The Joe Rogan Experience - June 26, 2017


Joe Rogan Experience #979 - Sargon of Akkad


Episode Stats

Length

3 hours and 59 minutes

Words per Minute

205.28099

Word Count

49,069

Sentence Count

4,954

Misogynist Sentences

199


Summary

Joe Rogan is back in Los Angeles and ready to rock and roll again. He talks about his upcoming Netflix special, his new music video, and how he's feeling about the upcoming tour. Plus, he talks about how he s feeling about going back on tour.


Transcript

00:00:00.000 Hey everybody, what the fuck's going on?
00:00:02.000 I should probably put my headphones on.
00:00:03.000 I'm freaking myself out.
00:00:04.000 I can't hear myself.
00:00:07.000 Thanks, everybody.
00:00:08.000 Came to Minneapolis and Indianapolis the first weekend of the big fat tour.
00:00:13.000 It was awesome.
00:00:15.000 This weekend, I'm at the Icy, the Ice House in Pasadena, but it's sold out.
00:00:19.000 And then next weekend on Friday night, I'm at the Ka Theater at the MGM.
00:00:26.000 And then the next weekend is sold out in Utah.
00:00:30.000 And the next dates that are available are Sacramento, Seattle, and San Diego.
00:00:35.000 All three of those I added second shows.
00:00:38.000 I think the first shows are sold out, but there's some tickets available for the second show.
00:00:41.000 JoeRogan.net forward slash tour.
00:00:44.000 And it looks like what I'm going to try to do is do another Netflix special at the end of this.
00:00:48.000 The tour goes on into December.
00:00:51.000 And then I'm thinking when that's done, I'll know exactly where to go and when.
00:00:56.000 But as of this weekend, I'm feeling good.
00:00:59.000 I'm ready to rock.
00:01:00.000 And six months from now, I'll be fucking primed.
00:01:03.000 Whoa.
00:01:04.000 This episode of the podcast is brought to you by Squarespace, which is an excellent platform for you to create your own website.
00:01:10.000 Go to squarespace.com now today.
00:01:13.000 Go to squarespace.com forward slash Joe, and you can get a free trial and 10% off your first purchase.
00:01:19.000 And they will show you how easy it is for you to create your own website.
00:01:23.000 It used to be very difficult.
00:01:25.000 It is now very easy.
00:01:27.000 And with Squarespace, each website comes with a free online store.
00:01:31.000 You get a free domain name if you sign up for a year.
00:01:34.000 And it's an impossibly easy to use, incredibly well-designed user interface that allows you to create these beautiful websites, beautiful templates, and a simple drag and drop user interface that you can use.
00:01:47.000 If you can do anything on a computer, like attach a photograph to an email, if you can move files around, if you do normal stuff, you know how to make a website with Squarespace.
00:01:58.000 And they have award-winning 24-7 customer support in case you get it all fucked.
00:02:02.000 It'll look great on everything, and it's used by a wide range of people, whether it's restaurants, stand-up comedians, Duncan Trussell and Doug Stanhope.
00:02:11.000 Both of their websites are made with Squarespace, artists, designers, and more.
00:02:16.000 So for a free trial and 10% off your first purchase, go to squarespace.com forward slash Joe.
00:02:23.000 We're also brought to you by Blue Apron.
00:02:26.000 Blue Apron, which is a company that sends you a styrofoam cooler that's packed with fresh ingredients from local farms.
00:02:35.000 Produce is sourced from farms that practice regenerative farming.
00:02:39.000 The beef, chicken, and pork come from responsibly raised animals, and the seafood is sourced sustainably under standards developed in partnership with the Monterey Bay Aquarium Seafood Watch.
00:02:49.000 They have these partnerships with over 150 farms, fisheries, and ranchers across the United States.
00:02:55.000 They send you these pre-packaged, pre-portioned little packets and step-by-step photographic instructions that show you how to create delicious meals.
00:03:05.000 It's super easy to do and there's very little waste because everything is portioned out perfectly for you.
00:03:13.000 Even if you don't cook normally, if you can follow instructions, you can cook.
00:03:17.000 They make it real easy.
00:03:18.000 And again, photographic, step-by-step instructions takes between 40 minutes, somewhere around 40 minutes for most people to cook some of these meals.
00:03:29.000 And they cost less than $10 a person.
00:03:31.000 And they have delicious stuff like crispy, or excuse me, creamy shrimp rolls with quick pickles and sweet potato wedges, chili butter steaks with parmesan potatoes and spinach.
00:03:44.000 So it's like they're cool, interesting choices.
00:03:48.000 It's not boring food.
00:03:50.000 I've done it.
00:03:51.000 I like it.
00:03:51.000 And again, it's pretty affordable.
00:03:53.000 Less than $10 per meal and a wide variety.
00:03:57.000 Choose from a variety of new recipes each week.
00:04:01.000 Or you can let Blue Apron's culinary team surprise you.
00:04:04.000 And the recipes are not repeated within a year, so you'll never get bored.
00:04:08.000 Check out this week's menu and get your first three meals for free with free shipping by going to blueapron.com forward slash Rogan.
00:04:18.000 You'll love how good it feels and tastes to create your own incredible home cooked meals with Blue Apron.
00:04:22.000 And again, it's very easy to do.
00:04:24.000 BlueApron.com forward slash Rogan.
00:04:28.000 We're also brought to you each and every episode by Onit.
00:04:31.000 That's O-N-N-I-T.
00:04:33.000 Onit is a total human optimization company.
00:04:36.000 And what we concentrate on is anything that can inspire you.
00:04:44.000 We put on the Onit Academy link.
00:04:46.000 It's filled with hundreds of articles on strength and conditioning, on the motivation to get yourself pumped up for life, on nutrition, nutrient absorption, all sorts of different interesting shit we find.
00:05:00.000 Q ⁇ A's with influential individuals and inspirational people and articles and videos.
00:05:07.000 And that sort of embodies what we're trying to do at Onit.
00:05:10.000 What we're trying to do is provide people with excellent supplements in the form of AlphaBrain, which is a cognitive enhancing nootropic.
00:05:19.000 That's sort of the cornerstone for our supplement line.
00:05:22.000 Double-blind placebo-controlled studies that show that it improves verbal memory, reaction time, all that's available at Onit.com for you to look at.
00:05:30.000 We sell strength and conditioning equipment, healthy snacks, and different various supplements.
00:05:36.000 But all of it concentrated on this one goal, getting you into your best state, whether it's your best state physically, your best state mentally, optimizing your time here, firing you up, getting shit done, and feeling good.
00:05:51.000 Go to onit.com to check it out, O-N-N-I-T.
00:05:55.000 Use the code word Rogan, and you will save 10% off any and all supplements.
00:06:00.000 My guest today is an interesting individual.
00:06:04.000 He's a very wise man, very smart guy, also very aggressive, very aggressive with his ideas.
00:06:09.000 He has a very popular YouTube channel.
00:06:12.000 And man, when I tell you that I got bombarded by people that wanted me to have him on the podcast, I mean fucking bombarded.
00:06:20.000 And he did my friend Dave Rubin's podcast, who's also been a guest on this podcast many times.
00:06:25.000 And Dave recommended him.
00:06:26.000 And we had a wonderful chat.
00:06:29.000 A Few things we didn't totally agree on, but I really enjoyed talking to him.
00:06:32.000 And I hope you enjoy it as well.
00:06:34.000 Give it up for Sargon of Akkad.
00:06:38.000 Joe Rogan podcast, check it out!
00:06:40.000 The Joe Rogan experience.
00:06:42.000 Train my day!
00:06:43.000 Joe Rogan podcast by night all day!
00:06:48.000 Free Kekistan?
00:06:49.000 Oh no.
00:06:50.000 I meant to wear it at the panel yesterday, but I actually was late and I forgot.
00:06:56.000 Yeah, what we're talking about here, folks, is he has a free Kekistan shirt.
00:07:00.000 And I just learned about this very recently from Jordan Peterson.
00:07:03.000 Oh, yeah, he loves it.
00:07:04.000 I had no idea.
00:07:06.000 I didn't know what the deal with the frog was.
00:07:08.000 Do you know that you can't have the frog now?
00:07:11.000 What is it on iTunes?
00:07:12.000 Apple iTunes won't let you use the frog.
00:07:14.000 It's great.
00:07:15.000 What the fuck, people?
00:07:17.000 What is Apple doing?
00:07:18.000 It's a frog.
00:07:19.000 It's a cartoon frog.
00:07:20.000 So Hillary Clinton denounced it.
00:07:22.000 She denounced the frog?
00:07:23.000 Oh, yeah.
00:07:24.000 She said something about deplorables.
00:07:26.000 What did she say about the frog?
00:07:28.000 Oh, it was part of an article on her website.
00:07:32.000 Oh, God.
00:07:33.000 They were saying how this cartoon frog is a hate symbol.
00:07:34.000 It's like you're ridiculous.
00:07:35.000 You're fucking idiot.
00:07:37.000 I mean, come on, let's get it.
00:07:38.000 It's a frog.
00:07:41.000 This is the best bit about the internet.
00:07:43.000 We can make them look ridiculous.
00:07:45.000 Well, they look ridiculous on their own.
00:07:47.000 I mean, it's they.
00:07:48.000 We're saying they, right?
00:07:49.000 We're already like doing the bad thing, lumping people into these groups.
00:07:52.000 Yeah, a mythical way.
00:07:53.000 But it is a weird thing that they don't understand.
00:07:56.000 Like, whoever gets upset at this does not understand how preposterous it is to freak out about a frog.
00:08:03.000 And who the fuck is over at Apple?
00:08:06.000 First of all, what about the poor bastard that made Pepe the frog for totally non-Kekistan, non-red pill reasons?
00:08:15.000 This poor bastard just had this cartoon frog that he made, feels bad, man, you know?
00:08:21.000 And then all of a sudden, that poor guy has to say, I'm killing off Pepe.
00:08:25.000 And then the internet's like, bitch, you're not killing off shit.
00:08:28.000 Nobody knows who you are, but everyone knows who Pepe is.
00:08:30.000 You can't kill off something like that.
00:08:32.000 That's so crazy.
00:08:33.000 Yeah, it's silly to think that he could have achieved it, isn't it?
00:08:36.000 Well, everybody's so, well, it's almost like they have to denounce it.
00:08:40.000 You know, otherwise they'll get lumped in with a supporter.
00:08:40.000 Yeah.
00:08:43.000 Like, he probably does think that he's a, people think he's a supporter.
00:08:46.000 Maybe.
00:08:47.000 Well, I don't think they do.
00:08:48.000 I think he's like, oh, you know, I don't know what's going on, so I want to stay out of it.
00:08:52.000 But I haven't really looked into that too much.
00:08:53.000 It's weird, man, because like, you know, people think they're being shadow banned.
00:08:59.000 Like, Scott Adams had a big post the other day where he said he's being shadow banned.
00:09:04.000 Yeah, but he said he talked to Jack at Twitter, and Jack said there's no shadow banning.
00:09:04.000 Probably is.
00:09:09.000 I talked to Jack.
00:09:10.000 Oh, shut up, Jack.
00:09:11.000 We know they're shadow banning.
00:09:13.000 This is Sargon speaking.
00:09:14.000 Jack's full of shit.
00:09:16.000 Do you think Jack's full of shit?
00:09:16.000 Right, Jack, we know they're shit.
00:09:17.000 No, absolutely.
00:09:18.000 Why the fuck haven't I been verified?
00:09:19.000 Jack?
00:09:20.000 Why haven't I been verified?
00:09:20.000 Tell me, Jack.
00:09:22.000 He's probably super busy with the 400 billion people that are on Twitter.
00:09:22.000 What's your excuse?
00:09:28.000 Probably super full of shit.
00:09:29.000 That's what I'm saying.
00:09:30.000 So have you tried to get verified?
00:09:31.000 Three times, and every time they've just denied why.
00:09:34.000 I mean, what is it?
00:09:35.000 Don't have enough.
00:09:36.000 How many followers do you have?
00:09:36.000 200,000.
00:09:38.000 There are a bunch of party accounts about me.
00:09:39.000 My fans need to know who I am on Twitter.
00:09:41.000 Don't you think that's the reason for verification, Jack?
00:09:44.000 Isn't that interesting?
00:09:45.000 Because if you were on a television show, it would be expected that you would be verified by now.
00:09:51.000 With that kind of following, that's a giant following.
00:09:53.000 And you're seeing that.
00:09:54.000 230,000 on YouTube something?
00:09:56.000 Yeah, that's a lot of people.
00:09:57.000 Seems worth verifying me for.
00:09:59.000 It's a badger status, which is why they de-verified Milo.
00:09:59.000 Yeah.
00:09:59.000 But it's not.
00:10:03.000 I mean, that's the most ridiculous one.
00:10:05.000 De-verifying someone.
00:10:06.000 You can't do that.
00:10:06.000 That proves it.
00:10:07.000 How do you verify someone and then de-verify?
00:10:10.000 It's not Milo anymore, is it, Jack?
00:10:11.000 That's the guy.
00:10:12.000 Yeah, you're a fucking liar.
00:10:13.000 You know it's a badger status.
00:10:15.000 You know you're doing it to create some sort of privileged class on Twitch.
00:10:17.000 I mean, they even get like a different function, don't they?
00:10:20.000 Yeah.
00:10:20.000 Where you can only see through other verified.
00:10:22.000 Hell, every account on Twitter should be verified.
00:10:24.000 Yeah, that is a thing you do, right?
00:10:26.000 You're allowed to only see accounts that are verified.
00:10:29.000 Yeah, it's a privilege.
00:10:30.000 And he took it away from Milo to punish him for being a dissident.
00:10:34.000 And he won't verify me to punish me for being a dissident.
00:10:37.000 And he knows it.
00:10:38.000 This is a strange.
00:10:39.000 You're fucking balls deep in this, dude.
00:10:41.000 I feel it.
00:10:41.000 You're very emotionally invested in all this.
00:10:44.000 Oh, yeah.
00:10:44.000 That's because I've never had to play the game.
00:10:47.000 I'm where I am now entirely on my own merits.
00:10:51.000 I've never had any industry connections.
00:10:54.000 I've never been to America before.
00:10:55.000 I've never spoken to any of these people.
00:10:57.000 They all knew who I am, and they're all afraid of me.
00:11:00.000 They're all afraid of you though.
00:11:00.000 Because if I can do it, anyone can do it.
00:11:04.000 Why do you think it's fear?
00:11:05.000 Why do you think it's fear?
00:11:06.000 Because we don't need them.
00:11:07.000 Well, you definitely don't need them.
00:11:08.000 No one needs anybody anymore.
00:11:10.000 Exactly.
00:11:10.000 But that's a good thing.
00:11:11.000 But you can't create a system of privilege without interdependent connections.
00:11:16.000 There is no system of privilege if you don't have to rely on other people to maintain a privileged class.
00:11:20.000 If you don't have to maintain a privileged class and you don't have to rely on other people, then their status goes away.
00:11:26.000 Let's break this down for people who don't know what the fuck we're talking about.
00:11:30.000 Your...
00:11:32.000 That's your actual name, right?
00:11:34.000 Was that cool?
00:11:34.000 Are you dead naming me?
00:11:35.000 Are you doxxing me?
00:11:36.000 I am doxing you.
00:11:37.000 But you've already been doxed.
00:11:37.000 Oh, my God.
00:11:38.000 We were discussing this.
00:11:40.000 Now, your YouTube account, you have great videos.
00:11:43.000 Thank you.
00:11:44.000 Really great commentary.
00:11:45.000 Very smart guy.
00:11:46.000 Thank you.
00:11:47.000 And how did you get involved?
00:11:49.000 I mean, are you like, what is this?
00:11:51.000 Are you an alt-right guy?
00:11:52.000 No.
00:11:53.000 You're not, right?
00:11:54.000 Yeah, the alt-right is an identitarian movement for white people.
00:11:58.000 They're heavily nationalist, protectionist, anti-interventionist.
00:12:01.000 They're not like laissez-faire free market capitalists.
00:12:04.000 I'm a liberal.
00:12:05.000 I'm like an English liberal, as in I want universal rights, laissez-faire capitalism.
00:12:10.000 I don't mind, I mean, socialized healthcare and a welfare state I'm happy with as long as it's universal, as long as it doesn't just apply to some, as long as anyone could access it if they needed to.
00:12:20.000 Just universal rights for all.
00:12:23.000 I can't see a reason to discriminate against someone when it comes to dealing with rights.
00:12:28.000 Well, who's defining alt-right?
00:12:32.000 Has it been clearly defined?
00:12:33.000 No.
00:12:34.000 Because it's so new.
00:12:35.000 It's actually relatively well defined now because the alt-right have asserted themselves.
00:12:39.000 They said, look, no, we're for white people.
00:12:42.000 We're pro-white right.
00:12:42.000 But who's saying that?
00:12:43.000 Which guys?
00:12:44.000 Well, like, you know, it's like Ramsey Paul or Richard Spencer.
00:12:48.000 But as soon as someone like that says they're alt-right, then everybody else just assumes that they allows them to define it.
00:12:57.000 Because I think there's probably a lot of people who consider themselves alt-right where they were right-wing, but they weren't racist.
00:13:04.000 They weren't a right supremacist.
00:13:05.000 They just didn't like this old man right that was there before.
00:13:10.000 I think they call themselves the alt-light.
00:13:13.000 I don't know whether that's a term of disparagement or not, or the new right or something like that.
00:13:18.000 But the thing is, I don't run in these circles, so I actually don't really know very much about those guys, like the Paul Joseph Watsons and Senoviches.
00:13:24.000 I don't find them particularly offensive.
00:13:26.000 I just find them to be like a modern form of conservatism.
00:13:29.000 I think Paul Joseph Watson is a very reasonable guy.
00:13:32.000 He's very articulate.
00:13:34.000 He seems pretty smart.
00:13:36.000 He does the opposite of what the media does.
00:13:41.000 That's not a criticism.
00:13:42.000 But he makes good points.
00:13:43.000 Exactly.
00:13:44.000 About a lot of things.
00:13:45.000 Yeah, yeah, absolutely.
00:13:46.000 I think it's necessary for him to exist.
00:13:50.000 Because they both create narratives.
00:13:52.000 And it's not that these narratives aren't true.
00:13:55.000 It's just that sometimes these narratives sometimes leave things out, but not always.
00:13:58.000 This isn't like a condemnation of either side by saying this.
00:14:02.000 This is just what they do.
00:14:03.000 And that's okay.
00:14:04.000 You know, everyone's entitled to create their own sort of, and I'm going to use the word propaganda, but again, I'm not using it in a derogatory sense.
00:14:10.000 Right.
00:14:11.000 I mean, you're like, look, I want to make this point.
00:14:13.000 I don't want to make your point, you know, because you're making your point.
00:14:16.000 I'm going to make my point.
00:14:17.000 And these two points will both stand in existence and will both be true at the same time.
00:14:22.000 And so it's okay for Paul Joseph Watson to exist.
00:14:25.000 You know, it's fine.
00:14:26.000 I don't have a problem with him.
00:14:28.000 I don't really have any criticisms of him, to be honest.
00:14:30.000 Otherwise, other than, you know, I mean, I wouldn't do what he's doing.
00:14:33.000 What is he doing that you wouldn't do?
00:14:35.000 He goes a bit harder than I would.
00:14:37.000 And I don't think he is as objective as I would prefer out of the news source.
00:14:41.000 I prefer someone more like Philly D. Isn't that an interesting thing?
00:14:45.000 The objectivity.
00:14:46.000 This is a real problem when it comes to left versus right or any of these groups.
00:14:51.000 You get lumped into these categories and then you start defending your turf.
00:14:54.000 And that seems to be a huge issue today.
00:14:59.000 Everybody just sort of digs their heels in the sand and decides this is who I am.
00:15:03.000 And you start using they and them, which is why I try to avoid that.
00:15:07.000 The bad guys.
00:15:08.000 Yeah, they.
00:15:09.000 And they what they want, what the left wants.
00:15:11.000 It's really tough, actually, because I've actually done like channel where I've actually polled my subscribers and stuff.
00:15:11.000 Yeah.
00:15:19.000 And I got like it was a year ago, but my content hasn't really changed tenor.
00:15:23.000 And I got like tens of thousands of responses.
00:15:26.000 So it's a really good sample size.
00:15:27.000 And most of most of my subscribers are center left liberals who just lost.
00:15:32.000 You know, they're not identitarians.
00:15:33.000 They don't want to sit there and go, you know, I mean, I was at Vicon and they had so many panels.
00:15:38.000 They're like, let me talk about me being a Latinx.
00:15:41.000 Let me talk about me being a Latina.
00:15:43.000 And it's like a Latinx.
00:15:45.000 Yeah.
00:15:45.000 A Latinx.
00:15:45.000 Non-binary Latina.
00:15:47.000 Oh, come on.
00:15:48.000 No, I'm not joking.
00:15:50.000 I'm not making this up.
00:15:50.000 I'm not.
00:15:51.000 I wouldn't.
00:15:51.000 I'm not joking.
00:15:52.000 I wouldn't make anything up and I won't lie to you.
00:15:53.000 Right.
00:15:54.000 I'm absolutely serious.
00:15:55.000 But the thing is, what really annoys me about identity politics, right?
00:15:57.000 And when they're like, let me tell you about my identity.
00:16:00.000 It's like, look, dude, that's none of my fucking business.
00:16:01.000 I don't care about your fucking identity.
00:16:03.000 Do not tell me how you identify.
00:16:05.000 It's not my problem.
00:16:06.000 It's not my business.
00:16:07.000 It's not political.
00:16:08.000 And if you want to make it political, I'm just going to tell you to shut up and fuck off.
00:16:11.000 Because I don't care.
00:16:12.000 Hey, Joe.
00:16:13.000 Let me tell you.
00:16:14.000 You think it's just indulgent?
00:16:15.000 Yeah, of course it is.
00:16:16.000 Let me tell you about my white identity.
00:16:17.000 Look at this.
00:16:18.000 Yeah, yeah.
00:16:19.000 I told you.
00:16:19.000 See?
00:16:20.000 Latino A versus Latin X versus Latin E. Which word best solves Spanish gender problem?
00:16:26.000 Really addressing the hard-hitting issues here.
00:16:28.000 Jesus, Rachel, Richard, Reichard.
00:16:31.000 Reichard.
00:16:31.000 Did I say her name?
00:16:32.000 I don't know.
00:16:33.000 Reichard.
00:16:34.000 Reichard, yeah.
00:16:34.000 Reichard.
00:16:35.000 Yeah.
00:16:35.000 What word best solves Spanish's gender?
00:16:39.000 What?
00:16:40.000 What gender problem?
00:16:41.000 Exactly.
00:16:42.000 Exactly.
00:16:43.000 They've made up all this crap.
00:16:44.000 All it is is talking about themselves.
00:16:46.000 They're not talking about real problems.
00:16:47.000 Is this what happens when you have too many vaccines?
00:16:49.000 I think this is what happens when you have too much money.
00:16:53.000 Yeah, people are just too, it's too easy to get by.
00:16:56.000 This is a symptom of affluenza.
00:16:57.000 Yeah.
00:16:58.000 They're thinking, hey, people give a shit what I think of my own genitals.
00:17:01.000 Not really.
00:17:03.000 Yeah, unless you're sticking them in my face, I really don't give a fuck.
00:17:03.000 Get out.
00:17:07.000 Yeah, exactly.
00:17:08.000 But then that's an interpersonal thing.
00:17:09.000 That's not like a political problem.
00:17:10.000 No, it's not a political problem.
00:17:12.000 Well, I mean, the politics of human interaction, I guess.
00:17:16.000 Yeah.
00:17:16.000 It's a little bit of that.
00:17:17.000 The thing that annoys me most is when they say, let me tell you about my blackness.
00:17:20.000 And I'm like, what the hell do you mean by your blackness?
00:17:22.000 And they're like, well, me as a black person, you're going to start stereotyping yourself, aren't you?
00:17:26.000 You're going to give me a stereotype of black people.
00:17:29.000 And then you're going to say, this is somehow representative, not only you, but other people.
00:17:32.000 But then you're validating Richard Spencer when he says, hey, can I talk to you about my whiteness?
00:17:36.000 What's your argument?
00:17:37.000 Say no.
00:17:38.000 You're not allowed to do that.
00:17:38.000 Exactly.
00:17:39.000 But why not?
00:17:39.000 What's wonderful?
00:17:40.000 Why not?
00:17:41.000 You're talking to me about your blackness.
00:17:43.000 I'll talk about my whiteness.
00:17:44.000 And all of a sudden, we're going to both realize, you know, we're just talking about ourselves.
00:17:47.000 We're just talking about ourselves.
00:17:48.000 Why don't we talk about something real?
00:17:49.000 Did you see that woman, the Delaware professor, who was talking about that kid that got killed in North Korea, the kid who stole the propaganda poster?
00:17:59.000 I know about the kid.
00:18:00.000 I haven't seen the professor talking about it.
00:18:02.000 Oh, man.
00:18:02.000 Well, she just got fired today.
00:18:04.000 Oh, really?
00:18:05.000 Luckily, yeah.
00:18:06.000 Which is nice.
00:18:07.000 It's a good sign.
00:18:08.000 Yeah.
00:18:08.000 But what she had said is essentially he had it coming and that she's tired of rich, clueless, rich, clueless white people who come to her class.
00:18:18.000 And then she just goes on this series of straw men who do coke and have no problem raping a girl at a frat party.
00:18:25.000 Whoa.
00:18:26.000 Dude.
00:18:27.000 Where did that come from?
00:18:28.000 This guy just got beaten to death in Korea.
00:18:30.000 Yeah.
00:18:31.000 He shouldn't have raped girls at frat parties.
00:18:31.000 He didn't come.
00:18:33.000 He didn't rape anybody.
00:18:34.000 Of course he didn't.
00:18:35.000 There's no evidence he did coke.
00:18:36.000 Like, this fucking bitch.
00:18:38.000 Geez.
00:18:38.000 No one should go find.
00:18:39.000 Well, she's apparently, you know, she's racist.
00:18:43.000 She's racist towards white people.
00:18:45.000 I mean, you can be that.
00:18:46.000 And that's one of the wonderful things about the social justice warrior movement is that they've made it so it's impossible to be racist towards white people.
00:18:55.000 They believe it.
00:18:56.000 Yeah.
00:18:56.000 This is a real belief.
00:18:57.000 Oh, yeah.
00:18:58.000 That if you are a white person, you're a person of privilege.
00:19:01.000 And racism is something that a person of privilege does to a person in a compromised group.
00:19:06.000 Yeah.
00:19:06.000 Do you know what's really funny about it?
00:19:08.000 Is they, by saying it's a power plus prejudice, they make it, they say, right, it's institutional.
00:19:13.000 I say, okay, but that means you're taking away the word for interpersonal racism so if like an Asian guy goes up to a black guy and says hey N-word that can't be racism because he doesn't have institutional power and there is no word for racism between individuals now you've redefined it to be institutional so what is the word for interpersonal yeah it's like that old chris rock bit about for every black dude thinks he's tough.
00:19:36.000 There's a Native American waiting to kick his ass.
00:19:40.000 A more marginalized group.
00:19:42.000 It's just, it is one of those things where it's like people are seeking problems, and they're also seeking to clearly identify themselves as being from a more compromised group.
00:19:54.000 Politics of victimhood.
00:19:55.000 Yeah, exactly.
00:19:56.000 I'm the victim here.
00:19:57.000 Give me some special treatment.
00:19:58.000 It's like, dude, I don't want to be a victim.
00:19:59.000 How do you feel about transracialism?
00:20:02.000 I think that there's no logical argument against that.
00:20:04.000 Do you think you can identify as an Eskimo?
00:20:06.000 As far as they're concerned, you can.
00:20:07.000 There was a big hoo-ha in academia recently because someone published a paper, I can't remember the name of the person, published a paper saying, well, look, what's the argument against this?
00:20:14.000 If you can identify as another gender, why can't you identify as another race?
00:20:17.000 Same logic, just apply to a different not necessarily because the argument of you identifying with another gender could be based on some sort of hormonal issue or some neurotransmitter misfire.
00:20:32.000 There could be something that could be physically.
00:20:34.000 There could be, if any of their arguments were hinging on the idea that that mattered, but it's not.
00:20:40.000 They say, well, you can identify as anything you want because it's all a social construct.
00:20:42.000 So you can do whatever you want with it.
00:20:44.000 Yeah, that's where it gets slippery, right?
00:20:45.000 Where it gets really slippery is that sex and gender are social constructs.
00:20:49.000 It's like, well, clearly that's not true.
00:20:51.000 Actually, it is true.
00:20:51.000 No, no, no, no.
00:20:52.000 They are social constructs.
00:20:53.000 But the thing is, they're not...
00:20:55.000 It's not a social construct.
00:20:57.000 Biological sex is based on your chromosomes.
00:20:59.000 But your gender is a group of constructed behaviors around that.
00:21:04.000 But it's constructed for a good reason.
00:21:06.000 Men have got, you know, clearly got a superior natural advantage in things like strength and speed over women.
00:21:11.000 And women need protection when they are, say, pregnant.
00:21:14.000 You know, when they're looking after kids.
00:21:15.000 They need society to look after them because they have something that needs protecting.
00:21:20.000 And so we've got a whole, and each society, the reason you can tell that these are actually social constructs is because every society is different.
00:21:25.000 I mean, in Saudi Arabia, the men walk in front, the women walk behind, and they keep their heads down and get, shut the fuck up.
00:21:31.000 That's a social construct.
00:21:32.000 That's a gender role, you know?
00:21:34.000 Right, but that's a cultural role.
00:21:35.000 Yeah, of course, but this is still a gender role, and that's my point.
00:21:38.000 You know, for us, we don't do that.
00:21:39.000 But it's not necessarily.
00:21:40.000 It's not like a social construct, meaning that this is something that occurs all the time, even in primates.
00:21:47.000 I mean, there's a lot of behavior that males exhibit in nature.
00:21:51.000 Oh, I'm not saying there's no relation to biology, but how is it a social construct?
00:21:56.000 Well, it's a behavior that we have.
00:21:59.000 It's not like something that's consistent.
00:22:01.000 Like, you know, all men pithra, the dicks, is not a social construct, right?
00:22:06.000 Every man on earth does this, but not every woman on earth has to walk behind her man silently 10 paces back because she's female, right?
00:22:14.000 In the West, it's actually, you know, the other way.
00:22:15.000 You know, the man opens the door, the woman goes forth.
00:22:17.000 There's a gender role, but it's not the same.
00:22:19.000 So it can be changed.
00:22:20.000 It is a social construct, but that doesn't mean that there isn't a gender role.
00:22:24.000 There is always a gender role.
00:22:25.000 It's always informed by biology because men and women are fundamentally different and they always will be.
00:22:30.000 Well, that's where people get slippery with you, right?
00:22:32.000 That's where they like it that way.
00:22:34.000 They like it that way.
00:22:36.000 Who's the one feminist?
00:22:36.000 What is that?
00:22:37.000 There's one really nutty young lady who was saying that even physical strength is just because men are encouraged to do physical things.
00:22:45.000 Hey, are you referring to a one any Sarkeesian?
00:22:48.000 Is that her?
00:22:49.000 She's definitely said that.
00:22:50.000 Did she say that?
00:22:51.000 Oh, come on.
00:22:52.000 I know she's loony, but did she go that far?
00:22:54.000 There is a school of feminist thought that thinks this.
00:22:59.000 Oh, that's a fun one.
00:23:01.000 So they literally believe that the physical strength differences between men and women are just because of the way society treats women.
00:23:07.000 Men are socialised to do sports.
00:23:11.000 She really thinks that?
00:23:12.000 I don't know whether she believes that.
00:23:12.000 She said it.
00:23:13.000 See if you can find that.
00:23:15.000 It's from a video from like 2013 or something.
00:23:17.000 She seems quite the opportunist.
00:23:20.000 Well.
00:23:21.000 Isn't she on the Trust and Verify Galactic Council for Twitter?
00:23:25.000 She certainly was, which is probably one of the other reasons Jack...
00:23:32.000 You say she certainly was.
00:23:33.000 Is she still?
00:23:35.000 I don't know whether she's still, but I know that she certainly was at least a contact for them.
00:23:40.000 But there's been a bit of an interesting development with Miss Sarkeesian, if you'd like to hear about it.
00:23:44.000 Well, tell people who she is.
00:23:46.000 I don't even know if I need to.
00:23:48.000 Most people don't know.
00:23:50.000 You live in a very insulated world.
00:23:52.000 If I ask my mom who know, but people are familiar with the female feminist frequency.
00:23:56.000 Yeah, yeah, of course your mum won't know.
00:23:58.000 Okay, Niss Sarkeesian is an internet feminist who decided after making a bunch of videos on her YouTube channel in like 2009 through to like 2010 or something, 12, she ended up getting a sort of, you know, favourable reception with the progressive intelligentsia, who are very feminist, very pro-social justice.
00:24:22.000 And they liked her patter because it's pretty stock, pretty stock in trade.
00:24:28.000 And she came across as being very respectable.
00:24:31.000 She's well-presented.
00:24:32.000 She's attractive.
00:24:32.000 She's well-spoken.
00:24:34.000 She knows her patter.
00:24:35.000 And so they promote her.
00:24:37.000 They made a big deal out of her.
00:24:39.000 She sounds like a total bigot when she's talking about anything.
00:24:44.000 How so?
00:24:44.000 A bigot?
00:24:44.000 Yeah.
00:24:45.000 Because she'll just say, well, this is men.
00:24:47.000 I mean, the whole premise of feminism is that men are oppressing women.
00:24:50.000 And if men aren't oppressing women, then feminism has no argument.
00:24:54.000 And they also have a sort of thing they say to each other, where you have the right to hate your oppressor.
00:25:03.000 It's right to do this.
00:25:04.000 And it's like, okay, but I don't think it's ever a right to hate, even if you are being oppressed.
00:25:08.000 This is something that she said personally?
00:25:09.000 Not her personally.
00:25:10.000 She probably has said it personally, but this is the sort of ideological thought.
00:25:14.000 I mean, if you read their papers, if you read...
00:25:17.000 They're doing that they-them thing again, right?
00:25:19.000 That's what we're talking about.
00:25:22.000 Yeah, we are talking about a specific sphere of academic feminists.
00:25:25.000 In her case, sex-negative.
00:25:27.000 Sex negative.
00:25:28.000 Yeah, they don't like objectification.
00:25:31.000 They don't like sexy women and titties.
00:25:33.000 Sex negative?
00:25:34.000 Yeah, they don't like sexy women.
00:25:35.000 The male gaze is oppressing women.
00:25:37.000 Laura Croft, Tomb Raider.
00:25:39.000 Doggy.
00:25:40.000 That was part of the big issue, right?
00:25:41.000 It was like sex oppression because of Laura Croft.
00:25:45.000 Terrible, terrible oppression.
00:25:46.000 I don't know how women can get over it.
00:25:48.000 But Anit Sarkeesian stepped up and was like, you know what?
00:25:51.000 It's wrong to look at these tits.
00:25:52.000 And everyone was like, but she's wearing a really tight top for a reason, and she's not real.
00:25:57.000 Yeah.
00:25:59.000 But yeah, so basically, Anit Sarkeesian comes from a school of sex-negative feminism and she decided that she would take advantage of all this.
00:26:06.000 Now in 2010 there was a video, I think it was 2010, that the video was recorded in black and white of her in a university giving a speech where she says, I'm not a gamer.
00:26:14.000 I had to learn a lot about gaming to get into all this.
00:26:17.000 And then come 2012, I mean, but that video wasn't released until after she did her tropes versus women in gaming Kickstarter.
00:26:24.000 2012, May 2012, she starts this Kickstarter and within the first few days, she starts posting updates on that.
00:26:31.000 I'm being harassed.
00:26:32.000 It's like, well, yeah, people generally react negatively to phenomenal bigots on the internet.
00:26:37.000 They don't like it.
00:26:38.000 People don't like it.
00:26:39.000 I mean, like when you say men are oppressing women and everything she says is a variant of men are oppressing women.
00:26:45.000 You know, the male gaze is highly oppressive to women.
00:26:48.000 Toxic masculinity is highly oppressive to women.
00:26:50.000 Okay, when can men legitimately look at women, Anita?
00:26:53.000 No answer.
00:26:54.000 What is non-toxic masculinity?
00:26:56.000 There's no such thing.
00:26:57.000 So what you're saying is masculinity and men checking out chicks is just bad and you hate it and the men are all bad for doing this and they're oppressing the women when they're doing this.
00:27:06.000 Well, that's why I said it seems like she's a bit of an opportunist.
00:27:08.000 She's got an angle and she realizes that there's a lot of attention that goes towards that angle and she runs with it.
00:27:15.000 I mean, if I turned around and said all women are a bunch of manipulative slags who will take your money and run, they'd be like, you're a fucking mistaken.
00:27:22.000 All right, but I got to stop you there because I don't think she's ever said all men.
00:27:25.000 I don't think she said that.
00:27:26.000 She just uses the term men.
00:27:28.000 But I don't think she does.
00:27:30.000 I mean, I really don't think she says she has that guy as her fucking head guy who writes for her.
00:27:35.000 Well, McIntosh.
00:27:36.000 No, she fired him years ago.
00:27:38.000 I know.
00:27:39.000 What is he doing now?
00:27:41.000 Nothing, basically.
00:27:42.000 This terrible thing.
00:27:43.000 But I want to set the record strain, if you think, because this is something that really annoys me.
00:27:47.000 So I went to VidCon recently.
00:27:50.000 VidCon.
00:27:50.000 Yeah.
00:27:51.000 What is that?
00:27:51.000 YouTube convention in Anaheim.
00:27:53.000 Sounds like a good time.
00:27:54.000 It was a really good time for me, but it wasn't for them.
00:27:58.000 Their panels look like a bloody death march.
00:28:00.000 When they spotted us sitting in the crowd, their faces look, oh, God.
00:28:03.000 Why, they get upset at you?
00:28:04.000 Yeah, just because we sat there listening to what they had to say.
00:28:06.000 In fact, have you not heard about this?
00:28:09.000 No, dude.
00:28:09.000 I'm insulated.
00:28:10.000 Oh, dude, this is crazy.
00:28:13.000 So, Anita Sarkeesian, she spots me sitting in the audience, and then she starts freaking out.
00:28:18.000 Freaking out.
00:28:19.000 Yeah, she starts saying things like, there's a constant harasser of mine calling me, you know, she called me a shithead.
00:28:25.000 She called me a garbage human.
00:28:27.000 A garbage human?
00:28:28.000 I did nothing to her.
00:28:29.000 I didn't say a word.
00:28:30.000 just sat in the audience while she's there cussing me out on the stage.
00:28:40.000 And this was because you have made videos about her?
00:28:42.000 Is that what I'm saying?
00:28:42.000 Well, this is because I've made videos debunking her points.
00:28:45.000 Like, and she's under the impression that all of my videos are about her.
00:28:49.000 And I actually, on the ride over, I got people to counter up for me.
00:28:53.000 I've made around 30 videos in about four years, referring to her.
00:28:58.000 No, no, no, no.
00:29:00.000 It sounds like a lot.
00:29:01.000 But that's less than 4% of the videos I've made.
00:29:04.000 I've made 780 videos.
00:29:05.000 I work hard.
00:29:06.000 I'm sure you do, but that's still 30 videos.
00:29:08.000 I don't care if you've made 100 million videos.
00:29:10.000 You still make 30 videos about her.
00:29:12.000 I don't think that's a large number.
00:29:13.000 Maybe it's a large number compared to an eater who hardly does any bloody work.
00:29:17.000 But that's another interesting point.
00:29:19.000 But what's interesting about it is if someone puts something out there, you're going to expect that someone is going to come along, especially a guy like you, that is very opinionated.
00:29:32.000 And if someone has very different opinions than yours, you're going to have a contrary point.
00:29:37.000 That's definitely not harassing.
00:29:37.000 Of course.
00:29:39.000 It better not be.
00:29:40.000 Well, it can't be.
00:29:41.000 We're in a scary position if that's terrifying.
00:29:44.000 So how are you harassing?
00:29:46.000 What was the thing?
00:29:47.000 What's the worst thing you've ever said about her?
00:29:50.000 That she's wrong.
00:29:52.000 I mean, you will never find someone who says she's a good critic.
00:29:52.000 Yeah, well, yeah.
00:29:57.000 She's a terrible critic, and she always has been.
00:30:00.000 It's always been rightly pointed out to her that she's cherry-picked every time.
00:30:03.000 No one stands by her work.
00:30:05.000 She's become like a symbol of feminism because of all the harassment she's received.
00:30:09.000 That is a part of the problem, right?
00:30:11.000 That victim culture.
00:30:12.000 It's very strange.
00:30:14.000 You don't get any props unless someone's sending you horrible messages.
00:30:17.000 And then a lot of people have been caught doing it to themselves, which is really fucked up.
00:30:22.000 Yeah, yeah, loads, like the Black Lives Matter activists and whatnot.
00:30:25.000 But yeah, so basically, her work sucks.
00:30:28.000 It's just bad work.
00:30:29.000 I mean, nobody in the video game industry would turn around and say, you know what, that's great criticism, Nita.
00:30:34.000 Only like the progressive press.
00:30:36.000 What Black Lives Matter?
00:30:37.000 Oh, it was ages ago at protests.
00:30:40.000 They basically had run around into the university's compute system, logged into a fake Twitter account or something, sent themselves a bomb threat, and the police tracked it down and they were like, we have the video camera footage of you doing this video.
00:30:51.000 Yeah, yeah, it was actually really funny.
00:30:53.000 But yeah, people are so funny with their need to be a victim.
00:30:57.000 Well, it validates them.
00:30:59.000 Yeah, it really does.
00:31:00.000 When you don't have an argument, you need to be a victim.
00:31:02.000 Well, does she expect that somehow or another she's going to be able to put these points out and not have anybody dispute these points?
00:31:08.000 Well, I'm guessing so.
00:31:09.000 And she always conflates all criticism with harassment.
00:31:12.000 She's never addressed a critic before.
00:31:13.000 Oh, okay.
00:31:14.000 And the first time she addresses a critic, it's me and she's calling me names.
00:31:17.000 Yeah, that's a good idea.
00:31:18.000 That's not very mature.
00:31:20.000 Yeah, it's not very good.
00:31:22.000 She says that I'm responsible for all of the harassment.
00:31:25.000 All of it?
00:31:26.000 Well, at least.
00:31:27.000 It's definitely not hers.
00:31:28.000 It's not her fault.
00:31:29.000 If you have a controversial opinion and someone disputes it, that person who disputes it is responsible for all the harassment.
00:31:34.000 Yeah, well, basically, I've always said, right from the get-go, don't ever contact them.
00:31:39.000 Don't ever contact them because for a start, I mean, they don't want to listen.
00:31:42.000 She's not interested in hearing your fucking opinion.
00:31:44.000 So don't bother.
00:31:45.000 She's got nothing but contempt for you.
00:31:47.000 But secondly, don't send her harassment because A, it's immoral.
00:31:50.000 And B, she's got the victim card that gives her money.
00:31:54.000 When you send her a message saying, hey, Nita, I think you're a bitch.
00:31:57.000 She goes, cha-ching, you know, and then she runs off to the next interview where she gets paid.
00:32:02.000 I mean, some of her speeches at university, she's been paid 20 grand a speech.
00:32:05.000 That's hilarious.
00:32:06.000 I'm not joking, right?
00:32:07.000 And it's just, and for her to go, the internet's being mean to me.
00:32:10.000 Well, maybe if you weren't being a bigger online, they wouldn't be.
00:32:13.000 Maybe if you actually did some solid work and didn't say video games are making everyone sexist, which they're obviously not, this wouldn't be happening to you.
00:32:19.000 But the thing that pisses me off most about the accusations towards me of harassment, apart from the fact that I've never harassed her, and if there was any evidence I'd harassed her, it'd be all over the internet right now.
00:32:29.000 You know, I mean, it would be nothing but Twitter.
00:32:31.000 Of course, you know, because they're trying to paint me as a harasser when I'm actually just a critic of hers.
00:32:36.000 And I've always been completely against it.
00:32:38.000 But what really annoys me is that her Kickstarter, if you look at it now, if you go to it, you can pull it up now and you can see tropes versus women and like, you know, she's got a spiel and then, oh, the terrible messages, the harassment, the harassment.
00:32:49.000 That was in May 2012.
00:32:50.000 I didn't make a video on YouTube until July 2013.
00:32:54.000 This has been happening to her whole career.
00:32:57.000 It's nothing to do with me.
00:32:59.000 It's to do with her.
00:33:00.000 It's because she's a dick.
00:33:02.000 And everyone can see it.
00:33:04.000 And you can call a girl a dick, like you can call a guy a pussy.
00:33:07.000 I'm calling her a dick.
00:33:07.000 Yeah.
00:33:08.000 I don't care.
00:33:09.000 You're a dick, Anita.
00:33:10.000 Everyone knows you're a dick.
00:33:11.000 He looked into the camera.
00:33:12.000 Do you know what I'm saying?
00:33:12.000 That's harassment.
00:33:13.000 Do you want to know something?
00:33:14.000 I've spoken to a lot of interesting people recently since I've been here.
00:33:18.000 A lot of people who know you personally, Anita, and they say everyone hates you because you're a dick.
00:33:25.000 Oh, this is getting mean.
00:33:26.000 Did you see that?
00:33:27.000 You should have seen the article she put up about me today.
00:33:29.000 Another one?
00:33:30.000 Well, yeah, just calling me a harasser, calling me all this stuff.
00:33:31.000 It's like, no, you're a dick, Anita.
00:33:33.000 You're just being a dick.
00:33:34.000 Did you find a video of her saying that there's no biological difference between the sexes and that men's strength?
00:33:39.000 It'll be buried in the middle of the video.
00:33:42.000 I really think that it was her now that I'm thinking about this.
00:33:45.000 I've looked at it.
00:33:46.000 Maybe she took it down or something because I remember seeing it.
00:33:48.000 And I wish I'd saved it.
00:33:48.000 Someone's got to save it.
00:33:50.000 Someone's got to save it.
00:33:50.000 Someone on the internet will be able to tweet at us or something.
00:33:53.000 But that is so hilarious because it's so contrary to biology, just actual science.
00:33:59.000 Everything she says is contrary to reality.
00:34:01.000 I mean, she'll sit there and she'll pick out like the game Hitman.
00:34:03.000 In her video, she said something like, you're incentivized to kill the woman and then stuff her body in a trunk.
00:34:10.000 And the footage she's using, which she didn't take herself, obviously, and didn't credit whoever took it.
00:34:15.000 In the footage she's using, you see, like the guy go up and kill the girl.
00:34:18.000 You see them lose points.
00:34:20.000 It's like, no, that's a disincentivization.
00:34:22.000 You know, they're not incentivizing you.
00:34:24.000 They're saying they're penalizing you for this, Anita.
00:34:27.000 I mean, I thought, my very first video was to Anita as well.
00:34:31.000 Your very first video.
00:34:32.000 Yeah, it was terrible.
00:34:32.000 In like July 2013.
00:34:34.000 And I...
00:34:38.000 You were inspired.
00:34:38.000 Yeah, yeah.
00:34:39.000 Well, kind of.
00:34:40.000 But there was a lot of things that kind of, you know, that were going on.
00:34:43.000 I was like, you know, I feel the need to say something.
00:34:47.000 But apart from the quality being terrible, I'm bewildered at her.
00:34:52.000 I listened to it the other day and I was just like, wow, I had no idea what was going on.
00:34:55.000 And I was just like, look, Anita, I think you're just wrong on this.
00:34:59.000 I don't think you understand.
00:35:00.000 Like, there is a lot to this that you've left out.
00:35:05.000 I should have said cherry-picked, but you know.
00:35:06.000 And that's something that's dogged her whole career because that's all she's ever done.
00:35:09.000 Her criticism is terrible and everyone knows it.
00:35:11.000 Nobody likes her because she's a divisive bitch.
00:35:14.000 And all she does is play the victim.
00:35:16.000 She's got nothing.
00:35:17.000 She has nothing.
00:35:18.000 And all she does is love her.
00:35:19.000 Well, let's not talk about her specifically because we're kind of beating her down here.
00:35:22.000 Well, the thing is, it's hard to think of anything genuinely good about her.
00:35:26.000 Let's not talk about her.
00:35:27.000 What I'm saying is about these ideas.
00:35:29.000 Instead of concentrating on her as a person.
00:35:31.000 They've been debunked a million times all over the internet by everyone and their mother at this point.
00:35:38.000 Well, what I want to get to, what's interesting to me, is like what causes them?
00:35:41.000 What causes these people to group up into these weird sort of echo chambers and start reiterating things that don't really work or make sense?
00:35:50.000 Like, there's no biological difference between men and women, especially when it comes to strength.
00:35:55.000 And then the tropes about video games.
00:35:58.000 And also, like, here's the other thing.
00:35:59.000 Like, when that Gamergate thing happened, it was really fascinating to me that everybody wanted to absolutely lump people in one or two categories.
00:36:09.000 And the two categories are people that are appalled to it, appalled by it, and that think that sexism is abhorrent and all this stuff is terrible.
00:36:17.000 And then the other people who supposedly, if you wanted some sexy images in video games, you had to be a piece of shit.
00:36:25.000 Oh, yeah.
00:36:26.000 He's sex negative feminists, I was talking about.
00:36:29.000 But it got into this weird category where people are like, hey, we don't want you meddling with art.
00:36:35.000 And whether it's a Frank Visetta painting.
00:36:37.000 Yeah, I mean, like, if you look at some of Frisetta's stuff with the girls with the bikinis and I'm not familiar, but famous fantasy artist.
00:36:46.000 Oh, well, are we talking about like the 80s artists where they're beautifully painted?
00:36:50.000 It's like these oiled barbarians.
00:36:53.000 It's a celebration of the human physique at this perfect zenith.
00:36:56.000 Yes.
00:36:56.000 Yeah, yeah, yeah.
00:36:57.000 It's all Robert E. Howard's books on Conan and Frazetta was like the most famous of the Yeah, yeah, yeah.
00:37:07.000 You know, like with giant tits and blood all over the place.
00:37:11.000 Yeah, it's like, you know, it's silliness.
00:37:13.000 It doesn't make you hate women.
00:37:15.000 Like, that's what's gross.
00:37:16.000 Like, just like looking at Fabio in the cover of a romance novel doesn't mean you hate men.
00:37:21.000 Yeah, right.
00:37:22.000 In fact, it seems you're kind of lionizing them.
00:37:24.000 Look, this is the perfect depiction of what the artist wanted to see like a man look like in a huge rippling hero carrying the maiden or whatever.
00:37:32.000 Right, but it becomes this weird thing when it's a man, people don't have a problem with it.
00:37:36.000 Like no one says that it demonizes the male figure and that you have set these unnatural standards, so you've fucked with the self-esteem of all these young men.
00:37:46.000 But you absolutely have in the same way you do it with women, like body shaming or, you know, having ridiculous standards for bodies for females.
00:37:53.000 That works with men too.
00:37:55.000 It's the same thing.
00:37:56.000 Yeah, of course it does.
00:37:57.000 But they don't care because feminists dichotomize the world as women being oppressed and men the oppressors, and it's okay to hate your oppressor.
00:38:05.000 That's the root of all feminist thoughts.
00:38:07.000 But that's why I want to talk about this in terms of the ideas because I think it's kind of the same thing as the right versus the left.
00:38:13.000 And I think it's all like a symptom of human psychology.
00:38:16.000 Right, the right versus the left is tribalism.
00:38:18.000 It is tribalism, but so is female versus male.
00:38:23.000 When you look at this blind allegiance towards anyone with a vagina, it's very similar to someone who has blind allegiance to anyone who's in a red state.
00:38:30.000 Yeah, I mean, that's true.
00:38:31.000 But the people in the red state, do they say that the blue states are oppressing them?
00:38:34.000 Oh, yeah, absolutely.
00:38:35.000 Oh, do they?
00:38:35.000 For sure.
00:38:36.000 Yeah, like I said, absolutely.
00:38:37.000 Oh, yeah.
00:38:38.000 They hate Obama.
00:38:39.000 He's a communist.
00:38:40.000 He came from Kenya.
00:38:41.000 He wants to ruin America.
00:38:43.000 He wants to kill it from the inside.
00:38:44.000 I mean, that was the big narrative forever.
00:38:46.000 Yeah, but is that them saying he's holding them down?
00:38:49.000 Well, there was always a thing that he was holding them down with the economy or that he was holding them down with Obamacare and he was ruining so many different factors.
00:38:58.000 It makes it America.
00:38:59.000 He hates America.
00:39:00.000 It doesn't sound exactly the same to me because the feminists are essentially what is called neo-Marxists.
00:39:07.000 And they take the sort of Marxist dichotomy of the economy with the bourgeoisie oppressing the proletariat through their mere existence and their wealth and apply it to men being the bourgeoisie and women being the proletariat.
00:39:17.000 Which is highly ironic given the evidently privileged status in society women hold.
00:39:21.000 Well, that's when it got weird when everybody kept repeating that fucking wage gap thing.
00:39:26.000 That was one of the most frustrating talking points.
00:39:29.000 Even Obama.
00:39:30.000 The goddamn president of the United States on TV reiterating something that he knows is not true.
00:39:35.000 That was very frustrating.
00:39:36.000 Because I just, I feel like human beings are inclined very naturally to form a group, to stay tribal.
00:39:44.000 And then I think it works with, I think that is a part of what's happening, whether it's neo-Marxism that's connected to it or what, there's something that's happening with feminism versus masculine people.
00:39:55.000 Yeah.
00:39:56.000 You know, but there's no like masculinist, so I used to have to say masculine people.
00:40:01.000 There's meninists, but that's silly, though.
00:40:04.000 Yeah, that's funny.
00:40:05.000 Yeah, there's a parody of feminism.
00:40:06.000 It's like, what would you have?
00:40:07.000 Meninism?
00:40:08.000 That sounds dumb, doesn't it?
00:40:09.000 And they're like, yeah, well, then that's how feminism sounds to us.
00:40:12.000 Well, it's the most amazing thing that it's taken so long for male feminists to die off.
00:40:18.000 Like that's finally rotting away after a few of them have been outed as being creeps.
00:40:23.000 I was going to say that.
00:40:24.000 Isn't it funny?
00:40:25.000 Thank the baby Jesus they got caught because like look man, there's nothing wrong with being male and just because you're a human with a dick doesn't mean you're a bad person.
00:40:33.000 Yeah, but they think there's something wrong with being male.
00:40:35.000 Well then you know what?
00:40:36.000 They only do because they want to fuck those girls and they don't have any other way to fuck them.
00:40:40.000 Well, I mean that's probably the root of it but ultimately 100% they're both ultimately you see male feminists.
00:40:44.000 Yes.
00:40:45.000 Are they going to win any fights?
00:40:46.000 No.
00:40:48.000 Do they even need to shave?
00:40:50.000 Some of them do.
00:40:51.000 They have receding hairlines and fedoras and they're like maladying all over the place.
00:40:55.000 Jordan Peterson's not the best.
00:40:57.000 You said they're sneaky.
00:40:58.000 No, they definitely are.
00:40:59.000 But they're afraid of men like you.
00:41:01.000 They're never going to compete with you.
00:41:02.000 Yeah, but they don't have to.
00:41:04.000 They compete with themselves.
00:41:06.000 They don't see that that way.
00:41:07.000 They get fucked by genetics.
00:41:09.000 Let me tell you, a lot of them got fucked by genetics, and so they're taking the only evolutionary strategy they've got.
00:41:09.000 Some of them are.
00:41:13.000 You can hardly blame them.
00:41:14.000 But what they need to do is get to the lab and figure out CRISPR.
00:41:18.000 Work on re-engineering yourself to look like Thor.
00:41:21.000 Get down the fucking gym.
00:41:22.000 That's what they need to do.
00:41:23.000 Seriously, if you're a male feminist, just shut up.
00:41:25.000 Stop talking about feminism on the internet.
00:41:27.000 Look at yourself.
00:41:28.000 Go down the gym.
00:41:28.000 Go on a diet.
00:41:29.000 And in a couple of years' time, you'll be a bit bigger.
00:41:31.000 You'll be a bit thinner.
00:41:32.000 And if you're five foot one and you weigh 80 pounds, you're fucked.
00:41:35.000 Nah, just keep eating sprouts.
00:41:38.000 I've got a friend.
00:41:39.000 He's quite a short guy.
00:41:40.000 Do you know what he did?
00:41:41.000 He did martial arts.
00:41:42.000 He's a nippy little guy, and he's quite strong.
00:41:45.000 Oh, nippy, yeah, yeah.
00:41:46.000 No, no, he can get around.
00:41:46.000 He can get around.
00:41:47.000 He's an amazing, we go like bouldering as well, climbing.
00:41:51.000 He's amazing at it.
00:41:52.000 You would not believe how unbelievable.
00:41:54.000 His girlfriend is so attractive.
00:41:56.000 I believe you.
00:41:57.000 And he's so self-confident.
00:41:58.000 This guy should host a seminar.
00:42:00.000 This is how I did it.
00:42:01.000 Yeah, yeah, he should.
00:42:02.000 You can too.
00:42:03.000 It's just self-improvement.
00:42:04.000 You work on yourself and then things work out for you.
00:42:07.000 No, I mean, I absolutely think that that's a good thing.
00:42:09.000 Absolutely.
00:42:10.000 But I understand the motivation why someone becomes a male feminist.
00:42:15.000 The word ally is one of my favorite.
00:42:17.000 Ally.
00:42:18.000 Ally.
00:42:18.000 I just love that they are now synonymous with creepy harasser.
00:42:22.000 Well, you kind of knew it all from the beginning that that was what it was going on.
00:42:26.000 Everybody knew it.
00:42:27.000 Like, especially heterosexual men.
00:42:29.000 You're like, I know what you're doing, you fucking weasel.
00:42:33.000 You know, you're putting women above all else.
00:42:34.000 One guy I read his Twitter profile and he was talking about, he goes, oh, in his Twitter post, rather, he said, I'm going to stop calling myself a feminist until feminists decide that I'm doing feminism right.
00:42:46.000 That'll make women respect you.
00:42:47.000 But the fact that he put it out on Twitter, like you put that out, like this is just like this giant white flag.
00:42:53.000 Yeah.
00:42:55.000 But yeah, that's the thing.
00:42:56.000 You know, people in general, women being people, respect boundaries.
00:43:00.000 They respect people who will enforce their own boundaries.
00:43:02.000 This guy's literally said, hey, I have no boundaries.
00:43:03.000 You tell me what to do.
00:43:04.000 I'll be your slave.
00:43:05.000 No one's going to respect that.
00:43:06.000 I'll be your willing servant.
00:43:08.000 I don't give a shit.
00:43:08.000 I don't know respect for that.
00:43:09.000 You're a nice douchebag.
00:43:11.000 You don't have to say, I'm going to stop saying I'm a feminist until feminists tell me I'm doing feminism right.
00:43:17.000 There's a reason that they call them cucks.
00:43:19.000 What does that mean?
00:43:20.000 It's a good word.
00:43:21.000 It is.
00:43:21.000 That's a good word.
00:43:23.000 I didn't know what it meant.
00:43:24.000 I thought it always meant like a cuckhold, like a porn film.
00:43:28.000 I think it kind of did originally, but now it just kind of means someone who won't stand up for themselves.
00:43:32.000 Yeah.
00:43:32.000 And that's really good use for it.
00:43:34.000 Well, it's also not just someone who won't stand up for themselves, someone who's pandering.
00:43:38.000 Well, yeah.
00:43:39.000 Like you're pandering to the left.
00:43:41.000 You're trying to paint yourself out as being a person.
00:43:43.000 You're selling yourself out.
00:43:44.000 And hoping that the person you're selling yourself out to will treat you leniently.
00:43:48.000 See, so instead of talking about individuals, what I'm really fascinated about is the psychology behind what's going on now in this culture of free expression.
00:43:58.000 The fact that a guy like you can, not there's anything wrong with you, but I mean, a guy who's just a gentleman from England who doesn't have a connection to the media can, just by virtue of your ability to communicate and put out some YouTube videos and have some really good points, you can have this massive, massive following.
00:44:18.000 And these ideas that people are spreading back and forth, even the weird ones like the Kakistan, the frog, and all that shit.
00:44:25.000 It's just a joke.
00:44:26.000 But it's interesting to me.
00:44:28.000 It's interesting to see what happens.
00:44:31.000 What I feel like is Fox News and CNN and even major networks, especially all the ones that are trying to portray narratives, all the ones that are talking about the way life is.
00:44:44.000 They seem to me to be like these huddled up executives at the top of like a giant pile of crocodiles snapping at them.
00:44:55.000 It's like they have completely lost control over what people talk about and discuss.
00:45:01.000 It's a spire of rock and the ocean's slowly eroding its way.
00:45:06.000 Well, just 20 years ago, there was nothing there.
00:45:08.000 There was no water.
00:45:09.000 There was no crocodiles.
00:45:11.000 The only information that got out.
00:45:13.000 Yeah, I mean, the only information got out was through mass media.
00:45:17.000 Mass media was controlled by corporations.
00:45:19.000 And we felt like for the longest time, that was the only way.
00:45:23.000 And they still operate like that's the only way.
00:45:26.000 It's really quite fascinating.
00:45:28.000 Narrative hegemony.
00:45:29.000 Social media has demolished them.
00:45:31.000 And we're not going back.
00:45:32.000 No, it's not going back.
00:45:34.000 It's not going back.
00:45:35.000 And I think the only companies that truly understand that are internet-based companies that are trying to manage it, like whether it's YouTube or Google or even Netflix.
00:45:45.000 They understand it.
00:45:47.000 They're trying to manage it in some weird ways, though.
00:45:49.000 Like, I'm really bummed out about Netflix thumbs up versus thumbs down.
00:45:53.000 Like, what?
00:45:54.000 You know what you should have?
00:45:55.000 You should have a 100% thing.
00:45:57.000 So give them one to 100.
00:45:59.000 The more range you give someone, the better.
00:46:02.000 But up and down, it's just fucking silly.
00:46:04.000 I mean, if I were going to review things, I would do it recommended, not recommended.
00:46:08.000 And it would be quite subjective to my personal taste.
00:46:10.000 It would be obviously, look, if you've been following me for a while, you know what I like.
00:46:13.000 And I'd give my reasons and I'd try and explain it as accurately and concisely and as objectively as I could so that the largest sweep of people watching the video could understand why I felt the way I did.
00:46:24.000 And so at the end of it, I'd just go, would I recommend it?
00:46:26.000 Yeah.
00:46:27.000 And, you know, your mileage might vary.
00:46:28.000 Or yeah, you've got to watch this or something like that.
00:46:30.000 You know, I'd like it.
00:46:31.000 But I hate the star system.
00:46:33.000 It's like, well, I mean, you know, it makes it sound objective when it's not really.
00:46:37.000 Well, sometimes it's objective.
00:46:39.000 You know, I mean, it can be.
00:46:41.000 It's not perfect, though.
00:46:42.000 I agree with you.
00:46:42.000 Yeah.
00:46:43.000 Again, with a lot of this sort of stuff, it's about aesthetics and taste.
00:46:47.000 For sure.
00:46:48.000 I mean, I loved the movie Hardcore Henry.
00:46:50.000 I mean, I realized that I irrationally love this movie.
00:46:54.000 That was the single person.
00:46:55.000 Yeah, the first person.
00:46:56.000 The first person, yeah.
00:46:57.000 There was a lot about it that I've seen.
00:46:59.000 I never saw that.
00:47:00.000 Yeah, I'm sure it's like it's not for everyone.
00:47:00.000 Oh, you'll love it.
00:47:05.000 It's for the sort of guy who's got a lizard brain who likes violence.
00:47:08.000 Lizard brain?
00:47:09.000 That's what you think about me?
00:47:10.000 No.
00:47:11.000 You were an MMA fighter.
00:47:13.000 I'm not a fighter.
00:47:14.000 You were, weren't you?
00:47:15.000 I kickboxed.
00:47:16.000 I did Taekwondo, and there was no MMA when I was fighting.
00:47:19.000 Okay, but did you enjoy fighting other people?
00:47:22.000 That's a very complicated question.
00:47:23.000 Obviously, I did.
00:47:24.000 I'm not saying did you enjoy being people up on the streets.
00:47:25.000 I did it for many years, so I...
00:47:33.000 I definitely enjoyed winning, but it is such a hardcore hammer.
00:47:38.000 It was a task.
00:47:39.000 You'll like it.
00:47:40.000 When you get past the sort of, maybe the beginning of it is more shaking than the rest of it, but when you get past the initial sort of, this is weird to look at, it's fast-paced, it's action-packed, and it definitely taps into that sort of lizard brainware.
00:47:55.000 It's the male desire to fight.
00:47:58.000 This guy, at the end of the film, he's literally in a fist brawl with like 30 other dudes, and it's the most insane.
00:48:06.000 It's all coming at him, and it's just bam, bam, bam.
00:48:09.000 Get stoned and watch it, get drunk and watch it.
00:48:11.000 It's just the best thing if you're into that sort of thing.
00:48:14.000 But if you're not into that, you won't like it at all.
00:48:17.000 Sometimes I'm into that.
00:48:17.000 You won't see it.
00:48:18.000 Like, I like John Wick.
00:48:20.000 Not always, you know, but like, you know, I like John Wick too.
00:48:20.000 Exactly.
00:48:23.000 How about that?
00:48:24.000 It wasn't even that good.
00:48:25.000 But it's for a certain kind of person in a certain kind of mood.
00:48:28.000 But when it hits right, it's I was just enjoying this.
00:48:36.000 This was great fun, you know.
00:48:38.000 But I appreciate it.
00:48:39.000 It was just like tapping into the primordial desire to defeat your enemies.
00:48:43.000 I just thought it was interesting that they shot it like a video game, like Quake, first-person shooter.
00:48:49.000 So this VidCon thing that you went to, who puts this together?
00:48:55.000 Hank and John Green.
00:48:57.000 And who are they?
00:48:58.000 The Vlog Brothers.
00:48:59.000 They're famous YouTubers.
00:49:01.000 And I find it very interesting.
00:49:03.000 When Anita was calling me a shithead and garbage human, that's a direct violation of their code of conduct.
00:49:09.000 And do they do anything about it?
00:49:11.000 No.
00:49:12.000 Because she's a feminist.
00:49:13.000 Because she's a specialist.
00:49:14.000 She's a woman.
00:49:15.000 Oh, because she gets special...
00:49:19.000 Why does she get special treatment?
00:49:20.000 Because at this point, she's basically become the avatar of feminism.
00:49:23.000 If she fails.
00:49:24.000 Isn't that real, though?
00:49:25.000 I swear to God, didn't that make sense?
00:49:28.000 No, no, everyone hates her.
00:49:29.000 Why do they still prop her up?
00:49:31.000 Because she's an East Sarkeesian.
00:49:32.000 But she doesn't make any sense.
00:49:34.000 What do you mean?
00:49:36.000 There's a mythos that's built up around her.
00:49:39.000 That's a good word.
00:49:40.000 Yes.
00:49:42.000 And they've all propped her up.
00:49:44.000 They've all propped her up.
00:49:45.000 They've all said good things about her.
00:49:47.000 have all defended her and if it turns out that she is in fact an abuser which she did according to the vidcon code of conduct which would Oh, I can't remember it verbatim, but I read it on a video.
00:50:01.000 It's one of my latest videos.
00:50:02.000 It was just a quick message to Hank and John Green.
00:50:05.000 And I know that thousands of people emailed them saying, hey, look, she did violate your code of conduct from a position of institutional privilege at VidCon.
00:50:12.000 She was on the panel.
00:50:13.000 She had a microphone.
00:50:13.000 I didn't have a microphone.
00:50:14.000 She's calling me names.
00:50:15.000 She's accusing me of things that aren't true.
00:50:18.000 And she was doing that while you were in the audience.
00:50:20.000 Directly, looking me in the eye.
00:50:22.000 And it was like, okay, that's fine, but that's not right.
00:50:27.000 And they've done nothing.
00:50:28.000 They have done nothing.
00:50:30.000 They've said nothing.
00:50:31.000 Did you talk to her?
00:50:31.000 And then about it?
00:50:32.000 And then afterwards, all I got from her fans, and I mean like, you know, hundreds and hundreds of messages, and I've just retweeted them all night with exactly the same language she uses when she's displaying, oh, look at the nasty messages I'm getting of people calling me a garbage human, which is what she called me.
00:50:49.000 And so she has done absolutely nothing different.
00:50:52.000 And so I said, right, Hank and John, she's violated your code of conduct.
00:50:55.000 Her fans, she's incited a cyber mob of harassers exactly as she claims I do, but she has actually done this.
00:51:02.000 Then on the Saturday, she was due to be on a panel about cyber harassment.
00:51:10.000 See, but it can't be harassment when she's a woman and you're a man because you are the oppressor.
00:51:17.000 So it's the same thing as racism.
00:51:19.000 Like a black person being racist against a white person is impossible.
00:51:22.000 But the only reason to do this then, to take no action is if you are highly ideological.
00:51:29.000 And interestingly, so you feel like they're highly ideological.
00:51:33.000 Oh yeah, incredibly biased.
00:51:35.000 But why are they if they're YouTube stars?
00:51:37.000 I mean, aren't they like, don't they?
00:51:39.000 They're very progressive.
00:51:40.000 Very progressive.
00:51:41.000 But you're fairly progressive, too.
00:51:42.000 No, I'm a liberal.
00:51:43.000 What is the difference between a liberal and progressive?
00:51:43.000 I'm not progressive.
00:51:44.000 A liberal wants quality of opportunity, a progressive wants quality of outcome.
00:51:48.000 Quality of outcome.
00:51:49.000 Yeah, complete night and day.
00:51:51.000 The problem with liberals and progressives.
00:51:54.000 By whose definition is that?
00:51:56.000 I've always thought progressive means that you support gay rights, you support over the rights of the rights.
00:52:02.000 That's the outcome of gays with non-gays.
00:52:06.000 They want gays to have the same rights as gays.
00:52:07.000 I mean, don't get me wrong, I want that too.
00:52:08.000 And this is what I was going to say.
00:52:09.000 There's a lot of overlapping magisteria when it comes to these things.
00:52:12.000 For example, another great word.
00:52:13.000 Magisteria?
00:52:14.000 Yeah, I know.
00:52:15.000 I've never heard that before.
00:52:16.000 I've never heard that.
00:52:17.000 They use that.
00:52:17.000 Areas of interest.
00:52:18.000 Things that they want to have.
00:52:19.000 Domains of the world.
00:52:20.000 It sounds so much better than areas of interest, though.
00:52:23.000 Write that down, Jamie.
00:52:24.000 Who's going to start using that one?
00:52:26.000 Magisteria.
00:52:27.000 I'm glad I've given you something.
00:52:29.000 So, yeah, basically, you know, a liberal, there's no real argument against gay rights from a liberal perspective, because liberals want universal rights.
00:52:36.000 Of course.
00:52:36.000 If some people can get married, gay should be able to get married.
00:52:38.000 What's the argument?
00:52:39.000 Oh, I'm Christian.
00:52:39.000 Get fucked.
00:52:40.000 Not an argument.
00:52:40.000 I'm an atheist.
00:52:42.000 Whereas the progressives, they want it because someone else has it.
00:52:45.000 Someone else has this.
00:52:46.000 Therefore, they want this because they want it.
00:52:48.000 And by whose definition?
00:52:49.000 By theirs or yours?
00:52:50.000 Are you defining them?
00:52:51.000 That's what they define it as.
00:52:54.000 Yeah, they're the ones saying all this.
00:52:55.000 I mean, like, affirmative action is the example of this.
00:52:59.000 Okay.
00:53:00.000 A liberal would never ask for affirmative action because in the liberal worldview, people should be free to pursue their own goals.
00:53:06.000 Now, I mean, liberals are usually very anti-racist.
00:53:10.000 That's not a category on which you would judge someone.
00:53:12.000 Same with sex.
00:53:13.000 I would never turn around and say, right, okay, well, your CV is really good.
00:53:16.000 I mean, you clean what you're doing, but you've got some tits, so get out.
00:53:20.000 That would be ridiculous.
00:53:21.000 That's not a reason to do that.
00:53:24.000 But the progressive will do that if you're a man.
00:53:26.000 The progressive will say, well, I'm sorry, we want equality.
00:53:30.000 And this is what we define equality differently as.
00:53:33.000 I call it equality of opportunity.
00:53:34.000 They call it equality of outcome.
00:53:35.000 Because they'll turn around and say, well, you know what?
00:53:36.000 We've already got like 70% men.
00:53:38.000 So, I mean, your CV might be great, but you've got a dick.
00:53:41.000 So we can't take you.
00:53:42.000 We're going to take that unqualified woman instead because we need to get to parity because they want equality of outcome.
00:53:47.000 They want 50% men, 50% women.
00:53:49.000 I have a friend who was a big Whig at a big internet company, and they had to deal with that shit all the time, where they would have men who were far more qualified, who they were getting pressured to push out to make room for, like, they were specifically looking for a qualified black woman.
00:54:07.000 They were going out of their way because they wanted to give out the appearance of diversity.
00:54:11.000 Exactly.
00:54:12.000 But that's because they're also scared of criticism, right?
00:54:14.000 So they're scared of being attacked from progressive activists.
00:54:17.000 And by race pimps.
00:54:19.000 I mean, there are a lot of race pimps, which is a lot of people.
00:54:21.000 It's like our term race pimp, actually.
00:54:23.000 Yeah, it's not mine.
00:54:24.000 They used it for Jesse Jackson back in the day.
00:54:26.000 That's a good term.
00:54:27.000 And he's been outed as a race pimp.
00:54:29.000 I mean, they've talked to people who've famously come to organizations and said, I'm going to give you diversity courses, and you're going to pay me a certain amount of money, or we're going to protect you.
00:54:41.000 That's a secular religion, isn't it?
00:54:43.000 And they're a new priesthood.
00:54:44.000 Well, people don't like being criticized, and they don't want it to hurt their business.
00:54:48.000 And especially if you're dealing with an internet business, internet-based businesses, you know, like Google, for example, Apple, classically liberal.
00:54:59.000 You would say progressive.
00:55:00.000 Progressive.
00:55:01.000 Yeah, they're not liberal at all.
00:55:02.000 That's the thing.
00:55:02.000 Very liberal.
00:55:03.000 In England, liberal actually means liberal.
00:55:06.000 In the United States, liberal means progressive.
00:55:09.000 Well, I hate all these fucking terms.
00:55:12.000 They're a nightmare.
00:55:13.000 It really is a nightmare.
00:55:14.000 It's taken me a long time to really be able to hammer out the ins and outs ideologically of it.
00:55:19.000 And so I can explain it quite efficiently now because it's difficult.
00:55:22.000 Well, what's really dark about something like progressive thinking, well, not progressive thinking, but rather affirmative action is that you're going to create a certain amount of racism by putting someone who's unqualified in a position just because of their color.
00:55:36.000 No, no, no, no.
00:55:37.000 It's racist.
00:55:39.000 You're being discriminatory.
00:55:40.000 On the basis of race to a white person.
00:55:43.000 That's racism.
00:55:44.000 But they are racist.
00:55:45.000 But you go to the deep root of it, and you've got to realize, well, there's some people that just, they're given a terrible hand of cards.
00:55:52.000 We've got to be able to figure out how to fix that hand of cards, how to make it a little bit more balanced in our cities.
00:55:59.000 How do you do that?
00:55:59.000 That's a good question.
00:56:00.000 That's one way to do it.
00:56:02.000 But that's where it gets really crazy.
00:56:03.000 Oh, I don't know about that.
00:56:04.000 Of course, doesn't it?
00:56:05.000 I don't know about that.
00:56:06.000 But if people that are born in these crime-ridden communities and they get no assistance whatsoever from the very government that's sending billions of dollars overseas to assist these other countries.
00:56:16.000 I'm not against helping them.
00:56:18.000 Absolutely not against helping them.
00:56:19.000 I mean, like, you know, I'm not saying you know what you go hang.
00:56:23.000 Absolutely not.
00:56:24.000 But anything you do has to start with them saying, I want to change.
00:56:30.000 Sure.
00:56:30.000 And if they can't start, if that's not a premise they're going to start with, then you can do nothing for them.
00:56:36.000 If they're happy where they are and how they are right now, then you can't change anything.
00:56:39.000 All you'll do is throw good money after bad and they'll know it.
00:56:42.000 I mean, like, think about people who win the lottery.
00:56:44.000 Are they happy?
00:56:46.000 Some of them.
00:56:47.000 Mostly, though.
00:56:48.000 It ruins their lives, doesn't it?
00:56:49.000 That's what they say.
00:56:50.000 I feel like the people who write those articles are always sad they didn't win the lottery.
00:56:54.000 Maybe, but I think there's a lot of money.
00:56:56.000 There's probably a lot of dudes out there who win a lottery that are just straight balling, having a great time.
00:57:01.000 Given has no value.
00:57:02.000 It's only the things that you earn that have value.
00:57:03.000 I don't know if that's always true.
00:57:05.000 Most of the time it is.
00:57:06.000 I think it's 100%.
00:57:07.000 Maybe if there's like a certain amount of money, maybe you just got to ease into it.
00:57:12.000 Maybe it's one of those things.
00:57:13.000 Maybe if you win $100 million and you just pretend you only won $100,000, spend that.
00:57:19.000 But I don't see that.
00:57:20.000 Start slow.
00:57:21.000 Don't do Ferrari.
00:57:22.000 I'm not saying we don't do that.
00:57:23.000 You've got to help these people, but you have to begin from a position where they say, you know what, look, I want help.
00:57:28.000 Because if they're sat there going, I don't want to change, well, then they're not going to change.
00:57:33.000 Well, I understand that, but I mean, a lot of them don't even understand how they would go about changing, and they're constantly surrounded and reinforced by all these people around them that are in jail or committing crimes.
00:57:44.000 I speak to black YouTubers occasionally, and a lot of them are like, you know what?
00:57:48.000 They do blame a lot of stuff on the white man.
00:57:49.000 They say the white man wants to keep you down and stuff like that.
00:57:52.000 See, but that's not.
00:57:55.000 Okay, listen, if this was a white guy and he was like, the black man's holding me down, you'd be like, shut up.
00:58:00.000 Shut up.
00:58:01.000 You wouldn't accept it.
00:58:02.000 And if you're going to accept that from them, that's you being a racist.
00:58:05.000 Treat them like you would treat white people.
00:58:06.000 If this was white people sitting around going, I'm not going to do anything.
00:58:09.000 I'm going to join a gang.
00:58:10.000 I think black mans or the Jews are trying to hold me down or something.
00:58:13.000 Yeah, you didn't let me finish.
00:58:14.000 I wasn't saying that white people are, in fact, holding black people down.
00:58:18.000 I was saying that idea of the white man like they have this mandate.
00:58:21.000 We've got to hold the black people down.
00:58:23.000 That's all right.
00:58:24.000 It's a standard argument.
00:58:25.000 I don't think there's a coalition of white people.
00:58:28.000 No, no, no, no, no, no, no.
00:58:29.000 I mean, it's a vote in the head.
00:58:30.000 Yeah, it's a sad thought perception.
00:58:32.000 I think that there absolutely was certain times.
00:58:36.000 I mean, there's absolutely neighborhoods that black people were not allowed to buy real estate.
00:58:39.000 There's absolutely systematic racism.
00:58:41.000 And there are still a lot of racists around.
00:58:43.000 I'm not saying it's not justified.
00:58:44.000 That's the thing.
00:58:46.000 I'm not saying it's not justified.
00:58:47.000 And I think that's where the argument goes off the rails because they go, what do you mean?
00:58:52.000 All this stuff's happening?
00:58:52.000 Yeah, it has happened.
00:58:53.000 And there are racists, but you don't have to let that hold you back.
00:58:57.000 No, you definitely don't.
00:58:58.000 That is up to it.
00:59:00.000 Yeah, I think they need examples.
00:59:02.000 I think more examples and more role models in that direction.
00:59:04.000 I've actually, I've seen, I can't remember the guy's name, but I remember seeing this black doctor doing a speaking tour at...
00:59:13.000 I don't know what the difference is there.
00:59:15.000 Chiropractors aren't really doctors.
00:59:18.000 Just found that out recently.
00:59:19.000 No, it's real.
00:59:20.000 They don't go to medical school.
00:59:22.000 Yeah.
00:59:23.000 Isn't that fucking crazy?
00:59:24.000 I didn't know about that until she was.
00:59:26.000 I'm having her on the podcast.
00:59:28.000 The woman who wrote that article, dude, they kill people.
00:59:31.000 People have died from that.
00:59:32.000 They've died from that.
00:59:33.000 There was some recent thing that Redband sent me about some Playboy Playmate who died because the doctor, the chiropractor adjusted her neck and she fucking died.
00:59:42.000 Fuck.
00:59:43.000 People die every year.
00:59:44.000 They do it to babies, man.
00:59:45.000 They adjust babies.
00:59:47.000 There's people that do like chiropractic adjustments on babies.
00:59:50.000 Their bones aren't even fully formed.
00:59:51.000 Their bones are soft.
00:59:53.000 And these assholes are manipulating them.
00:59:56.000 I'm really glad you told me that because I got two.
00:59:57.000 How do you know?
00:59:58.000 Two and a half-year-old son.
00:59:59.000 Nothing wrong with that.
00:59:59.000 I mean, look at that.
01:00:00.000 Playboy model Katie Mae's death caused by chiropractor autopsy finds.
01:00:04.000 Scroll up so I could read that fucking, this nonsense.
01:00:07.000 I thought they were doctors.
01:00:09.000 I thought when a doctor tells you, when you say Dr. Mike, I thought it was a doctor.
01:00:13.000 He thought he went to medical school and learned how to be a chiropractor.
01:00:16.000 No.
01:00:16.000 I can't believe you're allowed to do that.
01:00:18.000 Not only that.
01:00:19.000 Settle down.
01:00:20.000 The guy who invented it was a magnetic healer.
01:00:20.000 Ready for this?
01:00:23.000 Okay, the guy invented chiropractor.
01:00:25.000 There's all this like subluxation.
01:00:27.000 There's all these words they use.
01:00:28.000 Did he have clear?
01:00:29.000 Yes.
01:00:29.000 They're in total.
01:00:30.000 Entirely fake.
01:00:31.000 Not only that, the guy who invented it very possibly was murdered by his son.
01:00:36.000 His son ran him over for sure.
01:00:36.000 That's a suspicion.
01:00:38.000 And they think he might have done it on purpose.
01:00:40.000 And then his son ran him over with a car and then took over the business.
01:00:43.000 And his son was a totally shady piece of shit.
01:00:46.000 Well, clearly.
01:00:46.000 A fraud.
01:00:46.000 He ran over his own father.
01:00:47.000 Well, it could have been an accident, a tragic accident, and a love dad.
01:00:50.000 But the allegation is that he ran over his dad to take over the family business.
01:00:56.000 It is not real at all.
01:00:57.000 It is not a real science.
01:00:58.000 There's no science to support it.
01:01:02.000 But the crazy thing is, it's so accepted that insurance pays for this shit.
01:01:06.000 She had some clotting and went to the hospital where they tried to do some procedures, but she passed away.
01:01:13.000 So this guy adjusting her neck and they tore her vertebral, her vertebral artery.
01:01:20.000 Did he have like a head lock or something?
01:01:22.000 No, man.
01:01:22.000 When you're adjusting you, you got to understand there's bones in there and there's violent jutting.
01:01:27.000 If you're a tender person or if you have some sort of vulnerability, yeah, a model.
01:01:33.000 They tore her fucking article.
01:01:35.000 Her artery rather, her article.
01:01:37.000 And subsequently cut off blood flow to her brain.
01:01:40.000 Her death was ruled accidental.
01:01:43.000 What in the fuck?
01:01:45.000 Can we get these people stopped somehow?
01:01:47.000 But how do you call yourself a doctor?
01:01:49.000 This is what confuses the shit out of me.
01:01:51.000 All these guys that have called themselves doctor, when I talked to them, I thought they were an actual fucking doctor.
01:01:57.000 Like I went to a place like this is Dr. Pete.
01:01:59.000 Dr. Pete's going to work on you.
01:02:01.000 Dr. Pete's not a fucking doctor.
01:02:03.000 Like it's 100% fake.
01:02:06.000 What is the article?
01:02:07.000 Pull up that article.
01:02:08.000 Like chiropractors are bullshit.
01:02:10.000 Because I want to give that lady props.
01:02:11.000 She's going to come on my podcast.
01:02:13.000 Oh, awesome.
01:02:14.000 She calls herself the science babe.
01:02:16.000 Yvette, the science babe.
01:02:17.000 I don't know who.
01:02:18.000 Do you know who she is?
01:02:19.000 The psy babe.
01:02:20.000 She sounds awesome, though.
01:02:21.000 Yes.
01:02:21.000 Pro-science?
01:02:22.000 I'm very excited.
01:02:23.000 I'm very excited to have her on.
01:02:25.000 The article was actually synchronicity because it was a conversation that I was having with Steve Ranella, who's a friend of mine, a couple of weeks ago, when he was talking about his brother's got some issue with atrophy and of his arm.
01:02:39.000 He's been going to a chiropractor.
01:02:40.000 And I said, I think chiropractors are bullshit.
01:02:42.000 And so after the podcast, I smoked a joint and I was thinking to myself, was I being too hard on chiropractors?
01:02:48.000 Is I being a dick?
01:02:49.000 So then I started researching chiropractors and I found all this crazy shit about the fucking magnetic healers and all this stuff.
01:02:55.000 Here's the article.
01:02:56.000 Chiropractors are bullshit.
01:02:58.000 You shouldn't trust them with your spine or any other part of your body.
01:03:01.000 And what is her name that she wrote this?
01:03:03.000 So we can give her some.
01:03:04.000 How do you go?
01:03:07.000 D. Etremont.
01:03:09.000 Entremont.
01:03:10.000 It's D. Apostrophe.
01:03:12.000 A capital.
01:03:13.000 Low-level D. What is this bullshit non-American spelling?
01:03:17.000 Yeah.
01:03:17.000 Ew.
01:03:18.000 Or possibly Canadian.
01:03:19.000 Quebec.
01:03:19.000 Yeah.
01:03:21.000 Lowercase D. Apostrophe large capital.
01:03:25.000 Don't worry.
01:03:26.000 I'm in favor of banning French as well.
01:03:28.000 I like French people.
01:03:29.000 No, no, the language.
01:03:30.000 Not the French, but we could ban the French.
01:03:30.000 I like their food, too.
01:03:31.000 The language sounds good if there's something.
01:03:32.000 There's one acceptable form of bigotry in England that's against the French, so fuck the French.
01:03:35.000 Oh, really?
01:03:36.000 You're allowed to?
01:03:37.000 Yeah.
01:03:37.000 How come you can do that?
01:03:38.000 Because we lost the Hundred Years' War.
01:03:40.000 Oh, so it's an England thing.
01:03:42.000 Yeah, we spent 100 years kicking the shit out of French, and then God intervened to save them.
01:03:49.000 Yeah.
01:03:50.000 France is having a hard time.
01:03:51.000 Let me finish this talk about the doctor though.
01:03:53.000 Basically, like, I know that there are probably going to be a lot of black people who are sat there going, well, holy shit.
01:03:58.000 My whole life I've been like, I know that the white man's keeping me down and stuff like this, but this isn't all coming from me.
01:04:04.000 That's the thing.
01:04:04.000 I'm not the fucking expert here.
01:04:06.000 This is coming.
01:04:07.000 I wish I could remember this guy's name.
01:04:08.000 It was like a couple of years ago that I watched this guy doing this speaking tour.
01:04:11.000 And he was a black doctor, but he's like a PhD in, I don't know, business studies or something like that.
01:04:17.000 It wasn't like, you know, he was not a chiropractor.
01:04:20.000 But I was absolutely loving his speech to like a bunch of kids in school because he was just saying, look, if you drop out, you won't get to where you want to go.
01:04:29.000 I mean, you've got to look at it, just look at it sensibly.
01:04:31.000 Just look at it through an Aristotelian lens.
01:04:34.000 You are what you do.
01:04:35.000 Okay, where do you want to end up?
01:04:37.000 You want to end up as a scientist?
01:04:38.000 Okay, well, if you drop out of school, that's that goal off the table.
01:04:42.000 You can't become a scientist by dropping out of school.
01:04:43.000 So no matter how much you hate school, you just got to fucking stick it out.
01:04:47.000 You've got to get it out.
01:04:48.000 If you want to do something that's involved in school.
01:04:50.000 If you want to get to a scientist, so you plot it and go, right, okay, so I've got to get to college, I've got to get to university, get PhD, then I've got to make sure that I've done all the right courses to get to the right field.
01:04:59.000 You've got to plot this out.
01:05:00.000 And if you drop out at any point, well, there we go.
01:05:03.000 you screwed your life plan there.
01:05:04.000 So you're going to have to figure out something to do.
01:05:06.000 And I know because I did exactly the same thing.
01:05:08.000 There's just no getting around it.
01:05:10.000 No one can give that to you.
01:05:11.000 You've got to give it to yourself.
01:05:12.000 That's what empowerment is.
01:05:14.000 If you want to be empowered, you've just got to go, you know what?
01:05:16.000 Let's just assume the white man hates you.
01:05:18.000 Let's just assume he does.
01:05:20.000 Just assume we're all fucking racist.
01:05:20.000 Every white man.
01:05:22.000 Why would you assume that though?
01:05:23.000 Because this is the worst case scenario.
01:05:25.000 This is worst case scenario.
01:05:26.000 Worst case scenario.
01:05:26.000 Let's think about it.
01:05:27.000 Every white person's racist.
01:05:28.000 They want to hold you back.
01:05:29.000 But legally, they can't.
01:05:31.000 So you might be thinking, right, okay, there's going to be some bigotry.
01:05:34.000 So what are you going to do?
01:05:35.000 You're going to work harder.
01:05:36.000 That's the only way you're going to do it.
01:05:38.000 If you want to get to where you're going to go, you've got to work.
01:05:40.000 But then you're living in a paranoid world where you're not going to be able to do that.
01:05:43.000 Every white person hates you.
01:05:46.000 No, no, no, hang on.
01:05:47.000 I'm not suggesting you should actually do that.
01:05:49.000 But why don't you bring it up there?
01:05:50.000 Because that's the worst case scenario for them.
01:05:52.000 That's what they'll say.
01:05:53.000 The white man's trying to hold me back.
01:05:54.000 Okay, let's say he is.
01:05:55.000 Let's just take that as a premise, right?
01:05:57.000 Okay, but does he actually, you know, he's not going to, I mean, they do actually have affirmative action for your university and stuff.
01:06:02.000 So even if we still assume that this is somehow him being a racist and hating black people, you can still go, well, I can still take advantage of that.
01:06:09.000 If what I need is these grades and these subjects, go and get those grades and those subjects.
01:06:14.000 In fact, you might think, well, Christ, I might actually go for the grade up just in case the white man tries to screw me even harder.
01:06:19.000 You know, I'm going to work even harder.
01:06:21.000 And you think, well, Christ, that means you've got to work hard.
01:06:22.000 Yeah, you've got to work hard.
01:06:23.000 I love the term the white man as if they're unified.
01:06:25.000 The white man.
01:06:26.000 The white man.
01:06:27.000 They're not unified in anything other than keeping black people down.
01:06:29.000 Yeah, yeah.
01:06:30.000 They're going to keep a brother down.
01:06:32.000 Why, except for when he gets to the presidency?
01:06:34.000 They clearly are in dispute over everything.
01:06:37.000 Yeah, yeah.
01:06:38.000 Like everything from top to bottom.
01:06:40.000 Is it secret?
01:06:42.000 The new center is the new thing that keeps getting banned.
01:06:45.000 Oh, the woke centrists.
01:06:46.000 The centrists, the new center.
01:06:49.000 The extremes are cancer, and we should just ignore them.
01:06:51.000 Yeah, well, the center sounds ridiculous, too.
01:06:55.000 It all sounds ridiculous.
01:06:56.000 But ask yourself, would you rather talk to Ben Shapiro or Richard Spencer?
01:06:56.000 I know.
01:07:00.000 Ben Shapiro.
01:07:01.000 Would you rather talk to me or Anita Sarkeesian?
01:07:01.000 Exactly.
01:07:03.000 Well, maybe not Anita Saris.
01:07:04.000 I'd like to talk to her.
01:07:05.000 Okay, she's a bad example.
01:07:06.000 Me or someone who says that all white people are oppressing all black people.
01:07:11.000 I always want to talk to someone who has these extreme beliefs.
01:07:16.000 I believe she does.
01:07:16.000 I don't need Franks.
01:07:17.000 Just because I want to know what goes on in her head.
01:07:21.000 I want to know why she thinks that there's something wrong with being male to the point where every male has toxic.
01:07:28.000 I mean, I'm talking about political candidates.
01:07:32.000 Would you rather have two people who are relatively moderate, relatively close to the center?
01:07:35.000 Oh, you're talking about political candidates now?
01:07:37.000 Yeah, yeah, sorry.
01:07:38.000 I should have specified that.
01:07:38.000 How'd you get there?
01:07:40.000 Well, because that's the center is the political sphere.
01:07:43.000 Oh, is it?
01:07:44.000 I see.
01:07:45.000 I feel like it's an ideology more than it is a political thing.
01:07:48.000 Well, it's how you live your life more than how you vote.
01:07:50.000 Yeah, it's not like an ideology in and of itself.
01:07:53.000 You've got like center-left, center-right.
01:07:54.000 got similar but different ideological traditions and I mean it's like it's basically there Because you're disempowering yourself.
01:08:05.000 Except when it clearly is someone else's fault.
01:08:07.000 Sometimes it is, yeah.
01:08:08.000 Sometimes it is.
01:08:09.000 But most of the time it's not.
01:08:10.000 Most of the time it's your own fault.
01:08:11.000 Yeah.
01:08:12.000 Like the victim mentality that you were talking about when we're talking about like hardcore feminists that are constantly blaming men for things.
01:08:18.000 But the problem is like, what is it like being a chick?
01:08:21.000 It's probably fucking terrible.
01:08:22.000 It's probably not always.
01:08:24.000 I mean, that's really misogynist.
01:08:26.000 When the term that you didn't let me finish, you fuck.
01:08:28.000 What I'm saying is when women are constantly worried about being victims, right?
01:08:34.000 Like women are worried, yeah.
01:08:36.000 Yeah, I've talked to some that talk about like driving home.
01:08:36.000 Are you sure?
01:08:40.000 They get nervous when they're walking into certain areas at night in a way that a man doesn't.
01:08:45.000 This is interesting because that's the same opinion that Saudi Arabians have of the West.
01:08:49.000 How so?
01:08:49.000 What do you mean?
01:08:49.000 You've given me the feminist narrative that all women are constantly afraid of men.
01:08:52.000 No, I think a lot of women are, though.
01:08:54.000 Yeah, no, no, but I'm not saying they're not.
01:08:56.000 But like, you know, when you say, right, the overwhelming majority of women are afraid of men, that's a feminist narrative.
01:09:02.000 That's also the Wahhabi narrative from Saudi Arabia.
01:09:06.000 And that's why they have to go, well, that means that our women aren't allowed out.
01:09:08.000 They've got to wear burqas.
01:09:10.000 They've got to effectively become invisible.
01:09:11.000 So men aren't going to predict that.
01:09:13.000 Well, they have a very ancient culture.
01:09:15.000 And in that culture, women are treated ridiculously different.
01:09:18.000 What do you mean they don't?
01:09:19.000 Wahhabism arrived in the 18th century with Abdul Wahhab a little terrible.
01:09:25.000 It's 1700.
01:09:26.000 That's pretty ancient.
01:09:27.000 To an American.
01:09:28.000 Yeah.
01:09:29.000 Not to anyone.
01:09:29.000 I'm American, man.
01:09:30.000 Okay, but I'm not.
01:09:31.000 It's an ancient culture.
01:09:32.000 It's not.
01:09:33.000 My point is.
01:09:33.000 It's not ancient at all.
01:09:35.000 It's old as fuck.
01:09:36.000 It's not as old as your country.
01:09:36.000 My point is.
01:09:37.000 Okay.
01:09:38.000 It's 1776.
01:09:38.000 The point.
01:09:40.000 Yeah, exactly.
01:09:41.000 No, no, this is like 1780 or something like that.
01:09:42.000 But even though it's not a problem, it's older by four years.
01:09:45.000 This country's older by four years.
01:09:47.000 Your country existed before 1776.
01:09:48.000 It just wasn't.
01:09:49.000 Sort of.
01:09:49.000 No, MC existed.
01:09:50.000 It's the same intellectuals.
01:09:51.000 There were some people here, a bunch of fucking campers.
01:09:53.000 Hundreds of years before.
01:09:55.000 Like the people that walked the Appalachian Trail.
01:09:57.000 But this is way newer and way more cancerous as well.
01:09:57.000 Yeah, absolutely.
01:10:01.000 There's a lot of great stuff about the point being, there is a very different classification of women versus men in Wahhabism, right?
01:10:10.000 It's the same theory behind it.
01:10:11.000 They say exactly the same words.
01:10:13.000 What do you mean?
01:10:14.000 Exactly exactly.
01:10:15.000 Women are afraid of men.
01:10:16.000 Yeah, but some women are.
01:10:17.000 I mean, like, in certain circumstances.
01:10:19.000 What I'm trying to get at is that the way women think about men is very different than the way men think about women.
01:10:25.000 A man does not walk through a group of women and worry about being raped.
01:10:29.000 A man does not walk through the streets at night and worry some woman's going to come out of the darkness and grab his ass.
01:10:35.000 But a woman doesn't walk into a courthouse, a divorce proceeding, and think, I'm going to lose everything.
01:10:39.000 That's true.
01:10:40.000 That's true.
01:10:40.000 Unless you're Roseanne and you're dating Tom Arnold, right?
01:10:44.000 Boom.
01:10:44.000 She's still paying that due to the child.
01:10:46.000 Again, that's one for the team.
01:10:48.000 Exception.
01:10:48.000 Exception that proves the rule there.
01:10:50.000 90% of the time, the woman, I mean, 90%, I mean, you hear horrific stories.
01:10:54.000 talk to some of the MRAs, you get some horror stories where, I mean, I know you're using all those terms.
01:11:01.000 I'm immersed in it all day.
01:11:03.000 But people aren't.
01:11:05.000 People that are listening to this, a lot of them are not.
01:11:07.000 Get your ass off the sofa, stop watching TV.
01:11:09.000 Or go online and just be immersed in that world.
01:11:12.000 No, go outside.
01:11:14.000 You don't have to be an MRA.
01:11:14.000 No, why?
01:11:15.000 You don't have to be a message.
01:11:16.000 What do you mean?
01:11:17.000 I spent thousands of pounds to come here to tell you what an MRA is.
01:11:21.000 No, but the thing is, this is the thing, right?
01:11:22.000 This is, this is, and I say this with the greatest respect, from the more venerable generations than mine who are more used to the television.
01:11:30.000 People don't watch TV anymore, and they won't go back to it.
01:11:33.000 Yeah, I think you're right.
01:11:34.000 I think it's dying.
01:11:35.000 We've got the statistics to prove that.
01:11:36.000 I've done videos on it.
01:11:37.000 We've got all the sources and the stats.
01:11:39.000 It's only the older generations.
01:11:40.000 I mean, like in Britain, all of the TV channels, Channel 4 was the young, hip, cool TV channel when I was young.
01:11:45.000 The average age of the viewer now is 44.
01:11:48.000 But once they found out that old dude had been fucking kids for all those years, Jimmy Saville.
01:11:52.000 Savile.
01:11:53.000 Yeah.
01:11:54.000 I think that lost a lot of faith.
01:11:56.000 That's not the reason.
01:11:57.000 There's TV watching.
01:11:58.000 That's not the reason.
01:12:00.000 Honestly, it's not the reason.
01:12:01.000 Jesus Christ.
01:12:04.000 I've been dealing with some people who are fucking idiots.
01:12:06.000 You know, and I've had to be really, really specific.
01:12:06.000 I get it.
01:12:10.000 There is a group of people online who know that I'm right, and they don't like that I'm right.
01:12:14.000 They don't like you right?
01:12:15.000 Oh, no.
01:12:15.000 They hate that I'm right.
01:12:16.000 Are we talking about Nina Sarkeesian again?
01:12:18.000 Her fans, they hate me.
01:12:21.000 And I don't want to give them any more ammunition than her.
01:12:23.000 That's so bad that she's not defending herself or she doesn't have the opportunity to.
01:12:28.000 When she called me a piece of shit and a garbage human, I had said nothing up until that point.
01:12:32.000 So I'd yell back, I just want to talk.
01:12:34.000 And she was like, oh, whatever, dude.
01:12:36.000 And it's just like, I came all this way to hear what you had to say, and all you could do is insult me.
01:12:40.000 That was the one interaction that you've had with that.
01:12:42.000 I've got to rehash this.
01:12:43.000 I know that.
01:12:44.000 Jesus Christ.
01:12:46.000 I've been personally slighted, man.
01:12:48.000 I get it.
01:12:48.000 I get it.
01:12:52.000 There's a narrative going around the progressive journalists that has already been debunked by an actual journalist called Tim Poole and the video evidence of the event that I personally took that I stormed the stage and started screaming abuse at her.
01:13:06.000 They're saying that.
01:13:07.000 Probably who is saying that?
01:13:08.000 Journalists.
01:13:10.000 Actual journalists?
01:13:11.000 Pull up of an actual article.
01:13:12.000 Tell me an actual article that said that.
01:13:14.000 There probably aren't any yet, but I bet you tomorrow or going on, there are articles about me and some of them will say something like, I abused her or she is some victim.
01:13:24.000 Do you not understand the irony?
01:13:24.000 Pause.
01:13:26.000 We're talking about people labeling themselves as victims and you are labeling yourself as victim of a crime that has not occurred.
01:13:26.000 What?
01:13:33.000 No, I don't know.
01:13:34.000 You're talking about articles that are being written by journalists and videos that are being made, but they have not been made yet.
01:13:39.000 And then you're saying, just watch, they are going to be made.
01:13:42.000 Yeah, but that's because I've been down this road before.
01:13:44.000 Okay, do you not see?
01:13:45.000 Come on, you've got to be able to see that.
01:13:47.000 Well, it's not that I mean, like, I'm not, like, a victim.
01:13:50.000 This is something that happened to me.
01:13:52.000 Right, but do you get that?
01:13:53.000 What we're just talking about right here?
01:13:53.000 Oh, yeah, yeah.
01:13:55.000 I asked you who is saying that you were stormed, that you stormed the stage.
01:13:59.000 You tell me, journalists, and I say, where are these articles?
01:14:02.000 And you say, just wait.
01:14:03.000 I'm going to watch.
01:14:04.000 At the moment, we're talking about rumors.
01:14:06.000 For example, like Tim Poole said that a friend of his, I think she's an actress, but it's the same sort of LA circles.
01:14:13.000 That's someone gossiping.
01:14:15.000 That doesn't mean that someone actually a journalist wrote an article about storming this.
01:14:20.000 But why do you think that's what you're doing?
01:14:22.000 Why would you social media?
01:14:23.000 Why would you even concentrate on that if it has not occurred?
01:14:27.000 I mean, isn't this sort of like a black guy complaining about the man holding you back when there is no man holding you back?
01:14:32.000 No, I don't think it's analogous.
01:14:34.000 I mean, it is.
01:14:35.000 You're talking about being a victim of a crime that hasn't occurred.
01:14:38.000 No, but I know what these people are like.
01:14:40.000 I've been dealing with them for long enough now.
01:14:42.000 I know what those white people are like, man.
01:14:44.000 I've been dealing with them my whole life.
01:14:45.000 Yeah, but we actually have a precedent for them doing this as well.
01:14:48.000 Right, but they're not doing it right now, and you're saying they are.
01:14:50.000 They are.
01:14:50.000 They're on social media right now.
01:14:52.000 I mean, you've got journalists like Ian Miles Chung, who said, right, so the prevailing narrative among journalists at the moment is that Sargon stormed the stage and abused Anita Sarkeesian, when video evidence says the exact same thing.
01:15:04.000 Yeah, he posted on Twitter.
01:15:05.000 So there's no articles, but there is a Twitter.
01:15:07.000 can tell that it's coming up.
01:15:08.000 They're going to be No, no, well, it's brewing.
01:15:11.000 I mean, this is how these rumors start, and then someone starts reporting it, and then suddenly you get a narrative going on.
01:15:17.000 I mean, and I'm not the victim of this because I've got a large enough reach to be able to say, let's switch videos now.
01:15:21.000 Because I think people listening to this who have no idea who the fuck are Neil Sarkeesian is and don't even know who you are, there's a lot of people listening to this.
01:15:27.000 I would like to get to more interesting things.
01:15:31.000 Okay?
01:15:31.000 Because I think you're an interesting guy.
01:15:33.000 Oh, thank you.
01:15:34.000 I think you're a little wrapped up in this right now, which is okay.
01:15:37.000 I understand very personal.
01:15:39.000 I'll tell you what, it was.
01:15:40.000 But you're a very smart guy.
01:15:42.000 Your videos are excellent.
01:15:44.000 Your podcasts are great.
01:15:45.000 Thank you.
01:15:46.000 I can't wait to Who's Russ?
01:15:49.000 Your friend?
01:15:50.000 Yeah, he's a huge fan of yours.
01:15:51.000 What's up, Russ?
01:15:52.000 Huge fan of yours.
01:15:53.000 I can't wait to see him.
01:15:55.000 Do you, like, now that you have this gigantic audience, do you feel a certain amount of responsibility?
01:16:00.000 Oh, yeah.
01:16:00.000 I always have.
01:16:01.000 Intellectual integrity is the most important thing.
01:16:03.000 And the great thing about this, and the great thing about the way the internet works, is that, I mean, look at the people who are at the top.
01:16:08.000 Look at someone like PewDiePie.
01:16:11.000 Man of outstanding moral character.
01:16:13.000 Is he?
01:16:13.000 Do you know him?
01:16:14.000 Do you hang with him?
01:16:15.000 I've spoken to him a few times, but I don't need to hang with him to know.
01:16:17.000 Look at what he does.
01:16:18.000 He doesn't go around bullying people.
01:16:19.000 He doesn't go around lying about people.
01:16:22.000 He goes around doing charity and being like, hey, bros, you know, and being nice to people and being inspirational.
01:16:26.000 What happened with him?
01:16:27.000 Can you?
01:16:28.000 You're not framed.
01:16:28.000 Okay, but let's go over the actual...
01:16:49.000 you could have ads on that YouTube video, and your YouTube videos get a lot of downloads, man.
01:16:55.000 I mean, it would be very lucrative for you.
01:16:57.000 And then with us, our YouTube money was cut by more than 60%, I think.
01:17:05.000 Yeah, yeah.
01:17:05.000 It was something crazy.
01:17:06.000 To retain 40% of your income was pretty good as well.
01:17:09.000 Yeah.
01:17:10.000 Well, anything controversial, anything having to do with anything right-wing is deemed not friendly for advertising.
01:17:21.000 Anything that has drugs in the title is deemed not friendly for advertising.
01:17:26.000 Like we've literally been able to change the title of things and re-upload them, and then the ads, they accept the headballs.
01:17:34.000 Yeah, so the same content.
01:17:36.000 So what all this came from is Poodie Pie, who has an enormous YouTube presence, right?
01:17:42.000 Doesn't he have like 50 million followers?
01:17:44.000 Holy shit.
01:17:46.000 And he made some videos that the, what is it, the Washington Post?
01:17:51.000 Is that what it was?
01:17:52.000 The Wall Street Journal.
01:17:53.000 Yeah.
01:17:53.000 Wall Street Journal.
01:17:54.000 Please do.
01:17:54.000 Can I tell you about it?
01:17:55.000 Please do, because you, I'm sure, are much more fun.
01:17:58.000 I did a bunch of videos on this, and it was the funniest thing in the world.
01:18:01.000 I love the language they use as well.
01:18:03.000 What's a scoop?
01:18:04.000 A scoop is a news story that other people are not aware of yet.
01:18:08.000 And You got it first.
01:18:10.000 So if tens of millions of people have seen something, and then if I'm a Wall Street journalist and I come along and I find something that tens of millions of people have seen, can I rightly call that a scoop?
01:18:22.000 Well, you can definitely call it a news story.
01:18:24.000 And you can definitely say that people in the...
01:18:34.000 Like, what is a real network?
01:18:37.000 What is a real newspaper?
01:18:38.000 What is real journalism?
01:18:40.000 Like, in their mind, I'm sure the Wall Street Journal considers themselves to be in the hierarchy versus someone who has.
01:18:48.000 But they don't have 55 million downloads.
01:18:50.000 They don't.
01:18:51.000 I mean, there's no way.
01:18:53.000 The Wall Street Journal puts out a video.
01:18:56.000 Yeah, if they put out a video, if they got a million hits on their video, that would be a big deal.
01:19:00.000 Oh, yeah, they'd be thrilled.
01:19:01.000 If Poodie Pie had only a million hits on one of these videos, he would be super depressed.
01:19:06.000 He is a much bigger entity than them.
01:19:08.000 But in their mind, like a scoop is them, the legitimate media.
01:19:14.000 So pointing out something in the illegitimate, this new weird video blogosphere.
01:19:22.000 So basically, PewDiePie, he's a comedian and he makes jokes.
01:19:25.000 But he's a, like I said, he's a man of good character.
01:19:27.000 And parents should be thanking PewDiePie for being the person he is because he's talking to their kids and their kids are going to grow up being decent people because of his influence.
01:19:36.000 But the Wall Street Journal, Ben Fritz and Rolf Winkler, and there was another one, I can't remember his name at the moment, they decided what they were going to do is, and I don't know who put them up to this, obviously, but they were going to watch about six months of his video, go through and find any anti-Semitic jokes that he had made.
01:19:50.000 Well, I mean, by definition, it's a joke, so there's not.
01:19:53.000 Six months?
01:19:54.000 Yeah, six months ago.
01:19:55.000 They sat together.
01:19:56.000 He does a video every day, and they got about, I don't know, like two minutes of footage or something with him making jokes at the expense of the Nazis, effectively.
01:20:03.000 And they were like, wow, this guy's promoting anti-Semitism.
01:20:05.000 I would like to see those jokes because I've heard about them, but I don't know the actual jokes that he said that they were saying.
01:20:12.000 But here's the, let's get to it.
01:20:14.000 They were saying that he is making light of being a Nazi or pretending to be a Nazi, and that this is what's fucked up.
01:20:21.000 And he was promoting anti-Semitism by parodying the Nazis.
01:20:25.000 Yeah, I don't understand that.
01:20:26.000 Trying to explain to people.
01:20:28.000 How does Hogan's heroes work then?
01:20:29.000 Yeah, exactly.
01:20:30.000 Isn't Mel Brooks a bit of an old anti-Semitism?
01:20:33.000 He must be.
01:20:33.000 Jesus Christ.
01:20:34.000 Even though he's not a good person, oh, God, just start.
01:20:37.000 Yeah, I mean, springtime for Hitler, you could say, was somehow or another humanizing.
01:20:42.000 I imagine that created a lot of Nazis.
01:20:44.000 Can you imagine if somebody made that argument though?
01:20:47.000 Imagine if someone made the argument that Mel Gibson created Nazis.
01:20:50.000 That's how far we have come with our Mel Brooks.
01:20:52.000 What did I say, Mel Gibson?
01:20:54.000 Yeah.
01:20:54.000 He might well imagine that.
01:20:55.000 That was a Freudian.
01:20:57.000 That was clearly a Freudian.
01:20:58.000 All I ever think about.
01:20:59.000 When someone says Mel Gibson drunk, I think of him yelling about Jews at cops.
01:21:04.000 That's all I think of.
01:21:05.000 Big beard and everything.
01:21:06.000 Crazy.
01:21:07.000 Crazy old man.
01:21:07.000 Fucking Jews.
01:21:09.000 It's like Jesus, Mel.
01:21:10.000 Jesus, Mel.
01:21:11.000 Just, dude.
01:21:13.000 Volt Wright poster boy right.
01:21:13.000 Let's see.
01:21:15.000 Yeah, well, he is.
01:21:16.000 Well, he's a serious like 9-11 conspiracy believer.
01:21:20.000 Where's that Charlie Sheen?
01:21:22.000 Oh, probably.
01:21:22.000 Me probably.
01:21:23.000 I'm lumping them together.
01:21:24.000 I think I might be.
01:21:25.000 So these videos, one of them was the funniest thing because they were taking like, and the jokes were like, you know, like, 20 seconds of joke taken out of like a 10-minute video, you know, where he's just, it's just like a cutaway to a parrot, you know, a skit.
01:21:39.000 You know, like, you know, one of them was a joke about how the media takes his jokes out of context to try and frame him as something native.
01:21:50.000 Right.
01:21:50.000 And they cut that out of context and used it.
01:21:53.000 And it was just like, well, that was.
01:21:54.000 Which is amazing.
01:21:56.000 It's incredible.
01:21:56.000 It's amazing that they had the balls to do that.
01:21:58.000 And they were not only proud of what they did, they called it a scoop.
01:22:02.000 And then they went to advertisers and said, look at this hate speech.
01:22:06.000 But how did they get away with doing that?
01:22:08.000 How did they get away with them to do that?
01:22:10.000 How come no one from the Wall Street Journal sat them down and said, well, listen, this is definitely not him saying that you're going to take something out of context and then doing it, is it?
01:22:20.000 Because if it is, and then you took it out of context and did exactly what he's saying you shouldn't do, that's kind of crazy.
01:22:26.000 It's mental.
01:22:27.000 It is mental.
01:22:28.000 It's absolutely insane.
01:22:29.000 But they feel like they can get away with it because they're in the hallowed halls of actual media and they're putting this stuff out there.
01:22:35.000 Jamie, did you find it?
01:22:36.000 See what you can find in terms of the actual jokes that PewDiePie said that are supposedly...
01:22:43.000 Yeah, go grab them.
01:22:47.000 The Wall Street Journal.
01:22:48.000 Yeah, if you look at his numbers, they look like that.
01:22:49.000 And what's so strange about it?
01:22:51.000 He's making hand gestures down for the Wall Street Journal, up for PewDiePie.
01:22:54.000 Yeah.
01:22:54.000 And what's so strange about it, though, is that PewDiePie isn't their competition.
01:22:59.000 All right, here we go.
01:23:00.000 Here we go.
01:23:02.000 Some of them are great.
01:23:03.000 Let's see what we got here.
01:23:04.000 My name is PewDiePie.
01:23:06.000 Whoa.
01:23:08.000 Okay, so him, 27-year-old, better known as PewDiePie, became famous for playing video games on YouTube.
01:23:15.000 Typical Nazis.
01:23:18.000 Okay.
01:23:20.000 53 million subscribers.
01:23:22.000 He's still 55 now.
01:23:23.000 Jesus.
01:23:24.000 The biggest star on the site by far.
01:23:26.000 Let's go inside the stories.
01:23:28.000 The scoop.
01:23:28.000 Oh, no, that's better.
01:23:29.000 Scoop is better.
01:23:32.000 Let's get into the latest scoops.
01:23:37.000 So it's him.
01:23:39.000 I think this is the one.
01:23:40.000 Recently, some of his videos are brief, including Nazi messages, images of Adolf Hitler, and explicit anti-Semitic commentary.
01:23:48.000 So this is the Washington Post's version?
01:23:50.000 The Wall Street Journal, rather.
01:23:52.000 Following a request for comment from the Wall Street Journal, Disney said the videos are inappropriate and cut ties to Pootie Pie, who ran his business.
01:23:52.000 Sorry.
01:24:00.000 Disney subsidiary Maker Studios cost Pootie Pie millions.
01:24:04.000 In a January 11th video, Mr. Blah blah blah featured two men holding a sign reading death to all Jews after hiring them from a freelance website.
01:24:14.000 I like the editing.
01:24:18.000 See, the joke is his reaction to it.
01:24:21.000 And the fact that you can subscribe to Keemstar, who's like another big YouTuber, he's mocking that.
01:24:25.000 I didn't think they would actually do it.
01:24:27.000 I feel partially responsible, but just I didn't think they would actually do it.
01:24:32.000 It might just be my crude sense of humor, but I think there's something funny about that.
01:24:37.000 He fired back at the media for mischaracterizing him.
01:24:40.000 Again, I think there's a difference between a joke and actual liebara.
01:24:46.000 Like fuck deaf to all Jews.
01:24:48.000 If I made a video saying, hey guys, uh Peter Pie here, deaf to all Jews, I want you to say after me, deaf to all Jews.
01:24:59.000 And, you know, Hitler was right.
01:25:02.000 I really opened my eyes to white power, and I think it's time that we did something about it.
01:25:08.000 That's essentially how they're reporting this.
01:25:10.000 As if that's what I was saying or some shit like that.
01:25:13.000 It's amazing how popular he is with such a shitty grasp of English.
01:25:17.000 He's Swedish.
01:25:18.000 Apologies can camouflage messages that may still be received and celebrated by hate groups.
01:25:22.000 The Southern Poverty Law Center says.
01:25:24.000 Yeah.
01:25:26.000 So if the Southern Poverty Law Center condemns you, it's becoming a bit of a badge of one of these days.
01:25:30.000 In the video Since Removed, he posted a video of a man dressed as Jesus Christ.
01:25:34.000 I love how they have like Since Removed is bold, and then Jesus Christ is bold.
01:25:39.000 The Jesus guy?
01:25:39.000 Really nice guy.
01:25:40.000 Yeah.
01:25:41.000 What the fuck?
01:25:46.000 Don't worry, though.
01:25:47.000 He stands beside God now.
01:25:49.000 He isn't happy.
01:25:52.000 Jesus out.
01:25:54.000 And I wasn't the one who made him say it.
01:25:56.000 Or was it?
01:25:57.000 No, it wasn't.
01:25:58.000 I don't think so.
01:25:58.000 He set up a GoPhotomy page and his own website, so you can take orders from Jesus himself.
01:26:06.000 And also, can I just point out that he said, hold on, stop.
01:26:09.000 This is not what I'm looking for.
01:26:11.000 Where are the actual videos of him doing the Nazi shit?
01:26:15.000 You just saw it.
01:26:16.000 That was it.
01:26:16.000 That's 100% it.
01:26:17.000 That's it.
01:26:18.000 This is their article that led to all of this.
01:26:20.000 Yeah.
01:26:21.000 Right, but what we're looking at here, this is the Wall Street Journal's version of it.
01:26:26.000 So this is...
01:26:27.000 This is the actual clips that they're saying are the thing that shows him to be anti-Semitic.
01:26:32.000 Holy shit.
01:26:35.000 I thought it was way better than I thought it was just a little video he got made off Fiverr.
01:26:40.000 Yeah, it's just him pratting around on the internet.
01:26:42.000 It's just silly, childish jokes.
01:26:47.000 Three.
01:26:48.000 Really?
01:26:49.000 I guess two.
01:26:50.000 I think three is too high.
01:26:51.000 Nazis are so stupid.
01:26:52.000 It's like he doesn't take a lot to turn them.
01:26:57.000 The really bad thing about this, though, is the fact that they didn't put this article out until they had taken these clips and dapped him into Disney.
01:27:05.000 And Disney obviously knew shit.
01:27:07.000 There is going to be a shitstorm about this.
01:27:10.000 Why would someone do this?
01:27:11.000 Why would someone get a political attack?
01:27:13.000 But why?
01:27:13.000 What is this?
01:27:14.000 Because he's making money and they're losing money.
01:27:16.000 His politics are not their politics as well.
01:27:18.000 Yeah, he mocks like, you know, he mocks feminism.
01:27:20.000 He mocks social justice.
01:27:22.000 He's a liberal.
01:27:23.000 He's not a progressive.
01:27:24.000 That's the difference.
01:27:25.000 Why would they think that they could just do that, though?
01:27:28.000 Because they've got the power to do that.
01:27:29.000 No one can stop them.
01:27:30.000 You couldn't stop them.
01:27:30.000 But they did.
01:27:31.000 They cost him millions of dollars.
01:27:32.000 I'm sure they did, but it's now costing them their reputation.
01:27:36.000 Absolutely.
01:27:37.000 Yeah, I mean, you can't just do that.
01:27:39.000 Like, the Wall Street Journals, they took a big hit from this, I'm sure.
01:27:43.000 They're unbelievably arrogant.
01:27:44.000 Well, actually, you know what's really interesting, right?
01:27:47.000 After all of this shit show blew over, there was a bunch of polls done of like 13 to 18 year olds.
01:27:52.000 And it was really, really fascinating how you had like favorability and how well known things were.
01:27:58.000 And like YouTube was really well known, really favorable with kids, you know.
01:28:02.000 And then you had the Wall Street Journal, which was really well known, but really unfavorable with kids.
01:28:07.000 How do kids know what the Wall Street Journal is?
01:28:09.000 Because of PewDiePie.
01:28:11.000 So now they think of the Wall Street Journal in this way, the millions of people that are his fans.
01:28:15.000 Exactly.
01:28:15.000 Well, that's a big mistake to a guy with a platform like that in 10 years time when they're voters and they're reading and consuming news, the Wall Street Journal will continue to be, they'll continue to go down.
01:28:26.000 PewDiePie will still be doing well.
01:28:28.000 But what's crazy is I never thought of the Wall Street Journal as that.
01:28:31.000 What I thought of the Wall Street Journal as being like informative, objective information about business.
01:28:37.000 If you asked me, like, what is the Wall Street Journal?
01:28:40.000 That's how I've always thought of them.
01:28:42.000 So to see this video.
01:28:44.000 It's pretty shocking, isn't it?
01:28:45.000 Well, it is disturbing because there's nothing there.
01:28:48.000 Like the debt to all Jews thing, the fact that did he pay those guys to hold up that sign?
01:28:48.000 No.
01:28:53.000 Yeah, like $5 to write on a sign so he could go, oh, because, you know, because it's taboo, right?
01:28:59.000 He's just breaking it to boot.
01:29:00.000 Exactly.
01:29:00.000 That's all it is.
01:29:01.000 Exactly.
01:29:02.000 If nobody had said anything, nothing would come up.
01:29:03.000 It's in poor taste.
01:29:04.000 And if you're representing Disney, you can't fuck with that stuff.
01:29:07.000 That's all that is.
01:29:08.000 But that's stuck.
01:29:08.000 Well, they know that.
01:29:09.000 I understand.
01:29:10.000 But it's not.
01:29:10.000 My point is, I'm defending him.
01:29:11.000 It's not evidence of anti-Semitism.
01:29:13.000 No, yeah, no.
01:29:14.000 He's just saying something fucked up because he's trying to be shocking.
01:29:17.000 Yeah.
01:29:17.000 He's just broken.
01:29:18.000 I think the Hitler stuff, like, holy shit, there was nothing there.
01:29:21.000 It's so stupid.
01:29:22.000 It's just the stupidest thing in the world.
01:29:23.000 And it's, I mean, this is the thing, right?
01:29:25.000 Like, the Keckstani flag, what I love about this is it's a parody of the Nazi flag.
01:29:30.000 Right, of a swastika.
01:29:31.000 What is it in there?
01:29:32.000 What's in there?
01:29:33.000 It's silly internet stuff.
01:29:33.000 Keck.
01:29:35.000 Is it a K in there?
01:29:36.000 Yeah, yeah.
01:29:38.000 Okay.
01:29:39.000 Can't see it.
01:29:40.000 When you have to explain any of this to someone, it sounds like the most stupid thing in the world because it's internet human beings.
01:29:44.000 I love how they're planting the Iwo Jima flag.
01:29:46.000 I know, isn't that great?
01:29:48.000 Isn't that great?
01:29:50.000 It's so stupid.
01:29:51.000 But that's what it's supposed to be.
01:29:52.000 Exactly.
01:29:52.000 Exactly.
01:29:53.000 Because that's how they come across to everyone else.
01:29:54.000 It's like, guys, this is how stupid you look to us.
01:29:57.000 Yeah, to the normal.
01:29:58.000 Did you see that thing that somebody put up about it's you and Milo and a bunch of other people and you've got a red pill?
01:30:07.000 And it says, get in the car, Normies.
01:30:10.000 We're going to save civilization.
01:30:12.000 Someone's got to do it.
01:30:14.000 Like, what?
01:30:15.000 First of all, who the fuck thinks that that's really...
01:30:21.000 Yeah.
01:30:22.000 But that's the point, isn't it?
01:30:24.000 about people who can't laugh at themselves.
01:30:26.000 If you can't laugh at yourself, then I've Well, there's also so many layers to this thing.
01:30:31.000 You have to be paying attention to it all the time, like you, to be able to keep up.
01:30:34.000 Like, even I have to ask you questions, like, wait, what's that?
01:30:37.000 What's going on here?
01:30:38.000 Because I'm not paying attention.
01:30:40.000 It's a complex subject.
01:30:40.000 But the thing is, the idea of it.
01:30:42.000 This is the thing.
01:30:43.000 The Southern Poverty Law Center, of course, came out and condemned Kazakhstan.
01:30:47.000 Because they're fucking idiots.
01:30:48.000 But it's not real.
01:30:50.000 Exactly.
01:30:50.000 But is Kazakhstan supposed to, is that supposed to be alt-right, though?
01:30:53.000 No, no, no, of course not.
01:30:55.000 It's actually, it's an identity for people who don't want an identity.
01:30:58.000 Because I'm not an identitarian.
01:31:00.000 It sounds very non-binary.
01:31:01.000 It sounds like Latinx.
01:31:03.000 Well, that's the thing.
01:31:03.000 Kind of, yeah.
01:31:04.000 No, no, that's, but that's exactly what it's supposed to sound like.
01:31:07.000 Right?
01:31:07.000 When they say, you know, I'm a Latinx, I'm, you know, let me talk to you about my woke blackness and stuff like this.
01:31:13.000 I want to be a woke black person.
01:31:14.000 Well, you can be a Kekestani.
01:31:17.000 Let me tell you about my woke blackness.
01:31:18.000 This is one of the coolest things you can ever say.
01:31:20.000 But that's what it is, though.
01:31:21.000 Because when someone comes to you about their identity, it's like, okay, but this isn't a conversation I want to have with you.
01:31:24.000 It's not politics.
01:31:25.000 It's you talking about yourself.
01:31:26.000 And it's like, well, I don't have an ethnicity, like an identity.
01:31:31.000 I don't play identity politics.
01:31:32.000 They play identity politics.
01:31:33.000 Kekestan is a parody of identity politics, and every identity politician hates it.
01:31:39.000 Everyone who gets into bed with identity politics and say, yeah, what am I my blackness or Latina or my whiteness or my Jewishness, whatever it is?
01:31:46.000 Okay, what am I my Kekestanishness?
01:31:48.000 And they're just like, well, shut up.
01:31:50.000 That's not real.
01:31:51.000 Well done.
01:31:52.000 You finally got it.
01:31:53.000 I don't give a shit about your fucking identity.
01:31:55.000 You don't give a shit about mine.
01:31:57.000 So let's talk about the issues.
01:31:59.000 The issues.
01:32:01.000 Like Poodie Pie.
01:32:02.000 So where's it, how does he stand now?
01:32:05.000 So he lost this Disney deal because this Wall Street Journal issue.
01:32:09.000 Pretty fine, obviously.
01:32:10.000 But how many people were actually aware of the real content?
01:32:14.000 Like what he actually did?
01:32:16.000 But what I'm saying is how many people are aware versus the magazine article?
01:32:20.000 Oh, it's about the type of people who are aware.
01:32:23.000 Because I wasn't aware.
01:32:24.000 Of course you were.
01:32:24.000 I knew who he was.
01:32:25.000 I heard his name.
01:32:27.000 And when I heard the story, I had heard a bunch of different accounts.
01:32:30.000 I tried to sift between the two.
01:32:32.000 And the middle seemed to be that he was doing like typical internet stuff, being silly, but none of it was really anti-Semitic.
01:32:39.000 Of course not.
01:32:40.000 But that's even milder than what I thought it was.
01:32:43.000 That's insanely milder, especially when you're using the clips of him saying if I was taken out of context.
01:32:50.000 Yeah, it's absolutely pathetic.
01:32:52.000 But obviously the audience of the Wall Street Journal is the elites, the intelligentsia, the smarts, rich people.
01:33:00.000 Old people that don't have a good internet connection at all.
01:33:02.000 Yeah, exactly.
01:33:03.000 Well, no, undoubtedly got the best internet connections, but they're not going to live very long because they're old.
01:33:09.000 PewDiePie's audience is coming up.
01:33:11.000 They're going to be the ones who are essentially taking over the world.
01:33:13.000 And PewDiePie is the one who's showing them that, look at the things these people are saying, they're bad.
01:33:18.000 And you know, because they're attacking me and you know me, I'm not doing anything wrong.
01:33:23.000 So you know you can't trust these people.
01:33:24.000 They are going to come after you.
01:33:26.000 If they can go after PewDiePie, the biggest YouTuber in the world, then they can go after anyone.
01:33:32.000 Yeah, but it's really dangerous to frame things that way because it's dishonest.
01:33:37.000 And as soon as you do something like that and you make a dishonest representation of this guy, of PewDiePie, immediately every other thing that you print, I'm going to go, well, you guys already made that thing about Poodie Pie, and that was bullshit.
01:33:50.000 Like you had to do a lot of work to make that.
01:33:53.000 Like to cherry-pick that hard.
01:33:56.000 Oh, they went through six months of video and got like two minutes of footage.
01:33:58.000 It's hilarious.
01:33:59.000 And he does a video every day.
01:34:01.000 So it's just like, because, I mean, you saw the quality of the video.
01:34:03.000 It's just, yeah.
01:34:04.000 Who funded that?
01:34:05.000 The Wall Street Journal funded that?
01:34:07.000 Like, that had to cost a lot of money, right?
01:34:10.000 Well, think about paying these guys by the hour.
01:34:12.000 They're working on that.
01:34:14.000 Watch a bunch of YouTube videos.
01:34:15.000 Millions of people have already watched and then go, look at the scoop we've got.
01:34:18.000 And try to find one or two Jew jokes.
01:34:23.000 See, if he was Jewish, he could have Mel Brooks his way right through that motherfucker and been fine.
01:34:27.000 But unfortunately for PewDiePie, he's got white male privilege, so he can't do a thing.
01:34:33.000 Well, you have to recognize your privilege.
01:34:35.000 Super important.
01:34:36.000 Check your privilege at the door.
01:34:38.000 It's a wonderful way for people to control you.
01:34:40.000 It's original sin.
01:34:41.000 Yes, it is original.
01:34:41.000 To tell you.
01:34:43.000 It really is.
01:34:44.000 These people are creating a new priesthood.
01:34:46.000 If you look at the diversity officers, I mean, like in Britain, the BBC is the prime example.
01:34:51.000 I mean, they actually have job positions where they hire on the basis of race and against people on the basis of race.
01:34:57.000 And not even for, like, if it was like in front of the camera, I could understand.
01:35:00.000 You know, I mean, like, I mean, I don't like the, we need more black people, we need more brown people, whatever.
01:35:05.000 But, I mean, I could at least go, well, I mean, like you were saying, you know, it's good to have examples.
01:35:10.000 I can accept that I don't like it, but not because I don't want to see brown people, but because it's discriminatory on the basis of race.
01:35:16.000 But I could at least accept it for a good conclusion.
01:35:19.000 But for this, it's just a research position behind the scenes, 18 grand a year, you know, making nothing.
01:35:26.000 And all you're doing is typing into Google and finding things.
01:35:28.000 And they're like, white people can't apply.
01:35:30.000 And I have to fund that with my TV license.
01:35:33.000 It's crazy that white people can't apply.
01:35:35.000 I know.
01:35:36.000 No whites allowed.
01:35:37.000 It's so racist.
01:35:38.000 And they're wondering why the alt-rights are doing so well.
01:35:40.000 Why so many people are just like, you know what?
01:35:42.000 Get fucked.
01:35:43.000 But the idea that you could say that, like, your melanin content is going to prohibit you from getting this job.
01:35:49.000 Did you, I'm sure you paid attention to the Evergreen State College fiasco, Brett Weinstein.
01:35:54.000 Did you listen to Brett when he was on this podcast?
01:35:55.000 I haven't had time.
01:35:56.000 I've been busy.
01:35:57.000 He's doing all this.
01:35:58.000 He's a very shockingly progressive man.
01:36:00.000 He is.
01:36:01.000 He's a super liberal, essentially.
01:36:03.000 Yeah, please.
01:36:05.000 He's a really nice guy.
01:36:07.000 He seemed it.
01:36:07.000 Yeah, he seemed it.
01:36:08.000 And the poor guy's in hiding now.
01:36:11.000 Yeah, he's in hiding.
01:36:13.000 Literally, he had to remove his family from there and kind of put them in hiding.
01:36:19.000 The schools essentially almost shut down.
01:36:22.000 I mean, the thing is chaos.
01:36:22.000 Yeah, they took over.
01:36:24.000 And now they're voting to remove funding and turn it into a private school.
01:36:28.000 Good.
01:36:29.000 Because people have to do it.
01:36:30.000 They're like, you guys are out of your mind.
01:36:32.000 We can't give you tax dollars.
01:36:35.000 They told him that he should stay home because they wanted a day of absence where white people were not allowed to go to work.
01:36:41.000 And he was like, what the fuck are you doing?
01:36:43.000 I'm Jewish.
01:36:44.000 Yeah, not only that, he's like, that's racist.
01:36:46.000 And they're like, no, you're racist.
01:36:47.000 That's the important point.
01:36:48.000 Well, the woman, one of the women who was responsible for it, called his wife a racist and was saying that white, she called for on a Facebook post, she called for white women to take this racist white, take her racist white ass out or something like that.
01:37:06.000 The thing about this all of that.
01:37:07.000 Maybe call her out.
01:37:07.000 Call her out.
01:37:09.000 The thing about all of this is really dangerous because at some point white people go, yeah, okay.
01:37:12.000 But she's calling her white.
01:37:14.000 She's calling her racist white.
01:37:15.000 Like, God damn it, just because someone doesn't agree with you on something that's really racist doesn't mean they're a racist.
01:37:21.000 No, of course not.
01:37:22.000 But it's dangerous to do this because, I mean, if you're going to use this as a weapon against other people, that's what this is.
01:37:27.000 This is an attack.
01:37:28.000 Yeah.
01:37:29.000 He's refusing to cooperate, so they're attacking him.
01:37:31.000 Well, it's a control.
01:37:32.000 It's without a doubt, it's about control.
01:37:34.000 This is 100% about people trying to control other people.
01:37:36.000 It's political, entirely political, and this is a political attack.
01:37:39.000 And if you're going, right, you don't want to be a racist, do you?
01:37:42.000 At some point, white people are just going to go, I guess I'm racist.
01:37:45.000 Did you see where they had the president and they were talking to him?
01:37:48.000 And the president was addressing these students.
01:37:50.000 They were being fucking completely ridiculous.
01:37:52.000 And they told him to put his hands down because his hands were a microaggression.
01:37:56.000 And then a kid stood up next to him in like a threatening manner and told him to put his hands down.
01:38:01.000 And he put his hands down.
01:38:02.000 They started laughing.
01:38:03.000 They started laughing.
01:38:05.000 It's clearly a game.
01:38:06.000 They're playing ping-pong.
01:38:08.000 They're playing checkers.
01:38:09.000 They're just bullying them.
01:38:09.000 They're bullying him.
01:38:10.000 But it's not just that.
01:38:12.000 It's not just bullying.
01:38:13.000 Oh, no, of course.
01:38:13.000 It's like them grouping up to support each other.
01:38:16.000 Can you imagine how he feels?
01:38:18.000 Well, he was saying that he believes it's like a mass psychosis.
01:38:22.000 That literally it's like a mass hysteria.
01:38:24.000 It's like people losing their minds in this way.
01:38:27.000 They aren't losing their minds at all.
01:38:28.000 They're being given undue power that they don't deserve.
01:38:30.000 They haven't earned all of this power.
01:38:32.000 But they're doing something with that power.
01:38:33.000 If they gave you that power, you wouldn't.
01:38:35.000 So it is some sort of a psychosis.
01:38:37.000 Oh, yeah, no, no, absolutely.
01:38:38.000 These people are, they've never been, they're not being held to an account.
01:38:42.000 They're not being told, look, when you earn power, you generally have to, I mean, that's a karate kid, isn't it?
01:38:47.000 You earn power, you have to be responsible with that power.
01:38:49.000 And that's what a good leader is.
01:38:51.000 These people are just being given power.
01:38:53.000 They haven't earned it.
01:38:54.000 They're being given it, and they are being told they can't do anything wrong.
01:38:57.000 And so they think, well, fuck, I'll do whatever the fuck I like then.
01:39:00.000 They're also told that you can always step up and talk about your blackness or whatever, and everyone has to kind of shut the fuck up.
01:39:08.000 There's these carte blanche.
01:39:10.000 Yes.
01:39:11.000 There's these subjects that you can sort of intersect or inject your race into or your gender or the fact that you're trans or the fact that you're marginalized.
01:39:24.000 Marginalized in any way, Latino.
01:39:26.000 And you can kind of shut down discussion.
01:39:28.000 And it's one of the things that you see when this president is addressing the students and students step up and they're asking questions.
01:39:34.000 This is a long, it looks like a hostage negotiation.
01:39:36.000 That's what I'm saying.
01:39:36.000 It is.
01:39:37.000 It's very strange.
01:39:38.000 They get up and they start talking about their own individual identity.
01:39:38.000 It is.
01:39:43.000 Let me talk to you about the color of my skin.
01:39:44.000 That's not my business.
01:39:45.000 Well, whatever it is, the fact that I have a vagina, the fact that I...
01:39:49.000 No, not really.
01:39:50.000 Even my wife's going to complain, to be honest.
01:39:53.000 It becomes the thing, you know?
01:39:54.000 It becomes the thing that you're discussing rather than whatever the actual thing you're discussing is.
01:39:59.000 Yeah, that's the problem with identity politics.
01:40:01.000 It's not a form of politics.
01:40:02.000 It's designed to grind the dialogue to a halt and to give the person making the argument control of everyone else around them.
01:40:02.000 It doesn't go anywhere.
01:40:10.000 That's where it is.
01:40:10.000 Unjustly.
01:40:11.000 This person hasn't earned this.
01:40:13.000 Yeah, that's where it is, right?
01:40:14.000 It's like you have a total control of the conversation now, and you have set these very clear parameters.
01:40:21.000 And there's a lot of that, man.
01:40:23.000 It's cancer.
01:40:23.000 There's a lot of that.
01:40:24.000 It is.
01:40:25.000 It's not healthy.
01:40:26.000 It's just bizarre that that would fly in a university setting.
01:40:29.000 And I've applauded.
01:40:30.000 I think the professor agrees with it.
01:40:32.000 The deans are.
01:40:32.000 Yeah.
01:40:33.000 Some of them, but also they don't want to lose their jobs.
01:40:36.000 And the students coming in, they're going to change the world.
01:40:39.000 They act like a mob.
01:40:41.000 They act like an enemy faction, an army within these things.
01:40:45.000 And this is the reason that I'm so good at identifying this, because I'm really well read in power politics.
01:40:50.000 If you want to know what's going on, what you need to do is read The Prince, read Alinsky, and read Robert Greene.
01:40:54.000 That'll teach you how to understand these people.
01:40:56.000 And these people are, I mean, everything they're doing is about power and control.
01:41:00.000 It's about influence over what's going on.
01:41:03.000 And at any point, you can break this influence by just simply understanding how they're gaining it.
01:41:08.000 I think Jordan Peterson's got some pretty unique insight into it because he's been dealing with it for several years and in a very high-profile way.
01:41:14.000 But one of the things that he said, he believes it's about revenge.
01:41:18.000 He says he thinks that a lot of these people that are so active on campus and they're in this fever pitch, he goes, a lot of them were bullied and pushed around when they were younger and they were socially awkward and not in the good, cool groups.
01:41:33.000 And now they want revenge.
01:41:35.000 And this is why they want to label everyone who plays sports as being some sort of a misogynist woman raper.
01:41:42.000 You know, I mean, that's really what it is.
01:41:44.000 Like everyone who's in a frat house is a misogynist, homophobic rapist.
01:41:50.000 Like there's no way around it.
01:41:51.000 You're in a frat.
01:41:52.000 It's like that woman who thought it was okay to say that that guy who got killed, like that they would do Coke and rape girls at a frat party.
01:42:00.000 Like what?
01:42:01.000 Okay, this is a caricature.
01:42:03.000 I mean, imagine if someone did an equal caricature about a black man.
01:42:07.000 It's so racist.
01:42:08.000 It's unbelievably racist.
01:42:10.000 It is racist, but it's just, it's all.
01:42:13.000 It's also mean.
01:42:14.000 It's like, this guy's dead.
01:42:15.000 He fucking got beaten to death for taking a poster.
01:42:18.000 That's all he did.
01:42:19.000 Like, you're putting all these straw man definitions on this guy, but all this guy did is a young guy's dumb thing.
01:42:27.000 I'm going to steal this poster.
01:42:28.000 And he's dead now.
01:42:29.000 Like, that's it.
01:42:30.000 The fact that he's white.
01:42:31.000 He's got a totalitarian regime.
01:42:32.000 You've got no sympathy for him whatsoever.
01:42:34.000 What the fuck's wrong with you?
01:42:34.000 Would you be cool if he was Colombian?
01:42:36.000 If he was a Colombian American and they beat him to death like that?
01:42:39.000 Would that be okay?
01:42:41.000 How would you feel then?
01:42:42.000 Would you come up with some sort of a Colombian influence that he's got that you don't have?
01:42:47.000 And so he's got some sort of a Colombian privilege?
01:42:49.000 I mean, how is this guy not being hideously oppressed anyway?
01:42:52.000 He's been captured, you know, taken by the North Koreans.
01:42:54.000 He's been beaten to death.
01:42:55.000 Yeah.
01:42:55.000 And they were literally like, oh, proof that your white male privilege isn't universal.
01:42:59.000 It's like, you know, if you're going to say that, I mean, you'd have to be a fucking moron to say that because everything about the ideology itself is contextual.
01:43:06.000 It's power plus privilege.
01:43:08.000 Therefore, it must have a context.
01:43:10.000 There must be more white people around for it to work.
01:43:12.000 And therefore, if we go to Zimbabwe, white people are hideously oppressed by their definition in Zimbabwe, when the white farmers are getting the farms appropriated by black African pan-Marxists or whatever they are, then of course they're being oppressed.
01:43:23.000 Of course it's not universal.
01:43:25.000 Of course it's contextual.
01:43:26.000 It's the most asinine thing to say.
01:43:29.000 Yeah.
01:43:29.000 You know, oh, it's not universal.
01:43:31.000 Fucking duh.
01:43:32.000 You've got universal.
01:43:34.000 What you were talking about.
01:43:35.000 I'm drinking a lot because I'm really dehydrated.
01:43:36.000 Don't worry about it.
01:43:37.000 I just don't have to hydrate myself.
01:43:39.000 Apologize for drinking water.
01:43:40.000 I don't know how you want to.
01:43:41.000 I know I'm consistently doing it and it's like, you know.
01:43:43.000 Don't worry about it, man.
01:43:45.000 But I think it's also what you were talking about where they want to make it all about them.
01:43:48.000 Like here, this woman has taken this situation, which is horribly tragic, and now she's talking about students that she has to deal with that are white men that are doing Coke and raping girls at frat parties, just like this guy.
01:44:01.000 But it is a horrible generalization.
01:44:05.000 They keep making up the stories of it.
01:44:06.000 But she's making it about her and her classes and the guys that she has, the clueless guys that she has to deal with in her classes.
01:44:13.000 And she's just deciding that this guy is that without any personal knowledge of him at all.
01:44:20.000 This is the future we've got now.
01:44:21.000 And she's turning it on herself and she's a victim because she has to deal with these students and these clueless white rich male students.
01:44:29.000 Fuck, man.
01:44:31.000 Now I'm not saying there's no twats out there, a bunch of dudes who are shitheads.
01:44:35.000 And for sure there are.
01:44:37.000 Just like there's women that suck too.
01:44:39.000 There's humans that suck.
01:44:40.000 It has nothing to do with gender or race.
01:44:43.000 That's racist to categorize them in that particular way.
01:44:46.000 There's plenty of humans that happen to be white males that are great, that happen to be black women that are great.
01:44:52.000 There's plenty of people that are cool across the board in every single gender, race.
01:44:58.000 Of course, you treat people as individuals.
01:45:00.000 You treat them by the content of the character.
01:45:02.000 Absolutely.
01:45:02.000 MLK was saying, I mean, that's the right way to do things.
01:45:04.000 That's the most disturbing thing about this, is that a fucking professor who's teaching young people is generalizing this horrendous way.
01:45:13.000 And she thinks it's okay because this kid stole a poster.
01:45:16.000 And what did he think was going to happen?
01:45:17.000 What did he think was going to happen?
01:45:19.000 Jesus Christ, when you're 20 years old, you're not thinking.
01:45:23.000 I mean, how the fuck does this dummy not know that?
01:45:25.000 How does she not understand just the science of the development of the human brain, which has been clearly established over the past 20, 30 years?
01:45:34.000 25.
01:45:35.000 The frontal cortex.
01:45:37.000 It's not even developing.
01:45:38.000 These kids do stupid things for you.
01:45:40.000 Of course.
01:45:41.000 Unbelievably stupid.
01:45:41.000 Of course.
01:45:42.000 And I have, and I'm sure you have, too.
01:45:44.000 Especially if you have a dick.
01:45:45.000 There's something about being a dude that makes you dumber.
01:45:47.000 My miss is a dude.
01:45:48.000 My miss is like, look, you know, I don't want to sound like a bigger, but it's always men.
01:45:48.000 You do dumber shit.
01:45:52.000 It is?
01:45:52.000 It's always men.
01:45:53.000 They're trying to impress women or something like that.
01:45:54.000 It's always men.
01:45:55.000 But let me educate you on something.
01:45:57.000 Please do.
01:45:57.000 Say the word at.
01:45:58.000 At.
01:45:59.000 And I'll put the letters T-W before it.
01:46:02.000 TWAT?
01:46:03.000 That's how you say it?
01:46:03.000 Yes.
01:46:04.000 Yes.
01:46:05.000 And if you're going to appropriate our language, it's American now.
01:46:08.000 That's why we spell tires different.
01:46:10.000 We don't use a Y in tire.
01:46:12.000 It's TWAT.
01:46:13.000 It's TWAT.
01:46:14.000 You can say it any way you want, dude, but I'm saying TWAT.
01:46:14.000 No.
01:46:17.000 It's our language, Jeff.
01:46:18.000 If I say TWAT, I'm going to run into problems with my people.
01:46:21.000 There's more of us than there are of you.
01:46:23.000 We stole your language.
01:46:25.000 God damn it, man.
01:46:26.000 It's ours now.
01:46:26.000 It's American.
01:46:28.000 In the future, you know, it won't be English anymore.
01:46:30.000 Oh, yeah, I know.
01:46:30.000 But like Latin became Spanish.
01:46:32.000 It's just going to happen.
01:46:33.000 So I've got like another 10 minutes.
01:46:35.000 Is there anything else you want to talk about?
01:46:36.000 Anything, man.
01:46:37.000 What do you want to talk about besides Anita Sarkeesian?
01:46:39.000 Yeah, no, I don't want to talk about it.
01:46:40.000 Jesus Christ, bro.
01:46:41.000 She's going to be victimized by this.
01:46:44.000 Oh, she's already pissy.
01:46:45.000 But yeah, anyway.
01:46:46.000 By the way, Anita, I got no hate for you.
01:46:49.000 All right.
01:46:50.000 I'm not this dude.
01:46:51.000 I'd have you on.
01:46:52.000 I'd talk to you just about that McIntosh.
01:46:53.000 You should come on.
01:46:54.000 You should come on.
01:46:57.000 And Jonathan McIntosh, yeah.
01:46:58.000 Grow pair.
01:46:59.000 There's a lot of those guys.
01:47:01.000 A lot of those vampire familiars.
01:47:03.000 You ever see the movie Blade?
01:47:04.000 Remember?
01:47:05.000 The vampire familiar hangs around with the vampires and thinks he's going to get in, and then they eventually kill him?
01:47:09.000 Yeah.
01:47:10.000 Same thing with 30 Days of Night.
01:47:12.000 Except the Vampire Familiars don't usually sexually assault the vampires.
01:47:16.000 Those male feminist guys, they rarely get that off.
01:47:19.000 I love that Patton Oswald tweeted that out, didn't he?
01:47:21.000 He was like, you know, the male feminist sexual predator is the sort of Catholic priest molesting children of our generation.
01:47:29.000 It's so true.
01:47:30.000 It is so fucking true.
01:47:31.000 In a lot of ways.
01:47:32.000 I love male feminists.
01:47:33.000 They're just like the creepiest looking motherfuckers.
01:47:36.000 And then it turns out.
01:47:37.000 By the way, if you're not that way and you identify yourself as a feminist, no, I'm not talking about you.
01:47:42.000 I know what you're saying.
01:47:43.000 There's a lot of people out there that really want to.
01:47:46.000 They get locked into it because it sounds right.
01:47:48.000 You know, like, oh, are you for equal rights for women?
01:47:51.000 Yeah, then you're a feminist.
01:47:52.000 Okay, then I'm a feminist.
01:47:53.000 And you say it, and then people treat you differently because you say it because they say, oh, you've done a good job, Billy.
01:47:58.000 I'm going to pat you on the head.
01:47:59.000 You called yourself a feminist.
01:48:01.000 Why'd you define yourself, man?
01:48:03.000 Are you a human?
01:48:04.000 Like, every good human wants equality.
01:48:06.000 Every good human.
01:48:07.000 It's like many labels as well.
01:48:08.000 It's not that you're denying sexism.
01:48:11.000 Like, you know, you don't have to be a pro-black ally to deny or to rather deny accepting any form of racism.
01:48:23.000 That's a bad way to describe it.
01:48:24.000 But it's how they define these things.
01:48:26.000 If they have a different definition, if they say, we're structural, it's, you know, like, you know, institutional, we'll talk about racism.
01:48:32.000 It's an easier way of framing it.
01:48:34.000 And you say, well, you know, I think racism, I don't think the structures are actually racist.
01:48:38.000 I think that the individuals, certain individuals within the structures are racist.
01:48:41.000 Then what you're doing is undermining their entire philosophy.
01:48:44.000 And if you're undermining their entire philosophy by effectively individualizing it rather than collectivizing it, then they have to consider you an enemy.
01:48:53.000 Because what you're, if everyone, if you go, yeah, and then someone else goes, you know what, that's a good way of looking at it.
01:48:57.000 I'm going to look at that too.
01:48:58.000 And then a lot of people go, you know what, that's a great point.
01:49:00.000 Then suddenly their philosophy is just going to collapse.
01:49:03.000 No one's going to believe what they're saying.
01:49:05.000 And they're going to be like, no, but this is, I'm being horribly oppressed.
01:49:08.000 And everyone's like, well, not really.
01:49:09.000 There are some racists, but we deal with them.
01:49:10.000 We've got laws.
01:49:11.000 You know, we've got laws against racism.
01:49:13.000 We've got laws, affirmative action laws to bring black people in.
01:49:16.000 You know what?
01:49:16.000 I don't think the system is actually as racist as you're saying.
01:49:18.000 I think that maybe there are a few bad actors.
01:49:20.000 And when we see one, we can spot them and get them out, you know.
01:49:23.000 Then how are these people going to play the victim?
01:49:26.000 How are they going to play the victim?
01:49:27.000 There's real racism and there's also real people playing the victim.
01:49:31.000 There's both.
01:49:32.000 Of course there's real racism, but we're vigilant for that.
01:49:34.000 Where are racists just tolerated?
01:49:36.000 Well, oh, the president of that company is now than ever, right?
01:49:38.000 Exactly.
01:49:39.000 The president of that company is a racist.
01:49:40.000 Yeah, well, we're just going to leave him because, you know, he's just like, you know, he's alright.
01:49:44.000 He's doing a good job.
01:49:45.000 And of course he's not.
01:49:46.000 He's going to be out of the door for sure.
01:49:48.000 They frame PewDiePie as being someone who may have been spreading anti-Semitism, which he probably.
01:49:53.000 I think they got him like millions of followers, though.
01:49:55.000 Well, they did, like two, three million, but they cost him millions of dollars.
01:49:58.000 Yeah, but I think ultimately it'll probably balance itself out.
01:50:01.000 Oh, yeah, it's already gone in his favor.
01:50:03.000 They've already fucked themselves.
01:50:04.000 Yeah, but that's only because there was a credible alternative media.
01:50:07.000 I mean, like, PewDiePie, there's me and another YouTuber called Krauts and T. We both made videos about that.
01:50:11.000 What's his name?
01:50:12.000 Do you want to add it all?
01:50:12.000 You're talking so fast.
01:50:14.000 No, it's just there's so much I need to tell you.
01:50:15.000 Oh, okay.
01:50:16.000 I'm probably not going to be here again, and there's a lot you need to tell you.
01:50:18.000 You're coming back.
01:50:18.000 Come on, man.
01:50:19.000 Oh, I'd love to.
01:50:20.000 You know, you like it.
01:50:20.000 I love that.
01:50:21.000 And when I've had a bit more sleep as well, that'll be anything.
01:50:23.000 But anyway, a YouTuber called Kraut and T. Kraut and T?
01:50:27.000 Yeah, he's a German guy.
01:50:28.000 Kraut and T. Oh, like T, like the sip tea.
01:50:31.000 Yeah, because he's half English, half German.
01:50:33.000 Okay, oh, I get it.
01:50:35.000 And he did a video about this as well.
01:50:38.000 I did a video about this.
01:50:39.000 And PewDiePie did a video using the information and statistics we've compiled to be like to do his own video and used us as sources.
01:50:46.000 Say, look, this is where I got this information.
01:50:47.000 And he tweeted out.
01:50:48.000 And I was like, holy shit, that's really nice of him.
01:50:50.000 But it's really interesting how we helped him fight back because he's a comedian.
01:50:55.000 He doesn't do all this professionally.
01:50:57.000 He doesn't know what to look for.
01:50:59.000 And why would he?
01:51:00.000 You wouldn't expect him to.
01:51:00.000 No one would.
01:51:01.000 But luckily, this was our speciality.
01:51:04.000 We've been going against the media since day one because we know they're a bunch of liars.
01:51:07.000 We know they are.
01:51:08.000 Some of them are.
01:51:10.000 There's got to be.
01:51:11.000 Not all of them.
01:51:11.000 Right.
01:51:12.000 I mean, there's got to be good journalists.
01:51:13.000 But that's no excuse, though.
01:51:15.000 You know, a platform like the Wall Street Journal, if you want to maintain your reputation, you don't publish shit like that.
01:51:19.000 Well, how do you think?
01:51:20.000 That's unacceptable.
01:51:21.000 What is their response?
01:51:22.000 They don't care.
01:51:22.000 They don't address it.
01:51:24.000 Why should they?
01:51:24.000 That seems so crazy.
01:51:26.000 They should address that.
01:51:28.000 And they should defend that.
01:51:29.000 Absolutely, they should.
01:51:30.000 I mean, is he suing them?
01:51:32.000 No, I don't think he sued them.
01:51:33.000 Why?
01:51:34.000 I don't know.
01:51:35.000 I don't think he wants the trouble.
01:51:37.000 If he sued them, I'd back him.
01:51:39.000 I think he would win.
01:51:40.000 I can't see how he could lose.
01:51:43.000 Can you prove they cost you money?
01:51:44.000 Easily.
01:51:46.000 Is this actual anti-Semitism?
01:51:48.000 Obviously not.
01:51:49.000 How could they do it?
01:51:50.000 Did they take you out of context?
01:51:51.000 Clearly.
01:51:52.000 And in fact, one of the jokes was that they would take him out of context.
01:51:56.000 How could he lose this?
01:51:58.000 Yeah.
01:51:59.000 Well, it is possible.
01:52:01.000 Honestly, I think that what he should do is find out how much the Wall Street Journal cost to buy and then crowdfund it.
01:52:07.000 I'd donate to that.
01:52:09.000 I'd spread it around.
01:52:10.000 Yeah, there was a lawsuit, recent lawsuit that I couldn't believe was thrown out.
01:52:15.000 God damn it.
01:52:16.000 I can't remember it now.
01:52:18.000 Yeah.
01:52:19.000 I'm not going to remember.
01:52:20.000 I'm not sure, you know, a few hundred million dollars.
01:52:22.000 But, like, I mean, he's got 55 million subscribers and he's a very influential man.
01:52:25.000 And there are a lot of people who like him who would love to support him.
01:52:28.000 Maybe it's possible.
01:52:30.000 Maybe he could speak to, I don't know, some nice wealthy benefactors who are like, you know what, I hate these people.
01:52:35.000 Like that Peter Thiel guy, what he did with Hulk Hogan versus.
01:52:38.000 Exactly.
01:52:39.000 Yeah.
01:52:40.000 Why not?
01:52:40.000 Wouldn't it just be the funniest thing in the world to see PewDiePie as the owner of the Wall Street Journal?
01:52:45.000 Wouldn't that just be fun?
01:52:46.000 That would be hilarious.
01:52:49.000 Wealthy billionaires who happen to be, obviously wealthy, happen to be watching this and think, you know what, we could get one over on these bitches.
01:52:55.000 And we don't have to just let them win.
01:52:57.000 We don't have to just let them win.
01:52:58.000 We could.
01:52:59.000 Well, I feel like the New York Times feels incredibly challenged right now.
01:53:02.000 The Wall Street Journal we were talking about.
01:53:04.000 I'm talking about the New York Times.
01:53:04.000 I know.
01:53:05.000 Sorry, I'm just making sure that.
01:53:06.000 I was going to say, with, you're just so ready to go.
01:53:10.000 Just so much words.
01:53:10.000 I am.
01:53:11.000 Unfortunately, I know you don't have much time.
01:53:14.000 But about Donald Trump, I was going to say.
01:53:16.000 Yeah, I wish I'd.
01:53:17.000 Is there any way you could reschedule or schedule your flight?
01:53:19.000 Yeah, I could just, I could.
01:53:22.000 Well, you can.
01:53:23.000 I just want to make sure you don't get stuck here.
01:53:25.000 Well, I can just buy another ticket tomorrow.
01:53:27.000 It'll cost me a fortune, but it's fine.
01:53:27.000 Are you sure?
01:53:29.000 Well, a fortune for an hour of podcasting seems one accurate.
01:53:33.000 You've only got an hour.
01:53:34.000 I've got another hour.
01:53:36.000 800 pounds.
01:53:37.000 Well, no, probably more.
01:53:38.000 It's probably 1,000 pounds, so I probably shouldn't.
01:53:39.000 What's 1,000 pounds?
01:53:41.000 Probably about $1,200.
01:53:42.000 We'll take care of it.
01:53:43.000 Really?
01:53:44.000 Yeah, we'll put it on the budget.
01:53:45.000 What the fuck am I going to do for the rest of the day?
01:53:47.000 We're going to hang out, man.
01:53:48.000 You're going to go to the movies.
01:53:49.000 You serious?
01:53:50.000 Have an ice cream.
01:53:52.000 Well, what am I going to do?
01:53:53.000 I don't know what you're going to do, dude.
01:53:53.000 Seriously, what am I going to do?
01:53:55.000 But if you want to do another hour of podcasting, we can.
01:53:58.000 We should just say we were supposed to start at 9.
01:53:59.000 We're a little late because the traffic here is insane.
01:54:02.000 My wife's going to kill me.
01:54:03.000 Your wife's going to kill you if you don't come home?
01:54:04.000 Then go home, man.
01:54:06.000 She'll understand that.
01:54:07.000 You sure?
01:54:08.000 Yeah.
01:54:08.000 Mixed messages.
01:54:09.000 She'll understand that.
01:54:10.000 She's going to kill me because she learns to stand there.
01:54:11.000 All right, so we keep going?
01:54:12.000 Yeah, going on the hill.
01:54:13.000 So where were we?
01:54:13.000 Okay, cool.
01:54:14.000 Where were we?
01:54:15.000 Buying the Wall Street Journal with Peter Thiel's money.
01:54:17.000 Yeah.
01:54:18.000 Well, what I was saying is that the New York Times feels very challenged by Donald Trump's dismissal of them as fake news and all these different attacks.
01:54:26.000 And because of that, they've sort of doubled down on the ethics of journalism.
01:54:29.000 They've tried really hard to be better at what they do.
01:54:33.000 Yeah, yeah, I saw.
01:54:35.000 But that's a real indictment of the state of journalism, isn't it?
01:54:37.000 When they have to do a little bit.
01:54:38.000 You know, we're going to recommit ourselves to actually being journalists.
01:54:41.000 Good thinking, what's the new thing?
01:54:43.000 They've been called out in a way that they've never been called out before.
01:54:48.000 Of course they did lose.
01:54:49.000 Yeah, I mean, he's the president.
01:54:50.000 They became political activists.
01:54:51.000 They weren't journalists.
01:54:52.000 Right.
01:54:52.000 They're right.
01:54:53.000 And they were on the losing side.
01:54:55.000 Right.
01:54:55.000 Get wrecked.
01:54:56.000 You know, of course you've got to suddenly go, well, okay, hold our hands up.
01:54:59.000 Okay, let's try not being liars anymore.
01:55:02.000 But what does that trickle down from?
01:55:03.000 Is it from the editors who lost sight?
01:55:05.000 Is it the writers themselves that they hire?
01:55:08.000 Is it the ethics behind the business of the magazine or of the newspaper?
01:55:11.000 Unfortunately, I don't know the ins and outs of the- I don't know if you can.
01:55:15.000 If I had speculate, I'd probably say it's coming from the top, and they're just like, what the fuck is happening here?
01:55:20.000 Like we're seeing with these tech companies.
01:55:20.000 Right.
01:55:22.000 Like we're seeing with these people that are like Uber progressive in Google or wherever the fuck they are.
01:55:22.000 Yeah.
01:55:28.000 Yeah, that's a weird thing, man.
01:55:30.000 We're dealing with a real, I mean, I hate to use the word, but it's the real word, culture war.
01:55:35.000 Yeah, it is.
01:55:36.000 It's the correct term.
01:55:36.000 I mean, that's the term.
01:55:37.000 These people are culture warriors.
01:55:39.000 And this is why I term this in like power politics, because they are attacking people.
01:55:39.000 Yeah.
01:55:45.000 I mean, like attacking PewDiePie, they're costing him money.
01:55:47.000 I mean, they didn't even publish their article.
01:55:49.000 The Wall Street Journal didn't even publish their article before they had gone to Disney and got them to pull their money from him.
01:55:56.000 Wow.
01:55:56.000 So they brought that to the city.
01:55:57.000 Before publishing, how just behind that?
01:56:00.000 Do you think it's because they heard that he was getting this big fat deal from Disney and they're like, fuck this guy?
01:56:05.000 It's because PewDiePie doesn't play ball with them.
01:56:05.000 Oh, yeah.
01:56:07.000 Doesn't play ball?
01:56:08.000 How so?
01:56:09.000 He's not a progressive.
01:56:11.000 He doesn't sit there going, oh, well, you know, feminism's a good idea, guys.
01:56:11.000 He's a liberal.
01:56:15.000 He's red-pilling the new generation about feminism.
01:56:18.000 He makes anti-feminism videos every now and again, and they're actually quite funny.
01:56:22.000 People should check them out.
01:56:23.000 He's just ragging on this because it's stupid shit.
01:56:26.000 It's the sort of stupid shit like we've been talking about.
01:56:28.000 And rightly so, this is bollocks.
01:56:30.000 And it's good that he's telling kids, look, this is crap.
01:56:33.000 Just treat people well.
01:56:35.000 Just treat people as an individual.
01:56:38.000 Well, I say male feminism.
01:56:39.000 It is crap.
01:56:40.000 But don't you think it's being sort of like it's eroding the same way male feminism is eroding?
01:56:44.000 Like that term is a very derogatory term now.
01:56:47.000 You call someone a male feminist.
01:56:49.000 Most people just go, ugh.
01:56:51.000 If you call someone a social justice warrior, I mean, that is a mocking term now.
01:56:55.000 That term is not something that someone carries.
01:56:57.000 There was a while where people were like, I read quotes like, yeah, we are warriors.
01:57:01.000 Like, what?
01:57:01.000 Yeah.
01:57:02.000 I mean, in the beginning, they were like embracing it.
01:57:04.000 I mean, it's not an SJW on the Joe Rogan podcast.
01:57:04.000 Yeah.
01:57:07.000 Aren't you like the number one podcast in the world?
01:57:09.000 It's probably up there.
01:57:10.000 Yeah.
01:57:11.000 It's not an SJW sat here telling me, listen, let me tell you how to check your privilege show.
01:57:15.000 No, I haven't had one of those on.
01:57:17.000 I would be fascinated.
01:57:19.000 You should get one on just to pick up the picture.
01:57:21.000 They probably know where your studio is and you have to talk to them for an hour.
01:57:24.000 But I can tell you what they think because I've spoken to them.
01:57:24.000 Yeah.
01:57:26.000 I mean, that's a cop-out because the people would be like, yeah, man, share both sides.
01:57:31.000 You should have both sides.
01:57:32.000 No, actually, I shouldn't.
01:57:34.000 I should just talk.
01:57:35.000 I'm just not interested in you.
01:57:36.000 You're creepy and weird.
01:57:36.000 Yeah, I mean.
01:57:37.000 Just be honest, but you've probably harassed someone.
01:57:40.000 There's some merit to having these volatile discussions with people that you disagree with.
01:57:45.000 Oh, yeah, I like it.
01:57:46.000 People keep asking me to talk to Richard Spencer.
01:57:48.000 It's like, yeah, but I've talked to SJWs already.
01:57:50.000 I know what the identitarians think.
01:57:52.000 Every time someone like Cat Black is sitting there going, oh, let me tell you about my blackness.
01:57:56.000 Who is that?
01:57:57.000 A YouTuber called Cat Black.
01:57:58.000 Oh, you're deep in the YouTube world.
01:58:00.000 Oh, yeah, of course I am.
01:58:01.000 You refer to him like we all know who you're talking about.
01:58:03.000 Yeah, yeah, sorry.
01:58:04.000 But she's actually really great.
01:58:05.000 She's talking about her.
01:58:05.000 She's a Carl Rowe.
01:58:07.000 Yeah.
01:58:07.000 She's a black trans YouTuber.
01:58:08.000 She's actually really great.
01:58:09.000 And people are going to be surprised to hear me say that.
01:58:11.000 But I've never done a video about her because she's not full of shit.
01:58:15.000 She's just an identitarian.
01:58:16.000 But that doesn't mean that she's bad or anything like this.
01:58:19.000 That just means she talks about herself a lot.
01:58:22.000 But she's actually got this really great streak in her that's really fair.
01:58:26.000 And so when, I mean, these people run like lemmings off cliff.
01:58:29.000 When they get an idea in their head, they're like, oh my God, yes, we should all, you know, jump on, you know, like Lacey Green when she abandons their community or something.
01:58:38.000 She's another YouTuber.
01:58:39.000 I'm so Jesus Christ.
01:58:40.000 Hey, you're on YouTube, right?
01:58:41.000 You're on YouTube.
01:58:42.000 You should.
01:58:42.000 Allegedly.
01:58:44.000 Who is Lacey Green?
01:58:45.000 What is she?
01:58:46.000 She's a feminist who realized how toxic the community that she was part of.
01:58:51.000 Millions of subscribers.
01:58:52.000 Oh, wait a minute.
01:58:53.000 Is she the girl that was, it turned out, was dating some sort of some guy who is.
01:59:02.000 Is he a neo-Nazi?
01:59:03.000 No.
01:59:04.000 What are you joking around?
01:59:06.000 Yeah, he's probably watching this.
01:59:07.000 Okay, well, I don't give a fuck if he's watching.
01:59:08.000 Let's try to follow this.
01:59:09.000 Yeah, but I'm going to talk to him later.
01:59:10.000 Yeah, no, he's not a neo-Nazi, obviously.
01:59:12.000 Jesus Christ.
01:59:13.000 Well, what was his deal, though?
01:59:15.000 Well, nothing.
01:59:16.000 They start dating.
01:59:17.000 But isn't he like a...
01:59:21.000 He's pretty liberal.
01:59:22.000 Oh, that's what it is.
01:59:23.000 Okay, so he's an anti-SJW person.
01:59:25.000 That's what he is.
01:59:26.000 Yeah, we're basically disaffected liberals, mostly.
01:59:28.000 And so...
01:59:31.000 People got mad at her because she was dating this guy who they don't agree with.
01:59:35.000 Yeah, because he's against feminism and Black Lives Matter and things like that.
01:59:38.000 Why do people give a fuck who people date?
01:59:40.000 Why is that a daughter?
01:59:41.000 Because they're controlling psychocultists.
01:59:44.000 And they dragged her all across social media.
01:59:46.000 I mean, some of the worst things I've seen.
01:59:48.000 Oh, yeah, yeah, absolutely.
01:59:48.000 Really?
01:59:50.000 But what did they talk about?
01:59:51.000 Oh, like, just, you know, I mean, like, one thing that annoys me is that, like, you know, six months ago, everyone was like, yeah, we love Lisa Green.
01:59:58.000 She's great.
01:59:59.000 She's a feminist.
01:59:59.000 She's, you know, her sex ed stuff's brilliant, blah, blah, blah.
02:00:02.000 And then as soon as she's like, you know, I kind of, I met this guy, I like this guy, and he's actually making some really good points.
02:00:07.000 They're like, your sex ed stuff sucked.
02:00:10.000 You suck.
02:00:11.000 Everything you've ever done sucked.
02:00:12.000 Get out of our community.
02:00:13.000 And then, I mean, you know, she gets death threats and things like that.
02:00:16.000 And she used to get them from like, I guess, like alt-writer types who were in their basements or something who hated her.
02:00:20.000 But now she gets them from the SJW types.
02:00:23.000 I wish people would just not send threats and shit.
02:00:25.000 It doesn't do anything.
02:00:26.000 It just makes, it's just nasty for people.
02:00:28.000 Well, it definitely is that.
02:00:29.000 But it's just, it's so sad that someone can't just date someone that they don't agree with.
02:00:34.000 Oh, yeah.
02:00:34.000 It's fucking crazy.
02:00:36.000 She obviously likes the guy.
02:00:37.000 Of course.
02:00:38.000 And you can't change your opinion either.
02:00:40.000 No, your opinion can't evolve.
02:00:42.000 It can't.
02:00:43.000 You'll be a flip-flopper.
02:00:45.000 Yeah.
02:00:46.000 Can you even control what you believe?
02:00:49.000 I believe that you could, but then what did I just say?
02:00:53.000 I said, I believe that I could control what I believe.
02:00:56.000 Yeah, exactly.
02:00:57.000 That's really important.
02:00:57.000 Exactly.
02:00:58.000 So now you don't believe it, right?
02:00:59.000 Because you've just decided, I don't believe that now.
02:01:00.000 I'm going to believe that I can change now.
02:01:02.000 Well, I believe that people do change, for sure.
02:01:04.000 Well, no, no, but they do.
02:01:07.000 It's not like a conscious choice to say, you know what, one day, I mean, I'm an atheist, so one day I'm going to be like, actually, yeah, I'm going to choose to believe in God.
02:01:13.000 Do you either believe or you don't believe in it?
02:01:14.000 I don't think it's a choice.
02:01:15.000 Well, I can see what you're saying, but I think that when you're going over the possibilities of any subject in your mind, you're influenced by a host of factors.
02:01:29.000 First of all, how you physically feel, where you are in your life, what kind of relationship you have with your family, like where your job is right now.
02:01:38.000 I know a lot of people that hit the fucking bottom and then all of a sudden they became religious.
02:01:42.000 Do you think that a coincidence?
02:01:44.000 No, it's like at that stage in your life, you needed to believe something.
02:01:48.000 And that helped you.
02:01:49.000 And it helped you to believe that thing.
02:01:51.000 And maybe, I mean, how many dudes find Jesus while they're getting their dick sucked in a jacuzzi?
02:01:56.000 Yeah, exactly.
02:01:56.000 It's probably pretty low.
02:01:58.000 Right?
02:01:59.000 Unless it's like a really bad blowjob and you just feel sad and you actually like it went really badly wrong.
02:01:59.000 Yeah, yeah.
02:02:04.000 Yeah.
02:02:05.000 Terrible.
02:02:06.000 Jesus, help me.
02:02:08.000 Help me, Jesus.
02:02:08.000 Jesus.
02:02:10.000 But, you know, people find, they find love and happiness in weird places.
02:02:15.000 They find depression in weird places.
02:02:17.000 So it's really a lot of what you believe depends on where you're at in your life.
02:02:22.000 It can very well be influenced where you go one way or the other.
02:02:26.000 And then once someone has an opinion, whether it's God is real or God is fake, they fucking fight that to the death.
02:02:35.000 Like that is their stance, and they can't change it.
02:02:38.000 They can't move it around.
02:02:39.000 And if you do, you're a flip-flopper.
02:02:41.000 Well, if you didn't reason yourself into it, then you can't be reasoned out of it.
02:02:44.000 That's true.
02:02:45.000 Well, that term flip-flopper is so fucking hilarious.
02:02:48.000 It's so stupid because it's a moronic political term and really only existed because politicians did not have a forum to defend the evolution of their ideas before.
02:02:57.000 Now they can go online.
02:02:59.000 Bernie Sanders has a fucking podcast, you know?
02:03:02.000 Yeah, he does.
02:03:03.000 They can go on YouTube or Facebook.
02:03:05.000 They don't need to be in some sort of a political forum, some sort of a formal debate.
02:03:12.000 Yeah, they can just talk about their ideas.
02:03:14.000 Like, this is what I used to believe, and this is why I changed my mind.
02:03:17.000 And they could do that over the course of an hour in a YouTube video.
02:03:20.000 And as clearly as the mind could decipher, state their opinions on things.
02:03:26.000 But because people couldn't in the past, like, he used to be tough on crime.
02:03:30.000 Now he's a flip-flopper.
02:03:31.000 Yeah, your policy is.
02:03:32.000 You know, it might not be what happened.
02:03:34.000 He might used to be tough on crime, but then he was given evidence of systematic racism.
02:03:39.000 Then he realized how corrupt the police department is.
02:03:41.000 And then he realized that they have mandated quotas where they have to arrest a certain amount of people, and this is probably the best place to do it because those people don't have the money for legal representation.
02:03:50.000 So that's why my opinion evolved.
02:03:52.000 But you couldn't do that in the past.
02:03:54.000 So someone would just call you a flip-flopper.
02:03:56.000 Yeah.
02:03:57.000 The internet's changed everything and for the better by a long term.
02:04:01.000 And by the way, this is not discrediting real flip-flopping, right?
02:04:03.000 Well, no, obviously, if someone doesn't really hold a position and they're just doing it because it's a popular thing to do, like Hillary Clinton, for example.
02:04:10.000 Perfect example.
02:04:11.000 Clinton and Trump on gay marriage.
02:04:13.000 Yeah.
02:04:14.000 That's the best one.
02:04:15.000 Trump has been consistent on that for decades now.
02:04:15.000 Yeah, it is.
02:04:18.000 Hillary Clinton in 2008 was like, well, I believe it's men and a woman.
02:04:21.000 Until 2013.
02:04:22.000 It wasn't until 2013 she finally came clean and went gay marriage.
02:04:26.000 I didn't even know.
02:04:26.000 Right.
02:04:27.000 Up until 2013.
02:04:29.000 I believe there's a video of her from 2013, in fact, saying that marriage is between a man and a woman.
02:04:34.000 Of course she would.
02:04:35.000 And then suddenly, oh, no, you've got to change your position on that.
02:04:38.000 And she's like, a sacred bond.
02:04:40.000 Yeah, whatever.
02:04:40.000 I don't give a shit because I'm Hillary Clinton.
02:04:42.000 Yes.
02:04:42.000 I'm here for the money.
02:04:44.000 I'm a careerist.
02:04:45.000 I'm going to be the first woman president because I had to fuck Bill Clinton.
02:04:50.000 I would love to read what the crazy shit she said to those bankers in those quarter million dollar speeches.
02:04:55.000 I would love to.
02:04:57.000 No, the transcripts that she won't release.
02:04:59.000 No, Wikileaks has released a bunch of them.
02:05:01.000 I did not know that.
02:05:03.000 Really?
02:05:03.000 When did they do that?
02:05:06.000 Like, she wants open borders, like hemispheric open borders, where one hemisphere is just free and open, and another one is free and open.
02:05:15.000 To eventually get a united globalized world.
02:05:19.000 And, you know, there's loads of stuff in there that's just.
02:05:21.000 And she's saying how she holds a public position.
02:05:23.000 She said this to the bankers.
02:05:24.000 She said that in the speech, that she holds a public and a private position.
02:05:28.000 Yeah.
02:05:28.000 I did not know that they got that.
02:05:30.000 Why didn't Bernie Sanders ever discuss that?
02:05:32.000 Was this a brief thing?
02:05:34.000 He's a cuck.
02:05:35.000 I don't know.
02:05:36.000 No, no, no, don't even.
02:05:37.000 No, no, cuck.
02:05:38.000 Oh, a cuck.
02:05:39.000 No, he's not a cunt.
02:05:39.000 Bernie Sanders seems like a very nice man.
02:05:42.000 He's a cuck.
02:05:43.000 Well, I mean, you saw Black Lives Matter take over a stage, right?
02:05:46.000 Oh, yeah, I did.
02:05:47.000 That was horrible.
02:05:48.000 Oh, no, it was sad.
02:05:48.000 It was fucking sad.
02:05:49.000 Bernie, these little bullies have got no right to your platform, and they're literally screeching him.
02:05:54.000 We are being reasonable, screaming in his face like Bernie.
02:05:57.000 As soon as someone approaches him, he needs them on his side.
02:06:00.000 It doesn't matter.
02:06:01.000 It doesn't matter.
02:06:02.000 That lost me.
02:06:03.000 I didn't approve of that.
02:06:05.000 I was at that point a supporter of Bernie.
02:06:06.000 I like Bernie.
02:06:08.000 I thought, okay, he's maybe a little further left.
02:06:10.000 I mean, I'd have to go.
02:06:12.000 But he took a chance in having him go up there.
02:06:15.000 I like that, though.
02:06:16.000 I like that he took the chance to let them talk.
02:06:18.000 I don't know.
02:06:19.000 I don't think they should be able to do it, but I think it does give you a sense of who he is.
02:06:24.000 That he's like, all right, tell me what you want.
02:06:26.000 Tell me what you're looking for.
02:06:28.000 He's just like standing at the side like this.
02:06:29.000 And it's like an attorney, you're not a leader, mate.
02:06:32.000 Well, if he moves his hands, they're microaggression.
02:06:34.000 Exactly.
02:06:34.000 We've learned that from Evergreen State.
02:06:36.000 You should have had your boundaries.
02:06:37.000 I mean, you didn't immediately capitulate.
02:06:39.000 Right.
02:06:39.000 So that was your boundary.
02:06:41.000 You just didn't stick to it.
02:06:42.000 Oh, that's a good point.
02:06:44.000 You should have said, listen, no.
02:06:45.000 I'm going to call security.
02:06:46.000 You're going to be escorted away.
02:06:48.000 Maybe we can discuss it in future.
02:06:50.000 But if someone comes up to you and starts screeching in your face that they're being reasonable, you draw the line and you stick to it.
02:06:56.000 You do not let that person bully you out of your podium.
02:07:00.000 It's yours, not theirs.
02:07:01.000 They've got no right to it.
02:07:02.000 Tell them to get fucked.
02:07:04.000 Well, you know, he's being investigated now.
02:07:06.000 They're going after.
02:07:07.000 I heard, but I haven't looked into it.
02:07:09.000 Financial dealings with his wife and him in a small university where she made some decisions that ultimately didn't pan out financially.
02:07:19.000 And it caused the university to go under.
02:07:22.000 Oh, dear.
02:07:22.000 They acquired a bunch of new land and they were going to expand the university and they got a lot of loans and they started construction, all this jazz, and apparently it just wasn't sustainable.
02:07:32.000 And so now I think there's allegations, which of course, whenever something fails like that, there's allegations that someone mishandled something.
02:07:40.000 So they've been forced to return to the state.
02:07:41.000 How is this lawyer dredged up?
02:07:43.000 I do not know.
02:07:44.000 It's been the news.
02:07:46.000 I haven't really deeply gone into the story.
02:07:50.000 But the basics of the story are she worked for this university.
02:07:55.000 She wanted it to expand.
02:07:57.000 They spent a lot of money expanding.
02:07:59.000 It didn't work out.
02:08:02.000 It doesn't sound on the face of it like Malfeasance was necessarily involved.
02:08:06.000 The worry is that he used his influence to make certain things happen that allowed it to take place.
02:08:14.000 Perhaps that's the allegation, I believe.
02:08:17.000 I still think Ron Paul can win, to be honest.
02:08:20.000 He might.
02:08:21.000 He's old.
02:08:22.000 Yeah, he's way too old.
02:08:22.000 Bernie's way too old.
02:08:23.000 They're always too old.
02:08:24.000 Why are politicians so old, dude?
02:08:26.000 Because it takes that long before we trust you.
02:08:28.000 It's sad, though.
02:08:30.000 Isn't there like some guy in his 40s who's basically saying the same thing as Bernie Sanders?
02:08:36.000 We don't trust him.
02:08:37.000 Fucking creep.
02:08:39.000 Who gives a shit who he fucks?
02:08:40.000 This is the thing with Bill Clinton, man.
02:08:41.000 It really pissed me off.
02:08:42.000 It's like, look, I don't give a shit.
02:08:43.000 People do.
02:08:44.000 It's weird.
02:08:45.000 Well, it's because they can find out about it and they couldn't find out about it before.
02:08:48.000 So you got a guy like JFK who's a total pussyhound, who people revere as a great, great person.
02:08:53.000 He's MLK.
02:08:54.000 Yeah.
02:08:54.000 Total pussyhound.
02:08:55.000 Supposedly.
02:08:56.000 Yeah.
02:08:56.000 Yeah, he was.
02:08:57.000 I don't know if that's true.
02:08:58.000 He's smoking that fire.
02:09:00.000 Who knows?
02:09:01.000 And this is the thing.
02:09:02.000 But why do we care?
02:09:03.000 Yeah, exactly.
02:09:04.000 That's the other question.
02:09:05.000 That's a problem with him and his wife.
02:09:06.000 That's nothing to do with me.
02:09:08.000 And it doesn't surprise me at all.
02:09:09.000 Great men always get women flinging themselves at them.
02:09:12.000 And okay, they're only human.
02:09:14.000 If you've got some really attractive woman flinging herself at you, going, go on, we can just, nothing's going to go wrong.
02:09:18.000 You know, it'll be great.
02:09:19.000 No one will know.
02:09:20.000 No fucking surprise.
02:09:21.000 It's not a judgment on their character either.
02:09:21.000 Right.
02:09:23.000 That's just them being a bit weak in the moment.
02:09:25.000 You know, they know it's wrong.
02:09:27.000 They know they shouldn't have done it.
02:09:28.000 All right.
02:09:29.000 Well, the culture was very, very, very different.
02:09:32.000 The culture of politics was very different back then.
02:09:34.000 Oh, yeah.
02:09:35.000 People knew about the president's affairs.
02:09:38.000 Who was the president that had polio?
02:09:41.000 I don't know.
02:09:42.000 Who's the president that had polio?
02:09:44.000 Had a wheelchair.
02:09:46.000 What is it?
02:09:47.000 Was it Roosevelt?
02:09:49.000 Is it the guy who's got all the little round glasses and it's like, ah, smile.
02:09:53.000 Was it Roosevelt that had polio?
02:09:55.000 I'd have to look it up.
02:09:56.000 Anyway, he was, yeah.
02:09:58.000 Franklin.
02:09:58.000 Teddy?
02:09:59.000 Franklin Delano Roosevelt.
02:09:59.000 Franklin.
02:10:02.000 He had polio, and the press knew it, and they hid it from everybody.
02:10:06.000 They just, no one.
02:10:08.000 Yeah, it's weird, isn't it?
02:10:09.000 Yeah.
02:10:09.000 I mean, that was the thing with JFK.
02:10:11.000 That was the thing with Lyndon Johnson famously held a press conference while he was taking a shit.
02:10:16.000 He was literally taking a shit with the door open in the stall when he was talking to the press.
02:10:20.000 Because there was like boundaries, you know.
02:10:23.000 They couldn't say that.
02:10:24.000 This guy had a code of honor.
02:10:25.000 Yeah, this guy was taking a horrible steam.
02:10:27.000 No.
02:10:29.000 Well, I also think that once Watergate broke, once you realize that, hey, it's pretty obvious this president is a fucking liar and a crook, and we need, if we have any responsibility, we need to expose this.
02:10:41.000 Well, once you expose that, man, you open up that door, woo!
02:10:44.000 People find out the genie's out of the bottle, and then they want to know everything.
02:10:48.000 As soon as it becomes permissible.
02:10:48.000 They want to know.
02:10:50.000 Well, it's up to people not to care.
02:10:52.000 It's up to people to understand that, look, at the end of the day, you are the person who is making the difference.
02:10:56.000 And it's a small difference.
02:10:57.000 But if we all agree that we're all making a small difference and we think, okay, I'm going to act in a way that I find personally responsible.
02:11:02.000 If we all say, right, well, we're all going to act in ways that we find personally responsible, we'll just end up with a better world without having to change anything.
02:11:08.000 Right.
02:11:09.000 Yeah.
02:11:10.000 It's all in our heads.
02:11:11.000 Don't you think that today is probably harder than ever to manage what you're paying attention to and what you're not?
02:11:15.000 Oh, yeah.
02:11:16.000 Well, if you think about fake news and like parts and, I mean, that's a real problem with social media.
02:11:22.000 There have been studies done that show like, and it's more on the left, but it's also very much on the right.
02:11:26.000 You know, like, you know, I don't know, like 70, 60 or something like that, you know, but it's slightly more on the left.
02:11:31.000 Where people just essentially get one political viewpoint.
02:11:35.000 And so the social media is just a stream of content that's all telling you the same thing.
02:11:40.000 What was the story that just came out about a fake news story that cost some company some untold millions of dollars?
02:11:50.000 Might have even been, they might have even said a billion dollars in stock.
02:11:54.000 And it just came out today.
02:11:54.000 Really?
02:11:56.000 Like that a company was extremely devalued because of a fake news story.
02:12:02.000 That is a danger.
02:12:03.000 Yeah.
02:12:03.000 Oh, it's a big danger.
02:12:04.000 Especially if you trade in the stock market and then all of a sudden it causes a cascade of panic and people just start dumping their stock and you're fucked.
02:12:11.000 And it all turns out to be fake.
02:12:13.000 I mean the idea would be that it would balance out eventually, but if it was a fragile company, who knows?
02:12:18.000 But who knows what damage that could have done to them.
02:12:21.000 Did you find it?
02:12:22.000 It's a really recent news story.
02:12:24.000 It just came out today or yesterday.
02:12:27.000 It's the flip side of being able to hold people accountable.
02:12:29.000 Yes.
02:12:30.000 Yeah, you can hold it.
02:12:31.000 The internet's fantastic.
02:12:32.000 Social media's fantastic.
02:12:34.000 YouTube's fantastic.
02:12:35.000 And I think you can make real arguments that these things are now critical to free speech.
02:12:40.000 I agree.
02:12:42.000 And this should be understood.
02:12:44.000 But on the flip side, someone could say something that's not true.
02:12:47.000 That's free speech.
02:12:48.000 Yeah, well, it's unfortunate, but it's amazing when you really think about it how rare it is.
02:12:54.000 I mean, even though it's fairly common, you're talking about 7 billion people.
02:12:59.000 Yeah, even if it only happened once a day, that's still only one thing a day.
02:13:04.000 That's nothing.
02:13:04.000 But I mean, it is up to everyone.
02:13:06.000 I mean, like, they'll probably have to educate kids in school.
02:13:09.000 When you see something going on on social media, you're going to have to fact check it.
02:13:12.000 I think there's going to be much more sophisticated ways of understanding whether or not someone's being honest.
02:13:18.000 And I think what we're looking at now is just like when people started reading books that were written, they never imagined that someone was going to have a printing press.
02:13:27.000 And when people had a printing press, no one was ever going to imagine that you were going to put that on the internet.
02:13:31.000 So I think there's probably something around the bend right now for us that's going to radically change the factual interpretation of information.
02:13:41.000 Like you're going to know what's real and what's not.
02:13:44.000 And it's probably going to be something that you're not even going to understand what it's about now.
02:13:49.000 Like that as technology evolves in this bizarre and exponential way, they're going to reach some new critical way of distributing information.
02:13:57.000 And then things are going to get very strange.
02:13:59.000 Just the way they're strange now, where we have this radical new ability to put out content like you do or I do.
02:14:06.000 I think in the future there's going to be another change that's just as monumental and just as huge.
02:14:11.000 And it's probably going to deal with how we interface with that data.
02:14:14.000 Like some sort of a wearable thing or a mind thing where you can recognize patterns as being true or untrue.
02:14:23.000 Recognize humans as being honest or dishonest.
02:14:26.000 Like you're going to be able to see.
02:14:28.000 You're going to find it, man.
02:14:29.000 We're going to get to, you know, people are really worried about the lack of privacy now.
02:14:35.000 I feel like that is just step one to this ultimate, very bizarre, symbiotic relationship we're going to have with each other and technology where we're going to be inside each other's heads in a very new way.
02:14:48.000 It comes back to kind of what I was saying about racism.
02:14:50.000 Like the SAWs and like the Evergreen College students going, hey, if you do this, you're a racist.
02:14:55.000 Eventually they just go, yeah, okay, I'm a racist.
02:14:57.000 They'll just say it.
02:14:58.000 Okay.
02:14:59.000 And they'll be like, yeah, but you're a racist and then I know what's next.
02:15:01.000 And they'll be like, yeah, but you're supposed to do something because otherwise I'll call you a racist.
02:15:04.000 Yeah, okay, I'll just be a racist then.
02:15:06.000 It doesn't mean anything anymore if I'm a racist.
02:15:09.000 I have to get consent to give that power.
02:15:11.000 And if I don't consent, like, you know, if you, like, what was Kurt Eichenwald or whatever who caught with the tentacle porn?
02:15:18.000 He should have just owned it.
02:15:19.000 Well, yeah, fuck you.
02:15:20.000 You probably watched some shit stuff too.
02:15:22.000 You know, what would they say?
02:15:24.000 Oh, my God.
02:15:24.000 He looks at tentacle porn.
02:15:25.000 He's like, yeah, and you look at what?
02:15:27.000 Go on, tell me what your porn watching habits are.
02:15:29.000 Go on, tell me about your puritanical porn watching.
02:15:32.000 You just watch two people in missionary for 20 minutes, and then that gets you off, does it?
02:15:36.000 Shut up.
02:15:37.000 Well, even if it doesn't get you off and you watch it, you know, like, what if you're just like, what in the fuck?
02:15:42.000 Like, if you look through my YouTube lists or you looked at my history online, you'd go, is that something you like?
02:15:49.000 No, I wanted to see what the fuck it was.
02:15:51.000 I just wanted to see what it was, yeah.
02:15:52.000 I don't like car accidents, but I watched that motorcycle kick that car and the car spins out and hits the barrier and then flips that other car.
02:16:01.000 Yeah, I saw it.
02:16:01.000 Everybody saw it.
02:16:02.000 Doesn't mean I liked it.
02:16:03.000 No, you know, but you know, a lot of things.
02:16:06.000 It's a lot of people.
02:16:07.000 Tentacle porn.
02:16:08.000 Yeah, wow.
02:16:09.000 Something about that makes me skeptical.
02:16:11.000 Wow.
02:16:11.000 He should have just owned it.
02:16:12.000 It's such a silly.
02:16:13.000 I don't even know what you're talking about.
02:16:13.000 Who is he?
02:16:15.000 This guy Iconworld, I think it is.
02:16:17.000 Who is he?
02:16:17.000 I don't know.
02:16:18.000 I think he's a right-wing journalist or something.
02:16:19.000 I don't know.
02:16:20.000 Oh, and they caught him into tentacle porn?
02:16:22.000 Yeah, I guess it was in his search history or something.
02:16:24.000 Did you find that story about?
02:16:27.000 Well, there's someone.
02:16:28.000 Dude, I know I have that devalued story.
02:16:30.000 Like, someone sent me a story.
02:16:31.000 Yeah, yeah, it was definitely something.
02:16:33.000 But it doesn't matter.
02:16:34.000 But the thing is, this is the point: if we're going to a future where literally everyone's going to know everything about everyone, then nothing's really going to matter about any person.
02:16:40.000 I mean, there's gonna be like, well, yeah, everyone does this, you know, you know, you know, exactly who does this, so it's not gonna be shameful, nothing will be shameful, nothing will be shocking.
02:16:47.000 Everyone will just know everyone about everything about everyone all the time.
02:16:50.000 Yeah, that's the real worry, right?
02:16:52.000 And then we're gonna become okay here.
02:16:53.000 I got it right here.
02:16:54.000 Who knows?
02:16:55.000 Okay, the fake news.
02:16:56.000 Here it is.
02:16:57.000 I'll tell you, I found it so easy, Jamie.
02:16:59.000 So sad at your skills.
02:17:01.000 It says, fake news of fatal car crash wiped out $4 billion in, here's the company, Ethereum.
02:17:12.000 E-T-H-E-R.
02:17:14.000 No?
02:17:14.000 That's Bitcoin.
02:17:15.000 That's like a cryptocurrency.
02:17:17.000 Oh, that's why you didn't find it.
02:17:18.000 I didn't know what that was.
02:17:19.000 That's a new one?
02:17:20.000 Yeah.
02:17:21.000 Oh, God.
02:17:22.000 Okay, so fake news of Fatal.
02:17:23.000 It's on Quartz.
02:17:25.000 QZ.com.
02:17:27.000 Yeah.
02:17:28.000 Wiped out the $4 billion in their market value yesterday.
02:17:32.000 How the fuck do they have $4 billion in market value of a crypto coin that I've never even heard of?
02:17:37.000 No idea.
02:17:37.000 I don't know how cryptocurrency works, to be honest.
02:17:39.000 So that's...
02:17:44.000 Jesus Christ, I mean, if there's $4 billion in value to it, that's as legitimate as a company.
02:17:49.000 Let's have some legitimacy.
02:17:50.000 Point being that a fake news story wiped out the value of it.
02:17:54.000 But maybe now's the time to buy.
02:17:57.000 Well, yeah, if it's got no value, and it's probably going to go back up.
02:17:59.000 Because this wasn't a real thing about it.
02:18:01.000 Get on the ground floor.
02:18:02.000 Dive in.
02:18:03.000 Take a chance, son.
02:18:04.000 Yeah, make it.
02:18:04.000 Well, that is.
02:18:07.000 There's a fake story about the guy that created it died in a car crash.
02:18:11.000 Oh.
02:18:11.000 So everybody panicked.
02:18:12.000 Yeah, and then everyone panicked because the creator, the guy that's controlling the money and the market of it supposedly died.
02:18:19.000 That didn't happen.
02:18:19.000 Oh.
02:18:20.000 Oh, okay.
02:18:21.000 But still, fake news.
02:18:24.000 By the way, your stock's gone down $4 billion.
02:18:26.000 Fucking what?
02:18:27.000 I know, right?
02:18:28.000 You can usually imagine.
02:18:29.000 That's the other weird part.
02:18:30.000 It's not stock.
02:18:31.000 It's a value of market of the channel.
02:18:34.000 That's another one of my favorite stories of the last year.
02:18:37.000 The Thanos story, that woman who was pretending to be the new female Steve Jobs, and she had the look, so everybody went with it.
02:18:45.000 And she had this blood testing procedure that totally didn't work.
02:18:50.000 It's a great story.
02:18:51.000 Yeah, go and tell you.
02:18:52.000 She was touted as being the next genius.
02:18:55.000 She dressed exactly like Steve Jobs.
02:18:57.000 She wore black turtlenecks and was really hilarious.
02:19:00.000 And she gave the speech to like women in something or another.
02:19:04.000 But the speech was so jumbled and shitty.
02:19:08.000 And there's so much poverty of words in her sentences.
02:19:12.000 It's like the content was just so bad.
02:19:14.000 I was like, this is a genius.
02:19:16.000 Like, maybe she's super awkward and nervous.
02:19:18.000 And maybe she just didn't prepare this speech at all.
02:19:21.000 But no.
02:19:22.000 Turns out she was a massive fraud.
02:19:24.000 The whole thing's a massive fraud.
02:19:25.000 And her $34 billion company vanished.
02:19:29.000 So the company had a product that they were using where they were doing these blood tests.
02:19:35.000 And a whistleblower from inside the company said, this person, they're ignoring all the bad data.
02:19:40.000 They're ignoring all the failed tests.
02:19:42.000 They're pretending it's much more efficient than it is.
02:19:44.000 So it just didn't work, right?
02:19:45.000 Dude, not only did it not work, hundreds of thousands of people who took these tests were exposed to potential health threats because they thought, well, this is I know where my blood is now.
02:19:55.000 So you could have gone more cautious or less cautious based on this or got treatment that you didn't need based on this or didn't get treatment that you probably did need based on this.
02:20:08.000 Can you imagine the kind of person who does that, though?
02:20:10.000 She's standing there bullshitting and lives at risk.
02:20:12.000 Sociopath in some sort of a way, or unless she was lied to, too.
02:20:18.000 I mean, who knows exactly what the actual story is.
02:20:22.000 But the whistleblower saying, no, she's absolutely aware of it.
02:20:25.000 And so now she went from being like the number one self-made female entrepreneur ever, where she was worth like $34 billion or something crazy, to massive debt.
02:20:37.000 Now her company is worth nothing.
02:20:39.000 It doesn't work.
02:20:40.000 So she fired a shitload of people, and the company still exists as an entity, but it's like one of those really tortured things.
02:20:46.000 There's lawsuits are flying around left and right now.
02:20:49.000 But the whole story is a cautionary tale in what we're looking for.
02:20:54.000 We're looking for this thing, this new image.
02:20:56.000 You know, here's this woman.
02:20:58.000 She's fitting the part.
02:20:59.000 She's got a black turtleneck on.
02:21:01.000 Econ artist came along and played you.
02:21:02.000 Dropped out of college at 19 to start this business.
02:21:06.000 I mean, she doesn't even have a background in medical science.
02:21:09.000 She doesn't have a degree.
02:21:11.000 It's not like she's a PhD in some sort of bioengineering or blood testing or whatever the fuck it would be.
02:21:17.000 Yeah, yeah.
02:21:18.000 It's a crazy story.
02:21:20.000 Because she was saying, hey, exactly.
02:21:23.000 She dressed the part, looked at the part, said the right words.
02:21:26.000 Confidence trickster comes along and says, hey, you're interested in a woman doing this?
02:21:30.000 I'm a woman.
02:21:30.000 I'm doing this.
02:21:31.000 But I remember before she got exposed, watching that video of her talking, going, this does not seem like a bunch of people.
02:21:36.000 Yeah, if she's just like rambling along and kind of disjointed or something.
02:21:40.000 It's just clunky.
02:21:41.000 Yeah.
02:21:42.000 Like, pull up, see if you could find Elizabeth.
02:21:46.000 Is that her name?
02:21:47.000 Stanos?
02:21:48.000 What the fuck was her name?
02:21:49.000 I forget what her name is.
02:21:50.000 Theranos was the...
02:21:52.000 What was her name?
02:21:53.000 I'm looking at pictures of her.
02:21:54.000 It doesn't say her name anywhere, though.
02:21:57.000 But no, I did hear about it, but I'll have to check it out some other time.
02:22:01.000 Yeah, but it's...
02:22:04.000 What's that?
02:22:05.000 Elizabeth Holmes.
02:22:06.000 Yeah, okay.
02:22:07.000 So it's not discrediting women.
02:22:09.000 This is not discrediting the idea of a genius woman.
02:22:11.000 This is not discrediting the idea of a genius entrepreneurial woman.
02:22:14.000 What this is fascinating to me is a person who fits all of these categories that people are clearly looking for, right?
02:22:22.000 And they just bought into it.
02:22:24.000 Identity politics.
02:22:24.000 This is what they wanted.
02:22:26.000 Confirmation bias.
02:22:27.000 And we love the fact that it's a strong woman who's doing this.
02:22:30.000 It's amazing.
02:22:31.000 I'm with her.
02:22:32.000 Hashtag.
02:22:33.000 They went with Marine Le Pen, though, weren't they?
02:22:35.000 Who's that?
02:22:36.000 Oh.
02:22:36.000 The French woman.
02:22:37.000 Oh, that's right.
02:22:38.000 Yeah.
02:22:40.000 Oh, that's the thing, isn't it?
02:22:41.000 Oh, well, you know, I mean, I know she's the woman, but I'm thirdly for Alexander Macron.
02:22:45.000 Yeah, but you're I'm with her, aren't you?
02:22:47.000 I'm with her when she thinks wrong.
02:22:47.000 Right.
02:22:47.000 Oh, you're not.
02:22:49.000 So it's not that she's a woman, it's the way she fucking thinks.
02:22:52.000 Why can't we just be honest about this?
02:22:54.000 You know?
02:22:54.000 That Merkel lady's a trip, too.
02:22:56.000 Oh, Merkel's cancer.
02:23:01.000 What are you doing?
02:23:02.000 What are you doing?
02:23:03.000 She's a trip.
02:23:04.000 When I know about her, if I know who's running Germany, the Germans just can't help themselves.
02:23:10.000 What are they doing over there?
02:23:13.000 With her.
02:23:13.000 With what?
02:23:15.000 I don't know.
02:23:16.000 Merkel's.
02:23:17.000 I speak to, again, my friend Kraut, who tells me about these things.
02:23:20.000 she's famous for her, and I don't want to say flip-flopping, but I would actually be a lot more charitable and say her navigation of the political winds.
02:23:29.000 Oh.
02:23:29.000 She's like Hillary in that way, I guess, Another one.
02:23:36.000 I think there's some thertical opportunists.
02:23:38.000 Sailors?
02:23:38.000 Really, really, it's being a smart politician, isn't it?
02:23:41.000 It's saying, well, okay, where are people going?
02:23:44.000 I don't think that's smart anymore.
02:23:46.000 Well, no, I think it means you lack credibility.
02:23:49.000 It makes you look like you don't stand for anything.
02:23:51.000 Well, look, as much as Trump lies, and he lied 600-plus times in the 100-plus times that he was in office, they did a count of all the times he said things that weren't true publicly, right?
02:24:04.000 As much as he does that, everybody knows that's what he does.
02:24:07.000 Like, they know who he is.
02:24:09.000 They know what he does.
02:24:10.000 They know he tweets all the time.
02:24:12.000 So even if he tweets something ridiculous, they know who that is.
02:24:15.000 That's that guy.
02:24:17.000 That's him.
02:24:17.000 That's what he really thinks.
02:24:18.000 That's why he's saying it.
02:24:19.000 You never got that with Hillary Clinton.
02:24:21.000 What you got is these weird, boiled-down messages.
02:24:25.000 It seemed like they were put together in some sort of a writer's room.
02:24:29.000 Yeah.
02:24:30.000 I mean, she doesn't control her Twitter account.
02:24:30.000 And it was.
02:24:32.000 No.
02:24:33.000 Undoubtedly.
02:24:34.000 But Trump undoubtedly is in control of his own Twitter.
02:24:36.000 100%.
02:24:37.000 And it's so hilarious.
02:24:38.000 I thought it was hilarious when she wrote to him, delete your account.
02:24:41.000 Yes.
02:24:42.000 So that is legit banter.
02:24:44.000 That's good.
02:24:45.000 That was good.
02:24:46.000 You got to give it up to her there.
02:24:47.000 But I don't know who wrote that.
02:24:48.000 It was probably some 20-year-old kid somewhere.
02:24:50.000 She probably wrote it.
02:24:50.000 Yeah.
02:24:51.000 Some girls working for it.
02:24:52.000 Good job, guys.
02:24:53.000 Yeah, she wrote it.
02:24:53.000 Who knows?
02:24:54.000 The social media manager of Wendy's tweeted out Pepe of the Wendy's icon as a Pepe.
02:25:00.000 Oh, shitstorm.
02:25:00.000 Oh, no.
02:25:02.000 Absolute shitstorm.
02:25:03.000 They were forced to delete it, and who knows whether that person still has a job.
02:25:06.000 But people think that Pepe is alt-right, and you're saying Pepe's not alt-right.
02:25:10.000 The alt-right use Pepe, but everyone uses Pepe.
02:25:12.000 Pepe's a ubiquitous internet symbol of sort of dissidence and chaos.
02:25:18.000 More importantly, disrespect.
02:25:21.000 But isn't Kek like the god of chaos?
02:25:23.000 Yeah, yeah.
02:25:24.000 Isn't that a little made-up?
02:25:26.000 That's a little made-up myth.
02:25:27.000 Made up.
02:25:28.000 What?
02:25:29.000 Everything's made up.
02:25:30.000 What if you die and you meet Kek?
02:25:31.000 Would you be feeling bad because you said he was made up?
02:25:34.000 I'll be quite surprised.
02:25:36.000 I'll be like, look, hey, you're going to have to work with me here because let's be honest, would you have believed it?
02:25:42.000 No.
02:25:43.000 But I would be so shocked if he had a gold robe.
02:25:48.000 It was a frog.
02:25:49.000 Yeah, and he got to heaven.
02:25:51.000 And it literally looks like Pepe.
02:25:54.000 And he's got a gold robe on.
02:25:56.000 Can I piss off literally everyone who believes in a religion?
02:26:00.000 Why would you want to?
02:26:01.000 Because what about hope?
02:26:03.000 Doesn't it just strike you as the most childish fucking thing, though?
02:26:06.000 Depends on the mind.
02:26:07.000 When I die, I'm going to go somewhere else, you know.
02:26:10.000 That's starting to make sense to me.
02:26:10.000 Not Scientology.
02:26:12.000 I can tell, actually.
02:26:13.000 But I said, I've been watching it.
02:26:15.000 But no, seriously, it's obviously wishful thinking.
02:26:17.000 And it's obviously because you're afraid of dying.
02:26:18.000 And it's obviously that you just want to believe it.
02:26:20.000 It's not true.
02:26:20.000 You fucking might just want that gold medal.
02:26:22.000 That's what I'm saying.
02:26:23.000 It's not true.
02:26:23.000 I want that Scientology gold medal that they gave Tom Cruise.
02:26:28.000 The big one, like a laptop.
02:26:30.000 He looks like Flavor Flavor up there with that fucking gold medal on.
02:26:33.000 But isn't it so arrogant to think that humans are so important that they're going to go to an afterlife?
02:26:38.000 It might be right.
02:26:39.000 You know, the afterlife isn't that arrogant to me because it's the idea is that you're trying to pass on life, right?
02:26:49.000 And the mind understands that this is sort of a futile effort.
02:26:53.000 You're temporary.
02:26:54.000 You have a finite life.
02:26:56.000 You know, like, what are you doing?
02:26:58.000 Why even bother?
02:26:58.000 Why not just end this now?
02:27:00.000 You're constantly suffering.
02:27:01.000 Just jump off a building.
02:27:02.000 So, no, no, no, we can't do that because there's a greater good waiting for us.
02:27:06.000 There's a greater thing.
02:27:07.000 There's another level of the video game.
02:27:10.000 That doesn't make sense.
02:27:11.000 If you're going to go to heaven when you die, kill yourself.
02:27:15.000 Right, that's because you can't.
02:27:16.000 That's a loophole.
02:27:17.000 So you can't kill yourself when you go to heaven.
02:27:20.000 Because when you kill yourself, you make the people around you that loved you suffer.
02:27:24.000 So you don't go to heaven when you kill yourself.
02:27:26.000 That's convenient.
02:27:27.000 Or it's God's faith.
02:27:28.000 What if they kill you?
02:27:29.000 What if you're part of a cult ritual?
02:27:31.000 Like, you know, we're all just going to drink the Kool-Aid, the poison Kool-Aid, and kill ourselves.
02:27:35.000 Well, they answered that with Jesus.
02:27:36.000 Jesus on the cross.
02:27:37.000 That's why we still do this.
02:27:39.000 We literally make the cross sign, the thing that killed him.
02:27:44.000 Bill Hecks had a great joke about that.
02:27:46.000 He goes, it's like going up to Jackie on the asses with a rifle pendant on.
02:27:49.000 He goes, just thinking of John, Jackie.
02:27:52.000 Imagine if you're like a pagan or something, back in like the 7th century or whatever, and you're Northern European Lithuanian or something.
02:27:58.000 And then these Germans come along and go, hey, we've got this religion about a dead guy.
02:28:03.000 We think you should join it.
02:28:04.000 And you'd be like, why?
02:28:06.000 Well, it's really sad.
02:28:08.000 It's a really sad story, but he died for your sins.
02:28:10.000 But yeah, everywhere you go, there's this guy just being killed on a cross.
02:28:13.000 You'd be like, you know, my religion is way more fun than that.
02:28:17.000 Yeah, you have the original sin, too.
02:28:19.000 That's the great thing that you brought up earlier about being white.
02:28:22.000 I mean, it's sort of like just being a human, the original sin.
02:28:26.000 The only way they can have complete, total control over you is to say that you are guilty no matter what.
02:28:30.000 Just by virtue of being born.
02:28:32.000 Believing you're guilty.
02:28:32.000 Yeah.
02:28:33.000 Then it works.
02:28:34.000 It starts from that.
02:28:34.000 Yeah.
02:28:34.000 Yeah.
02:28:36.000 That this all-knowing, infinite thing that looks over the universe is incredibly petty, very jealous.
02:28:43.000 Yeah, if you worship anything else, any other false idols, death.
02:28:47.000 You wear two different types of clothes, death.
02:28:50.000 Yeah.
02:28:50.000 You know, it's really interesting because there have been religions in the past that thought that that was an evil god that created the earth.
02:28:57.000 And everything in the earth is created by an evil god.
02:28:59.000 And there is a loving god beyond it, but you can't pray to it.
02:29:02.000 It has no interaction.
02:29:03.000 You've got to get past earth.
02:29:05.000 Yeah, basically.
02:29:06.000 You have to live a good life and eventually you'll get over there.
02:29:08.000 Well, maybe it's true.
02:29:09.000 Maybe it's like this is a level of the video game, right?
02:29:11.000 Maybe you get past this.
02:29:12.000 It's not true.
02:29:12.000 You don't know.
02:29:13.000 No, I do.
02:29:14.000 You get through and that's awesome.
02:29:14.000 None of it's true.
02:29:16.000 I'll put money on it.
02:29:17.000 Well, money's not worth anything in the afterlife, sir.
02:29:20.000 Okay, okay, well, whatever you want on it right and in the afterlife that's like saying if it's true in the afterlife i'll blow you in the afterlife but you're you're like an ethereal being you're like some fake i know man you don't know about you dude, I do know.
02:29:33.000 This is what pisses me off.
02:29:34.000 I've had a near-the-death experience, man.
02:29:37.000 I've been there, man.
02:29:38.000 It's obviously such wishful thinking.
02:29:40.000 Well, anybody that says they know, as soon as someone says, I know what happens, even if you say, I know that you die and there's nothing.
02:29:47.000 Well, you definitely don't.
02:29:49.000 you definitely don't know definitely don't know but if i'm going to put money on it i mean like But I mean, like, now, you know.
02:29:58.000 No, no, it's not for me.
02:29:58.000 Yeah, but tomorrow.
02:30:00.000 I mean, tomorrow they might be able to prove it.
02:30:01.000 That would be amazing.
02:30:02.000 Tomorrow, you know, if you go to a betting agency, it's like, well, actually, scientists have discovered that God is real, and it turns out that there are 72 virgins waiting in heaven.
02:30:11.000 So maybe you should convert to Islam now.
02:30:12.000 You'd be like, oh, shit, okay.
02:30:14.000 Allahu Akbar.
02:30:15.000 Did you ever see that priest that got in trouble because he took photos of him in heaven?
02:30:20.000 And he was selling photos of him in heaven?
02:30:22.000 It is wonderful.
02:30:23.000 It's him, like, waving, and he's in, like, this pure white background.
02:30:28.000 And he said...
02:30:30.000 Oh, boy.
02:30:33.000 It's on my Instagram account from a while back.
02:30:37.000 Such a bunch of people.
02:30:38.000 You can find it, Jamie.
02:30:39.000 It's hilarious.
02:30:41.000 I mean, it's adorable because he's so fucking stupid.
02:30:46.000 Like, the thing is so fucking stupid that it makes you just go, it's like that.
02:30:51.000 I just found out about this, too, the other day.
02:30:53.000 Jeron Horton told me about the 18-year-old kid that was running a medical practice and he did not have any medical.
02:31:01.000 He was a chiropractor.
02:31:02.000 He was a chiropractor, basically.
02:31:04.000 But they busted him and he sat down and gave interviews and said that he's being persecuted and said all the true facts will come to light.
02:31:12.000 But the way it's edited is so funny.
02:31:14.000 I mean, it's really, really funny.
02:31:16.000 I mean, this guy's a loon, this kid.
02:31:17.000 Well, he sounds like, yeah.
02:31:18.000 Did you find the photo of the guy in heaven?
02:31:20.000 Wow.
02:31:20.000 Yeah.
02:31:23.000 It's not a great version of an alt text.
02:31:26.000 There he is.
02:31:28.000 He's got his hand up.
02:31:29.000 There's a full version of it where you can see more of him.
02:31:32.000 Yeah, it's like that.
02:31:34.000 But it's all just, it's not that.
02:31:35.000 It's white.
02:31:36.000 Is that a fake one?
02:31:38.000 Oh, it's WhatsApp.
02:31:39.000 He chose for $5,000 for one.
02:31:42.000 Yeah, he was charging people for the photo of him in heaven.
02:31:47.000 And he was, I guess he was going to sign it.
02:31:50.000 Imagine what kind of moron you'd have to be.
02:31:54.000 Look at this.
02:31:54.000 Look at the quote.
02:31:55.000 I just want to bless everyone with the experience I had talking with Jesus.
02:31:59.000 He's a great guy and showed me a lot of love.
02:32:02.000 I can't wait to get back up there with him.
02:32:09.000 That's it.
02:32:09.000 That's it.
02:32:10.000 You met the king of kings.
02:32:11.000 You met the son of God.
02:32:12.000 You met a man who's come back to life.
02:32:14.000 That's the guy that's right.
02:32:15.000 And this is what he says.
02:32:16.000 He says, he's a great guy.
02:32:18.000 He showed me a lot of love.
02:32:19.000 Oh, well, for sure.
02:32:21.000 It sounds like he had a real transcendent experience with Jesus.
02:32:24.000 He's a great guy.
02:32:25.000 He had pictures on his galaxy smartphone.
02:32:28.000 He brought his fucking Samsung up into heaven.
02:32:33.000 The worst part about this is, that's basically what the Bible stories are based on a lot of the time.
02:32:37.000 It's like someone was, you know, like Enoch was taken up into the crystal ship of God or something.
02:32:44.000 Yeah.
02:32:44.000 It's just horseshit.
02:32:45.000 It's just horseshit made up by idiots thousands of years ago because they don't know what's going on.
02:32:49.000 They can't explain what's going on, but they are kind of scared of getting killed.
02:32:52.000 And that's because it might really happen really fucking soon.
02:32:55.000 Let's get rid of that.
02:32:57.000 And they're talking about an oral tradition.
02:32:59.000 Like it was talked for thousands of years.
02:33:02.000 And by the time someone wrote it down, wasn't the original Bible like a thousand years old by the time anybody wrote it down?
02:33:09.000 Well, undoubtedly, there was, I mean, we don't know how old because it was an oral tradition.
02:33:12.000 They didn't come from anywhere.
02:33:12.000 They don't even know, right?
02:33:14.000 Well, also when you hear like the similarities between the Sumerian text stories, like the epic of Gilgamesh is really similar to Noah's Ark.
02:33:23.000 Oh, yeah, yeah.
02:33:24.000 Nepishtim, who survived the flood and whatnot.
02:33:27.000 And in fact, it's like really so similar.
02:33:30.000 It's almost the same words, just replacing God with gods.
02:33:33.000 Right.
02:33:34.000 Well, they obviously replace gods with God.
02:33:36.000 And the thing is, we can even pinpoint the sort of time that it's likely to have been adopted into the religion.
02:33:41.000 During the Babylonian exile, when the Assyrians conquered the nation of Israel and left Judah, they undoubtedly, they took them back to, well the Assyrians, the Babylonians, under Nebuchadnezzar, took them back to Babylon.
02:33:55.000 And then there the smart Jews were, the elites, the educated people.
02:33:59.000 And unsurprisingly, suddenly their mythos includes loads of Mesopotamium ideas.
02:34:04.000 Right.
02:34:04.000 Whoa, where did that come from, genius?
02:34:06.000 Oh, and then they go back and all of a sudden, oh, okay, it's incorporated.
02:34:09.000 Okay, yeah, it's this is all horseshit.
02:34:11.000 It's all horseshit.
02:34:12.000 And there's no benefit into really trying to include the older stuff that they find either and try to like figure out whether or not these are connected.
02:34:19.000 No, it's stupid.
02:34:20.000 It's just like, I mean, it's just like, look, there are so many gods.
02:34:22.000 Why that one?
02:34:23.000 Why that one?
02:34:24.000 You don't know.
02:34:25.000 Probably because you grew up with it, right?
02:34:27.000 You weren't reasoned into that ability.
02:34:29.000 You weren't reasoned out in position.
02:34:30.000 You didn't sit there and think, you know what?
02:34:31.000 I've had a think, and I think it's Marduk.
02:34:34.000 That's the god.
02:34:35.000 I was thinking about it.
02:34:36.000 I was thinking Tiamat.
02:34:37.000 Yeah, it was.
02:34:38.000 Well, Tiamat, you couldn't really have, because that's like a primordial god.
02:34:41.000 But that's what I like.
02:34:42.000 I like the old ones.
02:34:43.000 Yeah, but they don't.
02:34:44.000 You know, you couldn't pray to Tiamat.
02:34:45.000 It's like Krom.
02:34:46.000 No, pray to Krom.
02:34:47.000 He doesn't listen.
02:34:48.000 No, he doesn't.
02:34:49.000 I mean, I like Asher, to be honest, you know, Oishtar.
02:34:52.000 But, you know, it's just like, it's nonsense.
02:34:56.000 It's all nonsense.
02:34:57.000 Well, it's all a myth, and it's fascinating.
02:34:59.000 And what's interesting to me is the historical idea that these people were living by these stories like thousands of years ago.
02:35:07.000 And like what, like, you try to put it in context, imagine what it would be like living thousands of years ago with no science, with no education, limited understanding of the world around you.
02:35:18.000 And then there you are abiding by these ancient stories.
02:35:21.000 And it's a fascinating thing.
02:35:24.000 Like the Dead Sea Scrolls, that's really fascinating stuff.
02:35:27.000 They've been deciphering that forever, but they don't incorporate it into the Christian doctrine.
02:35:32.000 You know, it's not a problem.
02:35:34.000 It's already codified.
02:35:34.000 We can't revisit that.
02:35:36.000 Yeah, what do they do about that?
02:35:37.000 Because if they know that that's ancient religious text and it's as old as anything that they found, right?
02:35:42.000 It's the oldest version of the world.
02:35:43.000 Yeah, first century or something.
02:35:45.000 It's old as fuck.
02:35:46.000 Yeah.
02:35:47.000 And these, you know, they don't use it.
02:35:49.000 What are they going to do?
02:35:49.000 It's weird.
02:35:50.000 But that's because they'd have to change their beliefs and they weren't reasoned into these beliefs.
02:35:54.000 So they're going to be like, no, I don't want to.
02:35:55.000 They just ignore it.
02:35:56.000 They just ignore it.
02:35:56.000 They've got a narrative.
02:35:57.000 I wonder what the official position of the Catholic Church is about the Dead Sea Scrolls.
02:36:03.000 I don't give a shit what the Catholic Church says.
02:36:05.000 I'm fascinating by that.
02:36:06.000 Well, yeah, but one thing that bothers me about it is, I can't remember what I was going to say.
02:36:14.000 Just before you brought the Catholic Church, you blew a fuse when I brought it.
02:36:19.000 Yeah, just super bummed out about them.
02:36:21.000 Yeah, no.
02:36:22.000 The Catholic Church is horseshit.
02:36:24.000 Oh, it certainly is.
02:36:25.000 This is my problem then.
02:36:26.000 It's like, I mean, if someone wants to believe in stuff, that's fine.
02:36:32.000 This is why secularism is important, you know.
02:36:34.000 It's like, but it's like identitarian politics.
02:36:37.000 You know, it's just itinerary and say, yeah, but let me tell you about my ex, you know, but you're just talking about yourself.
02:36:41.000 Right.
02:36:42.000 You know, you're not talking about a problem that's external to you.
02:36:44.000 You're saying, I have a problem.
02:36:45.000 I believe something.
02:36:47.000 You know, I believe in God.
02:36:48.000 I believe that black people are this.
02:36:50.000 I believe that white people are this.
02:36:51.000 I mean, yeah, but I don't really care.
02:36:52.000 You know, tell me about the things that are external to you that actually are affecting things in you rather than things that are in you that are coming out.
02:36:58.000 I don't care about that.
02:36:59.000 You've got this the wrong way around.
02:37:01.000 I just find it fascinating that there's like a certain level of like when you go back in time, there's a certain level of history where you just ignore those ideas.
02:37:11.000 Like when you get past 4,000 years old, like the epic of Gilgamesh, there's no one who's trying to live by the Sumerian text.
02:37:19.000 There's no one who's treating that as a religion, right?
02:37:21.000 The ancient Mesopotamian stories are basically thought of as purely myths.
02:37:25.000 Just like Odin and Thor and Zeus.
02:37:28.000 That's all abandoned.
02:37:29.000 But at one point in time, that was the ideology to live by.
02:37:32.000 I like the way the gods never really get resurrected.
02:37:34.000 And I don't really believe the neo-pagans and stuff.
02:37:37.000 They say, oh, I actually believe in this.
02:37:39.000 No, you don't.
02:37:40.000 You just think it's cool.
02:37:41.000 They think it's cool.
02:37:42.000 They don't really believe in it.
02:37:43.000 I'm a Wiccan.
02:37:44.000 And that Wiccan is not old either.
02:37:48.000 That was the 19th century or something like that.
02:37:50.000 That was invented.
02:37:51.000 Oh, that's even dumber.
02:37:52.000 Yeah, I know.
02:37:53.000 Isn't it right?
02:37:55.000 It's like the oldest version of the Mormons.
02:37:57.000 Yeah.
02:37:58.000 Oh, I love the Mormons.
02:37:59.000 Joseph Smith, is it?
02:37:59.000 What is it?
02:38:00.000 Yeah.
02:38:01.000 Dies in a gunfight or something like that in a prison.
02:38:01.000 Yeah.
02:38:04.000 It's like, that's my kind of prophet.
02:38:07.000 He taken it out when he was 14, too.
02:38:08.000 There's golden plates that no one got to sauce.
02:38:10.000 Good stuff.
02:38:11.000 I mean, who do you know that's 14 that isn't full of shit?
02:38:14.000 Have you ever met a 14-year-old that tells you the truth?
02:38:17.000 They're not necessarily lying all the time, but they've done it.
02:38:19.000 He knew things.
02:38:21.000 This boy has wisdom.
02:38:23.000 Where did you learn this wisdom?
02:38:24.000 A golden tablet.
02:38:25.000 Come with me.
02:38:26.000 I'll show you where it is.
02:38:27.000 The angels knew we were coming and they took it away.
02:38:31.000 Well, I'm persuaded.
02:38:33.000 Do you have your magic seer stone?
02:38:35.000 Oh, no, they took that as well.
02:38:37.000 God, it's so close.
02:38:39.000 So close.
02:38:39.000 Yeah, but they did tell me before they left that I could have nine wives.
02:38:44.000 So I think when these bitches want to get one.
02:38:46.000 Do you know one of my favorite things about the Quran and the Hadiths is that Muhammad basically used to pull these fucking dictates from God out of his ass.
02:38:54.000 And at one point, his child bride turned around to him and said, you know, paraphrasing, but you know, these are coming at surprisingly convenient times for you, Muhammad.
02:39:04.000 It's like, oh.
02:39:07.000 He got called out by your child bride, Muhammad, because it's obvious you were pulling this shit out of your ass for your own convenience.
02:39:14.000 It's just like, and there are some really gross things about Muhammad as well.
02:39:18.000 I better not go into it because someone will probably kill me.
02:39:20.000 Isn't that amazing that you really would have to worry about that?
02:39:23.000 Yeah, legit, yeah.
02:39:24.000 Yeah.
02:39:25.000 I mean, people who have mocked since we've lost, I think it's worth talking about Wahhabism.
02:39:32.000 I've been learning about that a lot recently.
02:39:34.000 Okay.
02:39:35.000 It's crazy, and Muslims hate it.
02:39:37.000 It's like the SJWs of the Muslim world.
02:39:40.000 Imagine if the country was taken over by rabid SJWs who had access to weaponry.
02:39:46.000 That'd be scary, right?
02:39:48.000 Yeah.
02:39:48.000 Imagine if they had in every city like SJW police who went around saying, right, blacks over there, whites over there, men over here, women over here.
02:39:56.000 And if you were even caught in, you know, fraternizing with the opposite race or sex, that's it.
02:40:01.000 You're going to get the lash.
02:40:02.000 Maybe you get your hand chopped off.
02:40:03.000 Maybe, you know, whatever punishment intersectional social justice would meet out if given the option.
02:40:09.000 That's basically what it's like in Saudi Arabia and under ISIS.
02:40:13.000 So it really is, in a lot of ways, a lot of things that we've been talking about today, they're very similar.
02:40:18.000 And these are deep grooves that are carved in the human psyche.
02:40:22.000 Imagine the kind of person who wants to be the person who enforces that.
02:40:27.000 Piece of shit.
02:40:27.000 Right.
02:40:28.000 That's why the social justice war is a piece of shit.
02:40:31.000 Control freaks.
02:40:31.000 Yeah, control freaks.
02:40:33.000 Moral busybodies who've got no real morals at all.
02:40:35.000 And people who lack a certain amount of objectivity, which is why that certain lady you kept talking about earlier feels like she can say those terrible things about you.
02:40:46.000 But she can't think of herself.
02:40:48.000 Yeah.
02:40:49.000 And ironically, everything they're saying is projection.
02:40:51.000 I mean, who's doing the abusing?
02:40:52.000 Who's doing all of this?
02:40:54.000 It's not me.
02:40:56.000 None of this is really true.
02:40:58.000 But yeah, the Islamic world, the Wahhabis, it was really interesting.
02:41:01.000 I read Force and Fanaticism by Simon Ross Valentine, an English professor who went and lived down in Saudi Arabia for three years to learn about their ideology.
02:41:10.000 Really, really, really fascinating.
02:41:13.000 One of the most fascinating things is how much other Muslims don't like the Wahhabis.
02:41:17.000 And I'm using the term Wahhabi, but what it's a school of Islamic thought called Salafism, and it can best be described as autistic screeching.
02:41:27.000 It's like taking a really hardline interpretation, ignoring other arguments against this hardline interpretation, and going, you know what, just no, not even listening.
02:41:36.000 And then just carrying on and seeing where that train of logic gets to you.
02:41:39.000 And suddenly it gets to religious police, suddenly it gets to throwing gays off buildings and stuff that regular Muslims don't need to do because Muhammad, you know, they'll be like, well, not pro-gay.
02:41:48.000 I mean, and this is another thing I'll get into.
02:41:50.000 But Muhammad says, don't kill people.
02:41:52.000 You know, at one point, at some point in the Quran, he says, and so they kind of say, right, okay, well, I'm not going to kill someone.
02:41:57.000 You know, I mean, okay, maybe the gay thing was wrong because a lot of Muslims are actually pro-death sentence for homosexuality.
02:42:03.000 But, you know, they'll have, I mean, everything's reasoned.
02:42:06.000 Everything is argumentation.
02:42:08.000 Right.
02:42:08.000 And so the important thing is to understand that they are persuaded into becoming jihadis.
02:42:14.000 And it's not, and this is one thing the West has to understand.
02:42:18.000 It is not that we are at war with them, right?
02:42:22.000 You can go to ISIS's, like, they produce magazines.
02:42:25.000 There was one called Dabik, which was a city they owned, but they lost it.
02:42:31.000 So they created a new one called Ramaya, which is Rome.
02:42:35.000 Arabic for Rome.
02:42:36.000 they have conquered before, not ISIS, but Islam, and want to conquer again.
02:42:41.000 And in them, they literally say, I can't remember, I think it was like issue 14 of Dabik or something.
02:42:46.000 They literally turn around and say, We do not fight you because you bomb the Middle East.
02:42:50.000 We fight you because you are not Muslims.
02:42:53.000 And no, it's not because you bomb the Middle East.
02:42:56.000 Even if you stop bombing the Middle East, we will fight you until we become Muslims.
02:43:00.000 This is why we do it.
02:43:01.000 Don't get me wrong, you bombing the Middle East isn't making us like you anymore.
02:43:04.000 You know, we're going to keep doing it.
02:43:06.000 We fight you because you insult Muhammad.
02:43:08.000 We fight you because you don't believe in Allah and this Prophet in the last day.
02:43:11.000 This is why we fight you.
02:43:12.000 Do not misunderstand.
02:43:13.000 And they literally address the liberal or the progressives or, you know, whatever you'd like to call them.
02:43:18.000 They're literally talking to these people saying, don't think that we're crazy and we don't know.
02:43:22.000 We just, this is what we believe.
02:43:24.000 You are cancer and we want to exterminate you, basically.
02:43:27.000 And you get the SJW saying, well, I mean, it's kind of bombing.
02:43:30.000 I mean, it's American foreign policy.
02:43:32.000 No, it's nothing to do with that.
02:43:33.000 It's all about their argumentation.
02:43:35.000 Even if we were just peaceful, peace-loving people who did nothing, they would still be like, well, Allah has told me to cut your head off.
02:43:41.000 I mean, but again, this is a small minority who are widely despised by other Muslims.
02:43:47.000 I mean, like, for example, if you went to a Christian school and everyone would probably fairly chill, you know, you still go to like, you know, a Catholic school or something.
02:43:55.000 And most people you talk to, they'd be fairly chill.
02:43:57.000 But in every group of people, you'd have a couple of people who'd be the sort of people watching you.
02:44:02.000 And if you started doing something, they'd go and get the nuns.
02:44:04.000 You know, they'd go, whack, you know, how dare you?
02:44:07.000 You know, they are those people.
02:44:09.000 It's just for Islam, they end up going and killing things.
02:44:13.000 Because in Islam, you're told, right, if you die, and this is another thing that drives me crazy, they're like, oh, the 9-11 hijackers were drinking and carousing and doing drugs.
02:44:21.000 They're obviously not real Muslims.
02:44:22.000 No, you fucking idiots.
02:44:24.000 They think they're having all of their sins expunged the second they die in the service of Allah.
02:44:30.000 This is what martyrdom is.
02:44:31.000 No matter what you've done, you go to heaven and get your virgins.
02:44:36.000 So why wouldn't you do whatever the fuck you like?
02:44:39.000 They have a free pass because they know they're going to die.
02:44:41.000 Exactly.
02:44:42.000 So that's what they do.
02:44:43.000 And when they're like, oh, ISIS aren't really Islamic, because look at, they're doing all these un-Islamic things.
02:44:47.000 No, they know they're going to die.
02:44:48.000 And, well, you know, in the logic of the theology that they live under, they think they're going to die and that they're going to get it all rescinded, remission of sins.
02:44:58.000 Isn't that the most fascinating part of it?
02:45:00.000 It's just the things that people are willing to believe.
02:45:02.000 Yeah, but they've been reasoned into it by, if you take, like, say, right, I believe that the Quran is the word of God, and I believe that Hadiths and the Sunnah are the actual sayings and doings of Muhammad, and you believe these things are true, then you're building from a certain base, and you need to, and basically it's how you construct arguments.
02:45:18.000 And if you start leaving out parts of what otherwise is like a regular Muslim sort of theological argument, then you end up building like a weird tower over here that starts going off, you know, and the arguments start constructing themselves off to the left rather than staying reasonably central.
02:45:33.000 I mean, like, most Muslims are never going to become terrorists because they don't autistically believe that a bunch of things just aren't true just because they don't want them to be true.
02:45:40.000 You know, they're not cherry-picking from the Quran as much as the, well, you know, to the extent that the Wahhabists or Salafists are.
02:45:47.000 And they all believe, do you want to know what they hate?
02:45:50.000 Do you know what their watchword is?
02:45:51.000 It's the dumbest thing in the world.
02:45:53.000 What?
02:45:54.000 There will be no more innovations.
02:45:56.000 What?
02:45:57.000 No more innovations.
02:45:58.000 Muhammad was the perfect man.
02:46:00.000 We must live like Muhammad.
02:46:02.000 Therefore, I mean, you would not believe how dumb this is.
02:46:06.000 This is really part of their doctrine.
02:46:08.000 Yes.
02:46:08.000 It's called Bida.
02:46:10.000 Bida.
02:46:11.000 Do you spell that?
02:46:12.000 B-I-D-A.
02:46:13.000 I think it might have an H on it.
02:46:14.000 I'm not sure.
02:46:15.000 But I don't speak Arabic.
02:46:17.000 And the idea is that there are no not have any innovation.
02:46:20.000 No.
02:46:21.000 But what about the innovation that they already use, like Toyota trucks?
02:46:24.000 Yeah.
02:46:25.000 Like, for example, the King of Saud had a hell of a time.
02:46:29.000 Right, okay, so Saudi Arabia was founded by the Ibn Saud family in conjunction with Abdul Wahhab.
02:46:37.000 He was a known, in like the late 18th century, he was a known theologian, and he was a brilliant man, but he was, I can't think of a better way of putting it than autistic, because he was really focused, hyper-focused on a single sort of strand of thought.
02:46:54.000 And everything that he thought was either black or white.
02:46:56.000 There was no gray in between, right?
02:46:57.000 And I mean, I literally mean he may well have had autism.
02:47:01.000 I'm not just using it as an insult.
02:47:02.000 He may well have literally had autism.
02:47:04.000 It's either this or nothing at all, you know?
02:47:06.000 And so he ends up building this theology.
02:47:08.000 And it's not bad.
02:47:10.000 Well, no, no, that's not wrong with some.
02:47:12.000 That's not that's not bad.
02:47:13.000 It's not that it's not, like, it's not that there isn't an argument there.
02:47:17.000 There obviously must be an argument there, otherwise people wouldn't believe it, right?
02:47:21.000 Wait, wait, you're getting me really confused.
02:47:23.000 Okay.
02:47:24.000 What might be an argument, otherwise people wouldn't believe it.
02:47:26.000 What he's saying, his interpretation of the Quran and Islam and all of this, he's got a particular interpretation.
02:47:32.000 And he says to someone, hey, you should follow my interpretation of Islam because I've actually got the right interpretation of Islam.
02:47:40.000 And most people were like, no, you don't.
02:47:42.000 No, you don't.
02:47:43.000 You're ignoring almost everything.
02:47:45.000 And that's fine.
02:47:47.000 What does that have to do with innovation?
02:47:48.000 Right, well, that's the thing.
02:47:49.000 Like, bida is, you know, harmful innovations.
02:47:53.000 And don't be wrong, harmful innovations are bad by definition, but they're also good innovations.
02:47:58.000 So, you know, the telephone, you know, the cars, all this sort of stuff.
02:48:01.000 These are considered good innovations, and so other Muslims are free to use them.
02:48:04.000 The Wahhabis, I imagine now at this point, they've been persuaded that they are okay.
02:48:11.000 But when, like, for example, Ibn Saud first tried to explain to them, look, we need guns, and, you know, we need a maxim gun.
02:48:20.000 We need planes.
02:48:22.000 We need bombs and bullets and cars.
02:48:24.000 All of these now televisions and satellite dishes and all this sort of thing.
02:48:29.000 They were actively resistant to this because, oh, Muhammad didn't have this.
02:48:33.000 And he had these things called Aikhwan warriors.
02:48:36.000 I'm probably pronouncing that horribly.
02:48:38.000 So apologies, anyone who knows how to pronounce these words because I've only ever seen them written down.
02:48:43.000 But these guys were like ISIS.
02:48:45.000 I mean, the way Saudi Arabia was formed is exactly the same way as ISIS is being, the Islamic State is being formed now.
02:48:52.000 Exactly the same.
02:48:53.000 These guys were the most brutal men you've ever seen.
02:48:55.000 And they would just go in and murder entire villages, right?
02:48:57.000 He had 150,000 of these people, You know, like that he used to conquer what is now Saudi Arabia.
02:49:03.000 And eventually, he was like, I want a car.
02:49:08.000 I think a plane is a good idea.
02:49:10.000 And they were like, no, no.
02:49:11.000 And so they revolted against him.
02:49:13.000 And the British, seeing him as a useful client, decided to help him out with a few maxim guns and a few, like, you know, biplanes or something.
02:49:20.000 So they revolted against him because he wanted a car?
02:49:23.000 Yeah, because he's un-Islamic.
02:49:24.000 Muhammad never had a car.
02:49:26.000 And the British killed them all.
02:49:26.000 Right?
02:49:29.000 Which was probably, honestly, it was probably a good thing for the world.
02:49:32.000 Don't get me wrong, because these guys were nuts.
02:49:34.000 They would regularly just, I mean, and this is the thing.
02:49:36.000 It's another thing they've got is the tach feary doctrine.
02:49:39.000 You're not a real Muslim.
02:49:41.000 That's an interesting thing, isn't it?
02:49:42.000 And it's interesting how we do that to them.
02:49:45.000 We tack fear them.
02:49:46.000 You're not a real Muslim either.
02:49:48.000 We're not in a position to tell anyone who's a real Muslim or not.
02:49:50.000 You know, if they consider themselves Muslim, they're Muslim.
02:49:53.000 And I mean, like, literally, they are the most Muslim in their opinion.
02:49:56.000 In their opinion, nobody else is a real Muslim.
02:49:58.000 And therefore, it's okay to kill them.
02:50:01.000 You only have to give a shit about actual Muslims under this doctrine.
02:50:04.000 Again, this is not what most Muslims think.
02:50:06.000 This is the antithesis of what most Muslims actually think.
02:50:09.000 And so these guys used to go in a huge rating of the music.
02:50:12.000 Is it funny that you even have to qualify that?
02:50:14.000 Like, you have to stop and say, I know.
02:50:15.000 You know, the other nutty shit is not as nutty as this nutty shit.
02:50:19.000 Yeah, and just while we're on the subject, right?
02:50:21.000 Islam is totally illiberal.
02:50:24.000 A Western liberal country should never defend Islam in its most very Islamophobic of you.
02:50:30.000 In its most important thing.
02:50:30.000 Exactly.
02:50:31.000 Are you Islamophobic?
02:50:32.000 Oh, absolutely.
02:50:33.000 I would hate.
02:50:34.000 You would hate to live under Sharia law.
02:50:36.000 Well, you might not, well, not you personally, but as a man, you would entirely benefit from living under Sharia law.
02:50:41.000 For real?
02:50:41.000 How so?
02:50:42.000 But girls have to walk around like beekeepers, right?
02:50:45.000 Not always.
02:50:46.000 That's actually not.
02:50:47.000 How do you know what they really look like?
02:50:49.000 That's actually not in the Quran.
02:50:50.000 It just says be modest, right?
02:50:51.000 Oh, modest.
02:50:52.000 You could just wear a headscarf or something.
02:50:54.000 Well, it's all relative, isn't it?
02:50:55.000 If you're like an internet hoe, modest is like cheating shorts.
02:50:58.000 Yeah.
02:50:59.000 And if you're an autistic ISIS member, that's full-on burka with a grill over your face.
02:51:03.000 So what's interesting about this conversation...
02:51:14.000 In a legal system?
02:51:15.000 In the legal system.
02:51:16.000 So do they use that legal system?
02:51:17.000 Like, they use the Quran to enforce the laws?
02:51:21.000 So if a woman says something like, oh, Sargon raped me, and you'd be like, pitch.
02:51:26.000 You need a female witness.
02:51:27.000 He has two points.
02:51:28.000 You need more than one female witness.
02:51:30.000 Yeah, you need another female witness to testify.
02:51:32.000 And even then, the worst part about it, but a half a woman.
02:51:37.000 It's entirely possible that the judge will punish her for tempting me.
02:51:41.000 Yeah, because if she has a half an opinion and then the other chick has a half an opinion, that's still only like one person's human story.
02:51:47.000 And then one person's a woman.
02:51:48.000 Yeah, this runs through the whole thing.
02:51:49.000 I mean, like, women are entitled to half of the inheritance money a man is entitled to.
02:51:54.000 Where does the other half go to?
02:51:56.000 Any brothers or, you know, anything like that.
02:51:59.000 If you're a lone sibling, then yeah, okay.
02:52:00.000 A lone child, then yeah, okay, you're fine, obviously.
02:52:02.000 But like, if you've got a male sibling, you'll get half of what he gets.
02:52:05.000 Wow.
02:52:06.000 Pretty horrible, isn't it?
02:52:07.000 But that's pretty ancient.
02:52:08.000 Yeah, and this is just like the most mild thing.
02:52:10.000 And like the legal system is basically about as useful as Hammurabi's code of laws, where an eye for an eye, a tooth for a tooth.
02:52:18.000 I mean, that was outdated 2,000 years ago in the West.
02:52:22.000 You know, with the advent of Jesus, but not in Islam.
02:52:24.000 It's really brutal stuff.
02:52:25.000 And it's just like, okay, well, I mean, any progressive whoever advocates for anything positive about Islam is an idiot and doesn't understand Islam.
02:52:36.000 They've got, I mean, I've seen them going, Islam is the most feminist religion.
02:52:39.000 Yeah, if you want to be made a second-class citizen, it is absolutely.
02:52:42.000 If that's what you want feminism to be, the installation of a systemic oppression against women, sure, that can be feminism if you like.
02:52:50.000 And that really is revealing about feminism, isn't it?
02:52:53.000 What is going on lately with this weird sort of acceptance of, how do you say, hijab?
02:53:00.000 How do you say the hipgear?
02:53:01.000 Hijab.
02:53:02.000 I do know what offends me more than anything, right?
02:53:04.000 But what's going on with that?
02:53:06.000 There's this weird thing where women are like being empowered.
02:53:08.000 They're pretending.
02:53:10.000 But they're pretending they're empowered by this and that this is their own choice and that this is a feminist value.
02:53:15.000 Unbelievably stupid.
02:53:16.000 But it's very common.
02:53:17.000 Have you seen it?
02:53:18.000 Yeah, I'm seeing a lot.
02:53:19.000 And one thing that I've never really thought of myself as the sort of person who gets offended, but when I see an American flag hijab, I find that offensive.
02:53:26.000 Come on, they have those?
02:53:27.000 Oh, yeah.
02:53:28.000 I know.
02:53:29.000 I literally look at it and go, no, this is the antithesis of what these values meant to represent.
02:53:32.000 That seems like something a biker would wear.
02:53:35.000 It sounds like a parody face.
02:53:37.000 Sounds like a parody.
02:53:38.000 But maybe that's really what it is.
02:53:40.000 Maybe like one of them Duck Dynasty face.
02:53:43.000 Maybe, maybe, maybe I've been fooled.
02:53:45.000 Oh, my God.
02:53:46.000 Isn't that just the symbol of liberty?
02:53:48.000 Cover yourself up.
02:53:49.000 Is that real?
02:53:52.000 Wow.
02:53:53.000 Cover your fucking hair up, you know.
02:53:56.000 So is she wearing that to show that she's very American?
02:53:59.000 That when Trump is saying that he wants to shut down mosques.
02:54:03.000 Did you see that whole Megan Kelly thing with Alex Jones?
02:54:07.000 I saw that he scooped her, but to be fair, I mean, Alex probably did say a lot of crazy shit, and he probably did a lie about a lot of crazy shit, didn't he?
02:54:15.000 Well, in the past, he definitely said crazy shit about Sandy Hook.
02:54:18.000 But she was saying that she was...
02:54:23.000 I've known Alex a long time.
02:54:26.000 It's not that.
02:54:27.000 It's just that he's just always looking for conspiracies.
02:54:32.000 Like, when you think you know for sure that something's, for sure, that something's a conspiracy, when you don't really know for sure, you're just saying it.
02:54:39.000 And then you just run with it.
02:54:40.000 When you're talking about children's lives, it's very dangerous.
02:54:44.000 It's horrible.
02:54:45.000 You better have some bloody convincing proof.
02:54:47.000 And he's abandoned it.
02:54:49.000 He's abandoned that.
02:54:50.000 Yeah, but he hasn't abandoned it and apologized for saying it.
02:54:54.000 He's like, what he's done is like made this sort of like, he's sort of like made some video where he was talking to the parents directly and saying that he couldn't imagine and he feels their pain and I'm terrible for, I feel very sorry for your loss.
02:55:14.000 But it doesn't go saying, I'm sorry I supported a hoax that's not true that says you were liars and your children were never killed.
02:55:22.000 You know, there was a really crazy story that I read about a guy who was a conspiracy theorist and then his own son was murdered.
02:55:28.000 And then he realized, holy shit, in Sandy Hook.
02:55:32.000 He realized like, oh my God, this is how insane these conspiracy theories are.
02:55:36.000 There's people that are, there's a whole movement right now where they think that polio was a massive hoax and that what polio was, was them injecting people with the polio vaccine because there was no polio and then injecting them with this vaccine, they hurt all these people.
02:55:54.000 As someone who has been fooled before, I can tell you the only solution is education.
02:55:58.000 You can't just tell these people they're idiots and never talk to them because they'll just think, well, you don't have anything to rebut me with.
02:56:03.000 Flat earth.
02:56:04.000 Right, yeah, exactly.
02:56:04.000 The flat earth.
02:56:05.000 You've got to engage with these people.
02:56:06.000 I mean, that's weirdly popular these days.
02:56:08.000 Weirdly.
02:56:09.000 It's like, what the fuck do you think the satellites are for?
02:56:11.000 Well, I put that Elon Musk thing up.
02:56:12.000 Have you seen that Elon Musk thing I put on Instagram?
02:56:15.000 Check this out.
02:56:16.000 Pull this up.
02:56:17.000 Elon Musk posted something about SpaceX.
02:56:22.000 They can shoot something into space now and then land it on a pad.
02:56:25.000 They've figured out how to do that.
02:56:26.000 And so they sped up a sped up version.
02:56:29.000 So instead of taking an hour or whatever it takes to get up there and get back.
02:56:33.000 But check this out.
02:56:34.000 Look how beautiful that looks when they're coming down on the earth.
02:56:36.000 But then watch.
02:56:37.000 But why doesn't the water fall off, Jay?
02:56:39.000 I don't know.
02:56:40.000 But check this out.
02:56:41.000 The water, it does.
02:56:44.000 So watch how this thing lands.
02:56:46.000 This is really kind of crazy.
02:56:48.000 Boom.
02:56:48.000 From that nuts.
02:56:50.000 Fucking hell, right?
02:56:51.000 Yeah, that's incredible.
02:56:52.000 Dude, Elon Musk for president.
02:56:54.000 Well, I don't know about his politics, but I mean, that's Elon Musk to build shit.
02:56:58.000 That's amazing.
02:56:59.000 That's wonderful.
02:56:59.000 But look at that fucking video of the Earth.
02:57:02.000 So all these dip shits in the comments.
02:57:05.000 I knew it when I put it up.
02:57:06.000 Fishbowl eye lens.
02:57:07.000 I was going to put it up because it's awesome.
02:57:09.000 Just, it's awesome.
02:57:10.000 I'm going to put it up whether or not dipshits think the earth is flat or not.
02:57:13.000 But you clearly see the fucking circle.
02:57:16.000 You see the curve of the earth.
02:57:18.000 And there's so many people in the comments.
02:57:20.000 Like, it's CGI.
02:57:21.000 It's fake.
02:57:21.000 You really believe that, bro?
02:57:22.000 You're such a sellout.
02:57:24.000 You're such a sellout, bro.
02:57:25.000 Why do we have satellites?
02:57:26.000 What's the point?
02:57:28.000 They don't think they're satellites.
02:57:29.000 But you can see them launching in space.
02:57:31.000 No, no, no.
02:57:31.000 No.
02:57:32.000 No, they're flying around in the planes.
02:57:34.000 Okay, why can't you see the sun at any time of the day or night?
02:57:37.000 Why can't you?
02:57:38.000 Yeah.
02:57:38.000 Because the sun goes around behind the flat Earth.
02:57:41.000 What?
02:57:42.000 No.
02:57:44.000 It doesn't make any sense.
02:57:45.000 Yeah, no.
02:57:46.000 Why are you saying it like you're looking for a rational explanation?
02:57:49.000 Well, yeah, but that's the thing, isn't it?
02:57:50.000 Because let's say, you know, this is the flat Earth.
02:57:52.000 The ice wall.
02:57:53.000 You got the sun.
02:57:53.000 And they say about it like it's like a, I don't know, like a fucking, like a spotlight, right?
02:57:58.000 And it is for some reason rotating around.
02:57:59.000 But it's like, well, at any point, I should be able to point over and go, oh, look, there's the sun.
02:58:02.000 It's just shining downwards, you know.
02:58:04.000 You have a poor understanding of flat earth theory, sir.
02:58:07.000 I think I probably do.
02:58:08.000 I think you're absolutely right.
02:58:09.000 But yeah, the only way to get rid of these things is to educate people.
02:58:12.000 You can't ridicule them.
02:58:13.000 Right.
02:58:14.000 Because they're not necessarily being irrational.
02:58:16.000 They're just dumb.
02:58:17.000 But it's like everything else we've talked about, where people have an idea in their head and then they dig their heels in the sand.
02:58:23.000 And here's the thing.
02:58:24.000 Once you've publicly said you think the earth is flat, and once you've said that, you have to sort of stick with your guns.
02:58:32.000 Well, this is.
02:58:33.000 Because if you come out and say, I'm sorry, I'm so stupid.
02:58:36.000 I really did believe these fucking idiots on YouTube instead of believing actual astrophysicists, actual people who are involved in the fields of cartography, airline travel, shipping routes.
02:58:47.000 I mean, Jesus fucking Christ.
02:58:48.000 You can go on and on and on about satellites, aerospace engineering, all the different people that probably would have to be in cahoots.
02:58:56.000 But let me tell you about the wall of ice.
02:58:58.000 There's a photo of it.
02:58:58.000 The walls of ice.
02:58:59.000 Someone sent it to me.
02:59:00.000 You fucking idiot.
02:59:01.000 Here's the wall.
02:59:02.000 It's like, it's 10 feet high.
02:59:03.000 That's a piece of shit.
02:59:06.000 I watched a video by some YouTuber about this.
02:59:09.000 And some of the people.
02:59:11.000 It's not the problem, man.
02:59:12.000 Fucking YouTubers.
02:59:14.000 It is a part of the problem.
02:59:15.000 But he was like, why do we believe in a flat earth?
02:59:18.000 And I was like, great.
02:59:20.000 I want to know.
02:59:21.000 That was legitimate.
02:59:22.000 Vaccines and chemtrails.
02:59:24.000 Actually, it was the hope.
02:59:25.000 Bigfoot.
02:59:26.000 It gives us, man, we'll talk about Bigfoot.
02:59:28.000 It's a hope?
02:59:28.000 They get a hope?
02:59:29.000 Yeah, it gives them hope.
02:59:30.000 Like, hope beyond the ice wall that a better land exists.
02:59:33.000 And it's like, why would it be any better than what we've got now?
02:59:36.000 Don't you have a photo, you dumb cunts?
02:59:38.000 It's not one guy leaked a fucking camera over to that ice wall, you piece of shit.
02:59:42.000 Can't we fly a drone over?
02:59:43.000 Show me the edge.
02:59:44.000 Show me the outside edge.
02:59:46.000 And it's a conspiracy to keep hidden.
02:59:46.000 Why?
02:59:48.000 No.
02:59:48.000 Fucking why.
02:59:49.000 Here's what it is.
02:59:50.000 People love to know things that other people don't know.
02:59:54.000 They love when they uncover a truth, when there's a secret that someone is hiding about something as monumental as the shape of the earth.
03:00:02.000 People love to be the people in the know.
03:00:04.000 They love to be hashtag woke.
03:00:08.000 That's what it is.
03:00:09.000 And they don't like to read books.
03:00:11.000 No, no, no, they don't.
03:00:12.000 And I think for some of them, though, like, I mean, this guy who's saying, look, it gives us hope.
03:00:16.000 And it's like, that's because you don't have any hope.
03:00:19.000 You need something.
03:00:20.000 And you're obviously not a religious person.
03:00:21.000 And so you take on this as kind of a religious belief.
03:00:24.000 No, no, no, you're wrong.
03:00:24.000 Listen, a lot of them are religious.
03:00:26.000 That's part of the thing.
03:00:27.000 Part of the whole thing is that evolution is a lie.
03:00:30.000 That's one of the big parts of flat earth theory is that the earth is flat, the firmament.
03:00:34.000 Breaking the condition of today.
03:00:36.000 Yeah, man.
03:00:36.000 Listen, there's a big religious part of this whole thing.
03:00:40.000 Evolution is a lie.
03:00:41.000 It's a huge part of the flat earth movement is a belief in creationism, a belief that space is not real and that above us is heaven.
03:00:50.000 That we live in a dome.
03:00:52.000 No, there's a lot of different factions to the flat earth movement.
03:00:55.000 I'm sure there's some good reasons.
03:00:58.000 Yes, I have.
03:00:59.000 But no, this is a big part of this is a deep belief in creationism.
03:01:05.000 Oh, so it's an American Christian thing.
03:01:08.000 Oh, yeah.
03:01:08.000 The ones who believe in it also don't believe in dinosaurs.
03:01:11.000 They think dinosaurs are fake.
03:01:13.000 This is wonderful, right?
03:01:15.000 The main guy believes that dinosaurs are fake.
03:01:18.000 By the way, oh, he thinks nuclear bombs are a hoax.
03:01:20.000 Yes, nuclear bombs are a hoax.
03:01:20.000 What?
03:01:22.000 Oh, what does it do?
03:01:22.000 Just a bomb.
03:01:24.000 Just a bomb.
03:01:25.000 Just a really big.
03:01:25.000 Not a big deal.
03:01:26.000 It's a bomb.
03:01:27.000 It's a hoax.
03:01:28.000 Yeah, but like.
03:01:29.000 This is a massive hoax.
03:01:30.000 They spell it.
03:01:31.000 He believes it's a bomb.
03:01:31.000 Just a really big one.
03:01:32.000 Dinosaurs are a hoax, too.
03:01:33.000 Yeah, I believe they are.
03:01:34.000 Bro.
03:01:34.000 Chemtrails are real, though.
03:01:35.000 Of course they are.
03:01:37.000 The irony is chemtrails probably could be real.
03:01:40.000 Geoengineering is a thing.
03:01:41.000 Oh, yeah.
03:01:42.000 And so, like, it could actually be right.
03:01:43.000 And I could see the rationale from the government's point of view.
03:01:45.000 If they're like, well, look, global warming's happening.
03:01:47.000 And if we put extra clouds in the sky, then it'll deflect an excessive amount of the sun's rays and cool the earth down.
03:01:54.000 Well, it does do that.
03:01:55.000 Yeah, exactly.
03:01:55.000 But it doesn't mean it's a conspiracy to do that.
03:01:57.000 That's why they're stupid.
03:01:58.000 Well, yeah, it's not...
03:01:59.000 They're not trying to...
03:02:01.000 They think, well, they're trying to poison people or they don't understand that that's what happens when a jet engine goes through condensation in the air.
03:02:09.000 It creates these fake clouds.
03:02:10.000 And yes, these fake clouds can cool the earth down a little bit.
03:02:15.000 They put them out on purpose.
03:02:16.000 Yeah, but they're not putting them out on purpose.
03:02:17.000 They're just a side effect of air travel.
03:02:19.000 Like they're making these assumptions based on seeing something in the sky and thinking someone's spraying something.
03:02:25.000 Yeah, but wasn't there actually like a geoengineering thing that was actually something they're doing or planning to do?
03:02:29.000 There is definitely the possibility of one day us having no real protection, no ozone layer.
03:02:37.000 There's a possibility.
03:02:38.000 And one of the things that scientists have speculated is that suspending some sort of reflective particles in our atmosphere, like satellites, that that would somehow or another protect us from solar radiation.
03:02:51.000 That's what I was thinking.
03:02:51.000 So it's something that people have thought of.
03:02:53.000 Yeah.
03:02:53.000 Whether they're doing all that.
03:02:54.000 Well, for sure someone has thought about doing it if that's the issue.
03:02:57.000 If we're missing some sort of an atmosphere, can we create an atmosphere?
03:03:01.000 How would we do it?
03:03:01.000 Yes, we can.
03:03:02.000 We'd spray chemicals in the sky.
03:03:03.000 Okay.
03:03:04.000 But that's not what you're seeing when you're seeing those clouds behind jets.
03:03:07.000 It's really simple science.
03:03:09.000 The heat of the jet engine creates it.
03:03:11.000 Combined with the constant, yeah, the condensation in the atmosphere, it creates these artificial clouds.
03:03:16.000 They look like clouds because they are clouds.
03:03:18.000 Yeah, yeah.
03:03:18.000 And that makes perfect sense.
03:03:20.000 But to the armchair conspiracy theorist that looks up, they're like, they're spraying us, man.
03:03:25.000 Death spray in the sky, man.
03:03:27.000 Can I come out and defend Alex Jones for a minute?
03:03:29.000 I love Alex Jones.
03:03:29.000 You know what I'm saying?
03:03:30.000 I know.
03:03:31.000 You're going to defend him about Sandy Hook?
03:03:32.000 No, no, I'm going to defend him about the frogs.
03:03:35.000 What about the frogs?
03:03:36.000 They are turning the frogs gay.
03:03:36.000 It's true.
03:03:38.000 Oh, they turned frogs gay.
03:03:39.000 You know what's really ironic as well?
03:03:40.000 The globalists are turning the frogs gay.
03:03:42.000 Well, there is.
03:03:43.000 You know, I mean, I'm actually being literally serious when I'm saying that.
03:03:45.000 Right, well, there's pollution.
03:03:46.000 Yeah.
03:03:47.000 Yeah, pollution actually does make the frogs gay.
03:03:49.000 It's corporate malfeasance.
03:03:50.000 And it's only a certain kind of frog and a certain kind of pesticide.
03:03:54.000 I actually read through the study.
03:03:55.000 He actually played my video on his channel before.
03:03:57.000 My video was on my old channel that got taken down by a man.
03:04:00.000 Your channel got taken down?
03:04:02.000 It was just a little shit posty channel.
03:04:03.000 But the shit posty channel, what do you mean?
03:04:03.000 Don't worry about it.
03:04:06.000 Just put bullshit on there.
03:04:07.000 Bullshit, I found funny.
03:04:07.000 Oh.
03:04:08.000 And one of the funny thing was looking into this study Alex Jones is citing, so I thought I'd go and get the study, actually read it.
03:04:13.000 And he's right, kind of.
03:04:16.000 For example, these people are actually globalists.
03:04:18.000 It's like a multinational corporation.
03:04:20.000 People who are actually open borders.
03:04:21.000 They are pro-open borders.
03:04:22.000 They're pro-you know, corporate influence around the world because they run a giant corporation.
03:04:27.000 The chemicals they're putting in the water technically were actually, he wasn't even going far enough.
03:04:31.000 These chemicals were making the frogs trans.
03:04:34.000 Like, it would, for 90% of the frogs, and it was only a certain chemical, a certain pesticide, and it was affecting a certain kind of frog.
03:04:41.000 So, and it, it doesn't, I know.
03:04:43.000 How does it make them trans?
03:04:44.000 Well, I don't know the process, obviously.
03:04:46.000 I'm not a scientist, but like, um, but I mean, for 90% of them, oh, yeah.
03:04:51.000 Well, no, no, literally, for 90% of the frogs that were touched, someone like that, I'll have to, you know, check the numbers.
03:04:56.000 But for a large, for the majority of the frogs touched, it would make them behave in like the sort of feminine way.
03:05:02.000 Male frogs, it'd make them behave in a feminine way.
03:05:04.000 And for 10% of the frogs touched by this chemical, it would actually set them through a process to transform into a fertile female frog.
03:05:12.000 So it wasn't even making them gay.
03:05:12.000 Holy shit.
03:05:14.000 It was literally making them trans, right?
03:05:16.000 That's not even trans, right?
03:05:17.000 But that's literally like changing your actual sex.
03:05:20.000 Yeah, yeah.
03:05:20.000 It's literally like transitioning into an actual female that's actually fertile.
03:05:24.000 So it's better than we can do for human beings.
03:05:26.000 But this was actually a real thing.
03:05:29.000 It wasn't like an intentional plan to try and turn people into gays or trans or whatever, though.
03:05:35.000 And it was done by people who are globalists, probably, but it wasn't done as part of a nefarious plot.
03:05:40.000 It was done because it was cheaper to just dump the chemicals, you know, obviously.
03:05:43.000 And it was just an unintended plan.
03:05:45.000 It's just ironic that when Alex Jones says, you know, the globalists are putting chemicals into water that make Franz gay.
03:05:50.000 Well, he's right.
03:05:51.000 Well, you know, during Desert Storm, there was a project that they put together where they were trying to figure out a gay bomb.
03:05:58.000 Yeah.
03:05:59.000 Do you know about that?
03:06:00.000 Yeah, I heard about that.
03:06:00.000 Like, these soldiers were about to start like, well, okay, I'm not feeling very horny, but now I've decided I like men.
03:06:06.000 They would blow this stuff in the air, blow it up, and it would land down on them, and it would transfer them into gay people, and they'd be so ashamed that they would stop fighting and lose morale.
03:06:16.000 This was a real consideration.
03:06:17.000 They spent millions of dollars on these.
03:06:20.000 Yeah, exactly.
03:06:21.000 These guys were fearsomely gay.
03:06:23.000 That was a superpower to the Thebans.
03:06:25.000 Scientists developed gay bomb to make enemy soldiers stop fighting and make love.
03:06:30.000 Does 2007?
03:06:31.000 Do, do, do, do, do, do.
03:06:34.000 Scientists at Wright Laboratory in Dayton, Ohio, working to make America's military even mightier, made the discovery in 1994, according to the detailed papers unearthed through a freedom of information request.
03:06:46.000 And last night, they were finally rewarded with an IG or LG?
03:06:52.000 Ignoble Awards, sorry.
03:06:54.000 Ignoble Prize for Peace, a spoof of the Nobel Prize.
03:06:57.000 Oh, okay.
03:06:58.000 Due to be announced next week, Mark Abrahams, editor of the Annals of the Improbable Research and the man behind the Ignoble Awards, explained, we don't know if this document was the start and end of it or whatever.
03:07:10.000 In fact, this project continued and perhaps continues to this day.
03:07:16.000 So to this day, they might be working on a gay bomb.
03:07:18.000 Do you think Trump puts a stop to that?
03:07:20.000 He doesn't like unnecessary spending.
03:07:22.000 He's pro-gay, though.
03:07:23.000 Yeah, but he doesn't want more gays.
03:07:25.000 It's too complicated to change people.
03:07:26.000 What if a regular guy like him gets bitten by that spider?
03:07:29.000 By a radioactive gay spider.
03:07:30.000 Do you know about that tick that turns people into vegans?
03:07:35.000 Go on down.
03:07:36.000 There's a tick.
03:07:37.000 This tick bites you and you develop a meat allergy.
03:07:40.000 Oh, really?
03:07:41.000 It's called a lone star tick.
03:07:41.000 Yep.
03:07:43.000 Where do they live?
03:07:44.000 Where they live?
03:07:45.000 Probably Texas because it's called a lone star tick.
03:07:47.000 But I think they're spreading.
03:07:50.000 It would be ironic if it was in the meat-eating capital of the world.
03:07:53.000 Lone Star Tick, making people allergic to red meat.
03:07:56.000 That'd eat the fuck out of some chicken, though.
03:07:58.000 I'll tell you that.
03:08:00.000 Do you reckon it was engineered by the government?
03:08:02.000 I just think it's just nature deciding we're tired of their bullshit.
03:08:06.000 Look, it's been recorded as far north as Maine and all the way to central Texas and Oklahoma.
03:08:11.000 Wow.
03:08:12.000 The ticks are found in the dense undergrowth and wooded areas as well as around animal resting areas.
03:08:18.000 You know, I've been thinking about this, man, because I run in the hills where this ticks.
03:08:22.000 I saw a tick, like Ari and I did a podcast the other day that we will release on July 18th, which is the day that Ari's Netflix special comes out.
03:08:29.000 But we did a Podcast where we went on a hike and I put on a weighted vest and we went through the woods and did the whole thing that I do when I run.
03:08:35.000 And a tick landed on Ari and we knocked it off and then we started talking about it and like going, is there a way to stop ticks from coming on you?
03:08:44.000 Is there like an effective do they have a spray?
03:08:47.000 Like ticks are not like mosquitoes, right?
03:08:49.000 No.
03:08:50.000 Is there an effective tick spray?
03:08:52.000 No idea.
03:08:53.000 You should know, dude.
03:08:54.000 You're so smart.
03:08:55.000 Dude, I don't know about ticks.
03:08:57.000 I'm not that smart.
03:08:57.000 There's tick repellent.
03:08:58.000 Oh, there is.
03:08:59.000 I don't know how effective it is, though.
03:09:01.000 Hmm.
03:09:02.000 Yeah, mosquito repellent doesn't always work.
03:09:04.000 I get bitch.
03:09:05.000 It never fucking works.
03:09:06.000 I get bullish.
03:09:07.000 You know what works?
03:09:08.000 Thermacell.
03:09:09.000 You ever use a thermocell?
03:09:10.000 No.
03:09:10.000 Oh, man.
03:09:11.000 I don't know if it's bad for you, but it's amazing.
03:09:13.000 I don't even know what that is.
03:09:14.000 Thermocell is in a piece of equipment that people take on hikes and camping and hunting.
03:09:20.000 Yeah, I don't go to the woods much.
03:09:22.000 There's this chemical that screws into the bottom, and then it has an element.
03:09:26.000 You click it and it heats on, and it makes this fine mist through this element.
03:09:31.000 And this fine mist, it doesn't smell bad to people, but mosquitoes cannot fuck with it.
03:09:39.000 They just vanish.
03:09:42.000 It's not poisonous to long term or anything.
03:09:44.000 I don't know, man.
03:09:44.000 That's the problem.
03:09:46.000 I don't want to be doing this, and then five years later, you've got dementia.
03:09:49.000 I already have it.
03:09:50.000 No, no, no.
03:09:51.000 Panicking?
03:09:51.000 Yeah, I'm panicking now.
03:09:52.000 Shit.
03:09:53.000 Some things I can't remember.
03:09:54.000 So Bigfoot then.
03:09:56.000 Yeah, what about them?
03:09:57.000 Have you seen one?
03:09:58.000 Oh, no, I definitely haven't.
03:09:59.000 But I went looking.
03:10:00.000 Yeah.
03:10:00.000 Did you?
03:10:01.000 I did a whole TV show about it.
03:10:02.000 Yeah, did you?
03:10:03.000 Oh, I didn't see it.
03:10:04.000 Well, I did this show that I used to do on the sci-fi channel that made me, this made a lot of people mad at me, and it also made me realize what conspiracy theories really were all about.
03:10:15.000 The show was called Joe Rogan Questions Everything.
03:10:18.000 So I would go and I would embed myself with these people and hang out with them.
03:10:21.000 And I embedded myself with these Bigfoot people.
03:10:23.000 And what I said at the end of it was, here's one thing that you don't find when you go looking for Bigfoot.
03:10:28.000 Black people.
03:10:30.000 You're more likely to find Bigfoot than you are black people looking for Bigfoot.
03:10:34.000 It's a bunch of unfuckable white guys out camping.
03:10:39.000 Oh, Jesus.
03:10:40.000 No one's horrible.
03:10:41.000 Black people don't want to have nothing to do with it.
03:10:41.000 It's true.
03:10:43.000 It's a goofy white obsession.
03:10:46.000 And it's also, there's no one that I talked to.
03:10:50.000 There's like one lady that had a pretty good story.
03:10:53.000 She believed what she was saying.
03:10:55.000 Black bears are huge.
03:10:55.000 But here's the thing.
03:10:57.000 And you can get a black bear that's seven feet tall easily.
03:11:01.000 And they walk on their hind legs sometimes.
03:11:02.000 They just do.
03:11:03.000 Everybody knows they do.
03:11:04.000 There's video of them.
03:11:05.000 It's not a mystery.
03:11:06.000 Black bears walking through the woods on hind legs that are seven feet tall.
03:11:09.000 You would think that's Bigfoot.
03:11:10.000 This is really just telling my life upside down.
03:11:12.000 I mean, if there's no Bigfoot, but the Earth is flat.
03:11:14.000 What do I believe?
03:11:15.000 Well, I don't think the Earth is flat either.
03:11:17.000 I think it's just everything's a lie, man.
03:11:20.000 This is all psychology.
03:11:21.000 But the Bigfoot thing, there are so many really convincing stories.
03:11:24.000 That's the thing.
03:11:25.000 I've never heard one.
03:11:27.000 I haven't heard one of them.
03:11:27.000 Sorry?
03:11:28.000 What, convincing stories?
03:11:29.000 Of Bigfoot?
03:11:30.000 No.
03:11:30.000 Oh, I've heard loads where they're like, you know, I was like, you know, five feet away from this Houston, and it didn't have like, you know, it didn't have a bear's face, it had a man's face.
03:11:38.000 And they'll be like, you know, and it was, you know, looking at me and stuff like this.
03:11:41.000 Is that okay?
03:11:42.000 But nobody has ever got a good picture of any of these things, you know?
03:11:44.000 Yeah, nobody believes that.
03:11:45.000 But nobody goes to the woods on a regular, other than Les Stroud, Survivor Man, he believes it.
03:11:50.000 Yeah, but there are never any bodies.
03:11:52.000 There's never any feces.
03:11:54.000 I've heard bears make a fucking monkey noise, too.
03:11:57.000 I heard him say that.
03:11:58.000 He's like, it made it sound like, ooh, hoo-hoo-hoo-hoo.
03:12:00.000 And he's like, it was clearly a primate sound.
03:12:03.000 But I've heard, I've seen in my own eyes in the woods a bear do that with another bear.
03:12:08.000 They make that when they fight with each other.
03:12:10.000 Yeah, it must be bears.
03:12:11.000 It must be.
03:12:12.000 I can't remember.
03:12:13.000 Because, I mean, you can't have like, you can't have like a species of Australopithecus or whatever it's supposed to be.
03:12:18.000 Gigantopithecus.
03:12:19.000 Sorry, yeah, gigantopithecus.
03:12:21.000 Whatever it's supposed to be.
03:12:22.000 And just like, I mean, don't be wrong, America's a huge place.
03:12:25.000 Yeah.
03:12:27.000 You would find something.
03:12:27.000 You would find something.
03:12:28.000 You'd find a dead one or something after a while.
03:12:31.000 There's enough people looking, but the thing is, when they found out about that Hobbit guy, do you know the Hobbit person?
03:12:40.000 Homo florensis.
03:12:41.000 Florencesis.
03:12:42.000 Florences.
03:12:43.000 Yeah, I know the thing you're talking about.
03:12:44.000 Yeah, the island of Flores, yeah, in Indonesia.
03:12:47.000 They found within the last 14,000 years, there was absolutely a little person that was like two and a half, three feet tall.
03:12:56.000 Yeah, but they look 100% human.
03:12:59.000 This thing did not look 100% human.
03:13:02.000 It was another type or an offshoot of Homo sapiens.
03:13:06.000 Something that's something that classified it.
03:13:10.000 I don't know if they have DNA.
03:13:11.000 It's a relation of humans, obviously, spurred somewhere.
03:13:14.000 It's a type of humans, but it's not Neanderthal and it's not Homo sapien.
03:13:18.000 And this thing was 14,000 years ago.
03:13:21.000 So they know that Gigantopithecus was a real animal, and it lived 100% alongside human beings as recently as 100,000 years ago.
03:13:30.000 Right, okay.
03:13:31.000 But this is just nonsense, though, the big for thing.
03:13:34.000 Well, that orang pendeck is much more interesting.
03:13:37.000 The orang pendeck is a small hobbit-like man that lived, I think that might have been Indonesia as well.
03:13:45.000 But maybe Vietnam.
03:13:47.000 Maybe it's Vietnam.
03:13:48.000 Well, I think it's some of these tropical jungle climates where there might have been, as recently as 100 years ago or whatever, there might have been a few of those fuckers remaining.
03:13:58.000 They don't know.
03:13:59.000 I mean, when someone dies in the woods, man, do you know how many people we lose in the United States in the woods?
03:14:03.000 Thousands every year.
03:14:04.000 Thousands.
03:14:05.000 I watch David Pilates all the time on Art Bell and stuff.
03:14:09.000 It's fascinating.
03:14:09.000 I think he's great.
03:14:10.000 I love that he's such a clean-cut all-American guy.
03:14:13.000 And he never goes into the wild conspiracy theory angle of it.
03:14:16.000 But it's always hinting that it's probably Bigfoot.
03:14:20.000 It's always hinting it's probably Bigfoot.
03:14:21.000 Well, that was a movie rather.
03:14:22.000 He never is.
03:14:24.000 Bobcat Goldwait made a movie about Bigfoot eating people.
03:14:27.000 Oh, wow.
03:14:27.000 It's a great movie.
03:14:28.000 Terrifying.
03:14:29.000 It's called Willow Creek.
03:14:30.000 It's a horror movie.
03:14:32.000 Bobcat Goldwait made a horror movie?
03:14:34.000 Dude, he made a great horror movie.
03:14:36.000 It's like a sort of like one of those Blair Witch Project type movies about a couple that go up looking for Bigfoot, you know, and it's just a film where the original Patterson footage was taking place.
03:14:48.000 It's a person in a suit.
03:14:49.000 People love, they want to believe.
03:14:50.000 It's a person in the fucking suit.
03:14:51.000 Bobcat wants to believe.
03:14:53.000 Yeah, but why do they put tits on the suit?
03:14:55.000 Because it's a girl.
03:14:59.000 Why would they make it?
03:15:00.000 It looks terrible.
03:15:01.000 Anybody who looks at it, it looks like a person in a suit.
03:15:04.000 That's another part of my bit.
03:15:05.000 I said, whenever you see something that looks like a person in a suit, it's a person in a suit.
03:15:09.000 There's not a fucking animal alive.
03:15:10.000 It looks like a person in a suit.
03:15:13.000 I just wanted to know what the conversation was.
03:15:14.000 It's like, look, if we get a male suit, I mean, it's going to look like a person in a suit, but if we get a female suit, they're going to be like, why would they put tits on it?
03:15:22.000 Well, it has to be a real big female.
03:15:24.000 They did a female because a female would be smaller than a male.
03:15:28.000 So instead of it being an eight-foot-tall thing, you get a six-foot-tall thing.
03:15:32.000 And it's like a regular size dude with some nice tits.
03:15:35.000 And you're like, it's a chick, bro.
03:15:36.000 Meanwhile, it's a big guy.
03:15:37.000 It's like a big six.
03:15:39.000 I love the Bigfoot.
03:15:40.000 I love the breakdown videos all over YouTube.
03:15:43.000 They're great.
03:15:43.000 And it's always like, no, that's a dude in the suit.
03:15:45.000 There was a WikiLeaks that was just released today about NASA knowing about aliens.
03:15:51.000 Really?
03:15:52.000 Yes.
03:15:54.000 Come on, go.
03:15:55.000 Pull it up, Jamie.
03:15:57.000 NASA knows about aliens.
03:15:59.000 They have information that Wikileaks knows, and they're going to spill the beans.
03:16:04.000 I'll tell you, when it comes out tomorrow, it really is true, and then there's like an alien ambassador on the White House lawn or something.
03:16:09.000 You're going to be like, oh, okay, I was, you know, you look like a prat on yourself.
03:16:12.000 Oh, it's Anonymous.
03:16:13.000 I thought it was WikiLeaks.
03:16:14.000 Oh, yeah.
03:16:15.000 Anonymous, much more likely to be pranksters.
03:16:17.000 Yeah.
03:16:18.000 Yeah.
03:16:19.000 Yeah, much more likely.
03:16:21.000 That's very generous.
03:16:22.000 Anonymous, a global hacking collective, believes that alien life exists and thinks NASA is about to confirm it.
03:16:29.000 The shadowy group.
03:16:30.000 Oh, yeah.
03:16:30.000 I love that.
03:16:31.000 It's a shadowy group.
03:16:32.000 It's a group of autistic people.
03:16:33.000 So you can be in.
03:16:34.000 You could join.
03:16:35.000 No, how do you know I'm not?
03:16:36.000 I'm in two.
03:16:37.000 In a 12-and-a-half-minute video published in an unofficial YouTube channel on Tuesday, the video centers around recent findings by the American Space Organization, including the discovery of 219 new planet candidates, 10 of which present similar conditions to Earth, by NASA's Kepler Space Telescope team in June, as well as comments made by a senior NASA official.
03:16:37.000 Exactly.
03:17:01.000 But here's the real thing, right?
03:17:03.000 You believe, and I believe as well, that it's entirely possible that some life exists somewhere in the infinite universe.
03:17:08.000 I can't believe it doesn't.
03:17:10.000 Of course.
03:17:10.000 We're on the same page.
03:17:11.000 Why are we so skeptical about it visiting us?
03:17:14.000 Because it's very difficult to visit us.
03:17:16.000 Right.
03:17:16.000 But if it's way more advanced than us, shouldn't it be able to figure it out, just like we could figure out how to get a probe to Jupiter?
03:17:22.000 Why can't it figure out how to visit us?
03:17:23.000 But then we've got a government conspiracy in cover-up.
03:17:27.000 Do we?
03:17:28.000 Or are they just people and they don't know any more than you or I?
03:17:31.000 Like, we don't know shit.
03:17:32.000 If aliens came here, do you think they would like go, oh, that's an elected official?
03:17:36.000 Clearly that guy's the mayor.
03:17:38.000 Let me talk to him.
03:17:39.000 No, they'd look at this fucking ant hill and they'd be like, which one of these dip shit should we tell?
03:17:43.000 I don't know.
03:17:43.000 We'll just pick that guy up and stick some stuff up his ass and run some tests on him.
03:17:49.000 That's bollocks, though, because I've never been listening.
03:17:52.000 I'm really offended.
03:17:53.000 We've got friends online forever, and this is just a lot of fun.
03:17:55.000 I've got a counter-argument to this.
03:17:56.000 We can distinguish between from one ant to another.
03:17:59.000 We can see the workers, we can see the soldiers, we can see the queen.
03:18:01.000 They can tell, they would surely be smart enough to see the family.
03:18:04.000 Yeah, but the soldiers and the queen are physically shaped different.
03:18:09.000 They're different things.
03:18:11.000 Like this big one makes all the babies.
03:18:14.000 It's pretty obvious that a queen bee, she has a giant stinger that doesn't detach and she goes and kills all the baby bees.
03:18:21.000 Oh, that bitch is trying to kill the women.
03:18:23.000 Why can't they speak English?
03:18:23.000 They can't speak English.
03:18:25.000 Yeah, so if they probably miss the president, they'll be like, okay, well, we have to think that if something is alien, that it could be so alien that all of our concepts of culture are so bizarre that there's no context for it to relate.
03:18:25.000 They probably can.
03:18:38.000 And how can they understand our language?
03:18:40.000 They just know what we're saying.
03:18:42.000 Right.
03:18:42.000 They know what we're saying based on the fact that it's fairly simple.
03:18:46.000 If they can understand our language, then they must have relatable or similar concepts in their own language.
03:18:51.000 Sure.
03:18:51.000 Not necessarily.
03:18:52.000 To be able to translate.
03:18:54.000 Then they can't understand our language.
03:18:55.000 That's not necessarily true.
03:18:56.000 That's putting them into a limitation that we currently have.
03:18:59.000 Like, we understand dolphin language, but we know they have a language.
03:19:02.000 Yeah, but we can appreciate the dolphins probably have a concept for eating.
03:19:07.000 Right?
03:19:08.000 So, like, if the dolphins, I don't know what dolphin language is.
03:19:11.000 Yeah, but we're talking about culture in terms of like who's the queen, who's the male.
03:19:15.000 The hierarchical.
03:19:16.000 You know, they'll have like an alpha dolphin.
03:19:18.000 And people, the dolphins will have some like, you know, cliques that mean the person in charge.
03:19:24.000 I mean, they must have some concept of hierarchy because they have a concept of hierarchy.
03:19:28.000 You can see them operate under a concept of hierarchy.
03:19:30.000 But you know that we don't really understand dolphin language.
03:19:33.000 Well, no, but I mean, we can surely pick out like the concepts themselves.
03:19:38.000 I mean, they must understand the concept of leader, or they just can't understand the concept of leader.
03:19:43.000 I don't think it is widely understood what dolphins are saying.
03:19:48.000 I think we can translate dolphin to human.
03:19:51.000 Well, we can't even recreate it.
03:19:53.000 No, no, I'm sure.
03:19:54.000 But I mean, we can't recreate their sounds.
03:19:56.000 Like, we don't know what their sounds mean.
03:19:58.000 Like, if we said a bunch of shit to a dolphin, like, of course not.
03:20:02.000 I'm not saying we can talk to dolphins or anything, but like with a dog, you know, dogs have got certain kinds of very, very primitive methods of communication.
03:20:10.000 But like a dog whimpers, you know it's in pain.
03:20:12.000 Right.
03:20:13.000 You know.
03:20:14.000 And so I have to have a concept of being in pain to understand the dogs going, I know that that means the dog's hurt.
03:20:21.000 I must understand that the concept of hurt.
03:20:23.000 Right.
03:20:24.000 And so if the aliens, if they can understand English, then they must understand the concept of what a president is.
03:20:30.000 And they must have some sort of way of analyzing it, surely.
03:20:35.000 If they can take in all the superiors.
03:20:35.000 Sure.
03:20:37.000 They're superior to us, obviously.
03:20:38.000 Yeah.
03:20:39.000 Well, if not superior, technologically far more advanced in that they've been alive longer.
03:20:44.000 But when you consider the possibility of all sorts of different weather conditions, different life conditions, maybe different solar systems where they have zero concern about being hit by an asteroid.
03:20:55.000 They're always bipedal.
03:20:56.000 Yeah.
03:20:58.000 Well, that's because they're supposed to represent us in the future.
03:21:02.000 That's because it's easier.
03:21:03.000 Is that what it is?
03:21:04.000 Well, surely.
03:21:05.000 It's trying to relate to.
03:21:06.000 I think it's a relatable thing.
03:21:07.000 It could be that too.
03:21:08.000 But that was my point, is that I think that aliens could potentially be so alien that they can't even relate to the concept of our ability to communicate with noises.
03:21:18.000 Like they might be communicating with smells.
03:21:21.000 Yeah, exactly.
03:21:22.000 They might be using some telekinesis or some shit.
03:21:26.000 Psychics, man.
03:21:28.000 Yeah.
03:21:28.000 Maybe.
03:21:29.000 I mean, yes.
03:21:30.000 So do you think if you had a bet all of your YouTube money have aliens been here or no?
03:21:40.000 I've probably been no.
03:21:43.000 No?
03:21:44.000 Yeah, I'm a bit of a skeptic.
03:21:45.000 I wouldn't need to.
03:21:47.000 I mean, all of the ancient alien stuff is total horseshit.
03:21:50.000 Total horse shit.
03:21:50.000 Yes.
03:21:52.000 That stuff really fits more into the Graham Hancock Randall Carlson mode of there being an ancient, very advanced civilization that probably was wiped out.
03:22:02.000 Just like we know there have been civilizations that were wiped out, it's much more likely that someone, like say in Machu Picchu, someone was able to construct things that were like really confusing to us today.
03:22:17.000 It wouldn't come from the new world.
03:22:18.000 It's really interesting because as soon as you start going down this line, you think, okay, what do you need to have high civilization?
03:22:25.000 You need a certain series of things.
03:22:26.000 And there are a bunch of books you can read to.
03:22:28.000 Abundance of food, agriculture.
03:22:30.000 It's not just that.
03:22:31.000 You need a lot more than that as well.
03:22:34.000 You can read Thomas Solithing's Wealth, Power, or Poverty and Politics.
03:22:37.000 I can't remember what the title is.
03:22:39.000 And Why Nations Fail and What's the Guns Gentlemen Steal?
03:22:44.000 Yes, I read Guns Gentlemen.
03:22:46.000 Read these books and then suddenly you understand what populations need.
03:22:49.000 I read three quarters of it, I should be honest.
03:22:51.000 It's really good.
03:22:51.000 I bailed on it, but there's a whole slew of things you need as a civilization for humans to develop to high sort of standards.
03:23:02.000 Population density and all this sort of thing.
03:23:04.000 But it's not just population density, it's lots of other places that have high population densities.
03:23:09.000 So in Africa, for example, you've got very few areas that have high population density.
03:23:15.000 And so you don't have great civilizations like in sort of sub-Saharan Africa because they just aren't great cities, dense cities to trade with.
03:23:23.000 And then you go, okay, well, what they had, it's difficult to trade because in Africa, and this is something that Thomas Sowell goes into, is that, look, the rivers aren't very navigable.
03:23:31.000 You know, there are lots of rapids, lots of rocks.
03:23:33.000 They're not very deep.
03:23:34.000 You can't carry huge barges down.
03:23:36.000 They're not like the Danube.
03:23:38.000 This thing goes for hundreds of miles and it's deep and you can get barges all up and down it and stuff like this.
03:23:45.000 I mean, and traveling goods by water is so much more efficient.
03:23:49.000 Like, you know, like 20 times more efficient than taking them over the land.
03:23:53.000 And so if you don't have pre-large machines, and so you need that innovation in order to create the large machines in the first place.
03:24:00.000 Exactly.
03:24:00.000 And you need somewhere to take it.
03:24:02.000 You're not trading with anyone if you're the one Atlantean civilization, no one else is buying your goods.
03:24:10.000 It's all horseshit.
03:24:11.000 It just couldn't happen based on what you need for this thing to actually end up existing.
03:24:16.000 Right.
03:24:17.000 In order for an Atlantis to exist, you really have to have a bunch of Atlantis.
03:24:20.000 It's a bunch of wide Atlantis.
03:24:21.000 Which is more likely that there were some sophisticated civilizations that were hit by asteroids.
03:24:29.000 A lot of evidence for that.
03:24:30.000 A lot of iridium, yeah.
03:24:31.000 There's a lot of actual physical evidence.
03:24:33.000 Yeah, there's also this stuff, stuff called nuclear glass.
03:24:36.000 Well, it's the best evidence because iridium comes from space.
03:24:39.000 You know, iridium, when you find it in large quantities, indicates an asteroidal impact.
03:24:43.000 Iridium is very common in space and extremely rare on Earth.
03:24:47.000 So when there's a whole sediment layer of iridium, it's almost entirely provable.
03:24:52.000 Yeah, but that's when it was hit by something.
03:24:54.000 Wouldn't there be some sort of ruins underneath this then?
03:24:56.000 Yeah, they found a lot of ruins.
03:24:58.000 I mean, this is what's really fascinating about how Gobekli Tepe is 12,000 plus years old.
03:25:03.000 Oh, yeah, but that didn't die out because it was hit by an asteroid.
03:25:06.000 Well, this is what Graham Hancock and Randall Carl have been speculating.
03:25:10.000 That was really primitive.
03:25:11.000 Okay, you don't know that.
03:25:12.000 But before we go into it.
03:25:13.000 Well, a massive, huge stone columns that are 19 feet tall that are carved out of stone by people that are hunters and gatherers.
03:25:19.000 It's pretty impressive.
03:25:20.000 No, no, no.
03:25:21.000 But the point being, it's impressive for the thing, but it's not like a technological achievement.
03:25:26.000 Well, okay.
03:25:28.000 What is impressive?
03:25:29.000 Machu Picchu impressive?
03:25:30.000 It's pretty impressive, right?
03:25:31.000 That's not very old, though.
03:25:32.000 Well, it's thousands, a couple thousand years old.
03:25:34.000 1400.
03:25:35.000 1400.
03:25:36.000 Something like that.
03:25:36.000 Okay, I don't think that's true, and I don't think they know that for a fact.
03:25:39.000 And this is one of the speculations that Graham Hancock and Randall Carlson have presented, is that the evidence of this nuclear glass, which only exists on impact sites, before you go to your phone, listen to me so you can understand what I'm saying.
03:25:52.000 Impact sites where meteors hit or where they've done nuclear tests.
03:25:58.000 So they've found this stuff and micro-diamonds, evidence of massive impacts.
03:26:02.000 When they do core samples, they found it at 10,000 and 12,000 years, which is essentially the same time period as the end of the ice age.
03:26:09.000 I did a huge podcast on it with geologists.
03:26:12.000 They don't know, though.
03:26:13.000 Here's the thing.
03:26:13.000 When you talk to an expert, this is pure speculation.
03:26:16.000 There's no carbon-dated evidence that points to that age.
03:26:20.000 Whereas there is with the Sphinx.
03:26:21.000 So with the Sphinx, or excuse me, with the Great Pyramids.
03:26:24.000 The Great Pyramid of Giza, they've done some tests on some of the sediment and some of the stuff in between the stones, and they brought it to about 2,500 BC.
03:26:33.000 They found like a pretty specifically.
03:26:34.000 Yeah, they found a bunch of stuff that shows with a very high degree of possibility or probability that it's around 2500 BC.
03:26:44.000 What they've done with this evidence of massive impacts- Yeah.
03:26:52.000 But there's with rather the Great Pyrrhan.
03:26:54.000 But there's plenty of evidence that there was some sort of a massive event around 10,000 and 12,000 years ago.
03:27:03.000 And what their belief is that this is where all the stories of Atlantis come from.
03:27:07.000 This is where all these stories of these great civilizations that were wiped out come from, is that most likely we experienced a big die-off.
03:27:15.000 And a lot of people were affected by these disasters and catastrophes that coincided with asteroidal impacts.
03:27:22.000 Maybe, but I mean, we know where the stories of Atlantis come from.
03:27:27.000 Right.
03:27:27.000 Plato.
03:27:28.000 And he claims to have gone to Egypt and been taught this.
03:27:31.000 But it's all, I mean, it's not like a real thing.
03:27:33.000 I mean, he said it was 9,000 years before the present.
03:27:36.000 But the thing is, then he was in the story.
03:27:44.000 For example, 9,000 years ago, there wasn't an Egypt.
03:27:48.000 2,500 years ago from now, like 500 BC.
03:27:53.000 Like 9,000 years before that, there was no Egypt.
03:27:55.000 There was no Athens.
03:27:56.000 And yet the story involves them.
03:27:57.000 That's not necessarily true either.
03:27:59.000 Robert Schock, who's a geologist out of Boston University, has done a lot of investigation on the temple that the Sphinx was carved out of, like where the Sphinx is, the walls of the temple.
03:28:12.000 And he believes it's from water erosion.
03:28:14.000 There's these fissures, these deep erosion marks that they believe indicate that the temple of Sphinx, the area that's carved out of, is far older than the same, like they know that the Great Pyramid of Giza is 2,500 BC, but they always attribute the Sphinx to being the same timeline.
03:28:34.000 Dr. Robert Chock from Boston University says that's not the case.
03:28:37.000 He said, what you're looking at is thousands of years of rainfall, and they have hundreds of geologists that have signed off on this.
03:28:43.000 This is not like, and there's, he goes over the physical evidence of these fissures and how they're made.
03:28:49.000 He's like, as a geologist, I can tell you, you are looking at thousands of years of rainfall.
03:28:53.000 The last time there was serious rainfall in the Nile Valley was about 7,000 BC.
03:28:58.000 So what he's saying is it's entirely possible that there have been many different civilizations that lived in this area, and that Egypt is a civilization that might have gone back many, many thousands of years longer than just the construction of the Great Pyramid of Giza.
03:29:14.000 I wasn't talking about the Great Pyramid of Giza, really.
03:29:16.000 But that's Egypt.
03:29:17.000 Well, no, that's just a monument in Egypt.
03:29:20.000 Right, but Egypt is a place that did exist 10,000 years ago, 12,000 years ago, 50,000 years ago.
03:29:27.000 But don't say no, it didn't, because you don't really know.
03:29:29.000 Well, no, well.
03:29:30.000 But you're saying no, it didn't.
03:29:31.000 I'll tell you.
03:29:32.000 Are you an Egyptologist?
03:29:33.000 Do you have boots on the ground and talk to the people over there?
03:29:36.000 We don't really know, right?
03:29:38.000 Well, you don't know what I'm about to say.
03:29:39.000 But you said, no, it didn't.
03:29:41.000 Well, yeah, but the state of Egypt.
03:29:43.000 Like as, all right.
03:29:43.000 Okay.
03:29:43.000 Okay.
03:29:46.000 There undoubtedly would have been people living in the area, right?
03:29:49.000 But calling it Egypt, you're saying.
03:29:50.000 Yeah, they wouldn't have called Egypt.
03:29:51.000 When did they start calling it Egypt?
03:29:53.000 Oh, I'd have to look that up.
03:29:54.000 I don't know.
03:29:55.000 But we can tell, archaeologists can tell, like when the grains were first domesticated and things like this.
03:30:04.000 And you're never going to have a state without farming.
03:30:08.000 You're not going to have a hunter-gatherer state where they have a government.
03:30:12.000 And you need a surplus of food to be able to support a priestly class and a sort of bureaucratic class.
03:30:20.000 And to get something built, you need these things.
03:30:25.000 Not necessarily carving out of stone.
03:30:27.000 If you're going to build a pyramid, yes, you need these things.
03:30:29.000 Maybe carving a Sphinx, that's something tribesmen could do.
03:30:32.000 Maybe Beltepe, however it's pronounced.
03:30:34.000 Gobeckling.
03:30:35.000 Yeah, Kobekli.
03:30:36.000 I can easily believe that nomadic hunter-gatherers, they could have crafted that because you're just basically sanding and chipping a stone.
03:30:44.000 Well, they moved them, too.
03:30:45.000 These are huge chunks of stone that were done back when they didn't have a surplus of resources.
03:30:51.000 They're hunters and gatherers.
03:30:53.000 I mean, don't get wrong, I'm not saying it wasn't like a monumental achievement.
03:30:56.000 For the time and the resources, it obviously was.
03:30:59.000 But we know that they weren't farming.
03:31:02.000 We might be right about that, or we might not be right.
03:31:05.000 You're talking about an incredibly long time ago.
03:31:08.000 12,000 years ago is a long time.
03:31:10.000 And whatever evidence would be left has really probably been absorbed by the Earth by now.
03:31:15.000 Maybe, but I mean, we can't sit there and speculate about something that might have happened if we've got no evidence.
03:31:21.000 But that's exactly what we're doing if we're saying that they were definitely hunter and gatherers.
03:31:25.000 We don't know.
03:31:26.000 We do know.
03:31:27.000 What we do know is they were capable of making these incredibly massive, complicated structures 12,000 years ago for sure.
03:31:34.000 Is the Sphinx an incredibly massive complicated thing?
03:31:36.000 Well, no, we're talking about that.
03:31:36.000 We're talking about Gobekli Tepe.
03:31:38.000 But yeah, it's not big.
03:31:40.000 It's fucking huge.
03:31:41.000 And for you to say that is very disturbing because they've only uncovered 5% of it or 10%.
03:31:46.000 It's massive.
03:31:47.000 I thought they'd uncovered more than that.
03:31:48.000 But like the stones themselves, I mean, they're quite big stones.
03:31:48.000 No.
03:31:50.000 But see, but this is the problem.
03:31:51.000 You don't know a lot about it, but you're automatically jumping to this skeptical.
03:31:55.000 I don't know what you're talking about.
03:31:56.000 That's the thing.
03:31:56.000 But you haven't if you didn't think it was big.
03:31:58.000 I wonder what they like.
03:32:00.000 I just can't remember off the top of my head exactly.
03:32:01.000 That's the problem, but you're so convinced in your argument, but you're not really, you know, you haven't really spent a whole lot of time thinking about it.
03:32:07.000 But they're carving stone, man.
03:32:08.000 That's not very difficult.
03:32:09.000 They're not just carving stone.
03:32:10.000 They're carving stern with three-dimensional objects in the stone, meaning like lizards that are on the outside of it, 19-foot tall, stone columns in concentric circles.
03:32:20.000 They're huge structures.
03:32:22.000 Look at the technology to do.
03:32:25.000 It does to move those things.
03:32:27.000 Look at the image on the far right.
03:32:28.000 That's what they have figured out so far by using radar underground.
03:32:32.000 I mean, look at the size of this.
03:32:34.000 Don't get me wrong, that is huge.
03:32:35.000 It's fucking giant.
03:32:37.000 It's fucking primitive.
03:32:38.000 It's just stone.
03:32:40.000 It's not that primitive for someone to make something out of stone.
03:32:43.000 It's really impressive.
03:32:44.000 It's really impressive, right?
03:32:45.000 It must be a bit more.
03:32:45.000 It's a bit primitive in turn, but like they don't have a TV.
03:32:48.000 Is that what you mean?
03:32:50.000 For example, I'm sure if they went and did some sort of super high resolution scans on it or something like that, they'd be like, right, okay, this has been carved with other stone tools, or they can tell that it's been carved with copper tools or whatever.
03:33:02.000 That won't have been carved by copper tools.
03:33:05.000 Well, you know, they don't even know that when it comes to the Great Pyramid.
03:33:07.000 In fact, there's a lot of people who speculate they figured out a way to make some sort of a diamond-tipped drill in order to make the king's chamber and some of the sarcophaguses.
03:33:17.000 It's not fringe.
03:33:18.000 They don't really know how they carved out some of the stone.
03:33:21.000 There's a simple vase.
03:33:23.000 There's a simple vase in Egypt that they don't understand because it's carved completely out of one piece of stone.
03:33:29.000 But it's so intricate and so hard that they've made this incredibly thin lip on this vase and they've gone deep inside of it and the whole thing is entirely smooth.
03:33:38.000 They really don't know how the fuck they did it.
03:33:40.000 Probably sandpaper.
03:33:40.000 I'm rubbing it.
03:33:41.000 Patience.
03:33:42.000 They can't get your fingers inside of this thing.
03:33:44.000 I'm talking about an actual vase.
03:33:45.000 Yeah, it's very, very confusing.
03:33:47.000 Do modern archaeologists.
03:33:48.000 This is not cringe in any way.
03:33:51.000 I don't know about that.
03:33:51.000 I was talking about ancient civilizations themselves.
03:33:53.000 But what is an ancient advanced civilization?
03:33:56.000 Like, don't you agree that the Parthenon, the Acropolis, that's an advanced civilization.
03:34:00.000 Not particularly for the time.
03:34:02.000 For what they accomplished was amazing.
03:34:04.000 Yeah, they said, no, no, it's a great thing.
03:34:06.000 The massive stones that are cut in place, the beautiful columns.
03:34:09.000 They've been cutting stones for centuries.
03:34:11.000 Well, then what is an advanced civilization if that's not advanced?
03:34:15.000 Well, that's the point, isn't it?
03:34:16.000 What are you talking about calling the Advanced Agreement?
03:34:18.000 Periods created the Parthenon as the glorification of Athens over its empire, right?
03:34:23.000 It's not that these things didn't already exist, it's that he wanted to make Athens look special in the eyes of the rest of the world.
03:34:29.000 This is what I'm confused.
03:34:30.000 Like, what year did this happen and when do you consider people to be able to do that?
03:34:34.000 It was about, I don't know, 460 BC or something.
03:34:37.000 Right, so that looks like it.
03:34:38.000 Isn't that an advanced civilization?
03:34:40.000 No, really.
03:34:41.000 Come on, man.
03:34:43.000 If that didn't exist anywhere and then all of a sudden 10,000 years ago.
03:34:47.000 Yeah, but if 10,000 years ago, something like the Parthenon and the Acropolis, if we could prove that 10,000 years before anybody had done anything else, that existed, you'd be blown away by that, right?
03:35:00.000 Yeah, but shocking.
03:35:01.000 Yeah, no, the Parthenon.
03:35:02.000 The Parthenon may be 10,000 BC.
03:35:04.000 That would be incredible, right?
03:35:05.000 Right, it would be, right?
03:35:06.000 But a bunch of upright stones that have been nicely carved to have animals on them and stuff like that.
03:35:12.000 Not just that.
03:35:13.000 Massive stones, 19 feet tall in concentric circles, all in this huge stone.
03:35:18.000 We've got stone hens in Britain.
03:35:20.000 We know how they made it.
03:35:21.000 We know where the stones came from.
03:35:22.000 We know how them gotten there.
03:35:23.000 You're one of those dudes.
03:35:24.000 You're one of those dudes.
03:35:24.000 What?
03:35:25.000 No, not just red history, because you don't know anything about Gobekli Tepe, and yet you're so eager to...
03:35:25.000 That's a red history.
03:35:30.000 You just keep interrupting me.
03:35:30.000 Well, I know a bit.
03:35:33.000 Oh, I interrupt you.
03:35:34.000 You're adorable.
03:35:36.000 You keep going with these things.
03:35:38.000 You really don't even know anything about this area, and yet you're convinced it was made by hunters and gatherers.
03:35:44.000 There's a lot of argument one way or the other about this.
03:35:46.000 No, no, it's entirely possible that they had discovered farming, I guess.
03:35:51.000 I mean, that's the area that farming was originally originally.
03:35:54.000 That is part of the problem, right?
03:35:55.000 When you get to 12,000 years ago, who knows what the fuck they were doing.
03:35:57.000 That's the end of the ice age, isn't it?
03:35:59.000 And we think they knew things, and then we find out that they knew more.
03:36:03.000 The Earth is mostly covered in ice, how could you get an advanced civilization?
03:36:08.000 Part of the Earth is mostly covered in ice.
03:36:10.000 Like when you're dealing with North America, right.
03:36:12.000 North America 12,000 years ago, big parts of it, right?
03:36:17.000 And the South, too.
03:36:18.000 So you've got like a band in the middle.
03:36:19.000 Yeah.
03:36:20.000 I mean, where do they think Turkey was like back then?
03:36:24.000 Because this is where Gobekli Tepe is in, Turkey.
03:36:26.000 What do they think Turkey was like 12,000 years ago?
03:36:29.000 And the problem is if they can carbon date it absolutely to 12,000 years ago, they know it was covered up.
03:36:34.000 It was intentionally covered 12,000 years ago.
03:36:37.000 This is the story of Gobekli Tepe.
03:36:40.000 When was it constructed?
03:36:41.000 That they don't know because stone you can't carbon date.
03:36:44.000 Like the way they did it with the Egyptian pyramids, they got the material in between each stone.
03:36:48.000 I can completely believe that, you know, like 30,000 years ago, 20,000 years ago, I could believe in a certain area of the world that a bunch of tribesmen who hunter-gathering, maybe seasonally nomadic, you know, for 100 years, you know, maybe it was a religious duty they considered it to be.
03:37:07.000 I mean, that's the thing.
03:37:08.000 Stonehenge was built over a period of thousand years.
03:37:12.000 I mean, it took a long time to just put these fucking stones up.
03:37:15.000 Right.
03:37:15.000 You know, so I mean, we don't know how long this would have taken, but to have a bunch of carved stones upright, even if they're big, that just requires more people.
03:37:23.000 That's not a great technological feat.
03:37:25.000 That's the thing.
03:37:25.000 It's not.
03:37:26.000 Well, what is a technological feat if carving the Acropolis is not a technological feat?
03:37:31.000 Well, it was a technological feat, but it wasn't particularly unique to the world at the time.
03:37:35.000 In fact, the Persians and the Babylonians and the Assyrians, they've been known for thousands of years before that.
03:37:39.000 The Greeks were catching up.
03:37:41.000 Doesn't that then support the idea that Gobekli Tepe is even more unique?
03:37:46.000 Because it wasn't a critic before.
03:37:49.000 No, no, that's right.
03:37:50.000 It's an advanced civilization then.
03:37:52.000 Well, yeah, I mean, it's advanced for the rest of the world, but it's...
03:37:57.000 If we wanted to build something like that today, it would take massive resources.
03:38:01.000 It would take huge machines.
03:38:02.000 It would take a gigantic group of engineers.
03:38:05.000 That would be a huge project.
03:38:06.000 Yeah, but that's out of convenience.
03:38:08.000 What do you mean out of convenience?
03:38:09.000 Why are you discussing that?
03:38:10.000 How is it out of convenience when you get 19-foot-tall stones that are moving them in a circle?
03:38:18.000 I mean, it's a lot of work.
03:38:19.000 Yeah, of course it is.
03:38:20.000 Yeah, that's why we'd use machines.
03:38:21.000 Because it would just be a lot of work.
03:38:23.000 It's not as simple as you just use machines.
03:38:25.000 You'd have to get very specific, massive machines that are incredibly difficult to engineer.
03:38:30.000 And these people did that without any of that shit.
03:38:32.000 Yeah, they did.
03:38:32.000 They used ropes and pulleys and logs.
03:38:35.000 Probably not pulleys, actually, but the ropes and logs.
03:38:35.000 Probably.
03:38:37.000 Who knows how the fuck they did it?
03:38:39.000 But that's what we're supposed to believe.
03:38:40.000 There was no wheel back then.
03:38:41.000 No, it probably wasn't.
03:38:43.000 I mean, it's pretty spectacular.
03:38:46.000 Yeah, yeah, no, no.
03:38:47.000 Okay, yeah, right.
03:38:48.000 Okay, so let me.
03:38:49.000 So I'm not trying to take away the amount of effort involved in building this thing and the willpower to do it and the manpower to do it.
03:38:57.000 What are you trying to say?
03:38:59.000 This is not indicative of advanced civilization.
03:39:03.000 I think it's our definition of advanced.
03:39:04.000 For example, like I would consider advanced civilization to have writing.
03:39:08.000 I would consider them to have metallurgy.
03:39:11.000 I wouldn't even call this advanced.
03:39:14.000 This puts them where we were 4,000 years ago, you know, as a species.
03:39:19.000 So advanced is like sumer.
03:39:24.000 So it's like, I mean, what...
03:39:26.000 I mean, yeah, I mean, if you want to speculate, there might have been a civilization like pre-Ice Age or during the Ice Age that had writing and metallurgy and, you know, great cities and international trade, agriculture, all the sort of stuff that we would...
03:39:47.000 But when I say advanced, we mean not as advanced as we are now.
03:39:51.000 Pre-modern, right?
03:39:52.000 Well, that's what everyone speculates about Atlantis as well.
03:39:55.000 No one ever thinks that Atlantis was like today.
03:39:57.000 They think some crazy things, some people.
03:39:59.000 Really?
03:40:00.000 Well, yeah, some people.
03:40:01.000 Some people are.
03:40:02.000 But yeah, we should have been more specific.
03:40:06.000 But yeah, I mean, don't get me wrong.
03:40:08.000 I'm sure that for the time, like when a random tribe came over the hill and they saw Gabukli Tepe and it was incredible, they were just like, oh, the gods have built this.
03:40:16.000 You know, that's what they would have thought, I'm sure.
03:40:18.000 You know, to them, it would have relatively been incredible.
03:40:22.000 And I don't want to take away the achievement for the people who made it because it's unbelievably good, especially for the time.
03:40:22.000 And don't get me wrong.
03:40:29.000 But if they didn't have metal tools, they didn't have writing, they didn't have agriculture, I wouldn't call that advanced.
03:40:34.000 I'd just call that a lot of work for a primitive civilization in the same way that Stonehenge would have been.
03:40:38.000 Yeah, the real problem is we don't really know what they had.
03:40:41.000 And until more evidence gets uncovered, they're just guessing.
03:40:45.000 They have some carbon dating.
03:40:46.000 They know that the stones exist.
03:40:48.000 They don't even know.
03:40:49.000 I mean, they didn't even know about this until 1990 something.
03:40:52.000 I mean, they found it in the 90s.
03:40:54.000 Yeah, but I mean, all over Northern Europe, there are stone circles, henges, mayhems, whatever.
03:40:59.000 Well, they find more of those now than ever before with satellites, right?
03:41:02.000 You've seen the ones that they're finding now in the Amazon?
03:41:02.000 Yeah.
03:41:05.000 And they're like, what the fuck is this?
03:41:06.000 They're finding all these structures and channels.
03:41:09.000 You sent me a link to that because that sounds really interesting.
03:41:10.000 Well, that's the lost city of Z. You know about that, right?
03:41:15.000 There was a guy who went looking.
03:41:16.000 He was an explorer.
03:41:17.000 He went looking for the lost city of Z in the Amazon.
03:41:21.000 It was like a mythical lost city, and he got eaten by cannibals, apparently.
03:41:25.000 There's some movie that's coming out about him right now.
03:41:28.000 That's racist, Jay.
03:41:29.000 No, it's not.
03:41:30.000 They were white cannibals.
03:41:31.000 How about that?
03:41:33.000 post-privilege.
03:41:33.000 It was speculated that before the The Amazon was covered in an ocean somewhere around 10 million years ago or something.
03:41:43.000 Oh, yeah, yeah.
03:41:44.000 But they think that these people had some sort of, you know, for the time again, what is an advanced civilization that probably needs to be defined.
03:41:52.000 Yeah, I think that's the first stuff.
03:41:53.000 But yeah, I mean, it's interesting to say, isn't the Parthenon an advanced thing?
03:41:56.000 No, not really.
03:41:57.000 You know, it was a beautiful thing.
03:41:59.000 It was a great construct.
03:42:00.000 And it cost the Athenians a lot of money, and they did it to beautify the city and make it the jewel of grace.
03:42:05.000 But it wasn't like, I mean, you know.
03:42:08.000 What's more impressive is The Parthenon is built in the Acropolis.
03:42:14.000 They did it.
03:42:14.000 The Acropolis is what's really a trip.
03:42:17.000 It's because it's giant fucking chunks of stone.
03:42:17.000 Why?
03:42:21.000 You know how big they are?
03:42:22.000 Yeah, I've been there.
03:42:23.000 They're huge.
03:42:24.000 Yeah, they don't even know why they're there.
03:42:27.000 They weren't even built at the same time.
03:42:28.000 The Acropolis is just a natural stone formation.
03:42:31.000 Not necessarily.
03:42:32.000 There's stones under that natural stone formation.
03:42:34.000 Like the actual platform that the Parthenon was built on existed before the Parthenon.
03:42:41.000 I'm not sure I follow what you're saying there, because as far as I'm aware, it's just a large stone formation.
03:42:41.000 Right.
03:42:46.000 I guess they must have leveled it off something.
03:42:49.000 I think there's actual physical stones, actual monstrous physical stones that are involved in the construction.
03:42:55.000 Look at Malta and see the same sort of thing.
03:42:57.000 Just a photo, see if you could find that.
03:42:58.000 There's a bunch of, yeah, the old one.
03:43:00.000 What's the thing at Malta where they thought it was built by Cyclopses or something?
03:43:04.000 Giant stones like 3,000 years BC.
03:43:06.000 Is that Baalbach?
03:43:07.000 No, that's in Lebanon.
03:43:08.000 There's Baalbach in Lebanon.
03:43:10.000 They found these enormous stone carved things.
03:43:13.000 Yeah, see those stones?
03:43:15.000 Which ones?
03:43:17.000 There's some stone.
03:43:18.000 See if you could find some images of the side.
03:43:22.000 I'm pretty sure that's just an actual stone out.
03:43:24.000 And the sort of fortifications around it, you can probably Google that and find out when they were put up.
03:43:29.000 It was probably like 16th century or something.
03:43:32.000 There's nothing mystical about the Parthenon.
03:43:34.000 It's just a really useful defensive fortification.
03:43:36.000 Well, not mystical.
03:43:37.000 I'm just saying ancient.
03:43:38.000 I think the speculation was that the Acropolis...
03:43:44.000 The bottom.
03:43:44.000 The Parthenon's the top.
03:43:46.000 No, the Parthenon is on the Acropolis.
03:43:48.000 The Parthenon is on the top.
03:43:49.000 And the Acropolis is on the bottom.
03:43:51.000 That's what I'm saying.
03:43:53.000 The thing that they think is ancient stone.
03:43:55.000 The Acropolis.
03:43:56.000 Okay, all together.
03:43:57.000 Yeah.
03:43:58.000 And the Parthenon is just the building.
03:43:59.000 Yeah.
03:44:00.000 Right, okay.
03:44:00.000 The Parthenon was a temple.
03:44:02.000 And so when you say the Acropolis, the Parthenon is continuous.
03:44:05.000 Most cities had an Acropolis, which is just a raised rocky bit that they built defensive fortifications on.
03:44:10.000 They'd put temples on and stuff like that.
03:44:12.000 So finally, where I read that, see if you could Google man-made stones in the Acropolis.
03:44:19.000 There was some sort of weird speculation that the Acropolis was a part of some structure that was far older than the Parthenon.
03:44:27.000 I'm not persuaded by all of that sort of thing.
03:44:30.000 That's a bit fringe stuff for me.
03:44:32.000 You're not into fringe shit?
03:44:33.000 Not when it comes to history.
03:44:36.000 I like to know accurate stuff, you know.
03:44:38.000 Yeah, no, it makes sense.
03:44:40.000 Fringe is always more attractive, though.
03:44:42.000 It's interesting, isn't it?
03:44:43.000 Because you never know, and they're sort of like, oh, you might be able to find out for yourself.
03:44:46.000 But it's also like, undeniably, it has gravity towards me, at least.
03:44:51.000 I love the old stuff.
03:44:52.000 They're like, no, no, no.
03:44:53.000 You know what they found out, man?
03:44:54.000 They were werewolves.
03:44:55.000 There's always something that you find out.
03:44:57.000 Yeah, I went through all of this in my 20s, where I went through literally all of these fucking.
03:45:01.000 I'm a late bloomer.
03:45:02.000 Yeah, really.
03:45:04.000 You'll end up eventually where I am.
03:45:05.000 Oh, I definitely won't end up where you are.
03:45:07.000 You will.
03:45:07.000 How dare you?
03:45:08.000 No, no, it's a good place to be.
03:45:11.000 You'll go through all of these theories and you'll immediately dismiss things.
03:45:15.000 No, you don't immediately dismiss things.
03:45:16.000 Then you have to start reading about why things happened.
03:45:19.000 That's the thing.
03:45:20.000 You have to start reading the source material.
03:45:22.000 That's why I know who built the Parthenon.
03:45:24.000 Right, but that's also why you just flippantly decided that.
03:45:27.000 Because Herbert Tepe was small.
03:45:29.000 Yeah, well, no, no, it's not the size of it's irrelevant.
03:45:31.000 It's the sort of like technological advancement.
03:45:34.000 You know, it doesn't take much to carve stone.
03:45:36.000 They used to, you know, they'd use other stone to break against it and then they'd smooth it with hands of sand.
03:45:41.000 It just took a lot of work for a long time.
03:45:43.000 And I mean, don't get wrong, it must have been fucking awful.
03:45:45.000 You know, get a hide of leather and put sand on it.
03:45:47.000 You can spend ages smoothing out stone.
03:45:49.000 Do it yourself if you like.
03:45:50.000 It's fucking boring.
03:45:51.000 But I guess if you've got nothing else to do all day, because you're a play driver.
03:45:53.000 What is this right here, James?
03:45:55.000 But that's not carved.
03:45:56.000 That's like this is bass relief, I believe.
03:45:59.000 Oh, wow.
03:46:00.000 This is what most of the art on Go Baklatepe was.
03:46:03.000 It was like a 3D.
03:46:05.000 It could be moved away, not carved into.
03:46:07.000 Yeah, that's what was interesting about it.
03:46:07.000 Right.
03:46:09.000 It's more difficult.
03:46:10.000 Yeah, yeah, I don't remember.
03:46:11.000 I mean, I don't want to take anything away from the artist because that must have been a lot of work.
03:46:14.000 What was the man-made stones in the Acropolis?
03:46:17.000 Did you find any of that?
03:46:18.000 There was evidence of these enormous chunks of rock that were moved.
03:46:23.000 But I might be conflating Baalbek in Lebanon with the Acropolis.
03:46:29.000 It's all fascinating to me because we don't know.
03:46:32.000 We just don't know what these people are up to.
03:46:34.000 And we see what they left behind.
03:46:36.000 We pieced it together and people have studied it for thousands of years.
03:46:39.000 And it's just, it's amazing.
03:46:42.000 I'm sure you've been to Rome, right?
03:46:44.000 You haven't?
03:46:45.000 It's on my list.
03:46:46.000 I haven't seen it.
03:46:47.000 The Coliseum's fairly recent, right?
03:46:49.000 You know, they're like one century AD or something.
03:46:53.000 Yeah, I mean, in terms of like ancient civilization, very recent.
03:46:57.000 It's it's so crazy when they show you the history of it how they used to that whole thing was like Nero's backyard I mean that's really well and then they decided to once they aren't so bad.
03:47:09.000 Yeah And this thing has gone through all these different various stages.
03:47:19.000 And when they show you all the different things they used to do there, like have these boat fights where they would fill it up with water and they would raise these lions and all these animals through the bottom of the cage and they would fight against them.
03:47:30.000 Like, you're fucking crazy.
03:47:31.000 And not that long ago.
03:47:32.000 Well, yeah, near 2,000 years.
03:47:34.000 Yeah, I mean, like, people used to be brutal.
03:47:36.000 That's the thing.
03:47:37.000 People forget.
03:47:38.000 Well, we're talking about ISIS.
03:47:39.000 They still are.
03:47:40.000 Yeah, well, they are.
03:47:41.000 But now, I mean, that shocks and is horrible, you know, to most people.
03:47:45.000 But in the 15th century, 16th century, kids used to do cockfighting.
03:47:50.000 They do it here in the Mexican neighborhoods.
03:47:53.000 Yeah.
03:47:54.000 Just watching these chickens kill each other.
03:47:55.000 Dude, one of my old guy that I used to use as a gardener, like, six, seven years ago, used to cockfight.
03:48:02.000 No, no sympathy, no compassion for anything that wasn't human, basically.
03:48:05.000 It was like, oh.
03:48:06.000 Definitely not for chickens.
03:48:07.000 They put razor blades on them.
03:48:09.000 They put spurs on them.
03:48:10.000 They fight each other.
03:48:11.000 Yeah, it's fucking crazy.
03:48:12.000 That's nice.
03:48:13.000 Yeah.
03:48:13.000 I went to this guy's yard.
03:48:14.000 He had like a hundred roosters in these pens.
03:48:18.000 And they would have this carved out area in a pit.
03:48:20.000 He'd let the roosters go.
03:48:22.000 It's super common.
03:48:23.000 It's super common in many parts of the world, in fact.
03:48:26.000 It's really common in the south of America.
03:48:28.000 Not South America, but south of the United States.
03:48:31.000 It's real common, like cockfights.
03:48:34.000 It's not thought of the same way as dog fights.
03:48:37.000 Dog fights horrify people.
03:48:39.000 But chicken fights, people are like, who cares?
03:48:42.000 I'll just eat afterwards.
03:48:43.000 Maybe it's because you can't eat the dog.
03:48:45.000 Well, they still do bullfights in Spain.
03:48:48.000 They do.
03:48:48.000 They're pretty amazing.
03:48:49.000 This is pretty controversial these days.
03:48:51.000 They just lost another matador a couple days ago.
03:48:53.000 He got fucked up.
03:48:54.000 He stepped over his own cape, tripped over the cape, got jacked.
03:48:58.000 Yeah, he'd get torn apart.
03:49:00.000 No sympathy whatsoever.
03:49:01.000 No, no, that's a foolish way to die and risk your life.
03:49:06.000 And the whole thing is crazy.
03:49:09.000 But Spain is an advanced civilization.
03:49:12.000 Well, I've known too many Spaniards.
03:49:17.000 How dare you?
03:49:18.000 But don't you know, yeah, they're a modern Western civilization and they have bullfighting.
03:49:24.000 Yeah.
03:49:24.000 Do you know what's really funny about Spain, though?
03:49:26.000 Because I mean, like, in America, you've got a terrible history, right?
03:49:30.000 We have a terrible history of being terrible.
03:49:31.000 Is that what you're saying?
03:49:32.000 Yeah.
03:49:33.000 Okay.
03:49:34.000 So the Spanish Spanish history is highly Islamophobic, obviously.
03:49:40.000 And I went to Spain a few years ago and everything, everywhere.
03:49:44.000 It's a white guy killing a Muslim on every portrait, on every wall.
03:49:49.000 Because they spent 800 years kicking the Muslims out of Spain because the Muslims came along and conquered Spain.
03:49:49.000 Oh, yeah.
03:49:55.000 And so everything is totally Islamophobic because that is their history.
03:49:59.000 They can't get away from it.
03:50:00.000 There wouldn't be a Spain because the Moors came over and conquered it.
03:50:03.000 And they were like, right, everyone's a Muslim now, or you're going to have to pay the Jizya.
03:50:06.000 And you're going to be a Dimi.
03:50:08.000 And they were like, no, they spent 800 years as resistance or whatever.
03:50:13.000 The Spanish before it would be.
03:50:15.000 Like, you know, just the Reconquista.
03:50:17.000 And it took several crusades.
03:50:19.000 It took, you know, hundreds and years and gallons of blood.
03:50:22.000 But eventually Spain became Christian again.
03:50:24.000 And the Muslims and the Islamophiles who will sit there and go, oh, well, you know, Al-Andalus, yeah, it was stolen from someone else and these people took it back.
03:50:33.000 I've got no fucking sympathy for these people who got kicked out who came in originally and weren't even supposed to be there.
03:50:38.000 And I tell you what, the Muslims are fucking lucky that Columbus discovered the New World because the Spanish were basically just going to roll up all of North Africa.
03:50:45.000 They were planning to take it all fucking back because this was all Christian.
03:50:49.000 It really annoys me that people are like, oh, no, that's Muslim.
03:50:52.000 No, it wasn't.
03:50:53.000 And it's because they took it.
03:50:54.000 And so if the Spanish are going to take it back, that's just as valid.
03:50:58.000 You don't have an argument against them doing it.
03:51:00.000 Yeah, once something's been established, though, we don't want you taking it back.
03:51:02.000 Well, no, no, no, no, I'm not suggesting it now.
03:51:04.000 I'm not suggesting it now, obviously.
03:51:06.000 But I know you're not suggesting it, but it's still.
03:51:06.000 Nationalism.
03:51:10.000 But like, you know, when we're looking back at history through the lens of having a moral judgment about it, and they go, oh, oh, the Spanish kicked the Muslims out of Spain.
03:51:19.000 Yeah.
03:51:19.000 Good.
03:51:20.000 For them.
03:51:21.000 They have every right to.
03:51:23.000 You can't really have moral judgments about history or not.
03:51:25.000 Well, people do this all the time.
03:51:26.000 Yeah.
03:51:27.000 Like the Crusades.
03:51:28.000 Like Jimmy Doyle.
03:51:29.000 There's nothing worse than the Crusades.
03:51:30.000 Shut the fuck up, Jimmy.
03:51:32.000 You do not know what you're talking about.
03:51:33.000 Just shut up.
03:51:34.000 Jenks, stop talking about the Crusades.
03:51:36.000 Oh, unless, I tell you what, you're right.
03:51:38.000 Let's call them back.
03:51:39.000 Bring back the Crusading armies.
03:51:40.000 Let's call it Baldwin.
03:51:41.000 Baldwin of Jerusalem.
03:51:42.000 Come on, you're going to have to come home.
03:51:44.000 Come on, Baldwin.
03:51:45.000 This is wrong.
03:51:45.000 We shouldn't be crusading.
03:51:46.000 That ended 800 years ago.
03:51:46.000 Oh, what?
03:51:48.000 Boom, done.
03:51:49.000 Conversation over.
03:51:50.000 It pisses me off, man.
03:51:51.000 They keep bringing up this shit.
03:51:53.000 Absolute shit.
03:51:54.000 But what about the Crusades?
03:51:55.000 What about the fucking Crusades?
03:51:56.000 Where are the Crusaders, Jank?
03:51:57.000 You know?
03:51:58.000 Where are the fucking Crusaders?
03:51:59.000 Are you talking to Jank Ugre from Jank Ugar?
03:52:01.000 Jimmy DeWonti.
03:52:02.000 Do you think people know who you're talking about?
03:52:04.000 Oh, yeah.
03:52:05.000 Oh, yeah, yeah.
03:52:06.000 Some people do, but most people know.
03:52:07.000 Jank fucking knows.
03:52:08.000 I saw him at VidCon.
03:52:09.000 I'm sure he knows because you're talking about him when you say his name, but I'm saying you're saying it to a large audience.
03:52:14.000 People know people know.
03:52:16.000 Just because you don't know people.
03:52:17.000 No, a lot of people don't.
03:52:19.000 I do know.
03:52:20.000 It would be terrifying for people.
03:52:20.000 I do know.
03:52:22.000 How do we get out of bullfighting to this?
03:52:24.000 Dunno.
03:52:26.000 Cultural relativism, basically, I think is what it is.
03:52:29.000 But we're talking about advanced civilization.
03:52:31.000 I'm absolutely sick of the apologism, though.
03:52:32.000 It's like, look, look, things that happened in the past, we don't have to apologize for.
03:52:36.000 Right.
03:52:37.000 These things just happened.
03:52:39.000 We don't need to do things.
03:52:40.000 But don't you think white people should apologize for all the horrible things that we've done?
03:52:44.000 What about the Native Americans?
03:52:45.000 Should we apologize to them?
03:52:47.000 I think that if they're prepared to apologize to the people they genocided.
03:52:51.000 Well, that's you're going a little too far back.
03:52:52.000 Well, they don't exist anymore.
03:52:53.000 No, you go too far back.
03:52:54.000 That's a problem.
03:52:55.000 Oh, who designed it?
03:52:56.000 Like, once we have done that, we've trespassed on their land, you don't bring up them doing it to improve that.
03:53:03.000 That would be Native Americanophobia.
03:53:05.000 Yes, it's not our place.
03:53:07.000 It's not our place.
03:53:08.000 Yeah.
03:53:08.000 See, this is the whole thing.
03:53:09.000 And this whole relativism, it pisses me off.
03:53:09.000 It's all bollocks.
03:53:11.000 And it really actually makes me angry.
03:53:13.000 And none of them will ever listen to the counter-arguments.
03:53:16.000 It's really annoying.
03:53:17.000 And so it's just like, no, nobody's got any respect for you.
03:53:20.000 When you turn around and say, but what about the Crusades?
03:53:22.000 I mean, what about her is the dumbest damn thing anyway?
03:53:24.000 Because you're never addressing the point.
03:53:25.000 It's like, yeah, you didn't even give a context.
03:53:28.000 We're talking about Islam, basically.
03:53:34.000 They, for some reason, think that Christianity is as bad as Islam.
03:53:36.000 It's like, no, it's not.
03:53:37.000 I mean, like, the Crusades were distinctly un-Christian.
03:53:40.000 I spoke to a historian Tom Holland about this.
03:53:42.000 And I was like, look, how did they come to the conclusion that they should have a crusade?
03:53:47.000 And he was like, well, they didn't get it from Christianity.
03:53:49.000 And the Orthodox patriarch in Constantinople was just completely against the idea because there's nothing in the Bible that justifies it, you know, in the New Testament.
03:53:59.000 You know, Jesus was a man of peace.
03:54:01.000 He turns the cheek.
03:54:02.000 He lets his enemies strike him.
03:54:04.000 So you can't pick up a sword and say, I'm going to kill someone for Jesus.
03:54:06.000 It doesn't make sense.
03:54:07.000 But isn't it just what happens?
03:54:08.000 No, no, no.
03:54:09.000 When people have massive amounts of power, and then it went weird with the Aztecs.
03:54:13.000 It goes weird with everybody.
03:54:15.000 It was entirely temporal.
03:54:16.000 It was about conquering the Middle East.
03:54:17.000 But it was also about making sure that they could get to Jerusalem.
03:54:20.000 It's like an ideological thing.
03:54:21.000 Because the Turks were basically preventing pilgrims from getting to Jerusalem.
03:54:25.000 Muslims had owned Jerusalem for like 400 years.
03:54:27.000 And everything was okay because the pilgrims could travel along a pilgrim trail.
03:54:31.000 And people made money doing this.
03:54:33.000 They'd sell things.
03:54:34.000 And then the Turks came in and were like, hey, we're Muslims and fuck everyone because we're the Turks.
03:54:38.000 And that was it.
03:54:39.000 The pilgrims couldn't go to Jerusalem.
03:54:40.000 And Pope, I think it was Innocent II or Urban II.
03:54:43.000 I never remember which one it was.
03:54:45.000 Whichever the Pope was, he turned around and said, hey, I could create a spiritual kingdom or an empire, an empire of faith on earth.
03:54:53.000 And he's like, why don't you all, rather than fighting each other in Europe, I mean, Europeans were not very advanced, but they were very, very warlike.
03:55:02.000 And why don't you just go to the Middle East?
03:55:04.000 And they did.
03:55:05.000 And the Crusade was actually, I mean, it's a real thrilling thing to read about now.
03:55:09.000 You know, you see, like, historical films, like, all the time.
03:55:12.000 It's always the Roman Empire.
03:55:16.000 Just every historical fucking film.
03:55:16.000 Which one?
03:55:18.000 Right.
03:55:18.000 Okay.
03:55:19.000 And it's just like, look, no, someone needs to do a film about the First Crusade because the First Crusade is one of the most thrilling things you can ever read about.
03:55:25.000 It's just the most, it's insane what these men did.
03:55:28.000 You know, marching.
03:55:29.000 I mean, they marched like a thousand miles from northern Europe and just they marched through and just did so much stuff.
03:55:35.000 Like, the Pope first calls a crusade and it goes kind of wrong for him because everyone's, like, the emperor in Constantinople, the Byzantine Emperor in Constantinople, he says to the Pope, look, do me a favor, I'm getting thrashed by the Turks.
03:55:49.000 I need about 4,000 knights, professional soldiers, good fighters, so I can beat them back.
03:55:54.000 And the Pope's like, hey, great idea.
03:55:56.000 Why don't we all go over there and beat up the Turks for Christianity?
03:55:59.000 And loads of peasants were like, oh, that's a good idea.
03:56:03.000 And so like 70,000 peasants, led by Peter the Hermit, turn up on the Emperor's doorstep.
03:56:08.000 And he's like, what the fuck is this?
03:56:09.000 You know, why have you sent me the scum of Europe to come and fight the Turks?
03:56:13.000 So he just gets them across into Anatolia and they go on and they just get massacred by the Turks, absolutely butchered.
03:56:19.000 And then the Crusading armies actually arrive.
03:56:22.000 Peter the Hermit actually manages to survive.
03:56:24.000 But then the Crusading armies arrive, and the Turks think, well, this will be fine.
03:56:27.000 And the Emperor's like, shit, look at all these really...
03:56:32.000 Gets them across and they end up smashing the Turks.
03:56:35.000 The Turks are just expecting another bunch of pezzas.
03:56:37.000 And they're not expecting what they call the men of iron, which couldn't be broken.
03:56:40.000 And that's why they both fight in Paris.
03:56:43.000 Probably.
03:56:43.000 Spain.
03:56:44.000 Wherever the fuck they are.
03:56:45.000 But it's a really interesting thing.
03:56:47.000 And no one ever talks about it.
03:56:49.000 Well, the history of human beings is fascinating.
03:56:52.000 What we have learned and how long it's taken us to get to this point.
03:56:55.000 The regressive left will always talk to you like this is like a one-sided, oh, the Christians just bully them out of the blue one day.
03:57:01.000 It's like, no, there was an established order.
03:57:02.000 There were things going on.
03:57:03.000 There was a really detailed political landscape.
03:57:06.000 And you get someone like Jimmy Dory who goes, well, there's nothing worth in crusades.
03:57:09.000 Just shut up.
03:57:10.000 Just, you don't know what you're talking about.
03:57:13.000 Just be quiet.
03:57:14.000 All right, fella.
03:57:15.000 Glad you got that off your chest.
03:57:16.000 I've had that on my chest for a long time.
03:57:16.000 I'm angry.
03:57:18.000 Listen, we did an extra hour past.
03:57:20.000 We thought I could do.
03:57:21.000 It's two now.
03:57:22.000 This is when your flight.
03:57:23.000 So let's get you another flight.
03:57:24.000 Yeah.
03:57:25.000 And thank you, brother.
03:57:26.000 I really appreciate you coming on, man.
03:57:27.000 Thanks, man.
03:57:28.000 It was fun.
03:57:28.000 It was very fun.
03:57:28.000 It was fun talking.
03:57:30.000 It's nice that I can bitch some people out who need to be bitched out.
03:57:33.000 He definitely did that.
03:57:34.000 He definitely did that.
03:57:35.000 All right, folks.
03:57:36.000 We'll be back tomorrow with Crystalia.
03:57:38.000 See ya.
03:57:39.000 Do you say a COD or a CAD?
03:57:41.000 Not sure.
03:57:43.000 I've only seen it written.
03:57:44.000 I've never heard it said.
03:57:47.000 He was a lot of fun, though.
03:57:48.000 Thanks, everybody, for tuning into the podcast.
03:57:50.000 Thanks to Caveman Coffee.
03:57:51.000 Go to cavemancoffeeco.com.
03:57:54.000 Use the code word Rogan and you'll save 10%.
03:57:57.000 Thank you to Blue Apron.
03:58:00.000 Go to blueapron.com forward slash Rogan and you can get your first three meals for free with free shipping.
03:58:09.000 That's blueapron.com forward slash rogan.
03:58:12.000 Thank you to Squarespace.
03:58:13.000 Build your own fucking amazing website yourself at Squarespace for a free trial and 10% off your first purchase.
03:58:23.000 Go to squarespace.com forward slash Joe.
03:58:26.000 And thank you each and every episode to Onit.
03:58:29.000 Go to O-N-N-I-T, use the code word Rogan and save 10% off any and all supplements.
03:58:36.000 Okay, all right.
03:58:38.000 All right, my guest tomorrow is my friend Chris Dahlia, hilarious stand-up comedian, regular at the comedy store.
03:58:43.000 Hang with him all the time.
03:58:45.000 And he's got a new Netflix special coming out.
03:58:48.000 Oh, yeah.
03:58:49.000 So that should be fun.
03:58:50.000 That's tomorrow.
03:58:52.000 Josh Barnett, former UFC heavyweight champion and a very wise man, very interesting and Renaissance man himself.
03:58:58.000 We'll be here on Wednesday and more.
03:59:01.000 All right.
03:59:01.000 Until then, bye.