The Joe Rogan Experience - September 22, 2021


Joe Rogan Experience #1710 - Cullen Hoback


Episode Stats

Length

2 hours and 48 minutes

Words per Minute

159.5223

Word Count

26,938

Sentence Count

2,074

Misogynist Sentences

8

Hate Speech Sentences

53


Summary

On this episode of Conspiracy Theories, host Alex Blumberg sits down with the creator of QAnon, Alex Castellanos, to talk about his new documentary, "QAnon: The Conspiracy Theory." They discuss the history of the conspiracy theory and how it came to be, how it started, and why it's so important to know what's real and what's not real. They also talk about the conspiracy theories that have been floating around the internet for years, and how they ve changed the way we think about conspiracy theories and fake news. Alex is a regular contributor to the New York Times, and host of the popular conspiracy podcast "The FiveThirtyEight" podcast. He's also a frequent contributor to The Daily Beast, and is the host of "The Conspiracy Theory" podcast on conspiracy theories. He is also the founder of the Conspiracy Theory Podcast, and co-host of the show "The Biggest Secret" on The Five Thirty Eight. Alex and Alex talk about Qanon, conspiracy theories, and Alex's new documentary "Q and the Conspiracy Theory," and discuss how the Q and Conspiracy theories came about. They also discuss how Q and The Conspiracy theory came to life, and what it was like to be a member of Q and how Q's group, and the impact it had on the world, and its impact on the culture of conspiracy theorists and conspiracy theorists. . Q and Alex discuss the impact of Q's documentary on the internet culture, and whether or not just a conspiracy theory, but an actual conspiracy theory. of the Q group? and why they think it s a good thing. Q is more than a conspiracy or a conspiracy and it s more than just a thing an ? of a , and why Alex thinks it s better than than , that is more than just ... a better And more in this episode, and more ... and more! and more. And much more! Check out the full series on Q and the Q&A on the Q & A! on this episode on the first part of "Q&A! of Q& A on the new series, "The Q and A series on the "Q & A podcast. and how to find out more about Q and Anon!


Transcript

00:00:12.000 What's up, man?
00:00:13.000 How are you?
00:00:14.000 I'm good.
00:00:16.000 It's really exciting to be here.
00:00:19.000 Right before we got rolling, I was mentioning, I tweeted back in April, would love to go on Joe Rogan's podcast at some point.
00:00:29.000 Is that when the documentary series came out in April?
00:00:32.000 Yeah.
00:00:32.000 So it premiered in March, like end of March.
00:00:36.000 And then it was rolled out in two episode per weekend installments.
00:00:41.000 So it took three weeks for the whole thing to release.
00:00:46.000 And it was actually really exciting because it was...
00:00:49.000 You had the audience reacting in real time.
00:00:53.000 There'd be a week to see, okay, how is the Q-munity receiving this?
00:00:57.000 How is the mainstream media?
00:00:58.000 Did you say the Q-munity?
00:00:59.000 The Q-munity, yeah.
00:01:00.000 You really said that?
00:01:01.000 Yeah.
00:01:01.000 Is that what they call themselves?
00:01:02.000 I mean, I have so much of this lexicon now that I can't avoid.
00:01:06.000 It's just...
00:01:06.000 It was funny.
00:01:07.000 I say it at the beginning of the series.
00:01:08.000 It seeps into your thoughts.
00:01:09.000 It really does.
00:01:10.000 Right.
00:01:11.000 Clinton body count, right?
00:01:12.000 Clinton body count, 17. Can't see that without thinking Q. Right, right.
00:01:16.000 Impossible.
00:01:17.000 Just a number.
00:01:18.000 Yeah.
00:01:20.000 It's really well done, I just want to say.
00:01:22.000 You did a fantastic job.
00:01:24.000 It's really excellent.
00:01:25.000 Thank you.
00:01:26.000 And it's such a compelling subject.
00:01:28.000 And it opens up so many conversations on censorship, on, you know, like, what is the truth?
00:01:38.000 And how important is it To know what's real and what's not real.
00:01:46.000 It's such a complex and unusual conversation that we have to have today about misinformation, disinformation, propaganda, and this whole Q thing, which is just like...
00:02:02.000 It, to me, embodies the perfect example of what's like worst case scenario.
00:02:09.000 If someone just started making up some wild shit.
00:02:14.000 We assume making up wild shit online.
00:02:17.000 It's a safe assumption.
00:02:18.000 And got an enormous group of people to go along with it.
00:02:23.000 And then they wind up attacking Capitol.
00:02:27.000 I mean...
00:02:29.000 When they attack the Capitol building, when they stormed the Capitol building, and you realize, like, oh my god, like, this is literally like the wings of the butterfly create the storm, and then here it is, like, a thing that looked preposterous just a couple of years ago,
00:02:46.000 when people are talking about all the Q drops and all this, like, the people that I knew that are into QAnon, Jamie pointed out earlier that a lot of them were the same people that are into Flat Earth.
00:02:55.000 It's the same sort of folks, like, kind of...
00:02:58.000 Without, you know, uncharitable terms, unsophisticated, gullible, and into secrets, into finding out secrets.
00:03:08.000 I mean, I would say that in my experience, there weren't too many QAnons who were also big flat earthers.
00:03:14.000 I can introduce you to a few.
00:03:14.000 Or maybe they had set that aside a little bit.
00:03:17.000 But, I mean, you see in episode one, Liz Crokin, who is a big QTuber, big...
00:03:23.000 Sort of celebrity in terms of analyzing Q-drops and talking about the meaning.
00:03:29.000 I bring that up to her.
00:03:30.000 I'm like, well, what about Flat Earth?
00:03:32.000 She's like, tell me anything.
00:03:34.000 Nothing's too crazy, right?
00:03:35.000 And this is the thing about Q. It's like, Q... And I think a lot of people were attracted to this premise because it feeds correctly into this notion that we should be skeptical of things, was question everything.
00:03:50.000 Question everything.
00:03:52.000 Question whether or not the Earth is flat.
00:03:54.000 Question whether or not...
00:03:55.000 And in the beginning, that included question Q. And that faded away pretty quickly.
00:04:04.000 A few months into QAnon, you could question anything as long as you didn't question Q. You jumped right to January 6th, sort of the finish line.
00:04:20.000 When I think of Jan 6, I don't think it would have happened without QAnon.
00:04:27.000 I also don't think it happened solely because of QAnon.
00:04:30.000 I think that's fair.
00:04:31.000 There were a lot of forces that coalesced that day.
00:04:36.000 Yeah, I would agree with you, 100%.
00:04:38.000 I think it's such a strange time, you know, where people are learning how to use social media, and then we're also aware that social media is heavily manipulated by foreign countries.
00:04:57.000 I'm sure you're aware of the Internet Research Agency in Russia, this whole...
00:05:03.000 Essentially a troll farm that's designed to fuck with us and very successfully and Renee DiResta has highlighted this really well and It was really interesting to talk to her and find out how deep the rabbit hole goes with this stuff and hundreds and hundreds of thousands of memes often hilarious memes that get shared on social media created by Russia I mean,
00:05:30.000 can you imagine getting that job in Russia where it's like, your job will be to be a memesmith?
00:05:35.000 Your job is make funny with American politics.
00:05:38.000 Good luck.
00:05:39.000 Yeah, like, what do you...
00:05:41.000 Like, where do you start?
00:05:43.000 Yeah, that was one of the questions I've had when you're looking at something like 8chan, now 8kun, where Q had been posting.
00:05:53.000 And we talk about free speech on those platforms, and...
00:05:59.000 This idea that, you know, anyone can post whatever they want within, you know, sort of the limits of free speech, right?
00:06:07.000 Like, there are certain things which are illegal.
00:06:09.000 But does that protect people who are abroad as well?
00:06:13.000 You know, free speech traditionally in America would protect American citizens.
00:06:16.000 But in the case of the Internet, you know, the Internet is global.
00:06:19.000 So that means that you can have actors in South Africa or in Russia posting on these sites Protected under the same premise as what Americans are accustomed to.
00:06:33.000 And able to use speech against America in certain instances.
00:06:37.000 So what you're describing, you know, these kind of memesmiths in Russia are in some ways using free speech against us, right?
00:06:45.000 Yeah.
00:06:45.000 I guess.
00:06:46.000 Yeah, I mean, I guess.
00:06:48.000 I think what it's doing...
00:06:50.000 This is an odd idea, but I think what it's doing is it's forcing us to adapt and evolve our ability to detect bullshit.
00:07:05.000 And it's doing it almost like an immune system response.
00:07:11.000 Like, what we're reacting to and what we're recognizing from all this stuff is like, oh, we didn't know what this was, and this has resulted in this riot, whatever you want to call the Capitol Hill attack.
00:07:26.000 And now we're looking at more censorship on social media.
00:07:32.000 We're looking at them trying to batten down the hatches and figure out how to handle something like QAnon or the people that were allegedly promoting these ideas.
00:07:46.000 A lot of them that are banned from social media, the stuff you highlighted in your show.
00:07:50.000 We have to figure out what's true and what's not true.
00:07:53.000 And so there's been some sort of draconian measures that have been suggested, you know, like hiring some sort of a team that goes over social media and make sure that everything is according to what they deem to be correct or incorrect.
00:08:11.000 Which obviously is subject to biases, and we're very aware that that's going on today, that there's a lot of that going on today, where necessarily the truth doesn't, like the Hunter Biden laptop story is a great example of that, right?
00:08:24.000 Like the social media platforms.
00:08:28.000 They censored news from the New York Post, one of the oldest newspapers in America, on the Hunter Biden laptop story because they decided that somehow or another it was propaganda or somehow or another it was not good to get that information.
00:08:42.000 But it was news.
00:08:43.000 It was real news.
00:08:45.000 It was a real story and they decided it was too close to the election.
00:08:49.000 This could hurt Biden.
00:08:50.000 We don't want Trump to win.
00:08:51.000 So you're dealing with biases.
00:08:53.000 This is not just like simply, here's information that we know to be true or here's information that we know to be a lie.
00:08:59.000 We're going to stop that from getting through.
00:09:00.000 No, they knew it to be true, but they decided to stop it because it wasn't convenient or it didn't fit the narrative they were trying to promote.
00:09:09.000 Right.
00:09:10.000 How do you find a neutral arbiter of the truth if you are going to entrust someone with that responsibility?
00:09:19.000 And I think that it's just an incredibly slippery slope.
00:09:23.000 Incredibly slippery.
00:09:23.000 And what these big tech companies have suggested is that, well, maybe we don't use humans.
00:09:31.000 Maybe we use algorithms, right?
00:09:33.000 You know, to moderate everything.
00:09:35.000 And, you know, the algorithms that In many ways had bolstered something like Q because they're basically sociopathic when it comes to just trying to drive attention as much as possible.
00:09:49.000 So now they can kind of invert those algorithms and punish those who talk about that kind of content.
00:09:56.000 And oftentimes, even if their goal was just to prevent conversation around QAnon because they consider it to be problematic, What else gets swept up with that?
00:10:09.000 I saw a lot of people who were reporting on QAnon maybe coming from the side of critiquing it.
00:10:18.000 Their videos were being wiped out.
00:10:20.000 People who were documenting January the 6th, their content was being wiped out.
00:10:24.000 People who were critical of QAnon, they had websites that were sort of on the other side.
00:10:29.000 That was being wiped out as well.
00:10:30.000 And that's because, of course, it's sort of this blunt force that an algorithm wields.
00:10:38.000 So people even, people that were analyzing the movement from a critical standpoint, people who are looking at how ridiculous this is, look at this, they had their channels wiped out as well?
00:10:48.000 Yes.
00:10:49.000 So any content on QAnon, they just want to erase it from the internet, essentially.
00:10:54.000 That seemed to be the initial response.
00:10:57.000 It's so strange that they all move together in sync.
00:11:00.000 I mean, I think that if I did not have HBO in my sales with this project, it wouldn't have seen the light of day.
00:11:07.000 Really?
00:11:08.000 Like, if you try to put it on YouTube, you think?
00:11:10.000 Oh, yeah.
00:11:11.000 Yeah.
00:11:11.000 I mean, when we, even when we, so when we first released the series, you know, there was, there was, there were some articles floating around, like, oh, maybe this is going to make it things worse.
00:11:23.000 If I typed in Q into the storm into YouTube, it wouldn't auto-populate at a certain point.
00:11:29.000 It started out auto-populating, and then that went away.
00:11:32.000 So, yeah, I wouldn't feel confident at all that, you know, if we didn't have a gorilla in our corner, that this story that revealed ultimately who was behind QAnon...
00:11:48.000 Would have been seen, would have been able to find an audience.
00:11:53.000 Shout out to HBO. Shout out to HBO. I mean, they really had my back.
00:11:56.000 They've been amazing for decades.
00:11:58.000 You know, you really think about it.
00:11:59.000 I mean, they're the people that when Bill Maher's show Politically Incorrect got pulled off of, what was it, on ABC? I forget.
00:12:05.000 Yeah, I'm not sure.
00:12:06.000 Network television.
00:12:08.000 They immediately took it, brought it over, turned it into real-time, and made it even better.
00:12:12.000 You know, it's uncensored now, and it's, in my opinion, Real Time with Bill Maher is probably one of the very best social commentary shows and comedy shows that, like, really doesn't pull any punches on any network, ever.
00:12:27.000 You want to hear something fucking crazy.
00:12:29.000 So...
00:12:31.000 Just yesterday, I was talking with someone who's helping distribute the film.
00:12:36.000 And I said, well, what about Amazon?
00:12:39.000 You know, are we going to be able to put it out on Amazon internationally?
00:12:41.000 And they said, well, as of the last year, they have stopped taking documentaries.
00:12:48.000 All documentaries.
00:12:49.000 What?
00:12:50.000 You cannot publish a documentary.
00:12:52.000 What?
00:12:52.000 On their platform.
00:12:53.000 And the reason was because, that I was told, is because, you know, there was all this conspiracy, Flat Earth stuff, and they were getting blowback, but eventually they said, we don't want to have to decide what we publish and what we don't, what's real and what's not.
00:13:07.000 We're just not going to publish anything.
00:13:09.000 Oh, my God.
00:13:12.000 That they couldn't get published was The Cove.
00:13:15.000 I don't know if you saw that documentary.
00:13:17.000 The Dolphin documentary.
00:13:18.000 Yeah, the Dolphin documentary.
00:13:19.000 It won an Oscar.
00:13:20.000 Just to check it, I looked it up, and sure enough, The Cove wasn't available on Amazon.
00:13:27.000 Those who said that there wouldn't be a slow creep of censorship, starting with things that I think everybody agrees they wouldn't like to be in society, You know, things like The Daily Storm or maybe a lot of people don't want 8chan.
00:13:40.000 You know, there's a progression, you know, until you end up, it seems like something like, you know, the Cove can't find an audience on a major platform.
00:13:50.000 And I don't want to conflate government censorship with the corporate censorship too much.
00:13:57.000 However, in a lot of ways, it does feel like the government has passed the buck to these corporations to do what they legally can't.
00:14:04.000 Which I think is the same thing we saw the government do with privacy, right?
00:14:09.000 Like, they wouldn't have been able to get all of this data from us directly, but if you give it to a Facebook or a Twitter, it's very easy for the government to then go and get access to that information.
00:14:19.000 So I think what we saw happen with the Fourth Amendment, we're now seeing happen with the First Amendment, where they can say, well, look, we couldn't restrict conversation around certain topics.
00:14:29.000 We couldn't directly decide what's true or what's not.
00:14:33.000 We're going to put that in the hands of these companies.
00:14:36.000 And, of course, these companies have intimate relationships with many members of the government.
00:14:43.000 You know, there's a revolving door there.
00:14:45.000 So I... When people want to talk about limiting what we can say online or limiting disinformation and other things, I think that it's almost the wrong place to start.
00:14:58.000 I think we have to go back to the privacy issues.
00:15:02.000 And I actually think if we had not let privacy be eroded online, we wouldn't be having this debate.
00:15:08.000 Because if these gigantic companies hadn't collected thousands of data points on us, didn't know our fears, our desires, if they hadn't built these psychometric profiles, they wouldn't have been able to manipulate us,
00:15:26.000 use these algorithms to drive us into echo chambers, which have really created these disparate realities.
00:15:33.000 And now these disparate realities can't agree necessarily on a set of facts.
00:15:38.000 Sometimes you're considered, you know, sometimes people will be ostracized for even talking to somebody from the quote-unquote other side, right?
00:15:49.000 And so now there's this conversation about what should be allowed to be said online.
00:15:53.000 And I think that that's simply a byproduct of, you know, our privacy having been eroded.
00:16:00.000 So, you know, if I was to do anything about these issues, I would start by restoring rights.
00:16:05.000 I would go back and say, all right, well, how do we get ownership and privacy rights online when it comes to our personal data?
00:16:14.000 Let's start there before we start, you know, going after the speech itself.
00:16:18.000 Do you think that the algorithms are designed to do this or do you think that it's just a function of human nature that we tend to gravitate towards things that outrage us and then huddle up together in echo chambers?
00:16:35.000 That this is just a natural tribal behavior and that what the algorithms do is essentially just highlight what we're really interested in.
00:16:46.000 They magnify it in a feedback loop, right?
00:16:49.000 So you're right to say that humans do have these traits.
00:16:54.000 You know, and I haven't designed the algorithms, but I've also talked to people who have, and, you know, a lot of, they don't even understand how they work at a certain point.
00:17:02.000 Like, they're off to the races.
00:17:03.000 Have you seen The Social Dilemma?
00:17:05.000 Yeah.
00:17:05.000 Yeah.
00:17:06.000 What'd you think about that?
00:17:07.000 It's great.
00:17:08.000 It's great.
00:17:08.000 Amazing, right?
00:17:08.000 Yeah.
00:17:09.000 Yeah.
00:17:10.000 It's scary, too.
00:17:12.000 The conclusions that they draw.
00:17:14.000 Oh, certainly.
00:17:15.000 I mean, yeah, I made a film about eight years ago called Terms and Conditions May Apply.
00:17:20.000 You know, and that came out right before the Edward Snowden revelations.
00:17:25.000 You know, and when it came out, the initial response was like, oh, this is maybe conspiratorial.
00:17:30.000 Surely the government doesn't have this much insight into our behavior and, you know, and access to our devices and our personal information.
00:17:42.000 And then the Snowden revelations came out, and then it was like, oh, well, maybe the series didn't go far enough.
00:17:51.000 And back then we had talked about the idea of how is technology influencing us?
00:17:57.000 How is it changing us, manipulating us?
00:17:59.000 And it didn't feel like that was the biggest story at the time.
00:18:03.000 And this question of privacy and how our rights are being eroded through these agreements that nobody ever really reads.
00:18:10.000 And you could find all kinds of juicy tidbits hidden in there in terms of what the companies were actually doing and kind of revealing this unholy collusion between the government and big tech.
00:18:23.000 You know, but at the time people would often say, well, what's the cost?
00:18:29.000 What's the big deal if they're mining my personal data to serve me with ads?
00:18:33.000 And I'd say that the environment we find ourselves in now is the cost.
00:18:39.000 Do you do anything to personally protect your data?
00:18:42.000 Do you use DuckDuckGo for searches and things along those lines?
00:18:47.000 Do you use Brave as a browser?
00:18:48.000 Do you do that stuff?
00:18:49.000 All that, yeah.
00:18:50.000 Use Signal.
00:18:52.000 VPNs?
00:18:53.000 VPNs.
00:18:53.000 I mean, I do my best.
00:18:56.000 If a government actor really wants to get at you, they're going to be able to.
00:19:01.000 I mean, you saw the NSO clickless spyware story, right?
00:19:04.000 Yes.
00:19:05.000 I mean, if there's a zero day that allows you to get access to a microphone and everything that someone's doing on their phone without them even having to click on a link, it's game over.
00:19:17.000 Of course governments will abuse that.
00:19:21.000 Yeah, and probably are right now.
00:19:23.000 Probably are right now, yeah.
00:19:24.000 Where are our phones at?
00:19:25.000 Right there.
00:19:27.000 Yeah, it's interesting because you would think that there would be a market for a platform that becomes bulletproof.
00:19:36.000 And there have been some, you know, Linux-based cell phone operating system phones that they sell, like they buy a phone.
00:19:44.000 You get a Google phone, they de-Google it and put different software on it and stuff.
00:19:49.000 But I'm not sure if that's like...
00:19:52.000 If you're deluding yourself into believing that you're actually protected with that stuff or you actually are protected, I would think they could work around all those things, especially something that's, I mean, it's essentially like open source, right?
00:20:05.000 Like if it's a Linux-based operating system, there's some super geniuses out there.
00:20:11.000 I'm sure they're going to be able to hack into that.
00:20:15.000 Yeah, I mean, I think with stuff like Signal, you're just protecting yourself as best you can.
00:20:20.000 You use something that's end-to-end encrypted.
00:20:23.000 You're doing better than 99% of people who are out there.
00:20:26.000 You're making somebody really have to work to get access to your stuff.
00:20:30.000 And if you're using a VPN and you're using DuckDuckGo, then you're minimizing your digital footprint.
00:20:35.000 And you're not worth as much of these companies.
00:20:38.000 They're not able to...
00:20:42.000 Manipulate you, I guess, in the same way through these algos.
00:20:45.000 But let me ask you this.
00:20:46.000 What would you do about the algorithm problem?
00:20:49.000 Because on the one hand, algorithms are necessary for something like a search engine.
00:20:54.000 On the other hand, they drive the most sensational content, things like QAnon, and I think have largely facilitated the situation we find ourselves in now.
00:21:06.000 It's a really interesting question because on one hand, for me, like when I go to my YouTube page, it highlights the things that I'm interested in.
00:21:14.000 And when I go to YouTube, in general, I'm interested in entertainment.
00:21:20.000 I mean, that's all I get.
00:21:21.000 I mean, I have a very boring YouTube page.
00:21:24.000 If you go to it, there's a few political talk shows that I listen to, like Breaking Points and...
00:21:30.000 Jimmy Dore and Kyle Kalinske and a few other folks.
00:21:34.000 And then there's a lot of billiards matches and Muay Thai fights.
00:21:39.000 That's most of my YouTube.
00:21:41.000 So it's boring for the average person.
00:21:46.000 It's not suggesting anything to me that's going to lead me down any rabbit holes.
00:21:52.000 Because I don't use it for that.
00:21:54.000 I use YouTube like, oh, I got 10 minutes to kill.
00:21:57.000 Let me watch something stupid.
00:21:58.000 Let me sit here and watch a pool match.
00:22:01.000 That's what I do.
00:22:03.000 But if you're a person that is, and I've been that person in the past, that got into conspiracy theories.
00:22:12.000 And like, is this real?
00:22:13.000 Like, who built the pyramids?
00:22:15.000 And the next thing you know, you're- How do magnets work?
00:22:18.000 Right.
00:22:18.000 Yeah.
00:22:20.000 And you're going down this rabbit hole and then that becomes your fucking life.
00:22:23.000 Like, one of the things that to me was so fascinating about your documentary series was seeing into the lives of these people that were utterly obsessed with these Q drops.
00:22:36.000 And it becomes a thing that gives life meaning when, you know, for lack of better terms, a lot of those folks in that documentary are misfits.
00:22:49.000 Like, a lot of the people are a little goofy.
00:22:54.000 You know, the way they talk about things and think about things is a little goofy.
00:22:59.000 And when QAnon came along, it gave them something to latch on to that was bigger than them.
00:23:05.000 They were a part of a movement.
00:23:07.000 And you see that same sort of thinking, that same sort of mental pattern in people that get obsessed with UFOs.
00:23:15.000 You see it in people that get obsessed with political dogma.
00:23:19.000 You see it in a lot of things.
00:23:21.000 They become a part of a movement that's far bigger than them.
00:23:24.000 It's one of the reasons why people get so invested in political candidates.
00:23:27.000 And political campaigns.
00:23:29.000 It's not necessarily that they're looking at the big picture objectively and they think that this politician is going to be better for their lives.
00:23:37.000 They're going to highlight problems that we have with inequality or problems that we have with laws.
00:23:46.000 But mostly they want their team to win.
00:23:49.000 You know, there's a lot of that with a lot of people.
00:23:52.000 They connect ideologically with a team, and then they get very tribal.
00:23:57.000 And that was the thing that I saw in that documentary.
00:24:00.000 I'm like, this is a pattern of human behavior.
00:24:03.000 And the QAnon thing just locked into it because it was secretive, it was interesting, the idea that Trump had this insider, and this insider was like dropping all this information that it was all going down.
00:24:18.000 In Trump, we had this guy that was going to clean up the swamp.
00:24:22.000 He was going to find those people that were eating babies and all that.
00:24:27.000 Yeah, and it was a narrative that contradicted what people were seeing in the mainstream media as well.
00:24:33.000 And I think that a lot of those individuals who voted for Trump and also gravitated towards QAnon, They wanted to believe that there was this sort of secret story that was happening behind the scenes,
00:24:49.000 you know, where all of these arrests were coming, where, you know, what they had hoped would happen would actually happen.
00:24:57.000 And I think that, I mean, you hit a lot of the big points there.
00:25:01.000 And that's why QAnon is sort of part religion, part political movement, part interactive game.
00:25:08.000 And it draws in people from the UFO crowd.
00:25:11.000 I mean, it's a big tent for all kind of...
00:25:17.000 Conspiratorial fringe thinking or, you know, sort of strong religious convictions as well.
00:25:22.000 I mean, you see a lot of people who are evangelicals who also believe in QAnon.
00:25:27.000 There's a big overlap there.
00:25:29.000 So I think it's individuals who are looking for community as well, looking for purpose.
00:25:35.000 I spent so much time with these guys, right?
00:25:38.000 And I would get phone calls in the middle of the night where they just wanted to talk A lot of times because they just found me to be a grounding force in their lives.
00:25:49.000 And so I would try to always take it when I could and just give them a more neutral perspective versus what they might be hearing, especially some of the QTubers.
00:26:02.000 So some of the Qtubers, like, they were going down these paths, or these, you know, they were in these rabbit holes, and then they would call you and go, hey man, does this make any sense?
00:26:13.000 Am I out of line?
00:26:15.000 Because they were wrapped up in it, and they saw you as an objective sort of voice of reason.
00:26:21.000 I think so, yeah.
00:26:23.000 I mean, you see in Episode 5 one of the craziest things in the series, which is that these ex-Mill guys, whether it's General Flynn or, in this case, General Paul Vallely or Major General.
00:26:39.000 Is he a Lieutenant Major?
00:26:40.000 Anyway, he's using his ex-Mill guys to cede his political agenda with these QTubers.
00:26:50.000 So what happens is they suddenly have someone who is claiming to have this super secret intel reaching out to them and saying, you know, Osama bin Laden's actually still alive.
00:27:01.000 Would you like to talk about that on your Qtube station?
00:27:04.000 And they're going, well, you know, and they're useful because someone like in the case of Craig and his site JustInformedTalk, I mean, he had like half a million or more followers.
00:27:15.000 So they're a useful inflection point.
00:27:19.000 So he would be someone who might call me up and be like, what do you think about this?
00:27:23.000 Like, why are they telling me this?
00:27:26.000 You know, what's their motive?
00:27:28.000 Should I tell this to my audience?
00:27:31.000 I'd be like, it didn't really matter what I said.
00:27:33.000 He usually just ended up telling it to his audience.
00:27:34.000 But, you know, I would try to talk him through it.
00:27:37.000 Did you try to get a hold of General Kelly?
00:27:39.000 I did not.
00:27:40.000 I did not.
00:27:40.000 The only general that I spoke to, and he didn't end up making an appearance in the series, just because I wanted to keep the series focused primarily on the investigation into who was behind it, was General Hayden.
00:27:55.000 The Kelly one is interesting because when he's standing outside in front of his house with his family, he's like, where we go one, we go all.
00:28:01.000 And they're reading the whole speech, the QAnon speech.
00:28:03.000 Oh, General Flynn.
00:28:04.000 I'm sorry.
00:28:04.000 General Flynn.
00:28:05.000 I said Kelly.
00:28:05.000 Sorry.
00:28:06.000 Flynn.
00:28:07.000 That was...
00:28:08.000 So, did you talk to him?
00:28:09.000 I had tried to get a hold of General Flynn and to do an interview with him.
00:28:13.000 Of course I would have.
00:28:15.000 He was going through some legal troubles at the time.
00:28:19.000 So he was not making as many public statements.
00:28:22.000 He was not really giving interviews.
00:28:24.000 So what you saw was his family, you know, Joe Flynn, Barbara Redgate, I think was her name.
00:28:31.000 Some of these characters who were in his orbit, family members who were kind of speaking on his behalf.
00:28:37.000 Meanwhile, General Flynn was messaging QTubers behind the scenes, bolstering this.
00:28:44.000 And he had like $5 million in legal bills or some crazy amount of money that he had to spend in his legal bills.
00:28:51.000 So he needed all the support he could get.
00:28:54.000 And I had talked to some individuals who had helped him in his fundraising efforts from nearly the beginning at one of these Q conventions.
00:29:03.000 You know, and they said internally within the family there was debate as to whether or not QAnon was helpful or harmful.
00:29:12.000 But ultimately they all gravitated towards it.
00:29:15.000 So do you think he was using it just as a fundraising thing?
00:29:19.000 Like he was attaching himself to that because he knew that they would support his legal defense?
00:29:25.000 Cynically?
00:29:26.000 I mean, I do think that that's a part of it, right?
00:29:29.000 You have an incredibly passionate base who are...
00:29:33.000 Most of them believe that Flynn was Q, right?
00:29:38.000 That was the predominant theory among QAnons.
00:29:41.000 Because Flynn was so central to the Q narrative...
00:29:45.000 You know, he was a good guy in that narrative who warranted their support and was kind of, you know, working against the cabal.
00:29:54.000 So you ask most QAnons and Flynn would be very high on their list of suspects for who might be behind it.
00:30:01.000 And it would not be until the last year in the approach to the election that Flynn would openly embrace it, doing what I think you were describing where he, you know, raises his hand with the statement on the Fourth of July and they all take the Q oath.
00:30:16.000 Whoa.
00:30:17.000 Yeah, that was wild to see an actual general do that.
00:30:22.000 And then you saw all of the QAnons doing the same thing.
00:30:25.000 So then they went on YouTube and started taking the same kind of oath.
00:30:30.000 You saw how much power he had really generated vis-a-vis QAnon.
00:30:36.000 What's even more wild, though, is that in Japan, there is a huge crowd that supports Flynn there.
00:30:46.000 So there's like Q Army Flynn Japan.
00:30:50.000 And there's over 100,000 people in Japan who are like giant Flynn supporters.
00:30:55.000 It's almost like a religious offshoot.
00:30:58.000 In Japan, there's sort of two different big primary segments there of QAnon.
00:31:03.000 But one of the biggest ones is really centered around Flynn.
00:31:10.000 And that's always shocked me.
00:31:13.000 I mean, imagine if there was like a general in Japan who had a big following in the States.
00:31:18.000 They managed to do that through 4chan or 8chan or 8kun or whatever.
00:31:22.000 Well, that was another aspect that I thought was odd is that how people in other countries like the guy from South Africa are so into American politics.
00:31:33.000 It's hard to try to get their perspective.
00:31:37.000 I would imagine that it's just very different being there and looking at us.
00:31:43.000 I think America is such a bizarre anomaly.
00:31:45.000 But America has so much influence on world policy that if you want to influence World policy, you influence America.
00:31:53.000 But it's so, I mean, it is really unusual that we don't, we barely know who the fuck the Prime Minister of Canada is, you know, right?
00:32:03.000 We know very little about Canadian politics.
00:32:06.000 We know Trudeau, right?
00:32:07.000 Handsome guy, see him on TV, a lot of Canadians hate him.
00:32:10.000 That's what we know, you know?
00:32:12.000 I mean, we really don't know anything about all the other people.
00:32:15.000 We knew about that guy What was his name?
00:32:18.000 The one who was the mayor of Toronto who was out of his funk, Ford.
00:32:21.000 Remember that guy?
00:32:22.000 I do remember him.
00:32:23.000 He was crazy.
00:32:24.000 Yeah, it just requires a big, flashy story.
00:32:27.000 Yeah, so you've got to be in the tabloids.
00:32:29.000 Yeah, something big has to happen.
00:32:31.000 But that guy in South Africa was fully invested in American politics.
00:32:37.000 Oh, I mean, living and breathing in it.
00:32:39.000 A huge conspiracy theorist, too.
00:32:41.000 I mean, he's also my sort of prime suspect for the original Q. Really?
00:32:46.000 Yeah.
00:32:47.000 Interesting.
00:32:48.000 I mean, I don't say it in the series because I couldn't prove it definitively, and I didn't want to...
00:32:54.000 It is possible that there are...
00:32:57.000 I have sort of two other theories for who may have started it, but he's the predominant.
00:33:02.000 Well, he was so adamant that the second Q wasn't Q. And I was like, hmm, how does he know that?
00:33:10.000 Like, why does he think that?
00:33:11.000 Why is he so convinced?
00:33:13.000 I mean, if he doesn't have access to some data points, if he doesn't have access to whatever it is, like the location or something, the ISP, the IP address, rather, what's his reasoning?
00:33:33.000 Right.
00:33:53.000 I mean, there's a lot of reasons to think that Paul would have been running it up until that point.
00:33:58.000 And there's a lot of reasons to think why CodeMonkey or slash Ron Watkins would have taken it from him at that point.
00:34:05.000 I mean, he was posing a real threat to the, you know, quote-unquote operation.
00:34:10.000 He had just appeared on Alex Jones.
00:34:12.000 So this is late December 2017. Q's been actively posting for about two months at this point.
00:34:19.000 Who had just appeared on Alex Jones?
00:34:21.000 Ron?
00:34:21.000 So Ron did not appear anywhere.
00:34:23.000 The first recorded interview that I know of that he gave was with me a year and a half or so, or a year later.
00:34:33.000 This is when Q kind of gets its first...
00:34:37.000 I consider Alex Jones to almost be mainstream.
00:34:40.000 He has a big enough audience, right?
00:34:41.000 It brings it out to a new, bigger audience.
00:34:44.000 It escapes the chans through Alex Jones.
00:34:48.000 And this happens in late December 2017. And Coleman Rogers and Paul Ferber both appear on his show.
00:34:53.000 This is when Alex Jones is not banned from social media either.
00:34:56.000 Correct.
00:34:57.000 So it's much bigger reach.
00:34:58.000 Yeah, he's got a bigger reach at this point.
00:35:00.000 So Jerome Corsi, who had been talking about Q-drops, sort of famed conspiracy theorists, largely responsible for the old Bertha movement thing around Obama, Swift Boat with John Kerry, he's been doing this political operative stuff for years.
00:35:16.000 He brings it to Alex Jones, and Alex Jones brings these two guys on his show, late December.
00:35:21.000 It's Paul Ferber and Coleman Rogers.
00:35:24.000 You know, Coleman Rogers would go on to start a 24-hour news channel devoted to Q. Paul Ferber, you know, was running the board that Q was posting on.
00:35:33.000 Which is very suspicious.
00:35:34.000 Very suspicious.
00:35:36.000 You know, for some reason, Q leaves 4chan in late November 2017 and jumps over to choose Paul Ferber's board on 8chan for some reason.
00:35:48.000 Why didn't Q create its own board?
00:35:51.000 Why that board?
00:35:52.000 Was he on Paul Ferber's board on 4chan?
00:35:55.000 Did Paul Ferber have a board on 4chan?
00:35:57.000 No.
00:35:58.000 So Q is largely posting on Poll on 4chan.
00:36:02.000 So they didn't have control.
00:36:04.000 Poll is a politics?
00:36:05.000 Yeah.
00:36:05.000 It's like politically incorrect.
00:36:07.000 Is it P-O-L? P-O-L. Yeah.
00:36:09.000 Yeah.
00:36:09.000 And that's where usually Poll on 4chan or 8chan is where you see the most racy kind of anti-Semitic white supremacist stuff.
00:36:17.000 If you're going to see that, it's there.
00:36:19.000 And you can never really tell when you're on that board if people are being ironic or post-ironic.
00:36:24.000 Shitposting.
00:36:24.000 Shitposting.
00:36:25.000 Yeah.
00:36:25.000 Right.
00:36:25.000 Can't really tell.
00:36:26.000 But anyway, that's the board that when the chans get reported on, usually they're pulling screenshots from that.
00:36:34.000 So Q was posting there, and people on 4chan were like, this is a fucking LARP. Stop it.
00:36:39.000 What is the first post?
00:36:41.000 They didn't like Q very much.
00:36:43.000 They thought it was bullshit?
00:36:44.000 They thought it was bullshit, yeah.
00:36:45.000 Interesting.
00:37:00.000 Almost exponentially at the very end, up until Q decides, I'm out of here.
00:37:05.000 You can see.
00:37:06.000 And then right at the end there, that's when Q jumps ship and goes over to 8chan.
00:37:11.000 Now, they would also say that Q had been banned on 4chan, and that also motivated the leap.
00:37:17.000 But I do think that it's interesting that...
00:37:20.000 Was it banned on 4chan?
00:37:21.000 So, my understanding is that the trip code itself, like, Hugh is no longer able to post using, I think, using the trip on Fortune.
00:37:29.000 The trip code, explain that?
00:37:31.000 So, it's just a cryptographic function that allows someone to type in a password.
00:37:37.000 A simple password.
00:37:38.000 It's eight characters long.
00:37:41.000 And it will produce a code.
00:37:44.000 And that's called a trip code.
00:37:45.000 And this is a way of anonymous or pseudo-anonymously kind of identifying yourself.
00:37:49.000 So if I have the password, Q's password, eight character or so password, type it in, boop, boop, boop, it's going to produce a code.
00:37:56.000 And that code will appear on every post that the password is entered for.
00:38:01.000 So this is how people knew that Q was posting.
00:38:03.000 And in fact, the first 127 posts didn't have a trip code.
00:38:07.000 So there was no way to even know who Q was in the beginning, right?
00:38:12.000 In fact, the Anons had to go back and review all the old posts just to see, like, okay, which ones were Q, which ones weren't.
00:38:20.000 So, like, the first 127 drops, like, anybody could have been coming in, like, LARPing as Q, kind of jumping in, trying to write in this style.
00:38:28.000 And the idea of The queue hadn't even been established yet.
00:38:31.000 I mean, queue itself?
00:38:32.000 Yeah, anybody.
00:38:33.000 Any asshole.
00:38:34.000 Like you.
00:38:35.000 You could have done it.
00:38:35.000 Yeah, of course.
00:38:36.000 Yeah, you could have done it.
00:38:37.000 Anybody could have done it.
00:38:38.000 Paul Ferber could have been there doing it sometimes, but someone else could jump in, you know, kind of put on the hat.
00:38:43.000 And lots of people were trying to do that in the beginning.
00:38:46.000 So, like, when you post there, you don't have, like, a screen name?
00:38:51.000 No, there's no login.
00:38:52.000 So anybody can post without having to log in.
00:38:55.000 That's what makes the Chans really unique.
00:38:59.000 There's also no algorithms on the Chans.
00:39:01.000 But if you're using a VPN, which 4chan makes that very difficult, 8chan less so, If you go there and you wanted a shitpost and you don't want to have it traced back to you, you can do that.
00:39:17.000 So that's why people, you know, it gives them this true sense of anonymity without having all of their posts be logged and associated with their IP. How does that establish a community, though?
00:39:27.000 Because when people have, whether it's Twitter or whatever, people find people based on their screen names.
00:39:35.000 Well, it's more of a culture.
00:39:37.000 And I think that those who are heavy Chan users would...
00:39:43.000 The term is fame fagging, that's what they would say, where people use a name like Q to bring more notoriety.
00:39:56.000 In their minds, the best ideas rise to the top.
00:40:00.000 So they don't want to...
00:40:05.000 They don't want to have, they don't want to give, they don't want to have, I guess, the power that's associated with identity.
00:40:12.000 And that's part of why they were so annoyed when Coleman Rogers and Paul Ferber went on the Alex Jones show, because they were, you know, doing what Chan users would call fame fagging at that point.
00:40:24.000 And that's a term they love to throw around on the Chans heavily.
00:40:28.000 So if you had a screen...
00:40:30.000 And Q was doing that, too.
00:40:31.000 So Q, by using a trip code, would have been, you know, that term they threw out.
00:40:37.000 But it's...
00:40:39.000 When Q started posting, Q was not known as Q at the time.
00:40:44.000 Q was just a random anon writing stuff.
00:40:48.000 And it wasn't until...
00:40:49.000 But there was no Q at the end of it or anything?
00:40:50.000 No, there was no Q at all in the beginning.
00:40:53.000 So that concept wasn't even developed yet.
00:40:57.000 And it was not until drop, I believe, is it 20, 30?
00:41:03.000 It was like, I have it, maybe I brought it with me, but it's like drop, let me look this up, I don't want to get it wrong.
00:41:11.000 Drop 34. At drop 34, that's the first time we see Q, Clearance, Patriot.
00:41:20.000 And so, you know, some anon just signs off, Cue Clarence Patriot.
00:41:25.000 And the LARP kind of continues.
00:41:27.000 And so it's almost like an improv game where people are jumping in and they're like, oh, what's the story that we're developing?
00:41:33.000 Let me try to add something new to the story.
00:41:35.000 You know, it's not until Drop 61 that you see the idea of Cue, signing off as a Cue.
00:41:46.000 So the first, you know, 60 or so drops, that's not even a thing.
00:41:52.000 And it's not until drop 128 that Q signs off using a trip code.
00:41:59.000 So at that point, one individual Locked it down with a password.
00:42:05.000 And then started, you know, writing from there.
00:42:07.000 So it's possible that that password could have been shared between multiple people.
00:42:11.000 You know, that was the story that Q told.
00:42:15.000 That, you know, that there was a whole kind of, like, less than 10 people who knew the full truth about Q, right?
00:42:25.000 And this was something I'd hear from a lot of QAnons.
00:42:32.000 But most likely, if you were just an old-school troll on 4chan or 8chan, you would just want it all to yourself.
00:42:41.000 Are you really going to share the password with other people?
00:42:43.000 Because someone else could easily just change the trip code and lock you out.
00:42:47.000 And they could also reveal that the two of you were in on this together, if you have a falling out.
00:42:52.000 Yeah, and we haven't seen that happen.
00:42:53.000 Nobody has come forth and been like, okay, actually, here's what was going on behind the scenes.
00:42:58.000 You would expect there to be some kind of a leak around all of this.
00:43:02.000 And you asked about the community thing.
00:43:04.000 You know, that term I used before, there's tons of terms like that and a culture that exists on 4chan and 8chan that is very abrasive.
00:43:14.000 And is designed to keep normies out.
00:43:17.000 And they try to keep normies out.
00:43:19.000 They want to be as edgy as possible.
00:43:22.000 It's a game of who can be the most provocative in a lot of instances.
00:43:28.000 So that's largely what drives those communities.
00:43:32.000 And people will have friends on there, and they'll become friends through Discord channels and stuff like that.
00:43:38.000 And sometimes you can have a handle, like CodeMonkey has a handle, right?
00:43:43.000 People know him as CodeMonkey.
00:43:44.000 But the thing about being anonymous is anyone can LARP as you.
00:43:47.000 How do you know if you're talking to the same person or not?
00:43:50.000 That is crazy, right?
00:43:51.000 So anybody can jump in and pretend to be Code Monkey.
00:43:54.000 Yeah, that's a fascinating aspect of this whole thing.
00:43:58.000 In his case, he has his own trip code, and his trip code produces Ode Monkey.
00:44:02.000 Ah, okay.
00:44:03.000 Yeah, but it's actually pretty easy to hack a trip code.
00:44:06.000 You were with them for how long when you were doing this documentary series?
00:44:13.000 I was with them, I mean, we were filming on and off for about three years.
00:44:17.000 Over the course of that time like Ron's story evolves or Either he forgets what he told you initially or he forgets the way he was sort of describing Q and his idea of politics and what he feels about politics and towards the end He's saying essentially he's been educating normies on how to do politics and you're like hey What the fuck?
00:44:46.000 What is this?
00:44:48.000 But along the way, part of the thing was analyzing the difference between the original cue and then the cue once it goes to 8chan, right?
00:44:58.000 There's a different style of communication.
00:45:01.000 Yeah, yeah.
00:45:02.000 I mean, so the two points there.
00:45:04.000 So when we get to the end, we can go into all the reasons that Ron is Q. But if we're just talking about the early days, because I get a lot of people asking me, okay, well, who was the original Q? What you were describing, that style shift,
00:45:20.000 that's a huge indicator.
00:45:22.000 A, that there was only one person writing at one point and then another person writing at the next.
00:45:27.000 And B, it tells us when the shift might have happened.
00:45:30.000 So, you know, that shift that's most detectable is somewhere between the jump from 4chan to 8chan and a little bit after when Paul loses control of the board and believes that Q has become fake.
00:45:47.000 And Q starts posting really obvious doctored photos.
00:45:55.000 There's punctuation changes.
00:45:59.000 You can see that it's someone who's trying to emulate.
00:46:01.000 Punctuation changes?
00:46:02.000 Yeah, like way more exclamation marks.
00:46:06.000 And stuff.
00:46:08.000 To be clear, it's this kind of punctuation I've seen Ron use on his Telegram feed extensively since he was booted from Twitter in January.
00:46:18.000 Is he leaning into it?
00:46:21.000 Well, after the series concluded, Ron messaged me.
00:46:29.000 And that was the first time he had seen it.
00:46:31.000 He was watching it alongside everybody else.
00:46:34.000 And he messaged me and he said, Cullen, you know, I identify more with villains.
00:46:42.000 He said something to the effect of, I learned a long time ago that you have to make internet personalities larger than life because it makes for a more entertaining existence.
00:46:54.000 I'm not Q, but I may as well lean into it.
00:46:58.000 So he has to continue to deny Q, being Q, I think, in his mind for whatever liability might come with that.
00:47:09.000 It's like he comes as close to admitting it without getting rid of the plausible deniability.
00:47:20.000 And I think he also assumes that all of our communications are being monitored.
00:47:25.000 Right.
00:47:26.000 Which they probably are.
00:47:27.000 I'm sure they are.
00:47:29.000 100%.
00:47:29.000 There's that moment where you're both laughing, where he's laughing, where he talks about that, that he's been doing this 10 years anonymously.
00:47:41.000 But he's not cute.
00:47:42.000 And then you laugh, and then he laughs.
00:47:45.000 Well, he breaks first.
00:47:46.000 Right, that's right.
00:47:46.000 Which is a huge tell as well.
00:47:48.000 Like, he realizes he just fucked up.
00:47:50.000 Like, he realizes he just slipped.
00:47:51.000 Yeah.
00:47:52.000 And I have...
00:47:55.000 Ron has some tells.
00:47:58.000 You know, he'll clear his throat.
00:47:59.000 That's a big tell for when he's lying.
00:48:03.000 Maybe.
00:48:04.000 I mean, that was after three years of this kind of cat and mouse thing.
00:48:09.000 And he had always denied being really involved in the boards at all.
00:48:16.000 You know, he would kind of pretend not to know certain things about Q. So this was a big shift for him.
00:48:24.000 To come out and say, yeah, actually, like, I was leading digs on the boards, which is basically finding and sort of curating the research or evolving theories or, you know,
00:48:40.000 and this is what happens on Poll, it's what happens on Q Research, is that they're going out and collecting whatever stuff they find on the internet and saying, okay, you know, Something about Huma Abdeen or something about, you know,
00:48:56.000 nuclear subs.
00:48:57.000 You might just get this sort of lengthy list, and those are the digs.
00:49:00.000 And then they keep getting kind of recycled.
00:49:02.000 And so what Q would really do was just look at all of that research, you know, kind of pick the best, the most spicy stuff, ask a question about it or reference it in some way, shape, or form, and then it would make those who were on those sites We're good to go.
00:49:42.000 That's what's interesting, right?
00:49:43.000 It's like you could not be a deep government agent, some person who was in the White House, who had massive responsibilities, who was side by side with the president, and have the knowledge of that community.
00:49:59.000 Really, almost wouldn't be possible.
00:50:01.000 No, and you would be potentially criminally liable.
00:50:07.000 Right, like if they went into your White House laptop and go, hey motherfucker, what are you doing on 8chan?
00:50:13.000 Like, what's up with all these frogs with Hitler armbands?
00:50:16.000 You know, like...
00:50:19.000 Guys, if you think it's really General Flynn, do you think he would tell you?
00:50:25.000 Or do you think he would at least have layers and layers between that?
00:50:29.000 The only reason he would lean into it is because it can't come back to haunt him in that same way.
00:50:35.000 And someone like Ron, is what Q did illegal?
00:50:41.000 That's a big question.
00:50:42.000 Right.
00:50:42.000 Is it incitement?
00:50:45.000 That's a tough thing to prove.
00:50:47.000 But, you know, pretending to be a government insider with Q-level clearance, well, who?
00:50:56.000 Who are you pretending to be?
00:50:58.000 Well, not only that, you're doing it, again, on a website that has a lot of frogs with swastikas.
00:51:04.000 Yeah.
00:51:05.000 You know, like, are we supposed to believe that's real?
00:51:09.000 But at a certain point in time, it becomes very clear that there's a massive movement behind this.
00:51:14.000 Like, literally millions of people believe this, and some of them don't even go to those websites, correct?
00:51:22.000 Like, there's individual websites that collect all of the drops, right?
00:51:26.000 Oh, sure, yeah.
00:51:27.000 I mean, most of those who are following the queue We're not actually engaging with the site where Q was posting.
00:51:37.000 I would say that is...
00:51:39.000 Most?
00:51:40.000 Yeah, the vast majority.
00:51:41.000 So what happens?
00:51:41.000 I've never actually really been to 8chan.
00:51:45.000 And you see that moment in the series, right?
00:51:48.000 I talked to Liz Crokin, you know, who was a gossip columnist for the Tribune once upon a time.
00:51:54.000 I think she either wrote for like The Star or The Inquirer, one of those rags as well.
00:51:58.000 But, you know, she's like, there's, you know, child porn on those sites?
00:52:03.000 That's news to me.
00:52:04.000 Right.
00:52:05.000 You know, it's like, well, it gets deleted because 8chan moderates just like every social media company has to moderate.
00:52:12.000 You know, I think they moderate sort of to the absolute limit.
00:52:17.000 You know, but...
00:52:18.000 Yeah, I mean, she also didn't want to engage with that material.
00:52:25.000 It's an uncomfortable place to be for a lot of people.
00:52:28.000 So that's part of why in the series I wanted to say, well, look, in order to understand Q, you have to understand...
00:52:34.000 The entire mentality of the world from which Q is born.
00:52:38.000 And in fact, the person who is the architect, I believe, behind all of this, and I think we proved to be the architect behind all of this, is, you know, an edgelord of that space.
00:52:48.000 It's the exact kind of person who would run that kind of campaign.
00:52:51.000 And I don't think for a second, like, no one could sit back and be like, I'm going to create this massive global movement and be successful.
00:52:58.000 It's just, this is the one that kind of stuck.
00:53:01.000 I mean, there have been other LARPs in the past, you know, other people who've kind of come out and been these secret anons.
00:53:07.000 I mean, before Q, there were a couple of other prototypes.
00:53:10.000 I didn't mention this in the series, but, you know, there was like an FBI anon, a CIA anon, a Mega anon, and...
00:53:18.000 Mega or MAGA? Mega anon.
00:53:20.000 What's Mega anon?
00:53:21.000 This is just like a...
00:53:23.000 Someone, in that case, it was supposed to be kind of like someone who had a good insight to Washington, D.C. politics and had good sources and was writing on the chans and was supposed to be female.
00:53:36.000 And, yeah, this had happened kind of in the year running up to Q and tailed off at the beginning of 2018, this other kind of heightened anon.
00:53:48.000 But you can go all the way back to the early 2000s with Art Bell and Coast to Coast.
00:53:53.000 You know, there was somebody who pretended to be a time traveler from the future who was trying to prevent World War III, John Titor.
00:53:59.000 Oh yeah, I remember that.
00:54:00.000 Tons of people got into it.
00:54:02.000 Were you into it?
00:54:02.000 No.
00:54:03.000 No, but that was a fascinating one, the time traveler one.
00:54:06.000 I had a comedian send me this like he was serious about it.
00:54:09.000 Like, dude, I think this guy's for real.
00:54:11.000 I'm like...
00:54:12.000 I don't have to change my number.
00:54:14.000 I think this is a fucking time traveler.
00:54:16.000 But this 8chan thing, what's fascinating to me is how many members does 8chan have or how many users does 8chan have?
00:54:28.000 There are a lot of...
00:54:31.000 There's what Ron would say, and then there's probably what the sort of truth is.
00:54:35.000 We did some analytics of the traffic, you know, and it certainly went way up thanks to Q. Way more, way more users until Q had become the predominant reason that people were visiting 8chan and 8kun.
00:54:54.000 So we're talking, it looked like over 2 million were actively engaging there during sort of the peak of Q. It might have been higher than that.
00:55:01.000 Before that.
00:55:02.000 It was a little difficult because they got wiped from the web and a lot of the data around usage got wiped as well.
00:55:09.000 You know, when Q was really booming right after the Epstein incident, you know, that- Just after Epstein's death?
00:55:19.000 Yeah.
00:55:19.000 Yeah.
00:55:20.000 And then there was another shooting.
00:55:22.000 And then that, as you see in the series, kind of triggers a series of events that causes HN to get pulled from the web.
00:55:30.000 But- The New Zealand shooting.
00:55:34.000 So, which one actually?
00:55:36.000 Does the Christchurch Facebook?
00:55:37.000 It might have been Poe at that point.
00:55:39.000 No, so Poe was in April.
00:55:41.000 What was Poe?
00:55:42.000 Poe was another, it was a synagogue shooting.
00:55:45.000 So there was Christchurch and there was Poe, and then there was another shooting that took place in El Paso in August.
00:55:57.000 There's so many shootings, I can't even keep track of all of them.
00:56:02.000 Which is its own story, I guess.
00:56:04.000 But, you know, in 2019. And that was the third one that became the point that there was a lot of public outcry, largely led by Fred Brennan, their opposition.
00:56:16.000 To wipe the site from the web.
00:56:19.000 And I think the assumption was that people were being radicalized on 8chan, that they were, you know, being exposed to dangerous ideas and that this is what was leading them to these shootings.
00:56:32.000 One of the things I tried to point out actually in this series was that that's kind of a misguided assumption.
00:56:40.000 You know, I think that we can look specifically at the New Zealand shooter, right?
00:56:46.000 Lots of headlines were saying that the New Zealand shooter had done so because they were radicalized on 8chan.
00:56:54.000 But I think they'd visited the Baltics, they'd donated to white supremacist organizations, and they had, even in their own testimony, said that they'd been radicalized on YouTube, that it was like the YouTube algorithms that had drawn them to a lot of this material once upon a time,
00:57:11.000 years earlier.
00:57:13.000 And so I think that's yet another example where you say, okay, well, algorithms are driving people, you know, in a certain direction.
00:57:21.000 We can have a conversation about what kind of seatbelts you might want to put on algorithms, but, you know, an HN doesn't have any.
00:57:29.000 So...
00:57:32.000 So it was interesting that it was almost like the big tech was passing the buck, that 8chan was the low-hanging fruit.
00:57:40.000 And it's like, you get rid of 8chan, you've solved the problem, right?
00:57:42.000 And, like, in reality, something like Q wouldn't have been successful if it wasn't for big tech.
00:57:49.000 These chans are old.
00:57:50.000 They've been around for decades.
00:57:53.000 And Q just escaped the chans.
00:57:55.000 And it escaped the chans thanks to algorithms and people who were susceptible to a narrative that they wanted to believe was true.
00:58:08.000 Wanted to be a part of something.
00:58:10.000 I mean, you use the word misfits a lot.
00:58:12.000 I think that seems applicable in a lot of cases.
00:58:17.000 You know, they're LARPing.
00:58:19.000 You know, like I made a film about LARPers in 2008. Did you?
00:58:22.000 What is it?
00:58:22.000 Yeah, it's called Monster Camp.
00:58:24.000 Very DIY little film, but it was back before people really knew what LARPing even was.
00:58:28.000 You know, and I saw some parallels here.
00:58:31.000 People who don't have a lot of, you know, sort of close friends in their lives and are looking for meaning and, I guess, kind of want to feel special.
00:58:44.000 Yeah.
00:58:46.000 And I think what you see actually over the course of the series, and this is why we structured it this way, I mean, Q did really kind of start as a sort of interactive game that took on a life of its own.
00:58:57.000 And it grew really rapidly until these, you can call Q a meme, until it memed itself into reality.
00:59:05.000 Until someone like Ron Watkins was advising President Trump.
00:59:13.000 Near the end of Q and near the end of his term.
00:59:17.000 Was he reeling?
00:59:18.000 Yeah.
00:59:19.000 I mean, I don't haven't interviewed Trump on this, but I do know that he was communicating with Giuliani.
00:59:28.000 He was advising them on what to do in relationship to the election.
00:59:31.000 How the fuck does that happen?
00:59:33.000 Trump was retweeting him.
00:59:35.000 Right, but how does Ron Watkins get a hold of Giuliani?
00:59:39.000 Why is he taking any advice from him?
00:59:45.000 He was...
00:59:46.000 I asked Ron the same question.
00:59:50.000 This is true?
00:59:51.000 Option of last resort.
00:59:52.000 But this is 100% true?
00:59:53.000 I mean, I have not talked to Giuliani, but you can see in the data trail, you can see him...
01:00:00.000 Yeah, you can see that Ron was openly advising them on what to do in relationship to the election.
01:00:08.000 He has been an advisor ever since.
01:00:10.000 Still?
01:00:11.000 It blew my mind.
01:00:11.000 Still right now?
01:00:12.000 Yeah, he's like, I mean, he was deeply involved with all this stuff with Lindell.
01:00:16.000 You know, he was...
01:00:17.000 Lindell?
01:00:19.000 Mike Lindell.
01:00:20.000 You know, the whole symposium, the...
01:00:22.000 MyPillow guy.
01:00:22.000 Yeah, like the iPillow guy.
01:00:24.000 Oh, the MyPillow guy.
01:00:25.000 That guy.
01:00:25.000 I shut that guy off as soon as I hear him talking.
01:00:28.000 I don't...
01:00:29.000 You know, the audit in Arizona.
01:00:33.000 Ron's been advising on that.
01:00:35.000 Really?
01:00:36.000 Yeah.
01:00:36.000 But did they watch your documentary?
01:00:39.000 I don't know.
01:00:40.000 If they watch your documentary, wouldn't they go like, hey, what the fuck are we doing?
01:00:43.000 Who are we talking to here?
01:00:45.000 Yeah, I mean, I don't...
01:00:46.000 It's wild.
01:00:48.000 I'm not sure they all saw it.
01:00:50.000 But that's wild.
01:00:51.000 But he also still has a huge following.
01:00:53.000 And he was...
01:00:55.000 So where does he have a following now?
01:00:56.000 He's kicked off of Twitter?
01:00:57.000 Well, all of this makes you wonder if he had a relationship before he migrated from, you know, Q shuts down, right, on election day, and then Ron starts actively really posting on Twitter.
01:01:09.000 And the fact that he was able to build such a massive following and quickly start advising on election fraud issues makes you wonder if they had a connection with the administration in some way, shape, or form before that.
01:01:23.000 You know, Ron would often say while I was filming with him that it was a marketing campaign.
01:01:26.000 Some would describe it, which is kind of a glib way of putting it.
01:01:29.000 That too was a marketing campaign?
01:01:30.000 Yeah, that it would end at the election.
01:01:32.000 You know, he and Jim would say, Jim Watkins' father would say that often.
01:01:36.000 You know, a glib way of putting it.
01:01:38.000 But some might describe it as a psyop.
01:01:43.000 You know, and...
01:01:44.000 And so I consider it a possibility, especially since HN was directing a high amount of traffic to Donald Trump's website in the first election cycle, that a relationship had formed somewhere along the way.
01:02:02.000 Maybe there were payments.
01:02:04.000 I don't know.
01:02:04.000 I haven't been able to prove that.
01:02:06.000 But how does someone like Ron Watkins go from...
01:02:09.000 You know, obscurity to suddenly being, you know, a celebrity in those circles almost overnight.
01:02:17.000 And to being the guy who's like, yeah, I'm reading this election manual.
01:02:21.000 Let me give you some advice on how to do it.
01:02:25.000 I mean, he said he's like, I must have just been the option of last resort.
01:02:28.000 But here's the question.
01:02:29.000 Why would they even...
01:02:31.000 If he's not admitting he's Q, all he is is a guy who's running 8chan and then 8kun, why are they even communicating with him?
01:02:40.000 Out of all the human beings in the world, all of the political experts, all of the military experts, all of the people that are...
01:02:52.000 Heads of whatever friendly media organizations that would communicate with him.
01:02:56.000 Why is he communicating with Ron?
01:03:00.000 Well, I think Q became very useful to the Trump campaign in the last year.
01:03:04.000 So they knew that he was Q? Well, I don't know that.
01:03:07.000 So why him then?
01:03:09.000 But we do know that Jason Sullivan, who was Roger Stone's head of social media, was using his algorithm hijacking tools on Twitter to amplify Ron.
01:03:20.000 And he was hoping to get a hold of Q. You know, it is possible that there was something along those lines was happening behind the scenes there.
01:03:30.000 It may just be the case that Ron was able to leverage his position and those relationships over time because he was telling them what they wanted to hear.
01:03:41.000 You know, his lawyers, everybody was saying, like, what are we going to do about this?
01:03:45.000 And Ron's like, I've got a plan.
01:03:48.000 Right?
01:03:50.000 But it blows my mind to this day.
01:03:55.000 You know, my jaw dropped.
01:03:59.000 When I saw Ron appearing on OAN with his black hat on, you know, that he bought when I was filming, a big black cowboy hat, and suddenly he was the election expert.
01:04:14.000 And suddenly he was advising these guys.
01:04:18.000 You know, that tells you a lot.
01:04:22.000 OAN is so strange.
01:04:24.000 Whenever I watch...
01:04:25.000 There he is.
01:04:26.000 There he is, yeah.
01:04:27.000 Cyber analyst on Dominion Voting.
01:04:29.000 Shocking vulnerabilities.
01:04:31.000 And mind you, when I spoke with him after he had been on the air with Chanel Rion on OAN, he's like, well, Q actually did a drop while I was on OAN, so I can't be Q. It's like, you know, the more he tried to make it seem like he wasn't Q,
01:04:49.000 the more he made himself seem like he wasn't.
01:04:51.000 He had gone through that massive, massive effort to make it seem like Bannon was Q, right?
01:04:56.000 Right.
01:04:56.000 A huge effort.
01:04:57.000 I mean, I even believed it.
01:04:58.000 Explain that to people who haven't seen it yet.
01:05:00.000 First of all, go see it.
01:05:01.000 It's on HBO Max.
01:05:03.000 Go watch it.
01:05:04.000 Please.
01:05:05.000 It's so good.
01:05:06.000 Thanks for watching all six hours.
01:05:08.000 Yeah.
01:05:08.000 It's More than that, I watched six hours.
01:05:10.000 Were you working out while you were watching it?
01:05:12.000 Some of the time I was actually on the treadmill, or the stair mill rather, watching and it kept me distracted.
01:05:20.000 But I watched it again last night.
01:05:22.000 I watched a few of them last night.
01:05:23.000 I watched two of them last night.
01:05:25.000 Because I just wanted to kind of refresh my brain.
01:05:28.000 The thing that I was getting at Initially is like...
01:05:32.000 The Bannon thing?
01:05:33.000 Yeah.
01:05:34.000 Well, let's just do the Bannon thing and then I'll get to the thing I was getting at earlier.
01:05:37.000 Okay.
01:05:37.000 Where I got distracted.
01:05:38.000 So he sort of tried to...
01:05:43.000 It was very strange that he was doing that.
01:05:46.000 That he was like outing Bannon, right?
01:05:48.000 Super strange.
01:05:49.000 If Bannon was Q, why would you dox your most famous user?
01:05:52.000 Right.
01:05:52.000 And he's saying that what he's using is IP addresses, and that he's isolating it to a very specific location that is very near where Bannon's house is, right?
01:06:04.000 Correct.
01:06:05.000 Yeah, so the very first time I met Ron, we shot an interview in the Philippines.
01:06:11.000 You know, I had gone there saying, you know, primarily it's gonna...
01:06:15.000 I'm interested in looking at free speech through the lens of Q. That's what I, you know, that was sort of the pretense.
01:06:21.000 And at the end of my interview with him, he pulls me aside and he says, you know, no one's really been looking at this, but I think Bannon is Q. Very first time.
01:06:34.000 Let's explain Bannon.
01:06:35.000 Steve Bannon, he's Donald Trump's advisor.
01:06:39.000 Yeah, he was sort of the architect, mastermind behind the 2016 campaign.
01:06:46.000 There's a lot of people that probably aren't paying attention to politics on this.
01:06:48.000 Yeah, okay.
01:06:48.000 Steve Bannon.
01:06:49.000 Okay, so then, go ahead.
01:06:51.000 So he pulls you aside.
01:06:52.000 You know, who eventually kind of gets booted from Trump's inner circle, right before Q gets started.
01:06:58.000 Yeah.
01:06:58.000 Now he's a podcaster.
01:07:00.000 He's a podcaster, yeah.
01:07:01.000 He's got his war room or whatever it is.
01:07:03.000 Yeah, the war room.
01:07:04.000 I don't know if Ron's appeared on it yet, but...
01:07:05.000 This is the space room.
01:07:06.000 This is a great room.
01:07:07.000 Thank you.
01:07:08.000 Great room.
01:07:08.000 Really, yeah.
01:07:09.000 I just want to, like, hover in here.
01:07:12.000 I feel like we should be floating right now.
01:07:14.000 We can if you want to go there.
01:07:17.000 Is there time?
01:07:20.000 There's always time.
01:07:21.000 Yeah.
01:07:23.000 Especially once you've started.
01:07:27.000 So Steve Bannon.
01:07:30.000 So Ron pulls me aside.
01:07:31.000 He's like, yeah, it's this guy, Steve Bannon.
01:07:34.000 No one's looked.
01:07:35.000 And I, you know, I've got my absurd amount of research, you know, I've got Infinity Board, thousands of assets, I've got timelines of all the possible suspects for Q at this point.
01:07:44.000 You know, and Bannon was a prime suspect, just based, just sheerly on kind of the circumstances around him, like his character, you know, he knows the chance, he knows that world.
01:07:57.000 He knows how to co-opt it for political gain.
01:08:01.000 So, You know, he makes for a possible suspect.
01:08:04.000 Well, yeah, I've been considering Bannon.
01:08:06.000 Now, why would Ron, the very first, you know, first off, like, trolling journalists, and I don't consider myself a journalist, but trolling journalists is, like, the gold standard for trolls on the trans.
01:08:16.000 Like, that's, if you can get them to publish something fake, that's their dream, right?
01:08:20.000 So, like, I'm just thinking, okay, well, why is he telling me Bannon the very first time I meet him?
01:08:24.000 Fine.
01:08:25.000 And they say, can you show me some data associated with this?
01:08:28.000 He's like, I'll get around to it.
01:08:32.000 And some time goes by, and I think six months later, he could really tell that the heat was on him.
01:08:39.000 I have Sauron.
01:08:40.000 I think that's who it is.
01:08:42.000 He's like, I've got to throw this guy off the trail.
01:08:46.000 He's like, okay, I've got the data.
01:08:47.000 I've got the data that shows that it's Steve Bannon.
01:08:49.000 You know, and you can see in the series, in episode four, you know, he's like, I've known it from the beginning.
01:08:56.000 I've known it, Steve Benno.
01:08:56.000 Can I stop you for a second?
01:08:57.000 Yeah.
01:08:58.000 Why was the heat on Ron at that point?
01:09:02.000 Well, all of the evidence pointed to him.
01:09:04.000 You know, I had a massive list of evidence pointing to Ron.
01:09:07.000 And like, for example...
01:09:09.000 Oh, he would say, like, while I would be with him, he would say things that, like, Q's going to start going on the offense.
01:09:21.000 As we're, like, coming down the mountain.
01:09:22.000 And then during the interview, he brought that up again.
01:09:24.000 Q's going to start going on the offense.
01:09:26.000 Two days later, Q writes, going on the offense.
01:09:28.000 This is not something that Q does often.
01:09:31.000 I think there was one more usage of the word offense in, like, the thousands of drops that had occurred up until that point.
01:09:36.000 Like, that's, okay, fine.
01:09:38.000 That's one data point, right?
01:09:40.000 But I have many, many more data points like that.
01:09:42.000 And in fact, actually, Ron, after watching the series and he saw that scene, I was going to get around to this after the whole ban a bit, but he ended up throwing his own right-hand man under the bus.
01:09:52.000 He's like, well, I had this guy with me that day.
01:09:53.000 He must have been taking notes, and maybe he's Q. That's how it must have.
01:09:57.000 That's the only way that could have happened.
01:09:59.000 All right, fine.
01:10:01.000 You've already convinced me of Bannon, but now let's throw your right-hand guy at the bus.
01:10:07.000 Interesting.
01:10:09.000 First off, he would be able to access the trip code.
01:10:13.000 It wouldn't be hard for him to easily hijack it at any point in time.
01:10:19.000 Could he falsify IP data?
01:10:21.000 How is he getting you this data?
01:10:23.000 Well, so the data set that he presented, I believe he had been LARPing as Bannon for some time.
01:10:30.000 Oh, Jesus Christ.
01:10:34.000 So he, yeah, he had put on the hat of Steve Bannon, I believe, probably sometime.
01:10:39.000 He was like, who's a likely suspect?
01:10:40.000 He'd done the same thing I did.
01:10:41.000 He was like, who can I pin this on so it wouldn't be traced back to him?
01:10:44.000 And I think when we see that static IP, this is my best guess, what you can do is set up a proxy server.
01:10:52.000 You can just run it, run all your traffic through that one IP address.
01:10:57.000 And so from that point on, that IP address can also serve as a Right.
01:11:09.000 Right.
01:11:32.000 Well, it's the new IP address.
01:11:34.000 So Q could always use the static IP to fall back on as a verification method.
01:11:41.000 And this is what Paul was complaining about.
01:11:43.000 He's like, okay, well, suddenly Q is using this new IP. Now, interestingly, Ron's right-hand man during one of these other interviews is like, let me show you.
01:11:52.000 Ron was actually messaging me all the way back in early January of 2018 that he thinks it's Steve Bannon.
01:11:58.000 So, and then actually, when this takeover, I think, took place, when Paul Ferber, the previous board owner of Q, says that it's a fake Q, some of the first things that get posted are in relationship to Steve Bannon.
01:12:17.000 It's like a Steve Bannon article.
01:12:18.000 So it really looks like right from that moment, someone, most likely Ron, was laying the pipe to create a forensics data set that pointed to Steve Bannon.
01:12:32.000 And I think he was rather disappointed that nobody had picked up on it.
01:12:37.000 And he was like, let me show you.
01:12:39.000 It's so good.
01:12:40.000 We even staged a whole thing where we sent someone out with cameras in front of Avenatti's office, shared that stuff, who lives like 20 minutes or so away from where Steve Bannon's house was.
01:12:53.000 You know, staged this whole thing so that there was a forensic trail that wasn't pointing in their direction.
01:12:57.000 And that's why it confused the hell out of me.
01:12:59.000 I was like, you know, all this data would be such a pain in the ass to fake.
01:13:03.000 Well, it wasn't that the data was fake.
01:13:05.000 It was that the data was designed to look like a certain thing because, of course, Ron would know how to do that.
01:13:11.000 And I'm sure he was just like, ah, he wanted better competition.
01:13:14.000 He wanted somebody to see how smart he was.
01:13:16.000 How much time did you spend thinking about this?
01:13:19.000 I mean, dude.
01:13:22.000 I spent the last few months just trying to, like, urge my brain.
01:13:26.000 Damn it, now I'm going on Joe Rogan's show.
01:13:29.000 I'm going to have to, like, re-upload it all.
01:13:32.000 So what I wanted to get to before is when the initial cue drops and when all this stuff starts happening, how does it leak and become a mainstream thing?
01:13:45.000 How does it get out?
01:13:47.000 Like, what is...
01:13:48.000 Is there a specific moment or is it just a slow sort of recognition by people who are conspiratorially minded that there's this gentleman or person or whatever that's posting pretending to be this Trump insider?
01:14:06.000 Like, what is it?
01:14:08.000 That makes it become this huge thing.
01:14:10.000 I don't remember when I started hearing about it, but I assume by the time stuff reaches me, it's leaked out into the mainstream.
01:14:19.000 So what was the event?
01:14:23.000 People started talking about this anonymous insider on the chans almost immediately.
01:14:28.000 It was almost instantaneous.
01:14:30.000 And I think that for those who were trying to get an edge on YouTube, you know, having this...
01:14:38.000 Being able to say, like, oh, there's this secret government insider who's releasing drops, and all of our dreams are coming true.
01:14:43.000 So you had people like Tracy Beans, Jordan Sather.
01:14:49.000 Right from the start, kind of being like...
01:14:51.000 Is this legit?
01:14:52.000 Here's what they're saying.
01:14:53.000 And then it kind of built from there.
01:14:56.000 People are following them.
01:14:58.000 Start checking out the chans.
01:15:01.000 But it was still fairly contained.
01:15:04.000 It was fairly contained up until, like, late December of 2017, when these characters, Tracy Beans, Paul Ferber, and Coleman Rogers, and I believe it was Tracy's idea, say, let's start a board on Reddit that's devoted to QAnon.
01:15:21.000 That's gonna reach, you know, a much wider audience.
01:15:26.000 So that's 2017. 2017, yeah.
01:15:29.000 So Reddit is where it branches out.
01:15:31.000 Reddit branches out and then it also branches out when they go on the Alex Jones show.
01:15:36.000 And that becomes a big boom.
01:15:39.000 So then it becomes fun.
01:15:40.000 Mm-hmm.
01:15:41.000 Yeah, that's when the excitement kind of kicks up.
01:15:44.000 But you had a lot of the Anons who were pissed off that they had gone on Alex Jones.
01:15:48.000 They were going mainstream.
01:15:51.000 They wanted everybody to stay anonymous.
01:15:53.000 And they didn't like the celebrity status of it.
01:15:55.000 That's fascinating.
01:15:57.000 Fascinating part of the culture, right?
01:15:59.000 Yeah, really fascinating.
01:16:01.000 And you can see it even in Q's mentality towards Tracy.
01:16:06.000 Okay, so this is great.
01:16:07.000 I talked with all of the board owners for Q. So everybody who had ever been in charge of the craziest thing on the internet that's QAnon, right?
01:16:19.000 Or Q had been posting on their board.
01:16:21.000 I asked them all, like, okay, well, did Q ever communicate with you, right?
01:16:25.000 Did Q ever, you know, send you messages?
01:16:28.000 And they said, well, not, like, direct messages.
01:16:31.000 But Q, because of that IP address, would sometimes post anonymously, openly on the boards, so that only the board owner or the moderators would know it was Q. So it was a way of communicating with those who were running the boards without the entire public knowing it.
01:16:50.000 And so they would be able to just go, okay, same IP, bring it up.
01:16:55.000 And Q didn't do this very often, very, very rarely.
01:17:00.000 But one of the board owners was like, there's a really unbelievable moment.
01:17:04.000 My jaw just dropped that Q was saying this stuff.
01:17:08.000 And I was on this call with a couple of other folks who were big into Q at the time, I think.
01:17:17.000 Anyway, they were all like, wait, what is it?
01:17:20.000 What is it?
01:17:21.000 I think we have the...
01:17:22.000 It's like Secret Q drop.
01:17:24.000 Secret.
01:17:25.000 Oh, this is great because Q is just shit-talking Tracy Beans and some of...
01:17:31.000 Just writing like a trash-talking anon.
01:17:34.000 And so this was after the switch to 8chan?
01:17:37.000 Yeah, yeah, yeah.
01:17:37.000 So this is April.
01:17:38.000 I believe it was from April of...
01:17:41.000 2018, and people are really starting to monetize Q at this point.
01:17:45.000 Tracy Beans is kind of moving away.
01:17:48.000 She's built this huge audience off of QAnon.
01:17:51.000 She's kind of rejecting it.
01:17:53.000 And Q, if we scroll up, you can see...
01:17:57.000 R equals 18. So you can see the ID right there, the FC, whatever.
01:18:02.000 This is something that only a board owner or a moderator would be able to see.
01:18:06.000 It's right next to Anonymous.
01:18:08.000 So then you see that little F-C-F-E-3-I. So if you scroll up and you can see a Q drop, that one right there, it has the same ID next to it, the F-C-F. And that's because it's like a hash representation of the IP address.
01:18:25.000 So it's the same.
01:18:26.000 So you know that Q, even though they didn't use the trip code, was also posting these anonymous drops.
01:18:33.000 And we didn't include this in the series, but this is kind of wild that Q was actually making posts that those who were running the boards could actually see.
01:18:48.000 So you have this one here, R equals 18. Why is that interesting?
01:18:51.000 Why is that unusual that he's making...
01:18:53.000 Well, because he's secretly communicating with the board owners and moderators and only they'd be able to see it.
01:18:59.000 But if you scroll down...
01:19:00.000 So this is only to moderators?
01:19:02.000 Yeah, so you can see...
01:19:03.000 Well, the ones that have Q were public and everybody would know.
01:19:06.000 The ones that are anonymous, no one knows that these are actually Q posting, except for the moderators.
01:19:13.000 So, you can see there, it says Beans.
01:19:16.000 Hey, patrons.
01:19:17.000 No, up above that, it says Beans, you are shit.
01:19:20.000 That's Q writing.
01:19:22.000 Yeah, interesting.
01:19:24.000 That doesn't seem like it makes any sense.
01:19:28.000 Then you go down here, it's like, help me change the world, not us, but her.
01:19:30.000 Scam, scam, scam, what a disgrace.
01:19:32.000 So, this is Q being very upset at Tracy Beans for monetizing Q. For pivoting mainstream.
01:19:43.000 And that's her YouTube channel?
01:19:46.000 It says fake outlets like CNN, New York Times.
01:19:49.000 So that's what it is?
01:19:50.000 Like going on those outlets and then promoting her YouTube channel?
01:19:53.000 Is that what it is?
01:19:54.000 Promoting her YouTube channel, trying to get people to give her money.
01:19:57.000 Basically, you can see here, Bean started at 8K followers, now 77K because of Q. This is Q writing.
01:20:04.000 Now pushing for more money and drops Q to be more accepted mainstream.
01:20:14.000 So Q is like, she got famous off of Q and now is dropping Q and Q is not happy with this.
01:20:20.000 This is someone upset that someone's getting famous.
01:20:26.000 Someone's writing it out.
01:20:27.000 Well, but this is Q writing it.
01:20:28.000 Right, a lot, yeah.
01:20:29.000 Whoever, you know, which was likely Ron at this point, writing this.
01:20:33.000 Which seems like, yeah, this is not like, why the fuck would a government insider give a shit who's becoming famous from distributing what's supposed to be real information, right?
01:20:45.000 Yeah, yeah.
01:20:47.000 And just coming on here and just writing like a typical anon, just like shitposting and bitching.
01:20:51.000 Right, right, right, exactly, yeah.
01:20:53.000 Like when Q was riding to the border, he's just like another anon.
01:20:57.000 How many people at one point in time were doing Q-related YouTube videos?
01:21:06.000 I mean, I don't...
01:21:07.000 A lot.
01:21:08.000 A lot.
01:21:09.000 I mean, there were, I think, 15 or 20, you know, who were, like, the predominant ones, you know, and maybe, like, 10 really big ones, big accounts.
01:21:21.000 We follow a couple of them in the series.
01:21:23.000 You know, Craig had a huge following.
01:21:25.000 Liz had a huge following.
01:21:27.000 Dustin had a sizable following.
01:21:29.000 Jordan Sather had a really big following.
01:21:31.000 You know, and they kind of came at it from different angles.
01:21:34.000 Like Craig was the evangelical approach.
01:21:37.000 You know, Dustin was more of the truther who kind of went after everybody.
01:21:43.000 Craig was the guy who had Donald Trump on his wall.
01:21:45.000 Yeah, yeah.
01:21:47.000 And then you have someone like Jordan Sather who comes from the David Wilcock UFO New Age crowd.
01:21:57.000 Like the reptile people, lizard people or whatever, blue avians, that's his baby.
01:22:03.000 5G. So, yeah, so you could, and so for them, it was like, okay, well, here's a whole new audience that maybe we can tap into.
01:22:13.000 And as Q became this umbrella for all beliefs, all conspiracies, all every, you know, the big tent, it was very useful for, say, someone like Wilcock or Jordan Sather to be like, okay, well, let's inject our worldview into the broader narrative.
01:22:29.000 And I think that's one of the most interesting things about Q is that, you know, Q didn't have complete control, right?
01:22:35.000 It was kind of call and response.
01:22:37.000 Like, what does the audience want to some extent?
01:22:39.000 Because it was an evolving narrative.
01:22:42.000 And the idea that lizard people or blue avians got introduced to a subset of that, you know, subset of that ecosystem isn't something that was introduced by Q itself.
01:22:53.000 It was just something that others who glommed on to it introduced.
01:22:58.000 So, I don't know.
01:23:02.000 It's kind of fascinating to watch how it took on a life of its own.
01:23:05.000 And in some ways, Q didn't have quite as much control as you might imagine.
01:23:10.000 I mean, Q even did Q&As at one point, you know, where people would ask...
01:23:13.000 Really?
01:23:13.000 Yeah, people would ask a question and then Q would be like, well, no, 9-11, not an inside job.
01:23:19.000 You know, aliens, real.
01:23:23.000 You know, those kind of things.
01:23:25.000 Earth is not flat.
01:23:26.000 How does he have time for this?
01:23:29.000 I mean, if you go to Ron's Telegram, look at how much he's been posting every day.
01:23:33.000 If it's Ron.
01:23:34.000 If it's, I mean...
01:23:35.000 You're convinced.
01:23:38.000 I'm convinced, yeah.
01:23:39.000 100%?
01:23:40.000 100%.
01:23:40.000 Wow.
01:23:41.000 I mean, I'm not convinced that Ron was doing it entirely autonomously.
01:23:45.000 You know, he had support.
01:23:47.000 There were overlapping networks that made this possible.
01:23:50.000 But was Ron, you know, the linchpin of it all?
01:23:55.000 Did he have control of the Q account?
01:23:58.000 Yes, that I believe 100%.
01:24:01.000 It seems like it because the shift from 8chan, when 8chan was taken down, and then they come back as 8kun, and then Q starts posting before anybody can post.
01:24:13.000 Oh, yeah.
01:24:14.000 I mean, duh.
01:24:15.000 Right?
01:24:16.000 I mean, that's a big duh, right?
01:24:18.000 That's another huge one, yeah.
01:24:19.000 That's a giant duh.
01:24:20.000 Like, hey, fuckface, how dumb do you think we are?
01:24:24.000 Right?
01:24:24.000 Like, that's...
01:24:28.000 No one else has the ability to post, and you don't know who Q is, and you're not in direct communication with Q, but Q posts.
01:24:36.000 Yeah, yeah.
01:24:36.000 Somehow, yeah.
01:24:37.000 Somehow, no one else was able to do it, really.
01:24:40.000 But I was sitting there trying.
01:24:41.000 He's like, well, Q has high-end military.
01:24:44.000 Who knows how he was able to do it?
01:24:45.000 Just like, how did Q know how many users were on 8chan posting?
01:24:51.000 He's like, I don't even know that.
01:24:53.000 Okay.
01:24:54.000 Okay.
01:24:54.000 Okay.
01:24:55.000 Yeah, but that was a huge tell.
01:24:58.000 I mean, there were lots of little clues I wasn't able to include as well.
01:25:01.000 I went through and analyzed reflections in some of the photos that showed how Q was holding the phone when they were taking pictures, and it was left-handed.
01:25:14.000 I wrote about this on Twitter, and then Ron messaged me.
01:25:20.000 He's like, well, I'm ambidextrous.
01:25:23.000 Oh God.
01:25:24.000 Jim is a character.
01:25:26.000 Jim is a fascinating person.
01:25:28.000 I don't know anyone like that guy.
01:25:30.000 Watching that guy just talk and communicate and watching his mannerisms.
01:25:36.000 And how old is he?
01:25:39.000 Oh, I don't know his age off the top of my head.
01:25:41.000 Mid-50s.
01:25:42.000 He looks a lot older.
01:25:44.000 Right?
01:25:45.000 He looks like a man who's lived.
01:25:48.000 Yeah, like he's had a hard life.
01:25:49.000 But the way he's communicating, it's like he's playing all the time.
01:25:57.000 Like everything's playing.
01:25:59.000 It's very playful.
01:26:00.000 When he's talking to you and he's explaining things, everything...
01:26:05.000 He's got an incredible sense of awareness of how he's being perceived and what he's projecting.
01:26:15.000 Like, I'm watching the way he's talking and I'm like, this is not like a...
01:26:22.000 It's not his first dance.
01:26:24.000 You know what I'm saying?
01:26:25.000 He knows how to talk to people.
01:26:27.000 Well, he was a recruiter in the military.
01:26:30.000 So he did that for a long time before.
01:26:33.000 So he knew how to convince somebody to pick up a gun.
01:26:37.000 Yeah.
01:26:39.000 So he has the gift of gab, and that requires some pretty solid communication skills, I think.
01:26:48.000 But it's interesting that that's lost when you communicate in text.
01:26:53.000 What's interesting about him is when he talks, and you can see him, and you get the whole big picture.
01:27:04.000 Of who he is.
01:27:05.000 This oddball character.
01:27:07.000 In the end, he's got the crazy facial hair and all that jazz.
01:27:10.000 But all of his weird quirks, all the weird things about him.
01:27:15.000 The guy should have a fucking podcast.
01:27:17.000 I don't know if he does.
01:27:18.000 I mean, he YouTubes regularly.
01:27:20.000 Does he?
01:27:20.000 Or bitch shoots.
01:27:21.000 Or he has his own thing.
01:27:22.000 Are you allowed?
01:27:23.000 Is he allowed to be on YouTube?
01:27:24.000 I don't think...
01:27:24.000 I mean, he's...
01:27:26.000 I'm not sure if he's on YouTube anymore.
01:27:27.000 He created his own version of YouTube.
01:27:31.000 He did?
01:27:31.000 Yeah.
01:27:32.000 He created it?
01:27:32.000 Yeah, it's called, well, it's not quite YouTube, but it's Tiger Network is the name of it.
01:27:37.000 Tiger Network?
01:27:38.000 Yeah.
01:27:39.000 See if he's on YouTube.
01:27:41.000 But I mean, while I'm watching and I'm listening to him talk, I'm like, he's a fun guy.
01:27:45.000 Well, remember, it's edited, so you're cutting out all of the stuff that's not that compelling.
01:27:52.000 I understand.
01:27:53.000 But what I'm saying is, he's really aware of how he's being seen.
01:27:58.000 He is, and I think that that playfulness is very intentional, and it masks something more sinister in a lot of cases.
01:28:04.000 And you see that on the 6th, right?
01:28:06.000 When we're approaching the Capitol, you see that kind of crack.
01:28:11.000 What do you see when you see the whole 6th thing with him?
01:28:16.000 I mean, he's a little...
01:28:20.000 In the approach to that, he's almost starry-eyed.
01:28:25.000 Like, he's looking around and going...
01:28:28.000 I mean, he says it, right?
01:28:30.000 He's like, this is the most non-business thing I've ever done.
01:28:33.000 Right.
01:28:34.000 And he...
01:28:37.000 I think it drained him financially keeping Q online in the long run.
01:28:44.000 He had to sell his pig farm.
01:28:46.000 How did it drain him financially?
01:28:47.000 What was the hit?
01:28:49.000 Well, it was expensive to keep 8chan online, for one.
01:28:54.000 Expensive to run?
01:28:56.000 As they describe, it's a yacht that you just sink money into.
01:29:01.000 And Fred Brennan, their opposition who had created 8chan, made it difficult to keep it going.
01:29:08.000 Made it expensive legally and technically for them.
01:29:13.000 You know, they had a lot of legal costs associated with all of that.
01:29:17.000 I don't know.
01:29:19.000 You know, they had to change locations for their businesses.
01:29:23.000 They had to change server setups.
01:29:25.000 I mean, you know, he just made things more costly.
01:29:33.000 And Tom, I guess, said that Jim is somebody who, when he gets money, he kind of spends it.
01:29:37.000 So sometimes he has a lot and sometimes he doesn't have very much.
01:29:42.000 Fred would say that that's all bullshit and that Jim is actually super wealthy and it's all just a character he's playing.
01:29:49.000 Because they have Five Channel in Japan, which is an incredibly popular Chan.
01:29:53.000 That's, I think, where the majority of their income stream comes from.
01:29:56.000 And you see some of that Chan drama play out where, you know, he had this split with his old business partner, Hiroyuki, he's super famous in Japan.
01:30:04.000 Might also explain why there's the whole Q-Japan Flynn thing we were talking about earlier, because they have headquarters there, so if they're gonna get Q-ed to be popular anywhere, it's gonna probably be in Japan.
01:30:14.000 You know, and Chan culture is also much bigger in Japan, I guess, in part because if I was to guess or I've sort of heard that when you're in an environment where you feel like you're socially not as free, you need more outlets and the Chan serve as that outlet.
01:30:33.000 And the chans in Japan, are they Japanese characters?
01:30:37.000 They are, yeah.
01:30:38.000 And can you translate it?
01:30:39.000 Is it translatable?
01:30:40.000 Like, is there a button that you can hit or anything?
01:30:42.000 I don't think it's quite that convenient.
01:30:44.000 It's not like on Twitter where you translate tweets.
01:30:46.000 So my point was, like, it's all Japanese traffic, essentially.
01:30:51.000 Yeah, yeah, Japanese traffic.
01:30:53.000 And it makes it really hard to really know what's going on in Japan.
01:30:59.000 If you want to have your business operations somewhere, it's difficult to do research.
01:31:03.000 We just released the series in Japan with a Japanese dub and everything.
01:31:09.000 I'm very interested to see how it's received there.
01:31:14.000 And that's where Ron is still.
01:31:16.000 I mean, we never...
01:31:17.000 That end sequence where I sort of confront him on, you know, why I think he might be Q, and he kind of comes out with it.
01:31:30.000 Initially, we were supposed to meet again to film, and I wanted to kind of confront him on sort of the list of reasons I thought he was Q. And he wanted to do it on an ice wall.
01:31:42.000 So, like, ice climbing.
01:31:44.000 He had just gone on one ice climbing trip.
01:31:48.000 He apparently has a YouTube channel where he talks about God and sings hymns.
01:31:54.000 What?
01:31:55.000 That sounds about right.
01:31:56.000 Yeah, it's like it's unrelated to all of his other stuff.
01:31:59.000 Oh, my God.
01:32:00.000 There's only one video on it right now, and that's it.
01:32:02.000 Just one?
01:32:03.000 Things might have gotten deleted.
01:32:04.000 I mean, this is what he...
01:32:06.000 He calls himself Watkins Xerxes?
01:32:12.000 That's how I had to find it.
01:32:13.000 It wasn't coming up with his name.
01:32:17.000 And he's in the Philippines still?
01:32:19.000 No, he's in California.
01:32:21.000 Really?
01:32:21.000 Oh, that's right.
01:32:23.000 That's right, he moved to California.
01:32:26.000 Why do we have to do that again?
01:32:32.000 I usually try to keep people's families out of it, but it was a family reason.
01:32:38.000 So, the January 6th thing again, you were saying that you see more of a sinister aspect of his personality on the January 6th thing.
01:32:50.000 Like, what do you mean by that, other than the fact that he said it's the most non-business thing?
01:32:55.000 Well, you can see a fire in his eyes, right?
01:32:57.000 But don't you think that that's also just the moment itself?
01:33:00.000 Before anyone attacks the Capitol building itself, you know, you're there in this wild, crazy mass of humans.
01:33:08.000 Have you ever been to like a- But I mean, I had seen that look in his eyes before.
01:33:12.000 You know, that wasn't the first time.
01:33:14.000 When had you seen that before?
01:33:16.000 When I would bring up Fred.
01:33:18.000 Do we get upset?
01:33:20.000 Mm-hmm.
01:33:20.000 You know, I think the first time that we...
01:33:23.000 When he had to go and confront Congress, I mean, he was obviously pretty riled up about that.
01:33:29.000 I mean, you know, he's used to being the boss in the Philippines.
01:33:32.000 So being in a situation where he maybe had less power...
01:33:36.000 And he would even tell you that he's, you know, he can be volatile.
01:33:42.000 And the first day we met and filmed with him, at the end of that day, he said to me, you know, you're the enemy, right?
01:33:51.000 And I was like, I don't think I'm your enemy.
01:33:55.000 Well, if you were the enemy, why is he talking to you?
01:33:58.000 It's voluntary, right?
01:33:59.000 The whole thing was voluntary.
01:34:01.000 Like, why did they agree to do that?
01:34:03.000 Did they think that they were smarter than you?
01:34:05.000 They were going to be able to, like, keep shucking and jiving to the very end, and at the end of it, you would think it was Steve Bannon or whoever?
01:34:12.000 I mean, in the beginning, I had first reached out to Fred, and I didn't know that their relationship was dissolving, that the tension was growing between Fred and Jim and Ron.
01:34:25.000 You know, and I was genuinely interested in the free speech side of what HM was doing.
01:34:33.000 And I didn't think that they were behind Q when I went there.
01:34:39.000 I just went to talk to them also in part because if anybody knew who was behind Q, it would be those with the technical data.
01:34:47.000 Everything else is just, you know, can be noise.
01:34:52.000 You can sort of see what you want in the writing.
01:34:57.000 But the data itself was sort of the most valuable.
01:35:00.000 Of course, after I left, you know, I was like, God, these guys are suspicious.
01:35:05.000 Right away.
01:35:06.000 Oh, right.
01:35:07.000 I mean, as I was doing those interviews, I was like, what is going on?
01:35:09.000 You know what got me?
01:35:10.000 The tape on the glasses.
01:35:12.000 I was like, what the fucker?
01:35:14.000 You got an expensive watch on and you got tape on your glasses.
01:35:17.000 Like, what's going on?
01:35:18.000 You know, Ron reacted to that, right?
01:35:19.000 What?
01:35:20.000 He's like, it was dental floss.
01:35:24.000 Reacted to what?
01:35:25.000 Your...
01:35:25.000 Me?
01:35:25.000 You.
01:35:26.000 Really?
01:35:26.000 Yeah.
01:35:26.000 Dental floss.
01:35:27.000 Yeah, he was using dental floss instead of tape.
01:35:30.000 That's what it was?
01:35:31.000 Yeah, I guess so.
01:35:33.000 Why didn't you just buy a new pair of glasses?
01:35:35.000 Well, he's...
01:35:36.000 I mean, look, I think I even told him this.
01:35:39.000 I was like, I think it's just part of the character you were playing, right?
01:35:42.000 Yeah, that's what I thought.
01:35:43.000 Well, I have a bunch of comments from his telegram and relationship to it if you want to look at him.
01:35:48.000 No, I don't.
01:35:49.000 Probably not.
01:35:50.000 Was he upset about it?
01:35:52.000 I mean, he was just being a little sassy about it.
01:35:55.000 That's all.
01:35:56.000 He was just like an opportunity to be like, actually, it was Dental Floss.
01:36:00.000 Well, whatever the fuck it was.
01:36:02.000 It's so on the nose.
01:36:04.000 It's just a character Ron's playing.
01:36:05.000 Ron is a theater nerd.
01:36:08.000 He's a huge theater nerd.
01:36:09.000 I played sports, but I was also a theater nerd.
01:36:14.000 So you know both worlds?
01:36:15.000 I know both worlds, yeah.
01:36:17.000 I definitely know the type, though Ron just takes things to the extreme more than anyone I've ever met.
01:36:24.000 We've discovered when we were going to Reno that we had both been in the Music Man and both been in the Barbershop Quartet and both still remembered the music to it.
01:36:33.000 But Ron has this idea that we're all, you know, the old Shakespeare thing, we're all just actors on stage.
01:36:41.000 But then the question is, what part do you want to play if that's true?
01:36:47.000 And, you know, he seems to like that villain role.
01:36:54.000 He's drawn more to that.
01:36:57.000 And I think that the glasses are just a character.
01:37:00.000 I mean, the watch, he was covering up his watches, his fancy, fancy watches.
01:37:05.000 And when he stopped covering it up is when we were in Reno filming, and then he, like, kind of rolled it up to show me that he was now wearing a Casio watch.
01:37:13.000 And I went back and looked through all the footage and said, like, okay, like...
01:37:16.000 Did he have on fancy watches?
01:37:19.000 Because Q would use fancy watches to confirm he's still in fact Q and fountain pens.
01:37:25.000 That was a good red herring for a bit.
01:37:28.000 And maybe Jim was also in on it or got in on it at some point.
01:37:32.000 Because he's really into pens.
01:37:33.000 He's super into pens.
01:37:35.000 As Ron would say, he's autistic about pens would be Ron's definition of it.
01:37:41.000 But Ron's into them too.
01:37:43.000 He kind of let that slip at one point.
01:37:45.000 He's also into pens.
01:37:45.000 Yeah, he's like, oh, I tried to buy this $3,000 pen and I couldn't get it.
01:37:49.000 You can hear it.
01:37:50.000 It's in one of the scenes in Reno.
01:37:54.000 So yeah, Ron's also super into fountain pens, just not as obsessed.
01:37:59.000 What a bizarre thing to get into.
01:38:00.000 Pens.
01:38:02.000 Do they write a lot?
01:38:03.000 But you can see, this is all for show, right?
01:38:05.000 So he puts on the Casio watch.
01:38:06.000 He's thinking very much about the image he's presenting and how it's going to be interpreted.
01:38:13.000 Even the green hat.
01:38:15.000 Do you know what the green hat means in China?
01:38:18.000 No.
01:38:19.000 It means you're a cuck.
01:38:20.000 Really?
01:38:21.000 Yeah.
01:38:22.000 You've been made a cuckledove.
01:38:25.000 Cuckledove?
01:38:26.000 Yeah.
01:38:26.000 Oh, I never heard that term.
01:38:28.000 Maybe I'm using it wrong.
01:38:30.000 I like it.
01:38:31.000 If you invented it, congratulations.
01:38:33.000 It's a good one.
01:38:34.000 Here we go.
01:38:35.000 Cuckledove is fun.
01:38:36.000 Have you heard of cuckledove?
01:38:39.000 No, Jamie's pretty deep.
01:38:41.000 How would you spell that?
01:38:43.000 Well, no, like you've been made a cuckled of.
01:38:46.000 Oh, cuckled of.
01:38:48.000 Two separate words.
01:38:50.000 Well, I think you just inadvertently made a good word.
01:38:53.000 Cuckled of.
01:38:54.000 Cuckled of is nice.
01:38:55.000 Oh, like a dove.
01:38:56.000 Like a pair of them?
01:38:57.000 Yeah, like a cuckled of.
01:38:59.000 You're like a little peaceful cuck.
01:39:01.000 On the third day of Christmas.
01:39:07.000 Dove is the peace bird, right?
01:39:10.000 You're a cuckoldove.
01:39:12.000 I like it.
01:39:13.000 You wear that green hat around China and women will approach you and this is why he wears it.
01:39:18.000 They will approach you and do what?
01:39:20.000 You know what that green hat means.
01:39:23.000 So it's a pickup trick.
01:39:25.000 What kind of hat?
01:39:26.000 And also he's leaning into the cuck aspect.
01:39:30.000 Was it a baseball hat?
01:39:32.000 I don't remember what it was.
01:39:32.000 It's just like a big green, kind of a baseball hat.
01:39:35.000 I'm trying to remember it.
01:39:35.000 It's like a baseball hat.
01:39:37.000 Okay.
01:39:37.000 So that's why he wears that hat.
01:39:39.000 Yeah.
01:39:40.000 So he's very conscious of his image.
01:39:42.000 The whole martial arts thing was odd too.
01:39:44.000 When I'm watching him punch the makiwara and all that jazz.
01:39:47.000 Yeah.
01:39:48.000 Because it doesn't look like he does that a lot.
01:39:50.000 You know what I'm saying?
01:39:53.000 I'm not a martial arts expert.
01:39:55.000 It's very performative.
01:39:56.000 It seems very performative.
01:39:57.000 Because I'm watching them throw the punches.
01:40:00.000 They're not efficient.
01:40:02.000 It's not like a guy who has done that a lot.
01:40:04.000 Have you watched video footage of someone hitting a macawarra?
01:40:09.000 People who do that on a daily basis.
01:40:11.000 I did not side-by-side them, no.
01:40:13.000 No, you don't need to.
01:40:15.000 He just seems like a guy who's like, you know, maybe he's done it a few times.
01:40:19.000 But he's making it seem like this is like a common aspect of his day, this practice.
01:40:25.000 Right, well he exaggerates everything.
01:40:26.000 Yeah.
01:40:27.000 So, like he, recently on his telegram, he's like, I do ice climbing, I'm an ice climber.
01:40:33.000 And then he posted images from the only time I know he's ever done ice climbing.
01:40:38.000 So I imagine it might be similar with a martial arts thing.
01:40:41.000 Well, if you do martial arts a couple times, you're technically a martial art.
01:40:47.000 Like, I play basketball with my kids sometimes.
01:40:50.000 I am not a basketball player, but I guess I could say I play basketball.
01:40:54.000 Yeah.
01:40:54.000 Ron does a lot of things.
01:40:58.000 He's a jack of all trades.
01:41:01.000 He said after Q stopped posting and he stopped being the admin of 8Koon at the same time, He's like, I'm going to be a woodworker now.
01:41:12.000 I'm going to start making crafts.
01:41:17.000 I'm leaving Eight Coon behind to become a woodworker.
01:41:20.000 He actually posted this really absurd video where he went out into the woods and kind of...
01:41:30.000 This happened maybe a month or two after the series aired.
01:41:36.000 It was like...
01:41:37.000 Here's my new character that I'm playing, and I'm, you know, leaving the old cue thing behind.
01:41:43.000 I think I have the video here.
01:41:47.000 It's sort of like, it feels like he's playing the ASMR crowd a little bit, and he dubbed the whole thing.
01:41:53.000 It's very...
01:41:55.000 The ASMR crowd is strange.
01:41:58.000 It's strange, but it's really compelling.
01:42:00.000 Some of it's very compelling.
01:42:01.000 Like there's one guy, I go to his channel, and he cooks out in the woods.
01:42:07.000 And he chops the wood, and he makes the fire, and there's no conversation at all.
01:42:11.000 It's crackling, and then he's eating the food.
01:42:14.000 And like, you know, video might be 30 minutes long, and it's got millions of views.
01:42:20.000 Like, this is wild.
01:42:21.000 Like, people are really into just, like, listening and watching people do stuff where they don't talk.
01:42:26.000 I mean, do you want to listen to Ron, you know, when you put your head on your pillow at night?
01:42:30.000 You're like, maybe just Ron's voice lulling me to sleep.
01:42:33.000 Well, that's not necessarily ASMR, right?
01:42:35.000 Well, if you heard the way he did it, it very much feels like he was going for the ASMR crowd.
01:42:43.000 It's so...
01:42:44.000 I mean, he, like his dad, they're clever, and it's funny how playful this...
01:42:52.000 It's like an internet person out in the world sort of interacting.
01:42:58.000 We'll hear this.
01:42:59.000 I don't know if that's six.
01:43:01.000 This is it.
01:43:01.000 This is the end of it, I think.
01:43:04.000 Leaving 8-kun, I have hand-built many wooden toys.
01:43:11.000 Donating to children and orphanages throughout my region of the world.
01:43:21.000 That being said, I am no longer affiliated with 8kun and have no interest discussing 8kun or Q-related material.
01:43:34.000 I have always been opposed to violent criminal acts in the name of any group He's a master.
01:43:48.000 Master troll.
01:43:49.000 Master troll.
01:43:51.000 I do believe strongly in the fundamental right to legally express our beliefs and ideas in ways protected by the Constitution.
01:44:01.000 Is he speaking in Japanese and then dubbing it in English?
01:44:06.000 I mean, just wait for the last bit here.
01:44:09.000 Watch this.
01:44:10.000 I am not Q and have never posted as Q. Why is it dubbed?
01:44:17.000 Oh my god.
01:44:18.000 Why is it dubbed?
01:44:20.000 And he puts on a black hat.
01:44:22.000 He's just like, never posted as cute.
01:44:23.000 It's amazing.
01:44:23.000 And the whole thing is just like a...
01:44:25.000 It's great.
01:44:26.000 It's great.
01:44:27.000 I mean, it's very entertaining.
01:44:28.000 It's very fun.
01:44:29.000 Shitting in the town square.
01:44:31.000 That's why I included...
01:44:32.000 That's shitting in the town square?
01:44:33.000 Kind of, yeah.
01:44:34.000 How's that?
01:44:35.000 Because he's just messing with people.
01:44:38.000 He's hijacking culture to get us talking about it, to get people...
01:44:43.000 But is that what he's doing, or is he just having fun?
01:44:45.000 I mean, it's kind of having fun.
01:44:47.000 That seems like fun to me.
01:44:49.000 I mean, look.
01:44:49.000 But also when he takes his black hat, because I've never posted his cue, and then looks away from the camera.
01:44:54.000 It's so performative.
01:44:54.000 It's super performative.
01:44:55.000 But it's also the weirdness of the dubbing.
01:44:57.000 Like, what is that?
01:44:58.000 What's the original?
01:44:59.000 Is it originally in Japanese?
01:45:01.000 I mean, Ron would never, it's like he would always do a funny walk, you know?
01:45:06.000 I have no idea if he even said the same things.
01:45:10.000 He probably just went out, recorded himself, came back in and was like, alright, I'm gonna, you know, read my script now.
01:45:16.000 But he's talking in the thing and it doesn't match up.
01:45:20.000 Yeah.
01:45:21.000 He did not...
01:45:21.000 I brought that up.
01:45:22.000 He didn't worry about that.
01:45:23.000 I think he did it on purpose.
01:45:24.000 Oh, he definitely did it on purpose.
01:45:25.000 For fun.
01:45:26.000 Just to make it as weird as possible.
01:45:28.000 Exactly.
01:45:28.000 You know?
01:45:29.000 But why do you say that it's like hijacking?
01:45:31.000 Well, the idea is like to...
01:45:34.000 So the reason I included the Diogenes bit in one of the episodes is because I felt it was the most illustrative of Ron's worldview.
01:45:43.000 What is it?
01:45:43.000 What is the Diogenes...
01:45:45.000 The cynic.
01:45:46.000 So cynicism.
01:45:47.000 And this is something that Ron had brought up along the way.
01:45:49.000 This idea that, like, a dog can shit in the middle of the town square, why can't I? Okay, okay.
01:45:53.000 You know, so it is super performative.
01:45:56.000 And I think what he's just, he's leaning into assumptions and trying to, and I mean, you could also just call Diogenes a troll.
01:46:06.000 Like, that's the same thing.
01:46:07.000 It's like, why can't I do this thing?
01:46:09.000 It gets people talking about the, you know, look at that guy.
01:46:11.000 He just took a shit in the middle of a town square.
01:46:13.000 Who does he think he is?
01:46:14.000 That's so, you know...
01:46:15.000 And I think that really helps explain how Ron sees the world.
01:46:21.000 You know, he's partially treating it like it's a game, but...
01:46:29.000 If he has any religion, I would say it's that flavor of cynicism.
01:46:34.000 It's like, Socrates gone mad.
01:46:37.000 I don't know, man.
01:46:38.000 That was funny to me.
01:46:39.000 Oh, it's funny.
01:46:39.000 I mean, that's why I brought it here, because I thought it was super weird and kind of hilarious.
01:46:44.000 I do not dislike them.
01:46:47.000 At all.
01:46:48.000 I mean, I don't...
01:46:49.000 Part of me with all this kind of QAnon stuff is like, and many things online, and it gets into this conversation of censorship and whether or not censorship is necessary or whether it's evil, whatever.
01:47:04.000 The thing about it is, all these things, is it's not...
01:47:08.000 It doesn't work on that doesn't work on me in terms of like I like the Q drops all stuff I'm not I never got interested right so I never I never got invested in it I never I mean I got invested but in the way that I wanted to figure out who is behind you make a documentary different flavor of obsession but so it's like But I mean,
01:47:28.000 Ron is fucking with misfits.
01:47:29.000 If you look at what he's done in recent...
01:47:32.000 I mean, look, I think you can find likable things about anybody.
01:47:37.000 And I think that if you look at what Ron's been doing since Q sort of stopped and what he's been up to on Telegram...
01:47:48.000 I mean, he's been running these same kind of hype trains to get people really excited that something is going to happen and that it doesn't.
01:47:54.000 I think the most insane one, though, that has happened in recent times, he got this woman who was working in an election office in Colorado.
01:48:07.000 She sent him some documents.
01:48:10.000 He started trumping up this idea that there was this whistleblower, this Dominion whistleblower.
01:48:15.000 It was going to be the...
01:48:16.000 You know, the end-all, be-all.
01:48:19.000 She had revealed the passwords to the system, and this was going to blow open the whole thing.
01:48:26.000 He gets tons of new followers on Telegram.
01:48:30.000 He's telling people, spread the video.
01:48:34.000 Spread it far and wide.
01:48:36.000 This person's a hero.
01:48:38.000 And he releases this video.
01:48:41.000 And We already knew all of this stuff.
01:48:48.000 And people started reporting that it had come with malware.
01:48:52.000 The video?
01:48:53.000 Yeah.
01:48:54.000 They started showing their calendars that were full of virus notifications.
01:48:58.000 Things couldn't be shared.
01:48:59.000 So it sounds like you told people to spread this video far and wide and it was laden with malware.
01:49:05.000 And what was the malware attempting to do?
01:49:07.000 I don't actually know what the malware...
01:49:09.000 Probably create a botnet, something like that.
01:49:11.000 So do you think that he released the malware?
01:49:14.000 So you can see, this is what people were showing on there.
01:49:17.000 Put it down to protect your priceless data.
01:49:19.000 Oh my god.
01:49:20.000 And this is a Google Calendar?
01:49:22.000 Is that what that is?
01:49:23.000 This is an iCalendar, I think.
01:49:24.000 Like someone's iPhone.
01:49:25.000 Wow.
01:49:26.000 You know, so lots of people were saying this.
01:49:28.000 And then it gets even crazier because Lindell has this whole cyber symposium.
01:49:31.000 They're going to reveal that the whole election was a fraud.
01:49:33.000 They bring the whistleblower out.
01:49:35.000 What happened is Ron essentially doxxed her when he released the information, even though he said that he was scrubbing everything.
01:49:44.000 He released some passwords that got traced back to her, and now she's entangled in a big legal suit.
01:49:50.000 But essentially she ended up doing the thing that she had claimed the other side was doing.
01:49:56.000 You know, she revealed she somehow got access to the password that the Secretary of State, I think, you know, was supposed to be the only—the government was only supposed to be the ones who had access to this.
01:50:07.000 And then she ended up releasing it.
01:50:08.000 So it's a huge, like, legal problem for her.
01:50:11.000 You know, Ron didn't really protect her in that situation.
01:50:16.000 And it's created chaos.
01:50:19.000 So talk about fucking with misfits.
01:50:21.000 Yeah.
01:50:22.000 Wow.
01:50:23.000 Is it possible that he didn't know about the malware?
01:50:28.000 No, because he cut the video.
01:50:30.000 He put it together and then he released it.
01:50:33.000 Are you saying that his video purposely had malware in it?
01:50:38.000 That is the indication, yes.
01:50:42.000 Based on all of the comments after the video got released from people who had been following his telegram from some time.
01:50:48.000 Like, this thing's full of malware.
01:50:49.000 Why the fuck would someone do that, though?
01:50:51.000 It seems like it would be so obvious that that's what...
01:50:54.000 Well, maybe he didn't know what the outcome would be.
01:50:56.000 Could have been a prototype.
01:50:58.000 I don't know why he would bother doing something like that.
01:51:00.000 But again, he is someone who...
01:51:03.000 Go back to the Diogenes thing.
01:51:06.000 I think for him, all of this is just entertainment.
01:51:10.000 So even releasing malware would be fun.
01:51:12.000 Like, haha, I got everybody to click this video.
01:51:14.000 It's all shit you already knew.
01:51:16.000 And now you have malware on your computer.
01:51:17.000 I mean, Q is malware.
01:51:20.000 It's releasing a lot of ideological or conceptual viruses into people's minds and getting to believe that all these arrests were going to happen that never happened.
01:51:30.000 Educate me on Telegram, because I know very, very little about it.
01:51:34.000 I've never been on Telegram.
01:51:36.000 Is it essentially for people that got banned from other social media platforms or that feel like they're being censored?
01:51:43.000 Yeah, I mean, it's an encrypted platform.
01:51:46.000 Is it all right-wing?
01:51:48.000 It doesn't use algorithms.
01:51:50.000 No, I mean, it's a messaging platform.
01:51:53.000 Oh, so it's not like a Twitter?
01:51:55.000 No, it's not quite like a Twitter.
01:51:57.000 So how the hell do you...
01:51:59.000 You can just follow people on there.
01:52:01.000 So it started as a messaging platform, but you can have gigantic groups.
01:52:06.000 And the guy who started it, like, you know, he...
01:52:08.000 I think he used to...
01:52:10.000 And this is a little outside of my expertise.
01:52:12.000 He was, I believe, like, uh...
01:52:14.000 He's Russian.
01:52:16.000 He had...
01:52:17.000 I think maybe he'd worked at Contactia before.
01:52:19.000 He left, started Telegram.
01:52:21.000 Russia, not his biggest fan.
01:52:24.000 Um...
01:52:25.000 They raised a bunch of money in some crypto ICO. I think it's probably helped their technology to scale.
01:52:31.000 And they are, you know, far more permissive when it comes to people using their platform.
01:52:40.000 Technically, unless you're a member of the group, you shouldn't be able to see what people are posting.
01:52:44.000 But you can forward messages now from, you know, what one person wrote to another group.
01:52:48.000 You can follow people on there.
01:52:50.000 And so Ron has actually managed to accrue I think he's over 430,000 followers at this point on Telegram, where he can post something and share it and then people can just comment in the aftermath.
01:53:01.000 And there is a kind of retweet functionality in the form of forwarding messages, but it doesn't have the same kind of amplification tools built into it.
01:53:13.000 So...
01:53:13.000 It's just...
01:53:14.000 Because there's no algorithm.
01:53:15.000 Because there's no algorithm.
01:53:16.000 But so is it like Twitter where you could just like, I could follow you on Twitter and I can go say, oh, what's he up to?
01:53:22.000 And then click on it and then see all of your tweets?
01:53:26.000 Yeah.
01:53:26.000 Yeah.
01:53:26.000 You can go back and see everything Ron's done or anybody who has a Telegram feed is done unless they've deleted it.
01:53:32.000 So it's a messaging app, but it's also like a social media app as well?
01:53:36.000 I think it's evolving into a social media app.
01:53:39.000 So it was initially just like Signal?
01:53:42.000 Yeah, I mean, that's how I'd use it initially.
01:53:44.000 And then I was surprised to hear, oh, wait, people were, they create, it had been sort of retooled to be, you know, to follow, but to create sort of big followings on there.
01:53:53.000 And how does someone become aware of it?
01:53:55.000 Is it just word of mouth in terms of, like, following people and...
01:54:00.000 You can say, follow this person.
01:54:03.000 So if you have a big Telegram account, you can forward somebody else's message or say, hey, here's another handle.
01:54:12.000 Follow this person.
01:54:14.000 I'm not the foremost expert on Telegram, so I may not know exactly what its origin was, but I do remember that initially I was using it primarily as an encrypted messaging tool.
01:54:24.000 And then pretty quickly, several years ago, You know, crypto groups, other things we're using that as a, you know, means for communicating with a large audience.
01:54:36.000 So it's not essentially designed as like a Twitter replacement, but it's being morphed into something along those lines.
01:54:43.000 It's being utilized in a similar, but I think, better way for society.
01:54:49.000 I think the fact that he doesn't have algorithms makes it a, and the fact that it's encrypted, And that the company itself isn't mining your data, and that's not the...
01:54:57.000 I don't know what their business model is, but maybe they don't know yet.
01:55:01.000 Right.
01:55:01.000 Well, didn't Twitter lose massive amounts of money for a long time?
01:55:06.000 Like, they weren't making any money, right?
01:55:07.000 But they were worth a lot of money on paper.
01:55:10.000 Because users are valuable.
01:55:10.000 Yeah.
01:55:11.000 Yeah, but they didn't exactly...
01:55:13.000 I don't know.
01:55:13.000 Maybe they know how to monetize it now, right?
01:55:17.000 Right.
01:55:18.000 But for the longest time.
01:55:20.000 Yeah, yeah.
01:55:21.000 I mean, once they went public and then kind of figured out how to use the conversations that were happening, have people pay for, you know, to get stuff trending and have ads served.
01:55:33.000 What is the replacement for Twitter?
01:55:35.000 Like, when people get kicked off of Twitter, what is the standard?
01:55:38.000 Where's the standard places they go?
01:55:39.000 I mean, there's a lot of companies that are trying to be the replacement.
01:55:43.000 I don't think any have really stood out.
01:55:45.000 I think Telegram is winning at this point.
01:55:50.000 Because it is a little bit of a...
01:55:51.000 It's a little evolved compared to like a Gab or something along those lines.
01:55:55.000 Gab was one of the things that had really picked up, but...
01:55:58.000 You know, Telegram just has better technology, too.
01:56:01.000 I mean, the fact that it is encrypted makes an improvement.
01:56:04.000 It's not mining user data in the same way.
01:56:07.000 How many users does Gab have?
01:56:09.000 Do you know?
01:56:09.000 I don't know.
01:56:10.000 But that became popular for quite a few people that did get banned from social media platforms, right?
01:56:19.000 Yes.
01:56:19.000 So I think, you know, Ron and Andrew Torbaugh, I believe, is the CEO of that company.
01:56:27.000 And Ron had had an account there.
01:56:29.000 I think they had some kind of data leak that was problematic for them.
01:56:34.000 That Ron had a data leak?
01:56:36.000 Well, not Ron had a data leak, but the Gab had a data leak.
01:56:38.000 Oh.
01:56:39.000 That was problematic for them.
01:56:41.000 And then there's Minds, too, right?
01:56:43.000 Yeah, there's a bunch of these.
01:56:45.000 Speech platform?
01:56:46.000 Mm-hmm.
01:56:47.000 Yeah, well, they say that they're...
01:56:48.000 But what do we even mean when we say that, right?
01:56:51.000 Because, you know, I think one of the things that people misunderstand about Section 230 is that it doesn't...
01:56:59.000 It actually...
01:57:00.000 What it does is it protects companies who want to moderate.
01:57:04.000 Explain Section 230?
01:57:06.000 Yeah.
01:57:06.000 So Section 230 is part of the Communication Indecency Act that many people in the digital rights space would say it sort of created the Internet.
01:57:18.000 In some ways, it is the First Amendment of the Internet in that it allows companies to...
01:57:26.000 To foster speech on their platforms in line with the First Amendment or not.
01:57:32.000 It can either be as permissive as you want it to be or you can moderate as much as you want.
01:57:37.000 In fact, there was a court case early on that is what led to this.
01:57:43.000 Where, you know, I think this prodigy was being sued for something that someone had posted on their site.
01:57:48.000 And then they started moderating, you know, moderating.
01:57:51.000 And this is when they determined that actually, you know, you can moderate but not be responsible for the content that's being published there.
01:57:59.000 This is what allows comment sections.
01:58:01.000 This is what allows for social media to exist.
01:58:04.000 It really is the thing that drives the internet.
01:58:08.000 A lot of times people will say, well, this is a handout to big tech, which isn't quite accurate either.
01:58:15.000 It's really the thing that allows small companies to...
01:58:18.000 I think?
01:58:40.000 Then they can get in trouble.
01:58:41.000 But if we got rid of Section 230, which I've heard people say, well, if I can't say whatever I want on Twitter, let's get rid of Section 230. It's like, well, do you think that these companies are going to be more or less permissive if they're liable for every single thing that's there?
01:58:58.000 And really what's going to happen, what some of these bigger companies are going to drive towards is using AI moderation that only they can afford.
01:59:06.000 And when something goes wrong, they're just going to blaze, you know, say bad AI, right?
01:59:10.000 And then meanwhile, you know, competitions, smaller companies that can't afford that moderation will simply be edged out.
01:59:17.000 So, if Twitter was somehow liable for everything that was on the site, they would probably integrate a lot of this AI moderation tools.
01:59:26.000 And we've seen how well that works.
01:59:30.000 You know, it's not particularly good at determining what should or should not be allowed online, and it ends up casting a far wider net.
01:59:39.000 Yeah.
01:59:40.000 So, yeah, I don't...
01:59:42.000 That doesn't seem to be the solution here.
01:59:45.000 But I do think that we have a challenge given the scale of these companies.
01:59:49.000 You know, how many people use Twitter?
01:59:51.000 How many people use Facebook?
01:59:53.000 When they become the arbiters of truth and are the primary place that many people are getting their information from, where news is being created, I think we do have to ask the question, well, what...
02:00:09.000 Should there be rules in place that require them to treat content more neutrally, or do we want them to operate as autonomous businesses, even at that scale, that can dictate what people see and what they don't?
02:00:24.000 We've just never seen anything in our lives where, you know, hundreds of millions of people were using the same thing, and there was an entity that could determine what should be allowed.
02:00:36.000 And you could make the public square argument around that as well and say, like, so many people are using it that this has become a kind of digital public square.
02:00:45.000 But as the law currently stands, you know, it's a private company.
02:00:48.000 It can do what it wants.
02:00:50.000 What are your thoughts on this?
02:00:52.000 Especially because you're so deeply invested in this Q phenomenon, and we see how that went.
02:00:59.000 I mean, if you ever really wanted to suppress free speech, what you would do is engineer something like Q, and then have it reach this boiling point, which is January 6th, where you have an arguable point If you wanted to say,
02:01:17.000 this is what we want to avoid, and this is why we need at least some form of censorship.
02:01:23.000 I mean, I think that, and this is what drew me to the story in the beginning, Q is testing the limits of free speech.
02:01:29.000 And that's kind of how you know whether or not you have a right, whether or not you can have dangerous ideas, whether or not you can say unpopular things.
02:01:38.000 And that's why I was drawn to Q in the first place, because Reddit had banned it, and that seemed novel at the time.
02:01:43.000 And it seemed like maybe, well, is this where the internet is headed?
02:01:46.000 So you saw that and that's what led you to start investigating and setting up this documentary series?
02:01:54.000 Yeah, yeah, because I had a background covering digital privacy.
02:01:57.000 Did you bring this to HBO and say, hey, look at this.
02:02:00.000 Very late in the game.
02:02:01.000 So I had been shooting this entirely independently up until September 2020. You financed it yourself?
02:02:08.000 Yeah.
02:02:08.000 Whoa.
02:02:09.000 Yeah.
02:02:10.000 You risky man.
02:02:11.000 Loan, credit card.
02:02:12.000 Wow.
02:02:13.000 It was a gamble.
02:02:15.000 It's a big gamble, right?
02:02:17.000 It paid off.
02:02:18.000 Holy shit.
02:02:19.000 In this case, it did.
02:02:20.000 It got it to HBO and it worked out.
02:02:24.000 Because it wasn't what it was.
02:02:25.000 But there was no guarantees that that was going to happen.
02:02:27.000 Also, there's no guarantees you're going to have an ending, like January 6th.
02:02:31.000 I'm not saying that you were rooting for something crazy like that to happen, but boy, did that pay off.
02:02:37.000 I mean, it's almost like you have to predict where things are headed in order to tell a story like this.
02:02:45.000 I mean, we had put together that whole opening sequence that takes place in D.C. before January 6th.
02:02:52.000 I mean, the ideation for that had started in November.
02:02:55.000 It gave it a whole new meaning, of course, when you see these kind of...
02:02:58.000 All of these beliefs and conspiracy theories, or whatever you want to say, kind of overtaking DC. And you see the kind of characters memeing themselves into existence in that opening.
02:03:11.000 But would this series have been substantially different if that hadn't happened?
02:03:17.000 Of course.
02:03:18.000 And it wasn't even that obvious to most who were working on it in the lead-up to The Six that I needed to go.
02:03:25.000 You know, I was...
02:03:26.000 Because we were in the throes of post, and it was an incredibly aggressive post schedule.
02:03:31.000 Really aggressive.
02:03:32.000 I mean, we were turning...
02:03:33.000 I was turning out 16-18 hour days.
02:03:35.000 Everybody was working around the clock for five months straight on this thing.
02:03:38.000 So for me to step away to go shoot at that point, a lot of people thought I was out of my mind.
02:03:43.000 What could possibly be so important that you would drop editing on this incredibly aggressive post schedule to go and document Jim?
02:03:54.000 Thank God you did.
02:03:58.000 Right?
02:03:59.000 Well, you must have been at the end, like on January 7th, you must have been like, fuck yeah!
02:04:04.000 I was just like, oh fuck.
02:04:07.000 Actually, on January 7th, I think everybody was in a little bit of a state of shock.
02:04:14.000 And didn't really...
02:04:16.000 I hadn't even seen all of the news reports.
02:04:19.000 I hadn't seen what everybody else had been seeing because we were on the ground.
02:04:23.000 So it wasn't until I started looking at all of the footage and all of the archival from other sources that you're like, holy shit, this is what was going on inside?
02:04:34.000 So what did you see?
02:04:36.000 I mean, we were on the side where the scaffolding was, where people were climbing all of the scaffolding.
02:04:41.000 I was following Jim that day.
02:04:43.000 I was very anxious going into it.
02:04:46.000 I didn't really sleep the two nights before.
02:04:48.000 Really?
02:04:48.000 Because I thought it was going to be bad.
02:04:49.000 I thought it was going to be way worse than it was.
02:04:51.000 I actually thought it could have broken out into a civil war that day.
02:04:55.000 That's how bad I thought it might be.
02:04:58.000 Meanwhile, most other folks who, if you weren't tracking all of these movements, I think that they were absolutely, their minds were blown that this could even happen.
02:05:12.000 That's me.
02:05:13.000 So I had no idea.
02:05:15.000 I had no idea that anything was boiling below the surface, but you apparently, you thought that it was going to be way worse than what it was.
02:05:23.000 Yeah.
02:05:24.000 What were you tracking that led you to believe this?
02:05:28.000 I mean, all of these instigators around Trump, Steve Bannon, Roger Stone, General Flynn, they were all stoking the flames to the max.
02:05:40.000 If you looked at the chatter on the chans, if you looked at the chatter on social media in general, I think?
02:06:04.000 Shit's going to get real.
02:06:05.000 What were his actual words?
02:06:07.000 He said wild protest.
02:06:08.000 I mean, he wrote that.
02:06:10.000 It was on Twitter?
02:06:11.000 Yeah.
02:06:13.000 I'm sure we could pull it up.
02:06:14.000 He asked for a wild protest.
02:06:17.000 A wild protest.
02:06:19.000 And this was a protest what he believed was election fraud.
02:06:25.000 Right.
02:06:25.000 And if you listen to his words in the speech that day, you know, he's, he's, them's fighting words.
02:06:31.000 You know, we're going to go to the, we're going to go there and we're going to take it, take it back.
02:06:35.000 And so there were a lot of different forces who were pushing that day for something to happen and, you know, for Trump to invoke the Insurrection Act.
02:06:45.000 This is something you'd hear a lot.
02:06:46.000 You know, Ron was saying Trump needs to cross the Rubicon.
02:06:49.000 Like, there was this idea that he needed to—that the only way that they could keep democracy was to overthrow the— Take, you know, take things over.
02:07:04.000 Here it says, this is Trump's Twitter.
02:07:06.000 It says, Peter Navarro released a 36-page report alleging election fraud more than sufficient to swing victory to Trump.
02:07:13.000 A great report by Peters.
02:07:15.000 Statistically impossible to have lost the 2020 election.
02:07:18.000 Big protest in D.C. on January 6th.
02:07:21.000 Be there.
02:07:21.000 Will be wild.
02:07:24.000 Hmm.
02:07:26.000 And what is this 36-page report?
02:07:29.000 Did you get into that?
02:07:32.000 I don't know the specifics about that, but I do know that there hasn't been anything.
02:07:41.000 Of all the lawsuits and everything that's been kicked out, nothing is stuck.
02:07:45.000 So this is the Director of the Office of Trade and Manufacturing Policy, Peter Navarro.
02:07:52.000 Yeah.
02:07:52.000 So he thought at the end, I mean, he thought essentially that getting these people to protest and that this 36-page report, that there would be enough data and that eventually someone somewhere would reverse the decision and prove that there was enough election fraud to reinstate him.
02:08:19.000 I mean, that day they were certifying the vote.
02:08:23.000 So animosity had been directed towards Pence, who I believe they incorrectly assumed if he didn't certify the vote that it would...
02:08:33.000 Or that he could somehow usurp authority in that case.
02:08:40.000 Yeah, they were calling him a traitor.
02:08:42.000 And Ron stoked the flames of that, of course.
02:08:44.000 They released something that night that made it seem like Pence was trying to run a coup against Trump.
02:08:50.000 To agitate people as much as possible.
02:08:53.000 They released.
02:08:54.000 Ron and Jim.
02:08:56.000 So they tried to release these documents that You see it in the end of the series where, you know, you're saying the mother of all bombs is coming when it comes to, you know, an information drop.
02:09:09.000 And that information drop was specifically targeting Pence.
02:09:12.000 And there was so much anger towards Pence.
02:09:16.000 And this is as Ron and Jim?
02:09:18.000 It's not as Q? Correct.
02:09:20.000 At this point, yeah.
02:09:20.000 Q was done at this point.
02:09:22.000 So now Ron was just...
02:09:23.000 How odd is that?
02:09:24.000 Yeah.
02:09:24.000 Right?
02:09:25.000 Yeah.
02:09:26.000 I mean, how coincidental.
02:09:27.000 Yeah.
02:09:28.000 Sure.
02:09:29.000 You could call it a coincidence.
02:09:30.000 Let's do that.
02:09:32.000 What is the motivation?
02:09:33.000 Why do you think Ron is doing this?
02:09:36.000 What's the reason?
02:09:43.000 Well, he does treat the whole world like it's a game.
02:09:47.000 And what are the objectives of that game?
02:09:51.000 Maybe gaining power, right?
02:09:53.000 Because he's in contact with the administration at this point.
02:09:56.000 I mean, think about it.
02:09:57.000 This guy was just running some fringe website out of, you know, Southeast Asia and Japan.
02:10:04.000 And suddenly he's advising, you know...
02:10:10.000 The president.
02:10:11.000 Like, he's gotten to the seat of power somehow using that website and Q and his sort of fame that was developed through Q. I mean, remember, Q mentions Ron, Code Monkey, very early on in the narrative.
02:10:26.000 So Q, right at that hijacking point early in January, he's like, Ron, Code Monkey.
02:10:31.000 How odd.
02:10:32.000 Yeah.
02:10:33.000 A little bit of theater.
02:10:36.000 That means Q is deeply invested in the community of the board, right?
02:10:40.000 And it means the community is deeply invested in Ron.
02:10:42.000 And so he was able to pivot that notoriety from obscurity to infamy.
02:10:48.000 And he is interested in infamy.
02:10:50.000 He will embrace infamy, as he has said.
02:10:54.000 He's a character.
02:10:55.000 I mean, you couldn't ask for better characters.
02:10:58.000 I mean, if you had developed this as a drama, if this was fiction.
02:11:01.000 People wouldn't believe it, probably.
02:11:03.000 I don't know.
02:11:04.000 I mean, there's something to be said for...
02:11:09.000 It's like the two of them, particularly Ron and Jim, and then even Fred.
02:11:15.000 Like, all of it is just...
02:11:17.000 It's so weird.
02:11:20.000 Oh, I mean, they're unbelievable characters, to be sure.
02:11:24.000 And you have to say, like, what kind of person would be drawn to becoming an edgelord on something like A-Chan, right?
02:11:34.000 And Ron grew up with his dad running Chans.
02:11:39.000 I mean, what was it like in that household growing up?
02:11:41.000 Right, right.
02:11:42.000 So, yeah, I mean, very fascinating character studies, to be sure.
02:11:50.000 I mean, so much of human history, though, is written by people with, I think, not identical tendencies, but people who are driven for power and are willing to do things that others aren't.
02:12:04.000 Do you think you could ever get Paul to admit that he was the original cue?
02:12:10.000 Would he be in trouble for that?
02:12:13.000 Because if you could, if he admitted it, that would really pull the floor out of the whole thing, wouldn't it?
02:12:24.000 I mean, I think that the fact that the first 127 drops were anonymous in the first place, that, you know, there's all these shakeups, there's a style change.
02:12:33.000 Like, I think that, I think there's lots of, the fact that the vast, vast, vast majority of things that Q prophesized or that people believed didn't come true, all the central tenets, you know, the arrests and all that, you would think that that would be enough.
02:12:47.000 If you could get Paul to come out and say, yeah, I was it.
02:12:50.000 I mean, people would only even believe that if he came with the data.
02:12:53.000 And even then, many still would be like, man, it's the deep state.
02:12:56.000 It's this, it's that.
02:12:57.000 Like, they'll come up with some explanation to write it off.
02:13:01.000 And Paul, he released a whole book after this took place, you know, explaining his...
02:13:09.000 I think?
02:13:23.000 Why would they just let a fake Q take it over?
02:13:27.000 And why wouldn't they try to reach out to you and reclaim it?
02:13:30.000 And his answer, which I just think is how he was thinking about it, is, well, maybe they just wanted it to continue, you know?
02:13:37.000 And he thought of it almost like a child, like a baby that had gone out into the world and it was becoming this big movement.
02:13:46.000 And he wanted to see it continue to grow.
02:13:50.000 Yeah.
02:13:52.000 How odd.
02:13:53.000 And so Fred started 8chan?
02:13:57.000 Fred started 8chan on a mushroom trip.
02:14:00.000 Really?
02:14:00.000 Well, on the comedown from the mushroom trip.
02:14:03.000 You know, he wanted to take the board creation of Reddit and the anonymity of 4chan and basically turn that into 8chan.
02:14:12.000 And what was the difference?
02:14:13.000 Was it a lack of moderation?
02:14:15.000 Anybody could create a board.
02:14:17.000 But what that meant was that people would go there and create boards with a lot of legal content.
02:14:22.000 So we had a huge moderation problem in the beginning because they could post illegal content there, child porn or something like that.
02:14:30.000 And first off, there could be hundreds of boards.
02:14:34.000 How is he just able to manage all of that?
02:14:37.000 You know, and they could do it totally anonymously.
02:14:38.000 And if someone does post stuff and then they take it down, the person who posted that stuff can still post.
02:14:44.000 They can still post.
02:14:45.000 Because they can't, it's not like a Reddit user where you could ban the user because they posted illegal content.
02:14:52.000 And you can ban an IP, but if they're using a VPN, which someone who's doing something illegal like that probably would be using a VPN, unless they're real dumb.
02:15:00.000 You know, it makes it very hard to track.
02:15:03.000 Now 8chan does respond to government requests for illegal activity like that.
02:15:09.000 I mean, all the websites do.
02:15:12.000 I mean, Facebook has tons of issues with that kind of material getting posted.
02:15:16.000 So Fred does it with Jim?
02:15:19.000 So Fred creates this thing.
02:15:22.000 He creates 8chan.
02:15:24.000 It revs up.
02:15:26.000 And it's not really a huge hit overnight.
02:15:29.000 It's not until Gamergate happens when Chris Poole, who was the owner of 4chan at the time, says, Gamergate...
02:15:40.000 You know, what's going on with these gamers and this whole, like, attack strategy that they're using?
02:15:45.000 Too much.
02:15:46.000 I don't want that on this site.
02:15:47.000 He bans all discussion of Gamergate, bans talk around it, creates a lot of animosity in those communities, and they're looking for a new home.
02:15:54.000 And the new home that they go to is 8chan.
02:15:58.000 So suddenly Fred has a huge influx of users.
02:16:01.000 He's like, what do I do?
02:16:02.000 So he needed more resources.
02:16:05.000 He was cash-strapped.
02:16:06.000 I mean, he was just, like, operating out of, you know, like a A very low-rent situation in New York.
02:16:16.000 So he starts getting offers, and he felt that Jim and Ron had experience running Chans and would be the best partners.
02:16:27.000 They offered server space for him.
02:16:30.000 They offered to fly him out to the Philippines where he could continue to run A-chan.
02:16:36.000 So for him at the time, that was a pretty good deal.
02:16:40.000 So he got on a plane and headed out to the Philippines.
02:16:45.000 And their relationship over time started to devolve.
02:16:51.000 What caused that?
02:16:57.000 Fred would, I think, say that he started to depart philosophically, that maybe they made him kind of uncomfortable in certain situations, like that they weren't taking his safety seriously, things like that.
02:17:13.000 But he also found a wife in the Philippines.
02:17:17.000 And his wife was, her dad was like a priest and he has since described it as a, and people who I was shooting with there would describe that religion kind of as a cult.
02:17:28.000 So he was newly religious and 8chan also doesn't jive super well with that philosophy.
02:17:35.000 You know, maybe, probably he believed in a lot of that stuff.
02:17:40.000 I think Fred is impressionable and he's also going through life faster than the average person because he doesn't think he's going to live that long.
02:17:48.000 So that could have factored into it.
02:17:52.000 You know, the big rift, though, that happens between them is Fred just doesn't really want to work on HN anymore.
02:17:59.000 They're giving him too much responsibility in his mind.
02:18:01.000 He doesn't show up to work one day.
02:18:03.000 Doesn't show up for a couple of days.
02:18:05.000 Maybe he's going to stop working on it.
02:18:06.000 And that's when Jim, who had given him a place to stay in the Philippines free of rent, but that Jim owned, just barged into his apartment and was like, why aren't you at work?
02:18:20.000 And Fred's account is that it was very traumatizing for him.
02:18:26.000 And he knew he needed to get out of that situation immediately.
02:18:30.000 So I'm sure that there was a confluence of factors that led to them kind of pulling apart.
02:18:36.000 And also Fred was young when he created 8chan.
02:18:38.000 You know, he was like 18. So we're talking about somebody who's still growing up.
02:18:42.000 Right.
02:18:43.000 You know, and like, he's 21, he's like, I don't want to do this.
02:18:45.000 Maybe I don't want to be the...
02:18:47.000 Right.
02:18:48.000 But he still likes that world.
02:18:51.000 Right.
02:18:52.000 You know, and Fred is a very eccentric character and he likes to lean into that eccentricity as much as possible.
02:18:59.000 And who Fred is online is very different than who Fred is in real life.
02:19:04.000 And I mean, both he and Tom, like Tom, who you see, who's Jim Watkins, kind of right hand man, artist in college.
02:19:11.000 I think that Jim was hoping that Fred was going to have a similar relationship to him, that he would take him under his wing while he was like, you know, 19 or 20 or whatever, and that he would just continue to be a part of their organization going into later years.
02:19:26.000 And, you know, and Tom, I didn't mention this in the series, but he's a psychonaut.
02:19:31.000 Psychonaut.
02:19:32.000 So he loves to experiment with psychedelics.
02:19:35.000 That's like his, that's his jam.
02:19:36.000 So he and Fred would do psychedelics, like, on occasion.
02:19:40.000 And they had a good relationship for a long time.
02:19:45.000 And that's why I think Tom is the one that you see when they're trying to smooth things out in the Philippines.
02:19:52.000 And Fred's really going after them, trying to get the side offline.
02:19:55.000 That's why Tom is the sort of most suitable peacemaker in that situation.
02:19:59.000 He's, as Fred would say, sort of the ice to Jim's fire.
02:20:05.000 That's also why Tom's pupils, I think, are like this in that one scene.
02:20:09.000 He said it was just the lighting, but I think he really is into that stuff.
02:20:13.000 Is Tom the guy who Fred records the conversation they're having and puts it all online?
02:20:20.000 Yeah, Fred records that conversation and puts it all online.
02:20:23.000 Without Tom knowing about it.
02:20:25.000 Correct.
02:20:25.000 Yeah.
02:20:26.000 Yeah, so, you know, that's not really the most noble approach when you're trying to negotiate with somebody.
02:20:35.000 I think that Fred was maybe interested in...
02:20:42.000 I mean, he had gained a new following.
02:20:46.000 He had appeared on a lot of news outlets and stuff in the run-up to that happening.
02:20:52.000 And...
02:20:53.000 I don't know.
02:20:55.000 He's also known as copy-paste.
02:20:57.000 So anything you send to Fred, it's possible that he will simply screenshot it and then share it with someone else.
02:21:05.000 What is he doing now?
02:21:07.000 That's just his deal, his technique.
02:21:09.000 What is he doing now?
02:21:12.000 Well, the other thing that Fred really likes, and I was saying, yeah, you should focus more on this other thing, is fonts.
02:21:18.000 He designs fonts.
02:21:19.000 Right, right, right.
02:21:20.000 So he's been doing font work for a variety of companies.
02:21:24.000 I don't know.
02:21:25.000 He's probably working on a...
02:21:27.000 Who knows what else he's working on?
02:21:30.000 But he's living with his family now on the East Coast.
02:21:34.000 You know, his mom and his brother have the same condition he does.
02:21:37.000 Oh, wow.
02:21:38.000 So he's helping his mom out, living with her.
02:21:43.000 You know, he's still pretty prolific on Twitter.
02:21:47.000 He posts a lot.
02:21:49.000 So he hasn't been banned?
02:21:51.000 He's also into furry culture.
02:21:53.000 What culture?
02:21:54.000 Furries?
02:21:54.000 Oh, okay.
02:21:55.000 Furry culture, which he's been posting about more lately.
02:22:00.000 But yeah, I mean, I think he's actually doing a lot better.
02:22:02.000 I think that him being back at home with his mom is a better situation.
02:22:06.000 And he's not in trouble in the Philippines?
02:22:07.000 Oh, he's in trouble in the Philippines.
02:22:09.000 Yeah, he can't go back.
02:22:10.000 But they don't extradite?
02:22:13.000 No, we don't.
02:22:14.000 No, they can't extradite him for a cyber libel charge.
02:22:17.000 And that was one of the craziest things to me in all of this.
02:22:20.000 It's like, here you're running an absolutist free speech website, taking it right to the edge, but you're going to go after somebody for something they said on Twitter.
02:22:28.000 Right.
02:22:28.000 And the thing is, he was saying, like, there's nothing I can do.
02:22:33.000 Now he's in trouble with the law.
02:22:37.000 Yeah.
02:22:37.000 Right, because the way it works in the Philippines is that it becomes like a criminal suit, so the state takes it over if they determine that there's a case.
02:22:49.000 So when Fred was leaving the Philippines, he got advance notice that an indictment was going to drop.
02:22:57.000 Yeah, I saw that part of it.
02:22:59.000 It's very compelling because it's under the wire, like barely gets out.
02:23:05.000 Yeah, and...
02:23:05.000 He would have been fucked.
02:23:07.000 He would have been fucked.
02:23:08.000 Especially because COVID was on our heels.
02:23:10.000 Right.
02:23:10.000 And if he gets COVID, I mean, he's already...
02:23:13.000 Oh, it's probably, you know, could have been game over.
02:23:15.000 Especially at that time, we had no idea.
02:23:17.000 Right, and if he goes to jail, it's most likely he's going to get COVID. Right.
02:23:20.000 Well, especially in the Philippines.
02:23:22.000 I mean, the Bikutan Detention Center, which is for foreigners, he would not fare well there.
02:23:28.000 And with the COVID stuff ramping up, I have no idea.
02:23:34.000 He could have likely died.
02:23:35.000 I don't think that that's an exaggeration.
02:23:37.000 No, I don't think it is either.
02:23:39.000 Out of all this, all the time that you spent working on this, When you're alone with your thoughts, I think that this subject highlights some very important questions and important conversations about free speech and about what roles,
02:24:01.000 if any, these platforms, whether it's 8chan or Twitter or what have you, have in protecting free speech or censoring Questionable behavior and I mean obviously this did not end well,
02:24:17.000 you know, I mean in Whether or not 8chan is responsible for some of it or whether the Q Movement is responsible for some of it or what percentage of it is it's clear that this becomes It becomes a vector For a lot of very questionable ideas and questionable behavior.
02:24:43.000 And what should be done?
02:24:46.000 Well, that's part of why the approach I took was to be as neutral as possible going into this world.
02:24:55.000 Create a historical document and show the mechanics.
02:24:59.000 And how did all of this actually work?
02:25:02.000 Don't look at the magic trick.
02:25:04.000 Don't look at the spectacle it's generating.
02:25:07.000 You know, look behind the curtain so that that magic trick can't work again, or at least the exact same magic trick, right?
02:25:14.000 It's like you watch Penn& Teller and when they reveal the magic trick, sometimes it's actually more magical.
02:25:19.000 But now you know how the magic trick works.
02:25:22.000 Right.
02:25:24.000 You know, I struggle with this question, would you rather live in a world where something like Q is possible?
02:25:31.000 And if not, what's the cost?
02:25:33.000 And this has always been the challenge with any case that kind of tests the limits of free speech.
02:25:40.000 It always seems like, well, surely this goes too far.
02:25:46.000 But I think that's why you always get sticky things kind of like this, testing free speech.
02:25:53.000 It's not going to be something light and fluffy.
02:25:57.000 It's going to be something that actually has a significant, potentially damaging impact on many people's lives.
02:26:05.000 And then we sit back and go, well, what do we do about it?
02:26:10.000 But I think that all rights come with a cost.
02:26:17.000 It's usually on a spectrum of security to freedom.
02:26:21.000 And how much freedom do you want to give up for how much security?
02:26:25.000 So, I mean, what do you think?
02:26:27.000 Do you think that would you rather live in a world where something like you is possible?
02:26:34.000 I'm torn, right?
02:26:36.000 Because on one hand, my perspective is, that wouldn't work on me.
02:26:46.000 Flat Earth doesn't work on me.
02:26:48.000 This doesn't work on me.
02:26:49.000 These movements, chemtrails don't work on me.
02:26:51.000 All these movements don't work on me.
02:26:53.000 But would they work on me when I was 15?
02:26:57.000 I think the answer is yes.
02:26:59.000 You know, therein lies the problem.
02:27:01.000 Would it work on me when I was 20?
02:27:03.000 Maybe.
02:27:05.000 25?
02:27:06.000 Likely.
02:27:07.000 30?
02:27:08.000 Maybe not anymore.
02:27:09.000 Like, then I'm online.
02:27:11.000 And then I'm starting to...
02:27:14.000 Piece things together in a broader perspective.
02:27:17.000 I'm not looking at things at face value and saying, oh, this is true.
02:27:20.000 I'm going, hold on.
02:27:21.000 And now I'm operating out of a lot of experience and I'm operating out of, you know, an understanding of how all this shit works in terms of like propaganda and nonsense and shitposting and a lot of these things.
02:27:38.000 So, should we protect people from things that wouldn't work on you?
02:27:43.000 Like, I'm sure QAnon's not working on you, right?
02:27:45.000 You have a much more sophisticated understanding of how the internet works.
02:27:53.000 You know, all these things.
02:27:55.000 Like, what are we supposed to do?
02:27:58.000 The rational argument is you counter bad speech with better speech.
02:28:06.000 Like, you explain things in a much more educated or a much more precise...
02:28:14.000 Well, that's what the series is doing in relation to Q, right?
02:28:16.000 Yes, for sure.
02:28:17.000 It's unpacking it.
02:28:18.000 That's exactly what your series does, right?
02:28:20.000 And at the end of it, it looks preposterous, and it's not gonna like, if a cue drops tomorrow, people are gonna be like, bitch, I saw that series, right?
02:28:28.000 So this is the argument for free speech.
02:28:32.000 Because imagine if in a world where you are, like what you were saying earlier, that if it wasn't for HBO and their balls, and their, you know, their bravery.
02:28:44.000 I mean, HBO's like, they can do whatever the fuck they want.
02:28:47.000 They're HBO, right?
02:28:48.000 And they're like one of the few who still will.
02:28:50.000 Yes, one of the few who still will.
02:28:52.000 And they did.
02:28:53.000 And again, I go back to Bill Maher, but I think he's got a very important show.
02:28:57.000 And it's one of the few networks that did support him and that kind of program where you are allowed to talk about controversial ideas.
02:29:08.000 If they're taking out people that are criticizing Q as well as supporting Q, then you've got a really weird, slippery thing.
02:29:19.000 You're trying to erase reality.
02:29:21.000 So what happens to the people that were invested in this?
02:29:25.000 They don't get closure.
02:29:26.000 They don't get an understanding.
02:29:27.000 I would hope that a lot of the people, like that one family, the guy with the big neck and his little kid, I would hope they would watch this series and go, Jesus, we got duped.
02:29:38.000 This is kind of nonsense.
02:29:39.000 I would hope, right?
02:29:41.000 I hope a lot of these folks...
02:29:43.000 And a lot have.
02:29:43.000 You know, a lot of folks who were big Q believers or people who have family who have watched it.
02:29:48.000 I mean, there's been a de-escalation quality, too.
02:29:50.000 Because when you understand what grew people into this, you know, it softens your view of them.
02:29:55.000 And when somebody who believed in all of this sees what was going on behind the scenes and actually engages with the material...
02:30:00.000 For some of them, you know, they can go, okay, well now I have new information and I can arrive at a new conclusion.
02:30:06.000 You're not telling them how to think, you're just presenting them with ideas and saying, you know, what do you think now?
02:30:11.000 And this is the argument for free speech, but it's also, it shows how complicated it is to unpack something like this.
02:30:18.000 I mean, the amount of time and effort that you put in to expose what this really is, is pretty substantial.
02:30:26.000 It's substantial, yeah.
02:30:27.000 And difficult, I would imagine.
02:30:30.000 Not just amount of time, but you have to frame this.
02:30:37.000 How many hours of footage did you have to go through?
02:30:39.000 Over 1,600.
02:30:40.000 Jesus!
02:30:42.000 To come up with six episodes.
02:30:44.000 Holy shit, man.
02:30:45.000 It was a lot.
02:30:46.000 It was a lot of material.
02:30:47.000 I mean, we had multiple angles, you know, right?
02:30:49.000 But a Herculean effort.
02:30:51.000 But you really managed to nail it.
02:30:54.000 You did a fantastic job.
02:30:55.000 And this is the argument for free speech.
02:30:58.000 Because you show that this is most likely 99.999% nonsense.
02:31:07.000 Right?
02:31:07.000 We don't really know who's posted.
02:31:10.000 Sometimes you'll get a hit.
02:31:11.000 But the point is, it's like, through free speech, and through your ability to accurately disseminate information, you've Produced a really amazing and entertaining thing that gives people an insight into the psychology Behind the folks that believed in this the psychology behind the folks who are likely perpetrating it It's an argument for free speech,
02:31:37.000 but it's it's also shows how fucking difficult it is to really parse this out Yeah, yeah, I mean I Look, conspiracy theories or painting your enemies in a black and white heaven or hell all in other terms,
02:31:58.000 that's something that humans have done forever.
02:32:00.000 I mean, even during the Revolutionary War, there were a lot of theories going around.
02:32:04.000 About King George, you know?
02:32:06.000 We wanted to turn everybody into slaves.
02:32:08.000 There is...
02:32:09.000 So this is...
02:32:11.000 And I'm sure that 15-year-olds and 20-somethings were impressionable, and it motivated them in those situations.
02:32:21.000 And I do think that the international component does add another element to this that's a little bit different.
02:32:27.000 But that's why I always bring it back to the privacy side.
02:32:32.000 I'd say before we worry about deciding what should or should not be said online, let's restore privacy rights.
02:32:40.000 Let's give people ownership over their data.
02:32:43.000 Let's make it so that these companies can't know more about us than we know about ourselves and see what impact that has first.
02:32:50.000 It does seem like that is where everything got really crazy because your data became a commodity that you didn't know you had.
02:33:01.000 Like, you didn't know it was valuable.
02:33:03.000 So when you signed off on the terms and conditions and you just started posting things and you allowed these companies to track all of your information and all your stuff that you do online, you didn't realize that you were creating these enormous companies With massive amounts of resource that just collect data.
02:33:24.000 The surveillance industrial complex.
02:33:26.000 Right.
02:33:26.000 They have products, essentially.
02:33:28.000 Gmail is a product.
02:33:29.000 Google's a product.
02:33:30.000 But really, you're the product.
02:33:32.000 You're the product.
02:33:32.000 Because you're what they sell.
02:33:35.000 These things, they just offer you that, so you give them the data.
02:33:38.000 And then once you give them the data, then they sell it.
02:33:41.000 And so you're essentially a customer, but you're also what they're selling.
02:33:46.000 I think you can argue that January 6th is a byproduct.
02:33:51.000 How so?
02:33:52.000 If you trace the lineage of...
02:33:55.000 Of data harvesting, to psychometric profiles, to those psychometric profiles being used to target us with information that, you know, that is tailored to our insecurities and desires.
02:34:08.000 It drives us into more extreme groups.
02:34:11.000 And then when we move in those more extreme directions, we're less willing to entertain an imposing reality.
02:34:16.000 And then when we're less willing to entertain an opposing reality, we become less willing to hear things from another side because we only believe in our one truth.
02:34:24.000 And then we become more interested in silencing whatever that one truth is.
02:34:27.000 We become more angry.
02:34:29.000 As they say, the hostility of suppression speeds up the treadmill of extremism.
02:34:33.000 The more you shut someone up, the angrier they become.
02:34:35.000 The more it validates the very thing that they were trying to stop in the first place.
02:34:39.000 And then you end up with a cultural climate or a social climate where something like January 6th is possible.
02:34:47.000 So, you know, I don't think it's the only reason, but I think that the data mining was a huge factor and the data mining in concert with the algorithms because those algorithms, of course, use what was reaped from that data mining.
02:35:03.000 In order to drive people towards the crazy shit on the internet.
02:35:07.000 You know, Q wouldn't have been possible without the algorithms.
02:35:10.000 Bottom line, just wouldn't have.
02:35:12.000 It wouldn't have escaped the chance.
02:35:15.000 So algorithms essentially are There's a problem in the inherent manipulation of people's viewing habits and they're doing it to accentuate their profits and they're doing it to accentuate the amount of time that you spend online.
02:35:39.000 And to feed us with whatever is most sensational.
02:35:43.000 Yeah, but not necessarily, right?
02:35:45.000 We talked about what my algorithms show.
02:35:48.000 That's what I'm interested in.
02:35:50.000 My friend Ari Shafir pointed this out.
02:35:53.000 He decided to do a test where he only looked up puppy videos on YouTube.
02:35:59.000 And then YouTube just started only recommending puppy videos.
02:36:02.000 So it's like the argument becomes...
02:36:04.000 We get driven into...
02:36:06.000 It learns very quickly and then it reinforces those assumptions.
02:36:09.000 Right, but it's human nature is the problem, right?
02:36:13.000 It's what people are actually interested in is the problem.
02:36:16.000 So algorithms, they take advantage of our human nature, which is our human nature is to find these echo chambers, is to find confirmation bias.
02:36:29.000 For sure, yeah.
02:36:30.000 I mean, I think a bigger question might be, are algorithms free speech?
02:36:35.000 You know, is that an expression of free speech?
02:36:40.000 We have a court case that I disagree with that asserted that money is free speech.
02:36:45.000 Money is an amplifier.
02:36:48.000 What court case is this?
02:36:49.000 Citizens United.
02:36:50.000 What was the case?
02:36:53.000 You know, this is the flagship Supreme Court case that actually Barr, who is Jim Watkins' attorney in DC, he was a part of that case, bringing it to the Supreme Court, which basically said that you could We're good to
02:37:23.000 go.
02:37:31.000 I mean, in my lifetime, the biggest lie that was told was the one that brought us into Iraq, ultimately.
02:37:36.000 Weapons of mass destruction, right?
02:37:38.000 Like, that got broadcast and cost how many lives?
02:37:41.000 So these things aren't new.
02:37:44.000 The idea that, like, a lie can have huge consequences in the world aren't new.
02:37:50.000 But the algorithms are.
02:37:52.000 And I do think that there is a bigger question to be had around should there be some restrictions placed on them?
02:38:01.000 What limitations might we put in place so that we at least know what the rules are?
02:38:13.000 It's a black box of amplification.
02:38:17.000 And it gives incredible power to those who run that black box.
02:38:22.000 Do you think it's at all possible that algorithms could be exposed in a way where people, where the narrative shifts and we realize that algorithms are actually problematic?
02:38:36.000 And that it has done irrevocable damage.
02:38:41.000 And it's moved our society in this way, as highlighted in The Social Dilemma, as highlighted in that where you're realizing that the ultimate path for this sort of separation is And this reinforcement of tribalism,
02:39:02.000 it really leads to conflict, almost undeniable.
02:39:07.000 Yeah, yeah.
02:39:08.000 That we could do something where we could recognize that first of all, these corporations only exist because we didn't realize that data is a commodity.
02:39:17.000 Once we do realize that data is a commodity, people are giving up that data Almost against their understanding.
02:39:24.000 They don't really understand what they're doing until it's too late.
02:39:27.000 They're being duped, right?
02:39:29.000 There's like this gigantic three card money game going on with your data.
02:39:33.000 And then you have the algorithm problem and we're recognizing that this is essentially being manipulated.
02:39:41.000 It's being manipulated by foreign entities like the IRA. It's being manipulated by who knows how many other countries that have similar programs installed.
02:39:51.000 What do we do?
02:39:54.000 Yeah, and you, what do we do?
02:39:57.000 And you look at Twitter, and do you get rewarded for saying something neutral, or do you get rewarded for saying something hostile?
02:40:04.000 Hostile.
02:40:04.000 You know?
02:40:05.000 And the more hostile, the more, you know, the more it satisfies the audience, the more it, the person who's posting is trained by that.
02:40:14.000 They're trained to go, okay, this is the thing that people want.
02:40:18.000 So it does drive that tribalism and then it also makes people very sure of whatever that worldview might be because they can feel that there's a lot of people who like the same thing that they do.
02:40:28.000 I mean, I know with absolute certainty that the QTubers were reinforced in the same way because they told me that they were.
02:40:34.000 I had Craig who was in it.
02:40:38.000 We talked afterwards.
02:40:39.000 He would discuss how he knew what the audience wanted, what he could and could not say.
02:40:48.000 And he wanted to keep that audience.
02:40:50.000 And he was trained over time to say things around QAnon that would drive more eyeballs.
02:40:56.000 But at the same time, that became his livelihood.
02:41:00.000 And in the end, after this series came out, he had said at one point, I still have all these people who follow me who want to believe that it's all going to come true.
02:41:16.000 But he's like, look around, man.
02:41:19.000 It's a fairy tale.
02:41:20.000 It's like our team, it's like a basketball game.
02:41:24.000 Our team lost and you guys are still just dribbling around shooting hoops and the score is over.
02:41:31.000 What does he do now?
02:41:32.000 But at the same time, he at that point was not willing to say that to his audience.
02:41:36.000 Right, because they were still his audience.
02:41:39.000 What does he do now?
02:41:40.000 I don't know.
02:41:41.000 I mean, he expressed to me that he was trying to step out of the spotlight some, but I think he's sad that he...
02:41:50.000 I mean, he's told me he's sad that he's like, I just wanted to be a YouTuber.
02:41:53.000 That's all I ever wanted to be, and now I can't be a YouTuber.
02:41:56.000 Wow.
02:41:58.000 So he can't come back with another channel?
02:42:00.000 He's banned on everything.
02:42:03.000 He could probably go on Tiger Network.
02:42:07.000 But I don't know how stable it is.
02:42:09.000 I don't know anything about it.
02:42:10.000 I haven't really used it.
02:42:12.000 And the real problem with banning people, it's like, Oh, one of the craziest things, though, that Craig told me, you know, it was in that hot tub scene in the end after he'd just been banned on YouTube.
02:42:25.000 You know, and you could see that that was coming right up towards the election, that things were going to crescendo in that direction.
02:42:32.000 And he's like...
02:42:35.000 He admitted something to me, which is just that he knew all along that it was the idea that this is a military operation or that Trump was behind him.
02:42:43.000 He's like, that was a bunch of, you know, that was a bunch of bullshit.
02:42:47.000 I never believed that.
02:42:48.000 You know, in the beginning, he's like, we're just kidding ourselves.
02:42:52.000 It's just a bunch of a-holes LARPing on 8chan and then suddenly, or 4chan, and then suddenly, you know, in his mind, the military got him.
02:42:58.000 Because he had these guys, these ex-military guys reaching out to him and sort of using him as a conduit for their agenda.
02:43:07.000 And so there's that crazy aspect of this too.
02:43:13.000 But it is fascinating to me that he openly admits that it started as a LARP. But it memed itself into reality.
02:43:24.000 And that's what you see come January 6th as this thing that was casting this imaginary view of the world was trying to make itself real.
02:43:33.000 And in many ways, it didn't actualize all of the beliefs, but some of the central ideas of it If we assume that Ron is Q, well, Ron eventually managed to get access to the seat of the Capitol.
02:43:47.000 If you want to say that the storm is coming, well, a storm eventually came.
02:43:54.000 This is meme magic at work.
02:43:58.000 This is like a collective imagination that willed something into existence.
02:44:04.000 And I know that's a little bit of a derail from what you were saying before about the banning.
02:44:08.000 No, but it's relevant.
02:44:11.000 Was this ultimately satisfying for you?
02:44:14.000 Obviously your investment paid off.
02:44:17.000 You put together a masterpiece.
02:44:21.000 You really did.
02:44:22.000 I had an awesome team.
02:44:24.000 Everybody involved put together a masterpiece.
02:44:27.000 Shout out to everybody.
02:44:28.000 But was this ultimately satisfying to you or does this leave you with a sense of impending doom?
02:44:38.000 Or both?
02:44:43.000 Well, unpacking the mystery is satisfying.
02:44:47.000 Reaching a conclusion is satisfying.
02:44:49.000 But do I think that we're out of the doghouse?
02:44:53.000 No.
02:44:54.000 Do I think that these patchwork solutions of censorship on these platforms is going to solve the problem that they think it is?
02:45:02.000 No, I think it's going to make it worse.
02:45:05.000 Do I think that it's driving more of a wedge in society?
02:45:08.000 Yes, I do.
02:45:09.000 So, if it is in fact the goal to de-escalate things or to make the polarization in society go away, this strategy historically and at present,
02:45:27.000 I mean, there's lots of historical examples of how this doesn't work.
02:45:32.000 And why censorship ends up having the opposite of its intended effect.
02:45:37.000 And I think we're watching that in real time.
02:45:40.000 But all of these people who are being banned, censored, they disappear.
02:45:45.000 Just because you don't see them on Twitter, they're mad.
02:45:51.000 And they'll find other platforms, but I think you have to let it work itself out.
02:45:57.000 And if you're going to tinker with something, tinker with the business model.
02:46:02.000 But these companies are never going to offer that as a solution.
02:46:05.000 Of course.
02:46:05.000 Ever.
02:46:06.000 Well, the amount of resources that they've managed to acquire is staggering.
02:46:11.000 There's never been a company like Google before.
02:46:13.000 Imagine.
02:46:14.000 A company like YouTube, they own YouTube, right?
02:46:18.000 Both of them together.
02:46:19.000 I mean, either one individually, there's never been anything like them.
02:46:22.000 The fact that they're both owned by one company is fucking incredible.
02:46:26.000 And then there's Twitter, and then there's Facebook, and they're both amazing.
02:46:30.000 And Facebook also has Instagram, also an amazing new thing.
02:46:35.000 Like, God, the reach.
02:46:37.000 And the reach is global.
02:46:38.000 And that's why when we talk about, you know, are we getting out of the quote-unquote doghouse here?
02:46:43.000 It's like...
02:46:47.000 Whatever the next war will be, if there's going to be another war, will be borderless.
02:46:52.000 Right?
02:46:53.000 Because these ideas seep across borders.
02:46:56.000 The internet is borderless.
02:46:57.000 And the internet is the new world.
02:46:58.000 You know, with its own laws and its own regulations and the companies that are running the rules of those places.
02:47:06.000 And...
02:47:08.000 And I saw this when I was traveling.
02:47:10.000 Just, you know, South Africa, Macau.
02:47:13.000 I had production assistants who were working with me there who were totally red-pilled on cue-related stuff that they had experienced on YouTube.
02:47:19.000 So the ideas are penetrating abroad.
02:47:23.000 So if the polarization continues, it is not just in America.
02:47:29.000 You know, it is happening ideologically on a global scale.
02:47:33.000 And so it wouldn't be nation versus nation.
02:47:36.000 It would be, you know, these patchworks of ideologies where people have very disparate views of reality.
02:47:45.000 And that's why I'm such an advocate for, you know, folks talking to people that they've sort of stopped listening to.
02:47:59.000 People who have Q friends or family or if you believe in Q, like reaching out to family members again, that's going to be a necessary step.
02:48:08.000 But until we fix the privacy problem, I don't see any of this going away.
02:48:14.000 It may be a runaway train.
02:48:16.000 I hope it's not, but it feels like it right now.
02:48:20.000 I think that's a great way to wrap this up.
02:48:22.000 And thank you.
02:48:23.000 Thank you for being here and talking to me.
02:48:26.000 And thank you for your incredible amount of work that you put into that HBO series.
02:48:31.000 It's amazing.
02:48:32.000 I really, really appreciate it.
02:48:33.000 And thank you for having me on and for watching all six episodes and then re-watching some of it.
02:48:39.000 My pleasure.
02:48:39.000 It's been a great conversation.
02:48:42.000 Thank you.
02:48:43.000 We're fucked!
02:48:45.000 Bye, everybody.
02:48:45.000 But there's hope.
02:48:46.000 There's hope, I guess.
02:48:47.000 We're here.
02:48:48.000 We're okay now.
02:48:49.000 You can try without hope.
02:48:50.000 Yes.
02:48:50.000 We can try without hope.
02:48:51.000 All right.
02:48:52.000 Bye, everybody.