The Joe Rogan Experience - April 13, 2023


Joe Rogan Experience #1970 - Bill Ottman


Episode Stats

Length

2 hours and 52 minutes

Words per Minute

156.59454

Word Count

27,023

Sentence Count

2,320

Misogynist Sentences

16

Hate Speech Sentences

22


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with the founder and CEO of MimbleWired, a site that allows you to access and access all sorts of valuable information on the internet. We talk about his journey with Bitcoin, privacy, privacy issues, and much more. It's a great episode, and I hope you enjoy it! If you like what you hear, please HIT SUBSCRIBE on Apple Podcasts or wherever else you get your podcasts. I'll be looking out for your Subscriber Submitted Subscribes and will send them to you. Thank you so much for all your support, it means the world to me and I can't wait to do more of this! Timestamps: 2:00 - What's the difference between Bitcoin and Bitcoin? 4:30 - Why Bitcoin is better than Bitcoin 6:00 - How much money is there in the internet? 11:30 - What are the risks of Bitcoin and other cryptocurrencies 16:20 - What s the future of privacy in the Internet 17:00 | Privacy in the 21st century 18:40 - How to protect your data 19:30 | What are your thoughts on copyright? 21:00 // Privacy issues 22:40 | Privacy issues in the digital age? ) 23:40 26:40 // Copyright 27:10 28:20 29:10 | The future of the internet 31:00 / 32: What's your favorite piece of art 35:00/35:00 & 35: What s your opinion on copyright 36:00+37:00 +37: Is Bitcoin s role in the future? 39:00 ? 40:00 Or is Bitcoin the real problem? 41:00 Is Bitcoin better than bitcoin? 45:00 What s copyright better than copyright better? 47:00 Do you agree with copyright in the modern day 44:00 Can you have zero money in the dark side of the Internet? 46: Is there a better way of protecting your data in the real or is there any better than the digital space? ? , 47:10 & 45:30 // 47:30 & 48:00 Could Bitcoin be better than money? Theme song by Ian Dorsch Theme by Ian Sommer


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:13.000 Hello, Bill.
00:00:14.000 Good to see you, buddy.
00:00:15.000 What's going on?
00:00:16.000 We're here, man.
00:00:18.000 You are here.
00:00:18.000 Everything's going wild.
00:00:19.000 How is your site going?
00:00:21.000 It's going.
00:00:22.000 We're decentralizing as fast as possible, getting it out of our hands so that we need to protect ourselves from ourselves.
00:00:31.000 How do you do that?
00:00:33.000 Tell everybody it's maps.
00:00:35.000 Minds.
00:00:36.000 Minds, rather.
00:00:36.000 Sorry.
00:00:37.000 Minds.com.
00:00:38.000 M-I-N-D-S.com.
00:00:39.000 I just had Rick Dobbin on.
00:00:40.000 I have psychedelics on the brain.
00:00:42.000 I sense kinship with MAPS. Yeah.
00:00:45.000 Yeah, MAPS and MAPS, the two of them should go together perfectly.
00:00:48.000 We probably should work together.
00:00:49.000 Seamlessly.
00:00:54.000 There's protocols and there's platforms.
00:00:58.000 Twitter, Minds, other social networks, these are platforms.
00:01:03.000 They're kind of built in the traditional social media style, which is on servers that live in huge cloud centers.
00:01:14.000 There's also protocols.
00:01:15.000 The one that we're working with now is called NOSTER, which stands for notes and other stuff transmitted by relay.
00:01:22.000 So there's no company owns this protocol.
00:01:26.000 The founder is anonymous, sort of similar to Bitcoin.
00:01:29.000 And what it is, is it's all about crypto key pairs and signing stuff.
00:01:37.000 So with Noster, this is all happening in the background on Mines.
00:01:41.000 Every user has a cryptographic key pair, which you can download your settings.
00:01:45.000 You're the only one who gets the private key.
00:01:47.000 That's your identity.
00:01:49.000 Your content, your followers, all that is tied to your identity.
00:01:54.000 So when you post something, when you follow somebody, That is creating a signature on this decentralized network of relay nodes.
00:02:03.000 So we run a relay.
00:02:05.000 Thousands of other people run relays.
00:02:07.000 Snowden's on Noster now.
00:02:09.000 It's like getting serious endorsement.
00:02:13.000 Because it doesn't have a company.
00:02:15.000 Because companies are choke points.
00:02:19.000 We saw what happened with Napster, for instance.
00:02:22.000 What happened with Napster?
00:02:23.000 Napster just got rocked by the music industry and they basically died.
00:02:27.000 I mean, they still sort of exist, but they pretty much got taken down because there was that entity to go after.
00:02:33.000 I see.
00:02:34.000 So if they were decentralized, but there was no real decentralization back then, was there?
00:02:39.000 Well, torrents are decentralized.
00:02:40.000 Bit torrents, right.
00:02:42.000 Actually, in Noster, there's work that we're involved with now on integrating Noster with torrents so that more heavy files, video and rich media can be shared over the network.
00:02:52.000 Right now, it's text and links, essentially.
00:02:56.000 What is your opinion when it comes to copyright protection and stuff like that?
00:03:00.000 Well, like if someone has copyrighted material, like say if like NBC has a show and then someone uploads it.
00:03:08.000 I mean, that's illegal.
00:03:10.000 Right.
00:03:11.000 But if it's on torrents, it's like, how do you get it down?
00:03:14.000 You don't.
00:03:15.000 You don't.
00:03:15.000 You don't.
00:03:16.000 So it's just, even though it's illegal...
00:03:18.000 Yeah, I mean, and that's why, like, so we have moderation.
00:03:21.000 I mean, if a copyrighted video gets reported on Mines, then, you know, we will take it down from our interface.
00:03:29.000 But it still exists on the decentralized network.
00:03:32.000 So you really can't stop people from sharing it.
00:03:36.000 No.
00:03:36.000 No.
00:03:37.000 No.
00:03:37.000 So, you know, I think I kind of have a nuanced view on copyright.
00:03:41.000 Like, I think that people's work should be protected and, you know, not stolen and monetized.
00:03:47.000 But at the same time, it's just like the nature of information is not compatible with copyright.
00:03:54.000 Right.
00:03:54.000 Just because information wants to do what it wants to do.
00:03:57.000 That's one of the weirder things about what's happening with the internet is that Essentially, as time goes on, it becomes more and more difficult to control what's just ones and zeros.
00:04:13.000 As they're out there, the bottlenecks between people and information, they're getting broader.
00:04:21.000 It's getting more and more difficult to stop things.
00:04:23.000 And I think as time goes on, it'll be impossible.
00:04:26.000 I don't think there'll be any stuff.
00:04:28.000 I think In the future, like anything, I have a feeling that we'll have zero privacy in the future.
00:04:35.000 I have a feeling that all of this encryption and all this stuff, I think it's all going to be invalid once quantum computing is ubiquitous.
00:04:46.000 I just have a feeling that there's no way you can stop information when...
00:04:54.000 Technology moves past where it is now to some place where basically anybody could get access to anything at any time and then the problem becomes how do you control money when that happens because money in a lot of I mean,
00:05:10.000 a lot of it is just ones and zeros.
00:05:12.000 And the only thing that stops you from being able to steal it or transfer it is encryption.
00:05:17.000 When you don't have any more encryption and anyone kind of has access to anything that's online.
00:05:24.000 So, yeah, I hear you.
00:05:26.000 It does seem like over time technology is working against privacy.
00:05:31.000 Yes.
00:05:31.000 That's a better way to put it.
00:05:34.000 But, you know, encryption hasn't been broken.
00:05:37.000 Some encryption has been broken.
00:05:38.000 But encryption is still good.
00:05:40.000 It still works.
00:05:41.000 And as quantum computing advances, encryption protocols will advance too.
00:05:47.000 So it's a race.
00:05:48.000 Right.
00:05:48.000 And there are people trying to break it.
00:05:51.000 But I heard the other day you were mentioning the whole thing with Signal and Tucker.
00:05:56.000 Yeah.
00:05:56.000 And so, yeah, to clarify that, so based on my understanding, Signal has not been compromised.
00:06:04.000 The Pegasus program that you were referring to does have the ability to hack people and get into their device.
00:06:11.000 But once you get into someone's device, it's actually easier to get into someone's machine or phone than it is to hack the encryption protocol.
00:06:19.000 So if you can get into someone's phone, then you probably can get into their messages.
00:06:24.000 I think that's what they did.
00:06:28.000 Right.
00:06:28.000 It's not that signals are relevant, but it's just that the government right now has the ability to get into things no matter what.
00:06:35.000 According to Gavin DeBecker, who's a securities expert, he said all they need is your phone number.
00:06:40.000 Signal One, excuse me, Pegasus One, was you needed a link.
00:06:46.000 So someone would have to, that's the whole Jeff Bezos story.
00:06:49.000 Someone sent him a link on WhatsApp, he clicked on that link, bang, all of a sudden they have access to his phone.
00:06:55.000 Right.
00:06:56.000 And you have to be super sophisticated to lock yourself down to avoid that.
00:07:00.000 Like, I'm sure that there are people who, well, I know there are people who...
00:07:04.000 Are probably less victim to something like Pegasus.
00:07:07.000 Sure.
00:07:08.000 Probably less victim.
00:07:09.000 But I mean, I guess with a guy like Jeff Bezos, he probably has someone scan his phone all the time.
00:07:14.000 But I wonder if they could even detect Pegasus at this point.
00:07:18.000 Yeah.
00:07:18.000 And if we know about Pegasus 2, how do we know if there's a Pegasus 3?
00:07:23.000 Like, whatever sort of workarounds they've found, I'm sure we're not going to be privy to it until it's too late.
00:07:30.000 Yeah, I mean, the layers of surveillance kind of keep zooming out.
00:07:34.000 And, you know, I think this whole intelligence world is getting out of control.
00:07:42.000 I mean, because you've got artificial intelligence, you've got...
00:07:46.000 We have what we know is the state of human intelligence.
00:07:49.000 We then have what is speculated to be potential non-human intelligences.
00:07:54.000 Have you heard about this Arrow office that just came out?
00:07:57.000 No.
00:07:58.000 So, Arrow, All Domain Anomaly Research Office that...
00:08:03.000 In the NDAA recently, there's all these UFO pieces of language.
00:08:09.000 And Christopher Mellon, who you've had on, is kind of really knowledgeable about this.
00:08:14.000 I learned a lot of this from him.
00:08:17.000 Anyway, the reason I'm talking about this is I think that intelligence and secret information, as it transcends through the corporate world and the government world, it's all kind of interconnected.
00:08:29.000 And the same types of models are used.
00:08:32.000 So...
00:08:33.000 But anyway, Arrow is...
00:08:35.000 The NDAA that recently passed now gives all these whistleblower protections to people to come forward, which is an absolute game changer.
00:08:42.000 And Arrow is this new office that's like decently funded, I think like 11 million a year or something like that.
00:08:51.000 They're doing an audit, basically, of the government on these issues.
00:08:56.000 And I think that it correlates to surveillance to me, at least in speculation, because, like, what is this system?
00:09:07.000 Like, if they're here, which I don't know, but that is surveillance.
00:09:12.000 And, you know, it's just all compartmentalized and we don't know what's going on.
00:09:17.000 So that's why it's great that this office exists.
00:09:20.000 So, like, if someone wants to get a hold of...
00:09:24.000 I feel like when you have something like Pegasus, all you have to do is have access to it, and you can get a hold of anything.
00:09:33.000 And someone wouldn't even know if you...
00:09:35.000 Obviously, Tucker didn't know.
00:09:36.000 I don't know what kind of security protocols Tucker's company or what he uses, but obviously he didn't know that they had access to his phone.
00:09:46.000 Right.
00:09:47.000 Yeah, I don't know all the specifics of Pegasus, but, you know, we also know everything from Snowden and the PRISM program, which, you know, maybe was winded down a little bit.
00:10:01.000 I remember I heard some of that stuff got rolled back, but realistically, no.
00:10:06.000 No, no, no.
00:10:07.000 It's probably expanded.
00:10:09.000 For sure.
00:10:10.000 Yeah.
00:10:10.000 Yeah.
00:10:11.000 I mean, I'd imagine they could just get all the access to anyone who's in any room.
00:10:19.000 Just zoom in on that room, know where all the phone numbers are, and just bam.
00:10:26.000 Yeah, but the paradox of having all these back doors is that it makes everybody less safe.
00:10:32.000 It makes all of the government people less safe to have these back doors.
00:10:35.000 Right.
00:10:43.000 The chat rooms that they had, they were using Signal.
00:10:46.000 The government uses Signal very heavily.
00:10:49.000 And the Twitter rooms where they had people from the social media industry, they used that.
00:10:58.000 So to advocate for backdoors into encryption is just self-mutilation.
00:11:05.000 It's bizarre.
00:11:06.000 It's bizarre, but it's one of those things where they want to use it on other people, even if it could be used against them.
00:11:12.000 They probably feel like a lot of the people they're checking in on don't have access to that.
00:11:17.000 Like a standard journalist, they want to zoom in on Matt Taibbi's phone.
00:11:21.000 I'm sure he doesn't have access to that.
00:11:23.000 So it's like he can't use it on them, but they could use it on him.
00:11:28.000 Yeah, I just feel like we're living in sort of a petri dish a little bit.
00:11:35.000 The type of information, the level of discourse on this planet right now is very tampered down compared to what's really going on.
00:11:44.000 I mean, that's just an undeniable fact.
00:11:46.000 Tampered down?
00:11:47.000 Yeah.
00:11:48.000 I mean there's – and this is verified.
00:11:52.000 I actually – I'm a fucking dork.
00:11:54.000 I read the whole Wikipedia article just about top secret information and like all the levels.
00:11:59.000 And the US government produces more classified information than non-classified information.
00:12:10.000 So even if like there's an audit, they could redact everything.
00:12:14.000 Yeah.
00:12:14.000 Yeah, and it's just all these different divisions and departments and they all have their own protocols.
00:12:21.000 So just getting a handle on it is, I mean, that's the first thing that has to get done.
00:12:25.000 We have to, but not that we're even going to get the real information from there.
00:12:30.000 Right.
00:12:31.000 But then there's also the national security aspect of it.
00:12:34.000 It's like, You know, you have to have some things redacted because, you know, of China and Russia.
00:12:40.000 Like, you could just say that and then...
00:12:42.000 Yeah, that is the phrase that gets used.
00:12:44.000 Sure.
00:12:45.000 And it's so sad.
00:12:45.000 Because they have a full, like, clampdown on their population.
00:12:50.000 I mean, they limit the access to the internet.
00:12:52.000 Their internet is essentially, like, China-based.
00:12:55.000 Like, you...
00:12:57.000 VPNs are illegal.
00:12:59.000 And they're trying to do that here in America.
00:13:00.000 It's all backwards, yeah, with the restrict act.
00:13:02.000 That is wild.
00:13:03.000 It's getting nasty.
00:13:05.000 20 years if you use a VPN, which is hilarious.
00:13:08.000 And it's managed by the Commerce Department.
00:13:12.000 Unelected bureaucrats are the ones...
00:13:15.000 See, TikTok's actually not named in that act.
00:13:18.000 Right.
00:13:19.000 They're just letting the Secretary of Commerce decide which apps.
00:13:23.000 Yeah.
00:13:24.000 That's insane.
00:13:25.000 It's insane.
00:13:26.000 So, yeah.
00:13:28.000 Dan Crenshaw posted about it.
00:13:30.000 He...
00:13:33.000 He thinks it's not that big a deal because he thinks that there's a lot of acts that get pushed and that they never get passed through.
00:13:42.000 But what's disturbing is just the idea, the desire to do this and the fact that – imagine if it did get passed.
00:13:51.000 I mean, it's just a fucking full-on assault on free speech.
00:13:55.000 Yeah, I mean, it seems to be getting a toxic stigma connected to it.
00:14:00.000 Did you see Jesse Waters grill Lindsey Graham about it?
00:14:03.000 No, I didn't.
00:14:04.000 Oh.
00:14:05.000 He didn't read it, but he endorsed it.
00:14:08.000 Oh, Jesus.
00:14:09.000 And he just got completely called out.
00:14:11.000 It was really funny.
00:14:13.000 Like, that should be illegal.
00:14:14.000 You should not even be able to sign something that you haven't read.
00:14:17.000 And they can't read it.
00:14:18.000 It's too long.
00:14:20.000 There's not enough time.
00:14:21.000 There's not enough time.
00:14:21.000 That's a lot of these acts, right?
00:14:24.000 And they slip a bunch of shit in there that's like, wait a minute, what about page 485?
00:14:29.000 Like, what the fuck is going on there?
00:14:31.000 And then like, oh, I didn't read that.
00:14:35.000 But meanwhile, it's going to change discourse in this country.
00:14:38.000 It's going to change what people have, you know, the access that people have to free speech and communication.
00:14:44.000 And I mean, I think a lot of people endorsed it righteously being concerned about TikTok.
00:14:50.000 You know, that's what was so sneaky is, you know, they enrage you to then support this disaster.
00:14:59.000 Yeah.
00:15:00.000 And it's just like we can all agree that there's a problem with TikTok and that there's, you know, the Chinese government having access to all of this data is problematic.
00:15:12.000 But, like, there should be an encrypt act.
00:15:15.000 Like, encrypt everything, but you can't go around banning apps.
00:15:19.000 It just doesn't work.
00:15:20.000 It's irrelevant.
00:15:21.000 People are going to use VPNs.
00:15:23.000 I think this act needs...
00:15:25.000 I don't think it's going to make it.
00:15:27.000 I hope you're right.
00:15:29.000 Because more people are talking about it.
00:15:31.000 Tulsi Gabbard posted a big thing about it.
00:15:33.000 There's a lot of people that are up in arms.
00:15:35.000 But my concern is if it wasn't for social media, that act, which was kind of ironic, right?
00:15:42.000 If it wasn't for social media and people sharing this and becoming outraged and people discussing this, it would have slipped right through like the Patriot Act did.
00:15:50.000 The Patriot Act existed in a time where there wasn't social media.
00:15:53.000 And people weren't really aware of what they were pushing through until it was too late.
00:15:58.000 Yeah, I think there's much better solutions.
00:16:00.000 I mean, did you watch any of the TikTok CEO getting grilled?
00:16:04.000 Yes, I did.
00:16:05.000 Okay, so, you know, that was interesting because, you know, he's a pretty, he seemed like a sober guy.
00:16:12.000 But in his point was, well, you have to have consistent standards for other social media companies, too.
00:16:18.000 I mean, like, how do we know that Facebook and Google, just because they're US-based doesn't mean that they're not giving data to China?
00:16:26.000 We have no idea.
00:16:27.000 We have no idea.
00:16:28.000 So that's really the issue.
00:16:30.000 We need to understand what specifically are all of these apps doing.
00:16:34.000 They should be labeled very specifically.
00:16:37.000 And we're starting to see some of that happen, but the thing is you can't know with these proprietary apps because they're just not sharing anything.
00:16:46.000 I think one of the problems that people have with whether any kind of decentralized app like yours or any other...
00:16:58.000 Decentralized social media network is that people immediately go, oh, what do I have to do to do this?
00:17:04.000 Like Mastodon.
00:17:05.000 When people start using Mastodon and you get on it, you're like, what is this?
00:17:09.000 There's so many servers and how do I know what to join and what's going on here?
00:17:14.000 Yeah.
00:17:15.000 Well, Mines is different.
00:17:16.000 Mines is actually not fully decentralized.
00:17:18.000 We're a hybrid.
00:17:19.000 So we run a centralized infrastructure, but we interface through delegation, delegated cryptographic events signing.
00:17:28.000 That's happening in the background, but our app feels like...
00:17:33.000 A normal social media app.
00:17:35.000 Mastodon, the way that that works, is federated instances.
00:17:39.000 So there's all of these different instances with different URLs and there's like 20 people on each one.
00:17:45.000 But there is sort of some interoperability between the instances because you can subscribe to somebody on another instance from your instance.
00:17:54.000 But it's not fully decentralized.
00:17:56.000 It's federated.
00:17:57.000 And the problem is that you don't own your identity.
00:18:01.000 So if one of those instances goes down, you're screwed.
00:18:04.000 Your stuff is gone.
00:18:07.000 In Noster, which is like an architecturally different setup, and there's other protocols similar to Noster, But it doesn't matter if the website goes down.
00:18:16.000 You just pop over to another one, upload your key, and all your stuff is there.
00:18:22.000 And that's why we like it, because it keeps us in check.
00:18:25.000 Because our users can now basically, if we fuck around, They'll bounce.
00:18:32.000 And they can take their stuff.
00:18:36.000 Because the social graph, specifically, is the key.
00:18:39.000 Because you spend a decade getting all these followers.
00:18:43.000 It's your life!
00:18:44.000 People spend their lives doing this.
00:18:46.000 And then to be able to just get taken out by YouTube is so devastating and unethical.
00:18:51.000 It's ridiculous.
00:18:52.000 Well, it's really creepy, too, because many of the things they took people out for have turned out to be true.
00:18:57.000 There was a lot of things that they were labeling as disinformation or misinformation, which are 100% proven fact now.
00:19:05.000 And people lost their accounts.
00:19:07.000 And there's no recourse.
00:19:08.000 They're not going to reinstate you.
00:19:09.000 And that was a problem also with Twitter.
00:19:12.000 That for the longest time, if you said anything that was contrary to whatever the narrative was, whether the government was pushing it or the CDC was pushing it, anything contrary to that narrative, you would get fucked.
00:19:27.000 Yeah, and those people are not back, though.
00:19:30.000 I think Twitter's making way more progress than everyone else.
00:19:33.000 And look, I'm ultimately an Elon fan.
00:19:36.000 I'm rooting for him.
00:19:37.000 I think it's vastly improved.
00:19:39.000 But there's chaos currently underway at Twitter.
00:19:42.000 Oh, sure.
00:19:42.000 And those people have not all been let back on.
00:19:45.000 And I don't really understand why.
00:19:47.000 Who hasn't been let back on?
00:19:47.000 The people that we don't know.
00:19:49.000 The people whose random Joe Schmo posting a COVID study, like, has he been let back on?
00:19:53.000 All the thousands of people that got banned?
00:19:56.000 Well, I think he essentially let back on everyone who didn't do anything illegal.
00:20:02.000 Not Alex.
00:20:03.000 Not Alex.
00:20:04.000 Yeah, that's true.
00:20:05.000 Why?
00:20:06.000 Yeah, well, that's a personal opinion of Elon's, which I don't agree with at all.
00:20:11.000 Yeah.
00:20:12.000 Because they let Andrew Tate on.
00:20:15.000 Right.
00:20:15.000 Yeah, it doesn't mean that he's endorsing Alex to let him back on.
00:20:18.000 Right, it doesn't.
00:20:19.000 I mean, because there's a lot of people that are back on that are, you know...
00:20:23.000 They didn't make that one specific mistake that Alex made, but they've said some horrific shit.
00:20:28.000 But the reason Alex was banned was because he confronted...
00:20:34.000 It was actually for something he did off Twitter.
00:20:35.000 So he confronted this journalist, Oliver Darcy, in a line at some event.
00:20:40.000 And he was, you know...
00:20:42.000 Being Alex Jones, sort of ranting at him.
00:20:45.000 And then Twitter said, oh, you're bullying this guy, and this is not acceptable behavior, so you're going to leave.
00:20:51.000 But then when I remember the exchange with Elon and whoever it was that was asking, it was that he hadn't been let on because of the Sandy Hook stuff, which is not the same.
00:21:02.000 That's not even why he was banned.
00:21:03.000 Right.
00:21:05.000 You know, it's not easy.
00:21:07.000 I understand, you know, the politics of it.
00:21:09.000 And he probably has Tim Cook being like, you know, we're not going to advertise if you have Alex Jones.
00:21:16.000 But I don't know what's going on, but it doesn't seem to me.
00:21:19.000 Because he could win the argument if he would just let him back on.
00:21:23.000 Right.
00:21:23.000 And did you see this crazy clip of Elon and the BBC guy?
00:21:29.000 I did.
00:21:30.000 I posted it today on Twitter.
00:21:31.000 Oh, you did?
00:21:31.000 It was amazing.
00:21:32.000 Amazing.
00:21:32.000 It was amazing.
00:21:33.000 It was amazing, yeah.
00:21:34.000 Because the guy kept trying to change subjects and, let's move on.
00:21:37.000 Like, no, [...
00:21:39.000 What the fuck are you talking about?
00:21:41.000 Because that guy thought he could just say the narrative without specific examples.
00:21:47.000 Like, give me an example.
00:21:48.000 And the guy had no examples.
00:21:50.000 That's most people who are concerned about this.
00:21:52.000 Well, there's a lot of people that I know that are famous that publicly announced they were leaving Twitter.
00:21:58.000 And one of them I really love.
00:22:00.000 And I was like, why are you doing this?
00:22:02.000 I didn't even say anything to her, but I'm like, why are you doing this?
00:22:04.000 This is so dumb.
00:22:06.000 You're just doing this because this is the thing that everyone feels like they're supposed to do.
00:22:10.000 Hey, well, Twitter's kind of fucked now, so bye.
00:22:13.000 No, it was fucked before.
00:22:14.000 It's less fucked now.
00:22:16.000 Yeah.
00:22:16.000 Are there people that are going to say things like what I showed you earlier today, which is hilarious, and someone posted to Kamala Harris after she said something about the assault ban?
00:22:25.000 That shit's important.
00:22:27.000 It's important to have people mock people.
00:22:29.000 Like, I'm sorry if it hurts someone's feelings, but that shit's important.
00:22:33.000 Yeah, and I think the way that Elon handled that was great, because obviously you need a specific example to back up an argument.
00:22:40.000 However, I sort of think the whole premise of the conversation is wrong.
00:22:44.000 This war that Twitter is at with all the think tanks, and I think it was the Institute for Strategic Discourse that had actually compiled the information that the BBC guy was talking about.
00:22:54.000 And there is information there.
00:22:56.000 There is data showing, you know, hate speech XYZ has increased.
00:23:00.000 However, This is the wrong conversation.
00:23:03.000 It's not...
00:23:04.000 The existence or even rise of hate in the presence of that content on an app is not...
00:23:12.000 You're not just trying to ban hate.
00:23:16.000 Banning hate does not stop hate.
00:23:19.000 And this is what the peer-reviewed research shows.
00:23:22.000 So trying to bully Elon and Twitter for...
00:23:26.000 Look, even if there was a bump of hate speech since it became a little bit more free...
00:23:31.000 I mean, it seems like that's a potentially understandable intermediary effect to happen while things reorient.
00:23:43.000 Like, we open up free speech, we open up the valve a little bit, okay?
00:23:47.000 Because we think that this is going to be healthy for society long term.
00:23:50.000 So let it bump a little bit.
00:23:52.000 We need that.
00:23:52.000 We need to see what we hate or what other people hate.
00:23:56.000 You need to, like, what is it?
00:23:59.000 Free speech lets us know who the idiots are.
00:24:01.000 Like, you need to identify them.
00:24:03.000 Yes.
00:24:04.000 Yeah, the best response to whatever it is, bad speech, is better speech, is better arguments.
00:24:12.000 And that's, you literally have a debate platform, which is what Twitter essentially is.
00:24:18.000 Yeah.
00:24:18.000 That is the purpose.
00:24:19.000 Yeah, it's the purpose.
00:24:20.000 Yeah.
00:24:21.000 Yeah.
00:24:21.000 And not to mention that the hate isn't defined.
00:24:24.000 So it's only one type of hate that these people are typically referring to.
00:24:29.000 Right-wing hate.
00:24:30.000 Right-wing hate.
00:24:31.000 Not left-wing hate.
00:24:32.000 Right.
00:24:32.000 That's okay.
00:24:33.000 And so, actually, we're suing California.
00:24:38.000 We just filed this law, this complaint.
00:24:43.000 They are trying to pass this social media law called AB 587, which requires—it's a censorship law.
00:24:51.000 They require these policies on disinformation, misinformation, hate speech, and then they—undefined, use the words extremism and radicalization.
00:25:01.000 There's no definitions.
00:25:02.000 They don't require you to have a child exploitation material policy.
00:25:09.000 But they do require you to have a policy on hate, which isn't defined.
00:25:13.000 And so we're suing them with the Babylon Bee and Tim Pool.
00:25:17.000 When did they start this?
00:25:19.000 When did they start trying to pass this?
00:25:22.000 It just went into effect in January.
00:25:25.000 So it's now?
00:25:26.000 It's now.
00:25:26.000 It's in.
00:25:27.000 So if you live in California, what's the repercussions?
00:25:31.000 So it is it's targeted at social media companies.
00:25:35.000 So basically mandating that social media companies have the submit these policies.
00:25:42.000 So we would have to we would they would force us to write a policy on hate speech.
00:25:49.000 And then additionally, we would have to, on a biannual basis, submit analytics about all of our moderation data.
00:25:56.000 Which, honestly, we're already transparent about our moderation data, so that's largely public anyway.
00:26:01.000 We have a jury system.
00:26:03.000 And we have in-house moderators, but it's a huge burden.
00:26:08.000 It's crazy that they would expect companies to submit all that and then have these arbitrarily, well, actually not arbitrarily, specifically chosen categories for policies that are clearly politically charged.
00:26:23.000 And Newsom, when he came out and announced this law...
00:26:26.000 It was very, you know, we have to stop hate on social media and misinformation and disinformation.
00:26:32.000 Protect society.
00:26:34.000 Protect democracy.
00:26:35.000 No, you know, you're not protecting democracy by stopping free speech.
00:26:41.000 Right, because there's no checks and balances in place if something turns out to be accurate, where then whoever put out that disinformation initially Like, if someone posts something, like say, masks don't work,
00:26:57.000 and they get banned off of Twitter, and say, oh, this is in response to the CDC's...
00:27:01.000 But if it turns out that masks actually don't work, the CDC doesn't get punished.
00:27:05.000 Which is kind of fucking crazy.
00:27:07.000 Because if they're the ones that are setting these guidelines and these guidelines turn out to be inaccurate and people get banned off of social media for arguing with these guidelines, there's no repercussions.
00:27:21.000 There's nothing, you know, just these people are fucked and there's no recourse.
00:27:25.000 There's nothing to do.
00:27:26.000 Actually, that's the policy that we do need.
00:27:28.000 We need the policy for social networks and media companies to, you know, apologize and fix their wrongs.
00:27:35.000 Yeah.
00:27:35.000 Well, and also, like, why wouldn't the CDC be punished then?
00:27:39.000 Well, shouldn't they be banned?
00:27:40.000 Or shouldn't they be, like, have a strike against them?
00:27:43.000 But they're not.
00:27:43.000 It's like, it's really frustrating because you're dealing with narratives that are oftentimes 100% propaganda.
00:27:52.000 And they're not backed by science.
00:27:55.000 It's just like some things that people say.
00:27:57.000 Like Rachel Maddow was on television telling everybody that this vaccine stops the virus in its tracks.
00:28:04.000 If you get vaccinated, the virus can no longer affect you.
00:28:07.000 It can't affect anyone else.
00:28:08.000 It stops and we can get out of this thing.
00:28:10.000 Well, that's not fucking true at all.
00:28:13.000 And that person, there was no repercussion other than public mockery, which continues to this day where whenever she posts something, people post that video.
00:28:22.000 What about this, stupid?
00:28:23.000 And there's nothing other than that.
00:28:26.000 I know.
00:28:27.000 They're not even inviting the other side on to correct the issues.
00:28:32.000 They're certainly not correcting their errors.
00:28:37.000 It's just not that hard to admit you're wrong.
00:28:39.000 I don't know.
00:28:40.000 Maybe it is hard.
00:28:41.000 But people get into their own egos and they just can't handle it.
00:28:44.000 Well, there's people that are beyond reproach.
00:28:46.000 And that's the problem.
00:28:47.000 Or organizations.
00:28:49.000 One of the things that's fascinating about Twitter now is they fact-checked the Biden administration.
00:28:53.000 So the Biden administration put out some tweets that were 100% horseshit.
00:28:58.000 And then underneath it, Twitter fact-checked them, so they deleted the tweets, which is glorious.
00:29:03.000 It is.
00:29:04.000 That's amazing, and that's never happened before.
00:29:06.000 Yep.
00:29:07.000 Community Notes is my favorite feature on Twitter by far.
00:29:10.000 It keeps everybody in check.
00:29:11.000 It has a process for kind of vetting information, surfacing it to the top, showing the better idea to the bad idea.
00:29:20.000 And actually, their Community Notes stuff is all open source.
00:29:23.000 It was a little bit disappointing, though, when they open sourced the algorithm the other day.
00:29:28.000 Open source the algorithm, which step in the right direction.
00:29:31.000 I'm not trying to attack, but what we learned basically is that it's not the live algorithm.
00:29:39.000 It's an algorithm.
00:29:40.000 What do you mean?
00:29:40.000 What does that mean?
00:29:41.000 So it's not the production algorithm, at least from what we can tell, because you saw what happened with Substack.
00:29:47.000 Has that been reversed?
00:29:50.000 I think it has.
00:29:51.000 Yeah, I don't...
00:29:52.000 I would need to confirm.
00:29:53.000 I think that the ability to engage with those posts got changed back, but then, like, Taibbi's posts, you couldn't even search them.
00:30:01.000 But the point being is that...
00:30:03.000 And I... Submitted a comment on their GitHub where this algorithm exists.
00:30:09.000 It's like, well, if this is the algorithm, why wasn't the substack blocking showing up in the algorithm the other day?
00:30:17.000 I mean, it wasn't.
00:30:19.000 It wasn't there.
00:30:20.000 So clearly there's some sort of a link blacklist.
00:30:24.000 And, you know, Twitter did say that this isn't the whole algorithm and they're going to be releasing more over time.
00:30:30.000 But the problem is we should have seen something change.
00:30:34.000 In that, when all the substack blocking went down.
00:30:39.000 How many users does Mines have now?
00:30:43.000 We are 5 million.
00:30:45.000 Oh, that's great.
00:30:46.000 We actually had to take a little haircut because we, you know, and we're trying to be as honest as possible.
00:30:52.000 Because, basically, we had been counting that people who, you know, fail...
00:30:59.000 Data is hard.
00:31:01.000 And so people who had, like, tried to sign up were, in our data, showing as signed up.
00:31:07.000 So, you know...
00:31:09.000 Backed it up a little bit.
00:31:10.000 But the point being, we don't use any closed-source proprietary analytics tools.
00:31:15.000 So the temptation when you're running a startup, trying to create an app, is to just go to the Silicon Valley platform.
00:31:27.000 We're good to go.
00:31:30.000 We're good to go.
00:31:47.000 You are becoming Google.
00:31:49.000 You are now part of Google's tentacles.
00:31:52.000 And you're basically handing over all that user data to Google.
00:31:58.000 And our whole foundation has been...
00:32:04.000 Fully open source and just don't take shortcuts.
00:32:08.000 Our growth path is healthy.
00:32:11.000 It's happening.
00:32:12.000 We're continually growing, but I don't care about the pace of growth as much as the quality.
00:32:22.000 We do machine learning.
00:32:24.000 We're starting to do AI, but we're doing it in an open source way.
00:32:29.000 So, like, in the AI wars right now, you have, like, open, quote-unquote, open AI, which isn't, you know, it's barely open.
00:32:37.000 They don't share much of what's going on, and they shroud that in some sort of, like, oh, we're, you know, we need to protect you, and we need to not let this get out of control, which maybe there's an element of truth to that.
00:32:47.000 But then there's a whole other part of the AI world, like, with stable diffusion and stability, and we're, it's all open source, and it's being done in the open, and everybody has access, because As you see all this AI stuff coming about you,
00:33:04.000 do you think that How do you feel about that?
00:33:11.000 Do you think that you should be compensated in a way?
00:33:13.000 You mentioned copyright.
00:33:14.000 Do you have any issues with it?
00:33:16.000 No.
00:33:16.000 I mean, it is what it is.
00:33:18.000 There's no stopping that.
00:33:19.000 And I saw it a long time ago because this company from Canada was the first one to take all of...
00:33:29.000 I mean, they essentially got a database of all my audio recordings, which is...
00:33:37.000 Fucking thousands of them.
00:33:38.000 So there's so many hours of me talking that they could easily have me saying a bunch of things.
00:33:43.000 And so they put together, like, just this recording of me saying a bunch of things that I've never said.
00:33:51.000 And me talking about some subjects and doing these things.
00:33:55.000 And it was a conversation that I never had.
00:33:57.000 And then they did one with me doing a podcast with Steve Jobs, which is wild.
00:34:02.000 And now there's a new one with me doing a podcast with Sam Altman.
00:34:07.000 And it's a full podcast.
00:34:08.000 I listened to the beginning of that one.
00:34:09.000 I think they did it tastefully because they made it very clear that it's not you.
00:34:13.000 Yeah.
00:34:14.000 It's a proof of concept.
00:34:15.000 It's like they're showing that this is something that can be done.
00:34:18.000 And there's no flavor to it, which is interesting.
00:34:21.000 Like, there's no...
00:34:21.000 Like, if you and I are having a conversation, it's...
00:34:24.000 There's fun.
00:34:26.000 There's laughing.
00:34:29.000 There's human interaction.
00:34:31.000 This was just like question, answer.
00:34:33.000 And you would be like, oh, well, that's interesting.
00:34:36.000 And that just kept happening.
00:34:37.000 You can tell.
00:34:38.000 Yeah, you can kind of tell.
00:34:40.000 But for now, for now, I mean, they'll be able to sort of code your personality in the future and sort of gauge yours and maybe even have some...
00:34:53.000 You know, some weird interactions that are just silly that you could sort of program in that make it look like personality.
00:35:01.000 And you can do that now.
00:35:03.000 You can prompt more like casual attitude.
00:35:08.000 But I think that, okay, so maybe you don't care.
00:35:12.000 No, it's not that I don't care.
00:35:14.000 It's just that it is what it is.
00:35:16.000 Would I rather it not exist?
00:35:18.000 Of course.
00:35:19.000 Of course.
00:35:20.000 I'm getting sent these things where I'm doing these ads for products that I've never used and never talked about.
00:35:29.000 And these companies, and we tried to chase one of them down.
00:35:33.000 I think it was like a testosterone booster or something like that.
00:35:36.000 But when you do go down this rabbit hole legally, when you sick a lawyer on it, they're like, Jesus Christ, there's a web.
00:35:45.000 There's shell companies, shell corporations.
00:35:48.000 It's very difficult to find.
00:35:49.000 It's all overseas.
00:35:51.000 Very difficult to find who's actually making this and how they would profit off of it.
00:35:56.000 Yeah.
00:35:57.000 When we talk about AI, like displacing work, you know, on a mass scale, I think that we do have to have a conversation about these companies and, you know, where they're getting all their data and, you know, who's getting compensated.
00:36:10.000 So I actually prompted chat GPT and asked it, you know, do you have the rights to the data that you're using?
00:36:17.000 And it said no.
00:36:21.000 It said that we do not have access or rights to all of the data that we're using.
00:36:26.000 And that, you know, that could become more of a concern over time.
00:36:30.000 I posted this.
00:36:31.000 And so what does that mean?
00:36:34.000 You know, because what are they doing?
00:36:35.000 They're scraping the world's use of language.
00:36:38.000 All the data, all the imagery, everything.
00:36:40.000 And so when we talk about OpenAI becoming, you know, a Worth hundreds of billions of dollars probably.
00:36:48.000 It's fastest growing app in the history of the world.
00:36:51.000 Right.
00:36:51.000 It got to 100 million way faster than everybody.
00:36:54.000 Doing massive $10 billion deal with Microsoft.
00:36:59.000 They're profiting off of the world's data.
00:37:03.000 Right.
00:37:04.000 So I'm not saying that that's not okay at all.
00:37:08.000 But I mean, if something that I created that I didn't give them permission to use is, you know, if they're profiting off of that...
00:37:16.000 Basically, I think there's a world where some type of UBI that...
00:37:22.000 But the companies...
00:37:24.000 Pay people.
00:37:25.000 So, like, we split revenue with our users.
00:37:28.000 We do, like, crazy rev share programs.
00:37:31.000 Because our whole thing is, like, if you bring energy to this network, like, get paid.
00:37:36.000 So how does that work?
00:37:37.000 Like, when you say revenue, like, what kind of revenue are we talking about?
00:37:40.000 And, like, where's revenue coming from?
00:37:41.000 So we have Minds Plus, Minds Pro, similar in functionality to Twitter Blue, but you get more reach, more exposure, verified, and more video upload, HD. And you pay monthly?
00:37:57.000 Yeah, you pay monthly.
00:38:00.000 Minds Plus is like $7.
00:38:02.000 Minds Pro is more.
00:38:04.000 How much is Minds Pro?
00:38:05.000 Minds Pro is $50 a month.
00:38:07.000 50?
00:38:07.000 Yeah.
00:38:09.000 Dude, I think people want to support platforms that they care about.
00:38:14.000 What do you get for 50 bucks a month?
00:38:15.000 You just get a lot more video.
00:38:17.000 Dude, video is so expensive.
00:38:19.000 Right, because you have to put it up on servers.
00:38:21.000 Transcoding?
00:38:21.000 And you have to transcode it into every version and its mother.
00:38:24.000 We have free users who are actually costing us...
00:38:27.000 I shouldn't even say this because...
00:38:31.000 Because they're just uploading video all day, and it's getting no traffic, so it's like they're costing us lots of money.
00:38:36.000 But it's okay.
00:38:38.000 We want free video upload.
00:38:40.000 But YouTube, the amount that they're paying, they were losing money forever because of that.
00:38:45.000 So the video is killer.
00:38:48.000 And then we also have a non-surveillance native ad network where you can boost your posts.
00:38:53.000 And so people can pay for that.
00:38:56.000 But what we do is that, well, so one, if you serve booths on your page, we split it with you.
00:39:01.000 And we may even go deeper and give people more than half for the ads that they're serving.
00:39:07.000 And then additionally, if you get someone to buy a membership, you get half.
00:39:12.000 That's cool.
00:39:13.000 Forever.
00:39:13.000 Is anyone making a living off of Mines?
00:39:16.000 There are some, yes.
00:39:17.000 Really?
00:39:17.000 Yes.
00:39:17.000 So that's their job, is they post on Mines?
00:39:20.000 Yep.
00:39:20.000 Wow.
00:39:21.000 How much money?
00:39:22.000 Well, one of them.
00:39:22.000 I mean, you know, they're making thousands of dollars a month.
00:39:26.000 So, you know, it's money.
00:39:28.000 It's real money.
00:39:29.000 Yeah.
00:39:29.000 But that, you know, think of if you're doing it in a traditional business way.
00:39:34.000 Like, we're going to build a sales force.
00:39:37.000 And bring on a bunch of sales reps and send them out into the world and help sell our product and give them a commission.
00:39:43.000 You know, maybe typically a company, if they're trying to do a sales force, you know, maybe 20% commission, maybe 10 to 20% commission if you sell something.
00:39:54.000 We're just like, well, I don't really want to manually hire everybody.
00:39:59.000 Like, why wouldn't we just offer this crazy commission to our whole community?
00:40:04.000 Yeah.
00:40:04.000 And then because we are community owned.
00:40:08.000 So our code is owned by everybody.
00:40:10.000 Everybody owns their content.
00:40:12.000 We're actually we just reopened our stocks.
00:40:15.000 You can buy our stock on WeFunder right now.
00:40:19.000 We want to be owned by the world.
00:40:27.000 It's like a distributed sales force.
00:40:31.000 Not everybody's a big creator.
00:40:33.000 Only some people can make a ton of money serving ads on their page.
00:40:37.000 There's only a certain type of people that that's relevant to.
00:40:41.000 A normal person on social media might be able to make a couple dollars a month.
00:40:46.000 They get a few views.
00:40:49.000 The reason the commission program matters is because everybody has a network.
00:40:54.000 Everybody has friends.
00:40:55.000 Everybody can have boots on the ground and go off and do work and sell stuff.
00:41:04.000 So if you sell a $1,000 ad deal on Mines, take $500.
00:41:11.000 And that's real incentive.
00:41:13.000 We want to really go far beyond.
00:41:18.000 Here's a tip for Elon, and Elon would love to collaborate with you, but he should have a $1,000 a month option.
00:41:26.000 People would pay.
00:41:28.000 People want Twitter to win.
00:41:32.000 What would you get for $1,000 a month?
00:41:34.000 What's worth $12,000 a year?
00:41:36.000 I mean, you could bake in advertising.
00:41:40.000 You could bake in a lot of advertising so you get a lot more reach.
00:41:43.000 You could get access to the Twitter team.
00:41:48.000 There's stuff that people will pay for.
00:41:51.000 People want to grow on social media because reach is influence.
00:41:56.000 Right.
00:41:56.000 Right, but wouldn't that be a problem?
00:41:59.000 You're only allowing that reach for people that have a lot of money?
00:42:02.000 No, it's not.
00:42:03.000 They can afford a thousand bucks a month?
00:42:05.000 Yeah, but they're already selling ads.
00:42:07.000 You can already do that.
00:42:09.000 What's the difference?
00:42:10.000 If you just bake it into a subscription.
00:42:12.000 So how does it work?
00:42:13.000 Because I've never bought an ad on those things.
00:42:15.000 Just click add, give a credit card, say how much you want to spend per day, how long you want it to go.
00:42:21.000 And it just shows up in people's feeds.
00:42:23.000 Yep.
00:42:24.000 Because I've always wondered, like, why is this in my feed?
00:42:27.000 Because for a long time it wasn't.
00:42:29.000 And then now you do see these, like, paid ads.
00:42:33.000 Yeah, the promoted posts.
00:42:34.000 And, you know, the people who do it on Minds are much less, like, annoying advertisers and more so just artists and people trying to get the word out about their stuff.
00:42:45.000 Yeah.
00:42:48.000 What's your typical user base?
00:42:49.000 How are you attracting people other than coming on a podcast and talking about it?
00:42:55.000 People just find us organically looking for alternatives to big tech.
00:43:00.000 I mean, people are sick of this shit.
00:43:01.000 This is insane.
00:43:02.000 Everything has gotten totally out of hand and it's just unapologetic.
00:43:08.000 With Facebook and Google, you know, they're just not, they're not leveling with everybody.
00:43:14.000 Like, okay, so We understand that there's horrible content on the internet.
00:43:20.000 Let's deal with this and not erode freedom of speech.
00:43:23.000 I would love to see Sundar Pichai or Zuckerberg or Tim Cook have more of a balanced conversation about this, but it just seems like whenever they talk about it, it's like they're posturing.
00:43:37.000 It doesn't feel real and they're really acknowledging Even like the academic conversation with regards to censorship because the academics are saying that censorship causes increased radicalization.
00:43:54.000 So what's going on here?
00:43:59.000 Yeah.
00:44:00.000 What other things could be done to sort of level the playing field?
00:44:10.000 I mean, open sourcing is foundation, number one.
00:44:16.000 Like, everyone's afraid.
00:44:19.000 You know, look, companies will say, oh, we couldn't share our secret sauce.
00:44:25.000 You know, that's what makes us competitive.
00:44:28.000 But...
00:44:31.000 It's just really not true.
00:44:33.000 You can use software licenses as well that restrict people.
00:44:37.000 So you can still be transparent.
00:44:38.000 For instance, there's this really cool app called Uniswap, which is a decentralized protocol for crypto.
00:44:46.000 So you can swap tokens, and there's no intermediary.
00:44:51.000 And they used a time-delayed GPL. GPL is general public license, one of the most famous free software licenses.
00:44:59.000 But they basically said, look, we need to be transparent.
00:45:02.000 No one takes anything in crypto seriously unless it's transparent and audited.
00:45:07.000 So we're going to make it so that we're showing you all the code.
00:45:11.000 You can make sure we're not spying on you, doing anything sketchy.
00:45:14.000 But if you're a commercial entity, you cannot fork our code and compete with us for the next two years.
00:45:21.000 So they basically were giving themselves a head start.
00:45:23.000 The license that we use is the GPLv3, which says that anyone can do whatever they can.
00:45:29.000 And people do.
00:45:30.000 There are other versions of mines around the world, people running it and having their own social network.
00:45:35.000 But if they make changes, they have to share those changes with the world.
00:45:39.000 So it's referred to as copy left in kind of the copyright world.
00:45:44.000 It's basically that, yeah, you have to, it's sort of a pay it forward.
00:45:50.000 I borrowed from you.
00:45:51.000 I'm going to use that to build my business.
00:45:53.000 Yeah, I'm going to sell it.
00:45:54.000 I'm going to make a ton of money.
00:45:55.000 But, you know, the development that I did, I also have to share.
00:46:00.000 And there's many others.
00:46:02.000 I mean, there's even licenses that are way more restrictive but still provide the transparency.
00:46:07.000 There's ones that are just read-only.
00:46:08.000 Like, listen, you can read this and see it, but you cannot touch it.
00:46:12.000 You know, that would be a step in the right direction for Facebook.
00:46:16.000 And the thing is that they know about this power dynamic because they do create tons of open-source tools.
00:46:24.000 React.
00:46:25.000 Facebook created React, which is one of the most popular JavaScript frameworks.
00:46:29.000 Angular was made by Google and tons of databases and back-end tools.
00:46:36.000 These big tech companies do contribute a ton to open source, but they only do it on the stuff that are developer tools.
00:46:45.000 Because they know that the developers will only use their stuff if it's open source.
00:46:51.000 Developers are never going to use something that they don't have control over.
00:46:55.000 So it's like this very intentional game that they're playing.
00:46:58.000 Their main apps, they're not transparent about it all, but they know that they need the developer energy.
00:47:05.000 So, you know, I think that they should just do it.
00:47:08.000 And the great thing, even though the Twitter algorithm's not, you know, there yet, I think that Elon is, when I saw that happening, I would just say...
00:47:20.000 It's like one of the big guys dipped their toe in the water.
00:47:25.000 But it really has to be someone like Elon, who's eccentric and insanely wealthy, who's willing to go out on a limb for $44 billion and overpay for a company and then sort of like fucking throw it upside down.
00:47:39.000 But it's working.
00:47:42.000 It's working.
00:47:43.000 And besides all the people that publicly decry that they were done, that's not real.
00:47:49.000 The reality is that's not really that important.
00:47:52.000 The real important stuff is the mass amounts of humans that are constantly sharing information.
00:47:58.000 And that seems to have gone up.
00:48:01.000 Yeah, I think that Elon also just changed the way that The billionaires act.
00:48:10.000 Like, before him, you know, who's another billionaire that shitposts memes?
00:48:16.000 Nobody.
00:48:16.000 I mean, just, but that, don't underestimate, and I know you don't, but the fact that he did that, I think it sort of paves the way for other people up on his level to start being more real.
00:48:30.000 You think so?
00:48:31.000 I hope so.
00:48:31.000 I think most of them are cowards.
00:48:34.000 And they just don't have the courage to be that wild and just really post things they think are funny.
00:48:41.000 For me, my favorite one ever was the Bill Gates one when he posted the photo of Bill Gates next to a pregnant man emoji.
00:48:49.000 And it's like if you ever want to lose a boner real fast.
00:48:52.000 And he has a specific beef with Bill Gates because he's shorting Tesla stock.
00:48:56.000 Yeah.
00:48:57.000 But yeah, exactly.
00:48:58.000 I mean, they should.
00:49:00.000 It's good for...
00:49:01.000 People like real things.
00:49:03.000 People don't like fake things.
00:49:04.000 Like, they should...
00:49:04.000 Even if they were being sketchy and, like, manipulative about it.
00:49:09.000 Like, they should be acting real.
00:49:10.000 Because that's what people like.
00:49:12.000 I saw Bezos tweeting...
00:49:15.000 Bezos, I think he wants to come out of his shell.
00:49:19.000 He was tweeting, like, a Barry Weiss article, which was weird.
00:49:22.000 Interesting.
00:49:23.000 You know, because...
00:49:24.000 Amazon, we just ditched Amazon.
00:49:27.000 We moved our whole operation over to Oracle because they're more committed to free speech.
00:49:34.000 Does Amazon censor?
00:49:36.000 I mean, they banned Parler.
00:49:39.000 Oh, did they?
00:49:40.000 Yeah.
00:49:40.000 Interesting.
00:49:41.000 Yeah.
00:49:41.000 And they have much worse terms, which, like, kind of...
00:49:45.000 Spell out.
00:49:46.000 What was their justification for banning Parler?
00:49:49.000 Did they make a statement?
00:49:50.000 It was all around the Jan 6 stuff and, you know, extremism type reasons.
00:49:58.000 Meanwhile, all that content is on Facebook and certainly on other platforms that were on AWS. So do you think that that's just sort of like a PR move to ban Parler?
00:50:14.000 Partially.
00:50:14.000 I don't know all the specifics of the back and forth between them.
00:50:21.000 I don't know if they were given warning or the ability.
00:50:25.000 But the thing is, even if they were, it's like what Amazon would have been asking, take down this content or you're going to have to leave.
00:50:34.000 You know, that would be violating what they were trying to do with free speech.
00:50:38.000 You know, because there's a group of companies now that are like pro-free speech platforms, which are gaining dominance.
00:50:44.000 It's awesome.
00:50:45.000 You know, you've got Rumble, you've got Parler, you've got Mines, you've got, you know, there's a bunch of them.
00:50:51.000 And...
00:50:53.000 But unfortunately, the waters get muddied because, you know, Rumble uses Google Analytics.
00:51:01.000 They are totally closed source.
00:51:04.000 So they're doing some of the speech stuff right, which is absolutely essential.
00:51:08.000 And so, again, it's a huge step in the right direction.
00:51:10.000 I actually had a back and forth with Chris, their CEO, and he said that he would be open to some open source stuff, except he's been icing me on a couple emails recently.
00:51:20.000 He's probably busy though, no?
00:51:21.000 Oh, he's very busy.
00:51:23.000 But I think that this open source issue needs to be honed on.
00:51:29.000 We can't just let the next wave of free speech companies be doing all the same shit that big tech was doing on the technological end.
00:51:38.000 We can't just let them keep doing the surveillance, keep doing the secrecy.
00:51:44.000 We're not moving forward if that is where we ultimately end up.
00:51:50.000 But what I'm hoping...
00:51:52.000 So we're core developers at...
00:51:55.000 Well, we contributed what's called a NIP, which is a Noster Improvement Proposal, which is the framework which could enable a site like Twitter or Rumble...
00:52:07.000 Or any of them to integrate Noster like we do.
00:52:10.000 So you don't have to be fully decentralized, but you can integrate NIP 26, which is delegated event signing, so that your users have an escape hatch.
00:52:24.000 That's really all people want.
00:52:25.000 People are going to keep using Twitter.
00:52:26.000 Just because a fully decentralized option exists doesn't mean people are going to stop using Twitter.
00:52:31.000 And I feel like that's kind of what Elon has in his head.
00:52:34.000 Because he even, Nostra was on the, remember when, like a couple months ago, Twitter came out with this policy, like we're banning links to Instagram, Facebook, and Mastodon?
00:52:45.000 And then they rolled it back.
00:52:46.000 Do you remember that?
00:52:47.000 I don't.
00:52:48.000 So they did do that, and people were bugging out.
00:52:52.000 Of course.
00:52:53.000 And so they rolled back the policy.
00:52:55.000 They said, okay, we're not going to do that.
00:52:56.000 Noster was actually on that list.
00:52:58.000 So they're aware of this system, but I think that they're thinking about it the wrong way.
00:53:05.000 There's a fully pure Noster client called Domus, which is super nice, and it's a great option.
00:53:15.000 You know, fully decentralized options don't have a ton of functionality.
00:53:20.000 You know, they don't have the notifications.
00:53:21.000 They don't have a lot of the discoverability of stuff.
00:53:25.000 There's serious limitations with fully decentralized stuff that's never going to be able to compete with more of like a centralized option where you can do all this fancy I think that this idea that we need to push out decentralized competitors is really just the wrong state.
00:53:48.000 Elon repeatedly says, we need to maximize public trust.
00:53:52.000 And I do believe that he believes that and wants that.
00:53:55.000 And that's why he's trying to be more transparent.
00:53:57.000 But maximizing public trust...
00:54:00.000 Is about, you know, give people their own keys.
00:54:03.000 And then, you know, that's going to hold the company accountable.
00:54:05.000 And then if, you know, Twitter messes around, they can go pop over someone else.
00:54:09.000 But they don't lose all their stuff.
00:54:12.000 But how would you possibly move Twitter stuff to some other network, though?
00:54:19.000 Because if they...
00:54:21.000 Let's say if someone did something like that, and you had all your posts on Twitter, and you've been on Twitter since 2009, and Mines gives you the ability to port your shit over to there.
00:54:35.000 How the fuck would you ever wind up doing that?
00:54:38.000 I mean, if it gets integrated, then all the posts can just be signed.
00:54:45.000 If it gets integrated.
00:54:46.000 But that seems like that would be very beneficial to minds, but not very beneficial to Twitter.
00:54:50.000 Because at the end of the day, Twitter is still a company.
00:54:54.000 But think about it.
00:54:55.000 We're a company.
00:54:57.000 Right, but it would be beneficial to you.
00:54:59.000 You're far smaller.
00:55:00.000 How many users does Twitter have?
00:55:02.000 Oh, of course.
00:55:02.000 Hundreds of millions, right?
00:55:04.000 Right, but...
00:55:06.000 Actually, there were people on our team who asked the same question to our team.
00:55:11.000 Why would we give people the ability...
00:55:15.000 Why would we let people...
00:55:18.000 They're just going to go take their mind stuff and go somewhere else.
00:55:20.000 So it's not just us.
00:55:24.000 It's not...
00:55:25.000 Right, but you're a very small company, relatively speaking.
00:55:28.000 Like, when you first came on the podcast, I think you had two million.
00:55:31.000 Is that what it was?
00:55:32.000 Somewhere around there?
00:55:33.000 Yeah, yeah, that sounds right.
00:55:33.000 And now you're more than double that.
00:55:36.000 You're five million.
00:55:37.000 But relatively speaking, when people start talking about social media networks, Minds does not get into the conversation.
00:55:44.000 It's a mass conversation.
00:55:45.000 You know, distributor of information.
00:55:48.000 Right.
00:55:49.000 And, you know, now Rumble is.
00:55:51.000 Yeah.
00:55:52.000 And do you know why?
00:55:53.000 They've spent a lot of money.
00:55:54.000 Oh, God, they've spent a lot of money.
00:55:55.000 They came to me with a fucking shitload of money.
00:55:58.000 Same thing happening with Substack.
00:55:59.000 They're spending millions of dollars bringing over top talent.
00:56:03.000 It's not sustainable.
00:56:05.000 I wonder about Rumble.
00:56:07.000 I'm like, how much are you guys spending?
00:56:10.000 That's a lot of fucking money.
00:56:11.000 Because they have Steven Crowder on there.
00:56:14.000 They have Russell Brand.
00:56:15.000 They have all these people that I know.
00:56:17.000 They had to fork up some serious cash.
00:56:21.000 Yeah, I don't know.
00:56:22.000 I mean, I think that they're well bankrolled, that's for sure.
00:56:25.000 I guess.
00:56:26.000 And it also seems to be working in terms of just generating more publicity and users.
00:56:34.000 I know there was something that was on Rumble recently.
00:56:37.000 I forget what the video was, but it was over 2 million views.
00:56:40.000 I was like, that's significant.
00:56:41.000 Yeah, no, it's great.
00:56:42.000 I mean, it's great for...
00:56:45.000 It's very validating.
00:56:46.000 Regardless, see, I'm in it for the actual speech.
00:56:50.000 I want them to succeed.
00:56:52.000 I want that to happen.
00:56:55.000 Just while it's in my head, because I forgot to bring it up before about Substack.
00:57:03.000 So Elon actually accused them of stealing Twitter data.
00:57:08.000 And that was part of his justification for blocking them.
00:57:13.000 How did they steal Twitter data?
00:57:14.000 So, I mean, that's a very strong word that he used.
00:57:17.000 And I don't know all the details, but what I know is that, so Substack is largely powered by Twitter's API. Which means API application programming interface.
00:57:28.000 It's basically a developer tool set so that websites can integrate with API. You see login with Facebook, login with Twitter.
00:57:37.000 You have tweets embedded in Substack.
00:57:40.000 The API is how you facilitate that.
00:57:43.000 And Twitter has been locking down its API. Because...
00:57:49.000 It's been, quite frankly, probably costing them millions of dollars because when Substack makes an API call to Twitter, that costs Twitter money.
00:57:57.000 So Elon's perspective is, okay, we're hemorrhaging money.
00:58:01.000 I'm speculating.
00:58:02.000 I don't want to put words into his mouth.
00:58:04.000 But I think that he's locking down the API because...
00:58:07.000 It's costing them so much money to be supporting all these websites that aren't paying them.
00:58:12.000 Right.
00:58:12.000 So, you know, Substack, the authentication and sign-up uses Twitter.
00:58:17.000 The social graph and recommendations uses Twitter.
00:58:20.000 So, Elon's tweet said that, you know, they've basically been abusing our API to bootstrap their own social network.
00:58:28.000 Because Substack just came out with a social feed called Notes.
00:58:31.000 Right.
00:58:32.000 How does that work?
00:58:33.000 It's just a news feed.
00:58:34.000 Can you show me that?
00:58:36.000 I want to see what it looks like.
00:58:38.000 Substack notes.
00:58:40.000 Because I just heard about this.
00:58:42.000 I don't want to stain this table.
00:58:44.000 Oh, it's good to stain it.
00:58:45.000 It's good.
00:58:46.000 It gives it life.
00:58:47.000 This is a relatively new table.
00:58:49.000 The old one we had, the old studio, was covered in stains.
00:58:53.000 Introducing substack notes, unlocking the power of the subscription network.
00:58:58.000 And so...
00:58:59.000 What is this?
00:59:02.000 So there's a screenshot down there.
00:59:05.000 You have to sign in.
00:59:06.000 No, just do continue.
00:59:07.000 Yeah, there it is.
00:59:08.000 Okay.
00:59:09.000 So it's just a news feed.
00:59:09.000 So it looks exactly like Twitter.
00:59:11.000 Right.
00:59:12.000 Literally.
00:59:12.000 It has a heart, it has comments, and it has some sort of a repost, but it's not a square.
00:59:18.000 It's a circle.
00:59:20.000 It's basically a copy of Twitter, just like the Truth Social Network is, right?
00:59:26.000 Right.
00:59:26.000 Yeah.
00:59:27.000 And so to dig into that a little deeper, like, so you know Mark Andreessen?
00:59:32.000 Mm-hmm.
00:59:33.000 Brilliant guy.
00:59:34.000 I love that dude.
00:59:35.000 I mean, he's super, yeah, he's one of the most legendary tech investors of all time.
00:59:39.000 He's so smart.
00:59:40.000 And he's so smart.
00:59:40.000 And so he owns, I don't know what percentage of Substack, but Andreessen Horowitz is one of Substack's primary funders.
00:59:49.000 Okay.
00:59:49.000 And Andreessen Horowitz also put hundreds of millions into the Twitter deal.
00:59:56.000 So, Elon and Andreessen are probably super tight.
01:00:00.000 And so what's happening...
01:00:01.000 In another tweet, someone had said, you should buy Substack.
01:00:06.000 And Elon responded, yeah, maybe I will.
01:00:10.000 And this was like two months ago.
01:00:12.000 And so they've been pursuing...
01:00:14.000 It seems as though they've been in negotiation for Twitter to actually buy Substack.
01:00:19.000 I mean, Elon said it.
01:00:20.000 And he's also super tight with Marc Andreessen.
01:00:23.000 And so...
01:00:27.000 Probably, Substack thinks it has a certain valuation.
01:00:30.000 Elon wants to get it for less and was trying to say, listen, you are reliant on us.
01:00:36.000 Right.
01:00:36.000 We're helping you grow and then I'm thinking about buying you.
01:00:39.000 And I'm thinking about buying you.
01:00:39.000 It's costing me more money the more time that you're doing this.
01:00:42.000 So what I'm going to do is...
01:00:44.000 Cut off your API access and show you who's daddy.
01:00:48.000 Ooh.
01:00:49.000 Right.
01:00:50.000 So, I mean, who knows?
01:00:52.000 Because I don't think that...
01:00:53.000 Just speculation.
01:00:53.000 I don't think that he was out this to censor.
01:00:56.000 And unfortunately, Elon has successfully enraged both mainstream and now independent journalists.
01:01:01.000 Which is not—I don't think he intended for that to happen, but that's what happened.
01:01:06.000 I mean, all the Substack journalists—you know, that's one of the best places for independent journalism.
01:01:09.000 Yeah.
01:01:10.000 And now they're all pissed because, you know, there was a period of time where their businesses were screwed up.
01:01:15.000 Well, not only that, it was the very people that were using Twitter to put out these Twitter pages.
01:01:24.000 So there were all the emails that showed the collusion between the intelligence agencies and the former heads of Twitter.
01:01:32.000 You know, like this was the same guys and they were publishing this.
01:01:36.000 Yeah, and they all thought they were friends.
01:01:38.000 Yeah.
01:01:39.000 Well, as soon as the bunny got involved, that's where things get weird, right?
01:01:43.000 Yeah, and so I think it kind of is a corporate negotiation byproduct.
01:01:51.000 It seems like that's kind of what happened because I don't think that he intentionally wants to hurt those people.
01:01:58.000 What I like about Elon is he will change course.
01:02:01.000 If people respond in a negative way, they don't like it, they get upset, he's like, okay, we won't do that.
01:02:07.000 And he's publicly said that he would do that, and I like that he does do that.
01:02:11.000 He's flexible in that regard, and he's not completely dogmatic about these ideas.
01:02:18.000 Yeah, he's doing it live.
01:02:20.000 Yeah, he really is.
01:02:21.000 Fuck it, we'll do it live.
01:02:23.000 He really is, which is kind of interesting.
01:02:26.000 And it's also part of the fun of the chaos of Twitter under him.
01:02:30.000 You've got this one incredibly intelligent, super eccentric guy who happens to be one of the richest people on the earth.
01:02:37.000 And he decided to take over the...
01:02:40.000 And he did it very specifically because he thinks it's a threat to democracy.
01:02:44.000 That having this censorship and having this...
01:02:47.000 This control over the access to information and the narrative which is what you're previously you were getting and it's like we see it with YouTube we see it with these other social media networks where someone has a problematic opinion they get shadow banned and ghosted and they Their algorithm gets fucked up,
01:03:06.000 their access to new subscribers gets fucked up, and there's real-world consequences.
01:03:12.000 There's also demonetization, right?
01:03:15.000 YouTube's got this really sneaky thing that they do where they just demonetize things, and so you self-censor because you don't want to get demonetized.
01:03:22.000 And, you know, that's not good for anybody.
01:03:25.000 It is an existential threat having social media platforms censored.
01:03:29.000 I mean, that is how humanity is educating itself.
01:03:32.000 Right.
01:03:32.000 But it's also a new thing, right?
01:03:34.000 Because these things didn't exist two decades ago.
01:03:37.000 So now all of a sudden we have this new platform that really should be considered like a utility.
01:03:43.000 I think your access to it should really be just like your access to the internet.
01:03:47.000 Like, if we just decided that someone was problematic and you can't have the internet anymore, holy shit!
01:03:54.000 That would freak people the fuck out.
01:03:55.000 Well, it should freak you the fuck out if you don't have access to YouTube.
01:03:59.000 It should freak you out because that is the primary way where people share video and post video and post their opinions on things.
01:04:08.000 Yeah, it gets hairy too because Twitter's still playing this game with other countries.
01:04:14.000 Which we've basically said, and actually we're dealing with this right now, I'll just call them out.
01:04:19.000 Germany has this horrible piece of legislation called the Network Enforcement Act, which is similar to the California thing I was mentioning.
01:04:29.000 And, you know, they want you to take down stuff at their request.
01:04:34.000 And we're just not going to do that.
01:04:35.000 So we've made that decision.
01:04:37.000 And Twitter's doing that for Germany?
01:04:39.000 Twitter's doing that.
01:04:40.000 And they're also doing it with India.
01:04:42.000 Really?
01:04:43.000 So if India posts something negative about the government, the government can say, take this down, and Twitter will take it down.
01:04:48.000 Twitter has these interstitials, like kind of different content policies on different states.
01:04:52.000 And so Pakistan, there's basically a different Twitter in all of these different Is that because it's the only way they can be on in those countries?
01:05:01.000 That's the argument.
01:05:03.000 But it's the same argument that Google goes through when they're like, should we go to China?
01:05:07.000 We want to go to China.
01:05:08.000 We have this Dragonfly project.
01:05:10.000 Are we going to, you know, let's work on it behind the scenes.
01:05:13.000 What is Dragonfly?
01:05:14.000 It's like Google's, I don't know all the details, but it's their kind of China project.
01:05:21.000 Their project to get Google accessible in China and have it be okay with the government.
01:05:27.000 I was friends with someone who was an executive at Google back in the day, and what she described to me was that if they didn't, and they were in negotiation and doing business with China, and she said, if we don't do this, they're just going to copy it.
01:05:40.000 They're basically just going to rip off Google.
01:05:42.000 And so we're in this battle to either appease them with their rules and have it go over there and have some things like Tiananmen Square be censored.
01:05:54.000 Where you can't access information about Tiananmen Square.
01:05:56.000 It's just like, well, it seems to be a situation where we kind of have to do that or they're just going to copy Google.
01:06:03.000 Yeah, I don't think that...
01:06:05.000 That's just a game that we're not interested in playing.
01:06:08.000 I think that we're just going to stick to...
01:06:13.000 The laws that we're required to obey and if other countries are going...
01:06:18.000 I mean, we've been banned in China.
01:06:19.000 We've been banned in Vietnam.
01:06:21.000 We had a huge wave of, like, this was one of our largest...
01:06:24.000 We got, like, half a million users in, like, two days from Vietnam because there was a revolving door between their government and Facebook.
01:06:33.000 And, you know, the journalists of Vietnam found out and we got this huge wave.
01:06:39.000 And then shortly after, Vietnam banned us.
01:06:43.000 And but, you know, they people still can use VPNs at their own risk there.
01:06:48.000 I just feel like it's a losing battle, constantly catering to all of these different countries and their censorship laws.
01:06:54.000 It feels like just sort of a waste of time.
01:06:58.000 And you can we can bypass all that with decentralized protocols.
01:07:01.000 And so, you know, but I get it.
01:07:05.000 It's not easy, especially when, you know, a company could go bankrupt.
01:07:09.000 And if you, you know, If you lose all your German users, if Twitter gets banned in India, that's a major problem.
01:07:16.000 We're not dependent on those users.
01:07:19.000 And I appreciate that you're not independent.
01:07:20.000 I think what you do is very important.
01:07:22.000 And I'm glad you're out there.
01:07:23.000 I really am.
01:07:24.000 And I'm glad there's this option.
01:07:27.000 And I'm glad you guys have these, like, rock-solid ethics in regards to that.
01:07:31.000 It's very important, and I wonder, like, at scale, if we come back and do another podcast in two years, and Mines now has 50 million.
01:07:42.000 Or a hundred million.
01:07:43.000 Let's make that happen!
01:07:44.000 You're like, what happens then?
01:07:47.000 I have to be honest, I only got on Mines a couple times after you gave me a login.
01:07:54.000 You know what's funny?
01:07:54.000 You know your first time sharing Mines?
01:07:58.000 No.
01:07:59.000 It was in like 2009 or no maybe like 2011 and you shared this viral video that was going on in Mines of this like quantum levitation disk.
01:08:14.000 I don't know if you've ever seen that.
01:08:15.000 It's like if you look on if you just search like quantum you know superconducting levitation And you had just seen that, and this was way before we knew each other, and you had just shared that.
01:08:27.000 He sure it's me?
01:08:28.000 I'm almost positive.
01:08:30.000 I bet it's not.
01:08:31.000 No, I had friends message me.
01:08:33.000 They were like, dude, Rogan just shared this bit.
01:08:35.000 I bet it's not.
01:08:35.000 Really?
01:08:36.000 Yeah.
01:08:36.000 I bet it's a fake me.
01:08:38.000 No.
01:08:39.000 I think he got duped.
01:08:41.000 Really?
01:08:41.000 Let's go look at it.
01:08:42.000 Can we find it on your Twitter?
01:08:44.000 Yeah, how would it be me?
01:08:46.000 No, because you just shared a link to a video.
01:08:47.000 You didn't know.
01:08:48.000 Oh, so I shared it on Twitter.
01:08:50.000 You shared it on Twitter.
01:08:51.000 Oh, okay, that makes sense.
01:08:51.000 Yeah, no, no, you weren't a user.
01:08:53.000 Right.
01:08:54.000 Yeah.
01:08:54.000 Okay.
01:08:55.000 Yeah.
01:08:57.000 But...
01:08:58.000 That makes sense.
01:08:59.000 Yeah, I think.
01:08:59.000 So I shared this thing, and this thing originated on Minds, and then it boosted Minds.
01:09:05.000 Mm-hmm.
01:09:07.000 It was just like one of those like crazy levitation videos where it's just like, what the hell is going on?
01:09:12.000 Like, this magnet is just levitating.
01:09:15.000 Right, right, right.
01:09:16.000 That sounds like something I would post.
01:09:17.000 Yeah.
01:09:19.000 So, like, what is, like, what's...
01:09:21.000 Do you have, like, the number one post ever on Mines?
01:09:25.000 Like, how many people does it reach?
01:09:28.000 I mean, yeah, we have posts that have hit probably over a million views.
01:09:34.000 And is it because someone's sharing it on other social media networks like I did?
01:09:39.000 Yep.
01:09:40.000 For sure.
01:09:41.000 Yeah.
01:09:42.000 And that's what's so malicious about these algorithms getting clamped down is that, you know, everyone's in survival mode.
01:09:49.000 And even Twitter doesn't, what we learned from the algorithm is that the link, you know, links are punished.
01:09:55.000 All links.
01:09:56.000 If you post a link on Twitter, it's not getting elevated in the algorithm.
01:10:02.000 And that is because they want to keep people on Twitter.
01:10:05.000 Really?
01:10:05.000 Yeah.
01:10:06.000 And we always speculated this because you can just tell you post and the reach of posts with links is way down.
01:10:12.000 And so we have historically have obviously used that.
01:10:15.000 I mean, the problem is that the way that big tech emerged, there was nothing else.
01:10:20.000 There was nothing to throttle them.
01:10:22.000 Right.
01:10:22.000 So now they can just throttle competition by not allowing links to outside sites to be elevated.
01:10:31.000 I'm not saying they don't have the right to do it, but I think that it's just not helpful for the open web.
01:10:37.000 I think that people need to be able to post links and not get punished for it.
01:10:42.000 Yeah, for sure.
01:10:43.000 I didn't know that that was even a thing.
01:10:45.000 Yep, that's one of the things we learned.
01:10:46.000 You know, obviously they're favoring native video, native content, they're favoring Twitter Blue, which is, again, some of it makes sense, but, you know, don't piss off the journalists.
01:11:00.000 No, it doesn't make sense.
01:11:01.000 It's not smart.
01:11:03.000 But the link thing is really weird.
01:11:05.000 So...
01:11:07.000 I could understand encouraging people to post links, but how would you do that?
01:11:13.000 Not encouraging people to post links, but posting it native.
01:11:17.000 I can understand that.
01:11:18.000 Just put the whole article on Twitter, or just put this on Twitter.
01:11:22.000 Yeah, and it is a better experience typically to use the native functions of the app.
01:11:25.000 It's smoother.
01:11:26.000 It looks better.
01:11:27.000 But it's so easy to just say, you know, if I find something interesting on YouTube, oftentimes I will just say, oh, that's a fascinating video.
01:11:37.000 Let me just post that real quick.
01:11:38.000 And then the problem with that sometimes is it posts it with the...
01:11:44.000 Clickbait headline, you know, and then people think it's yours, that you're saying that, you know, libs get owned, you know, like that kind of shit.
01:11:51.000 Yeah, one word, two of the words in caps.
01:11:53.000 Yeah, so now I'm like, ah, fuck.
01:11:55.000 Maybe I should, like, Post it, but put my own thing on it instead of...
01:12:00.000 Right, but if you want to share a video from YouTube, you don't want to have to download the video and then upload it.
01:12:06.000 First of all, you can't do that.
01:12:08.000 You're not supposed to do that.
01:12:09.000 The only way to do that is to screen capture it or get an app that downloads.
01:12:16.000 Yeah, so all the sites do that.
01:12:19.000 I mean, when we were first starting, because we have over a million followers on our Facebook pages, and we would drive crazy traffic back in the day.
01:12:29.000 Just posting viral videos and cool articles, and we had journalists on, and we would post their stuff.
01:12:35.000 So much traffic.
01:12:39.000 Millions and millions of users a month hitting the site.
01:12:42.000 And then Facebook was just like...
01:12:44.000 And then, boom.
01:12:47.000 But we always knew that we didn't want to be...
01:12:49.000 It was a nice-to-have thing.
01:12:51.000 The reason we do what we do is because we didn't want to be relying on them.
01:12:55.000 But it just goes to show how much power they have.
01:12:58.000 I mean, they can literally wipe out Jobs and people's livelihoods and companies just overnight.
01:13:04.000 That's what happens.
01:13:06.000 If you're not favored and you don't play the algorithm game, it's so sickening and worrying about the algorithm.
01:13:14.000 It's like people just worship this thing and you kind of have to for survival because you're trying to succeed, but then what are you really spending your time doing?
01:13:22.000 It's an interesting discussion in the stand-up comedy community because a lot of comics are trying to game the algorithm.
01:13:31.000 And you hear these discussions.
01:13:32.000 I was talking to a friend of mine that was telling me about these comics at the cellar that were having this conversation where they were trying to figure out, like, this is what you do to get the algorithm.
01:13:42.000 This is what you do to go viral.
01:13:44.000 This is what you do.
01:13:45.000 This comic, who's like an established comic, who's like, this is fucking gross.
01:13:49.000 I don't want to have any part of this.
01:13:50.000 Just make the best shit you can make.
01:13:54.000 Don't do this.
01:13:55.000 And so then you see people that become sort of captured by this idea.
01:14:01.000 There's some guys that do it where it's an art form, like Mr. Beast.
01:14:05.000 He's figured out how to do it in a way that's really kind of fascinating.
01:14:09.000 Because he really knows what words to use, what images to use, and that there's an actual science to it.
01:14:16.000 Oh, yeah.
01:14:17.000 He is a weird and interesting guy.
01:14:20.000 Like, watching his interview with Lex, the way that his brain works, it's like everything is being processed through how does this play to the algorithm.
01:14:31.000 Yeah.
01:14:32.000 And it slightly unnerved me a little bit.
01:14:36.000 I think that, you know, he just seemed like...
01:14:39.000 I got this sense...
01:14:41.000 And he seems like a great guy and, like, super smart and, you know, obviously.
01:14:45.000 But...
01:14:46.000 It seems like he felt like, oh, I don't even necessarily know if this is worth my time to even be doing this interview right now.
01:14:54.000 Because I don't know how many views it's going to get.
01:14:57.000 And I don't know how it's going to play.
01:15:00.000 Because his time is so...
01:15:02.000 He knows how much he can make every second of the day, spending on a video.
01:15:07.000 So if he's going to go do an interview with another smaller YouTuber, that's impacting his...
01:15:16.000 I think that's dangerous to let that control the content that you create.
01:15:22.000 It's backwards.
01:15:23.000 Like the comic was saying, do the thing that you do as best as you can possibly do it.
01:15:30.000 Don't make the algorithm...
01:15:32.000 If the algorithm is your thing, if that's what you want to be the passion of your...
01:15:38.000 I guess maybe some people do love it, but it seems a little bit inverted.
01:15:43.000 It's a little bit inverted, but that's also...
01:15:46.000 I mean, he does so much good.
01:15:49.000 And it's so interesting to watch him do the thing.
01:15:52.000 I mean, I get it because, like, his content is fantastic.
01:15:56.000 So what he's trying to do is maximize the reach of his content.
01:16:01.000 And so it's just clever.
01:16:04.000 Oh, yeah.
01:16:05.000 The blind...
01:16:06.000 Caring the blind people...
01:16:07.000 I mean, that was...
01:16:08.000 We need more of that.
01:16:09.000 What's wild is he got hate for that.
01:16:11.000 Right.
01:16:12.000 What was the hate specifically?
01:16:13.000 Who gives a fuck?
01:16:15.000 But it's just funny.
01:16:16.000 Because you cannot do anything that won't piss someone off.
01:16:21.000 Because people are engaged in recreational outrage.
01:16:24.000 And that's what they're doing.
01:16:26.000 And they're trying to figure out an angle.
01:16:28.000 Oh, this rich guy's doing this thing.
01:16:31.000 And really, he shouldn't even be rich.
01:16:33.000 No one should be rich.
01:16:36.000 Oh, you're just trying to do this to make yourself look good.
01:16:39.000 There's all these angles.
01:16:41.000 That people will take on things, which is their prerogative.
01:16:45.000 And in a world of free speech, the beauty of what is the nature of the First Amendment is that you should be able to express yourself.
01:16:52.000 And so I fully support those cunts to rag on him.
01:16:57.000 How many blind people have you cured?
01:16:59.000 I mean...
01:17:00.000 I mean, they're allowed to have their perspective on it.
01:17:03.000 And yeah, I mean, obviously that's the take.
01:17:05.000 Like, you're not doing anything, so shut the fuck up.
01:17:07.000 But people who aren't doing anything are also allowed to chime in on stuff.
01:17:11.000 And they can look petty, and they can look foolish, or they can have really good points.
01:17:16.000 You know and that makes you but that's the beauty of what we have today There's so much dumb shit involved in social media and there's so much bickering and hate and there's so many people that are addicted to it It's elevated their anxiety level and they're all on medication now because they're fucking tweeting 12 hours a day There's a lot of that going on but if you can figure out how to manage yourself and manage it Now we have access to information at an unprecedented level,
01:17:46.000 where something like the Restrict Act gets picked apart by brilliant people on Twitter, on Facebook, on Instagram, on everything.
01:17:56.000 And that's so valuable.
01:17:59.000 So we have to figure out a way to preserve that.
01:18:01.000 And to have this kind of thing today It has never existed in human history.
01:18:09.000 Never.
01:18:10.000 Not one time has there been a time in human history where a person could tweet about a thing and it could be shared by millions of people and all of a sudden the conversation about this subject changes.
01:18:23.000 So you have a public narrative that's being pushed forth by these propagandists, and then someone comes along and says, actually, this is what's really going on.
01:18:32.000 And Twitter will fact check, and then people will chime in.
01:18:35.000 And this is a beautiful thing.
01:18:37.000 I mean, it's really an amazing thing.
01:18:38.000 So with all the bad that comes with social media and all the weird shit that it's doing to kids, it's just not good.
01:18:46.000 Yeah, the government is now using it for, you know, propaganda purposes as well.
01:18:53.000 Everyone's playing the game now.
01:18:54.000 The government's in it.
01:18:55.000 I mean, the Twitter files by Lee Fang exposed how the government was, you know, pushing propaganda.
01:19:04.000 So it's not only about taking down but also pushing out.
01:19:08.000 Yeah.
01:19:08.000 So, you know, they're in it.
01:19:10.000 They're in it not just that way.
01:19:12.000 I guarantee you they're in it the same way Russia's in it and these Eastern Bloc countries that have troll farms.
01:19:18.000 The idea that the United States is not engaged in something like that seems to me to be pretty ridiculous.
01:19:24.000 I'm sure they are.
01:19:24.000 Oh, yeah, they are.
01:19:25.000 I mean, it's on record now.
01:19:28.000 When you see, like, someone will post something and then you'll see someone has a very specific response, And then if you take that specific response and Google it, you'll see hundreds of verbatim exact responses from people that look like real people.
01:19:45.000 You go to their site, there's a picture of them smiling.
01:19:48.000 There might even be a picture of them with some fucking AI-generated kids.
01:19:52.000 It's really weird when you look at someone's feet.
01:19:56.000 I've looked at someone posting something controversial and then look at someone who has what seems to be like, well, this is a suspicious take.
01:20:05.000 And then I'll go to their page and it's all suspicious takes.
01:20:09.000 And occasionally the retweets and the retweets of things that go along with the narrative that they're posting, but it seems like very calculated.
01:20:16.000 And then you realize this person has 30 followers.
01:20:19.000 This is not even a real person.
01:20:21.000 Yeah.
01:20:21.000 Human detection is so key now.
01:20:24.000 Because there's going to be armies of bots, AI bots, fully autonomous, are coming.
01:20:29.000 They're mostly here.
01:20:31.000 And they're not going away.
01:20:33.000 I mean, there's going to be mastermind sort of engineers who have these armies that they control.
01:20:40.000 And it's actually entertaining.
01:20:45.000 The AI JRE, you know, that's getting hundreds of thousands of views.
01:20:48.000 I mean, that's like, they're blowing up.
01:20:51.000 So that's a whole thing for that person.
01:20:53.000 And there's going to be value and there's going to be negative stuff that comes from it.
01:20:58.000 But, you know, I just love being able to watch it play out and just everyone's at war, mainstream media, at war with independent journalists.
01:21:07.000 I mean...
01:21:07.000 Where do you think it goes?
01:21:09.000 Have you ever tried to extrapolate?
01:21:11.000 Do you ever try to look at it from where we're standing now with the chat GPT influence, the AI influence, all this different stuff, all the deepfakes?
01:21:20.000 Do you ever wonder how does this play out?
01:21:25.000 I think that it's similar to what you were saying before, just like the transparency is just increasing so drastically.
01:21:33.000 Even just thinking about how transparent everything is now compared to 20 years ago and what we're seeing play out real time.
01:21:42.000 Seymour Hersh, who's a Pulitzer Prize winning journalist, at war with the New York Times over Nord Stream right now.
01:21:50.000 And that's just happening in front of us.
01:21:52.000 Like, the U.S. government is denying that we did Nord Stream.
01:21:56.000 Seymour Hersh thinks otherwise.
01:21:59.000 And, you know, that is just...
01:22:01.000 What does the New York Times take on?
01:22:02.000 Oh, man.
01:22:03.000 If we can bring this up, I don't want to abuse Jamie's powers, but they, on Easter, Their post of the Nord Stream situation was so egregious.
01:22:17.000 They said it may be best that the truth is not revealed about this.
01:22:25.000 Why?
01:22:26.000 I mean, because they probably think nuclear warfare is on the table.
01:22:33.000 I mean, it would be a false flag.
01:22:35.000 Look at this.
01:22:52.000 What?
01:22:52.000 That's wild.
01:22:54.000 I mean, you literally are in the business of revealing more.
01:22:58.000 I mean, that's what you're supposed to be doing.
01:23:00.000 If the United States government is engaging in something that is potentially dangerous to the human civilization because we can start a fucking nuclear war because of this, if you don't report on that, that's going to allow more of that shit to take place.
01:23:17.000 I can't believe those words were printed and they're still there.
01:23:20.000 How can that be where we're at?
01:23:24.000 In terms of where it's going, Honestly, I think that we are on the precipice of A whole new paradigm and level of access to information that is just going to be like a total shift in humanity.
01:23:41.000 I think we're getting closer.
01:23:43.000 And I mean that with regards to classified information coming out, corporate secret information coming out.
01:23:51.000 It's happening.
01:23:52.000 I mean, what we're seeing from the Twitter files, which Elon just...
01:23:56.000 Just did.
01:23:57.000 He became a whistleblower on himself.
01:23:59.000 Yeah.
01:23:59.000 And, I mean, the amount of information that that gave us and they were denying it for years and gaslighting everybody.
01:24:08.000 It's so egregious and dark.
01:24:11.000 It's so dark and they don't have any punishment on the table.
01:24:15.000 Which is really crazy.
01:24:17.000 You know, when you saw Vidya, like, testifying, and she had to kind of admit things that they've said in the past that were not true, it's really fascinating to watch that.
01:24:27.000 Because the ramifications of that, they're so—it's so dangerous.
01:24:33.000 It's so dangerous to limit the truth and deny the truth.
01:24:38.000 And when you're something like a social media network that is basically the town square for the world, and you're doing that, and you're doing that based on your ideology, and you're doing that based on input from intelligence communities, It's
01:25:22.000 social engineering.
01:25:22.000 Yeah.
01:25:22.000 Transparency to kind of happen and for us to fix it.
01:25:26.000 The problem is the idea is that we're in competition with China and China does not allow that.
01:25:32.000 So we will hinder ourselves and our ability to compete with them.
01:25:36.000 That's the argument.
01:25:37.000 And it's not a good argument, right?
01:25:39.000 Because this country has a very robust belief in free speech.
01:25:48.000 It's one of the reasons why, you know, dissent is tolerated.
01:25:52.000 It's one of the reasons why we can get away with what we get away with.
01:25:55.000 It's like we have protection in this country, and it's really the only country that has that.
01:25:59.000 And the idea that in order to compete with China, we have to become China is fucking gross.
01:26:05.000 It's not true, I don't think.
01:26:07.000 I don't think it is either.
01:26:08.000 I think the competition that we have because of free enterprise, free speech, the reason we have the best businesses in the world is because of that.
01:26:16.000 It wouldn't be allowed to happen in other countries.
01:26:18.000 They just scoop them up and then distort the vision.
01:26:23.000 It's a national security risk to not be transparent with the American people.
01:26:29.000 You know, all this banning crypto, restrict act, all of this stuff is devastating for innovation.
01:26:36.000 I mean, that's the edge of innovation.
01:26:40.000 I mean, these lawmakers don't understand the technology.
01:26:47.000 It's hard to understand.
01:26:48.000 It's hard to understand.
01:26:49.000 They don't even care to understand it.
01:26:51.000 They're not even reading the stuff.
01:26:52.000 They're just saying whatever they think their constituents want to hear or whatever the public narrative is or whatever the government is pushing.
01:26:59.000 And they just sort of like pile on and repeat the propaganda.
01:27:04.000 I mean, rather than banning Bitcoin, the US government should be stockpiling Bitcoin right now.
01:27:13.000 A national security risk to not do that.
01:27:16.000 Because what if all of the other countries do that?
01:27:19.000 They're all moving.
01:27:20.000 De-dollarization is occurring.
01:27:22.000 You know, Russia, China, they're all starting to trade, you know, outside of the petrodollar.
01:27:27.000 Yeah.
01:27:28.000 And if Bitcoin adoption is occurring in these other countries, I mean, So, even as a hedge, you know, our country should be stockpiling.
01:27:39.000 What if all these other countries and it blows up and it becomes the global reserve currency and then the U.S. is just out of luck because we didn't participate because we were trying to be, you know, too heavy-handed with it.
01:27:51.000 It doesn't mean that, like, Bitcoin and the dollar can coexist and should coexist.
01:27:56.000 Like, it's not necessarily one or the other.
01:28:00.000 But, I mean...
01:28:02.000 Yeah, we just need a new wave of politicians who are going to open up.
01:28:09.000 Is that going to happen?
01:28:10.000 I don't know.
01:28:11.000 I doubt it.
01:28:11.000 I have to take a leak.
01:28:12.000 Let's come back.
01:28:14.000 So this is kind of wild.
01:28:15.000 Jeremy Corbell just sent me this.
01:28:17.000 He said he asked the project manager for the government's largest UFO program if our government has in possession downed flying saucers.
01:28:28.000 He said for the first time being interviewed on camera about this, he answered publicly and the answer was yes.
01:28:35.000 Oh shit.
01:28:36.000 What the fuck?
01:28:37.000 Are we back?
01:28:38.000 Yeah, we're back.
01:28:38.000 Here we go.
01:28:39.000 What does that mean?
01:28:41.000 I just sent it to you, Jamie.
01:28:43.000 Let's watch it.
01:28:43.000 Just see what your take on this.
01:28:45.000 Because I know that you're fascinated by the subject, too.
01:28:50.000 This is a bizarre subject when it comes to disclosure.
01:28:55.000 Let's hear this.
01:28:57.000 I don't hear anything.
01:29:01.000 Maybe it's nothing, you know, but what...
01:29:04.000 What do you know about the phenomenon, about UFOs?
01:29:08.000 What can you say for sure?
01:29:10.000 Maybe it's nothing, but what do we know about UFOs and where does this go?
01:29:15.000 As I said, what we know is what we gleaned from all of the data that has been discussed.
01:29:23.000 UFOs are real?
01:29:26.000 Yes, UFOs are technologically sophisticated.
01:29:32.000 They have performance characteristics.
01:29:35.000 The five observables are sort of well documented.
01:29:40.000 But they also have very profound effects on some people, or they have superficial effects on other people, but they do have effects on people.
01:29:51.000 So going forward is to combine both of those, is to study UFO performance and, you know, the hope is that out of UFO performance Can come theoretical physics,
01:30:08.000 which will eventually translate into engineering.
01:30:11.000 Does our government have downed UFOs from unknown origin that they've been trying and are trying to reverse engineer and exploit those technologies to understand the physics and understand that technology?
01:30:22.000 Do we have that to work with?
01:30:25.000 I can't talk about that, but the answer is yes.
01:30:29.000 So this is from Weaponized, Jeremy Corbell and George Knapp's podcast.
01:30:35.000 Just that answer alone.
01:30:37.000 So who is Com?
01:30:38.000 Who is he?
01:30:39.000 Well, he is, according to Jeremy, he is a program manager for the government's largest UFO program.
01:30:51.000 Which, is that Arrow?
01:30:52.000 I don't know.
01:30:53.000 It might be.
01:30:55.000 Jamie could find that out.
01:30:57.000 Yeah.
01:30:58.000 I mean, the type of whistleblower protections that are available, like I hope we can expand that so it's not just for UFO related stuff, but also for surveillance to protect the Snowdens and Assange's of the world.
01:31:13.000 Like we need we need that.
01:31:15.000 And that's why I think that this office is going to be successful.
01:31:19.000 It's because, like, you can't the people who are doing the black projects.
01:31:24.000 We need to have reassurance that they're not going to be screwed after coming forward.
01:31:30.000 And so, you know, this type of legislation is so key.
01:31:35.000 I had an exchange with Mellon recently, and he actually told me to mention this.
01:31:42.000 He was saying that Lazar should come forward through Arrow.
01:31:47.000 No.
01:31:48.000 Sorry.
01:31:49.000 So, because why not?
01:31:52.000 I mean, if he was involved, then he could participate in that organization.
01:31:58.000 And put it through, because that's the channel that we have now.
01:32:03.000 The idea that someone is going to come out and say, we have downed UFOs, but we can't talk about it.
01:32:11.000 What does that mean?
01:32:12.000 Do you have something that was unidentified that is from some other state, some other government that you found and it's unidentified?
01:32:23.000 Or are you saying you have something from another planet?
01:32:26.000 There's two very different answers.
01:32:28.000 Right, because the Chinese balloons included in this as well.
01:32:33.000 So if that gets downed, you know, that constitutes crash materials.
01:32:37.000 Right, like what is a U.S. It's unidentified.
01:32:40.000 What does that mean?
01:32:40.000 But are you saying that you have something that you're back engineering from another country?
01:32:46.000 Or are you saying you have something that is of some completely unknown origin, like outside of this world?
01:32:54.000 Well, I mean, based on the testimony from, you know, whether you're talking about Roswell or particularly the one that I've been just loving deep in the rabbit hole is the Virginia incident in Brazil.
01:33:07.000 I mean, that is completely mind-blowing.
01:33:11.000 James Fox did an epic documentary about it, Moment of Contact.
01:33:16.000 You know, there are military whistleblowers in his documentary which say that the U.S. flew in and transported – I can't believe I'm about to say this – but bodies and materials.
01:33:28.000 Like, in this documentary – There's aliens running around all over Virginia, Brazil.
01:33:34.000 And these two sisters and their friends saw it.
01:33:38.000 The doctors saw it.
01:33:40.000 They did x-rays on it.
01:33:41.000 Like, they talked to these doctors.
01:33:43.000 They did x-rays on the bodies.
01:33:45.000 They did x-rays on the bodies.
01:33:46.000 And they had to demolish...
01:33:48.000 The whole wing of the hospital where it occurred because the smell was so bad that and this is this was reported independently from the guy who saw the crash.
01:34:00.000 Everyone said that it smelled like ammonia and sulfur just like And they couldn't get it out of their system for weeks.
01:34:07.000 And so, you know, there's record of the hospital, you know, doing this, like this renovation.
01:34:15.000 And I don't know, the speculation was saying that it's like maybe like a skunk or something, like a skunk.
01:34:22.000 Type smell.
01:34:22.000 Yeah.
01:34:23.000 Well, no, but that's like a skunk.
01:34:25.000 You know, animals have reactions when they're in a scary environment.
01:34:29.000 And that maybe these aliens have a similar reaction to like a skunk?
01:34:33.000 Yeah.
01:34:34.000 Everyone said that the smell was so horrible that they couldn't get it off their bodies.
01:34:40.000 They had to demolish the whole wing of the hospital.
01:34:45.000 Virginia is very similar to Roswell.
01:34:47.000 When you go to Roswell, it's like UFOs.
01:34:51.000 It's a tourist attraction.
01:34:53.000 You go to Virginia, there's huge UFOs everywhere.
01:34:56.000 It's like the culture.
01:34:57.000 At this point because the whole city was basically locked down.
01:35:01.000 They had like hundreds of people all over the city that saw military checkpoints or know somebody who was directly involved.
01:35:09.000 And like, you know, I approached obviously agnostic on what is happening, but it's it's like you don't have hundreds of people We're good to go.
01:35:42.000 Yeah, there's lots of evidence.
01:35:46.000 So, all of the...
01:35:49.000 I mean, there are hospital records.
01:35:52.000 There are...
01:35:54.000 You mean, like, in terms of physical?
01:35:58.000 Well, James says that there is, and that he's, you know, on top of it to try to find it.
01:36:05.000 But, I mean, evidence is a spectrum.
01:36:10.000 So, you know, physical evidence, you know, people died.
01:36:14.000 So this guy, Charizzi, so they confronted this guy, Eric Lopes, who, with this guy, Charizzi, I think Carlos Charizzi, they basically were police who, like, captured,
01:36:30.000 apparently, allegedly, one of these things.
01:36:32.000 And they talked to this guy's sister.
01:36:35.000 Who died, like, a week after grabbing this thing.
01:36:39.000 And, you know, she has a death certificate.
01:36:41.000 And, like, the cause of death was unknown.
01:36:44.000 He had this, like, super weird infection.
01:36:47.000 And so, you know, I would consider a death certificate, you know, based in, you know, he was a state worker.
01:36:57.000 And he died for super bizarre reasons.
01:37:01.000 So he touched...
01:37:02.000 He grabbed it and then apparently he got a cut or something, infection, and he died.
01:37:09.000 They confront the other guy who was driving the car, Eric Lopez, and he threatened to kill them when they go on his property.
01:37:14.000 It's so crazy.
01:37:17.000 But it's more so like it seeps into the – when something seeps into the culture that deeply, like it did in Roswell and, you know, like it did in Brazil, I just – I tend to feel like there's something there because of just how overwhelming – you know,
01:37:36.000 so many people saw the craft, so many people – We're involved.
01:37:40.000 It's just it's either the most ridiculous hoax of all time, which why would you even like who would do that and why?
01:37:49.000 And so it's like when you're when talking about Occam's razor, you know, the simplest is the most likely answer.
01:37:58.000 I feel like Occam's razor in both cases of Roswell and Virginia are That something actually did happen.
01:38:08.000 Yeah, Roswell has some weird stories about bodies.
01:38:11.000 You know, bodies and little caskets and stuff like that.
01:38:14.000 But it's just like there's so much attention to these things and there's so much hype that I always wonder.
01:38:20.000 Like how much of this is just people feeding off of the narrative and the stories and the fact that this is like an exciting thing to talk about and how much of it is that they want it to be real?
01:38:32.000 I don't think any of these people want it to be real.
01:38:35.000 Half of the witnesses will only go on video with voice modulation and from behind.
01:38:44.000 Isn't that possibly because of ridicule?
01:38:46.000 Yeah, because they don't want to do it.
01:38:51.000 They're just not interested in coming forward.
01:38:55.000 So I feel like if that was...
01:38:58.000 I mean, that certainly is the reason for a lot of people in the UFO world that they can just milk it and they get speaking gigs and they were contacted and, you know, it's a life.
01:39:09.000 Yeah, there's a lot of that, obviously.
01:39:11.000 And that's something that whenever there's a speculative phenomenon like that, it just seems like you're always going to get a bunch of bullshit artists.
01:39:20.000 But, you know, the type of specific evidence that I'm interested in seeing more of is like, okay, so if the military witness claims that, you know, U.S. Air Force came and transported something, like, there are probably records somewhere of these flights.
01:39:37.000 So, like, that's the type of thing that's just inaccessible to us.
01:39:41.000 Yeah.
01:39:41.000 So, but, you know...
01:39:43.000 It probably exists somewhere and you just need a mechanism to dig that up.
01:39:49.000 And so hopefully, hopefully Arrow can get down to it.
01:39:53.000 But it's this guy, Sean Kirkpatrick, I want to say, is running it.
01:39:58.000 But, you know, Kirsten Gillibrand was grilling the some military officials over the lack of funding.
01:40:08.000 They had requested that It gets far more funding.
01:40:12.000 It's getting like $11 million a year right now, which I think for government ops is somewhat modest for the magnitude of what this really needs to be.
01:40:23.000 But, you know, yeah, there's clips of people in Congress want this.
01:40:28.000 What do you think is going on?
01:40:32.000 I think it's happening.
01:40:34.000 I do.
01:40:36.000 I also want it to be happening.
01:40:39.000 That's my problem.
01:40:40.000 I recognize that urge in myself.
01:40:43.000 But, I mean, I've been...
01:40:45.000 I have no interest in falling for bullshit.
01:40:50.000 I don't...
01:40:51.000 I mean, I'm looking for holes.
01:40:53.000 I want holes.
01:40:55.000 I don't want to, you know, be...
01:40:59.000 Yeah, that would just be embarrassing.
01:41:01.000 There's so many different cases, and it's becoming legitimized now.
01:41:07.000 I mean, they changed the whole word to UAP because of the stigma.
01:41:10.000 So we're seeing official acknowledgement and progress.
01:41:16.000 Like, government is investigating this now.
01:41:19.000 They're compiling the information.
01:41:21.000 We have the best, you know, who you've had on, the best military witnesses, Air Force pilots.
01:41:26.000 Like, these guys are not fucking around.
01:41:29.000 They're just not.
01:41:31.000 Do you?
01:41:31.000 I mean, certainly you don't think that Ryan Graves or Commander Fravor are full of it.
01:41:38.000 No, I don't think they are.
01:41:40.000 Ryan Graves doesn't have, like, a physical interaction with them.
01:41:44.000 He basically, what he said was they upgraded their equipment and then immediately they started seeing things that are defying what they understand to be the laws of physics and what are currently known methods of propulsion and The way these things are able to operate and stay stationary in 120 mile an hour winds and things along those lines.
01:42:03.000 He was pretty convinced that this is something outside of our realm of understanding.
01:42:09.000 And then there's the Commander Fravor instance, which is very bizarre because it's multiple witnesses, video evidence, the equipment that they use to detect it.
01:42:21.000 All of it rings true.
01:42:23.000 It all works correctly in that this thing was able to go from 50,000 plus feet above sea level to 50 feet in less than a second.
01:42:33.000 They don't know what the fuck it is.
01:42:35.000 They don't know why it behaved the way it behaved, the fact that it was blocking their radar systems, the fact that this thing… They knew where they were going to go.
01:42:44.000 Yeah, it went to their cat point, which is their predetermined location where they were supposed to meet up later.
01:42:50.000 It went immediately to that.
01:42:52.000 The fact that this thing operates with no visible method of propulsion and that it moves at these insane speeds, it would turn human beings into jelly if they were inside of it.
01:43:04.000 It was a biological entity inside of that.
01:43:06.000 Which makes you think, like, okay, are these drones?
01:43:09.000 Are these, like, super sophisticated drones that some black ops project's been working on for a long time?
01:43:16.000 Mellon thinks that it's a post-biological probe, and that the gray, you know, the traditional kind of gray, because that's what they were described almost exactly in Virginia, the drawings of it, the renditions of what the witness saw looks just like the gray,
01:43:32.000 except it was brown with red eyes.
01:43:35.000 Which is a little bit strange.
01:43:37.000 You also have to think, why would we assume that there's only one version of this thing?
01:43:43.000 If we look at the cosmos, that this gray alien with the black eyes is the only one that exists, wouldn't there be some sort of parallel evolution?
01:43:55.000 I mean, my take on where human beings are headed seems that we're headed into some sort of an integration with technology.
01:44:02.000 It's already integrated into our lives to the point where it's inescapable.
01:44:07.000 And then what if it becomes physically integrated?
01:44:10.000 And what if when we're looking at declining in Sperm counts, the human beings are becoming more feeble and weaker and there's all this weirdness with gender in our culture.
01:44:29.000 And as technology advances, this obsession with gender and the lack of gender and gender being a social construct and the decrease in testosterone and penis sizes and actually, didn't they say penis sizes are going off?
01:44:42.000 Isn't it kind of weird that the transhuman – you kind of have the transhuman movement but then also the transgender?
01:44:48.000 Yeah.
01:44:49.000 It's like both sort of this divergence.
01:44:53.000 Well, one of my concerns is that it's just like a decaying of biological relevancy.
01:44:59.000 And that we're eventually going to succumb to this thing and we're going to become a part of it.
01:45:05.000 And that's what we're seeing when we see these essentially genderless, muscle-less creatures with enormous heads.
01:45:12.000 That that becomes what all primates eventually evolved to.
01:45:18.000 And that we're thinking of technology like it's not life.
01:45:22.000 And that maybe it is life.
01:45:24.000 And maybe it's just a different kind of life.
01:45:26.000 And we're creating this kind of life.
01:45:28.000 And we will eventually be that kind of life.
01:45:30.000 Yeah, I think that within the transhuman path, there's multiple branches.
01:45:35.000 So it's not as if it's all sort of this degradation.
01:45:42.000 Though I typically...
01:45:44.000 Like, I'm in no rush to integrate, you know, Neuralink.
01:45:48.000 But...
01:46:01.000 Yeah.
01:46:03.000 Yeah.
01:46:12.000 And then the thing is, if it does do what Elon thinks it's going to be able to do, which is radically alter your access to information and change your ability to process information, it's going to give the people that adopt it a significant advantage.
01:46:28.000 Not just a significant advantage, but an almost insurmountable advantage without it.
01:46:33.000 And then everyone's going to do it, just like how everyone wears clothes.
01:46:37.000 Someone invented clothes because it's way better.
01:46:39.000 You can survive outside, you know, with the fucking down parka on and wool undergarments, and you could live in a way that you could never live without it.
01:46:49.000 So it's much more sustainable to use clothes.
01:46:54.000 So everybody eventually put on clothes.
01:46:56.000 I mean, clothes are a form of technology.
01:46:59.000 We were all wearing them.
01:47:01.000 You can't go anywhere without seeing people in clothes.
01:47:04.000 And that sort of, like, it is an invention.
01:47:10.000 And we don't think of it that way.
01:47:12.000 We just think of it as clothes.
01:47:13.000 But it's a method that we have devised in order to walk on sharp surfaces and in inclement weather.
01:47:21.000 And we protect ourselves physically and biologically from that.
01:47:26.000 If that is just one step in the human's invention that sort of removes us from the biological limitations that we currently have, that's going to keep going.
01:47:38.000 And it's going to keep going and the end point seems to be integration.
01:47:44.000 Yeah, clothes as technology.
01:47:46.000 That's so important for people to recognize because we don't consider them the same for the most part.
01:47:54.000 So do you know the first use of the word computer?
01:47:59.000 Guess.
01:48:02.000 What year?
01:48:04.000 What do you want me to guess?
01:48:06.000 I'm going to go crazy.
01:48:07.000 Let's say 1500. Pretty good.
01:48:09.000 1613. Oh.
01:48:11.000 And it was to refer to a human who conducts computations.
01:48:19.000 Oh, it's a computer.
01:48:20.000 A human was the first use of a word computer.
01:48:22.000 Somebody who's just doing math.
01:48:24.000 And then it just escaped from there.
01:48:29.000 Yeah.
01:48:30.000 So we are computers.
01:48:32.000 And, you know, there's going to be a fork in the road where, see, like, I just would...
01:48:38.000 Neuralink needs to be open source.
01:48:40.000 I don't know how, like, putting something in my brain that can just, like, switch me off, that I can at least unleash some computer scientists on to audit it, make sure, like, okay, is this going to do anything to me?
01:48:54.000 I mean...
01:48:56.000 That's the thought process that I go through when I ask, like, would I integrate?
01:49:00.000 I would probably only do it if I had to.
01:49:04.000 Do you think there's going to come a time when you have to?
01:49:07.000 Or someone of your generation will have to?
01:49:10.000 Well, have to.
01:49:11.000 It better not be have to.
01:49:12.000 I want to say like even physically forced, but in order to compete and to exist, like phones.
01:49:18.000 You don't have to have a phone.
01:49:20.000 You don't have to have a phone.
01:49:22.000 You could be that dude who lives in the woods and, you know, chops down trees.
01:49:26.000 But do you think that that...
01:49:29.000 Is going to be the same decision.
01:49:30.000 I feel like a lot more people are going to resist biological integration.
01:49:36.000 Even though phones, you know, they're on our bodies.
01:49:39.000 They are biological extensions, sort of.
01:49:42.000 But, you know, they're actually having biological impact.
01:49:45.000 You know, your foot, your leg vibrates.
01:49:47.000 Like, there's definitely energy exchange.
01:49:50.000 But...
01:49:52.000 I don't know.
01:49:53.000 I mean, I don't envision myself just like buckling to some cultural trend.
01:49:58.000 I don't think it's necessarily just a cultural trend.
01:50:01.000 I mean, you really, if you want to compete in the workforce and you don't have a smartphone, you're at a significant disadvantage.
01:50:12.000 But you don't have to physically integrate to get access to that.
01:50:18.000 Well, yeah, you probably do, to a certain degree.
01:50:20.000 If you're going to be able to do the stuff that Neuralink's potentially capable of, but like, yeah.
01:50:25.000 I mean, but are you saying that you would because of that competition?
01:50:28.000 I'm not saying that I would.
01:50:29.000 I'm saying that everybody would.
01:50:31.000 I think if it gets to a certain point, look, it took a while before cell phones became basically universal and across all cultures.
01:50:40.000 I mean, sure, you can go to some hunter-gatherer tribes and they don't have cell phones.
01:50:44.000 But you don't want to live like that.
01:50:46.000 But it's so much easier to make the decision to get a cell phone than get an operation.
01:50:50.000 Yeah, but what if it becomes a thing that becomes very easy to acquire?
01:50:56.000 The other thing is it's also there's a haves and have not aspect to it because the people that are early adopters, if it is effective and it does work, you will have a massive advantage over everyone else.
01:51:10.000 If it really does change the way your mind is able to access information and the way your mind is able to process And, you know, imagine having the computational power and the access to information that ChatGPT has,
01:51:25.000 but instantaneously in your mind.
01:51:28.000 That seems to me where it's all headed.
01:51:31.000 And it just doesn't seem like it's going to stop.
01:51:34.000 It seems like everything keeps moving in this general direction of integration.
01:51:38.000 Agreed.
01:51:39.000 Yeah, I don't think it's stoppable at all.
01:51:41.000 And it's more so, how do we guide it so that it isn't just total dystopia?
01:51:47.000 Because there are different versions of this technology.
01:51:50.000 And similar to this whole issue with, like, you know, freedom-based social media versus, you know, big tech.
01:51:58.000 Like, it's the same...
01:52:00.000 You know, there will be super advanced transhuman technology where the creators of it actually want to be as ethical as possible.
01:52:11.000 There's already those AI camps that exist.
01:52:14.000 And so if the AI is getting integrated with the hardware that's coming in, then, you know, We just need to be cognizant of that distinction.
01:52:21.000 Because we don't want to stop it.
01:52:24.000 I mean, it's amazing what's happening.
01:52:27.000 There was an instance with ChatGPT where this guy, his dog was dying.
01:52:33.000 And he was like, what am I going to do?
01:52:36.000 The doctor's telling me there's nothing we can do.
01:52:38.000 He takes the blood test from the doctor.
01:52:40.000 He's like, give me the blood results.
01:52:42.000 Feeds the specific results into ChatGPT and asks it some questions about like what could possibly be going wrong here.
01:52:50.000 You know, give me something to work with.
01:52:52.000 It gives him a couple of options.
01:52:55.000 He then takes that back to the doctor and they found it and the dog's fine.
01:52:59.000 The dog was going to die.
01:53:00.000 The dog was anemic.
01:53:02.000 So what was wrong?
01:53:04.000 I don't remember all of the specifics, but the dog was anemic.
01:53:10.000 And there was something, but he successfully used it to help the doctor diagnose.
01:53:19.000 That's wild.
01:53:19.000 And it actually fixed the dog.
01:53:21.000 So, you know, at this point, I mean, specifically with medical applications.
01:53:28.000 So here it is.
01:53:29.000 Started dogging the proper treatment.
01:53:31.000 She's made almost a full recovery now.
01:53:33.000 Note that both of these diseases are very common.
01:53:39.000 Babesiosis is the number one tick-borne disease.
01:53:42.000 Oh, it's a tick-borne disease.
01:53:43.000 And IMHA is a common complication of it, especially for this breed.
01:53:48.000 So that looks like an Australian Shepherd, I think.
01:53:52.000 I think.
01:53:52.000 I'm not sure.
01:53:54.000 It says, I don't know why the first vet couldn't make the correct diagnosis, either incompetence or poor management.
01:54:01.000 GPT-3.5 couldn't place the proper diagnosis, but GPT-4 was smart enough to do it.
01:54:07.000 I can't imagine what medical diagnostics will look like 20 years from now.
01:54:12.000 Wow.
01:54:14.000 Will you say that?
01:54:15.000 Pull that a little higher up there.
01:54:17.000 The most impressive part was how well it read and interpreted the blood test results.
01:54:22.000 It simply transcribed the CBC test values from a piece of paper and it gave step-by-step explanation and interpretation along with the reference ranges which I confirmed all correct.
01:54:34.000 Wow.
01:54:36.000 That's pretty wild.
01:54:38.000 That's pretty wild.
01:54:41.000 So even as a supplemental tool for doctors to be referring to when they're coming up with their own...
01:54:47.000 Yeah.
01:54:48.000 You know, it's a helper.
01:54:49.000 Well, the question is, like, are you going to need doctors?
01:54:52.000 Because it seems like you're going to need surgeons, but are you going to need general practitioners that can disseminate information based on test results when they're not even that good at it?
01:55:01.000 And they have, like, 15 people coming to their office and everybody's got five minutes and...
01:55:06.000 You know, and they have student loans to pay and the insurance to pay.
01:55:10.000 And all they have are eyeballs.
01:55:11.000 They have human eyeballs.
01:55:12.000 And also all they have is what they've absorbed in terms of the amount of research and information they have.
01:55:20.000 And that varies widely between...
01:55:22.000 The doctors, because some doctors are more studied, and some aren't, and some are specialists, and some aren't, and some have, you know, done an incredible amount of work on certain subjects, and some of them are completely ignorant of it.
01:55:33.000 You ever try to talk to a general practitioner about vitamins?
01:55:36.000 They don't.
01:55:37.000 They go, oh, you don't need vitamins?
01:55:38.000 You just need a well-balanced diet.
01:55:40.000 And you look at this fat guy who's telling you this, you're like, the fuck do you know?
01:55:44.000 You don't.
01:55:44.000 You don't.
01:55:45.000 I mean, you're literally talking to a person that hasn't had an education in nutrition.
01:55:49.000 And they're telling people.
01:55:50.000 I've had conversations with doctors like that.
01:55:53.000 I'm like, well, that's pretty ridiculous.
01:55:54.000 What you're saying is patently untrue.
01:55:57.000 Yeah, I mean, it's going to potentially become a liability to go with the human.
01:56:05.000 Because, you know, if you're studying an MRI or something just with your human eyeballs...
01:56:12.000 AI can find the most minute little trace of something that you could so easily miss.
01:56:20.000 And so that's just, I mean, that's going to be a revolution for Lifespan.
01:56:27.000 Is ChatGPT4 now available to anyone?
01:56:30.000 You have to pay.
01:56:31.000 You have to pay.
01:56:31.000 But you can pay and get access to it?
01:56:33.000 Yeah, it's like 20 bucks a month or something.
01:56:34.000 But for anyone?
01:56:35.000 Yep.
01:56:35.000 Oh, wow.
01:56:36.000 They did remove some abilities, features, upgrades recently.
01:56:42.000 It got too busy.
01:56:43.000 Oh, I see.
01:56:44.000 I don't know what that means and how they can do that or...
01:56:46.000 Processing.
01:56:46.000 Turning it on and off.
01:56:47.000 How many people are using it?
01:56:49.000 You do have...
01:56:49.000 I mean, there's occasional just, like, total garbage.
01:56:53.000 It's so confidently spit out, which you just really...
01:56:58.000 It's why it's not ready for prime time, but...
01:57:00.000 What kind?
01:57:01.000 It will, like, I don't know.
01:57:03.000 I asked it about me just to see what it would come up with.
01:57:06.000 And it just came up with this, like...
01:57:08.000 Right-wing piece of shit.
01:57:10.000 Problematic.
01:57:11.000 Anti-American.
01:57:12.000 I mean...
01:57:13.000 Communist.
01:57:13.000 It said that I was, like, founder of...
01:57:16.000 It said a lot of...
01:57:17.000 90% correct.
01:57:18.000 But then it said that I was with this other company that I just never even heard of.
01:57:21.000 Is that because it's getting the information from websites that are...
01:57:26.000 Incorrectly saying these things.
01:57:27.000 We don't know.
01:57:28.000 So the system could be gamed in that way.
01:57:30.000 You could put up some sort of like factchecker.org bullshit website when you put out a bunch of propaganda and it sucks that information off the web and uses it at least to flavor an answer.
01:57:44.000 Yeah, that's a great point.
01:57:45.000 To be honest, I wouldn't be surprised if that becomes much more of a problem.
01:57:49.000 And that's why we need to know what are the data sets that these tools are scooping from.
01:57:56.000 So you're scanning the whole web for all the world's data and images and use of language, but like...
01:58:03.000 You're pulling in from, you know, misinformation.com?
01:58:08.000 Why are you using that?
01:58:11.000 And we don't know what they're using.
01:58:13.000 Well, that's probably going to be better with ChatGBT 4.5.
01:58:18.000 So what you're dealing with is like a constant improvement upon this resource that has, in a relatively short period of time, revolutionized the way people get access to answers.
01:58:30.000 So the word that is an interesting word which is going to become more talked about is alignment.
01:58:37.000 Have you heard this?
01:58:38.000 No.
01:58:39.000 So alignment is the phrase used in AI research to kind of understand how aligned this technology is with humanity.
01:58:52.000 And it's going to be abused for the same way that we see ChatGPT becoming biased already.
01:58:58.000 I mean, you can ask it to make a joke about...
01:59:01.000 I took this screenshot off of the ChatGPT ad for...
01:59:06.000 Safer and more aligned.
01:59:07.000 Aligned.
01:59:08.000 And I was like, what does that mean?
01:59:08.000 Aligned with how you want to use it.
01:59:12.000 Yeah, it's all about alignment.
01:59:13.000 Announcing ChatGBT4, a large, multi-modal model with our best ever results on capabilities and alignment.
01:59:23.000 Yeah, what is that?
01:59:24.000 So it sounds very similar to creating...
01:59:28.000 I think there's validity to it.
01:59:30.000 We need to be concerned about alignment.
01:59:32.000 We need to be concerned, you know, how is this technology helping or hurting humanity?
01:59:39.000 However, the problem is that different people have very different perceptions of what is good and what is bad for humanity.
01:59:44.000 So, you know, the people at OpenAI are telling it not to make a joke about Biden, but make a joke about Trump.
01:59:53.000 Yeah.
01:59:54.000 They're already kind of creating these rule sets of like what is acceptable and what's not acceptable.
02:00:00.000 You know, Elon comes out, you know, we need TruthGPT, the fully uncensored version, which is going to come out.
02:00:05.000 And then that's going to be chaos because, you know, you'd be able to, oh, how do I, you know, create this virus?
02:00:11.000 Right.
02:00:12.000 How do I make a bomb?
02:00:13.000 Yeah.
02:00:14.000 Yeah.
02:00:14.000 Granted, you can already do that in searching.
02:00:17.000 So, you know, ultimately...
02:00:20.000 Models are going to become more personalized.
02:00:22.000 You would hope that you would have the ability to control that yourself.
02:00:25.000 You know, how censored of a version do I want?
02:00:28.000 Do I want the safer version?
02:00:31.000 But, you know...
02:00:33.000 And then Max Tegmark and his institute, and Elon signed it, came out with this whole thing to pause AI. Do you see that?
02:00:46.000 Yes.
02:00:47.000 Which is...
02:00:49.000 Surprising, but also, you know, they're concerned, rightfully.
02:00:54.000 We don't know because it's becoming more autonomous and it's potentially, yeah.
02:01:01.000 So here's something that Jamie just pulled up.
02:01:03.000 Elon Musk reportedly purchases thousands of GPUs for generative AI project at Twitter.
02:01:09.000 Reports say it's a commitment to AI despite signing cautionary AI pause letter.
02:01:16.000 Wow.
02:01:18.000 I don't know exactly what that means.
02:01:19.000 Yeah, so basically they're pursuing their own.
02:01:21.000 So there's some drama involved with Elon and OpenAI because he put $100 million into the original.
02:01:31.000 So they were a non-profit in their first years.
02:01:36.000 And then they realized, okay, we need more money.
02:01:39.000 We need supercomputers.
02:01:41.000 So this is not going to be a cheap endeavor.
02:01:43.000 We need billions of dollars.
02:01:45.000 And so now they have a sort of hybrid model where there's a for-profit entity and a non-profit, and they sort of help each other.
02:01:54.000 And I think there was a semaphore article on Elon's trying to take over OpenAI, which I'm not going to say that it's like the full truth of it.
02:02:04.000 I think we don't know.
02:02:05.000 But so Altman and Elon were working together early on.
02:02:10.000 And then Elon, I think, tried to like take, I don't even know if it was a hostile takeover, but wanted to lead the effort.
02:02:17.000 And it didn't work.
02:02:18.000 People didn't want it to happen.
02:02:20.000 And so, you know, they went off and did their thing, and then Elon left.
02:02:25.000 And I think that, you know, that must be frustrating to him.
02:02:29.000 He put $100 million in to do open AI, which you would assume means it's going to be open source and ethical.
02:02:37.000 Yeah.
02:02:37.000 And then they keep on going.
02:02:39.000 They become the biggest app in the world.
02:02:41.000 And, you know, the original investors are kind of like, okay.
02:02:44.000 Yeah.
02:02:45.000 Right.
02:02:46.000 You know, that's not cool.
02:02:48.000 And so, you know, Twitter should start their own.
02:02:53.000 They have some of the best training data in the world.
02:02:56.000 It's just the most accurate real-time language use in humanity.
02:03:00.000 So they're probably going to come out with something cool.
02:03:02.000 And then you've got a bunch of other folks.
02:03:04.000 There's an open alternative chat GPT called Colossal, which is decent, but they're still reliant on some small parts of chat GPT. But yeah, I mean, it's, you know, with what we're seeing,
02:03:20.000 there's more than meets.
02:03:22.000 There's a lot of business drama happening behind the scenes, you know, with the subsec stuff, with the open AI stuff.
02:03:28.000 And, you know, I think that I just don't buy into the whole secrecy is going to save us mentality.
02:03:35.000 And I don't see OpenAI saying, okay, at this point, we now believe this to be safe enough to release.
02:03:44.000 I just think that they're going to keep hoarding.
02:03:47.000 And when really they should give us a path to when it's...
02:03:50.000 If we're going to be relying on them, they're getting so powerful so fast.
02:03:54.000 Yeah.
02:03:54.000 They need to give us a path to when it's going to be transparent and also why are you doing all this bias?
02:04:00.000 And I think there should be rev share baked in for humanity.
02:04:05.000 Rather than have the government fund UBI, why not have the billion dollar tech companies that are taking everybody's data give everybody a little rev share?
02:04:29.000 That's your mindset.
02:04:32.000 Yeah.
02:04:32.000 But to be honest, it's...
02:04:35.000 It's not just me.
02:04:37.000 That is what is going to be smart for them.
02:04:40.000 Elon even is doing monetization on Twitter.
02:04:45.000 They're working on it.
02:04:46.000 They're trying to do encrypted messages, apparently.
02:04:49.000 Ultimately, what's best for the community is, I think, better for the corporation long term, unless they want to fight this war, which they're going to, and maybe they do.
02:05:01.000 Fight this war.
02:05:02.000 This war for, you know, to be the top AI platform.
02:05:05.000 Yeah, I'm sure that's what they're doing.
02:05:07.000 Yeah, I mean, it kind of makes sense, but it's a very good point.
02:05:11.000 Like, where are you getting the data that's allowing you to do this?
02:05:13.000 It's basically everyone's data.
02:05:15.000 It's everyone's data.
02:05:16.000 Yeah.
02:05:16.000 It's just the internet.
02:05:17.000 Yeah.
02:05:18.000 And who owns the internet?
02:05:19.000 Because the internet is an open-source, decentralized protocol.
02:05:23.000 But it seems to me like this is the first rumblings of cyborgs.
02:05:27.000 This is the first steps that is going to force our integration.
02:05:31.000 Because if this does become something you can access instantaneously in your mind with some thing that you put on your head or something you put in your head, I don't see people not taking that.
02:05:48.000 What does that look like in your visual system?
02:05:51.000 So you integrate Neuralink, and it's in there, because we kind of have like, you know, our internal eye, mind's eye, I don't know what it is.
02:06:02.000 Whatever it is what we see when we dream, it's something.
02:06:05.000 Our eyes aren't open, but we're seeing.
02:06:07.000 Right.
02:06:08.000 But what would that even feel like to search the internet through your brain?
02:06:15.000 Like, are you seeing something?
02:06:16.000 Do you feel it?
02:06:17.000 I think there's probably going to be multiple versions of it.
02:06:20.000 It's going to get better and better, which is why you, A, don't want to be an early adopter, but maybe you do because those are the people that are going to be able to get access to the second version of this and the third, and it's going to become progressively more and more powerful.
02:06:35.000 Really wonder like when we're talking about aliens we're talking about these creatures Maybe that's a natural course of progression for intelligent life that intelligent life eventually realizes there's limits to biological evolution but technology Allows you to jump start and pass bypass all those limitations radically quickly where the the versions that are created by human beings once AI Is able to take over and
02:07:05.000 make better versions of it.
02:07:07.000 It's gonna make a better version of itself.
02:07:08.000 It's gonna continue to evolve to the point where I mean you you really become something that's completely different than a human being and Yeah, and it's a question of, you know, do I want to live?
02:07:23.000 You know, that's the question that we're going to get faced with.
02:07:25.000 Like, do you want to keep living or do you want to die?
02:07:29.000 And you got to pick if you're going to integrate or not if you want to live because our bodies just can't handle it.
02:07:36.000 Do you want to be in a Norman Rockwell painting or do you want to be in a Stanley Kubrick movie?
02:07:42.000 Yeah.
02:07:42.000 How do you want to live?
02:07:44.000 And like for what you were saying about, you know, where is the disclosure process taking us?
02:07:52.000 I think that there is real possibility for social unrest and the global economy.
02:08:01.000 This is most likely the reason that they're taking their time letting information out because religion, the global energy economy, I mean, if these things are real, And they're powered by some, you know, propulsion system with a new form of energy that is,
02:08:18.000 you know, near limitless or whatever it is, then what does that do for the economy?
02:08:25.000 Suddenly all the top largest energy companies in the world are just irrelevant.
02:08:31.000 Right.
02:08:33.000 And then the religious stuff is a whole other level of the equation.
02:08:38.000 But people will have concern.
02:08:42.000 And they should.
02:08:43.000 And they should.
02:08:44.000 Because we might not be people very much longer.
02:08:48.000 But there's a cool book called A.D. After Disclosure, which kind of does a thought experiment about what would happen.
02:08:56.000 So it's a fiction book about what happens if the government comes out and has an official press conference and tells us what's going on.
02:09:03.000 And it's cool.
02:09:05.000 It's by a really smart researcher.
02:09:07.000 How does it play out?
02:09:11.000 I didn't read it.
02:09:13.000 But I love Richard Dolan, who wrote it, and I've watched like...
02:09:18.000 Hundreds of hours of his podcast.
02:09:20.000 He's a great researcher.
02:09:22.000 I need to read it.
02:09:24.000 Maybe I'll listen to it.
02:09:27.000 If you had to speculate as to, like, imagine if there is some large worldwide wholesale disclosure of information, like if invasion is imminent or some sort of undeniable event takes place,
02:09:45.000 how do you think that plays out?
02:09:48.000 But the weird thing is that if Virginia happened, If you're living there and you're seeing that, this is it!
02:09:57.000 Right, but that's a crash.
02:09:59.000 Right?
02:10:00.000 What if it's a landing?
02:10:02.000 Right, like on the White House lawn.
02:10:04.000 Yeah.
02:10:05.000 Yeah.
02:10:05.000 I mean, they have buzzed the White House.
02:10:08.000 Yeah, in the 50s.
02:10:10.000 In the 50s.
02:10:10.000 Yeah.
02:10:11.000 And so, but yeah, there's something going on where they're, you know, in some senses being bold, but they're not being bold enough that it, for some reason, is able to take hold and just all of human society says,
02:10:27.000 okay, We can't not talk about this anymore because what'll happen now is there'll be all these, you know, New York Times article, Pentagon leaks, and it's like, whoa, can we, like, stop and talk about that for five minutes before we just go back to life?
02:10:41.000 Right.
02:10:42.000 And we just keep going back to life and forgetting about that.
02:10:46.000 But don't you think there's like an information overwhelming aspect to this?
02:10:52.000 That it's just overload of data and you're just constantly inundated by new stories and new things and there's so much to talk about and think about that it's very difficult for one thing to stick.
02:11:04.000 Unless it's like a nonsense thing that people get excited about, you know, like this Bud Light nonsense.
02:11:11.000 Like, you know, Dylan Mulvaney is in the spokesperson for Bud Light and now people are shooting Bud Light.
02:11:17.000 And, you know, my take on it has been like, yeah, I think that person's a silly person and a tension whore and a fucking weird person to be a cultural lightning rod.
02:11:27.000 They're obviously just a narcissist, but...
02:11:30.000 Why aren't you freaking out about all these other things that are happening?
02:11:35.000 Why are you not freaking out about the Restrict Act?
02:11:37.000 Why are you not freaking out about what's happening with ChatGPT?
02:11:41.000 Why is everybody freaking out about this dumb thing with Bud Light?
02:11:47.000 Yeah, I mean, there has to be room for paying attention to just silly, crazy drama.
02:11:53.000 It's good to a degree.
02:11:54.000 Why is that primary?
02:11:56.000 I don't know.
02:11:57.000 Do you think it's primary?
02:11:58.000 Yeah.
02:11:58.000 Yeah, I do.
02:11:59.000 I think there's way more people concerned with Bud Light than they are with the Restrict Act.
02:12:04.000 You have to force that into people's minds.
02:12:06.000 I think there's a certain percentage of our population where their minds just gloss over.
02:12:12.000 Like when you start talking about open source, a lot of people are just like...
02:12:18.000 They don't have whatever it is, the software to handle that.
02:12:25.000 Yeah.
02:12:26.000 I don't know.
02:12:27.000 I think it's coming and we just need to stay on it.
02:12:35.000 What else matters?
02:12:37.000 I don't want to die without knowing.
02:12:42.000 Because if that information exists and we're living in a world right now where there are People in, you know, D.C. or, you know, elite figures in the government who do—it's basically multiple civilizations coexisting.
02:12:58.000 There's people with access—because when you digest certain new breakthrough information, which, you know, humanity has done repeatedly, I think that changes who we are.
02:13:09.000 It changes how we interface with the world.
02:13:11.000 It changes all of the decisions that we make.
02:13:13.000 And so— We know that we're living in this almost like information caste system where there are people with access and people without.
02:13:24.000 And so that does exist.
02:13:28.000 Whether that includes actual aliens, I don't know.
02:13:33.000 But it exists for...
02:13:36.000 All kinds of issues that absolutely matter to our existence could be dealing with energy technology, could be dealing with corruption, you know, major geopolitical events that would totally change the trajectory of the world.
02:13:53.000 So I'm just like not comfortable not knowing and having these people that I don't know who they are and why do they get to know?
02:14:03.000 Right.
02:14:03.000 It's not cool.
02:14:04.000 No, it's not.
02:14:06.000 But that's always been the story that has taken place.
02:14:12.000 I'm sure you're aware of the Jackie Gleason story.
02:14:18.000 Right.
02:14:19.000 Jackie Gleason was friends with Nixon.
02:14:20.000 They're drinking, and Nixon says, hey, you want to see some fucking crazy shit?
02:14:25.000 And he takes him to whatever Air Force base where they have a downed UFO. And that he has access to see this, and he sees alien bodies, and that this is something that the government has always had.
02:14:39.000 And they've had this thing.
02:14:41.000 And Gleason, you know, there's speculation as to whether or not he really did have that conversation with Nixon, but there's no speculation to the fact that he built a fucking house that looked like a UFO after that happened.
02:14:54.000 And that house is, you know, you can look at it.
02:14:56.000 It looks like a flying saucer house.
02:14:58.000 Yeah.
02:14:59.000 And we just don't – I mean the laws that we need are laws for more information.
02:15:07.000 What do you think would happen if human beings absolutely knew that we do have UFOs from another world, that we're back engineering and we're trying to figure it out, that we have been in contact with alien civilizations, we are being monitored?
02:15:22.000 You know, the UFO folklore is, of course, that once we drop the bombs, that all of a sudden they start showing up, which totally makes sense.
02:15:29.000 Like, okay, there's a detection that these people, these beings on this planet, they've entered into this new phase of understanding of technology, and they can harness the atom, and of course they use it as a bomb.
02:15:44.000 Now, they're using it as nuclear power.
02:15:46.000 They're using it in all these different ways.
02:15:47.000 And we have to make sure that they safely transition to the next stage of evolution without complete total destruction, which would send them back to, at the very least, the Stone Age.
02:16:00.000 And then you're dealing with a, you know, tens of thousands of years of, you know, reinventing everything and getting back to the state where they were at now and give them another shot at doing it again.
02:16:11.000 So maybe that's why they showed up and maybe that's why they're monitoring us and that's why they're here.
02:16:19.000 That it's a bridge.
02:16:20.000 And they're just here to make sure that we don't fuck everything up.
02:16:24.000 So that this is a very delicate and precarious process that exists with all intelligent life forms all throughout the universe.
02:16:32.000 And that we're at a stage just like, you know, there's speculation and there's people openly discussing that some primates, lower primates, have entered into the Stone Age.
02:16:45.000 They're using tools.
02:16:47.000 They're using things.
02:16:48.000 And this is very similar to what they think happened to human beings.
02:16:52.000 That over the course of millions of years, they'll eventually become like us.
02:16:57.000 And that this is just a natural process that exists where intelligent life forms learn things slowly at first, and then eventually incredibly rapidly, which is what we're seeing now with AI. It's totally logical.
02:17:11.000 I think that...
02:17:12.000 Totally logical.
02:17:12.000 And the way that we interface with species below us is, you know, sometimes we're fascinated by them and want to learn about them and study, but other times we're like, eh, a squirrel, okay.
02:17:24.000 Right, exactly.
02:17:24.000 And so it's kind of both of those things.
02:17:26.000 Like, you know, people polarize the conversation, they say.
02:17:29.000 Oh, they'd never be interested in us.
02:17:31.000 That's stupid.
02:17:32.000 Of course they would be.
02:17:33.000 Just like we are in everything.
02:17:36.000 But at the same time, they probably also have other cooler things to pay attention to.
02:17:43.000 Maybe there's other way cooler species than us to pay attention to.
02:17:45.000 I'm sure there are.
02:17:47.000 But this is the transition.
02:17:49.000 So, if I was an alien life form, a superior intelligence from some other galaxy or some other part of the universe, I would 100% be concentrating on integrating this new species into the galactic network,
02:18:05.000 if there is one, you know, and then, you know, helping them get to this next stage.
02:18:12.000 I mean, maybe life is fairly rare.
02:18:15.000 Maybe intelligent life is much more rare.
02:18:18.000 And maybe this thing that we're seeing is like these rare nuggets of some very precious thing that exists in the universe.
02:18:28.000 And so they seek it out.
02:18:30.000 And they recognize that this is the thing that brings them to the ultimate next stage.
02:18:37.000 Yeah, it's rare, but also potentially fundamental.
02:18:42.000 Right.
02:18:43.000 So basically consciousness as the core of all things, all matter.
02:18:49.000 You know, that guy you had on, I actually connected with at one point, Philip Goth.
02:18:54.000 He's really interesting.
02:18:56.000 He's a panpsychist.
02:18:58.000 Yes.
02:18:58.000 And those, you know, actually Annika Harris, Sam Harris's wife, wrote a cool book, which I also haven't read, but I know about it.
02:19:07.000 It's called Conscious, and it's all about panpsychism, which is cool coming from her because she obviously is rigorous.
02:19:15.000 If you're going to be married to Sam Harris, you probably need to not be full of woo.
02:19:20.000 And so she kind of explored, and I listened to some podcasts with her explaining it, and So this is a real thing that's getting taken seriously.
02:19:31.000 It's not necessarily physically provable, but you can't actually deny that it's a possibility that consciousness is sort of the engine of everything.
02:19:44.000 So philosophically, when you're arguing against materialism versus fundamental consciousness, The materialists, they kind of arbitrarily try to point to when consciousness emerges.
02:19:59.000 But how can you pick that point?
02:20:02.000 Because if all matter is evolving and eventually becoming complex enough to have conscious properties, But still, show me the exact time when you consider something conscious.
02:20:17.000 Because there's plants, there's all kinds of microbes.
02:20:21.000 You just keep going back.
02:20:23.000 Is it when it's reactionary to its environment and trying to preserve its own life?
02:20:29.000 Or is it when it's interactive and conscious to the point where it's communicating?
02:20:34.000 Is that when we decide that it's conscious?
02:20:37.000 Is it when it's able to manipulate its environment like we are?
02:20:41.000 I mean, we obviously prioritize our consciousness above orcas because what we're doing to orcas is fucking horrific and accepted in some strange way that, you know, you can go to SeaWorld and watch something that might be as smart as us do tricks for fish.
02:21:01.000 You know, like why are we willing to do that?
02:21:04.000 That's clearly a conscious thing that has a language.
02:21:07.000 And yet we're like, yeah, yeah, yeah.
02:21:09.000 Stay in the pool, bitch.
02:21:12.000 It's weird what we do and what we prioritize as consciousness.
02:21:16.000 We seem to have like a caste system of our own where we decide that we only favor things that are able to manipulate their environment.
02:21:25.000 Yeah.
02:21:26.000 I mean, but that's way beyond, I think, you know, orca.
02:21:30.000 It's like, that's so, maybe to other people, they don't consider orcas conscious, but I feel like that's, like, a certain type of person that really hasn't looked into this at all.
02:21:40.000 Yeah, they're clearly conscious.
02:21:41.000 They're clearly conscious.
02:21:42.000 And then, I mean, but going back to microbe, I mean, I would say that a microbe is, like, obviously conscious.
02:21:49.000 I mean, like, but that's just where I'm at in the conversation.
02:21:52.000 Maybe viruses are conscious as well.
02:21:55.000 Right.
02:21:55.000 I mean, there's obviously evolution in viruses.
02:21:58.000 It's all just degrees.
02:21:59.000 It's very rapid.
02:21:59.000 You know, look what we saw in terms of variants from COVID, like these vaccine-resistant variants that popped up like almost instantaneously, you know, that these viruses recognize where immunity is and they figure out a way to move around that immunity and become more contagious.
02:22:20.000 There's also, you know, the words we use matter, like, life and conscious are a little bit different, you know, because something would be considered alive probably earlier than most people would consider conscious.
02:22:34.000 So some of the panpsychists like to...
02:22:38.000 Talk about experience.
02:22:39.000 You know, is the matter having any sort of an experience?
02:22:45.000 And sometimes that's a little bit more digestible for people.
02:22:49.000 Well, isn't that just an evolved level of consciousness?
02:22:52.000 You know, we have single-celled organisms.
02:22:55.000 We consider them alive.
02:22:56.000 Yeah.
02:22:57.000 But they're not multicellular organisms.
02:23:00.000 I mean, I tend to lean towards it being fundamental just because I can't place a spot on the emergent continuum.
02:23:12.000 I'm not going to pick.
02:23:14.000 You're not going to make me pick.
02:23:16.000 There's not a light bulb that goes off.
02:23:16.000 Right.
02:23:16.000 There's not like a switch that gets hit or a turkey tester that pops out.
02:23:21.000 Boink!
02:23:22.000 You know, like, oh, it's ready.
02:23:23.000 Right.
02:23:24.000 Yeah.
02:23:24.000 And it's the same thing when we look at, you know, a human being born and, like, being in the womb.
02:23:31.000 Like, you know, people arbitrarily saying, oh, this moment, stop.
02:23:35.000 Right.
02:23:35.000 How do you know?
02:23:36.000 You don't know.
02:23:37.000 We're not going to play this, like, number of days game.
02:23:40.000 Right.
02:23:40.000 Like, when does this soul enter into the fetus?
02:23:43.000 And that's why Chris Rock and Louis also has a great bit about that, where it's just like, oh, you know, You should be able to kill.
02:23:53.000 You know, like, kill it!
02:23:56.000 But you're killing!
02:23:58.000 Right.
02:23:58.000 You know?
02:23:59.000 Yeah.
02:23:59.000 Yeah, Bill Burr has a great bit about that as well.
02:24:02.000 Like, I support your right, but I also think you're killing a baby.
02:24:04.000 And that's, you would think that, that just feels like a more healthy place to be having the conversation.
02:24:11.000 Well, it's just very uncomfortable for people.
02:24:14.000 Right.
02:24:14.000 And especially people that support a woman's right to choose.
02:24:17.000 They don't want to think of it that way.
02:24:19.000 But it's undeniable.
02:24:21.000 I had a conversation with someone online once a long time where Richard Dawkins was saying that there's no difference between a human fetus and a pig fetus.
02:24:32.000 I'm like, what are you talking about?
02:24:34.000 That's a ridiculous way to look at it.
02:24:37.000 A human fetus has the potential to become a human being.
02:24:40.000 That could be your neighbor.
02:24:41.000 It could be your best friend.
02:24:42.000 It's going to turn into a person.
02:24:44.000 If someone doesn't intervene or if it doesn't die, it's going to become a person.
02:24:48.000 Pig fetus is going to become a pig.
02:24:50.000 They're very different things.
02:24:52.000 You could argue that a pig is intelligent because they are.
02:24:55.000 They're calculated.
02:24:56.000 They do things.
02:24:57.000 They're smart.
02:24:58.000 They're smarter than dogs.
02:25:00.000 Is your dog conscious?
02:25:01.000 I think my dog's conscious.
02:25:02.000 I have conversations with him.
02:25:04.000 He knows when it's time to eat.
02:25:06.000 He knows what I'm saying when I ask him if he wants to play.
02:25:08.000 He knows.
02:25:10.000 He's fucking conscious.
02:25:11.000 He's just not at the level that a human being is.
02:25:14.000 But that's also because he doesn't have the hardware.
02:25:18.000 Yeah, I mean, with the pro-life, pro-choice debate, it just feels so shallow, similar to the climate change debate, where it's like, obviously human life is precious and we should preserve it as much as is humanly possible.
02:25:33.000 And guess what?
02:25:34.000 Yes, if you choose to have an abortion, yeah, it's killing something.
02:25:38.000 You're killing something.
02:25:39.000 You know, whether or not people should have the legal right to do that in certain circumstances, I'm not someone to answer that.
02:25:45.000 But like, it would be so much more of a healthy conversation if we could agree at least on what we're talking about.
02:25:50.000 Right.
02:25:50.000 Which seems that people are trying to deny that it's life, which is not reality.
02:25:54.000 And with climate change, it's like, who cares if it's, you know, global warming, blah, blah, blah.
02:26:03.000 Do you think pollution is good?
02:26:13.000 Which is less pollution.
02:26:14.000 Yes.
02:26:15.000 But we're not worrying about is that, like, based on, you know, millions of years, like, who knows?
02:26:19.000 Right.
02:26:21.000 So it's like, I feel like some of these debates are just arguing about the wrong things.
02:26:25.000 And it's just, we need to reset some of these conversations to just find the place where we can agree on one thing and then make progress from there.
02:26:35.000 Because I think that a lot of people who are pro-abortion Like Louis and Chris Rock who made those bits, which are, you know, kind of crude bits, but they're at least honestly acknowledging what's going on.
02:26:48.000 Yes.
02:26:48.000 Yeah.
02:26:49.000 But that's the purpose that humor does serve in those ways is it makes you laugh at something that is kind of like you leave there going, has this got a fucking good point.
02:26:58.000 It was funny, but there's actually there's some truth in that.
02:27:01.000 And if it didn't have truth in it, no one would be laughing.
02:27:04.000 And that's the test of it all.
02:27:06.000 It's like, does it resonate with that part of your mind that recognizes these inconsistencies?
02:27:12.000 And there's also another problem with both the pro-life, pro-choice and also the climate argument.
02:27:21.000 That they become embedded in ideologies.
02:27:23.000 So they become these dogmatic things that you cannot question.
02:27:26.000 If you're a part of this group and you want to be accepted by this group, there's essentially two groups.
02:27:33.000 There's one group that believes in guns but also believes that babies Are sacred.
02:27:40.000 It's like very strange.
02:27:41.000 You know, they're in general pro-military and pro-killing bad people, but also they think all life is precious.
02:27:49.000 Protect babies with guns.
02:27:50.000 It's fucking, it's wild, but it's also, you know, if you don't give up your gun, people with guns will take your guns and they'll have all the guns to make sure that you don't have guns.
02:27:59.000 And it's just like, Jesus Christ.
02:28:01.000 It is a thought virus cult to think that you have to have your identity sort of Align with this cluster of beliefs.
02:28:11.000 Everything intersects.
02:28:13.000 I honestly feel bad for people who find themselves in that spot.
02:28:20.000 Well, it's brilliant people.
02:28:22.000 This is the problem.
02:28:23.000 Thomas Sowell had a great point about that, that intellectuals will oftentimes compartmentalize and ignore evidence.
02:28:30.000 We're talking about brilliant people that we rely on to be the voice of reason or the voice of intelligence and fact.
02:28:40.000 When it comes to certain particular arguments that they'll align with only the facts that support their case or support their position.
02:28:49.000 And if it doesn't support their position, they'll conveniently ignore it.
02:28:52.000 And, you know, we see that with almost everything.
02:28:55.000 We saw that very clearly with the discussion about COVID. And whether or not metabolic health is important and whether or not other treatments are important or whether or not you should accept the narrative that there's one binary argument and that there's one answer and that this is the only answer to solve this problem.
02:29:16.000 And clearly that's not the case and not true.
02:29:18.000 It was represented across so many different demographics, so many people that were similar but did different things, had better outcomes.
02:29:28.000 So there's also this weird thing that human beings almost automatically do, is they try to find the most convenient answer and the answer that supports their pre-existing conditions, their pre-existing positions rather.
02:29:46.000 Yeah, it's weird how COVID seems far away now, but history books are going to look at that as, you know, the largest scale psychological experiment on humanity that's ever occurred.
02:30:05.000 I mean, literally billions of people behaving I mean, you know, it used to be like Stanford Prison Experiment, you know, the elevator experiment, you know that one?
02:30:15.000 Where like people, everyone in the elevator is turned the opposite way of the door, and then people who walk in just turn and face the other wall just because everybody else in the elevator is doing it.
02:30:26.000 I mean, billions of people doing that type of behavior now.
02:30:31.000 Yeah, it's a problem with human beings that we're tribal and we want the respect and the acceptance of our tribe.
02:30:39.000 And if our tribe is, even if it's illogical, is heavily leaning towards one position, we feel absolutely compelled to do what the tribe is doing.
02:30:50.000 And we support each other.
02:30:51.000 We talk about it.
02:30:52.000 We discuss it.
02:30:53.000 And we find rationalizations.
02:30:55.000 It's hard, though, because you can't know everything and you want to be able to trust experts and defer to scientific consensus, which isn't even really, shouldn't be a thing.
02:31:05.000 Well, no, it's not even scientific consensus.
02:31:08.000 It's scientific consensus along people that are willing to accept the Proposed narrative by these very corrupt organizations that you could follow a very clear Paper trail of money and influence that led them to these decisions Like the lab leak hypothesis is one of the best examples of that where there's literal email actual actual emails that show people thinking that it came from a lab and then there is discussion with other people that don't think it came from the lab and And then
02:31:38.000 there's money that gets exchanged where they get these grants and they've changed their position.
02:31:44.000 And it's very weird.
02:31:46.000 And no one is discussing it in any mainstream source where they're saying, hey, we've got a real fucking problem with this and this is the problem, this is how it happened.
02:31:54.000 It takes independent journalists and people that are very brave that get censored, they get removed from YouTube, they get banned from Twitter.
02:32:02.000 And these are the people that came out and had a problem with this.
02:32:05.000 And a lot of them have rock-solid credentials.
02:32:08.000 A lot of them are established doctors and scientists, and they're saying, like, here's the problems, and why aren't we addressing these problems?
02:32:16.000 And these people are getting, you know, these pejoratives labeled on them.
02:32:19.000 Like, they're anti-vaxxers.
02:32:21.000 They're conspiracy theorists.
02:32:22.000 They're fools.
02:32:24.000 You know?
02:32:25.000 It's very strange.
02:32:27.000 But we did watch it.
02:32:30.000 We did witness it.
02:32:31.000 And some people learned from it.
02:32:32.000 And some people developed a new healthy sense of skepticism about public narratives.
02:32:38.000 And other people are just – they're still – they have their heels dug in.
02:32:41.000 Why don't you believe in science?
02:32:43.000 Like what science are you – what are you talking about?
02:32:45.000 Science is data.
02:32:47.000 I don't believe in data.
02:32:48.000 Are you looking at all the data?
02:32:49.000 Because I bet you aren't.
02:32:50.000 Let's discuss some of that data.
02:32:52.000 And then when you see the panic on their face, when they're forced to discuss this data, and they're forced to discuss these inconvenient realities, it's fucking fascinating.
02:33:02.000 Yeah, and it's just another example of...
02:33:04.000 The lack of access to information.
02:33:06.000 Because China is just, you know, keeping it all super tight.
02:33:09.000 You know, because obviously if the pandemic spawned from them, that's going to have geopolitical consequences.
02:33:17.000 Yeah.
02:33:17.000 But it's a perfect example.
02:33:20.000 Like, that just ripped through this planet.
02:33:24.000 And we don't know.
02:33:26.000 What happened?
02:33:27.000 And there are people who know more than we know about what happened.
02:33:31.000 And we're, you know, but they, you know, probably national security reasons from all governments on lock.
02:33:37.000 We got it.
02:33:38.000 We can't let this out because, you know, it could cause unrest.
02:33:43.000 Well, how about that fucking New York Times article about Nord Stream?
02:33:46.000 It's the same attitude.
02:33:47.000 Like, maybe it's best that we don't know.
02:33:49.000 Did you see the clip of Biden saying that, oh, we will put an end to Nord Stream?
02:33:55.000 I did not.
02:33:56.000 Oh, my God.
02:33:57.000 So this is in the original Seymour Hersh article, you know, basically how the US destroyed Nord Stream.
02:34:03.000 And there's a clip of Biden pre Nord Stream saying it will be, you know, we're gonna put an end to Nord Stream.
02:34:12.000 That guy must be such a fucking nightmare to those people because he's so compromised, you know.
02:34:20.000 Wait.
02:34:20.000 His mind.
02:34:22.000 A nightmare to his team?
02:34:24.000 Yeah, to the people that don't want that out there.
02:34:26.000 Oh, right.
02:34:27.000 Because he's so mentally compromised.
02:34:30.000 You can't trust him.
02:34:32.000 He gets in front of a microphone.
02:34:34.000 Let's hear what he says.
02:34:38.000 If Russia invades, that means tanks or troops crossing the border of Ukraine again, then there will be no longer a Nord Stream 2. We will bring an end to it.
02:34:54.000 Jesus Christ.
02:34:57.000 How will you do that exactly, since the project and control of the project is within Germany's control?
02:35:07.000 We will, I promise you, we'll be able to do it.
02:35:11.000 Oh my god.
02:35:13.000 What a fucking nightmare he must be.
02:35:16.000 And, you know, from the New York Times end of things, like, they're scared.
02:35:20.000 People are scared.
02:35:20.000 Because if it, I mean, that is an act of war.
02:35:23.000 Yeah, but it seems like what Seymour Hersh said seems to be true.
02:35:27.000 I mean, he has all the documentation.
02:35:29.000 He knows what happened, when it happened, who did it.
02:35:32.000 Yep.
02:35:33.000 And he's Seymour Hersh.
02:35:34.000 He's not a fucking moron.
02:35:36.000 He's not just some random person.
02:35:38.000 He's an 80-plus-year-old Pulitzer Prize-winning journalist.
02:35:42.000 He's one of the best journalists that's ever existed.
02:35:45.000 Exactly.
02:35:46.000 And for this, I mean...
02:35:50.000 It's basically the most recent false flag that has occurred, if it's true.
02:35:57.000 Right.
02:35:58.000 Because that was providing energy to Europe.
02:36:04.000 Right.
02:36:04.000 I mean, that had serious effects on how much people were paying for energy.
02:36:09.000 Serious effects.
02:36:10.000 Serious effects.
02:36:11.000 Yeah.
02:36:12.000 So it's scary.
02:36:13.000 Is it completely destroyed?
02:36:15.000 Yeah.
02:36:16.000 So it's no longer providing...
02:36:19.000 Right.
02:36:19.000 That was the second pipeline, I think.
02:36:21.000 Right, there was two.
02:36:22.000 There's another one that still does.
02:36:23.000 Two.
02:36:24.000 Then what I heard recently was that Saudi Arabia is buying Russia's oil and then distributing it to Europe.
02:36:29.000 Oh, great.
02:36:30.000 So they bypassed it.
02:36:32.000 Right.
02:36:34.000 Right.
02:36:35.000 Smart of Saudi Arabia.
02:36:37.000 I mean, when you have a president that's saying something like that, then it actually happens, and he's like, we don't have anything to do with it.
02:36:42.000 But this is actually one of the things that makes me not believe in aliens, is that unless the current administration is just so in the dark about what is going on on our planet, I would think that if the government was actually in on this and aware,
02:36:59.000 they wouldn't be going around blowing up pipelines.
02:37:02.000 Like, wouldn't...
02:37:05.000 Government relationships with aliens mean that they have some semblance of understanding of how to behave?
02:37:15.000 But do you think that that's the level of understanding and information exchange they have?
02:37:19.000 Or do you think it's more like chimpanzees in the Congo being aware of researchers watching them?
02:37:27.000 Because I would imagine it's more like that.
02:37:29.000 But even still, that is so...
02:37:31.000 If they know...
02:37:32.000 I doubt Biden has been, you know, clued in, if it is real, but...
02:37:37.000 About UFOs?
02:37:38.000 About UFOs.
02:37:39.000 Well, it seems that Clinton and at least Obama, that was, and Trump, even Trump talked about it.
02:37:46.000 They tried, but it seems like they failed.
02:37:49.000 Based on...
02:37:51.000 Yeah, I think...
02:37:53.000 Trump said that he knows some things, but he can't talk about them.
02:37:56.000 Yeah, and then famously, like, Obama.
02:37:59.000 And, you know, this is the one thing that is still funny about Jimmy Kimmel, which, you know, his level of funniness has degraded.
02:38:06.000 But he grilled Obama and Hillary Clinton about UFOs, which is just a great thing to do, which we can all agree on.
02:38:14.000 But don't you think that that was previously discussed?
02:38:17.000 Because when you do one of those shows, they break down what you're going to talk about.
02:38:21.000 Yeah, but I mean, why would they agree to it?
02:38:24.000 Because they have a patented way of dismissing it and answer.
02:38:28.000 But Obama said, like, I cannot reveal more.
02:38:31.000 I don't know.
02:38:32.000 I think that they're aware.
02:38:33.000 Based on their behavior, it seems they're aware and interested in the subject.
02:38:37.000 You've got Hillary walking around holding the UFO book, which we have on camera.
02:38:43.000 What UFO book?
02:38:45.000 She was photographed holding a well-known book about UFOs.
02:38:50.000 Because Podesta also has a...
02:38:55.000 Yeah, but to get back to would Biden engage...
02:39:00.000 Would we be engaging in all this warfare if we knew aliens were kind of observing us?
02:39:06.000 It just seems...
02:39:07.000 It doesn't make sense.
02:39:09.000 I mean, not that it needs to make sense, but if I were in office and I knew that, you know, there were these craft that, like...
02:39:18.000 We're watching us.
02:39:19.000 We didn't really know what was going on.
02:39:20.000 Like, I would be focusing on that and trying to, like, clean up, you know, make nukes safer, clean up the planet, like, do things that...
02:39:28.000 And even go talk to Putin and Xi about what's going on.
02:39:33.000 Yeah, but it doesn't seem like you're ever going to stop this...
02:39:36.000 Control of natural resources and this grasp of power that human beings have.
02:39:44.000 And if you're engaged in that, and that's a gigantic part of the economy, of international relations, why wouldn't they continue business as usual, even with the knowledge that UFOs exist?
02:39:56.000 I feel like it's a risk.
02:39:57.000 It would be a risk to our relationship with the ETs.
02:40:01.000 What if we don't have a real relationship with them?
02:40:03.000 Well, but even if we don't, if the behavior that we're exhibiting is just reckless, blowing each other up, destroying the planet, that's gonna not make us look good to them, in which case maybe they would start messing with us.
02:40:16.000 I don't know.
02:40:17.000 It just seems like it doesn't...
02:40:19.000 If there's a higher level of, like, beings that are higher than us, I would not be risking playing these earthly games.
02:40:29.000 Right, but how do you stop this?
02:40:31.000 If the whole purpose of this, like this war machine, is this constant attempt to control resources and consolidation of power, which it seems to be.
02:40:46.000 This is the game that they're playing no matter what.
02:40:48.000 And there's a lot of money involved in this game.
02:40:50.000 And there's a lot of influence in this game.
02:40:52.000 This is the game that the United States plays.
02:40:54.000 But no matter what...
02:40:56.000 They're gonna stop playing this game because there's a space daddy out there watching us?
02:40:59.000 I think that the space daddy is potentially the one thing that could make us stop.
02:41:05.000 But it hasn't.
02:41:06.000 Right, which makes me think that there is no Space Daddy.
02:41:09.000 It makes me think they don't have a relationship with Space Daddy.
02:41:13.000 This is what I think.
02:41:15.000 I think they're just doing what they've always done and maybe in relation they're just watching all this shit go on and they don't have real answers.
02:41:23.000 I think they're risking our planet by – if aliens were going to get hostile, I feel like they would much more likely get hostile to humans who could potentially fly out into the universe with nukes and spread our devastation.
02:41:40.000 But that's very far in the future.
02:41:42.000 We don't really have that capability right now.
02:41:44.000 I would imagine that if they're watching and they're observing that this is business as usual for the human race, if they were going to intervene and step in, they would step in if we employed nuclear weapons and we were really at the risk of destroying ourselves.
02:42:00.000 This is like a little slap in the face on the schoolyard.
02:42:03.000 This is not as horrific as Hiroshima and Nagasaki.
02:42:09.000 But imagine you're in office and you get verification.
02:42:14.000 You get the physical proof that you need to know.
02:42:17.000 And it's super classified.
02:42:20.000 Would that not change your whole reality and view on the world?
02:42:25.000 What do you do with that?
02:42:26.000 Do you go to the military-industrial complex?
02:42:29.000 Do you go to the pharmaceutical industry?
02:42:30.000 Do you go to all these people that are polluting rivers and all the fucking East Palestine thing?
02:42:38.000 Do you say, hey, this can never happen again because UFOs are watching us?
02:42:42.000 Well, I think that you at least hit up world leaders on private channels.
02:42:48.000 And start having the conversation like, hey guys, can we talk about this for a second?
02:42:54.000 Because we're seeing these crafts and it's occurring and we want to be on the same page because probably a good idea to do that.
02:43:05.000 Yeah, but there's no evidence that it's a good idea to do that.
02:43:07.000 There's no evidence that they're doing anything other than watching us.
02:43:10.000 Well, but...
02:43:12.000 Wouldn't it be a good idea for us to coordinate other players in the Earth to understand our strategy?
02:43:19.000 Sure.
02:43:20.000 It would if it got to that.
02:43:23.000 But it hasn't gotten to that.
02:43:24.000 Right now it's gotten to the, oh, there's this thing that flies really fast and goes into the water and doesn't make a splash.
02:43:30.000 The fuck is that?
02:43:31.000 I don't know, but...
02:43:33.000 It's a lot of fucking oil, and we've got to get that oil, and there's a lot of resources.
02:43:36.000 We've got to control those resources, because if we don't, China will.
02:43:40.000 If we don't, Putin will.
02:43:41.000 If we don't, this and that.
02:43:43.000 And we have to maintain our control, and we're the world's leader, and we're this and that.
02:43:48.000 It's just business as usual while this is all happening.
02:43:52.000 In this very weird way where the Pentagon discusses it, but not very specifically and not gives you all the information and just says, I can't talk about it.
02:44:03.000 But yeah, we do have a crashed UFO. And we have more video that we haven't shared, which we know.
02:44:10.000 Mellon said that.
02:44:11.000 Yes.
02:44:11.000 That they have higher-res stuff that they're specifically not sharing.
02:44:15.000 Mind-blowing stuff.
02:44:17.000 Just the fact, when I first heard that, that they have higher res, like, in my brain, I feel like I changed.
02:44:25.000 Because that felt like another level of validation for this being more of a real thing.
02:44:32.000 Not that I believed in that moment, but I feel like it changed who I was a little bit because it felt more real.
02:44:38.000 My concern is that it's a distraction and that this is all black ops stuff and these are drones and these are some things that we have the capabilities of utilizing and that we've created and that the United States citizens don't need to know about and the world citizens don't need to know about and this is a massive Advantage that the people that are in control of these technologies have over everyone else,
02:45:05.000 wouldn't it be better to say these are from another world and we don't have any idea?
02:45:09.000 I mean, to be honest, I sort of lean towards that option given how they're behaving.
02:45:15.000 Because they're behaving more in a military application mode of this stuff.
02:45:23.000 They're not behaving in a human way.
02:45:26.000 Whenever they say it's something, I'm like, why are you telling me that?
02:45:29.000 Well, you're telling me that because you want me to believe it's something.
02:45:31.000 Like, why would I believe you when you lie about basically everything?
02:45:36.000 If they lie about Nord Stream, they lie about the Gulf of Tonkin, they lie about, you know, all the different things they've lied about.
02:45:42.000 Why would they tell the truth about this all of a sudden?
02:45:46.000 Right.
02:45:46.000 And the whole thing like, oh, well, they wouldn't be messing with our own pilots in restricted airspace.
02:45:53.000 Well, maybe they would do that because that is how you create confusion.
02:45:58.000 But that being said, I don't know.
02:46:00.000 It's just like a bobblehead.
02:46:02.000 I'm just back and forth every day.
02:46:05.000 Well, I think we're basically just mentally masturbating until there's some real information.
02:46:09.000 Well, there is, but there is.
02:46:11.000 But there is.
02:46:12.000 So, there we're back again.
02:46:15.000 But there's so little.
02:46:16.000 There's so little, and it's so inconclusive, and there's so many, you know, statements.
02:46:21.000 Maybe we can teach AI to, like, go and get the information for us.
02:46:24.000 I wonder what chat GPT says about UFOs.
02:46:29.000 Yeah.
02:46:30.000 And now that it's becoming autonomous...
02:46:33.000 Jamie, you want to give it a run?
02:46:34.000 What would you like me to specifically ask it?
02:46:36.000 Let's say, to your knowledge, are we being visited by beings or crafts from another world?
02:46:47.000 How about that?
02:46:48.000 Okay.
02:46:49.000 Being visited by...
02:46:52.000 Beings or crafts from another world.
02:46:55.000 Dun, dun, dun.
02:46:57.000 It'll probably provide a bullet list of cases.
02:47:01.000 That's what I'm guessing.
02:47:04.000 Here it goes.
02:47:05.000 As an AI language model, I do not have personal beliefs or opinions, but I can provide you with information based on the topic of extraterrestrial visitation.
02:47:12.000 As of my knowledge, the cutoff date of 2021, there's no conclusive evidence that definitely proves the existence of extraterrestrial life.
02:47:19.000 We're good to go.
02:47:40.000 The search for extraterrestrial intelligence, SETI, is an ongoing scientific effort to detect signs of intelligent life elsewhere in the universe while initiatives such as Breakthrough, Starshot, aim to develop technology for interstellar travel.
02:47:54.000 In summary...
02:47:56.000 While there is no concrete evidence of extraterrestrial visitation, the search for extraterrestrial life and the possibility of interstellar travel remain topics of scientific interest and investigation.
02:48:08.000 Oh, that's so weak.
02:48:10.000 Yeah.
02:48:10.000 I mean, they're not even aware of the most recent advances in the U.S. government for how it's being looked into.
02:48:18.000 You know, they're talking about...
02:48:19.000 As of 2021. Oh, 2021. Yeah, so it's...
02:48:22.000 Why is it giving you information from two years ago?
02:48:24.000 When all the stuff really kind of broke.
02:48:27.000 It broke in 2017. That was the New York Times article that claimed that the Pentagon...
02:48:32.000 Right, so that should be there.
02:48:33.000 Yeah.
02:48:34.000 Yeah.
02:48:35.000 I mean...
02:48:36.000 Yeah, man.
02:48:37.000 No, I love it.
02:48:37.000 All of these issues are...
02:48:41.000 UFOs are so much fun to talk about.
02:48:43.000 They're the most fun.
02:48:44.000 UFOs are the most fun to talk about.
02:48:46.000 And it's just like, I'm just focused on trying to have more civil conversations about it.
02:48:55.000 Because it just seems like everybody just wants to jump on their team's side and just have shallow conversations.
02:49:03.000 Like, be able to pursue We're good to go.
02:49:36.000 It's not.
02:49:37.000 Except the narrative.
02:49:41.000 These Pentagon files are the same thing the New York Times is doing with the Nord Stream stuff.
02:49:48.000 So there's basically been a media blackout over these Pentagon files.
02:49:51.000 It's being called the biggest U.S. government leak since Snowden.
02:49:56.000 This happened in the last couple weeks.
02:49:58.000 And Fox News even made a statement, we're not going to cover this.
02:50:02.000 We're not going to show the documents because, like, serious blueprints for Ukraine were leaked.
02:50:09.000 And do you hear about these?
02:50:11.000 Yes.
02:50:12.000 Explain that to people.
02:50:13.000 I'm definitely not an expert on this.
02:50:16.000 But it seems like a blueprint for setting up this war.
02:50:20.000 Yes.
02:50:21.000 Like, very sensitive document.
02:50:24.000 I mean, Kirby, one of the...
02:50:26.000 John Kirby, he's like Biden's spokesperson.
02:50:29.000 He's the guy who lied about the Nord Stream thing.
02:50:32.000 You know, in the press room.
02:50:33.000 And he came out and said, he specifically told media, don't cover this.
02:50:41.000 Leak.
02:50:42.000 Which is just a shocking thing to hear from, you know, a government official telling the press not to cover a leak.
02:50:50.000 Did he give a reason?
02:50:52.000 I mean, national security and like screwing up the whole Ukraine situation.
02:50:59.000 With the truth.
02:51:00.000 Yeah.
02:51:01.000 Yeah, I mean, it's a huge strategic blunder.
02:51:03.000 Don't come to the truth, guys.
02:51:06.000 You're with us.
02:51:06.000 Yeah.
02:51:07.000 Meanwhile, we've been fucking you for decades.
02:51:12.000 Yeah.
02:51:12.000 Yeah, that money...
02:51:13.000 But listen to us.
02:51:15.000 So, these leaks are...
02:51:17.000 The leaks are escalating, which, you know, obviously you don't want it to be...
02:51:22.000 And this is why the government should start the process of, like, an actual disclosure, you know, opening up FOIA, Freedom of Information Act, much further, making it much...
02:51:35.000 Setting us on a path to get the information because otherwise we're going to keep getting these leaks and it's going to be worse for them.
02:51:42.000 Yeah.
02:51:42.000 Because it's going to come out willy-nilly and it's going to cause a lot more problems.
02:51:46.000 It's going to completely erode trust.
02:51:48.000 Right.
02:51:49.000 Yeah.
02:51:50.000 Yes.
02:51:50.000 So we can probably all get on board for a path to getting there.
02:51:56.000 Yeah.
02:51:57.000 Well, listen man, I appreciate you and I really appreciate what you're doing and your thought process behind all this and that you're stuck to these ethics.
02:52:09.000 You've stuck to them from the very beginning when you formulated this company and you're still doing it.
02:52:14.000 Thanks so much for having me.
02:52:15.000 Appreciate it very much.
02:52:17.000 Tell people it's Minds.org.
02:52:19.000 Minds.com or Minds.org.
02:52:21.000 Both.
02:52:22.000 Or you can get it on the App Store.
02:52:24.000 Just search Minds, M-I-N-D-S. Or if you want to support us, WeFunder.com slash Minds.
02:52:29.000 Trying to be community owned.
02:52:30.000 My pleasure, brother.
02:52:33.000 Thank you very much.
02:52:34.000 Bye, everybody.