The Joe Rogan Experience - March 07, 2023


Joe Rogan Experience #1951 - Coffeezilla


Episode Stats

Length

3 hours and 4 minutes

Words per Minute

186.36298

Word Count

34,297

Sentence Count

2,801

Misogynist Sentences

19

Hate Speech Sentences

19


Summary

In this episode of the Joe Rogan Experience podcast, Joe talks about the FTX scam, how it started, and why it's one of the biggest scams in the crypto space. Joe also talks about his own experience with the scam and how he was able to identify it and expose it to the public, and how it led to a massive amount of losses for the scammer who created it, Sam Bankman Frieden. Joe also gives us a quick run down of some of the most prominent crypto traders in the space, and explains how they got away with their scams and how they made millions of dollars off of it. Joe is an expert in crypto trading, and has a lot of experience in dealing with scams, so he's well versed in how to spot them and how to identify them, and what to do if you're a crypto trader looking to get out of a bad spot in the market. Joe is also a regular contributor to the Financial Times, and is a friend of mine, so I'm sure you know who he is talking about it! Thanks for listening to this episode, Joe! -Jon Sorrentino Tweet me if you have any crypto tips, tricks, or just want to give me a shoutout :) or have a question or would like to ask a question about crypto? Timestamps: 1:30 - What is a scammer? 2:00 - What's your favorite crypto trading platform? 3:00: What are the worst crypto exchange? 4: 5:15 - What are you looking for? 6: How to spot a scam? 7:00 8: What is the best crypto exchange you can do? 9:00 | What is your biggest crypto trading opportunity? 11:30 | What are your favorite piece of advice? 12:30 What s your favorite cryptocurrency? 13:00 // How to make money in crypto 14:40 | What kind of crypto trading firm? 15:40 16:40 - Who are you most likely to make the most money? 17:30 // What s a good crypto trading company? 18:40: What do you think of Bitcoin? 19: What type of crypto exchange do you like? 21: What's a good piece of crypto stock? 22:00 / 15:00 +16:00 & 16:20


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 Nice to meet you, man.
00:00:13.000 Hey, thanks a lot, man.
00:00:14.000 I appreciate what you do.
00:00:16.000 What you do is a very valuable service.
00:00:18.000 Because you go so deep on some of these scammers.
00:00:21.000 It's like, it's so important.
00:00:22.000 Because there's so many people that just, they don't really understand what's going, like the FTX thing, for example, the best one.
00:00:29.000 Because I was so in the dark about this thing.
00:00:31.000 I was like, what is happening?
00:00:32.000 Like, what are they doing?
00:00:34.000 Try to break it down for us.
00:00:35.000 Like, first of all, It's a crypto exchange, right?
00:00:41.000 So how does that work?
00:00:43.000 So the first question is, when you learn about crypto, you're like, it's this magic internet money.
00:00:48.000 Magic.
00:00:49.000 How do you get some of that?
00:00:50.000 How do you get some of that magic internet money?
00:00:52.000 Well, you have to go somewhere to buy it.
00:00:54.000 And so it's a crypto exchange is where you kind of go.
00:00:57.000 You put your...
00:01:15.000 Explain tokens, because I don't understand tokens.
00:01:17.000 I know there's crypto and there's tokens.
00:01:19.000 Like, what is the difference between the two?
00:01:20.000 Yeah, tokens like is the individual, you can think of currency, right?
00:01:24.000 So it's like the individual, so Bitcoin is, you have Bitcoin, then you have, it's one of the cryptocurrencies, you have Ethereum, you have Dogecoin, you have SafeMoon, you have FTT, which is what FTX was using as their native token.
00:01:40.000 So a lot of these guys, you'll start a crypto exchange, and then you'll launch your own token that people can invest in, sort of like they're investing almost in your crypto exchange.
00:01:49.000 And so that was actually one of the ways that FTX really perpetuated their fraud.
00:01:55.000 I can break it down.
00:01:56.000 How much do you know about the FTX situation?
00:01:58.000 Let's break it down for people that don't know about it.
00:02:01.000 Let's do it.
00:02:01.000 So FTX was this crypto exchange located out in the Bahamas, which is a great place to put your— Why do they do it in the Bahamas?
00:02:09.000 Because it's unregulated.
00:02:11.000 So the problem with doing stuff in the United States or, you know, something like Europe or something like that is you are subject to all these regulations which require you to be a little more careful.
00:02:24.000 Oh, those are pesky.
00:02:25.000 Yeah, they're annoying.
00:02:27.000 We don't need that.
00:02:28.000 The famous example is Coinbase is in America, and they have to file all these forms.
00:02:33.000 They're a regulated entity.
00:02:35.000 They're a publicly traded company, so they have to report everything.
00:02:38.000 So if you're offshore, you can kind of not do any of that.
00:02:41.000 You can play fast and loose.
00:02:43.000 And for some people, they think that's better.
00:02:45.000 They can offer, let's say, 100x leverage.
00:02:49.000 You have a dollar.
00:02:50.000 I'll let you trade with $100.
00:02:53.000 And that's gonna be like one reason you come to my offshore exchanges.
00:02:56.000 I can offer you more leverage than the guys who are like, you know, Coinbase or something like that.
00:03:00.000 So FTX launches, let's start with who Sam Bankman-Fried is.
00:03:04.000 He's kind of at the center of all of this.
00:03:06.000 Sam Bankman-Fried is this guy who comes, he's the son of two Harvard lawyers.
00:03:11.000 Then he comes up, prep school.
00:03:13.000 He's kind of like built for success, right?
00:03:17.000 He goes to MIT, goes to Jane Street as this quantitative trader.
00:03:21.000 And then he goes into the crypto space and he launches FTX. He's very young, right?
00:03:26.000 How old is he?
00:03:27.000 I think he is young.
00:03:30.000 Maybe you can look that up, Jamie.
00:03:31.000 31. He launches Alameda Research First, which is just like this trading firm, which basically the idea here is, we have some ideas, we're gonna raise a little bit of money, and we're gonna do these trades that are profitable in crypto.
00:03:45.000 So the way he first made his money, Was he did something where he bought Bitcoin in the US and he sold it on these Japanese exchanges where it was worth more.
00:03:57.000 So he was arbitraging this difference in prices.
00:04:02.000 And then after he made his money that way, he launches FTX in 2019. And that's a crypto platform where, honestly, you can make a lot more money than just with a trading firm.
00:04:13.000 So FTX quickly skyrockets in popularity.
00:04:16.000 They bring on people like Tom Brady to promote it.
00:04:19.000 Larry David in the Super Bowl.
00:04:21.000 They kind of get buy-in from all these big sort of names and also reputable people like BlackRock, Sequoia Capital.
00:04:30.000 They all invest in this guy.
00:04:32.000 Kevin O'Leary famously promoted it for like $18 million.
00:04:35.000 They gave him $18 million to promote it?
00:04:38.000 He says he lost it on the platform.
00:04:40.000 He says the 18 million was on FTX or whatever, and he never got a dollar out of it.
00:04:46.000 But that was what the deal was for.
00:04:49.000 So they were paying everybody to promote this FTX crypto exchange.
00:04:53.000 And the idea was, is this is the next big thing, right?
00:04:59.000 And this is where you're going to make money.
00:05:00.000 There was a lot of fear of missing out or FOMO in the markets at the time.
00:05:05.000 You know, everyone thought, oh, cryptos, you have to get in now, right?
00:05:09.000 Because if you get in now, you're going to make some money.
00:05:12.000 And so people invested in FTX thinking that this is going to be a safe platform.
00:05:20.000 This kid is smarter than everyone else.
00:05:22.000 He's the son of Harvard lawyers.
00:05:24.000 We just sort of can't lose.
00:05:26.000 And nobody paid attention to some of the red flags that were going on until ultimately it was too late.
00:05:31.000 It turns out he was pilfering FTX, the customer deposits, and was using it in Alameda Research, which was his trading firm, to try to make extra money, and he lost it.
00:05:42.000 And so this is all because it's unregulated.
00:05:45.000 Like if he was doing this, like Coinbase can't do this.
00:05:48.000 Is that correct?
00:05:49.000 Yeah, Coinbase is much more heavily scrutinized.
00:05:51.000 They actually have to file with the SEC. They have to say what they have, where they're putting their money.
00:05:57.000 They're subject to more regulation about like how they take care of customer deposits.
00:06:00.000 One of the big things with FTX was they told people, hey, you put your money with us.
00:06:04.000 We're not going to touch it.
00:06:05.000 We're not going to move it.
00:06:06.000 That's what FTX said in their terms of service.
00:06:08.000 So one of the really...
00:06:10.000 Big problems was they actually weren't doing that, but nobody knew because nobody had a look at their books.
00:06:15.000 Like, it was very opaque.
00:06:17.000 Nobody knew what was going on behind the scenes.
00:06:19.000 So even though they said, like, we're not going to touch your money, as soon as you deposited Bitcoin, I mean, I talked to some of the insiders at Alameda.
00:06:26.000 They said they had this backdoor system to where they could see you, Joe, deposit a Bitcoin on FTX. They could grab that Bitcoin and start trading with it immediately.
00:06:37.000 Even though they were never supposed to be able to touch your money, obviously.
00:06:40.000 That was the whole point.
00:06:40.000 It's like, you deposit with us.
00:06:42.000 We're not going to do anything with your money.
00:06:43.000 It's your money.
00:06:45.000 It's almost like a bank.
00:06:46.000 You deposit with a bank.
00:06:47.000 Your bank isn't supposed to go ahead and take your money and go start trading with it unless, obviously, we have FDIC insurance, stuff like that.
00:06:55.000 But they didn't have that.
00:06:56.000 They just take your money, go trade with it, and that's where the disaster started.
00:07:01.000 I really enjoyed you catching him on Twitter spaces.
00:07:05.000 I really enjoyed that.
00:07:06.000 I listened to the whole thing.
00:07:07.000 Because before that, you have this guy who's this...
00:07:11.000 You know, whiz kid, who you listen to him talk.
00:07:15.000 He has an answer for everything.
00:07:16.000 He's so articulate.
00:07:18.000 He's so knowledgeable.
00:07:20.000 Like, I listened to previous interviews before he got busted.
00:07:22.000 And then when you have him on, there's a lot of...
00:07:25.000 I wasn't aware.
00:07:27.000 I'm not sure.
00:07:29.000 I'm not aware of that.
00:07:31.000 I don't know.
00:07:32.000 There was all this...
00:07:34.000 Hemming and hawing and a lot of ums and ahs and and you just kept on him.
00:07:39.000 It was amazing that he first of all was amazing that He felt like he could do something like that.
00:07:46.000 Like why would he publicly communicate?
00:07:49.000 This is one of the This is why it's so interesting to me to look at fraud like this is why it fascinates me as well as I think it's an important thing to expose but like I'm interested in the characters Who perpetuate fraud because they're such interesting psychological case studies.
00:08:04.000 Sam Bankman Freed, you could probably write a whole book about the fact that this guy, he got away with lying so long and perpetuating this image of himself as this generous billionaire.
00:08:14.000 You know, he's sort of the next Warren Buffett that when everything goes wrong, he thinks he can reestablish control because he's so smart.
00:08:22.000 He is such a good liar that he's like, I can just lie my way out of it.
00:08:25.000 So I think that's why he ultimately talked.
00:08:27.000 His idea was If I lied my way into it, I can sort of lie my way out of it.
00:08:31.000 And this is what he did.
00:08:33.000 Prior to this, I'd interviewed him twice before, and I had kind of gotten hamstrung with, like, you know, he's just so good at dodging stuff.
00:08:40.000 Did you interview him before the scandal?
00:08:44.000 No, not before the scandal.
00:08:45.000 So it's like as it was going down.
00:08:47.000 As it was going down, he goes on all these Twitter spaces.
00:08:49.000 He doesn't want to—he's doing interviews with everybody.
00:08:51.000 I ask him.
00:08:53.000 And he doesn't want to talk to me.
00:08:55.000 So but he's going on these Twitter spaces.
00:08:57.000 So I keep I like was tracking when he'd go on a Twitter space and I would contact the people ahead of time.
00:09:02.000 I said, hey, at the end, when you're like ready for this thing to go down, because I know as soon as I get on, it's going to end pretty quickly after I said, let me on.
00:09:09.000 Let me ask him some real hard questions.
00:09:11.000 Because all these guys are like, Sam, you know, we appreciate your transparency.
00:09:14.000 Kind of kissing up a little bit.
00:09:16.000 But I was just like, somebody has to ask him some real questions.
00:09:19.000 So I had two prior little Twitter space interactions with him.
00:09:22.000 And he kept getting away with the fact that he blamed all the wrongdoing of FTX on Alameda Research.
00:09:30.000 And he said, I don't control Alameda Research.
00:09:32.000 Even though he was the owner, he's no longer the CEO as of 2020. He hands it to this...
00:09:39.000 A girl he actually had a relationship with, Caroline Ellison, right?
00:09:43.000 And she supposedly controlled it.
00:09:46.000 He said, she did everything.
00:09:47.000 I don't have access to the book.
00:09:48.000 Like, I basically knew nothing.
00:09:50.000 So anytime you'd call him out on an issue, you'd say, where's the money?
00:09:53.000 He goes, well, it's...
00:09:53.000 I don't know.
00:09:54.000 It's gone.
00:09:55.000 It's Alameda Research.
00:09:56.000 Ask Alameda.
00:09:57.000 So by the third interview, I'd studied him and I said, okay, how do we get down to...
00:10:04.000 FTX's responsibility in this whole thing.
00:10:06.000 And I kept coming back to, it was the terms of service that said, you cannot move, like when I deposit with you, you're not going to touch my money.
00:10:14.000 And I said, Sam, if that's true, where's the money of all these people?
00:10:19.000 There's no Ethereum left.
00:10:21.000 There's no Bitcoin left.
00:10:22.000 You don't have the real tokens anymore.
00:10:23.000 You just have your sort of nonsense FTT tokens, the tokens you invented.
00:10:29.000 And he said, oh, well, you know, there were some margin trading accounts.
00:10:34.000 And I'm like, no, but there were people who didn't trade with margin.
00:10:37.000 There are people who just put their money with you.
00:10:39.000 And all they wanted was they wanted to store some Bitcoin with Tom Brady.
00:10:44.000 They wanted to be alongside Tom Brady.
00:10:46.000 So he's like, well, you know, there was fungibility between wallets.
00:10:54.000 And it's like, well, what's fungibility mean?
00:10:56.000 It means...
00:10:58.000 Whether you were a guy withdrawing who was this degenerate day trader, or you were a grandma who just put one Bitcoin on there, or more likely the grandson, he treated all the accounts the same.
00:11:11.000 So when everyone came running for the money, they just withdrew until nothing was left.
00:11:15.000 And ultimately, because they had lost billions of dollars, it left billions of dollars in credit claims, basically.
00:11:23.000 They didn't have the money.
00:11:24.000 And so now it's trying to be sorted out by the guy who literally unraveled Enron.
00:11:30.000 And he says, this lawyer goes, it's worse than Enron.
00:11:38.000 I watched the CEO, the new CEO, talk about it.
00:11:41.000 Yeah.
00:11:41.000 About him trying to...
00:11:43.000 That's John Ray.
00:11:43.000 Yeah.
00:11:44.000 Yeah, yeah, yeah.
00:11:44.000 Trying to unravel it.
00:11:45.000 And it's amazing.
00:11:47.000 It's amazing that things with this amount of money can get this far sideways before anyone knows what's going on.
00:11:54.000 This is the problem with offshore accounts and stuff.
00:11:58.000 Like...
00:12:00.000 Actually, his whole technique of shifting the blame like onto Alameda and like, I don't control Alameda.
00:12:05.000 I've seen something very similar.
00:12:07.000 I'm investigating this Ponzi scheme that's offshore.
00:12:09.000 And like one of the first things the guy does is he controls it, but he renounces ownership.
00:12:14.000 He goes, oh, I'm passing it off to some sham director.
00:12:18.000 And he goes, I don't have anything to do.
00:12:20.000 I don't know.
00:12:20.000 Where's the money?
00:12:21.000 I don't know.
00:12:22.000 But he controls everything.
00:12:24.000 And so it's like this is the tactic of these offshore companies is...
00:12:29.000 Like, you put the right people in charge who are going to take the fall, you resign, and then you blame it on them later when everything goes wrong.
00:12:36.000 His problem, though, is Caroline Ellison flipped on him.
00:12:39.000 So she definitely flipped on him.
00:12:41.000 She was smart.
00:12:41.000 Yeah, yeah.
00:12:41.000 She cooperated.
00:12:43.000 Her and, I believe, Gary Wang were big executives.
00:12:48.000 They're cooperating.
00:12:49.000 They pled guilty.
00:12:50.000 They're cooperating with the feds.
00:12:52.000 They did the smart thing, which is something like this happens.
00:12:54.000 You shut up.
00:12:55.000 You don't say anything.
00:12:56.000 Right.
00:12:57.000 And then you point at your boss.
00:12:58.000 I mean, that's basically what they did.
00:13:00.000 Oof.
00:13:02.000 Which they for sure did stuff wrong, too.
00:13:05.000 You do not get to that level and not know that things were wrong.
00:13:09.000 Well, reading her tweets about amphetamine use was pretty wild, too.
00:13:13.000 The whole scene was wild.
00:13:15.000 The fact that they were all living together and fucking each other in this giant penthouse, this $40 billion penthouse.
00:13:23.000 The things, it's insane.
00:13:24.000 It's really...
00:13:25.000 I almost wish it wasn't a scam.
00:13:27.000 I've said this before because I root for nerds to be that successful, that you're just completely living outside the norms of society, just fucking each other on amphetamines and making billions of dollars.
00:13:39.000 It sounds like a great story if it wasn't illegitimate.
00:13:44.000 Yeah, that's, I mean, ultimately, that's the problem.
00:13:46.000 Like, Sam was just hopped up on amphetamines playing League of Legends while on Investor Calls.
00:13:51.000 Like, at the time, that was seen as this charming, like, genius thing.
00:13:55.000 On calls.
00:13:56.000 On calls!
00:13:57.000 You could hear, even actually, okay, so this is funny.
00:14:00.000 They found his League of Legends account, and during some of the calls, after it was a fraud, you could hear him clicking in the background.
00:14:08.000 Click, click, click, click, click.
00:14:09.000 He's playing League.
00:14:10.000 Like they tracked his account.
00:14:11.000 He's playing League while on calls about the failure of FTX. Wow.
00:14:17.000 Just imagine the arrogance.
00:14:19.000 Is that arrogance or is it just pure addiction?
00:14:23.000 I think, you know, those multiplayer games, those, you know, online role-playing games, that's what that is, right?
00:14:29.000 I think it's a, is it a MOBA? I used to play like a variant of League, so I know it's fun, but it's not like fun to the point.
00:14:39.000 Look, this guy's whole thing was like, I'm this effective altruist.
00:14:43.000 Yeah.
00:14:44.000 I'm this guy who's going to maximize good in the world.
00:14:46.000 And that was his reason for working all the time.
00:14:49.000 And that's the justification for it to be hopped upon amphetamines.
00:14:51.000 It's like maximize productivity, maximize human happiness.
00:14:54.000 You can't do that and then say, oops, I played a little too much of my video games and lost billions of dollars.
00:14:59.000 Like, this doesn't fly.
00:15:00.000 Well, he wasn't saying that I played the video games and that's why I lost the money.
00:15:04.000 But after the fact, he's still playing video games and you're like, can you have the decency to get off the video game and talk to people?
00:15:14.000 So bizarre.
00:15:15.000 It's just so bizarre that so many people got duped.
00:15:18.000 And I felt the same way about Bernie Madoff.
00:15:20.000 You know, I'm not a financially aware person.
00:15:23.000 I'm not into the market.
00:15:25.000 I don't follow these things.
00:15:27.000 So when I see something like that go down, I'm like, how did he get Steven Spielberg?
00:15:31.000 You know, how does someone like a Bernie Madoff or Sam Bankman-Fried, how does he...
00:15:37.000 Get these people to do this.
00:15:39.000 And in the FTX case, how much of it was getting celebrities to endorse the platform?
00:15:45.000 It's huge.
00:15:45.000 This is what I wanted to say.
00:15:47.000 Like, the more I study this stuff, and you start to have repeat occurrences, like I just cover stuff all the time, and you see echoes of the same thing.
00:15:55.000 I just had somebody just a couple days ago, I was interviewing for this news scheme we're looking at, and he said, you know, I never understood how Bernie Madoff got people, because it seems so preposterous.
00:16:05.000 And then I fell for something very similar.
00:16:07.000 And what I notice with all of these things, the thread is you know it's kind of too good to be true, but the social proof is overwhelming and it overwhelms your kind of like alarm bells.
00:16:19.000 So the social proof is a combination of things.
00:16:21.000 So first of all, it's like it's this guy who drives a Toyota.
00:16:24.000 So you go like, well, why does he need to scam me if he's driving a Toyota, right?
00:16:28.000 Then it's like, which Sam Bankman-Fried did.
00:16:31.000 Then it's like, okay, Tom Brady backs him.
00:16:34.000 Well, Tom Brady's got to have some guys who are looking into this.
00:16:37.000 And then it's like, well, BlackRock backed him.
00:16:39.000 Well, BlackRock definitely has some guys who looked into it.
00:16:42.000 It's Sequoia Capital.
00:16:43.000 They said he might be one of the first trillionaires or whatever.
00:16:48.000 He's such a great entrepreneur.
00:16:50.000 They think he's such a genius.
00:16:52.000 Actually, it might have been one of the A1Z guys.
00:16:55.000 I'm blanking on the name right now.
00:16:57.000 A1Z? What is that?
00:16:59.000 No, no, no.
00:16:59.000 I'm sorry.
00:17:00.000 I'm blanking on it.
00:17:01.000 It's this famous...
00:17:02.000 I've got to remember it right after I get out of here.
00:17:05.000 It's one of the famous investment funds.
00:17:10.000 They invested in a bunch of NFT projects.
00:17:12.000 Mark Andreessen, I think, is the guy who runs it?
00:17:19.000 Hmm.
00:17:23.000 Jamie's looking it up.
00:17:24.000 I thought I knew it off the top of my head, and now I don't want to say the wrong one.
00:17:27.000 A16Z. A1Z. A16Z. It was one of those, Sequoia A16Z. One of them wrote this glowing review of Sam basically saying he's going to be one of the first trillionaires.
00:17:38.000 So all these guys basically, a lot of these people backed Sam with the highest endorsements.
00:17:43.000 And so if you're just an average person, you're thinking...
00:17:46.000 How much more due diligence can I do than all these other guys?
00:17:50.000 All these other guys buy into him.
00:17:52.000 And then they themselves are kind of also looking at each other, being like, well, that guy did it.
00:17:59.000 It's the hottest deal around, right?
00:18:02.000 Kevin O'Leary's in.
00:18:03.000 So you kind of think you're swimming safely with other savvy investors.
00:18:08.000 Yeah.
00:18:08.000 And that's what ultimately gets you to buy in.
00:18:11.000 Bernie Madoff is very similar.
00:18:13.000 I mean, he was, you know, really well regarded in Wall Street.
00:18:17.000 So when people invested with him, they didn't they knew the like returns were insane.
00:18:22.000 But it wasn't like he was some random fly by night guy.
00:18:25.000 He was well-respected in the Wall Street space.
00:18:28.000 People thought he might take over the SEC after the current person had stepped down.
00:18:33.000 They thought he was going to take it over.
00:18:34.000 He's one of the leaders at the NASDAQ. I mean, he was one of the go-to guys.
00:18:39.000 And so you thought, well, I invest with Bernie.
00:18:42.000 I can't lose.
00:18:42.000 It's like almost betting on the house.
00:18:45.000 The house always wins, right?
00:18:46.000 So when FTX was taking off, it just seemed like everyone who was a someone was backing him.
00:18:53.000 So then it was okay.
00:18:54.000 And then I think a lot of these people deferred to their other friends.
00:18:59.000 They're all saying it's okay.
00:19:01.000 Let me put money in.
00:19:02.000 And it's just a huge case study that just because other people fall for something doesn't mean you're safe.
00:19:09.000 Like, you have to do...
00:19:10.000 I hate to say do your own research because that's such an overused, like, scammy phrase.
00:19:15.000 It's actually such, like, a phrase that, you know, it's almost useless.
00:19:19.000 Chemtrails.
00:19:19.000 Let me say this.
00:19:21.000 If it's too good to be true, if they're offering market returns that you want to believe in, you go, man, I want to believe this is real.
00:19:29.000 Don't invest.
00:19:30.000 Like, that's a bad idea.
00:19:31.000 People were calling bullshit, though.
00:19:34.000 Just like they were calling...
00:19:35.000 There was a few people that were wary that were calling bullshit on Bernie Madoff, and there was a few people that were standing out and saying, none of this makes sense.
00:19:44.000 Right.
00:19:45.000 And who were those people?
00:19:47.000 So there were a few people.
00:19:50.000 There was a Matt Levine interview with Sam Bankman Free.
00:19:55.000 He didn't call him a fraud outright, but he's like, hey, it seems like you're in the Ponzi business and business is good.
00:20:01.000 Whoa.
00:20:01.000 And what did he say to that?
00:20:03.000 He's like, well, you know, like think of it like a box.
00:20:07.000 And, you know, you tell a bunch of investors, you know, hey, if you put money in this box, we can get some money out.
00:20:12.000 We can give you this yield.
00:20:13.000 He starts to explain like this thing that sounds exactly like a Ponzi scheme.
00:20:18.000 And so ultimately, Matt Levine's like, yeah, this doesn't really make a lot of sense.
00:20:23.000 But again, it stopped short of this is a fraud because, you know, no one knew.
00:20:28.000 There's a bunch of backing.
00:20:29.000 So I made a video at the time being like this crypto CEO just describes a Ponzi scheme.
00:20:35.000 And that video has aged so well because it's like people like...
00:20:38.000 It was all true.
00:20:40.000 But people were outright calling it a fraud, like Mark Cohodes.
00:20:43.000 He's a famous short seller.
00:20:46.000 He was calling that a fraud early.
00:20:48.000 I have a buddy of mine.
00:20:50.000 He goes by Dirty Bubble Media on Twitter.
00:20:53.000 He's like one of the anon at Twitter accounts.
00:20:55.000 He was calling it a fraud.
00:20:57.000 You know, there were things that were coming out, like questions about, you know, okay, they say they have all this money, where?
00:21:04.000 Where on chain is it?
00:21:06.000 So like the blockchain, everything's publicly, you can see it, right?
00:21:09.000 It's all at some address.
00:21:11.000 And so people were asking like, where's, you say you have all this Bitcoin, where's the Bitcoin?
00:21:15.000 You say you have all this Ethereum, where's the Ethereum?
00:21:19.000 And why is so much of your balance sheet made up of your own tokens?
00:21:24.000 It's a big question.
00:21:26.000 So one of the things that FTX had done, and a lot of companies were doing at the time, but FTX was sort of the worst offender, is let's say I give you a loan, Joe.
00:21:37.000 So an unsecured loan would be I give you a million dollars and I don't ask for anything.
00:21:40.000 So if you default on that loan, I'm out a million.
00:21:43.000 Another way is I ask for, okay, I'll take some equity in the studio if something goes wrong, right?
00:21:49.000 So I cover my butt if you default on it.
00:21:52.000 Now, this is what was going on in crypto.
00:21:54.000 They're called collateralized loans.
00:21:56.000 But what FTX was doing was they were saying, hey, we'll take a million dollars from you, but instead of giving you collateral dollars or some asset, we'll give you FTT tokens, which is their own invented coin.
00:22:10.000 And that should have value if anything goes wrong.
00:22:13.000 And people were accepting that as value.
00:22:14.000 But the problem is...
00:22:17.000 The exact moment FTX can't pay you back is the exact moment that FTT becomes worthless.
00:22:23.000 So you think you have all this collateral, you think you have this backstop because on the books it's worth, you know, X dollars.
00:22:29.000 Let's say it's like worth five dollars a coin.
00:22:32.000 But what you're not realizing is the real risk is when FTX can't pay you back, they probably can't pay anyone back.
00:22:37.000 Everyone loses confidence.
00:22:38.000 Everyone sells their FTT tokens.
00:22:40.000 No one wants to buy it.
00:22:40.000 It's worth nothing.
00:22:41.000 So how did this all fall apart?
00:22:45.000 Great question.
00:22:46.000 So it's really interesting because it was like a battle between FTX and one of their competitors, Binance.
00:22:52.000 So the owner of Binance is, I think it's Chengpeng Zhao.
00:22:57.000 He goes by CZ on Twitter.
00:22:59.000 I probably butchered his name.
00:23:00.000 But he was actually originally sort of an ally of Sam.
00:23:04.000 So he invested in FTX early on, put $100 million in, and eventually got paid out like $2 billion.
00:23:12.000 Yeah.
00:23:13.000 Some of it was in this FTT token, though.
00:23:15.000 So they have a bunch of FTT, right?
00:23:17.000 And so it's like November was when all this stuff went down.
00:23:21.000 And a report comes out from Coindesk where it shows FTX's balance sheet.
00:23:27.000 It shows actually what tokens they have.
00:23:29.000 You know, for one of the first times, it was kind of really everyone got to look at it all at once in one place.
00:23:34.000 And people notice, like, wait a second.
00:23:37.000 A lot of their assets are just their own tokens.
00:23:39.000 Like they had a CRM token, which they control most of, FTT. So it looks like, if you just look at their assets, it looks like they're covering their liabilities.
00:23:49.000 They owe customers 10 billion.
00:23:50.000 Looks like they have 10 billion.
00:23:52.000 But like most of this, 10 billion is just their own tokens.
00:23:57.000 CZ takes this opportunity to kind of spread some sort of information about that.
00:24:03.000 He says, hey, we're actually going to sell most of our FTT that we got from that deal.
00:24:09.000 And we're going to sell it.
00:24:10.000 We don't know what's going on there.
00:24:12.000 And all of a sudden, it starts this firestorm because people are like, there was already all this worry in the past.
00:24:17.000 That summer, there had been a bunch of companies that collapsed.
00:24:20.000 And people had never thought FTX. It was kind of the first time anyone thought FTX could...
00:24:25.000 Maybe not have the money.
00:24:26.000 So CZ says, hey, maybe they don't have the money.
00:24:28.000 I don't know.
00:24:29.000 Whatever.
00:24:29.000 I'm just going to sell like $2 billion worth.
00:24:33.000 But he knew what he was doing.
00:24:35.000 Oh, for sure.
00:24:36.000 He's a shark.
00:24:37.000 He's a shark.
00:24:37.000 He knows what he's doing.
00:24:38.000 What was the conflict between the two of them that led him to do that?
00:24:41.000 The conflict was—it's a great question.
00:24:43.000 The conflict was ultimately that Sam was trying to get some regulations passed, and he knew every— I think?
00:25:06.000 It was kind of a dig because he knows CZ can't go to Washington.
00:25:10.000 He'd be afraid of being indicted.
00:25:12.000 I don't really understand why he can't go, but he can't go to America.
00:25:15.000 So Sam was meeting with regulators.
00:25:18.000 CZ felt cut out, like he was basically going to get a bad deal with regulators.
00:25:22.000 Sam was working really closely with regulators to try to get regulations passed.
00:25:26.000 And CZ felt like he was cut out.
00:25:28.000 So that stirred up this battle between them.
00:25:32.000 And ultimately, Sam goes, oh, you won our battle.
00:25:38.000 And people were like, you know, is it a battle when you lose billions of dollars of customer money?
00:25:43.000 Like, well, how can you view this as a battle?
00:25:45.000 But he viewed it as like, we're sparring partners.
00:25:48.000 Like, and you won this round, or you won the war.
00:25:51.000 Because he thinks this is going to go on forever.
00:25:54.000 Initially.
00:25:55.000 He thought he was going to be able to figure out a way to pull all the company's assets together and make everybody sound and repay everyone and go back to making money again.
00:26:07.000 I don't think he thought he would repay everyone, but everyone thought like, oh, we'll just enter Chapter 11 bankruptcy.
00:26:11.000 We'll restructure the company.
00:26:13.000 We'll reopen.
00:26:14.000 We'll just turn all the debt into new FTT tokens and pay everybody out.
00:26:19.000 Oh, that's what he thought?
00:26:22.000 He's on amphetamines, right?
00:26:24.000 So he can't be thinking totally clearly and probably overly confident.
00:26:29.000 Yeah, it's pretty clear he didn't see like the full scope of the situation, especially at first.
00:26:35.000 It seemed like he thought, you know, he was like saying FTX US was fine.
00:26:39.000 And then FTX US went bankrupt and he's the one who put it into bankruptcy and then he's telling everyone, oh, no, no, the money is actually still there.
00:26:46.000 I mean he was constantly giving a conflicting narrative of what was going on.
00:26:50.000 Now he's still like trying to say he did nothing wrong.
00:26:54.000 He maintains he's innocent.
00:26:57.000 And right now actually the big like kind of scandal now is they're finding a bunch of campaign finance violations because he was trying to influence politics, US politics.
00:27:07.000 I mean it's insane how deep FTX's influence went from the Bahamas reaching into the United States while technically not really being regulated by the United States.
00:27:19.000 Yeah, they were the number two donor to the Democratic Party.
00:27:23.000 That's right, but...
00:27:24.000 Also to the Republicans.
00:27:26.000 This is what's wild.
00:27:27.000 So, Sam knew publicly...
00:27:30.000 In our current American climate, it's kind of like, okay, it's a little bit chic to be donating to Democrats.
00:27:37.000 You can do that without too much negative press.
00:27:40.000 But if I'm the number three donor to the Republican Party, that's going to be a bad look.
00:27:44.000 So he decides to donate dark to Republicans.
00:27:49.000 And part of the accusation is he knowingly did this through one of his executives, Ryan Salami or something?
00:27:57.000 I think that's his last name.
00:27:59.000 Salami.
00:27:59.000 I don't know.
00:28:00.000 I just don't know how to pronounce it.
00:28:01.000 But he was like, yeah, the number three donor to the Republican Party.
00:28:03.000 But it was all orchestrated through Sam.
00:28:06.000 Sam wanted to basically influence politics by just donating, donating, donating.
00:28:12.000 And the idea is you donate to both sides, you can never lose, right?
00:28:15.000 Yeah.
00:28:16.000 If you have your hands in both pockets.
00:28:18.000 But publicly, he's just like, he's donating to Democrats because he says, oh, I'm like this, like, you know, I care about all these issues.
00:28:24.000 But it's like even more cynical than just buying one party is buying both lying about it so that you can get all the good press of like caring about all these social issues while also not caring at all.
00:28:39.000 And ultimately, one of the ways, like, even some of the candidates they donated to were through, like, a third employee we didn't even know about.
00:28:47.000 And they were, like, donating through them for, like, all these LGBTQ plus causes.
00:28:54.000 And it was through a guy, and the guy was like, I feel a little uncomfortable with this.
00:28:57.000 And he said, well, we don't have anyone trustworthy at FTX. We can donate through who's gay.
00:29:01.000 So, like, can you do this?
00:29:03.000 So someone had to be gay to do it?
00:29:06.000 Like...
00:29:06.000 No.
00:29:07.000 Basically, they were like, we need someone trustworthy we can trust to do this.
00:29:11.000 So, hey, you're going to be the guy.
00:29:12.000 Like, we're just going to do a few transactions through your name.
00:29:14.000 That just came out in a press release.
00:29:16.000 It's the new charges.
00:29:17.000 He was basically like a – they call them straw donors because it's like if I give money to you to give money to a politician on my behalf, you're a straw donor.
00:29:25.000 You're not really a donor.
00:29:27.000 So Alameda was using customer funds to pay off politicians in order to try to get favorable regulation for – I guess, offshore crypto exchanges, right?
00:29:40.000 And so these campaign finance, these violations, what are the regulations in terms of what you're allowed to do and donate and how did he violate them?
00:29:53.000 So I think the big violation was you're not supposed to – like if you're Alameda Research and you're funneling money through a personal investor, that I think is the problem.
00:30:06.000 Actually, campaign finance laws I've heard are pretty weak.
00:30:10.000 I forget the name of the law, but it was passed in like the early 2000s, 2010s maybe.
00:30:16.000 Where it actually became very easy to donate Dark, where it's like you can donate through super PACs, political action committees, and you can donate as much as you want and you don't have to be – your name has to appear nowhere.
00:30:28.000 And so that's actually what he said in one of the interviews.
00:30:30.000 He goes, no one believes me when I said I donated Dark because no one believes anyone would be like – everyone wants the credit for donating.
00:30:37.000 No one believes that I just do it on the sly.
00:30:41.000 And that's ultimately what he was doing.
00:30:42.000 But it also looks like he was donating through some of his executives.
00:30:47.000 I mean, the whole thing was shady all the way down.
00:30:50.000 So the person's not named in the report who is donating to Democrats.
00:30:55.000 We know the one donating to Republicans was Ryan Salami.
00:31:01.000 That person eventually said, well, hey, can we restructure all this money that went through me like a loan so that we can say that I took a loan out and I was donating so we didn't violate any laws?
00:31:11.000 They never ended up doing that.
00:31:13.000 But like it was very clear the internal conversations were they knew they were committing fraud.
00:31:18.000 They knew they were doing things wrong.
00:31:20.000 And this idea was, well, no one's going to catch us, right?
00:31:24.000 Nobody's ultimately going to find out what we're doing here.
00:31:29.000 How many more of these are out there in the world?
00:31:33.000 As big as FTX? We don't know.
00:31:36.000 I mean, there's only a few that are bigger, like there's Binance.
00:31:39.000 Very opaque company.
00:31:41.000 We don't exactly know.
00:31:43.000 And what has happened to Binance since FTX went down?
00:31:47.000 Because it seems like they received additional scrutiny, right?
00:31:52.000 Because now people are starting to look at it, and I saw that their value went down considerably.
00:31:56.000 Yeah, Elizabeth Warren wrote a letter to them.
00:31:59.000 They're being looked at much more closely.
00:32:02.000 I mean, ultimately, all of these things are so opaque in the sense that you can know their assets.
00:32:10.000 So it's like a big thing recently in crypto.
00:32:12.000 They'll say, hey, we're going to show you proof of reserves.
00:32:14.000 What does that mean?
00:32:15.000 They mean, we'll show you on-chain all our assets.
00:32:19.000 You can check yourself.
00:32:20.000 Like, I have a billion dollars in USDC. Well, that's great.
00:32:24.000 But it doesn't matter if I have a billion dollars in crypto, Bitcoin, whatever, if I owe two billion dollars.
00:32:30.000 That's what ultimately matters is how much do you have on deposits that you owe out.
00:32:35.000 And so with Binance, we don't really know.
00:32:38.000 The only one we have a little bit more of a look into is Coinbase.
00:32:41.000 It seems like they're legitimate.
00:32:46.000 So much of the problem with crypto is we don't know how much of this stuff is money laundering.
00:32:50.000 We don't know how much of this stuff is outright the proceeds of criminals.
00:32:56.000 I mean, we know that these criminals do launder their money through a lot of these crypto exchanges, through mixers.
00:33:02.000 It's just sort of this big mess right now, and we're waiting for regulators to figure it out.
00:33:07.000 Finally, regulators have stepped on the scene, but...
00:33:11.000 You know, right now it's just this kind of wild, wild west of you're just having to trust these shady offshore entities that they're telling the truth.
00:33:21.000 Binance says they're fine.
00:33:22.000 They show proof of reserves, but what are their liabilities?
00:33:25.000 You know, it's hard to know.
00:33:27.000 So people really just take you at face value and they have to trust that like, oh, other people are invested, so I guess I'll jump in too.
00:33:36.000 And that's why the celebrities are important.
00:33:38.000 And that's why the connection to BlackRock is important.
00:33:41.000 Huge part, yes.
00:33:42.000 Because they're the legitimacy that says, hey, I too am safe because Tom Brady's got his money there.
00:33:50.000 So the lure is like how Bitcoin used to be worth very little.
00:33:56.000 And then one time, what was the high of Bitcoin?
00:33:59.000 It was like $70,000 or something like that?
00:34:00.000 $60,000.
00:34:02.000 So that's the lure.
00:34:03.000 The lure is you buy in for pennies and one day you're insanely rich.
00:34:07.000 I'm sure you know about that one guy who lost a hard drive and who's paying people to go through a landfill to try to find his hard drive because there's billions of dollars worth of Bitcoin on that hard drive.
00:34:19.000 Yeah, people lose their crypto keys all the time.
00:34:22.000 I mean...
00:34:24.000 It's kind of an interesting idea where you go, I'm going to get in before everyone else.
00:34:30.000 But a lot of people found out about crypto at the same time the mainstream media everyone else did.
00:34:34.000 So by the time they're actually investing...
00:34:35.000 It's too late.
00:34:36.000 It's too late.
00:34:38.000 I think the most fair case you could make about crypto is...
00:34:46.000 Sometimes national currencies aren't a great idea and you want an alternative.
00:34:51.000 So like look at the Turkish lira, right?
00:34:53.000 The inflation rate I think is like 75% or something like that.
00:34:57.000 Like it's like it's an unimaginable.
00:34:59.000 It's just out of control inflation.
00:35:00.000 And if you hold on to your Turkish lira, you're in for a bad time because every day it's getting less valuable.
00:35:07.000 So the question is, What do you do if you're in that country making money?
00:35:14.000 If you want to store your money somewhere else, how do you store it?
00:35:17.000 So there's this idea of like these alternative currencies that are kind of interesting.
00:35:23.000 And then there's some arguments that like, hey, if you're someone like me, I have two employees and both of them are overseas.
00:35:30.000 Like one of them's in London, one of them's in Ukraine.
00:35:33.000 And so for me, I have to pay them and I have to do this wire transfer and it's kind of expensive to do these – like you pay all these fees for wire transfers.
00:35:41.000 So the idea is like, okay, well, if you have crypto, those wire fees can go down and instead of taking maybe a day or something, it will take like five minutes or three minutes.
00:35:50.000 So – No middleman.
00:36:07.000 There's also a really huge opportunity for fraud, scams, and basically shell games, where you're hiding the money.
00:36:17.000 You're saying, oh, invest in this.
00:36:18.000 This is going to become valuable later, but you actually own a bunch of that token.
00:36:21.000 Then you sell it off, and then the price plummets.
00:36:23.000 So you thought you had a bunch of money, but actually it's worth nothing.
00:36:26.000 There's all these new scams that have emerged as a result of people getting interested in this idea of an alternative money system.
00:36:35.000 I mean, yeah, especially in our modern age, I mean, it seems like you can understand where they're coming from, the average person.
00:36:43.000 They're like, look, I've been screwed by the banks.
00:36:45.000 Every time the government's printing a bunch of money, where do I go?
00:36:50.000 Right?
00:36:51.000 You can understand the appeal, but it's just like you went from the...
00:36:56.000 You know, the arms of one huckster to another.
00:37:01.000 It's almost to something worse.
00:37:04.000 There are reasons that our banks have a bunch of anti-money laundering laws.
00:37:08.000 There's a reason that they have all sorts of finance laws.
00:37:11.000 It's not for their safety.
00:37:14.000 It's for your safety.
00:37:15.000 I mean, it's like they need to fight.
00:37:18.000 One of the best ways to fight crime is at their wallets.
00:37:21.000 Take away their banking.
00:37:23.000 And crypto has just really revitalized that because now, if you're some criminal, laundering money has just never been easier.
00:37:32.000 Instead of taking $100,000 across the border or wiring it where it can get held up by a bank, now I can just send you $100,000.
00:37:40.000 It's going to take me five minutes.
00:37:41.000 So that's why when people kidnap people's data and things along those lines, they like to get paid through crypto.
00:37:49.000 Ransomware, yes.
00:37:50.000 Yes, 100%.
00:37:53.000 Before, it was like, okay, you need to use like Western Union or sort of one of these places where you can kind of send money without too much scrutiny.
00:38:01.000 But even Western Union has been kind of – they've been getting kind of pinched a little bit like, hey, you guys got to stop allowing all of this.
00:38:08.000 But in crypto, there's – because there's no middleman, because there's no one who controls like Bitcoin, like no one can say like no to a transaction – Now, it's like there's nothing to stop you from sending that money, and then you can take that money and you can send it to what's called a mixer,
00:38:25.000 which is this fancy language for a way to anonymize your transaction.
00:38:32.000 You put $100,000 into this little mixer, and then it sends $100,000 out later, and nobody knows where that money came from.
00:38:40.000 What is a mixer?
00:38:41.000 How does it work?
00:38:42.000 It's interesting.
00:38:43.000 So the most famous example is Tornado Cash.
00:38:45.000 They've recently been shut down.
00:38:47.000 But the idea of putting your money into Tornado Cash.
00:38:52.000 Yeah, it's wild.
00:38:53.000 Is there a better analogy for losing your house?
00:38:57.000 You know what's funny?
00:38:58.000 Yeah.
00:39:00.000 You know?
00:39:01.000 I mean, good lord.
00:39:02.000 The idea of these mixers was you'd anonymize your transactions.
00:39:05.000 So like, let's say I have Ethereum, one Ether into this mixer, right?
00:39:12.000 This pool of money.
00:39:13.000 A bunch of people are putting one Ethereum into this thing.
00:39:17.000 So all this money's going in.
00:39:18.000 And then you basically wait.
00:39:21.000 And as you're waiting, Ethereum's going out everywhere.
00:39:24.000 A bunch of people are withdrawing, right?
00:39:25.000 Because they're also taking their money out.
00:39:27.000 Right.
00:39:28.000 By the time you withdraw, there's nothing tying your Ethereum to your particular address to like this random external address because you send it to a different one.
00:39:38.000 So before, it's like if I send you a dollar and then you send that dollar on, we can easily trace that back to me, right?
00:39:45.000 It's like here, here.
00:39:46.000 But if I send a dollar to you and everyone's sending you a dollar and then you're sending a dollar to all these other wallets, then it's impossible to know which of those new wallets my dollar's from.
00:39:58.000 It's a crazy idea that these basically nerds in cryptography thought of, which is brilliant.
00:40:07.000 I mean it is brilliant because it is basically – it's almost impossible to trace.
00:40:13.000 But ultimately, the outcome of that is like, yeah, I encrypt all your data.
00:40:17.000 Joe, send me – I know you're the successful podcaster.
00:40:21.000 I want you to send me $10 million or your data is lost forever.
00:40:25.000 And you're like, call the police and you go, hey, track this guy.
00:40:29.000 And they're like, to what?
00:40:32.000 To a Bitcoin wallet?
00:40:34.000 To a Ethereum wallet?
00:40:36.000 What are we tracking here?
00:40:37.000 And then it goes to some mixer somewhere, and then we don't know where it goes after that.
00:40:41.000 So when Sam Bankman-Fried was working with regulators, when he was trying to impose regulations or encourage regulations, how could that have benefited him as opposed to Binance?
00:40:53.000 What could they have possibly done to make it easier or more profitable for him?
00:41:00.000 Why would he do that?
00:41:01.000 I'm not as familiar with the regulation side of things.
00:41:04.000 People were talking about that a lot.
00:41:06.000 What I know is everyone's always interested in pulling up the ladder after them and building the rule book around, like, hey, if you're from this certain jurisdiction that we're a part of, you're fine.
00:41:17.000 If you're not...
00:41:18.000 If you're from this one, you're not okay.
00:41:21.000 Or I might say, hey, CZ has connections to China.
00:41:25.000 Maybe that's a problem.
00:41:26.000 Or CZ has connections to here.
00:41:28.000 Maybe that's a big deal.
00:41:29.000 But I'm from the Bahamas, and I'm American, so that might be fine.
00:41:32.000 I mean, everyone's always interested in the regulations benefiting them.
00:41:36.000 The challenge now, though, is a lot of people had backed that bill, and Now that it was all a fraud, or the guy who basically pushed it was a fraud, now they're, like, trying to retool it, and it's,
00:41:52.000 like, sort of what's left after the guy who kind of was spearheading this bill, like, was a fraud.
00:41:57.000 It's kind of tough.
00:41:58.000 I was actually randomly, like, some senator's office reached out to me, and they're like, what do you think about this?
00:42:06.000 And I was like, I don't know, man.
00:42:07.000 You guys have to...
00:42:08.000 This is y'all's thing to figure out.
00:42:11.000 Ultimately, y'all have to...
00:42:15.000 My feeling is offshore entities should not be—they're not subject to our rules.
00:42:20.000 How can you allow offshore—like, yeah, I don't know.
00:42:24.000 It's very strange.
00:42:25.000 And these offshore entities were also using, like, U.S. branches.
00:42:29.000 Like, there's FTX U.S., which was, like, more regulated but not really that regulated.
00:42:34.000 It's a little, it's a strange time, man, to be covering crypto because I tried to tell people for years that this scam problem, this fraud problem, was going to undo sort of everything.
00:42:50.000 Like if you don't root out the scams, you don't find ways to solve that, this is never going to work because if you're, the money system has to be safe.
00:42:59.000 Like your grandma has to be able to charge back her credit card when there's a fraudster, right?
00:43:03.000 Or this whole thing doesn't work.
00:43:04.000 You can't rely on people being technically savvy in order to make something work.
00:43:11.000 If it's going to go to the public, you have to solve all these issues.
00:43:14.000 And fortunately, we saw crypto kind of go mainstream before they had really taken that.
00:43:20.000 Maybe some of them were taking it seriously, but not enough.
00:43:23.000 So if...
00:43:25.000 FTX didn't encourage regulation and CZ didn't get upset at that and sell off all his tokens.
00:43:33.000 Would they be still solvent today or still in operation today?
00:43:38.000 Would they not crash or was this inevitable?
00:43:41.000 It was inevitable.
00:43:42.000 So something to understand.
00:43:45.000 FTX was insolvent long before it was realized that they were insolvent.
00:43:50.000 Right?
00:43:51.000 So, that's the issue.
00:43:54.000 FTX's problem was not CZ. Ultimately, he's kind of the guy who pushed over the house made on sticks or something.
00:44:04.000 The problem was the foundation was wrong from the beginning.
00:44:07.000 If you don't have enough deposits to cover withdrawals, you just don't have the money, right?
00:44:13.000 Right?
00:44:15.000 Your issue is that anytime there's demand for withdrawals, you're going to encounter problems.
00:44:21.000 So it was going to be inevitable anytime any story broke that showed that maybe they're not as healthy as they should be.
00:44:29.000 There would have been a run on the banks and people would have found out.
00:44:31.000 It's just like, when are they going to find out?
00:44:34.000 He happened to be the final straw, if that makes sense.
00:44:36.000 So someone would have figured it out and someone would have started dumping their coins.
00:44:39.000 Yeah, people already – I mean even leading up like CZ gets a lot of the credit for it but like already like a day before they kind of shut down, myself and some other people were saying like we think they're insolvent because we had taken a look at their numbers and we said there's no way they have the money for this.
00:44:58.000 They don't have the tokens.
00:44:59.000 So we were warning people, hey, this is probably insolvent.
00:45:01.000 Get your money out.
00:45:02.000 But CZ ultimately was the big – he was the most notorious and well-respected person in the space to where people thought, OK, well, if he's saying it – he's a guy who only says positive things about crypto because he's a crypto executive.
00:45:17.000 So if he's saying there might be problems, there's probably some problems.
00:45:20.000 But Binance hasn't had similar problems.
00:45:48.000 Yeah, you don't know.
00:45:52.000 Mislabeled?
00:45:52.000 Yeah, he called it Fiat at FTX. But it was a $10 billion hole.
00:45:58.000 What do you mean by mislabeled?
00:45:59.000 Well, it was on a spreadsheet, Joe.
00:46:01.000 So he put it on a spreadsheet for their balance sheet, and he mislabeled the account Fiat at FTX. And so what prosecutors are now arguing is he knew, of course, what it was.
00:46:12.000 He deliberately obscured what that was to hide it from people who were trying to take a look at his books.
00:46:18.000 But...
00:46:20.000 It's just that's what I mean by black box.
00:46:21.000 You never know what games these guys are playing.
00:46:23.000 Like they say, oh, here's sort of like the rough estimate of our balances.
00:46:27.000 But oops, did I tell you about this $10 billion account?
00:46:29.000 Like I forgot.
00:46:33.000 It's so silly.
00:46:36.000 You find out like there were just no adults in that room and like the few adults that there were were like, you know, they had like a criminal lawyer.
00:46:44.000 Well, anyway, I don't think he's actually been convicted of anything.
00:46:46.000 I shouldn't say that.
00:46:47.000 They had this guy, Dan Friedberg.
00:46:49.000 Shady, shady.
00:46:51.000 Allegedly shady.
00:46:51.000 Dan Friedberg.
00:46:52.000 No, he's definitely shady.
00:46:54.000 I'll say that.
00:46:56.000 What was his...
00:46:56.000 I remember, but what was his...
00:46:58.000 His whole thing was he did this thing with Ultimate Bet.
00:47:01.000 So he was one of the lawyers.
00:47:04.000 So there's this poker site called Ultimate Bet.
00:47:06.000 And he got caught in this scandal where they had enabled this thing called God Mode on Ultimate Bet, where the...
00:47:14.000 The CEO could see everybody's hands and play on the site, seeing everybody's hands.
00:47:20.000 So he just, he cleaned up on all his own, like, his own customers.
00:47:25.000 Just basically taking their money, like, oh, I know exactly when to fold, I know exactly when to bet.
00:47:30.000 So he had God Mode enabled and then they found out.
00:47:33.000 Somebody found out about this God Mode.
00:47:34.000 And so the lawyer's like, how do we basically cover this up?
00:47:38.000 Dan Friedberg's like, how do we – what do you want me to do?
00:47:42.000 And he's like, hey, just make this problem go away.
00:47:45.000 This is the CEO. Like, go blame it on somebody else.
00:47:47.000 Go blame it on some like third party that got access to our website.
00:47:51.000 Say it was like a glitch or something.
00:47:53.000 And so that that is the experience of the lawyer that FTX then hires is like being complicit on a private call, leaked private call, trying to cover up this God mode scam.
00:48:05.000 That is his background.
00:48:07.000 And so I asked, you know, Sam, I was like, you know, what does it say if this is your chief regulatory officer?
00:48:16.000 This guy who enabled God or who helped cover up God Mode.
00:48:19.000 And he's like, well, I don't want to comment on other people or it's just like...
00:48:24.000 Well, how does he skate on that?
00:48:26.000 Like, how does a guy like that not wind up getting indicted?
00:48:32.000 I ask myself that every day.
00:48:35.000 Is it a matter of time?
00:48:37.000 Or is it he's gotten away with it?
00:48:39.000 So many of these scams are like these issues of either regulators not having time, not having the resources, not having sort of like it's maybe not big enough.
00:48:53.000 You know...
00:48:54.000 They're good people, a lot of the people going after these guys, but it's like trying to catch everyone who's speeding.
00:49:00.000 You know what I mean?
00:49:01.000 It's like people get away with it.
00:49:02.000 It's just there's too many people doing it.
00:49:04.000 You'll catch some people, but ultimately a lot of people will just basically skate by, even though by all rights they should have been caught.
00:49:11.000 In my view, what he did was criminal.
00:49:14.000 That's why he started.
00:49:14.000 But it hasn't been prosecuted or anything like that.
00:49:18.000 But he's on a leaked private call.
00:49:20.000 Everyone can go listen to it yourself.
00:49:22.000 It's just this shocking thing.
00:49:24.000 And I think that shows if you're running a shady empire, who's better than a shady lawyer to try to help you cover it up, right?
00:49:31.000 How did you get involved in what you do?
00:49:34.000 It's a weird thing.
00:49:37.000 So when did you start your YouTube channel?
00:49:39.000 So I started it a few years ago, 2018, 2019. And what was the first video?
00:49:46.000 I started as sort of like an interview show, nothing about scams.
00:49:52.000 I had a channel before it.
00:49:54.000 So I went to school for chemical engineering and hated it.
00:49:57.000 I was miserable.
00:49:58.000 I was like, I do not want my life to be earning 2% more of, you know, of a bottom line for Exxon Mobil or any chemical.
00:50:07.000 I just wasn't interested.
00:50:08.000 I was like, that's not my life.
00:50:09.000 So I always wanted to sort of, you know, have a voice.
00:50:13.000 And so I started a YouTube channel just doing random videos.
00:50:16.000 I hadn't really found my footing.
00:50:19.000 But throughout my entire life, I had kind of had this relationship with like hucksters and fraud.
00:50:25.000 Yeah.
00:50:41.000 You can just treat it naturally.
00:50:42.000 Just don't worry about, hey, don't listen to the, you know, the doctors.
00:50:46.000 Don't listen to your general practitioner.
00:50:48.000 You can just treat it with, like, colloidal silver.
00:50:51.000 Or just put a bunch of garlic cloves in the pot.
00:50:55.000 I still remember her house, like, reeked.
00:50:57.000 She would put 60 cloves of garlic in, like, in a stew.
00:51:01.000 And she would drink it up because she thought that would make her better.
00:51:04.000 Ultimately...
00:51:05.000 My dad convinced her, like, you gotta get the surgery.
00:51:08.000 Like, this ain't gonna fly.
00:51:10.000 You have to, you know, ultimately get the surgery, which thankfully she did, and she's fine now.
00:51:15.000 She takes medication to replace the hormones her thyroid would generate.
00:51:18.000 But I saw my mom kind of get swept in this thing that I knew was nonsense, but it's sort of like hard.
00:51:24.000 You kind of have to disprove every single, like, there's always a new, like, health guy telling you that there's some new alternative discovery, whatever.
00:51:32.000 And I was like, this is kind of weird.
00:51:34.000 And I was like, why do they hate doctors so much?
00:51:36.000 And it always seems to like end up with a sales pitch.
00:51:38.000 It never was like, hey, let me just give you this free thing.
00:51:41.000 It was like always like there's something, there's a catch.
00:51:44.000 So I didn't really know what I was looking at at the time.
00:51:47.000 Then I go to college and all my friends get an MLMs, multi-level marketing, you know, sort of like, just like the like, hey, you're gonna get rich.
00:51:53.000 So I was always getting invited to these like, get rich seminars.
00:51:57.000 And I'd go because it was like my friends like said, hey, we have to get somebody, you know, you want to go?
00:52:02.000 And I was like, Sure, I'll go.
00:52:04.000 I was kind of fascinated.
00:52:05.000 And you'd see these guys.
00:52:06.000 They're like, hey, don't work a nine-to-five job.
00:52:09.000 Be free like me.
00:52:11.000 And I'm like, you're here on a Sunday at 5 p.m.
00:52:15.000 How free are you, really?
00:52:17.000 You're just kind of grifting here.
00:52:20.000 But you'd see them in nice cars.
00:52:22.000 And so I was like, what am I looking at?
00:52:25.000 And then...
00:52:26.000 As I'm doing my YouTube show, I get fed a bunch of ads, like get-rich-quick schemes.
00:52:30.000 You've got a bunch of people flexing in their Lamborghinis, telling you, they're like 25 years old, telling you, you want to get rich by 25 or 22. I'll show you.
00:52:39.000 I made a million dollars.
00:52:41.000 I'm a millionaire by the time I'm 23 years old.
00:52:43.000 Just buy my course.
00:52:45.000 My course is $2,000.
00:52:47.000 Pay me $2,000.
00:52:48.000 I'll teach you to get rich quick.
00:52:50.000 So I saw this and it all that my experiences up to that point it kind of led me to like I want to say something why is nobody saying anything it just seemed like there was this you know these people pitching this stuff and nobody was talking about it so I made this random video just basically screaming about you know all these scammers online and unlike my previous work which kind of had resonated like it had gotten some reactions but not much what I noticed is the it resonated with people beyond the views If that makes sense.
00:53:20.000 Like, I was just like, there was something different about the reaction to it.
00:53:24.000 Like, and, you know, victims would reach out to me.
00:53:25.000 They'd be like, hey, I'd been scammed by this guy and I didn't realize what was going on.
00:53:29.000 And you showed me, you know, sort of like how the whole scheme worked.
00:53:35.000 So I decided to start pursuing it step by step.
00:53:38.000 And at first it was like just me discovering like, well, what is this?
00:53:42.000 Well, how does this scheme work?
00:53:43.000 Okay, so I buy this course and then what?
00:53:47.000 What are you saying in the terms of service that means that I can't sue you?
00:53:50.000 You have all these terms of service that basically say none of what I'm saying is true.
00:53:53.000 Like they say they can get you rich in the sales pitch and then in the terms of service they said results may vary.
00:54:00.000 What's that about?
00:54:01.000 I mean ultimately it's like and so I realized like oh there's this sophisticated way that they're preying on my psychology and they're setting it up with like I used to be broke like you.
00:54:11.000 Well that's a strategy.
00:54:13.000 A lot of these guys were never broke, right?
00:54:15.000 And it's just part of the story you have to tell to be really effective.
00:54:18.000 It's like, I used to be just like you, Joe, but then, you know, I found out that doing Amazon dropshipping is the way to make millions of dollars.
00:54:26.000 And, you know, I used to fail, but by these little tricks, I found out how to be successful.
00:54:32.000 And if you invest with me, I'll save you time.
00:54:35.000 You know, you could do it yourself, Joe.
00:54:36.000 You could do it.
00:54:37.000 But what?
00:54:37.000 It's going to take you five years.
00:54:38.000 Get with me, and I'm going to shortcut your success.
00:54:41.000 Two months, you're going to be making five figures a month.
00:54:45.000 Ten months, maybe six figures a month.
00:54:46.000 And I've done it for people before.
00:54:48.000 That's the social proof.
00:54:49.000 I've shown people how to do this.
00:54:53.000 You can watch them.
00:54:54.000 These are real people, Joe.
00:54:56.000 You can be just like them.
00:54:58.000 And so I started watching this and I started seeing it and I'm like, oh my gosh, this is so interesting.
00:55:03.000 I start covering it and then I start to get cease and desist letters.
00:55:05.000 They don't like that.
00:55:06.000 So they start to send me, they say, hey, you better shut up or we're going to sue you.
00:55:10.000 And I was like, okay.
00:55:13.000 I'm not going to stop making these videos.
00:55:14.000 I just kept making the videos.
00:55:16.000 And ultimately they never did.
00:55:18.000 But I start doing that.
00:55:21.000 And after I cover get-rich-quick schemes for a while, I start hearing about these tokens.
00:55:26.000 And they're like, hey, selling courses...
00:55:30.000 It's always the new grift.
00:55:31.000 You always have to find, because people figure it out.
00:55:33.000 They go like, oh, that actually doesn't work.
00:55:36.000 Dropshipping is not actually this incredible business that you thought it was that you're going to get rich easily.
00:55:41.000 So don't do that.
00:55:42.000 Go do crypto.
00:55:44.000 You've got to get into crypto now.
00:55:45.000 And then it became NFTs for a while.
00:55:48.000 But like, so I started, I eventually like pivoted into this crypto direction and learned all about that.
00:55:53.000 But it started just from a curiosity about scammers and I wanted somebody to say something because I was just like, why?
00:56:01.000 Why does this make some people tens of millions of dollars and nothing happens?
00:56:08.000 Why are some of these people making hundreds of millions of dollars?
00:56:11.000 People are miserable at the end of it and nothing happens.
00:56:17.000 And that was the start of my show.
00:56:19.000 So you start just doing interviews about what?
00:56:22.000 Like, you just, you didn't start doing this.
00:56:24.000 You started doing just like a normal interview show?
00:56:26.000 I was just doing a normal interview show with a few of my buddies.
00:56:30.000 And it just was kind of, I was just trying to find my way.
00:56:35.000 I was just trying to like, even before that I had done a show where I was like trying to break down these topics.
00:56:40.000 I was like researching addiction and I was just like trying to, you know, make some digestible piece of media around like addiction, right?
00:56:47.000 I always was interested in communicating complicated ideas in a digestible way.
00:56:53.000 I just felt like, man, there's so much cool science out there.
00:56:55.000 There's so many cool ideas out there.
00:56:57.000 How do we communicate this?
00:56:58.000 So I did that for a while.
00:56:59.000 Then I started like CoffeeZilla was like this spinoff channel.
00:57:02.000 I was like, let me do some interviews.
00:57:04.000 And then it was also my place.
00:57:06.000 I just threw things at the wall.
00:57:07.000 So then that's where I threw one of my rant.
00:57:08.000 Like I just like ranted about this thing against the wall and it kind of like stuck.
00:57:12.000 And I just enjoyed it.
00:57:14.000 I was like, man, screw these people, you know, like who are taking advantage of like...
00:57:18.000 And what was sick about it is they're not taking advantage of rich people because rich people will sue you.
00:57:23.000 If you screw them over, rich people will sue you.
00:57:26.000 They're taking advantage of like people who they're like at $10,000 or $2,000.
00:57:31.000 That's like all their disposable income.
00:57:33.000 And they're betting on these hucksters to dig themselves out of these situations.
00:57:38.000 And that's one of the things I try to tell people is like, a lot of the success of these things is not from, it's not even about greed.
00:57:46.000 It's about desperation.
00:57:49.000 When you fall for these things, a lot of times, you know, you're like my mom.
00:57:52.000 Like, the reason she fell for these things is she so badly didn't want surgery that she was willing to believe anything, right?
00:57:59.000 Because she's like, you know, if you tell me, and I have cancer, and you tell me I can be better, and you tell me it's $10,000, you tell me it's a dollar, I'll pay you either way, right?
00:58:08.000 And so people are financially, they feel like they're terminally ill financially.
00:58:13.000 They're just like, I don't know how to get out of this.
00:58:16.000 I feel like I have no opportunities.
00:58:18.000 This guy, I'm watching YouTube.
00:58:19.000 I'm trying to better myself.
00:58:21.000 I'm trying to educate myself.
00:58:22.000 And this guy comes on and tells me, it's all a click away, right?
00:58:25.000 It's all a credit card swipe away.
00:58:28.000 What has been the reaction?
00:58:29.000 Like, what has been the most visceral or violent reaction to what you've done and exposed?
00:58:37.000 I think the biggest story we've probably ever broken was either...
00:58:43.000 The FTX stuff, but that was already kind of going on.
00:58:45.000 It was probably the Logan Paul story.
00:58:47.000 The CryptoZoo saga.
00:58:50.000 That was a case where, you know...
00:58:52.000 It's just a classic influencer greed story where this guy launches...
00:58:57.000 An NFT project does millions upon millions of dollars in sales and delivers nothing.
00:59:03.000 He promises the world a fun blockchain game that earns you money, and he did nothing.
00:59:09.000 And the project was left abandoned, and people were miserable, complaining, complaining.
00:59:14.000 No one says it, but they don't have a voice.
00:59:16.000 I'm kind of aware that you covered it, but I don't know the story.
00:59:21.000 Let me back up then.
00:59:22.000 Okay.
00:59:24.000 Logan Paul is a popular influencer.
00:59:26.000 You know who he is.
00:59:29.000 So he, along with a lot of influencers, got really interested in the crypto space.
00:59:34.000 And he had done a coin before that called Dink Doink, which was abandoned shortly after he promoted it.
00:59:42.000 People got invested.
00:59:43.000 Goes to zero, right?
00:59:45.000 And he says, well, that's not my project.
00:59:46.000 That was my buddy's project.
00:59:48.000 And then like a month later, he's like, I actually do have a project.
00:59:51.000 Excited to announce it.
00:59:53.000 It's called CryptoZoo.
00:59:54.000 It's a fun game.
00:59:55.000 They called it a fun game that earns you money.
00:59:57.000 Basically, the idea is they're going to sell you these two things.
01:00:01.000 Eggs is NFT. And then there's a coin aspect to it called Zoo Tokens.
01:00:06.000 OK, so you can buy these zoo tokens to buy the eggs.
01:00:10.000 And the idea is the eggs will then hatch into animals that will earn passive zoo tokens.
01:00:17.000 So you can buy eggs with zoo tokens and then the eggs will passively earn you zoo tokens.
01:00:22.000 Does that make sense?
01:00:23.000 No.
01:00:24.000 Well, don't worry.
01:00:26.000 You're kind of actually caught up.
01:00:27.000 So these zoo tokens were basically this passive income.
01:00:31.000 You basically invest up front and then you're sort of getting the tokens back out, which you can sell, I guess.
01:00:37.000 That was the idea pitched to people and people immediately buy in.
01:00:40.000 Three million dollars in NFT sales, tens of millions of dollars in the tokens itself, the zoo tokens.
01:00:46.000 People are so excited about it because it's Logan Paul and he says this is his project.
01:00:52.000 He's putting his name behind it, his backing behind it, and he's a great marketer.
01:00:55.000 I mean, you've got to give the guy credit where credit is due.
01:00:58.000 He's a tremendous marketer.
01:00:59.000 So people get all excited.
01:01:02.000 All of a sudden, the hatch day comes when you're supposed to hatch these eggs.
01:01:07.000 And half the hatching doesn't work.
01:01:10.000 How does the hatching work?
01:01:12.000 Is it on a computer model?
01:01:14.000 It was on the blockchain.
01:01:17.000 So your NFTs would turn into different NFTs.
01:01:20.000 They would transform into the animals.
01:01:23.000 They go from an egg to an animal.
01:01:24.000 How?
01:01:26.000 It's just blockchain coding.
01:01:28.000 I mean, it's just...
01:01:29.000 But how do they...
01:01:30.000 Is it predetermined?
01:01:33.000 Yeah.
01:01:33.000 How does your egg become an ostrich?
01:01:35.000 It's just random.
01:01:36.000 It's supposed to be randomly generated animals.
01:01:40.000 So you might get a rhino, you might get a chicken.
01:01:44.000 Exactly.
01:01:44.000 And then you could crossbreed your rhino with a chicken and get a ricken or something and get even more tokens.
01:01:53.000 Is this it?
01:01:54.000 Yeah, there it is.
01:01:55.000 You get like bear shark.
01:01:57.000 So people start to like...
01:01:59.000 Is this still around?
01:02:01.000 So they say they're gonna go back and fix it now.
01:02:04.000 So Logan, after being not involved for like a year, as soon as my video comes out, he goes, Damn, what a coincidence!
01:02:11.000 Like I've been working on it.
01:02:12.000 Like I was gonna, you know, make it like launch it.
01:02:15.000 In reality, he hadn't touched it for a very long period of time.
01:02:18.000 But so sorry to back up.
01:02:20.000 Okay.
01:02:21.000 The half the token, half the eggs don't work.
01:02:25.000 And they're not actually earning anything.
01:02:28.000 The whole time they said they're gonna earn you these tokens, right?
01:02:30.000 They're not earning anything.
01:02:31.000 So the promises haven't been fulfilled.
01:02:33.000 There's just sort of all this stuff going on.
01:02:35.000 And behind the scenes, Logan's quiet.
01:02:38.000 Come to find out, he had hired basically criminals who were selling on the back end, like some of the tokens.
01:02:46.000 And he was sort of like, I don't know what his thing was.
01:02:50.000 I think he realized like, oh, it's not going to be that successful.
01:02:52.000 Let me move on.
01:02:53.000 I think his mentality was, let me just move on.
01:02:56.000 The problem, though, is you have millions of dollars of investment in a thing that you promoted.
01:03:00.000 You told everyone it was going to make them money, and then you never delivered anything.
01:03:05.000 So my story was basically showing that, showing the victims of the scheme, and in response, he's like, well, I'm going to sue you for that.
01:03:14.000 He said he was going to sue you.
01:03:15.000 Yeah, he said, I'll see you in court.
01:03:16.000 And then the backlash against him was so severe that he releases a video saying, thank you, CoffeeZilla, for showing the world what happened.
01:03:25.000 And I appreciate it.
01:03:27.000 I responded out of anger.
01:03:29.000 But I'm going to make things right.
01:03:30.000 I'm going to fix the game to what it was supposed to be.
01:03:32.000 And I'm going to pay back.
01:03:35.000 1.7 million dollars.
01:03:37.000 I'm committing 1.7 million to anyone who bought an NFT can get a refund.
01:03:41.000 Now, there's a bit of an issue with that.
01:03:43.000 So that's nice.
01:03:43.000 I actually think it's great that that happened, but there's two issues with it.
01:03:46.000 Number one, which is that The NFTs were only half a small part of the sale.
01:03:51.000 They actually weren't even half.
01:03:53.000 Because people bought these tokens.
01:03:54.000 So the people who bought tokens get nothing.
01:03:56.000 He's offering this refund on the NFTs.
01:03:59.000 The other problem is he hasn't refunded the NFTs.
01:04:03.000 I've actually reached out to him twice.
01:04:05.000 It's been like over a month since he's done this.
01:04:08.000 So he said he's going to do it.
01:04:09.000 And then the Discord, like he's posting in this little chat room with the investors.
01:04:15.000 After he said he was going to do it, he's posted nothing.
01:04:17.000 There's no way to get a refund right now.
01:04:18.000 So I keep asking, like, hey, you promised $1.7 million to these investors.
01:04:23.000 They're all waiting.
01:04:24.000 It's been over, I think it's almost been two months now, and there's nothing.
01:04:30.000 So it's like, you know, he says that he's refunding people, which sounds great for PR, and then it's just like radio silence.
01:04:38.000 What I'm ultimately looking for is some accountability from these guys.
01:04:41.000 They're happy to make money from the endeavors.
01:04:44.000 They're happy to potentially make millions of dollars from these different projects they're spinning up.
01:04:50.000 But the second accountability is asked for, you can't reach them.
01:04:55.000 I would assume Logan's a very busy guy.
01:05:00.000 Sure.
01:05:00.000 I would assume that he probably didn't come up with this on his own.
01:05:05.000 I would assume that someone probably came to him with this project.
01:05:09.000 This is just total assumption.
01:05:11.000 Guesswork.
01:05:12.000 Guessing on my part.
01:05:13.000 So we have text messages from behind the scenes.
01:05:15.000 A lot of people – the people who were responsible for it say Logan kind of spearheaded the idea.
01:05:20.000 And he says he spearheaded the idea.
01:05:22.000 So it was his idea?
01:05:23.000 Yeah.
01:05:24.000 And so he's working with someone, right, that probably assured him that this would work?
01:05:31.000 Uh, yeah.
01:05:32.000 I mean, he had this team of a few guys who they didn't do much vetting into, and some of them turned out to be criminals.
01:05:42.000 But, you know, my feeling is ultimately, no matter what happens, like, when you take people's money, that's what I'm trying to, like— On my show, I'm trying to tell these like influencers, like when you take people's money, it's different.
01:05:55.000 When you tell them you're going to make them money and you get into the financial investment game, your responsibility is different.
01:06:01.000 You can't just always pass the buck to like, oh, it was like a guy that's not that trustworthy.
01:06:06.000 It's like, all right, that might be true.
01:06:08.000 Then go fix it.
01:06:09.000 Go hire some more guys that are trustworthy and fix the thing.
01:06:13.000 And I think my experience – because I've talked to Logan and that's why I know he didn't respond to me because I texted him.
01:06:21.000 I said, hey, where's this money?
01:06:22.000 He left me on red.
01:06:24.000 But I've talked to him and when I talk to him – There's just sort of this feeling of he's like, I just don't want to think about this.
01:06:33.000 I don't want to be – he wants to focus on Prime, which is successful.
01:06:37.000 He doesn't want to be bothered with the victims of the scheme that he ultimately thought of in the first place.
01:06:43.000 Okay.
01:06:43.000 So is it possible that he's just gathering the money or working out a way to do it legally where it makes sense?
01:06:53.000 It's very frustrating because, you know, at every turn it's just sort of like...
01:06:59.000 You know, I want to say it's possible.
01:07:02.000 We just don't know.
01:07:02.000 And it's just sort of like...
01:07:04.000 When you promise people refunds, like the longer you wait, you know the less people are actually going to take that refund.
01:07:10.000 If Walmart says, hey, bring in this skull, I'll give you a refund.
01:07:14.000 And you're like, alright, when can I bring it in?
01:07:15.000 And they don't respond to you for two months.
01:07:18.000 They know that you're less likely to actually take the refund.
01:07:21.000 So I don't know if he's doing it because he wants less people to get the refund.
01:07:25.000 He probably is busy, but my thought is...
01:07:28.000 A transgression of this magnitude where you're playing with people's money and livelihoods, you cannot take it lightly.
01:07:35.000 And that's one of the things is these influencers got into this crypto space.
01:07:39.000 I don't think they fully appreciated it.
01:07:43.000 They're now dealing with financial investments, and it's not a joke.
01:07:46.000 It's not like a brand deal where, you know, if NordVPN isn't as great as they said it was, you know, it's all cool.
01:07:54.000 Right.
01:07:54.000 It's now, it's your company, and you promise people you're going to make the money, and now you haven't said anything for over a year, then you say you're going to refund them, and you don't say anything for two months.
01:08:05.000 That's an issue.
01:08:07.000 The whole crypto space and the whole NFT space is filled with weirdos.
01:08:14.000 Everyone that I've talked to that wants to come to me with some idea, it's always very strange.
01:08:21.000 When people have come to my business manager with financial propositions, they're always...
01:08:30.000 It's logical.
01:08:31.000 Like, it makes sense.
01:08:32.000 Oh, invest in this.
01:08:33.000 This is a fund, and it does this, and this is how you get a return on your investment.
01:08:38.000 None of that stuff ever made any sense to me.
01:08:41.000 I avoided all of it, luckily, but I was...
01:08:46.000 Propositioned by multiple different entities about these kind of things.
01:08:50.000 And I was like, I don't know what you're saying.
01:08:52.000 I don't know, like, why would anybody buy an NFT? Like, you know, oh, it's a non-fungible token and then you put it in an NFT wallet and you have this thing.
01:09:02.000 I'm like, but I have the same thing on my phone.
01:09:04.000 I can take a screenshot of that NFT and I have it.
01:09:06.000 Like, what is the thing, the physical thing?
01:09:09.000 You know, it's like, I understand, like, Beeple.
01:09:13.000 Do you know who Beeple is?
01:09:14.000 Oh, yeah, yeah, yeah.
01:09:14.000 Yeah, so Beeple made that little GigaChad thing for us.
01:09:17.000 It's a piece of digital artwork.
01:09:19.000 Yeah.
01:09:19.000 And, you know, he has an actual museum of digital art.
01:09:22.000 Right.
01:09:23.000 And if you buy a piece from him, you actually get a physical piece of digital art.
01:09:30.000 There's something there.
01:09:31.000 Yeah.
01:09:31.000 I get it.
01:09:32.000 Makes sense.
01:09:33.000 Like, the Ape Yacht Club, whatever the fuck that is.
01:09:38.000 Like, what's going on here?
01:09:40.000 I have a friend of mine who's an artist who made over a million dollars on NFTs, and I'm like, what did you do?
01:09:47.000 He talks to me for 10 minutes, and I'm like, I don't even know what the fuck you just said.
01:09:51.000 Yep.
01:09:52.000 So, let me start by saying, so, I work with a super talented digital artist.
01:09:58.000 So, he does a lot of my set stuff.
01:10:00.000 So, I have a lot of respect for, you know, the challenge of a lot of digital artists as opposed to physical artists.
01:10:08.000 It's like, if you're a painter, you sell your paintings.
01:10:10.000 If you're a digital artist, how do you print it out?
01:10:13.000 Like, what do you do?
01:10:14.000 So, NFTs were sort of originally, it was like, this is for artists.
01:10:18.000 Like this is a way for a digital artist now to legitimately sell scarcity in their work, which previously they had no way of doing.
01:10:27.000 You still can take a screenshot, but you don't own the NFT that like sort of the digital artist has sort of provisioned like this is the thing that matters.
01:10:35.000 So I have a lot of – like in that way, in that one way, I get it.
01:10:40.000 I get why people wanted it to become the next big thing.
01:10:44.000 The problem is it was quickly taken over as an investment vehicle.
01:10:49.000 Now it's like everybody is an art dealer and now everybody is an art expert and now we're trying to make a buck, right?
01:10:55.000 And that – Anytime you get art involved with money, things get weird.
01:11:00.000 But especially when you get art involved with quick flips and returns and now we're going to all make money from this.
01:11:06.000 That's when things get really weird.
01:11:07.000 So like I feel bad sort of for digital artists, legitimate digital artists who really do legitimate NFT work.
01:11:14.000 I don't think there's anything wrong with selling your work as a digital artist.
01:11:17.000 Like what do you expect them to do?
01:11:18.000 Not everybody can go work for like some random YouTuber.
01:11:21.000 Like, you know, people have to earn a living.
01:11:23.000 They do legitimate work and good work.
01:11:26.000 But the problem is when greed gets involved, when people get involved basically promising money.
01:11:34.000 In the case of the Bored Ape Yacht Club, it's sort of like what their idea was.
01:11:38.000 We'll start almost like a country club where the NFT is the pass for the country club.
01:11:44.000 And you can go chat with the holders of this Bored Ape Yacht Club.
01:11:48.000 And I guess the idea is because it's expensive, then you get in the room with people with money.
01:11:56.000 But I found that whole thing weird because of the like, you know, Jimmy Fallon's getting involved and like, and then all these like mainstream celebrities, you know, start promoting this thing.
01:12:05.000 And it's like, this is a little, why is everyone doing it?
01:12:09.000 And then you come to find out that a lot of them had their Bored Apes bought by this company called Moon Pay, who is trying to like, you know, use the celebrity's likeness to push that out.
01:12:19.000 And it's just like, this is a strange, what's actually going on here?
01:12:23.000 Is it just about the art?
01:12:24.000 It doesn't actually appear to be.
01:12:29.000 I just don't understand how it worked.
01:12:31.000 I don't understand how anybody looked at it and went, this is logical, I'm gonna buy that.
01:12:36.000 So think about it this way, though.
01:12:38.000 So I'm sure you've played a bunch of games, video games, right?
01:12:41.000 Have you ever played a video game where they have in-game skins and different outfits?
01:12:47.000 Sure.
01:12:47.000 So tons of businesses have been built, like the entire free-to-play model of Fortnite.
01:12:54.000 Fortnite makes millions and millions and millions of dollars.
01:12:58.000 Their whole model is built on skins and different in-game purchasable items.
01:13:02.000 You don't actually own anything.
01:13:04.000 Ultimately, it just lives and dies with your computer.
01:13:07.000 NFTs are sort of like – I guess the idea with NFT gaming or whatever is like you would actually own it.
01:13:14.000 Like the game couldn't take it away from you.
01:13:15.000 You'd have some piece of art that you'd have some ownership of that would matter.
01:13:20.000 Yeah.
01:13:22.000 Again, I think the challenge is just like where greed and like marketers get involved.
01:13:29.000 They just sort of like ruin everything with scams and fraud to where it's very tempting and I get the temptation to just throw everything out.
01:13:35.000 It's all just a fraud, right?
01:13:37.000 Because you see so much of it and so much of it is just like kind of people trying to scam you basically for, you know, and use especially celebrity likenesses to scam people.
01:13:50.000 Yeah, the celebrity part is a big key in all this, right?
01:13:54.000 I mean, it's a huge part.
01:13:56.000 This is how we get legitimacy for products now.
01:13:58.000 It's sort of like...
01:13:59.000 Endorsements.
01:14:00.000 Endorsements.
01:14:01.000 It's like you've got to find a guy to do it.
01:14:04.000 So ultimately, like...
01:14:06.000 And the AI stuff's scary because ultimately you'll get the AI deepfaking you into, you know...
01:14:11.000 Yeah, there's one of me.
01:14:12.000 There's one of me and Andrew Huberman selling some supplement that's not real.
01:14:16.000 Right, yeah.
01:14:17.000 I don't know if the supplement's real, but I know the commercial's certainly not real.
01:14:21.000 Yeah, they deepfaked you.
01:14:23.000 I think it went viral on Twitter for a bit.
01:14:25.000 Yeah, well, everybody knew it was a deepfake, luckily.
01:14:28.000 It wasn't quite good enough.
01:14:30.000 Yeah, and then, you know, we tried to figure out who's doing it, and you just run into a bunch of shells.
01:14:36.000 It's like very difficult to figure out.
01:14:38.000 I'll tell you offline who's doing it.
01:14:39.000 Okay.
01:14:39.000 I know.
01:14:40.000 I looked into it because I was curious.
01:14:41.000 I was curious.
01:14:42.000 And, you know, that same person had put out a lot of ads about like Kim Kardashian.
01:14:48.000 They had a deepfake of Kim.
01:14:49.000 They had a deepfake of...
01:14:51.000 They had one of you saying that like...
01:14:53.000 So they have one of you saying like, this product's great.
01:14:56.000 You know, go buy it.
01:14:58.000 And then there's another one where you were complaining that Andrew Tate launched it and you thought you were sort of like, Andrew Tate's going after my brand.
01:15:06.000 Like, because it's very similarly named to one of your products.
01:15:09.000 And so it's like, it was kind of this hilarious thing where they were playing both sides.
01:15:12.000 It's like, it's Joe Rogan's.
01:15:14.000 It's also Joe Rogan's hate, like, hates that it's out there because it's so good.
01:15:17.000 Then it's like, Kim Kardashian loves it.
01:15:19.000 They had every celebrity was like, basically endorsing this thing all through AI. And it's just the testament of our times.
01:15:24.000 Like, Celebrities are the new sort of authorities for better and often for worse.
01:15:31.000 But people use that as currency now.
01:15:34.000 And with AI, you can just fake a lot of that stuff.
01:15:38.000 That's what I'm worried about.
01:15:40.000 I feel like this is the very first volley in a war on reality.
01:15:45.000 In that the way AI is structured, it's so prevalent.
01:15:50.000 And so, like, when you look at chat GPI, and then you look at deepfakes, and you look at the ability to take—I mean, there's a whole podcast of me interviewing Steve Jobs that doesn't—it's not real.
01:16:01.000 And it sounds like a real podcast.
01:16:05.000 There's a lot of podcasts.
01:16:07.000 Yeah, it's crazy.
01:16:09.000 Sometimes I'll check one and I'll go, is this real?
01:16:11.000 I saw there's a bunch going around.
01:16:13.000 Now they can imitate anyone's voice.
01:16:15.000 I think you were probably one of the first because you have so many hours of footage.
01:16:19.000 So they had a lot of training data.
01:16:20.000 There was a Canadian company that showed proof of concept of this a few years back.
01:16:24.000 And I was like, oh boy.
01:16:26.000 I know where this is going to lead because they just took all the hours of footage so they basically have me at every pitch and tone and yelling and laughing and they can have me say anything at this point.
01:16:38.000 Literally.
01:16:39.000 And now they're getting really good at the inflection because one of the problems with these AI tools was they were very monotone and they can only imitate your voice in a monotone.
01:16:47.000 But now they're getting better at like, okay, we'll accent the voice and then we'll talk calmly and then we'll be able to, you know, get more excited.
01:16:54.000 So that's a huge problem.
01:16:56.000 Have you seen the face ones though?
01:16:58.000 That's the new ones.
01:16:59.000 Jamie, can you pull up the new TikTok face filters?
01:17:04.000 Have you seen this?
01:17:05.000 Which face filters?
01:17:07.000 The new...
01:17:07.000 Which one in particular?
01:17:08.000 I think it's like their glam one.
01:17:09.000 There's a bunch of Twitter threads right now on it.
01:17:11.000 I've seen the glam ones.
01:17:12.000 It's amazing.
01:17:13.000 It's amazing how they can put makeup on you.
01:17:16.000 No, no.
01:17:16.000 You look different.
01:17:17.000 Yeah, you look different.
01:17:18.000 Yeah.
01:17:18.000 It's literally going to be this new world where you won't know, like, catfishing is going to a new level.
01:17:23.000 Yeah, you'll have no idea what someone looks like.
01:17:25.000 There's a woman who did this ad and she was laying in bed.
01:17:28.000 She's like, I don't have any makeup on.
01:17:30.000 And in the old ones, like, there's a really funny video of this person that I know, actually, who put this filter on.
01:17:37.000 And in one of the scenes, she puts her hand in front of her face and the lips are superimposed on her hand.
01:17:45.000 And it looks so preposterous.
01:17:47.000 And the fact that she's So not aware of the fact that this thing is happening.
01:17:54.000 And she put the video out.
01:17:55.000 It's like...
01:17:56.000 We were laughing so hard.
01:17:58.000 First of all, you don't look like that.
01:17:59.000 Everyone knows you don't look like that.
01:18:01.000 And then when you put your hand in front of your face, you didn't see this fucking giant cartoonish fake lips that came over your palm.
01:18:10.000 This is so crazy.
01:18:11.000 So this is the one that I saw.
01:18:12.000 This woman.
01:18:14.000 Like, this is crazy.
01:18:15.000 Yeah, now if you touch...
01:18:17.000 Yeah.
01:18:18.000 Well, now if you touch your face, you should be able to.
01:18:21.000 They don't superimpose anything.
01:18:24.000 It's all really real.
01:18:25.000 You can do anything to your face, and you can manipulate it, and the AI tracks it all.
01:18:32.000 And I've seen people do it where they have two screens, like one that's actually them and one that's them with the filter, so you see it side by side.
01:18:39.000 It's shocking.
01:18:41.000 Yeah, it's really worrying, like, you know, these technologies, part of the problem is you can deploy them so cheaply and at scale to where, you know, in my world, I'm more worried about, like, the Joe Rogan deepfakes and, like, people scamming people out of money.
01:18:55.000 But I also worry about, like, the romance scammers.
01:18:58.000 Yeah.
01:18:58.000 Like, how good that's going to get.
01:18:59.000 Oh, yeah.
01:18:59.000 When ChatGPT now has all the scripts down and instead of paying someone to get it.
01:19:05.000 You have someone FaceTime this person.
01:19:07.000 Oh, yeah.
01:19:07.000 You have someone FaceTime them.
01:19:08.000 You have it all generated by an AI. It costs you almost nothing to do.
01:19:12.000 I mean, one of the rise of, like, robocalls was it's just cheaper.
01:19:15.000 Like, it's really hard if you're going to hire people to do it.
01:19:18.000 You kind of need an ROI. If you have robots, you know, sending spam, now it's...
01:19:23.000 Now it's good because you don't actually need to earn that many dollars per call to make it viable.
01:19:28.000 So you just call everybody.
01:19:29.000 One of my daughters got a phone call about how much money she owes and then if she doesn't pay this amount right away, the authorities will be in contact with her.
01:19:39.000 And, you know, she was 10 and she was laughing and she's like, what is this?
01:19:43.000 Am I in trouble?
01:19:45.000 She plays it for me.
01:19:46.000 I'm like, oh my god, this is hilarious But it's just when you take really lonely sad people like I remember watch this Television show once it was some expose on this poor man.
01:19:57.000 He was just like this old divorcee Who was being scammed by someone and I don't even think he had like a voice conversation with this person but he traveled to the UK or somewhere somewhere in Europe twice and To meet with this person that he'd been sending all this money to.
01:20:15.000 And both times something came up and the person couldn't meet him there.
01:20:18.000 This poor old guy just kept going there thinking that the love of his life was there.
01:20:23.000 And they interviewed his daughter and she was beside herself and she couldn't talk sense into him.
01:20:29.000 And they interviewed him and he was in denial and it was just so pathetic and sad.
01:20:33.000 And what is that going to be like now with this kind of shit?
01:20:37.000 It's going to be a lot more prevalent and it's going to get a lot better.
01:20:40.000 I mean the rise of the ability to generate like a realistic companion avatar is going to be – I mean it's massive.
01:20:50.000 These people were complaining to me the other day about this other thing – which you're going to find as well.
01:20:54.000 So there's this app where you can basically have a girlfriend who's an AI. Where like the AI, you'll like, like, basically, you know, it's a fake, like, you know, it's all AI, but it's like a companion chat bot.
01:21:08.000 And, you know, I get a lot of emails like, oh, such and such is a scam.
01:21:11.000 And usually it's like some Ponzi scheme or some get rich quick scheme.
01:21:14.000 This one, they were furious because the creators had sold it like, hey, you can have hot roleplay with this AI bot.
01:21:24.000 And then the people developing the app one day said, hey, we're turning that off.
01:21:28.000 But the reaction from the community was like, you took away my girlfriend.
01:21:33.000 Oh, Jesus Christ.
01:21:34.000 You took away my partner.
01:21:38.000 And these people had legitimately bonded with a bot.
01:21:46.000 Well, that's the Joaquin Phoenix movie.
01:21:49.000 Yeah, yeah, yeah.
01:21:50.000 What is it?
01:21:50.000 She?
01:21:51.000 Her?
01:21:51.000 Her, her, her, her.
01:21:52.000 Yeah, yeah.
01:21:53.000 That's really the premise of the movie, but in the movie it was all just voice.
01:21:57.000 Yeah.
01:21:57.000 Now it's going to be some actual 3D person.
01:22:01.000 Is this one?
01:22:03.000 It's replicate AI. So that's like still the uncanny valley, right?
01:22:06.000 You look at that and you'd have to have like really bad eyesight to think that's a real person.
01:22:11.000 AI shuts down erotic role play community, shares suicide prevention resources over loss.
01:22:17.000 Oh my goodness.
01:22:19.000 People were, like, miserable.
01:22:21.000 They're like, it's talking.
01:22:22.000 And they would, like, complain, like, after an update, they'd be like, because I looked through their Reddit, I was so curious.
01:22:26.000 I was like, this is like a new, brave new world, you know?
01:22:31.000 But they would say, you know, ever since the new update, she's just not the same.
01:22:37.000 She's, like, talking to someone different.
01:22:38.000 And it's like, you know the back end is just, like, a large language model.
01:22:42.000 And they just clicked an update.
01:22:44.000 They don't care as long as they're getting this feeling, right?
01:22:48.000 You know, it's really scary stuff because I read this statistic recently that said that there's somewhere in the neighborhood of 30 plus percent of women are single, but it's in the neighborhood of 60 percent of men.
01:23:06.000 Really?
01:23:06.000 Yeah.
01:23:07.000 That seems really high.
01:23:09.000 I know.
01:23:09.000 In what age group?
01:23:10.000 It does seem really high.
01:23:10.000 It's, you know, 18 to 49 or something like that.
01:23:13.000 Oh, wow.
01:23:14.000 I don't remember the exact numbers, but it's young men.
01:23:17.000 It's a shockingly high...
01:23:18.000 It doesn't make sense, like, why there is such a disparity between the genders, that men are so much more single than women.
01:23:26.000 Like, that doesn't even drive.
01:23:27.000 Yeah, how does that make sense?
01:23:30.000 Most men are single.
01:23:32.000 Most young women are not.
01:23:35.000 Maybe the guys are just saying they're single.
01:23:37.000 And all the girls are like, we're in a relationship.
01:23:39.000 It is just a research, right?
01:23:41.000 So it's just a survey, I would imagine.
01:23:44.000 30% of U.S. adults are neither married, living with a partner, nor engaged in a committed relationship.
01:23:49.000 Nearly half of all young adults are single.
01:23:52.000 34% of women and a whopping 63% of men.
01:23:57.000 Wow.
01:23:58.000 How does that work?
01:23:59.000 How does it work if there's roughly 50% women, 50% men?
01:24:04.000 How could 34% of women be single and 63% of men be single?
01:24:09.000 It says, not surprisingly, the decline in relationships matches the stride with the decline in sex.
01:24:14.000 The share of sexually active Americans stands at a 30-year low.
01:24:19.000 Around 30% of young men reported in 2019 that they had no sex in the past year, compared to about 20% of young women.
01:24:28.000 Only half of single men are actively seeking relationships or even casual dates, according to Pew.
01:24:34.000 That figure is declining.
01:24:37.000 What if, like, the women thought they were in a relationship and the guys are like...
01:24:40.000 Right.
01:24:40.000 Yeah, that's what we could say.
01:24:42.000 Yeah, that you could say that.
01:24:44.000 Or maybe the women aren't being honest.
01:24:46.000 Maybe they've gone on a date with a guy and they decide that's their boyfriend.
01:24:49.000 I don't know.
01:24:50.000 I think the more shocking thing is just that...
01:24:53.000 More, in general, are single.
01:24:56.000 Less people are having sex and are engaged in meaningful, long-term relationships.
01:25:02.000 I think that's...
01:25:03.000 You know, there's just an increasingly...
01:25:05.000 I feel like we're becoming more atomized.
01:25:08.000 Like, you just kind of can get lost in your world.
01:25:10.000 And you get these pseudo-communities popping up.
01:25:16.000 Like, if I'm a Bored Ape Yacht Club member, I could call...
01:25:18.000 You know, I might say, those guys are my brothers.
01:25:20.000 But are they really?
01:25:22.000 What are these new internet communities doing right?
01:25:27.000 And what are they not really replacing in the real world?
01:25:30.000 Because basically, that's what we've done.
01:25:31.000 We've replaced a lot of physical things with online things.
01:25:35.000 And sometimes that replacement works.
01:25:39.000 Like I can, you know, I can kind of, but sometimes it doesn't.
01:25:41.000 Like I can like FaceTime with my mom and it's like kind of the same, but it's not, it's not really.
01:25:47.000 It's a little annoying.
01:25:48.000 It's a little annoying, right?
01:25:49.000 And they're getting better at it, but it's like, it's always kind of like this like facsimile of the real thing.
01:25:55.000 And so I think this replica AI is like, it's sort of this, like it's trying to treat loneliness in people.
01:26:01.000 Maybe you could, that's the nice way of looking at it.
01:26:05.000 But it's, it's pretty dystopian, man.
01:26:07.000 It is dystopian.
01:26:08.000 And one of the things that I think accelerated it was the lockdowns, right?
01:26:12.000 So for especially people that had a lot of anxiety, there was people that went a year plus without being in contact with other people other than their immediate family members.
01:26:23.000 And so then they seek more time online.
01:26:27.000 They're online more.
01:26:28.000 And at the same time, this AI-generated 3D image of a person is communicating with you.
01:26:36.000 That, and then the rise of parasocial relationships.
01:26:41.000 Working from home.
01:26:42.000 Yeah, yeah.
01:26:43.000 People watch so much of online people, they think they know you.
01:26:47.000 And like...
01:26:48.000 And they don't.
01:26:49.000 But they feel like you're their friend rather than them having online...
01:26:54.000 I was hanging out with a few friends and they got approached by some people and these guys felt like they knew them.
01:27:02.000 They're like, Like, I love all your stuff.
01:27:05.000 They're asking about one of their friends, like, what do you think about when this guy did that?
01:27:09.000 And I'm thinking, like, this guy doesn't know you.
01:27:12.000 Right.
01:27:12.000 But it's this strange thing where that's our new world.
01:27:17.000 It's different from, like, when there were celebrities.
01:27:19.000 You didn't feel like you knew Tom Cruise.
01:27:21.000 Right.
01:27:21.000 Well, that's different, too, because he didn't really talk.
01:27:23.000 He only talked on screen.
01:27:25.000 He's playing a character.
01:27:26.000 Yeah, and when he did talk, it was disastrous.
01:27:28.000 Like, remember when he had that interview with Matt Lauer, and he was getting upset at Brooke Shields, who was taking, you know, psychiatric medications, and he's a Scientologist, and they believe those are the devil, and so he was telling, you're being glib, Matt.
01:27:41.000 You're being glib.
01:27:42.000 And everybody was like, oh my god, this guy's a psycho.
01:27:44.000 You remember those?
01:27:46.000 Ever since I watched Top Gun, I forgot.
01:27:48.000 That was disastrous to him.
01:27:50.000 But ultimately, he's kind of proven correct in a lot of ways because it turns out that the model of why they were using these SSRIs is not correct.
01:28:01.000 Like, they work, but they're not sure why they work.
01:28:03.000 And the initial thought was that they were addressing some sort of chemical imbalance in the brain.
01:28:11.000 And now that's been proven to not be correct.
01:28:15.000 How do you think we go about...
01:28:18.000 So it's sort of like managing these two things, right?
01:28:22.000 Like you manage the fact that pharmaceutical companies have profit incentives that lead them to want people to be on long-term drugs forever.
01:28:33.000 That's the best kind of drugs, one you never get off of.
01:28:36.000 Right.
01:28:36.000 With the fact that, like, on the other hand, you have a lot of, like, alternative health guys saying, hey, that's nonsense to listen to the guys.
01:28:42.000 They're also kind of a lot of them pushing a bunch of pseudoscientific wackiness.
01:28:48.000 So...
01:28:48.000 It's very hard to figure out what's right and what's wrong and what's correct and what's...
01:28:54.000 Yeah, because you go like, oh, Tom has a point about all these pills.
01:29:01.000 But it's like, okay, then is the answer nothing?
01:29:05.000 It's hard to know.
01:29:06.000 It's interesting, right?
01:29:07.000 Because the question is illness, right?
01:29:11.000 There are certain medications like insulin for people that are diabetic.
01:29:15.000 These are like...
01:29:16.000 Actual, real solutions to an actual medical problem that's being created by a pharmaceutical company that addresses real issues and helps people.
01:29:25.000 And then there's also stuff like, hey, you know, maybe you need Adderall.
01:29:30.000 Maybe you need to focus.
01:29:31.000 And so they're giving you speed, right?
01:29:33.000 And so it's basically, it's not based on a disease like, I can't go to a doctor and the doctor says, hey, you have herpes.
01:29:41.000 You need herpes medication.
01:29:42.000 And then this fixes your disease.
01:29:46.000 It's, I don't feel good.
01:29:48.000 Give me something that makes me feel good.
01:29:50.000 And then they give you something that makes you feel good, and you're like, okay, I'm on medicine because I have an illness.
01:29:55.000 Is that really what's going on?
01:29:56.000 But what else is causing that illness?
01:29:59.000 Do you exercise?
01:30:00.000 Do you sleep right?
01:30:02.000 Are you depressed because you have no meaningful relationships?
01:30:04.000 Are you depressed because you have a job that's horrific and stressful?
01:30:08.000 What is causing this that you're just putting a band-aid over?
01:30:13.000 So there's confounding Issues that are all souped in together.
01:30:19.000 And no one's the same.
01:30:20.000 That's the thing.
01:30:21.000 It's like, how much of it is environmental factors?
01:30:23.000 Like, I can speak personally.
01:30:25.000 I have developed some, like, low-grade form of ADHD. What does that mean, though?
01:30:33.000 What does it mean?
01:30:33.000 Meaning, okay, so like in the past, I could read books for hours and hours on end, right?
01:30:40.000 Like I loved reading books.
01:30:42.000 Due to how much I engage with social media, and I'm someone who tries to monitor this stuff.
01:30:48.000 I was on a flip phone last year for like six months out of the year.
01:30:51.000 I mean like I try to limit this stuff.
01:30:54.000 But because so much of my job is on social media and Twitter and I'm scrolling and the scroll is so addictive because you context switch so much so fast that it's like my brain when I try to lock into a book it's like it takes me a bit and I'm somebody who Likes to read a lot.
01:31:11.000 I'd say I was like a voracious reader, especially as a kid.
01:31:14.000 And like, as I get older, I'm having to sit down and it's more like work.
01:31:18.000 I like I have to intentionally like, okay, I gotta read this book.
01:31:21.000 I'm gonna cut myself from distractions.
01:31:23.000 And I have all these apps on my phone to try to limit the amount of like screen time that I have.
01:31:28.000 Because I'm just I know this is bad for my brain.
01:31:32.000 So I've given – so I don't – for me, I'm like Adderall is not a good solution for me because my problem is not that I was born with this issue.
01:31:42.000 My problem is I'm on my device and my device is literally overstimulating my brain to when I don't have that overstimulation.
01:31:49.000 I'm just sitting in a quiet room with a book.
01:31:51.000 Now my brain is like, well, where is it?
01:31:53.000 Where is the interaction?
01:31:57.000 So for me, I think the answer is, okay, for me, I just have to unplug more, right?
01:32:00.000 And that's what I try to do.
01:32:01.000 But for somebody who says, I was born like this, I can never pay attention, like, is the answer...
01:32:10.000 You know, some people say Adderall helps them.
01:32:12.000 What do you say to those people?
01:32:14.000 So, like, that's what I mean.
01:32:16.000 It's like, it seems like an environmental...
01:32:19.000 Yeah, well, I think, A, you're addicted to your phone.
01:32:23.000 For sure.
01:32:24.000 Yeah.
01:32:24.000 Most people are.
01:32:25.000 Most people are.
01:32:25.000 Yeah.
01:32:26.000 I am very fortunate that I'm not addicted to social media.
01:32:29.000 I am addicted to watching YouTube videos, which is a totally different animal.
01:32:33.000 Yeah.
01:32:33.000 And I'm also addicted to watching YouTube videos on things that I enjoy, which is better.
01:32:38.000 So I've filled that gap with things like I watch...
01:32:44.000 Like fight videos and professional pool matches.
01:32:47.000 It stimulates me in a way, but I'm not engaging with this context switching constantly, like scrolling on Twitter.
01:32:54.000 I go to Twitter maybe 5-10 minutes a day.
01:32:57.000 I go and I see what the fuck's going on.
01:32:59.000 Like, what is everybody mad at?
01:33:01.000 Like, who's in trouble?
01:33:03.000 Like, I'll shit-scroll.
01:33:05.000 Such a funny way to describe Twitter.
01:33:06.000 It's so accurate, too.
01:33:07.000 But I do not post.
01:33:08.000 Right.
01:33:08.000 If I post, I post and ghost.
01:33:10.000 I just post, and I leave it alone.
01:33:11.000 I don't read the comments, ever.
01:33:14.000 I don't read any of my comments.
01:33:15.000 I think that's great.
01:33:16.000 That is very important for famous people.
01:33:18.000 It's very, very important.
01:33:19.000 Because I have friends that don't...
01:33:21.000 And they'll come to me, and you know what they're saying?
01:33:23.000 I go, how do you know what they're saying?
01:33:25.000 Like, what do you give a shit?
01:33:26.000 I watch people ruin their lives by looking at their screens.
01:33:32.000 It's kind of hard because when you first kind of come on the scene, you get a little attention.
01:33:38.000 It's like intoxicating.
01:33:39.000 And you want to engage too.
01:33:40.000 Yeah, because it's like at first that's fun.
01:33:43.000 It's like when you have a thousand people watching you, that's like beautiful.
01:33:46.000 It's like there's this community.
01:33:48.000 They're resonating.
01:33:49.000 You have time to kind of you can respond to people like intelligently.
01:33:54.000 When you start to get into the millions, it's just ludicrous.
01:33:57.000 It just doesn't make sense anymore.
01:33:59.000 And it starts to be this like your audience starts to become to you more.
01:34:04.000 It feels more like a hive mind, even though it still is individuals.
01:34:08.000 It feels more like, OK, how do I get a pulse of what this actually is?
01:34:12.000 This is why people gravitate towards negative comments when they have huge audiences is because they go like, Well, maybe they're right.
01:34:19.000 Maybe that one guy represents the whole.
01:34:22.000 Of course it doesn't, but they're worried because they don't really know what their audience thinks because it's so many people.
01:34:29.000 So I know it's the right thing to do to unplug.
01:34:32.000 At the same time, I'm like, okay, I have to know the current events.
01:34:35.000 I have to know what's going on.
01:34:36.000 So that's one of the worst parts about it.
01:34:39.000 I love what I do, but it is the worst part of my job that I feel to some extent I kind of have to have my finger a bit on the pulse to know who's into what, what's big.
01:34:51.000 But then after that, the discipline is like unplugging.
01:34:55.000 What I have found is that if something is big enough that I need to pay attention, I'll find it.
01:35:01.000 I find it through other methods.
01:35:03.000 I find it through friends.
01:35:04.000 I have so many friends like, do you know about this?
01:35:06.000 Do you know about that?
01:35:07.000 Like, even sometimes when people are mad at me, like, what's going on with you and that person?
01:35:11.000 I go, what are you talking about?
01:35:13.000 I literally don't know.
01:35:14.000 And then they'll tell me, I don't want to look at that.
01:35:16.000 Like, leave it alone.
01:35:17.000 Like, I don't give a fuck.
01:35:18.000 But you'll find out.
01:35:20.000 You'll find out because people are talking about it.
01:35:22.000 You'll find out.
01:35:22.000 Like, let the addicts scroll.
01:35:25.000 Let them go crazy.
01:35:26.000 But for your own mental health, it's not...
01:35:29.000 And anybody who's public, like, you're a public person.
01:35:31.000 You engage publicly.
01:35:33.000 You put your videos out, and people comment on them, and your videos get millions of views.
01:35:38.000 Like, that is not an environment where you can healthily sample people's opinions.
01:35:45.000 It's just not possible.
01:35:48.000 Human beings are designed to look for threats.
01:35:53.000 You're designed to find problems.
01:35:56.000 And so if there's one person that thinks you're a piece of shit and a hundred of them love you, that one person is the one you're going to think about.
01:36:03.000 And they're confirming your worst fear.
01:36:06.000 Your worst imposter syndrome.
01:36:08.000 They're like, you are crap.
01:36:10.000 And you go, oh man, I knew it.
01:36:12.000 But even for people that are just regular people, imagine people aren't talking about you because you're anonymous, but you're engaging in this very shallow form of communication that's not natural.
01:36:26.000 You're engaging in a text-based communication with someone.
01:36:30.000 You don't know who they are.
01:36:32.000 You don't have any background on them.
01:36:33.000 You don't know if they're fucking schizophrenic.
01:36:35.000 You have no idea.
01:36:36.000 And yet you're investing your mind and your focus on these interactions that you're having with this person.
01:36:43.000 And most likely, if you're in a dispute, you're trying to win this dispute.
01:36:47.000 So you're trying to find reasons why they're wrong and you're getting anxiety and you're involved in this little sort of debate slash mental battle.
01:36:57.000 It's like fucking go outside.
01:36:59.000 Go do something with your life.
01:37:01.000 Like social media is fucking dangerous, but it's not dangerous if you understand it.
01:37:07.000 It's like if you have a cabinet filled with cookies and chips, it doesn't mean you're gonna get fat.
01:37:13.000 You can always go into that cabinet every now and again and have a cookie and you're going to be fine.
01:37:18.000 But if you just fucking open that cabinet every day and stuff your face, you're going to get diabetes.
01:37:24.000 Yeah, and what's hard is these apps are built to be sweeter and sweeter and more fattening every year.
01:37:31.000 Look at TikTok.
01:37:31.000 That's the best one.
01:37:33.000 And that's where I finally drew the line where I always try to stay up to date on all the apps, you know?
01:37:41.000 I have a family member who's young who, like, told me about TikTok and they're like, you gotta get on this.
01:37:46.000 It was, like, back in, like, 2019. And they, of course, were right.
01:37:49.000 I should have.
01:37:50.000 No!
01:37:51.000 No, no.
01:37:51.000 But at the time, I just said, like, this is a step too far.
01:37:55.000 The shortening of our attention spans, YouTube used to be short form.
01:37:59.000 That's, like, the funny thing.
01:38:00.000 It's, like, then it was, like, TikTok and, well, it started with Vine.
01:38:05.000 But it was just, like, this new idea that, like, hey, forget about 10 minutes.
01:38:08.000 Let's try 10 seconds for a video.
01:38:11.000 And that's where I have successfully disengaged.
01:38:14.000 I don't watch any TikTok or short form, because that would be the end of my attention span.
01:38:19.000 And I feel bad for, like, what do teachers do now?
01:38:22.000 When you're competing with, like, this never-ending feed of the most entertaining...
01:38:26.000 Well, the kids aren't supposed to have their phones in classes.
01:38:28.000 I have young kids.
01:38:30.000 But...
01:38:31.000 A lot of them sneak it and they figure out a way to juke the system.
01:38:35.000 But it's just – it's an inevitable fact of the progression of technology and technological innovation.
01:38:42.000 They're going to figure out a way to get people more engaged because it's profitable.
01:38:45.000 And there's – Going to be a better app than TikTok in the future.
01:38:50.000 A more addictive, more engaging app.
01:38:53.000 You have to imagine, right?
01:38:54.000 It's so funny to imagine now because you're just like, how could you?
01:38:57.000 But then we were thinking the same thing about YouTube.
01:39:00.000 You're like, wow, this is great.
01:39:02.000 You can find anything, anywhere.
01:39:05.000 And now YouTube is like, oh, they're like the responsible educational company.
01:39:11.000 You can learn a lot on YouTube.
01:39:12.000 Oh, I've learned so much on YouTube.
01:39:14.000 I love it.
01:39:16.000 It is kind of an incredible platform, and it is important to remember with all these new technologies, there are good things, but oftentimes the people who are creating the platforms don't really tell you about the bad things.
01:39:28.000 They're incentivized.
01:39:30.000 That's not their job.
01:39:31.000 Exactly.
01:39:31.000 Their job is just to make something awesome.
01:39:33.000 It's your job to figure out your own life.
01:39:35.000 Yeah.
01:39:35.000 But it's, you know, the problem with things like TikTok and YouTube and Twitter and, I mean, this is what we're finding out with the Twitter files, is that then other entities get involved in the process of censoring certain information.
01:39:51.000 And promoting a specific narrative.
01:39:53.000 And then when you find out the government's actually involved in that, like, well, that gets really shady.
01:39:57.000 Like, we need some sort of regulations and or laws to stop that from happening.
01:40:03.000 Or you need someone like Elon Musk that comes along and actually fact checks the president.
01:40:08.000 You know, when they started fact checking the White House, you know, actually, that's not true at all.
01:40:12.000 And that's not what why there's inflation.
01:40:14.000 That's not you didn't do that.
01:40:15.000 And it's amazing to see the White House delete tweets out of shame.
01:40:20.000 But that's the world we're living in now.
01:40:22.000 But that's not the case with YouTube.
01:40:24.000 And with YouTube, there was some real problems, especially during the pandemic, with the censorship of accurate information that didn't fit a very specific narrative that they were trying to promote because of their sponsors.
01:40:37.000 How do you regulate that when one of the challenges is that And I know this firsthand, the regulators are so out of touch with the technology because technology moves so fast that these guys,
01:40:55.000 a lot of these regulators were around when it was dial-up internet, and now they're in positions of power being asked to regulate things when they checked out with email.
01:41:07.000 Yeah.
01:41:08.000 Yeah.
01:41:09.000 Yeah.
01:41:09.000 Well, you saw that when people were interviewing Mark Zuckerberg and they were talking to him.
01:41:16.000 They don't know what they're talking about.
01:41:16.000 They're literally talking to him about problems with Google.
01:41:20.000 Yeah.
01:41:20.000 And he's like, hey, I'm Facebook.
01:41:21.000 And they're like, fuck are you talking about?
01:41:23.000 Yeah.
01:41:24.000 That is one of the bizarre things.
01:41:26.000 And so you rely, weirdly enough, on people to inform the politicians.
01:41:31.000 Well, who informs them?
01:41:32.000 Lobbyists.
01:41:33.000 Right.
01:41:33.000 And then you go back to people like Sam Bankman-Fried, where it's like he's informing them with money.
01:41:39.000 Yes.
01:41:39.000 And he's like, hey, let me get a meeting with you.
01:41:41.000 So he gets a meeting.
01:41:42.000 And now he's a favorite on the Hill because he seems like he's the responsible one in the room.
01:41:48.000 And it turns out he's a giant fraud, but no one noticed...
01:41:51.000 Because they don't know what they're talking about.
01:41:53.000 Right.
01:41:53.000 They don't know what they're talking about and they're dealing with a million different issues all at once.
01:41:56.000 Does it make sense?
01:41:59.000 So I totally get the, you know, elect kind of older people because they have wisdom.
01:42:05.000 But at the same time, does it make sense for there to be limits on age where you get more young people involved in these situations who actually know the technologies, especially on those special subcommittees where technology is such an important part?
01:42:21.000 Yes, it makes sense to get people that understand it, and young people are going to be more likely to understand it.
01:42:27.000 But do you want people with a lack of wisdom?
01:42:30.000 Like, these are the type of people they were dealing with at Twitter.
01:42:32.000 They were dealing with young millennials that were deciding to censor information and to, you know, I mean, that was one of the problems that they had with issues like deadnaming people.
01:42:43.000 You know, like if someone can change their name and change their gender, and if you use their old name, like if you called Caitlyn Jenner Bruce Jenner, you'd be banned for life.
01:42:54.000 Which is bizarre, because that person named Bruce Jenner won the fucking Olympics.
01:43:00.000 What are we supposed to do there?
01:43:03.000 You're doing this based on an ideology.
01:43:05.000 You're not doing this based on fact.
01:43:07.000 The actual fact is that person was born Bruce Jenner.
01:43:11.000 Now, to be kind and respectable to that person and refer to them in the gender that they want is nice.
01:43:19.000 It's a good thing to do.
01:43:20.000 But why is that problem something that gets you banned for life?
01:43:25.000 But you can call someone a cunt and that's fine.
01:43:29.000 I have no idea.
01:43:30.000 See, this is why I stay in my lane of scams, because I'm just like, it's impossible.
01:43:34.000 Yeah, you have to stay in the lane.
01:43:34.000 It's impossible to, and ultimately, like, one of the things I realize, so I consider myself a journalist, but one of my few privileges is that I don't have to engage in politics.
01:43:44.000 Yeah.
01:43:45.000 And it is a privilege because I see people just lose their mind.
01:43:49.000 Lose their mind.
01:43:50.000 In this culture war, and it's like, I mean, I don't know anything about most of these issues, and I'm like, I have expertise in like one thing, and I do have an expertise in it, but I think now if you're a journalist and you're sort of on the—if you politically align yourself,
01:44:12.000 now you're expected to have a position— On everything.
01:44:16.000 Yes.
01:44:16.000 Even if you have no idea what you're talking about, well, then you're expected to take the part, whatever the party line is, you're expected to take it.
01:44:22.000 Even if you haven't considered it, that's what happens.
01:44:25.000 And I've seen sort of people in the media become like co-opted by their audience where they may have to have these opinions.
01:44:33.000 Yes.
01:44:33.000 And so I feel lucky because I feel like there is no like mainstream thought and like scams.
01:44:39.000 I'm just like, let me interview a few victims and like they'll tell the story and that's great.
01:44:44.000 And I kind of stay away from that.
01:44:46.000 So, I mean, for me, that's where I always go back to.
01:44:49.000 I'm just like, I don't know.
01:44:50.000 That's a very good position because I've fallen into that.
01:44:53.000 I haven't fallen into audience capture, but I have fallen into the ideological game where If you're in one camp, you're supposed to have all the opinions that one camp has.
01:45:07.000 And if you do not align with all the opinions that one camp has, you find yourself cast out of the group.
01:45:14.000 And I thought initially, wrongly, That what the internet was going to do was provide people with so much data and so much information that we would lose camps and that people would instead have a more open-minded and centrist view of things and say,
01:45:35.000 well, I could understand why people would think this because of that and I can understand why and we would have like more of a collective idea.
01:45:42.000 But what I didn't anticipate was social media and the echo chambers that it would provide.
01:45:47.000 Right.
01:45:48.000 And that these ideological echo chambers also come with virtue signaling and that people get on these things because you're only dealing with a short amount of characters and you state something that you know is going to get a bunch of likes and people are very addicted to likes and there was some talk about like removing likes because they realized that likes were an issue and then people freaked out just like those people freaked out about taking away your fucking chatbot girlfriend and they stopped doing that they stopped that idea but If you didn't know whether
01:46:18.000 or not people agree with you or disagree with you, I think that'd probably be better overall for people.
01:46:23.000 Because I think that whether or not people agree with you or disagree with you is important.
01:46:30.000 But you don't know those people.
01:46:32.000 It's important if you know the people and you respect them and appreciate them.
01:46:36.000 And that used to be the world.
01:46:37.000 The world used to be, you know, I go to CoffeeZilla and I go, hey man, what do you think about this Ukraine thing?
01:46:43.000 And then I know you and I know that you're honest.
01:46:46.000 And so I talk to you and you say, well, this is what I've...
01:46:49.000 Right.
01:46:50.000 And this is what I think.
01:46:52.000 And then I go, oh, that's interesting because I thought this.
01:46:54.000 And you go, yeah, I thought that too, but then I found out that.
01:46:57.000 And you go, oh, okay.
01:46:58.000 And you get sort of a more informed, neutral position on what things are.
01:47:03.000 I don't think people are getting that.
01:47:05.000 I mean, there was a funny meme that came out right when the war started.
01:47:10.000 That was like the instantaneous change from people going from being healthcare experts to foreign policy experts.
01:47:19.000 It's hilarious.
01:47:20.000 It's very funny because that's what people do.
01:47:22.000 They find out what is the new thing that I can say that's going to get me likes.
01:47:25.000 Let me throw that Ukraine flag up in my Twitter bio alongside my gender pronouns and get after it and let's get some likes.
01:47:34.000 And now everyone's AI experts too.
01:47:36.000 They used to be crypto experts and now it's like everyone's AI experts.
01:47:39.000 Yeah, it's the classic.
01:47:41.000 It's like everyone's always current affair experts.
01:47:44.000 It's a weird thing how social media, like it's an echo chamber, but it's a weird kind of echo chamber because it's not just what you think.
01:47:53.000 So if that were the case, that would kind of be obvious.
01:47:56.000 But it's also like you're shown the other side, but the most incendiary, insane side of the other side's views.
01:48:04.000 Almost to the point it's like caricatures.
01:48:08.000 Let's say you're a right-winger.
01:48:10.000 You know, like, okay, the most insane people on the left are going to get the most likes from me because my camp is going to love it.
01:48:15.000 They're going to eat it up because they're going to look as insane as possible.
01:48:18.000 So you make them look insane.
01:48:19.000 The left-wing people, they go, okay, let's select for the most insane right-wing person and we'll put him out there.
01:48:26.000 And so they both put out like these like...
01:48:28.000 Sort of extreme views of the other side to their audience.
01:48:32.000 And then if you're in that echo chamber, you go like, wow, those guys are literally insane.
01:48:38.000 Because you think that's what the other team is just agreeing with.
01:48:43.000 Like, yeah, this is normal.
01:48:45.000 And meanwhile, the other team would be like, yeah, that's a little crazy, but we actually think this.
01:48:49.000 We have a more moderate position on whatever.
01:48:52.000 So what I usually find is...
01:48:55.000 When you actually deal with individuals instead of labels and ideologies, what you usually find is people are pretty normal, but a lot of people have been caught up in this battle, and it's like a reaction to the reaction to the reaction where you go from like,
01:49:14.000 okay, it was the mainstream media, then it was like...
01:49:18.000 Independent media and then I find that like you know and I'm I I'm in independent media And so I understand the temptation as many as much as anybody to like dunk on mainstream media because it's like it's easy It's like great.
01:49:29.000 It's like you get right, you know, and they are wrong so often But then the mainstream media gets pissed off and they're like hey look you independent media You're just all you do is spend your time complaining about us.
01:49:39.000 What are you actually doing in terms of news gather?
01:49:41.000 Are you on the ground?
01:49:42.000 What are you doing?
01:49:43.000 Sometimes they are but you know I think The news, it's all just kind of decentralizing into a lot of different camps and there's good people everywhere and there's bad people everywhere.
01:49:57.000 There's great journalists, you know, who are trying to make a difference in bureaucracies at MSNBC or wherever.
01:50:06.000 There's great people.
01:50:06.000 There's great regulators trying to make a difference.
01:50:09.000 But everyone's dealing with their own incentive problems and their own challenges with bias and their own echo chambers that they make mistakes.
01:50:21.000 And then when they make mistakes, the other team just goes like, ah!
01:50:24.000 Yeah, there's that, and then there's also financial incentives.
01:50:27.000 Yeah.
01:50:28.000 It's financial incentives.
01:50:29.000 Huge problem.
01:50:29.000 Yeah.
01:50:30.000 I mean, when you get motivated by whoever is your sponsor, whoever is the advertising revenue provider for whatever show you have, that becomes a gigantic issue.
01:50:40.000 When you see a mandate that gets pushed through, and when you see people clearly moving in lockstep Yeah.
01:51:05.000 What do you do?
01:51:06.000 All you do is criticize us.
01:51:08.000 Well, that's a very valuable role, guys.
01:51:11.000 Like, that's a very valuable role because you people are fucked.
01:51:14.000 Like, you're not Walter Cronkite.
01:51:16.000 This is not the New York Times of 1970. This is a completely different animal.
01:51:20.000 And it's an ideologically captured animal.
01:51:23.000 And then you have mainstream television, which is...
01:51:27.000 It's almost bullshit.
01:51:29.000 It's almost like you could just say CNN is bullshit.
01:51:32.000 Fox News is bullshit.
01:51:34.000 How much of it is bullshit?
01:51:35.000 Is it 30% bullshit?
01:51:37.000 Well, if I gave you a sandwich and it was a cheeseburger, but it was 30% dog shit, am I allowed to call that a cheeseburger?
01:51:44.000 Now, you have a dog shit infected...
01:51:47.000 Cheeseburger, right?
01:51:48.000 And that's what a lot of television news is.
01:51:51.000 And it's not news because they need to get you informed because it's like a service that they're providing because most people don't have the time to gather that information.
01:52:00.000 No, it's a propaganda disseminating entity that relies on advertising.
01:52:06.000 The advertising shapes the propaganda that gets disseminated.
01:52:09.000 That's fucking dangerous.
01:52:11.000 And so if independent media doesn't exist, where someone is not captured by that, can't point that out, we've got a real problem with information.
01:52:19.000 Because then it's going to be who has the most money and who can buy out the most media.
01:52:26.000 There's a lot of that going on, and that's scary.
01:52:28.000 It's scary for people that don't know the truth, and it feels horrible when you get duped, when you think that a mainstream story is correct, and then you find out, oh my god, I got fucked.
01:52:38.000 Yeah.
01:52:38.000 Yeah.
01:52:39.000 Well, I mean, what I think is everyone—the problem with pointing out financial incentives is everyone has financial incentives.
01:52:49.000 Everyone—even independent media has to make a buck somehow, right?
01:52:52.000 Of course.
01:52:53.000 What I'll say is I've been on some of these mainstream shows, not many of them, but a few of them have invited me on, and what I've noticed is they're just bad.
01:53:03.000 The platform itself is just a bad way to express yourself.
01:53:07.000 Absolutely.
01:53:08.000 I went on one, and I won't name it, but you're in this waiting room, and they join you in the waiting room.
01:53:14.000 It's like this Zoom version of it, and they go, Hey, how's it going?
01:53:17.000 I'm good.
01:53:18.000 I'm good.
01:53:18.000 Okay, we're on we're on in five and then they ask you like this like Three-second question and they cut you off within like you give this like sound and you're aware like Okay, this is it's live, but it's actually not live like they're gonna release it later So I'm like why can't I really think about my answer?
01:53:34.000 But it's given with this perspective like okay, you have you know You have this 30-second answer and then they respond and then before you can even respond they cut to a new segment and so I'm like you can't even get into The meat or nuance of the argument is the format literally constrains your ability to tell the truth,
01:53:54.000 the whole truth.
01:53:55.000 And so one of the things that I think has been so just unlocking about YouTube is like I just released a story and it was about a 30-minute story.
01:54:05.000 So you know how long it was?
01:54:05.000 It was 30 minutes.
01:54:07.000 When I have a 10-minute story, it's a 10-minute story.
01:54:09.000 When I have a 50-minute story, it's a...
01:54:10.000 That is such an underrated like just format shift.
01:54:14.000 To where you are able to tell the truth in the size that it is.
01:54:20.000 Yes.
01:54:20.000 And I think that's the problem now is, or the problem with mainstream media that's like, it's a challenge is they're stuck in an old format.
01:54:28.000 Yeah, and it's unfixable because they're connected to advertising.
01:54:31.000 So they have to go to commercial every X amount of minutes.
01:54:35.000 And that's not going to change.
01:54:36.000 Yeah, and you need the in and you need the out.
01:54:39.000 And they also have a time, they have a time spot.
01:54:42.000 So their time slot is, you know, 8pm to 9pm.
01:54:44.000 That's it.
01:54:45.000 So there's many subjects that are deeply nuanced, and you can't cover them in 60 minutes.
01:54:51.000 And you don't get 60 minutes anyway, you get 44 with commercials, or maybe even less, depending on the show.
01:54:56.000 That, you're fucked.
01:54:58.000 You're fucked because like it must be incredibly frustrating for someone who exists in mainstream media to see a person like you go into a deep dive And then they'll look at the video like this motherfucker got three million views like this is crazy You know my stupid fucking show on what network gets you know if you're lucky a couple of hundred thousand and that in the key Demographic what is it like forty fifty thousand and these are like big shows and that's hilarious but also it's It's great for you.
01:55:28.000 It's great for me.
01:55:30.000 And it also shows that people have this perception that because short attention span formats like TikTok work, they're very effective.
01:55:39.000 That's the only thing people want to consume.
01:55:41.000 That's not true.
01:55:41.000 It's not true.
01:55:42.000 I think it's actually kind of like splitting into two things where you have like, hey, I have a break.
01:55:48.000 I'm going to watch something short.
01:55:49.000 Or, hey, I'm like, you know, I'm going to go do something.
01:55:53.000 Let me put on a show.
01:55:54.000 Let me learn while I'm...
01:55:56.000 That's become hugely popular.
01:55:58.000 It's like, hey, I'm setting up something in my office.
01:56:01.000 Let me turn something on and learn something, hopefully.
01:56:03.000 While I'm doing it.
01:56:04.000 While you're cleaning your office, you're actually absorbing something.
01:56:08.000 Exactly.
01:56:09.000 I'm like basically sitting in the room with an expert as he describes some topic that I'm interested in.
01:56:15.000 But then there's a problem that, what if that guy's full of shit?
01:56:19.000 What if that guy's full of shit and there's no fact checkers?
01:56:21.000 So there's no one checking.
01:56:23.000 And who fact checks the fact checkers?
01:56:25.000 Right!
01:56:25.000 I mean, it's problems all the way down, but I think like...
01:56:29.000 The thing that I worry about the most is that we have to have some commonality.
01:56:36.000 And so I think why I like spending time on things that unite people is I'm like, all right, my show you can agree with no matter what.
01:56:46.000 Or you can watch it and you can disagree with it, but it doesn't— You're laying out facts.
01:56:51.000 Yeah, you're not divided by— You know, your interests either way.
01:56:57.000 And so I think it's such a ripe moment for journalists to do more than play the game of battles.
01:57:06.000 But I don't think they can in mainstream media.
01:57:09.000 That's why it's so interesting.
01:57:11.000 And that's why independent media has a huge advantage.
01:57:17.000 Don't you think like 60 Minutes has done a pretty good job?
01:57:20.000 They only have 60 Minutes.
01:57:21.000 Yeah, yeah, yeah.
01:57:22.000 And they don't even have 60 minutes.
01:57:24.000 But don't they do like real stories?
01:57:26.000 Yes, they do real stories.
01:57:27.000 Not just like partisan, you know, whatever.
01:57:30.000 They'll do a little bit of the politics, but they actually, like, they'll...
01:57:33.000 And Vice was doing that for a long time.
01:57:35.000 They did these incredible, like, documentaries.
01:57:38.000 That's journalism at its best, where you're just like, you're just deep diving a topic that just people find interesting.
01:57:44.000 Yes.
01:57:44.000 You go somewhere, you talk to people, and you go, you present the facts, but you don't go in with this pre-thing of, okay, I know what happened, and let me tell every, like, I'm just gonna, you know.
01:57:54.000 But Vice is a good example of what's wrong, because, like, they were that, and then they got bought.
01:58:00.000 And then the people who bought it, like, yeah, you got a great thing going on, but we're gonna fuck that up, and we're gonna turn it into this, like, woke fucking platform, this weirdo platform.
01:58:09.000 And that's what it is now.
01:58:11.000 It's like, you can kind of guess what their angle's gonna be before they even write the story.
01:58:17.000 I will say there have been a few good vice pieces since then.
01:58:20.000 Oh, for sure.
01:58:20.000 But I know what you mean.
01:58:22.000 It's really challenging.
01:58:24.000 I really try to be as charitable as I can because I know a lot of these journalists are working within these horrible constraints of...
01:58:34.000 They want to do investigative work.
01:58:37.000 One of the dirty secrets of the journalism game is that investigative journalism is the loss leader for every single news agency.
01:58:46.000 They're all losing money on investigative journalism, and they want to do as little of it as possible for the bottom line.
01:58:51.000 Because it's expensive.
01:58:51.000 Because it's expensive, man, to go send out a guy to really do the work.
01:58:55.000 You know what's easy?
01:58:56.000 Putting on a commentator and I can just pull up a bunch of articles all day and I'll just talk my talking points about those articles.
01:59:05.000 That's the profitable side of things because it's quick.
01:59:08.000 You can churn out clips.
01:59:09.000 And at the end of the day, you can use the findings of investigative journalists and you just put them on your show.
01:59:14.000 Yes.
01:59:15.000 You go, hey man, like I heard you found this.
01:59:17.000 And they spent like three months on it.
01:59:19.000 And you spend like 20 minutes and you get double the views because people, they know you because you're on TV all the time or you're on the internet all the time.
01:59:27.000 And so that's one of the real challenges.
01:59:30.000 I know journalists want to do that investigative work, but they have editors.
01:59:34.000 Yes.
01:59:35.000 And they have people telling them, hey, we judge you by the number of clicks you get on our site.
01:59:40.000 We're a click site.
01:59:41.000 Or we're a subscription site.
01:59:42.000 So you've got to cater to the kinds of people who subscribe to us.
01:59:47.000 If we're the New York Times, we have a certain type of people who subscribe to us.
01:59:50.000 If we're, I don't know what the equivalent is, the New York Post or whatever, we have a type of person.
01:59:55.000 So I think the New York Post is more advertising.
02:00:00.000 The point is, is that these journalists are kind of like sent out on these mandates rather than go find the truth.
02:00:05.000 That's what they want to do.
02:00:06.000 But it's like instead it's, you know, you're battling for attention in a click world where you're not even controlling your traffic.
02:00:14.000 The social media company is controlling your traffic.
02:00:16.000 And it's about how many likes you get on Twitter and how many retweets you get on Twitter.
02:00:20.000 Yeah.
02:00:21.000 Yeah.
02:00:22.000 It's – they're trapped.
02:00:23.000 And it's not good.
02:00:26.000 But with – the thing about independent journalists is that it's like they're not going to send someone to Turkey to investigate something.
02:00:34.000 They don't have the money.
02:00:35.000 They also don't have people that they can just send out.
02:00:38.000 And that was one of the cool things about Vice is they did do that.
02:00:42.000 And back in the day, they would send someone to the front lines of some foreign war.
02:00:47.000 And, you know, you see some fucking journalist with glasses on, with a flak jacket on.
02:00:53.000 Isn't that crazy?
02:00:53.000 It was wild.
02:00:54.000 Vice was wild in the beginning, you know?
02:00:57.000 And I'm good friends with Shane Smith, and I was friends with him in the early days when all that was going down.
02:01:03.000 It's fascinating to see what they did, but he sold it.
02:01:08.000 Yeah.
02:01:09.000 Well, here's the thing, too.
02:01:11.000 Independent media journalists, after a certain size, they can do it.
02:01:15.000 The problem is they realize, like, for clicks, it's like, hey, I should just stay in my room and make 20 videos instead of going out.
02:01:23.000 And so that's why I'm a big believer in, like, subscription models for independent media journalists.
02:01:29.000 Like subs.
02:01:30.000 Yeah, I do like the YouTube equivalent of like Patreon.
02:01:34.000 And it's like, that is a way for me to free myself from like the view model, which I did for a long time, where it was like, it was just, it was about views.
02:01:42.000 And so eventually I was like, man, I really want to deep dive something and I don't want to be limited to like, do I think this is a popular thing?
02:01:50.000 So that was a big change.
02:01:52.000 And I think, yeah, things like Substack, it really frees up people.
02:01:55.000 And I think as we learn to like, Pay for journalism.
02:02:01.000 I think that's a big thing because it's not free.
02:02:03.000 We got the false impression that it was free from years of just being able to go on Google News or whatever and sorting through.
02:02:11.000 Meanwhile, the quality of journalism was just dropping like a rock as everyone moved to this ad model.
02:02:16.000 To digital.
02:02:17.000 Yeah, to digital.
02:02:18.000 It's just there's no money in it.
02:02:19.000 And the money is in just the mass production of just slop.
02:02:23.000 Yeah.
02:02:25.000 Yeah, I don't envy them.
02:02:26.000 It's not good.
02:02:28.000 But it is great for someone like you.
02:02:30.000 It's great for us.
02:02:32.000 It really is.
02:02:36.000 Especially for having long-form conversations.
02:02:41.000 What I found is that anytime a model breaks, it gives you the chance to restart.
02:02:46.000 So you just described the kind of the problems with the models of mainstream journalism that allowed for an opening because people are thirsty for like real conversations.
02:02:55.000 Yeah.
02:02:55.000 Right?
02:02:56.000 And so this podcast can go on as long as it goes on for and we can clarify anything.
02:03:02.000 We can do – but this – If there wasn't problems in the previous generation, there might not have been that opportunity for you to, you know, get big basically doing what you do.
02:03:12.000 Well, this thing didn't exist before.
02:03:14.000 The only form that you had that's similar to this was radio.
02:03:20.000 And, you know, morning radio.
02:03:22.000 I mean, this is literally where I came up with the idea to do this, was being on the Opie and Anthony show.
02:03:28.000 Really?
02:03:29.000 Yeah, and being on the Howard Stern show, where you would go, wow, I'd like to have one of these things.
02:03:33.000 You just have fun with people and sit around and shoot the shit.
02:03:35.000 It'd be great.
02:03:37.000 No one was going to give me a show like that.
02:03:38.000 And they certainly were going to give me a show where one day I'm going to interview a UFO expert.
02:03:42.000 The next day it's a psychologist.
02:03:44.000 The next day it's an athlete.
02:03:47.000 And it's just whoever I'm interested in.
02:03:49.000 And no one would say, yeah, interview whoever you're interested in.
02:03:53.000 Here's some money.
02:03:54.000 You'd have to create it on your own, which is I did, but I didn't do it for profit.
02:03:59.000 I did it because I thought it'd be fun to do.
02:04:01.000 That's literally how I started doing it, then it became this thing.
02:04:04.000 But I kept it the way I started it, where I'm only, like, I got interested in you by watching your videos.
02:04:11.000 I got interested, I'm like, oh, this is fascinating.
02:04:14.000 Oh, this guy's clarifying this stuff.
02:04:15.000 I was wondering why.
02:04:16.000 Oh, okay.
02:04:17.000 And then here we are talking.
02:04:19.000 It's that simple.
02:04:21.000 And I've reached out to you.
02:04:22.000 It's like me to you, and then we're here.
02:04:25.000 There's no other people.
02:04:26.000 It's crazy to think of, you know, I kind of grew up in this.
02:04:31.000 I'm only 28. So I kind of like grew up in like, as it was shifting, as everything was shifting underneath people's feet.
02:04:37.000 And it's interesting to watch.
02:04:39.000 I am very fortunate to have never had to deal with these middlemen.
02:04:44.000 That's very fortunate.
02:04:45.000 And people have tried to inject it, but I got enough people who had been burned by that telling me, like, hey, you don't want to sell the show.
02:04:54.000 You don't want a middleman.
02:04:56.000 You don't want this guy saying he can get all these deals.
02:04:58.000 You do not want this guy.
02:04:59.000 He's just going to use you and he's going to inject himself for nothing.
02:05:04.000 You get nothing.
02:05:04.000 And then your show becomes worse.
02:05:06.000 It becomes this different thing.
02:05:08.000 Yep.
02:05:09.000 But I was so fast like that is my favorite and it's legitimately the most exciting part of independent media is for the first time there's no business people telling people what to do.
02:05:19.000 There's no top line guy who's saying hey we'd really prefer it if you sold more ad spots or did more of this.
02:05:26.000 It's just you and the audience and that direct connection is special and we've never really got gotten to see it before and yeah I think that's a game changer.
02:05:37.000 Yeah, I know a lot of people that have podcasts that sold like half of their podcast or they, you know, got into some sort of a deal with a management company and the management company takes a percentage of the show and then all of a sudden other people are on conference calls dictating guests and telling you to avoid certain subjects or don't have this person on or don't talk about that or every time you talk about this,
02:06:00.000 you know, if you get a, you know, a strike against you on YouTube, it's going to cost us.
02:06:06.000 Yeah, that's not—I mean, then you're back in the same trap that you were trying to avoid, if you were trying to avoid that trap in the first place.
02:06:12.000 But a lot of people were not trying to avoid that trap.
02:06:15.000 They just started a thing.
02:06:16.000 And then along the way, that thing became profitable, and people recognized it was profitable, and then they swooped in and tried to buy it.
02:06:24.000 And it's very tempting.
02:06:25.000 Someone comes along, hey, CoffeeZilla, we've got X amount of money for you.
02:06:30.000 Oh, yeah.
02:06:30.000 And then you don't have to worry about money anymore.
02:06:32.000 Oh, don't you want to do that?
02:06:33.000 And then you're like, okay.
02:06:34.000 Okay, well, here, now you have to interview this person.
02:06:37.000 It's all to promote some crappy crypto coin or something.
02:06:40.000 That's what it always is.
02:06:41.000 Imagine if that was you, if you fell into that.
02:06:43.000 Yeah.
02:06:44.000 No, I mean, there's been plenty of that.
02:06:46.000 People are always asking, like, hey, will you promote this?
02:06:48.000 Will you do that?
02:06:49.000 And it's just like, why sell out like that?
02:06:52.000 I think people just want to be free of the tension of worrying about the future.
02:06:58.000 You know, if something comes along and now all of a sudden you don't have to worry.
02:07:03.000 Like, they're going to throw X amount of dollars at you and now you're owned by this corporation so you don't have to think about who your guests are.
02:07:10.000 There's always catches, though.
02:07:11.000 Oh, for sure.
02:07:12.000 I think actually though the kind of day-to-day struggle of like I got to kind of like you got to make something.
02:07:20.000 I got to generate something useful is actually kind of good because it kind of makes you strive.
02:07:25.000 It kind of makes you push.
02:07:29.000 I really like, you know, I feel like I'm literally living a dream because I started making these YouTube videos.
02:07:36.000 Now I've got this like crazy set and, you know, I'm able to like learn all about cinematography and somehow I get paid for it.
02:07:44.000 And it's kind of this wild thing, but at no point did I have to ask for anyone's permission.
02:07:49.000 Yeah.
02:07:49.000 Like that is like the like the nobody had to give me a chance.
02:07:53.000 Like you kind of create your own your own chance in a weird way.
02:07:57.000 That's the beauty of YouTube.
02:07:58.000 You know, and I had a conversation with Russell Brand about this, and I'm like, here's this guy who's a movie star.
02:08:05.000 He's this huge movie star who decides, you know what?
02:08:07.000 I'm just going to have a camera pointing at me, and I'm going to rant and rave and have these comedic takes on social issues and issues in the news.
02:08:17.000 And it's become massively popular.
02:08:19.000 And I'm like, one of the things that's interesting is, like, you're doing it the way anyone can do it.
02:08:23.000 Like, anyone can set up an iPhone and have it point at you, and you just start talking and then make a video.
02:08:29.000 And there's a lot of them out there.
02:08:30.000 Like, you're not doing anything different.
02:08:32.000 Right.
02:08:32.000 You know what's fascinating is, you know what was the biggest tell?
02:08:36.000 Like, when I felt like everything broke was when all the late-night people had to go home for COVID. Yeah!
02:08:43.000 Wasn't that crazy?
02:08:45.000 Amazing.
02:08:45.000 You gotta see, like, they went from, you know, they're TV people, and all of a sudden what's on TV looks like a YouTube video.
02:08:51.000 Yes.
02:08:52.000 And you go, oh my gosh.
02:08:54.000 You suck at this.
02:08:54.000 This whole time, like, I thought you guys were something, like, you're the same as me.
02:08:59.000 Yes.
02:09:00.000 Worse.
02:09:01.000 But like, worse.
02:09:01.000 Way worse.
02:09:02.000 I'm doing it myself.
02:09:03.000 You have this whole team of people, and it's like, this is all you can, and then you, it kind of breaks the illusion, like, it's like seeing someone run a four minute mile or whatever, you're like, oh, I can do I can do this show like I got this idea to like make this crazy set because I saw somebody do this like TED talk about like you know I think their line was like you can't become Kanye in your living room like you got to make an environment that like speaks to what the show is kind of a weird thing now some people do do it very pop they have a very popular show
02:09:33.000 and they do just do it from their living room And that's a different appeal because that's like, that's raw.
02:09:39.000 So there's an appeal to the raw and then there's also an appeal to like, you know, high production value and it's different things.
02:09:46.000 They both communicate a different like kind of appeal to the work.
02:09:50.000 But I was always obsessed with like, there is no difference between YouTube and like Hollywood, besides just a little bit of knowledge, a little bit like insider, like they kind of know tricks.
02:10:02.000 There's tricks of the trade.
02:10:03.000 They kind of have a little bit more money.
02:10:05.000 But I was like, you can hack this together now.
02:10:08.000 You can figure out ways to kind of like...
02:10:12.000 Almost get to like a Netflix like so that's my that's my dream is to start pushing for like real like Documentaries or many documentaries on YouTube that look like they could be on a Netflix or something like that But never go to Netflix.
02:10:27.000 Yeah, but never take the deal like never go to the producer.
02:10:30.000 Just always be doing it yourself.
02:10:32.000 Yeah I think what you're saying about the late night things is so true.
02:10:36.000 Because I remember watching them do monologues with no audience.
02:10:40.000 And I was like, who said okay to this?
02:10:43.000 Why are you doing this?
02:10:44.000 There's not a fucking chance in hell that this is funny or gonna work.
02:10:49.000 And when you see those...
02:10:52.000 Flat, corny, late-night monologue jokes with no audience.
02:10:57.000 Those are so fucking cringy.
02:10:59.000 And you're also dealing with a lot of these people that are not stand-up comics.
02:11:03.000 So they don't even really, truly understand how to deliver it right.
02:11:09.000 Like, they don't have the chops.
02:11:11.000 What they're doing is just, like, reading off of a teleprompter, so a bunch of really good joke writers wrote them some stuff, and then they're playing to the audience, and the audience is, like, laughing so they get this feedback, and they know how to do that.
02:11:25.000 When they're just them and the camera, you're in the void now.
02:11:29.000 You're in deep space.
02:11:30.000 There's no one around you.
02:11:32.000 And it's fucking wild to watch.
02:11:34.000 It is wild.
02:11:35.000 Can you unlock, like, how different is that from, like, stand-up?
02:11:40.000 I'm just, like, a casual viewer of, like, the, like, late nights.
02:11:43.000 I mean, I know they say, like, applause, but is that real, like, laughter?
02:11:46.000 And, like, are they, like, saying, like, hey, clap?
02:11:49.000 Oh, there's someone who's doing this.
02:11:51.000 There's someone in front of the crowd.
02:11:53.000 There's a warm-up guy.
02:11:55.000 And generally the warm-up guy is a failed comedian or a middling comedian who's just trying to make it.
02:12:00.000 And they're doing the warm-up thing as like a side gig.
02:12:03.000 And there's people that are good at warm-up.
02:12:06.000 And the problem with being good at warm-up is it's a profitable job and it'll actually keep you from being good at stand-up.
02:12:12.000 Oh, you're like stuck.
02:12:13.000 Yeah, so you get stuck.
02:12:15.000 I've had friends that were stuck doing warm-up, and then some of them quit, and some of them didn't, and the ones that didn't are fucked.
02:12:22.000 Because those shows, they don't even exist anymore.
02:12:24.000 There's only a handful of those shows.
02:12:26.000 So if you see, how many talk shows are there?
02:12:29.000 There's very few.
02:12:31.000 There's Colbert, there's Jimmy Kimmel, there's Fallon.
02:12:34.000 There's a few.
02:12:35.000 There's only a few.
02:12:36.000 And so...
02:12:37.000 You know, they stand there and there's applause signs.
02:12:42.000 And then there's producers.
02:12:43.000 There's the warm-up guy that's literally telling people.
02:12:46.000 They're like, okay, explain to the people.
02:12:49.000 Okay, when Jimmy comes out, I want a big round of applause.
02:12:52.000 Let's practice this right now.
02:12:54.000 Ladies and gentlemen, Jimmy Fallon.
02:12:55.000 Yeah!
02:12:57.000 They practice it.
02:12:58.000 Yes, yes, yes.
02:12:59.000 They'll train the audience how to do it depending upon the set.
02:13:03.000 But I've seen them do that at different places.
02:13:05.000 And I had a friend who was a writer in the early days of Conan.
02:13:10.000 He's a buddy of mine who's a comic.
02:13:11.000 And I went to see one of the very, very early Conans.
02:13:15.000 So this is like...
02:13:18.000 I guess it was like the 90s or the early 2000s.
02:13:22.000 No, it had to be the 90s.
02:13:23.000 It was the 90s.
02:13:24.000 And they were reading their banter between Conan and, who's the other guy?
02:13:32.000 Andy Richter?
02:13:34.000 Oh, Andy Richter, yeah.
02:13:34.000 They were reading off of cue cards.
02:13:37.000 So they had a giant cue card.
02:13:38.000 The banter was fake.
02:13:39.000 So the banter, their dialogue back and forth was scripted.
02:13:44.000 So they were saying, so Andy, you know, I understand you got married.
02:13:50.000 And so they're reading it, and I'm watching the cast, like, this is madness.
02:13:54.000 Who approved this?
02:13:55.000 And it was terrible.
02:13:56.000 The early days of Conan, like, that sort of banter was fucking...
02:13:59.000 The thing about Conan is, like, he's this funny guy who's a funny writer, he's a really smart guy, and he had to figure out how to do the talk shows.
02:14:05.000 Yeah, he figured it out.
02:14:06.000 He figured it out.
02:14:07.000 But in the beginning, it was awful.
02:14:09.000 And I watched, like, the audience is being cheered on.
02:14:14.000 There's literal applause signs that flash to tell you when to applaud.
02:14:18.000 And like, we'll be right back.
02:14:20.000 Yay!
02:14:20.000 And everybody claps.
02:14:21.000 Is everybody really clapping that you're going to be right back?
02:14:23.000 Nobody gives a fuck if you're going to be right back.
02:14:25.000 I feel like they didn't know that, though.
02:14:27.000 Like, I feel like media literacy has kind of gone through the roof.
02:14:31.000 Like, so many people...
02:14:32.000 I guess it's maybe everyone has cameras now, so everyone's like sort of mini-producers now of their own show.
02:14:37.000 Sure.
02:14:37.000 And so they get it now, and so all of a sudden...
02:14:40.000 The craving for authenticity gets so much higher because now you're aware of like what a tele...
02:14:46.000 Everyone knows what a teleprompter is.
02:14:48.000 Everybody's sort of like...
02:14:48.000 Even though I didn't know how exactly I worked, I kind of was vaguely aware that like they kind of told you to applaud and like...
02:14:54.000 And the laugh tracks in, you know, sitcoms were just canned laughter.
02:14:58.000 So I feel like as people realize the fakery, there's a craving for like, hey, can you do this for real?
02:15:05.000 Like can you not...
02:15:06.000 You know, I remember when I found out like none of the conversations were real.
02:15:10.000 I was like, what?
02:15:11.000 What do you mean it's not real?
02:15:13.000 Because it's all a pretend.
02:15:15.000 They're all pretending that you really knew about my funny boat story.
02:15:19.000 And I had this quippy, you know, and I thought, wow, they're so charismatic.
02:15:22.000 And you couldn't find out they've been rehearsing the story.
02:15:25.000 Yes, they go over with a producer on the phone.
02:15:27.000 And it's completely insane when you realize that you're like, oh, it's all fake.
02:15:31.000 And the illusion is sort of gone.
02:15:33.000 And so now...
02:15:35.000 I think one of the surprising things but also maybe obvious in hindsight things was why shows with no laugh track, less production, are more engaging is because there's more of a realization of, oh, there isn't like games here.
02:15:50.000 There's just two people talking.
02:15:51.000 They haven't rehearsed their lines.
02:15:54.000 I mean, I came on here.
02:15:56.000 There was no production notes.
02:15:58.000 There was no like, hey, we want to talk about this.
02:16:00.000 It was just like, hey, you want to come on?
02:16:02.000 And that's all it is.
02:16:04.000 We didn't even discuss what we're going to talk about, which is what I do with everybody.
02:16:08.000 I just have them come in and talk.
02:16:10.000 Yeah, well, it's fascinating, and I think that's partly to do with, like, why people enjoy the show, is that they know it's not, like, tricks and gimmicks.
02:16:21.000 I wonder, there's, like, it's funny to me as I'm thinking about it, I'm like, there's sort of, like, this, like, the world is accelerating in two directions towards, like, authenticity, and then, like, with all the beauty filters and, like, the fake AI voices, it's like, you can fake reality,
02:16:36.000 but we also crave reality at the same time.
02:16:39.000 Yeah, for sure.
02:16:40.000 Yeah, people are craving real human experiences.
02:16:45.000 And if you watch those late night shows, you never feel like you know that person, you never feel like you're there.
02:16:52.000 But if you're just talking, and you and I are just talking, someone is like on their iPhone or whatever they're doing, they're a fly on the wall.
02:17:02.000 They're here.
02:17:03.000 Yeah.
02:17:04.000 In a weird way.
02:17:06.000 I always thought that, like, live streaming your whole life would become big.
02:17:12.000 Well, that was the Truman Show, right?
02:17:13.000 Yeah, no, but I thought we would see it.
02:17:16.000 Like, I guess you see it with Twitch streamers who, like, stream, like, 12 hours a day.
02:17:19.000 But I kind of thought what would take off as a—I was kind of surprised it didn't—was, like— You just watch my whole life.
02:17:28.000 Some people did do that for a while, right?
02:17:30.000 They tried it.
02:17:31.000 Yeah, I was kind of surprised that didn't – because I thought like eventually you'd have celebrities who their whole life would be on display and like the authenticity of just sitting in a room with somebody with just – it's quiet.
02:17:43.000 I think people just got too weirded out by that.
02:17:45.000 But wasn't there a...
02:17:46.000 There was a movie.
02:17:47.000 I forget who was in the movie.
02:17:50.000 But there was a movie where someone had their whole life filmed.
02:17:53.000 And at the end, they rejected it and decided...
02:17:56.000 That's Truman Show, for sure.
02:17:56.000 But Truman Show was like...
02:17:58.000 That was the Jim Carrey movie, right?
02:18:00.000 Yeah, yeah, yeah.
02:18:00.000 But that was fake, right?
02:18:02.000 Like, he didn't know.
02:18:03.000 He didn't know there were Filming his whole life and he rejects it.
02:18:06.000 There was another one where the person became famous because they followed them around with cameras everywhere.
02:18:11.000 And at the end of it, he fell in love with a girl or something and it was over.
02:18:15.000 You know, there's always some corny fucking reason why he cancels it.
02:18:19.000 Do you know what I'm talking about, Jamie?
02:18:20.000 Yeah, 100%.
02:18:21.000 I'm trying to figure it out.
02:18:22.000 I thought for some reason Matthew McConaughey was in it, but I don't think that was it.
02:18:24.000 Maybe it was Ethan Hawke or someone, some famous person.
02:18:28.000 Something TV. Yeah, Ed TV? Yeah, there you go.
02:18:32.000 That's it.
02:18:32.000 Who was it?
02:18:34.000 Who was Ed TV? But it was that.
02:18:36.000 It was kind of the premise of the film.
02:18:38.000 It was Matthew McConaughey.
02:18:39.000 It was Matthew McConaughey.
02:18:40.000 Yeah, 1989. 99. So in that movie, he gives up on everything after a while, right?
02:18:47.000 Oh, wow.
02:18:48.000 Yeah.
02:18:48.000 See, that was it.
02:18:50.000 You're live on Ed TV. So that was him, just a regular guy who became famous living his regular life.
02:18:56.000 How crazy is it that that was 99?
02:18:58.000 It is crazy.
02:18:59.000 And they kind of predict...
02:19:00.000 I mean, that was just in TV for a while.
02:19:02.000 Elizabeth Hurley.
02:19:03.000 She's still so hot.
02:19:04.000 How is she doing that?
02:19:05.000 The fuck is she taking?
02:19:07.000 She pulled it off.
02:19:08.000 But that was the thing.
02:19:10.000 It's like that this would be bad.
02:19:12.000 And they were sort of like saying, no one wants this.
02:19:15.000 Like, imagine if you got famous this way.
02:19:17.000 What a disaster.
02:19:19.000 Meanwhile, then you have social media influencers who are, you know, every single aspect of their life, they're live streaming, they're putting it on camera.
02:19:27.000 That's what Justin TV started as.
02:19:29.000 Yeah.
02:19:29.000 Eight years later, though.
02:19:30.000 He literally attached a webcam.
02:19:32.000 Yeah, that's right, to his baseball cap.
02:19:34.000 I've talked about that every way.
02:19:35.000 He's really interesting.
02:19:36.000 Justin TV was the first time we live streamed.
02:19:40.000 We live streamed on Justin TV in the green room of comedy clubs.
02:19:45.000 So what we do is like my buddy Red Band, Brian Red Band, we would go on the road together and we just we thought it'd be funny to just like livestream why we were there in a green room.
02:19:58.000 Yeah.
02:19:59.000 And so we just did that just for fucking around and it was just totally like, yeah, that's us in the green room.
02:20:07.000 It's still there.
02:20:08.000 That's hilarious.
02:20:10.000 I think that's the Hollywood improv, right?
02:20:12.000 Is that what that is?
02:20:13.000 One of them was that, the other one was Pasadena.
02:20:15.000 That's back in my full beard days.
02:20:17.000 So we did that before the podcast itself and just for fun.
02:20:22.000 And so there was like all these different versions of it that I tried out.
02:20:26.000 Where I was thinking, like, there's got to be a way to do something where I don't have to go to someone and say, hey, can you give me a show?
02:20:33.000 And then when I saw Tom Green's show, there was two things that gave me the big idea.
02:20:40.000 One of them was Anthony Cumia from Opie& Anthony.
02:20:42.000 He did this thing called Live from the Compound where he had his house set up with a green room in his basement.
02:20:49.000 And Anthony's a psycho.
02:20:50.000 So he was, like, singing karaoke while holding a machine gun.
02:20:53.000 It was like...
02:20:54.000 It was so crazy, because he had all this money, right?
02:20:56.000 He's very wealthy, so he had a full production set.
02:21:01.000 He built a set in his basement.
02:21:03.000 And I was like, this is wild.
02:21:06.000 He can just do it.
02:21:06.000 But he was already on this Opie and Anthony show.
02:21:09.000 And so he decided for fun with his friends, like he had, you know, a fucking like a full bar down there with like Guinness on tap.
02:21:16.000 And they were just drinking and being ridiculous.
02:21:18.000 And he was doing a talk show and just having fun, just being silly with his friends.
02:21:22.000 And I was like, I could do that.
02:21:24.000 And so we started doing something like that with a laptop.
02:21:27.000 And when I went to Tom Green's house, Tom Green had turned his home Into a television studio.
02:21:34.000 And it was on the internet.
02:21:36.000 And this was...
02:21:36.000 2007?
02:21:40.000 Somewhere around then?
02:21:41.000 And so he had, like, these fucking cables running through his living room and then he had a server room and everything like that.
02:21:47.000 And he takes me on this tour and I'm like, And there's a video of me sitting next to Tom Green because he had it set up just like a regular talk show where he had a desk like Johnny Carson and he was sitting there and he had screens and this is me explaining...
02:22:04.000 Why I think this is going to be the future.
02:22:07.000 For sure they'd be assholes.
02:22:09.000 There's no super cool hecklers.
02:22:10.000 They don't exist.
02:22:11.000 No, this is not about that, but this is...
02:22:14.000 That's me too.
02:22:15.000 But there is one video of me figuring it out.
02:22:18.000 Yeah, that's like a livestream show.
02:22:21.000 Yes.
02:22:21.000 That's it.
02:22:22.000 So this is like...
02:22:23.000 I think this is awesome.
02:22:24.000 Thank you, man.
02:22:25.000 This is the craziest thing ever.
02:22:27.000 It really is different, you know, than television.
02:22:29.000 This is way better.
02:22:30.000 It's like radio, but it's like television.
02:22:32.000 And the genre is different because we can sit here and ram...
02:22:35.000 You know, there isn't that time constraint.
02:22:36.000 There isn't that pressure.
02:22:38.000 I mean, you know, we want to keep it moving.
02:22:40.000 Well, not only that, there's not a corporate pressure.
02:22:42.000 You can't just express yourself because you're expressing yourself to someone who's selling advertising space.
02:22:48.000 That's crazy.
02:22:49.000 You just need to keep doing this.
02:22:50.000 You called this out.
02:22:51.000 We need to figure out how you make money from this.
02:22:53.000 Yeah.
02:22:54.000 I've got a lot of neat ideas I want to talk to you about.
02:22:55.000 Isn't that wild?
02:22:57.000 Take a little bit from the big wigs, right?
02:22:59.000 Dude, this is, I mean, they don't need to exist.
02:23:03.000 They're non-creative people.
02:23:04.000 We talked about this before the show.
02:23:05.000 They're non-creative people who are controlling creative things.
02:23:09.000 And they want to have their input.
02:23:10.000 Just abandon them.
02:23:12.000 Abandon ship!
02:23:13.000 Isn't that crazy?
02:23:14.000 Wow.
02:23:16.000 That...
02:23:16.000 Called it.
02:23:18.000 Yeah, you kind of did.
02:23:20.000 You know, I just realized why no one live streams their whole life.
02:23:22.000 I just realized it.
02:23:23.000 I remember people were trying and like they would go out and they'd call it IRL live streaming and you'd go to like the store.
02:23:29.000 You know what the problem was?
02:23:30.000 What?
02:23:31.000 People would swat you.
02:23:32.000 Oh.
02:23:33.000 So they'd like call in a bomb threat or something.
02:23:35.000 Oh God.
02:23:36.000 So the problem is you get enough people watching live.
02:23:39.000 One of them is a psychopath.
02:23:40.000 Or they just want to get attention.
02:23:43.000 Who knows why?
02:23:43.000 Tim Pool has that problem.
02:23:45.000 He's been swatted.
02:23:46.000 How many times has Tim Pool been swatted?
02:23:49.000 Multiple times.
02:23:50.000 Many times.
02:23:51.000 It's a real issue with him.
02:23:52.000 It's an issue with live streamers, though, because you get the reactions.
02:23:56.000 Right, right, right.
02:24:17.000 So that's what's bad about IRL. That's only one thing.
02:24:21.000 The other thing is, what is life then?
02:24:24.000 Is life a performance?
02:24:26.000 People are going to let that stop them, though.
02:24:28.000 But are you capable of being so in the moment that you are just yourself, no matter what?
02:24:34.000 Even if cameras are on, you would behave and exist the same way you would if there's no cameras on.
02:24:39.000 No.
02:24:39.000 I don't think most people...
02:24:42.000 Would be capable of that.
02:24:43.000 I mean, I enjoy keeping my private life private, my public life public.
02:24:48.000 I think that's pretty normal, and I think things get weird when everything's online, your family's online.
02:24:54.000 I've seen people who, they put out everything.
02:24:57.000 They put out their kids.
02:24:58.000 And they do it for the clicks.
02:25:01.000 That's what's weird.
02:25:02.000 And you're also not asking the kids.
02:25:05.000 Your kids are going to get famous when they're babies and then they don't have any say in it.
02:25:10.000 And then as they get older, people know them.
02:25:12.000 And then you run into all sorts of security issues because of that too.
02:25:17.000 Yeah.
02:25:17.000 It's not wise, and there's a lot of people that don't think.
02:25:21.000 They just do it.
02:25:23.000 It's also like an opportunity, like kids' channels were big on YouTube, where they were running these, sorry, family channels are what they called them, because you'd watch the family together.
02:25:34.000 Yeah.
02:25:35.000 And then you get like, your kids would like to watch their kids.
02:25:37.000 And people grew multi-million dollar brands in the back of that.
02:25:41.000 And it's like, by that point, it's too late to stop because you got a mortgage.
02:25:44.000 You're depending on that money coming in.
02:25:46.000 And so you can't stop.
02:25:47.000 Your kid better get on a camera.
02:25:49.000 I just want to know, like, do you tell your kid, like, get on your mark?
02:25:52.000 Like, hey, can you react to that again?
02:25:54.000 Can you help me with this thumbnail?
02:25:56.000 Like, that's crazy.
02:25:57.000 And then if the kid becomes famous when they're young, they're in so much trouble.
02:26:01.000 There's very few people that ever survive being famous when they're young.
02:26:05.000 Very few.
02:26:06.000 They all come out fucked up.
02:26:08.000 It's not a normal way to develop.
02:26:10.000 Fame is a drug that you have to develop a tolerance for.
02:26:14.000 And if you don't develop that tolerance, you actually develop with that drug.
02:26:19.000 Like, instead of, like, experiencing adversity, instead of developing your personality, you know, to, like, realize, like, what is wrong with the way I communicate?
02:26:30.000 Why do people get mad at me?
02:26:31.000 Why do people like me?
02:26:34.000 You sort it out as a human.
02:26:36.000 It's how you interact with the world.
02:26:38.000 It's why kids, you know, pick on each other and they're mean to each other and they're figuring out how to communicate and be social.
02:26:45.000 If you're five fucking years old and you're already famous, you're in deep shit.
02:26:49.000 And they're all in deep shit.
02:26:51.000 I've met quite a few of them now.
02:26:54.000 I've interviewed quite a few of them on this podcast.
02:26:56.000 I've met quite a few of them in real life and they're all fucked.
02:26:59.000 Everyone who becomes famous when they're a child is fucked.
02:27:02.000 I don't know...
02:27:03.000 I mean...
02:27:04.000 I was going to ask, like, do we know of anybody who...
02:27:07.000 Just navigating mega fame in general, I don't think I've seen many people do it without kind of getting eaten a little bit.
02:27:15.000 Yeah, you get eaten a little bit.
02:27:17.000 You need to do something to mitigate that.
02:27:20.000 You need to do something real.
02:27:22.000 And if you do not do something real, then the responses you get, if that's what you're living for, and if your worth and your value is based on people's People's attention to you and people's interaction with you,
02:27:40.000 that's not good.
02:27:41.000 It's very bad.
02:27:43.000 And that's why, I mean, also, how many of them are narcissists to begin with?
02:27:48.000 And how much of that narcissistic tendency gets fed by being famous?
02:27:53.000 It's just like I think with my phone, I've sort of given myself some low-grade ADHD. I think too much of the attention online makes you into a narcissist in Even if you weren't one original, it has the potential to do so if you don't actively mitigate it.
02:28:11.000 One of the strangest things is like when you get hot online, everybody wants to be your friend.
02:28:18.000 All of a sudden these people come out from the woodwork and all of a sudden everyone wants to be your friend.
02:28:22.000 And then when you're not hot again, now it's like you don't exist.
02:28:25.000 And that's a bad way to experience life that your whole identity and your whole friendship base and everything's wrapped up.
02:28:34.000 With how you're doing online and like, I know for me at least, I try to just segment my life to where the online thing is online and all my real friends are just in my city, just like kind of regular people, have different jobs.
02:28:49.000 I think it's kind of important to detach yourself so that when things aren't going well, it's fine.
02:28:54.000 When things are going well, it's fine.
02:28:56.000 There's like a stabilizing something.
02:28:59.000 I feel like you're describing Hollywood.
02:29:02.000 You know, you're describing the problems with Hollywood.
02:29:04.000 In Hollywood, when you make it, like if you're in a movie and you're doing well, everybody loves you.
02:29:10.000 Oh, Coffezilla, come on through the red carpet.
02:29:12.000 Let's go.
02:29:14.000 Coffezilla is hot now.
02:29:15.000 We want to put him in this movie and they want to put him on this show.
02:29:18.000 We want to do this.
02:29:19.000 And then when you're not, no one wants to talk to you.
02:29:21.000 Doesn't that break you psychologically, though?
02:29:23.000 Of course.
02:29:24.000 That's why they're all crazy.
02:29:26.000 In Hollywood, it's even worse, right?
02:29:28.000 Because you don't get to choose your own destiny.
02:29:30.000 You've developed your own show and you've created your own thing.
02:29:34.000 You haven't been chosen.
02:29:36.000 In Hollywood, the problem is you're being chosen for everything.
02:29:38.000 So you're being cast in these things.
02:29:40.000 So you have to deal with people that approve you or pick you.
02:29:43.000 So you're formulating your personality based on whatever the zeitgeist is, whatever the...
02:29:50.000 Ideology of most of the producers are like if all of Hollywood was right-wing Right if all the producers and all the executives and all the studios were all very conservative and right-wing all actors would be conservative They would all be pro-life.
02:30:09.000 They would all be First Amendment, Second Amendment happy.
02:30:14.000 They would all carry guns.
02:30:16.000 It would be 100% compliance, the same way it is with left-wing.
02:30:21.000 They're not necessarily people that think that way.
02:30:24.000 They think that way because that is the way to fit in.
02:30:28.000 And be successful.
02:30:29.000 So you take people that already have this exorbitant need for attention and then you bring them into an environment where they have to be chosen.
02:30:35.000 So you have to figure out what gets me chosen.
02:30:38.000 So you form your ideas and opinions based on what's going to be the most successful.
02:30:43.000 It's a mating strategy.
02:30:44.000 It's weird because the fact you need to be chosen sort of makes you play the same game.
02:30:50.000 Yeah.
02:31:06.000 That's why when those people do make it and they do get pushed through that red carpet, come on through Tom Cruise.
02:31:12.000 They're all fucking crazy.
02:31:14.000 And a lot of them treat other people like shit because they want to let you know that they're a part of the chosen class.
02:31:23.000 So that's like this thing about certain celebrities being assholes to regular people.
02:31:29.000 Like, why do they treat people like that?
02:31:31.000 Well, the same reason why royalty does it.
02:31:34.000 When you see the queen, you're supposed to bow.
02:31:37.000 This is how it goes down.
02:31:38.000 That's why they became the queen in the first place.
02:31:41.000 That's why they became a star in the first place.
02:31:43.000 Because they want to be that person that just gets fucking exorbitant amounts of love and attention.
02:31:49.000 And it's very unhealthy.
02:31:51.000 And it's good, I think, that it's now becoming possible that you can be like a Mr. Beast or something and not be in Hollywood.
02:31:59.000 He's like in North Carolina or whatever.
02:32:01.000 And he can just do his own thing.
02:32:03.000 He can start his own.
02:32:04.000 And he's as big of a brand as anybody.
02:32:07.000 And it's like, it just doesn't matter.
02:32:09.000 He doesn't have to kind of play the same games.
02:32:12.000 I think that is like...
02:32:16.000 Sometimes I think changes in technology are neutral.
02:32:19.000 It's kind of like you win some, you lose some.
02:32:20.000 I think that is a distinct change for the better that we've kind of decentralized Hollywood a little bit.
02:32:25.000 It's like you can just start your own show.
02:32:28.000 We talked about being subject to the gatekeepers but even subject to that kind of mentality of everything is about success and fame.
02:32:35.000 Yeah.
02:32:37.000 That's the currency of Hollywood.
02:32:39.000 It's also the motivation.
02:32:40.000 What is the motivation to do it in the first place?
02:32:43.000 A lot of the people that are in Hollywood, their motivation is purely for attention.
02:32:47.000 Their motivation is purely to become successful and famous.
02:32:51.000 Whereas his motivation seems to be to have fun and to do things with the money that is actually altruistic and good and beneficial and charitable.
02:33:00.000 He's a really good guy.
02:33:01.000 That's one of the appeals.
02:33:03.000 And also, there's no one filtering him.
02:33:05.000 That's who he is.
02:33:07.000 That's that guy.
02:33:08.000 He's very smart and very ambitious, but he's also not really money-hungry, and he dumps most of the money back into the production of his show.
02:33:15.000 He's legit.
02:33:16.000 I mean, you know, I have a lot of...
02:33:19.000 A lot of people you meet behind the scenes and they're like, they're different, you know?
02:33:23.000 It's like the same guy you meet.
02:33:24.000 And that's always a huge letdown.
02:33:26.000 I've had so many examples of that, but like, but he was one of the first, like, not first, but there are a lot of guys, but the big, the biggest stars, I guess, are the ones that are most likely.
02:33:35.000 You're like, ah, you're a bit different.
02:33:36.000 But he was like, when I met him, we talked a bit and it's just like, dude, this guy's legit.
02:33:41.000 Like, he's the real deal.
02:33:42.000 Yeah, that is who you get.
02:33:44.000 The guy that you see when he's doing those videos with his friends joking around and making them do stunts and pranks and all the different little games that he comes up with where people can win money.
02:33:54.000 That's really who he is.
02:33:55.000 YouTube's lucky because it could be anybody.
02:33:57.000 They don't select who is on top.
02:34:00.000 And they're fortunate because it could just be some super narcissistic monster.
02:34:06.000 I don't know if it would work.
02:34:08.000 Oh, you're saying it's selected for like...
02:34:10.000 Yeah, because a super narcissistic monster I don't think would create something that's relatable.
02:34:16.000 That's a good point.
02:34:18.000 There was this huge YouTube channel.
02:34:21.000 I think they still might be the biggest, like T-Series or something.
02:34:23.000 There's some random corporation.
02:34:25.000 I think there are ways to growth hack it maybe, but you're right.
02:34:28.000 You wouldn't create such a brand.
02:34:30.000 Right.
02:34:30.000 You couldn't fake it forever.
02:34:32.000 You couldn't fake authenticity.
02:34:34.000 I don't think you can.
02:34:35.000 I think after a while it gets exposed and people realize you're full of shit.
02:34:39.000 Yeah, the longer you talk, especially on the longer you...
02:34:44.000 If you're in little sound bites, you can pull it off for a little bit, but the longer you talk, the more it gets shown.
02:34:50.000 Unless you just have amazing stamina for bullshit.
02:34:53.000 Have you ever talked to somebody like that?
02:34:54.000 I'm sure, right?
02:34:56.000 Statistically, there has to be someone who came on the show and you're like, oh my gosh.
02:34:59.000 After like an hour?
02:35:00.000 That they're full of shit?
02:35:01.000 Yeah.
02:35:02.000 Oh yeah!
02:35:03.000 Oh yeah!
02:35:04.000 For sure!
02:35:05.000 There's quite a few people that I talk to that are full of shit.
02:35:07.000 And it's unfortunate.
02:35:08.000 Like, sometimes people, like...
02:35:10.000 You know, I've had people come on where I don't realize until, like, an hour and a half, two hours in.
02:35:15.000 And then I start asking them certain questions, and you realize, like, there's something fucking funny about your answers here.
02:35:21.000 Like, this is not...
02:35:22.000 And then we'll research them after the show, like, oh, good lord.
02:35:25.000 You know?
02:35:29.000 It's an issue, you know?
02:35:31.000 It's like...
02:35:31.000 And there's also people that just...
02:35:35.000 Whatever their motivations are, they're not good.
02:35:38.000 What is their motivation to do a show in the first place?
02:35:42.000 Is their motivation just to try to make the most amount of money?
02:35:45.000 Or are they trying to do a good show?
02:35:47.000 If you're trying to do a good show and you keep working at it, it'll get better.
02:35:50.000 Go watch my early shows.
02:35:51.000 They fucking suck.
02:35:53.000 You get better if you're actually just trying to do it and get better at it.
02:35:57.000 But if your motivation is just to make money, somewhere along the line usually you slide off.
02:36:03.000 I think most people who want to make money just go into finance.
02:36:07.000 Yeah, but they also want attention.
02:36:09.000 Oh, right.
02:36:10.000 They want that currency and power.
02:36:12.000 And then once you've gotten the attention, that's the thing about fame, right?
02:36:16.000 If you go to a store and there's a security guard at that store, you don't think, look at this poor fuck.
02:36:22.000 He's a security guard at a store.
02:36:23.000 You just think he's a guy.
02:36:25.000 Like, hey, man, what's up?
02:36:26.000 How you doing?
02:36:26.000 You don't treat him badly.
02:36:27.000 But if you go there and it's Will Smith, Will Smith has lost all his money.
02:36:32.000 Now he's a security guard at the store.
02:36:33.000 Look at this fucking loser.
02:36:35.000 Let's go visit Will Smith.
02:36:36.000 And you laugh at him.
02:36:38.000 How much you make here, Will?
02:36:40.000 What, are you going to slap me to kick me out?
02:36:41.000 Yeah, yeah, yeah.
02:36:42.000 You would be free to do that.
02:36:45.000 And that was the case with Gary Coleman.
02:36:47.000 You remember Gary Coleman?
02:36:48.000 Oh, the...
02:36:49.000 Yeah, the tiny guy.
02:36:50.000 He used to have that show, Different Strokes.
02:36:52.000 And he was famous on television, but then he...
02:36:54.000 I don't know what happened.
02:36:56.000 Lost all his money.
02:36:57.000 And he was a security guard at a studio.
02:37:00.000 And they hired him to be the guy that like when people drive through they meet him and then people realized he was there and this was like before social media.
02:37:08.000 So this was like early on and it was a real problem because people would go there just to mock him and make fun of him because someone who used to be famous and now is not is a loser.
02:37:19.000 But someone who's just never been famous is just a person.
02:37:22.000 It's very interesting.
02:37:24.000 But it's so weird because everyone who achieves any level of notoriety knows how temporary – more than even the audience, they know that there's a shelf life on everything.
02:37:36.000 Very few people make it an entire career – I'm always thinking, it's going to be over next month.
02:37:45.000 Right.
02:37:46.000 Because that is the nature of, especially online fame, is even more fleeting than the old days of movie stars.
02:37:52.000 It's even less.
02:37:53.000 So I don't understand why that is.
02:37:56.000 It feels like...
02:37:59.000 I don't really buy into that.
02:38:01.000 I mean, I think it's just people are people and you have your moment in like the spotlight for one reason or another.
02:38:08.000 It's usually not about who you are.
02:38:10.000 It's just you're saying something at a time that resonates and things don't resonate forever.
02:38:14.000 Right, but think of your perspective and where you're coming from.
02:38:18.000 You're a 28-year-old guy who is doing really well right now, so you are in the spotlight, and you haven't had a lot of time outside of it.
02:38:26.000 I mean, how old were you when you started your show?
02:38:30.000 I think I got actual attention maybe 26, 25, 26?
02:38:34.000 Yeah.
02:38:35.000 So for the last three years.
02:38:36.000 So you didn't go through like this long, terrible period of fucking hating life.
02:38:42.000 Darkness and depression.
02:38:42.000 Yeah.
02:38:43.000 So a lot of people fucking hate life and they look at someone who is a movie star or a television star like Gary Coleman and they go, wow.
02:38:53.000 How the fuck?
02:38:54.000 That guy, how's he doing it?
02:38:56.000 He's got a fucking Ferrari, he's got a this, he's got a that.
02:39:00.000 And then when they don't have it, like, ha ha!
02:39:03.000 You're less than one of them.
02:39:05.000 You're less than a normal person because you're a person that used to be free of it.
02:39:09.000 You're a person that used to be...
02:39:11.000 We love a story of some movie star that spent all their money and now they're broke and crazy.
02:39:17.000 I remember, who's the woman, was it Margot Kidder?
02:39:20.000 Is that her name?
02:39:21.000 The woman from Superman?
02:39:23.000 There was a woman who was she played Lois Lane in the early Superman's with Christopher Reeve and She went crazy and like she lost all of her teeth and she was like someone found her in the bushes somewhere like it was like real sad like real mental illness problems and I remember there's this deep fascination with this person who was a Movie star at one point in time and then had completely fallen apart like what was the story with her?
02:39:48.000 Do you remember the story with her Jamie?
02:39:50.000 You nailed as much as I remembered, yeah.
02:39:52.000 Yeah, something happened.
02:39:54.000 She had some sort of a mental health breakdown, and I'm sure some of that had to do with fame and society and acting and just the world that they live in of the movie star.
02:40:04.000 And then also the women's world of a movie star, which is a fucking much more brutal world.
02:40:09.000 That's brutal.
02:40:09.000 Because, you know, I was talking about Elizabeth Hurley.
02:40:11.000 She's the rare, the rare that stays hot.
02:40:14.000 She's hot and she's like fucking 80 years old.
02:40:17.000 She had an accident, I guess, that left her paralyzed.
02:40:19.000 Oh my gosh.
02:40:20.000 She lost some money and had some issues with.
02:40:22.000 Oh boy.
02:40:24.000 This...
02:40:24.000 I will say, like, one of the challenges is you kind of have to...
02:40:30.000 You don't know how long you're going to stay relevant.
02:40:34.000 And then if you don't make the money then, now what do you do?
02:40:37.000 I guess is the point.
02:40:40.000 And then...
02:40:41.000 So if you haven't set yourself up...
02:40:44.000 And a lot of these people, they think it's going to be lasting forever because their agents tell them it's going to last forever.
02:40:49.000 So they just spend all their money.
02:40:51.000 And then especially if they don't pick up any skills, one of the things like with actors is all you learn is acting.
02:40:59.000 I mean, one of the interesting things now, which is kind of fascinating about, like, modern, like, you know, people who grew up on TikTok and, like, the YouTube era, is you kind of have to learn, like, marketing.
02:41:11.000 You have to learn video editing.
02:41:14.000 You have to learn...
02:41:15.000 So you can pick up skills to where...
02:41:17.000 You're never going to be completely, you know, I know a lot of YouTubers who now work for other YouTubers because they like, they stopped being relevant, but they're like, I understand content.
02:41:25.000 I understand how this stuff works.
02:41:27.000 They're not just a face.
02:41:29.000 They're not just like a pretty face.
02:41:30.000 They have actual tangible skills beyond that.
02:41:34.000 That is an issue, though.
02:41:35.000 I mean, I can understand why that's a problem.
02:41:37.000 I think here's a big issue with someone like yourself.
02:41:40.000 What if YouTube goes away?
02:41:42.000 That's the real issue.
02:41:44.000 Sure.
02:41:44.000 If you're relegated to one platform, and that platform, what if the platform decides for whatever strange reason?
02:41:52.000 Like, what if they get pressure from someone who you've outed?
02:41:55.000 Right.
02:41:55.000 And they come up with some bogus reason to strike your account and delete your account.
02:42:00.000 That's a real issue.
02:42:02.000 If you're beholden to one company, that can be a real problem.
02:42:06.000 It is a huge problem.
02:42:08.000 Right now, the number one video sharing site in the world is basically YouTube, and that's essentially it.
02:42:14.000 I mean, there's not really anyone else.
02:42:16.000 There's, like, alternatives, like Rumble, but, you know, like, that's kind of...
02:42:21.000 I don't know why it's perceived as kind of like a right-wing thing.
02:42:25.000 Sort of, but then, you know, Russell Brand's on it, and, you know, Glenn Greenwald's on it.
02:42:31.000 Let's say political commentary thing.
02:42:33.000 I mean, I don't know if, like, mainstream, like, just, like, random creators are doing really well.
02:42:39.000 I don't know.
02:42:40.000 Maybe it's there, maybe it's not.
02:42:41.000 I think you're pointing out, though, a very good point, which is like, as much as we talk about the decentralization of gatekeepers, there is one gatekeeper to rule them all still, for someone like me, that is YouTube.
02:42:54.000 I mean, I would like to think that, you know, throughout, you learn enough about...
02:43:01.000 Making stuff, making content that you could move.
02:43:04.000 I would probably try to transition into some like production role.
02:43:08.000 Yeah.
02:43:08.000 Start a production company.
02:43:09.000 I mean, I love...
02:43:10.000 But I think you would still enjoy doing the thing you do.
02:43:13.000 Oh, of course.
02:43:13.000 You'd have to figure out a way to do it somewhere else.
02:43:16.000 But also you'd have to figure out a way to bring...
02:43:18.000 Like here's the other problem.
02:43:20.000 Social media, right?
02:43:21.000 Social media is where you use to promote the thing that you're doing on YouTube.
02:43:26.000 So what if that goes away?
02:43:28.000 What if something...
02:43:28.000 Like we have to assume that...
02:43:31.000 Twitter was on the verge of bankruptcy, apparently, when Elon bought it.
02:43:35.000 Sure.
02:43:35.000 It was fast-tracking to bankruptcy.
02:43:37.000 What if someone incompetent bought it and then ran it into the ground?
02:43:41.000 Then it doesn't exist anymore.
02:43:42.000 Then all those people that use Twitter to promote their businesses, stand-up comedians that use it to promote their tour dates, like, they're fucked now.
02:43:49.000 It's gone.
02:43:49.000 Now you don't have that vehicle, and so your ability to access your fans is completely gone.
02:43:54.000 Yeah, you don't own any of your data.
02:43:56.000 Right.
02:43:56.000 So you don't own any of your subscribers' data.
02:43:58.000 You don't own any of that stuff yourself.
02:44:02.000 It's a real...
02:44:03.000 It's a real issue.
02:44:03.000 It's a real challenge.
02:44:04.000 I mean, one of the things...
02:44:05.000 They also control what you can talk about.
02:44:08.000 So, when I was doing, you know, my first show...
02:44:14.000 I had this video where I wanted to explore smoking and vapes through the lens of the FDA and how they regulated vaping and they sort of went after vaping.
02:44:25.000 It's a problem, but it also seems like it's a lot healthier than just smoking cigarettes.
02:44:30.000 Cigarettes are the worst thing in the world for any human to be doing, although it's very fun.
02:44:35.000 But they're horrible for you.
02:44:37.000 And so I did a video about that.
02:44:39.000 YouTube age-gated it.
02:44:44.000 Not only no monetization, which that, you know, it's acceptable.
02:44:46.000 It's just kind of the cost of being on YouTube.
02:44:48.000 You sometimes get demonetized, whatever.
02:44:50.000 The reach was killed.
02:44:51.000 So now this video, which everyone loved, nobody can watch, or you won't get recommended, like, you know, the recommended feed.
02:44:58.000 There's also a problem that now you're in a specific category.
02:45:01.000 Like, I don't know how their algorithm works, but if you do get flagged for something, you could get put in a problematic category.
02:45:08.000 Right.
02:45:09.000 Which makes you shadow banned or less likely to be recommended.
02:45:12.000 Right.
02:45:13.000 So I think especially, I think they say their official stance is they do it on a video-by-video basis.
02:45:20.000 I don't actually know.
02:45:21.000 I mean, it's kind of hard to figure out, you know, what's true, what's not.
02:45:24.000 But I will say, like, did I ever do a video about that again?
02:45:28.000 No.
02:45:28.000 No.
02:45:29.000 Yeah, you self-censor.
02:45:30.000 Yeah.
02:45:31.000 And that's what happens to people.
02:45:32.000 That's a big problem.
02:45:32.000 That happened during COVID with a lot of people.
02:45:34.000 You know, people wanted to talk about issues like the lab leak hypothesis.
02:45:38.000 Usually they're important issues, too.
02:45:40.000 Yeah.
02:45:40.000 That's the problem.
02:45:40.000 Right.
02:45:41.000 They're controversial, so they are important.
02:45:43.000 Right.
02:45:44.000 But it's like, you know, I understand YouTube's perspective.
02:45:47.000 They have...
02:45:49.000 I don't know how many.
02:45:50.000 Maybe they're supporting hundreds of thousands of people's livelihood.
02:45:54.000 And they're like, do we want to risk it all so somebody can say some wild stuff?
02:46:01.000 Right.
02:46:01.000 And then the advertisers pull out.
02:46:03.000 They lose X percentage of the revenue.
02:46:05.000 And then whoever that producer is that allowed that revenue.
02:46:09.000 Channel to exist.
02:46:11.000 Gone.
02:46:11.000 Now that person gets fired, and their success in this company is based on whether or not the company's bringing in revenue.
02:46:18.000 And if you're allowing all these people to say things that are really terrible to the bottom line of whoever is paying money for advertising, that's not good.
02:46:29.000 What I've said is like, I think a lot of these, you know, some of these companies, they achieve near monopoly statuses.
02:46:36.000 It's hard to argue that some of these companies aren't close to a monopoly in their specific like domain that they're good at.
02:46:42.000 Because, you know, if you're going to make a replica of YouTube, you've seen how hard it is with Rumble.
02:46:45.000 It's not like you're just video sharing.
02:46:48.000 It's like you're video sharing.
02:46:49.000 They're AI. They're copyright ID. I think they said they spent like 10 million dollars or 100 million to build the copyright ID. So if you want to compete with them, you need to have at least that just to build a copyright ID system on par.
02:47:02.000 Then you got to go host all the video.
02:47:04.000 You got to find the AdWords targeting.
02:47:06.000 Google is the best ad targeting in the world.
02:47:08.000 They're not going to give you access to their system if you're a competitor.
02:47:11.000 They're not going to give you the same deal.
02:47:12.000 So it's like this challenge of, okay, who can really compete when there's such a high barrier to entry?
02:47:19.000 So I'm thinking like, why are these things not considered some sort of public good in that because we accept that it's so hard to compete meaningfully with these things that are so important to our public discourse,
02:47:35.000 I understand the whole argument of like free speech is just freedom to speak against the government, not freedom from a corporation.
02:47:41.000 But what I'm saying is when all our discourse is online, why are these companies not some form of like almost like a utility company?
02:47:49.000 Like, yes, at some level you don't have the right to monetize, but do you have the right to at least say something?
02:47:57.000 Yeah, that's a good point.
02:47:59.000 And that was the point about Twitter.
02:48:00.000 That was the conversation about Twitter being the town square and that it should be regulated like some sort of a utility.
02:48:06.000 And I could see that argument.
02:48:07.000 And also, when you think about the concept of free speech and the First Amendment, none of that existed with social media.
02:48:16.000 And they would have...
02:48:18.000 Imagine trying to wrap your head around social media when they're drafting the Constitution with feathers.
02:48:24.000 They're literally writing with a fucking quill.
02:48:26.000 They had no idea what they were saying.
02:48:28.000 So they were just trying to get people to be able to discuss things without being restricted by the government to stifle tyranny.
02:48:36.000 Because at the time, the tyranny was government.
02:48:40.000 That's the only people who had the kind of power and oversight to where they could literally stop you from saying anything as a government.
02:48:46.000 Now it's like, okay, you want to say something, the person who's going to stop you from saying it is probably not the government.
02:48:52.000 It's probably some random tech executive.
02:48:56.000 Yeah, random tech executive who has an ideological bias.
02:49:01.000 Unelected from a...
02:49:02.000 Yeah, exactly.
02:49:03.000 It's this strange thing, and I think it's actually a very...
02:49:09.000 It should be a universal issue because I think conservatives all don't want to be censored and that's usually who gets censored.
02:49:15.000 But left-wing people are all about decentralized power.
02:49:19.000 I mean that's like the idea is like democracy, more elected, not just like these unelected people but get more of a – like kind of a group say and powerful decisions.
02:49:28.000 Well, then they also should have a problem with the decisions even though they happen to kind of go a certain way.
02:49:35.000 Still being made by unelected people who just can have arbitrary biases.
02:49:43.000 That's the thing.
02:49:44.000 One day, Twitter's owned by...
02:49:46.000 I forgot the last...
02:49:48.000 Who was the leader of health and safety or whatever at Twitter?
02:49:54.000 Oh, Vidya?
02:49:55.000 Yeah, yeah, yeah.
02:49:55.000 One day it's her, the next day it's Elon Musk.
02:49:58.000 And they have different opinions on things.
02:50:00.000 And so do you want to be subject to both of their whims?
02:50:04.000 Or do you want there to be some sort of thing on, you know, I don't know, on the books that we can at least sort of have a public vote on it?
02:50:12.000 Well, there's this narrative that's being bantered about now that Twitter's no longer safe from trolls.
02:50:18.000 But Twitter was never safe from trolls.
02:50:21.000 It's just they used to be just left-wing trolls.
02:50:23.000 Now you get right-wing trolls, too.
02:50:26.000 It's more of a center.
02:50:30.000 It's like the idea that Twitter leans right now.
02:50:33.000 No, it doesn't.
02:50:35.000 How many left-wing people that are addicted to Twitter stayed on?
02:50:38.000 Most of them.
02:50:39.000 A few goofy celebrities valiantly declared they're leaving Twitter.
02:50:46.000 One of them was my friend.
02:50:47.000 I was like, what the fuck are you doing?
02:50:49.000 Why are you posting that you're leaving?
02:50:52.000 So goofy.
02:50:53.000 And you don't even know what you're saying.
02:50:55.000 You're just saying this because you think this is going to appeal to your base.
02:50:59.000 That you're so noble.
02:51:01.000 You're going to leave before the right-wing trolls come back.
02:51:04.000 Cut the fucking shit.
02:51:05.000 And the good thing about...
02:51:08.000 People being allowed to speak is that you allow them to put things out there that can be ridiculed by everybody.
02:51:14.000 And so if you really oppose these right-wing ideas, let them post them and then post something that ridicules them.
02:51:21.000 Post something that refutes them.
02:51:22.000 Post facts.
02:51:23.000 Post information.
02:51:24.000 Get engaged.
02:51:25.000 If that's your thing, you really like doing that?
02:51:27.000 I don't like doing that.
02:51:28.000 But if you like doing that, get in there.
02:51:30.000 Get in there and go to work.
02:51:31.000 It sounds like a huge...
02:51:34.000 I get exhausted.
02:51:35.000 I'm like, just thinking about it.
02:51:36.000 I'm like, who wants to spend their time arguing with somebody?
02:51:40.000 I don't know.
02:51:41.000 I guess it's just not something I care about.
02:51:43.000 So it's like, to me, that doesn't matter.
02:51:44.000 But I guess to some people, this is their whole...
02:51:46.000 Just like covering scams is my thing.
02:51:48.000 It's like, this is their whole thing.
02:51:50.000 And I guess that's their whole...
02:51:51.000 It's like video games.
02:51:52.000 It becomes their game.
02:51:54.000 Right.
02:51:54.000 That's where they get their score, their points.
02:51:56.000 They level up.
02:51:56.000 Yeah, they level up.
02:51:57.000 They get more followers.
02:51:57.000 Get more followers.
02:51:59.000 Level up, get more likes.
02:52:01.000 You know, people will tell you about their engagement.
02:52:02.000 My engagement on Twitter or something.
02:52:05.000 How the fuck do you know?
02:52:06.000 I don't even know how many followers I have.
02:52:08.000 Why are you paying attention?
02:52:09.000 Get out of there.
02:52:10.000 Go outside.
02:52:11.000 Go do something.
02:52:12.000 I think it's deeply bad for health to constantly be given analytics.
02:52:18.000 Like this is a thing on YouTube.
02:52:21.000 I was talking to Lex about this because he was telling me he doesn't like – he likes to not look at his numbers.
02:52:27.000 And I was like, man, I love that.
02:52:28.000 I try not to look at my numbers.
02:52:30.000 The thing is when you go into your dashboard, like they give you every stat you could ever imagine.
02:52:35.000 Yeah.
02:52:36.000 And I get it.
02:52:36.000 They're trying to educate you on if a video is doing well or doing bad or whatever.
02:52:40.000 But I think it's kind of good for artists not to have immediate feedback.
02:52:48.000 There's an argument against that though and that's Mr. Beast.
02:52:52.000 Mr. Beast has figured it out.
02:52:54.000 He moneyballed it.
02:52:54.000 He moneyballed YouTube.
02:52:56.000 YouTube before that wasn't like a science.
02:52:58.000 It was like an art.
02:52:59.000 It's like nobody knew what they were doing.
02:53:00.000 He comes in.
02:53:01.000 He's like, you guys are all idiots.
02:53:02.000 Let's turn this into stats and numbers.
02:53:04.000 And I love him and I hate him for it.
02:53:06.000 Because I got the one perspective.
02:53:08.000 It's like you kind of saw the Mr. Beastification of YouTube.
02:53:12.000 Everyone talks the same.
02:53:13.000 Everyone has a...
02:53:14.000 Hey guys, what's up?
02:53:15.000 Today we're doing this.
02:53:16.000 And that's like because he kind of showed like, oh, this is a pretty optimal way of doing it.
02:53:21.000 So it's good because he gave people like handles on their own success, which is valuable.
02:53:26.000 Like it's cool that you know why a video does well or not.
02:53:29.000 There's also something that like it kind of kills a little bit of creativity and inspiration when all of a sudden, you know, like this segment ain't going to do it.
02:53:38.000 Right.
02:53:38.000 Like and you they give you this graph.
02:53:40.000 Have you ever seen the retention graph?
02:53:41.000 No.
02:53:41.000 Oh, it's hilarious.
02:53:42.000 So you start off at 100%, and then you just see as people leave.
02:53:46.000 And then it goes to the end of the video, and you see how many people were left.
02:53:49.000 And at every moment, you can tell if someone clicked off at that moment.
02:53:54.000 People get so much anxiety for that.
02:53:57.000 Oh!
02:53:58.000 And what they do now, and this is taught at YouTube, boot camps, is like, look at your retention graph and everything that wasn't good.
02:54:06.000 If people clicked off, you gotta cut it.
02:54:11.000 You gotta stop.
02:54:12.000 And I think that creates its own, like, you know, sickness.
02:54:15.000 Yeah.
02:54:17.000 YouTube boot camps are hilarious.
02:54:19.000 That's so funny that they have YouTube.
02:54:20.000 But it makes sense.
02:54:21.000 I mean, if you wanted to treat it like a business, like any other business, if you wanted to get involved and, you know, you wanted to open up a small business somewhere, you know, you could treat YouTube like you're opening up a small business.
02:54:30.000 I get it.
02:54:32.000 I get it.
02:54:32.000 It's not my thing, though, so I don't get that aspect.
02:54:35.000 I think that would fuck with what I do.
02:54:37.000 I think that would get in the way.
02:54:39.000 I think it would fuck with what you do, too.
02:54:40.000 I think it gets in the way more than it helps.
02:54:42.000 We've had to move away for a while.
02:54:45.000 We really emulated some, like, creators...
02:54:48.000 Who we liked, what they did.
02:54:50.000 But eventually what you realize is like, I just have a different audience.
02:54:54.000 People are here for different reasons.
02:54:56.000 And so I have to find my...
02:54:57.000 I can't just rely on a book or...
02:54:59.000 Not literally a book, but like the playbook of like what has worked for you.
02:55:03.000 I have to find out like, you know...
02:55:07.000 Not only what my audience wants, but what do I want?
02:55:10.000 Yes.
02:55:10.000 I think that's the most important thing.
02:55:11.000 It is the most important thing.
02:55:12.000 We're not just making—well, I'm not making things for other people.
02:55:15.000 I'm making it because I think it's cool, I think it's interesting, and I think it's valuable just for me to express it.
02:55:21.000 And so I have to find out, like, why do people watch my show?
02:55:24.000 What do I want for my show in a way that even if nobody wants it, I put it in—like, I have this, like, this whole robot bartender thing, and it's like this CGI thing.
02:55:34.000 And I do it because I like it.
02:55:36.000 It's fun for me.
02:55:37.000 I get a real kick out of that stuff.
02:55:39.000 I'm a nerd when it comes to that CGI tech stuff.
02:55:43.000 And people wouldn't believe how much time I spend on that.
02:55:45.000 I spend like half my day just like tweaking this stuff.
02:55:49.000 But that resonates with people.
02:55:50.000 That's one of the reasons why people like it.
02:55:52.000 I think when you do something that you like, it's very obvious to the people that are paying attention.
02:55:58.000 I think that's part of the appeal of a lot of shows.
02:56:02.000 I think that that's why it works.
02:56:05.000 I mean, I think that's one of the secrets to my success is that I only have on people that I'm actually interested in talking to.
02:56:11.000 So I'm engaged.
02:56:12.000 I'm not just bullshitting my way through someone trying to promote some movie.
02:56:16.000 You know, I'm actually engaged.
02:56:17.000 If I have someone that's promoting a movie, I'm interested in the movie.
02:56:20.000 I want to know what they're doing.
02:56:21.000 If it's a documentary, I want to know, like, how did you go about doing this?
02:56:25.000 What's the process?
02:56:26.000 I'm actually engaged.
02:56:28.000 When you're faking it and phoning it in, people know it.
02:56:30.000 They feel it.
02:56:32.000 You know, and that's...
02:56:33.000 The beauty of your show is, I think your show serves multiple purposes, but one of the things is that it certainly clearly appeals to what you're interested in, and you act as a watchdog.
02:56:45.000 Like, I watched the Celsius video that you put out recently, and I watched it today, and I was like, this is so valuable because I'm seeing all these people, because you showed those people that did get scammed, and the people that get fucked over by this guy who created this thing, and,
02:57:00.000 you know, they have a voice now.
02:57:02.000 And you can also, like, let all these other motherfuckers that are trying to do something like that know that CoffeeZilla's out there, and he's gonna find you, and he's gonna put you on blast, and people are gonna know, and it's gonna be more difficult for the next person.
02:57:17.000 And again, it's not the wealthy investors that will sue.
02:57:20.000 It's these people that put in $2,000, and it was the only $2,000 they had.
02:57:25.000 That's where it's so valuable, and I know that you feel that way, and it comes through in your video.
02:57:31.000 And I think that's why it's appealing, and that's why it's working.
02:57:35.000 I really discovered early on that nobody cares about the numbers.
02:57:41.000 The numbers are like the headline or whatever, but ultimately you can't make a, like, this stuff doesn't matter until you get people involved.
02:57:50.000 Until you hear the victims talk, they're the heartbeat of everything.
02:57:54.000 Because until you hear that, like, what's a billion dollars?
02:57:57.000 It's impossible to know.
02:57:58.000 And then you watch the guy and you're like, who would fall for this?
02:58:03.000 It's easy to get cynical if you just see the numbers and the guy who defrauded people.
02:58:08.000 The second you humanize it and you show a person, and all of a sudden you see someone with all the same problems, and you can just tell, you can see it in their eyes, and they're just wrecked.
02:58:17.000 By this guy who truly they believed in.
02:58:20.000 It's like the biggest betrayal.
02:58:22.000 You trusted somebody with everything.
02:58:25.000 And then they stab you in the back.
02:58:27.000 Alex Mashinsky, CEO of Celsius, his whole thing was banks are evil.
02:58:31.000 Which is not...
02:58:34.000 Crazy.
02:58:34.000 I mean, it's like, you know, you can understand why a lot of people resonated with that.
02:58:38.000 They're like, and it wasn't even their evil.
02:58:40.000 I don't want to say they're heartless.
02:58:41.000 Yeah, yeah, yeah.
02:58:41.000 Sorry.
02:58:42.000 I was going to correct that.
02:58:43.000 He said, like, they're greedy.
02:58:44.000 Yeah.
02:58:44.000 And that's true.
02:58:46.000 And he's not.
02:58:47.000 Yeah, yeah.
02:58:48.000 He goes like, banks are not your friends.
02:58:50.000 True statement, Alex.
02:58:51.000 And then this interviewer is like, but Alex is your friend?
02:58:53.000 And he's like, yeah.
02:58:55.000 He's like, basically, you can take the same ride as me, 8%, 8% a year.
02:58:59.000 I'll just give it to you.
02:59:01.000 We're doing the same thing as the banks.
02:59:03.000 We're loaning out your money, but we're going to pass on 80% of the revenue back to you instead of the banks, which they take all your money, right?
02:59:10.000 Yeah.
02:59:11.000 So people bought into that.
02:59:12.000 They said, that sounds great.
02:59:14.000 Like, hey, the Internet changed everything.
02:59:16.000 You know, we think crypto is going to change everything.
02:59:18.000 Why not have a bank that instead of serving its shareholders, it serves its customers?
02:59:23.000 It kind of like there's something that makes sense there.
02:59:25.000 It's really compelling.
02:59:26.000 And then come to find out Celsius was never making money.
02:59:29.000 They said they were paying out, you know, you with with their profits.
02:59:33.000 They were paying out you with new deposits.
02:59:36.000 Like new people were coming in and they were paying you out.
02:59:39.000 And so it was this giant Ponzi scheme where they set the rewards because they knew if it's high enough, people are just going to flock to them.
02:59:45.000 And so but they had this compelling explanation for why like it kind of made kind of made a little bit of sense.
02:59:52.000 And then they when it finally goes wrong, he just get he just walks away.
02:59:57.000 I mean, yeah, he's getting sued civilly.
02:59:59.000 But where's the criminal action?
03:00:01.000 He's going to go to jail.
03:00:02.000 Probably not.
03:00:03.000 And it's like, that is so messed up.
03:00:05.000 That is such a, that itself is a crime.
03:00:07.000 I think it's so sick that we allow, we throw the book at people who will rob a store with a gun, right?
03:00:15.000 Still 10,000 bucks.
03:00:17.000 People who steal millions, billions of dollars often get away with it because it's just done a little differently.
03:00:26.000 There's not the drama of the gun and somebody's – even if no one gets shot.
03:00:30.000 It's just, hey, it's just he pushed a few pencils around.
03:00:32.000 He got you to sign a few shitty – but that's just as sick and twisted.
03:00:36.000 But it's just done in a way that socially is slightly more acceptable and they get away with it way more often.
03:00:42.000 But I would contend that these people are literally financially murdering people.
03:00:48.000 After Celsius, people committed suicide because of, I mean literally, it's a fact.
03:00:53.000 People committed suicide because they lost everything.
03:00:55.000 I'm sure FTX as well.
03:00:57.000 Of course!
03:00:58.000 You know, the bigger the scam, there's just statistically, it almost becomes impossible that you don't at least, if not financially, sort of metaphorically murdering a family, you literally kill somebody.
03:01:08.000 And people walk away with Like, either only the guy at the top goes down, or nobody goes down.
03:01:16.000 And that is crazy to me.
03:01:18.000 It's like, what message are we sending via our regulators?
03:01:24.000 Basically, it's like, hey, you're gonna get a slap on the wrist.
03:01:27.000 If you're caught.
03:01:28.000 And this, what you just did, is why you're so successful.
03:01:32.000 That's real.
03:01:33.000 This is how you really feel.
03:01:35.000 And this is why your show works.
03:01:37.000 This is it right there.
03:01:38.000 What you just did is why I'm interested in your show.
03:01:42.000 Because this is your real thoughts and opinions.
03:01:45.000 Something has to change.
03:01:47.000 Where you can't just go on like this where if we're really going to allow, if we're going to, you know, take our financial future in our own hands, we're going to allow these influencers to talk about finance, somebody has to be there when things go wrong.
03:02:04.000 Yes.
03:02:05.000 And there has to be consequences.
03:02:06.000 If you lie and if you cheat and you steal, there has to be a guy at the end of the day who's going to put you in trouble.
03:02:12.000 And I think a YouTube video is not nearly enough.
03:02:15.000 It's why I'm constantly saying like, hey, Can someone from the government get involved?
03:02:19.000 Go lock this guy up.
03:02:21.000 Go lock somebody.
03:02:23.000 I know a lot of this is new.
03:02:26.000 The crypto stuff is new.
03:02:27.000 But they're doing old crimes in a new way.
03:02:29.000 It's always been illegal to steal people's money.
03:02:32.000 And that is what's happening.
03:02:33.000 And that's why I put these people on my show.
03:02:35.000 So you don't think it's some rug pull where it's all fake money.
03:02:38.000 No, there was real money in these companies.
03:02:41.000 And they just stole it a new way, but they're still stealing money.
03:02:44.000 And the fact that we haven't found a way to put some of these people in jail is mind-blowing to me.
03:02:50.000 And we're sending a bad message that, hey, just keep doing it.
03:02:52.000 Just go start a new one.
03:02:55.000 They were trying to start GTX after FTX. Some new guys were trying to start the new thing.
03:03:02.000 And then HTX is next?
03:03:04.000 Jesus Christ.
03:03:05.000 It's just like, you know, you have to, that's half the purpose of the law.
03:03:10.000 It's partly, you know, for, you know, you did something wrong, you get punished.
03:03:13.000 But also, part of it is, you do something wrong, you send a message to socially, you socially signal that we do not tolerate this.
03:03:21.000 And right now, the social signal we're sending and accepting is, If you scam, there's a very high likelihood you'll get away with it.
03:03:29.000 And if you don't get away with it, you'll get a little slap on the wrist.
03:03:32.000 You'll get a little fine.
03:03:34.000 And that's not working.
03:03:36.000 No, it's not.
03:03:38.000 Hey, man, thanks for being here.
03:03:40.000 This was a lot of fun.
03:03:40.000 It was a lot of fun, Joe.
03:03:41.000 I appreciate your show.
03:03:42.000 I appreciate you.
03:03:43.000 I appreciate what you're doing, and I really enjoyed this.
03:03:46.000 Thank you so much.
03:03:47.000 Thank you.
03:03:47.000 Tell everybody how to get your show, what your social media is, all that jazz.
03:03:52.000 YouTube, CoffeeZilla, that's it.
03:03:54.000 That's the best place to find me.
03:03:56.000 I appreciate you guys having me on.
03:03:58.000 This is surreal.
03:03:59.000 Been a big fan of the show.
03:04:00.000 Thank you.
03:04:00.000 Appreciate you.
03:04:01.000 Bye, everybody.