The Joe Rogan Experience - October 31, 2023


Joe Rogan Experience #2054 - Elon Musk


Episode Stats

Length

2 hours and 41 minutes

Words per Minute

155.47458

Word Count

25,034

Sentence Count

2,721

Misogynist Sentences

21


Summary

In this episode of the podcast, I sit down with Ford's VP of Product Development at FKA to talk about the company's new hybrid truck, the Ford Fusion, and how they plan to go from concept to production in a short period of time. I also talk about how hard it is to get a car into production, and why it's better than making a movie. Happy Halloween, and Happy New Year, everyone! Check it out! The Joe Rogan Experience is a podcast by day, on the pod by night, all day long. It's amazing that he puts out a piece of art per day 365 days a year, 365 days of the year. I was trying to figure out how to do this, and I think I've figured it out. Enjoy the pod, and thanks for listening to the pod! -Jon Sorrentino is the CEO of FKA, the company that makes the Fusion, a hybrid truck and battery-powered hybrid truck. He's also a regular contributor to Jalopnik, and is one of the most prolific podcasters in the automotive press in the entire world. Check out his work on Jalopin. It's a must-listen if you haven't checked out the pod yet, and if you do, you'll love it. Thanks for listening and Happy Holidays, Jon! Cheers, Cheers. - Derek Videll - The Crew Jon and Cheers Tom (and Cheers! (featuring: Joe Rogans Podcast Timestar (and the FKA Crew) And, of course, Tom's new book "The Best Thing I've Ever Made Me Doin' It?" by Jon Rogan (and his new book, ) , and much more! . Jon's book, "The Future of Automotive" is out on Amazon Prime, out now! Jon talks about the future of autonomous cars, and more. Jon is a writer and podcast, and he's a podcaster, and we're a lot of other stuff, so you should check it out, too. , so you don't have to be mad about it, but you should listen to it if it's good, right? If you're looking for a good time, listen to the full thing, then you'll be mad at him.


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:27.000 It's amazing that he puts out a piece of art per day.
00:00:31.000 365 days a year.
00:00:34.000 Yeah.
00:00:34.000 I was following him on the X platform, FKA Twitter, but some of it was too jarring.
00:00:40.000 Too jarring, some of the images?
00:00:42.000 Yeah.
00:00:42.000 Well, cheers, sir, and happy Halloween.
00:00:44.000 Cheers.
00:00:44.000 Thanks for doing this.
00:00:45.000 Appreciate it.
00:00:46.000 You're welcome.
00:00:47.000 Thanks for rolling up in this hybrid truck, too.
00:00:50.000 Yeah.
00:00:50.000 I got a chance to look at it in the factory, but that was almost, what was that, like a year and a half ago or so?
00:00:57.000 Was it?
00:00:57.000 It was a while ago.
00:00:59.000 Yeah, a year ago, I guess.
00:01:00.000 Yeah, at least a year.
00:01:02.000 At least a year.
00:01:02.000 And it's different in real life.
00:01:05.000 Like, you see it in person.
00:01:07.000 Images are just...
00:01:09.000 We were talking about it outside.
00:01:10.000 You just can't contextualize them.
00:01:13.000 Yes.
00:01:13.000 It looks so odd that you have to see it in the flesh.
00:01:17.000 It looks like computer graphics in reality.
00:01:20.000 Yeah, it's the coolest-looking fucking production card that's ever been made.
00:01:23.000 It's world-proof, literally.
00:01:25.000 Literally, yeah.
00:01:27.000 One of the videos...
00:01:29.000 We're going to show is just going all, like, full Al Capone.
00:01:33.000 Just, like, if Al Capone showed up and emptied a, you know, the entire magazine of a Tommy gun into the side of the car, you'd be okay.
00:01:40.000 The only thing that's not bulletproof is the glass.
00:01:42.000 The glass is optionally bulletproof.
00:01:44.000 Oh, it is optional?
00:01:46.000 Well, you can make anything bulletproof if you want, but the glass has to be very thick for it to be bulletproof, so it can't go up and down.
00:01:53.000 So if you want a fixed glass...
00:01:54.000 Then how do you order drive-through?
00:01:57.000 Yeah, exactly.
00:01:58.000 Yeah, that's a problem.
00:02:00.000 You gotta pull ahead and open the door and get out.
00:02:03.000 But it's okay, you can just duck.
00:02:05.000 Yeah, you can just duck.
00:02:06.000 How far away are you from delivering them to people?
00:02:11.000 Has anybody gotten them yet?
00:02:14.000 We planted our first deliveries next month.
00:02:17.000 Oh, wow.
00:02:18.000 So now it's just testing and fucking around?
00:02:23.000 The hard part by far is manufacturing, not designing the car.
00:02:31.000 There's just not really a movie about that, but there should be.
00:02:35.000 So, in the sort of...
00:02:39.000 You know, the movies will always be about the sort of inventor who invented the car and then the job is done.
00:02:44.000 Right.
00:02:45.000 Or invented the object, now the job is done.
00:02:48.000 This is not true.
00:02:49.000 That's the easy part.
00:02:50.000 The hard part is manufacturing by far.
00:02:52.000 Why is it so much harder than making an individual model?
00:02:59.000 Um...
00:03:07.000 Well, in order to make it affordable, you have to make it at volume.
00:03:11.000 So you've got to make everything at a higher rate, consistently, If you tour the production line, you'd have a sense for it.
00:03:25.000 You've got to have all of the casting machines, all of the stamping machines, as the case may be, the glass machines, the wheels, the tires, everything required from the motor, the battery cells, all of the constituents of the battery cells, all of the silicon that goes in there with the chips.
00:03:45.000 It is.
00:03:46.000 The manufacturing is somewhere between 100 and 1,000 times harder than making a prototype.
00:03:50.000 Whoa!
00:03:52.000 And then if you want to say, like, you want to get from, once you reach volume manufacturing, which is insanely difficult, then you want to make the car affordable, it's harder to, say, reduce the cost of the car by 20% than it is to get to volume production in the first place.
00:04:12.000 I really cannot emphasize enough how hard production is relative to design.
00:04:19.000 I'm not saying the design is trivial, because you have to have taste and you have to know what to make.
00:04:26.000 If you don't have a taste in judgment, then your prototype will be bad.
00:04:32.000 But it is trivial, really, to churn out prototypes, and it is extremely difficult to build a factory.
00:04:40.000 And how much more difficult is it to make this, considering the body's made out of steel?
00:04:46.000 Very difficult.
00:04:48.000 The difficulty of manufacturing is proportionate to the amount of new technology that you have in a car or in the product.
00:04:54.000 In this case, there's a lot of new technology.
00:04:56.000 The production line will move as fast as the slowest and least lucky and most foolish part of the entire production line.
00:05:06.000 And you could say, to first approximation, there are 10,000 things that have to go right, at least, for production to work.
00:05:14.000 So if you have 9,999 things that are working and one that isn't, that sets the production rate.
00:05:22.000 Yeah.
00:05:24.000 By far the hard work.
00:05:25.000 In fact, the really...
00:05:29.000 The amazing thing about automobiles was not so much the invention of the automobile but the invention of the factory, the mass manufacturing.
00:05:37.000 And for that, Henry Ford deserves a tremendous amount of credit.
00:05:42.000 He was a next-level genius.
00:05:44.000 In fact, Ford is really responsible for the entire mass manufacturing industry because he actually found a Cadillac, which was the heart of General Motors, then got kicked out and then started Ford.
00:05:55.000 Really?
00:05:56.000 Yeah, and everyone just copied him.
00:05:57.000 Do you know he made one of his first cars out of hemp?
00:06:02.000 Well...
00:06:03.000 He used hemp fiber for the panels.
00:06:05.000 Okay.
00:06:06.000 Yeah, there's a fascinating video of him banging on it with a hammer.
00:06:09.000 Because hemp is bizarrely durable when it's compressed and when they take the fibers and I don't know what kind of epoxy they use or something to put it all together.
00:06:18.000 But what it makes with the actual physical form of it is insanely light.
00:06:24.000 Like fiberglass light, but very, very durable.
00:06:27.000 See if you can find that video.
00:06:28.000 It's kind of crazy.
00:06:29.000 Henry Ford is banging on, I believe it was the hood of it, with a hammer.
00:06:34.000 There it is.
00:06:35.000 So this was like, look at that.
00:06:40.000 Isn't that crazy?
00:06:41.000 That was wild.
00:06:44.000 I don't know why they stopped making them out of that.
00:06:47.000 That was from 1941. How much does the Cybertruck weigh?
00:06:53.000 It depends on configuration, but it's about 7,000 pounds.
00:06:58.000 There's different versions, but 6,000, 7,000 pounds.
00:07:01.000 It's like similar to, like it's a heavy truck.
00:07:05.000 Like a Ford F-250 or something like that?
00:07:07.000 Yeah.
00:07:09.000 And it, because of all of the metal and the weight and everything like that, but with the engines that you have, it's still, the 0-60 is pretty bizarre, right?
00:07:20.000 It's like 3.5 or something like that?
00:07:23.000 We're aiming to get the zero to sixty below three seconds.
00:07:25.000 Below three?
00:07:26.000 Yes.
00:07:27.000 Wow.
00:07:28.000 For the, you know, the beast mode version.
00:07:31.000 So we've got a beast mode version.
00:07:37.000 Well, I don't want to give it all away right now, but there are three demonstrations.
00:07:44.000 One of them people are aware of, which is emptying a Tommy gun into the side of the car, a shotgun.45 and a 9mm, and no penetrations.
00:07:59.000 And it comes that way from the factory.
00:08:01.000 Can I try it with an arrow?
00:08:02.000 Yeah, it'll be fine.
00:08:04.000 You think so?
00:08:06.000 I bet I can get in there.
00:08:07.000 A crossbow might...
00:08:08.000 I have a 90-pound compound bow that shoots 520 grain arrows at 300 feet per second with a razor-sharp broadhead.
00:08:21.000 We'll try it right now if you want.
00:08:22.000 I wish I had it with me.
00:08:26.000 Is it at your house or something?
00:08:27.000 Yeah.
00:08:28.000 Should we send someone to go get it?
00:08:29.000 We could do the demo tonight.
00:08:30.000 That would be interesting.
00:08:32.000 Maybe I'll drive back with an arrow sticking out of my car.
00:08:34.000 I bet I could get in there.
00:08:36.000 Okay, I'll bet you can't.
00:08:37.000 Really?
00:08:37.000 Yeah, I'll bet you a dollar.
00:08:38.000 Damn!
00:08:40.000 I think if you have a crossbow that's with enough force, a crossbow might get it through.
00:08:47.000 The thing about a crossbow is the bolt, even though it's very fast, it's not going to be nearly as heavy.
00:08:52.000 You won't have as many grains.
00:08:54.000 You can make a heavy crossbow bolt.
00:08:56.000 You could, yeah.
00:08:58.000 But generally, crossbow bolts are considerably lighter.
00:09:01.000 They're much smaller, you know, and they're much faster.
00:09:05.000 They're moving at like 400, 500 feet per second.
00:09:08.000 Easy.
00:09:09.000 Yeah.
00:09:10.000 I mean, the thing that matters is kind of the energy per unit area.
00:09:15.000 So interesting, like a 9-mile or a 45, which is basically sort of a 10-mile, the 45 is – they're roughly the same, but the 45 actually is slightly worse penetration than a 9-mile.
00:09:30.000 You know what I just realized?
00:09:31.000 I do have some broadheads.
00:09:33.000 I do have some broadheads, and I have a less powerful bow, but I have an 80-pound bow back there.
00:09:39.000 I think we should do it.
00:09:40.000 Absolutely.
00:09:40.000 Okay.
00:09:41.000 You want to do it right now?
00:09:42.000 Yeah, I can do it right now.
00:09:43.000 Okay, let's do it right now.
00:09:44.000 Let's find out.
00:09:44.000 Fuck yeah, let's do it.
00:09:47.000 Sick.
00:09:47.000 We'll be right back.
00:09:48.000 This could be funny.
00:09:49.000 I'm just like, why does he have an arrow sticking out of his car?
00:09:53.000 This show is sponsored by BetterHelp.
00:09:55.000 It's a really healthy, good thing to talk about what you're going through with people.
00:09:59.000 The good and the bad.
00:10:00.000 Don't keep it all bottled up.
00:10:02.000 And sometimes that can be friends or family, but it also helps to talk to pros.
00:10:07.000 And that's where BetterHelp comes in.
00:10:10.000 It's therapy that's totally online, which makes it so easy to get started.
00:10:15.000 You just fill out a few quick questions and they match you with someone to talk to.
00:10:19.000 And if you don't get the right match at first, you can switch therapists at any time for free.
00:10:25.000 It's easy, it's flexible, it's wherever you are.
00:10:29.000 Seriously, it's a great thing to try.
00:10:31.000 Get a break from your thoughts with BetterHelp.
00:10:33.000 Visit BetterHelp.com slash J-R-E today to get 10% off your first month.
00:10:40.000 That's BetterHelp.com slash J-R-E. Don't,
00:10:59.000 you know, have the arrow come back in.
00:11:04.000 I mean, just beware of ricochets.
00:11:11.000 You might want to do it at a slight angle.
00:11:19.000 You know what I mean?
00:11:33.000 Look at this.
00:11:35.000 Blew the arrow apart.
00:11:40.000 Flatten the tip of the arrow.
00:11:44.000 Look at the tip of the broad end.
00:11:47.000 That's impressive.
00:11:50.000 Hey cutie.
00:11:52.000 Thank you.
00:11:53.000 Well now we know.
00:11:55.000 So we just shot an arrow into it and it barely scratched it.
00:12:01.000 Bailey scratched it.
00:12:01.000 Yeah.
00:12:02.000 It was probably moving 275 feet a second.
00:12:06.000 That was a 525 grain-ish arrow with...
00:12:11.000 Yeah, even more than that because it had the 125 grain head so that was 545 grains.
00:12:17.000 That's impressive.
00:12:18.000 Yeah.
00:12:19.000 Very impressive.
00:12:20.000 It just destroyed the broadhead.
00:12:21.000 The broadhead flattened at the tip and then the arrow blew apart.
00:12:25.000 Yeah.
00:12:26.000 Amazing.
00:12:27.000 Yeah.
00:12:28.000 Like I said, we have a cool video we'll show at the Handover event next month, which is emptying an entire magazine of a Tommy gun, which I think is on the order of 50 rounds.
00:12:40.000 You're just going full Al Capone on the side of the car.
00:12:43.000 Shotgun, 9mm,.45.
00:12:45.000 And you built it like this just for fun?
00:12:47.000 Essentially?
00:12:48.000 Because it's cooler?
00:12:51.000 Because you can?
00:12:53.000 You know, trucks are supposed to be tough, right?
00:12:55.000 Yeah.
00:12:56.000 So, is your truck bulletproof?
00:12:59.000 No, mine's definitely not.
00:13:00.000 Exactly.
00:13:01.000 And if I shot mine with my bow, it would go right through it.
00:13:03.000 100%.
00:13:04.000 100%.
00:13:05.000 So, if you shoot any normal car, unlike in the movies where people hide behind car doors, a car door is basically very thin, mild steel.
00:13:14.000 So, if you shoot a gun through a regular truck, it'll go through both doors.
00:13:22.000 So...
00:13:25.000 You can't hide behind a car door like they do in the movies.
00:13:28.000 Way back in the day, dating myself with the A-Team, where they would be bullets flying everywhere and they'd be hiding behind the car door.
00:13:36.000 Right.
00:13:36.000 That doesn't work.
00:13:38.000 But it does in the Cybertruck.
00:13:42.000 It's best in apocalyptic technology.
00:13:46.000 It's an amazing car to have in the apocalypse.
00:13:48.000 Does it still do this thing where the ride height raises?
00:13:53.000 And there's also no regular drive trains, so there's no axles that are the impediment to going over rocks and things like that?
00:14:02.000 In other vehicles, gasoline or diesel vehicles, you've got the differential, which hangs down low between the rear wheels.
00:14:11.000 So you look under a truck, there's almost always a differential there that's hanging down pretty low.
00:14:16.000 So if you hit the diff on a rock, you'll break it.
00:14:19.000 The bottom of the side truck is completely flat and has the best clear height of any vehicle.
00:14:27.000 How far away are we, if it's ever gonna happen at all, from having a vehicle that can operate entirely on solar?
00:14:35.000 Well, you've got a surface area thing.
00:14:39.000 So it's about a kilowatt per square meter, normal to the sun, roughly.
00:14:47.000 So it really depends on what kind of mileage.
00:14:51.000 You don't have enough surface area to keep the car going just from the car surface area.
00:14:56.000 But if you had something that folded out, you could make it self-sustaining.
00:15:02.000 Something that folded out so you could park it and then leave it on.
00:15:06.000 Yeah, you'd have to unfold like the Starlink satellites do where you unfold the solar panels.
00:15:12.000 You just need more surface area.
00:15:15.000 Is there any potential for an advancement in technology that would make a smaller area much better at conducting sun?
00:15:24.000 Nothing?
00:15:25.000 No, it's a kilowatt per square meter.
00:15:28.000 That's what you're going to get when the sun, if you're normal to the sun, so 90 degrees to the sun.
00:15:33.000 And there's nothing that could accelerate that or no...
00:15:35.000 That's just literally the solar energy.
00:15:37.000 That's just it.
00:15:38.000 Yeah, so then you multiply your efficiency by that.
00:15:40.000 So if your commercial panel is like maybe 25-set efficient, if they're a good one, so you get like 250 watts per square meter.
00:15:49.000 There was one car, what was it, like a Fisker, that was using a solar panel that claimed that it was operating like the electronics, like it could start the radio.
00:16:00.000 Yeah.
00:16:02.000 I mean, you can definitely—you just don't have enough service area for it.
00:16:07.000 But, like, certainly you could run a house with solar, with the solar roof.
00:16:17.000 With the Tesla solar roof, you can run a house.
00:16:20.000 But it's never going to get to a point where you can just have a car that's made out of solar panels that it could drive around.
00:16:25.000 It could never be that efficient.
00:16:27.000 Correct.
00:16:28.000 You don't have enough service area.
00:16:29.000 What research or what breakthroughs have been made in terms of battery technology?
00:16:35.000 Like, how far away are we from having batteries that are far more efficient and last far longer?
00:16:41.000 I know there's some talk of, like, sodium-based batteries.
00:16:47.000 The battery range is not a problem at this point.
00:16:50.000 I mean, the Model S will go 400 miles, Model 3, Model Y will do over 300 miles.
00:17:00.000 That's more than most people need.
00:17:04.000 How far away are we from making batteries that are more efficient?
00:17:11.000 This is not really a constraint.
00:17:14.000 The point at which you've got a car that can do, let's say, even at highway speeds, 250 miles.
00:17:23.000 Let's say you turn 40 miles at 80 miles an hour.
00:17:27.000 Now you're driving for three hours straight.
00:17:31.000 And so if you start a trip at, say, 9 a.m., by noon you want to stop for lunch, go to the restroom, grab a coffee.
00:17:38.000 By the time you come back, your car is charged.
00:17:40.000 How long does it take to fully charge?
00:17:44.000 Yeah, like half an hour.
00:17:49.000 People will get used to it because it's a little different.
00:17:54.000 Like for a gasoline car, you'd want to fill it up.
00:17:56.000 For an electric car, you'd want to actually go very close to zero.
00:18:01.000 And the car can calculate how much range it has with precision.
00:18:05.000 So if you, say, enter a road trip in a Tesla, it'll calculate all of the supercharges along the way, where is your stuff, how much you should charge, and just let the computer do its thing and it'll work well.
00:18:17.000 So you actually want to charge to about 80% and then run it down all the way to 10% or less.
00:18:25.000 Do you want to do that on everyday use as well or just with long trips?
00:18:29.000 No, just long trips.
00:18:31.000 If you're trying to minimize the amount of time, you stop when charging.
00:18:36.000 So let's say you want to stop for 20, 30 minutes.
00:18:44.000 It's a little counterintuitive because for a gasoline car, you would fill it up.
00:18:48.000 For a battery, the charge state tapers off as you get above 80%.
00:18:54.000 You can think of it like the, I think the right analogy here is cars in a parking lot.
00:19:01.000 So the lithium ions are trying to find a parking space as they move across, you know, from one side of the battery to the other side, from, you know, cathode, anode.
00:19:11.000 I mean, they're sort of, just these ions are just bouncing around looking for a parking space.
00:19:16.000 So when the parking lot's empty, they can zip right in there and find a spot.
00:19:21.000 It's easy.
00:19:22.000 As the parking lot gets full, just like trying to find a parking space at a mall, you have to hunt around for a spot.
00:19:28.000 And that's basically what's going on.
00:19:31.000 The ions are looking for a parking spot.
00:19:34.000 So as the battery gets closer to full, it's harder and harder to find a spot.
00:19:39.000 They have to bounce around more.
00:19:41.000 So it takes longer to get from 80 to 100. Correct.
00:19:44.000 Getting from 80 to 100, it takes about as much time as getting from 0 to 80. Oh.
00:19:50.000 Just think of like the island's got to find a parking spot.
00:19:53.000 Oh.
00:19:54.000 And just like if you're in a mall and it's busy, then it takes longer to find a parking spot than if it's empty.
00:20:01.000 So essentially you're satisfied with the technology that's available right now in terms of like the amount of mileage that you get out of it and things along those lines.
00:20:11.000 Yeah, range is not an issue.
00:20:13.000 Cost is more of an issue.
00:20:15.000 So you just need to make the car affordable.
00:20:17.000 A long-range car needs to be affordable.
00:20:19.000 When you fully roll out, how many of those things, how many Cybertrucks can you guys make a month?
00:20:29.000 We're aiming to make about 200,000 a year at volume production.
00:20:34.000 Wow!
00:20:35.000 Maybe a little more.
00:20:36.000 But I just can't emphasize enough that manufacturing is much, much harder than the initial design.
00:20:47.000 The Cybertruck was easy to design.
00:20:49.000 I'm not trying to trivialize design.
00:20:51.000 It's just what I'm trying to do is to emphasize the difficulty of manufacturing, which is not understood by the public because there's no movie about it.
00:21:01.000 So there's lots of movies about the sort of wild inventor in the garage, but I'm not aware of any movie about manufacturing.
00:21:13.000 Have you ever heard of a movie about manufacturing?
00:21:16.000 I can't remember any.
00:21:17.000 Jamie?
00:21:18.000 Any movie about manufacturing?
00:21:19.000 There's one coming in my brain, but I don't think that's what it's even about, so I have no idea.
00:21:22.000 What is that?
00:21:23.000 Michael Keaton was making some cars somewhere.
00:21:26.000 I was going to look it up.
00:21:27.000 I mean, it's Tommy Boy.
00:21:29.000 Yes.
00:21:30.000 Tommy Boy's the only one.
00:21:31.000 That's a great movie.
00:21:32.000 It's a great movie!
00:21:33.000 That might be the only one.
00:21:36.000 That's interesting that it's such an immense part of American culture and also the decline of some American cities.
00:21:42.000 I mean, it's famously documented in Roger and Me, a great documentary, where he just talks about how Flint got destroyed when they pulled out the car manufacturing.
00:21:54.000 Yeah, yeah.
00:21:55.000 I mean, there's a reason why generally politicians really try very hard to get a factory in their area is because it's a massive generator of jobs.
00:22:05.000 And for every factory job, there's like roughly five support jobs.
00:22:10.000 So it's like teachers, electricians, plumbers, lawyers, accountants, restaurants.
00:22:18.000 So there's So manufacturing is kind of like a nucleus from which many jobs spring.
00:22:26.000 That's why it's generally, you know, governors and prime ministers and presidents will try so hard to get a factory in their country or region.
00:22:35.000 When you decided to build the Gigafactory, even when you decided to get involved with Tesla, did you have any idea of how difficult this would be?
00:22:45.000 Did you have a preconceived notion?
00:22:47.000 I thought it would be very difficult.
00:22:49.000 I thought our probability of success was less than 10%.
00:22:54.000 Yeah, I mean, it would be foolish to think anything else other than that.
00:22:57.000 I mean, even at this point, the only car companies that have not gone bankrupt are Ford and Tesla, the American car companies.
00:23:07.000 General Motors went bankrupt and Chrysler went bankrupt in 2009. There's some chance they'll go bankrupt again.
00:23:15.000 Ford and Tesla barely made it.
00:23:18.000 It was incredibly difficult to keep Tesla alive when General Motors and Chrysler were going bankrupt.
00:23:27.000 Because manufacturing is the actual hard thing, by far the hard thing.
00:23:32.000 I just can't emphasize that enough, and I hope somebody makes a movie about that.
00:23:36.000 Maybe they should make a movie about Tesla.
00:23:39.000 Sure.
00:23:39.000 Why not?
00:23:40.000 Yeah.
00:23:41.000 Perfect.
00:23:41.000 Who would you want to play you?
00:23:43.000 I don't care.
00:23:44.000 How about David Spade?
00:23:46.000 Anyone.
00:23:47.000 I don't care.
00:23:48.000 I'm kidding.
00:23:49.000 I don't care if anyone plays me.
00:23:50.000 But I do think that this— I just went back to Tommy Boy.
00:23:53.000 Yeah.
00:23:54.000 I think we rocked.
00:23:56.000 So, you know, Jim Foley is the CEO of Ford, and he's Chris Foley's cousin.
00:24:02.000 No way.
00:24:03.000 Yes.
00:24:03.000 Wow.
00:24:04.000 That's crazy.
00:24:05.000 Yes.
00:24:06.000 That's crazy.
00:24:07.000 And they look—I mean, they look related.
00:24:14.000 Yeah, there should be a movie.
00:24:17.000 Yeah.
00:24:18.000 You've got to get someone good that doesn't fuck it up.
00:24:21.000 Well, I mean, the thing is that writers are just disconnected from manufacturing.
00:24:27.000 They just never see it.
00:24:29.000 Right.
00:24:30.000 And I guess you have to try to create some narrative arc.
00:24:36.000 Yeah.
00:24:37.000 I mean, there are some shows like How It's Made type of thing, but they're pretty niche.
00:24:46.000 I know I have a broken record here, but I can't emphasize enough that it is insanely difficult to manufacture.
00:24:53.000 Makes sense.
00:24:54.000 Yeah.
00:24:55.000 Well, it particularly makes sense when it's something that novel.
00:24:58.000 Something is...
00:25:00.000 But ultimately, cool as fuck.
00:25:03.000 Yeah.
00:25:05.000 What has it been like?
00:25:06.000 You've owned X for a year now.
00:25:09.000 Oh, yeah.
00:25:11.000 Do you ever wake up in the middle of the night and have a dream that you didn't do it?
00:25:16.000 And your life is infinitely easier?
00:25:19.000 Well, it's certainly...
00:25:22.000 A recipe for trouble, I suppose, or contention.
00:25:27.000 What was it ultimately that led you to make the decision to do it?
00:25:37.000 I mean, this is going to sound somewhat melodramatic, but I was worried about that it was having a corrosive effect on civilization.
00:25:47.000 That it was just having a bad impact.
00:25:55.000 I think part of it is that it's where it was located, which is downtown San Francisco.
00:26:02.000 And while I think San Francisco is a beautiful city and we should really fight hard to kind of right the ship of San Francisco, if you've walked around downtown San Francisco, right near the ex-FK Twitter headquarters, it's a zombie apocalypse.
00:26:19.000 I mean, it's rough.
00:26:20.000 Have you been in that area?
00:26:21.000 Not lately.
00:26:22.000 No.
00:26:23.000 I've heard.
00:26:24.000 It's crazy.
00:26:25.000 I've heard it's crazy.
00:26:26.000 I've heard you really can't believe it until you actually go there.
00:26:29.000 You can't believe it until you go there.
00:26:30.000 So, now you have to say, well, what philosophy led to that outcome?
00:26:34.000 And that philosophy was being piped to Earth.
00:26:39.000 So, you know, a philosophy that would be ordinarily quite niche and geographically constrained, so that sort of the fallout area would be limited, was effectively given an information...
00:26:54.000 A weapon.
00:26:58.000 An information technology weapon to propagate what is essentially a mind virus to the rest of both.
00:27:05.000 And the outcome of that mind virus is very clear if you walk around the streets of downtown San Francisco.
00:27:10.000 It is the end of civilization.
00:27:13.000 And it's not just propagating the mind virus but suppressing any opposing viewpoints.
00:27:19.000 Yes.
00:27:20.000 Well, in order for the virus to propagate, it must suppress opposing viewpoints.
00:27:25.000 Because it doesn't stand up to scrutiny.
00:27:27.000 Correct.
00:27:29.000 I mean, you've felt the virus.
00:27:33.000 Yeah.
00:27:35.000 People have tried to cancel you so many times.
00:27:36.000 Yeah, it's fascinating.
00:27:39.000 I don't think you're melodramatic at all.
00:27:41.000 I think it's a – I mean I don't want to be melodramatic but it's almost like a death cult.
00:27:48.000 It's a death cult.
00:27:49.000 No, no.
00:27:49.000 That is exactly right.
00:27:52.000 It's essentially the extinctionists.
00:27:56.000 Like it's in the limit.
00:27:58.000 It is that they're propagating the extinction of humanity and civilization.
00:28:05.000 And there are some people who are – like most of the time it's implicit.
00:28:09.000 They don't – but sometimes it's explicit.
00:28:11.000 Like there was a guy on the front page of the New York Times who literally has the thing called the Extinctionist Movement and he was quoted on the front page of the New York Times as saying, there are 8 billion people in the world but it would be better if there were none.
00:28:25.000 And I'm like, well, buddy, you can start with yourself.
00:28:27.000 Yeah.
00:28:29.000 Does he have friends?
00:28:30.000 That's what always fascinates me.
00:28:32.000 Well, here he is.
00:28:33.000 That guy.
00:28:33.000 He looks like he's not long for this earth.
00:28:37.000 I mean, he's not young.
00:28:38.000 Voluntary human extinction movement.
00:28:40.000 That's hilarious.
00:28:42.000 I'd like to party with that dude.
00:28:44.000 I would just like to, like...
00:28:45.000 That's an explicit version of the death cult.
00:28:49.000 Yeah, maybe live long and die out.
00:28:51.000 I mean, it's not...
00:28:53.000 The extinction is a word he uses.
00:28:56.000 Yes.
00:28:56.000 No, I mean, it's literally a self-description.
00:29:00.000 Did they cover him glowingly?
00:29:01.000 That death cult was in charge of social media.
00:29:05.000 Yeah.
00:29:05.000 And still largely is at Google and Facebook, by the way.
00:29:08.000 Yeah.
00:29:10.000 So I'm like, I'm not in favor of human extinction.
00:29:14.000 They are, and they can go to hell.
00:29:17.000 Well, that guy is.
00:29:18.000 Yeah.
00:29:19.000 He can go to hell.
00:29:20.000 That guy seems silly.
00:29:21.000 I would like to hang out with him, though.
00:29:23.000 I would like to find out what makes him tick.
00:29:25.000 I bet that guy is fascinating.
00:29:27.000 We're good to go.
00:30:02.000 We're good to go.
00:30:12.000 Only on DraftKings Sportsbook with the code ROGAN. The crown is yours.
00:30:18.000 Gambling problem?
00:30:19.000 Call 1-800-GAMBLER or visit www.1800gambler.net.
00:30:27.000 In New York, call 877-8HOPE. Or text HOPENY467369. In Connecticut, help is available for problem gambling.
00:30:40.000 Call 888-789-7777 or visit ccpg.org.
00:30:47.000 Please play responsibly.
00:30:49.000 On behalf of Boothill Casino and Resort...
00:30:53.000 Licensee, partner, Golden Nugget, Lake Charles, Louisiana, 21 +, age varies by jurisdiction, void in ONT. Bonus bets expire 168 hours after issuance.
00:31:07.000 See sportsbook.draftkings.com slash football terms for eligibility and deposit restrictions, terms, and responsible gaming resources.
00:31:19.000 This episode is brought to you by ZipRecruiter.
00:31:22.000 I know there's a whole lot of people changing jobs right now and a whole lot of places hiring, which means if it's your job to hire, you are probably slammed.
00:31:32.000 I see you.
00:31:34.000 But what if I were to tell you that there's something that can make your whole hiring process faster and easier?
00:31:41.000 It's ZipRecruiter.
00:31:42.000 Instead of you doing all this hiring work, ZipRecruiter works for you.
00:31:47.000 You post your job once on ZipRecruiter and then it sends it to more than a hundred job sites So you reach just a ton of people.
00:31:56.000 Then ZipRecruiter's tech scans thousands of resumes for you to identify the ones whose skills and experience match your job.
00:32:06.000 So you get the first wide net and then you get it all narrowed down without you having to spend all that time.
00:32:15.000 So hiring heroes let ZipRecruiter help make your job easier.
00:32:21.000 Four out of five employers who post on ZipRecruiter get a quality candidate within the first day.
00:32:28.000 See for yourself.
00:32:30.000 Go to this exclusive web address to try ZipRecruiter for free.
00:32:34.000 That's ZipRecruiter.com slash Rogan.
00:32:37.000 Again, that's ZipRecruiter.com slash R-O-G-A-N. ZipRecruiter, the smartest way to hire.
00:32:46.000 If you get them alone for a few days and dig in.
00:32:49.000 I'm pro-environment, but in the limit, if you take environmentalism to an extreme, you start to view humanity as a plague on the surface of the earth, like a mold or something.
00:33:03.000 Right.
00:33:07.000 But this is actually false.
00:33:08.000 The Earth could take probably 10 times the current civilization.
00:33:12.000 The population could be – you could 10x the population without destroying the rainforest.
00:33:17.000 So the environmental movement – and I'm an environmentalist – has gone too far.
00:33:23.000 They've gone way too far.
00:33:26.000 You know, if you start thinking that humans are bad, then the natural conclusion is humans should die out.
00:33:33.000 Now, I'm headed to an AI safety, international sort of AI safety conference later tonight, leaving in about three hours.
00:33:42.000 And I meet with the British Prime Minister and a number of other people.
00:33:49.000 So you have to say, like, how could AI go wrong?
00:33:52.000 Well, if AI gets programmed by the extinctionists, It will—its utility function will be the extinction of humanity.
00:34:03.000 Yeah, clearly.
00:34:04.000 Yeah.
00:34:04.000 Yeah, I mean, particularly if— They won't even think it's bad like that guy.
00:34:07.000 Right.
00:34:08.000 Yeah.
00:34:10.000 If you let AI— It's messed up.
00:34:11.000 There's a lot of decisions that AI would make that would be very similar to eugenics.
00:34:17.000 I mean, there would be some radical changes in what people are allowed to and not allowed to do that allow them to survive that may be detrimental in terms of pollution and things like that, but it may be the only solution they have in their area.
00:34:31.000 I mean, maybe AI would come up with some sort of a different structure in terms of how they get power and resources, There's no shortage of power.
00:34:40.000 We talked about solar power for cars.
00:34:42.000 The issue is that cars just have a very low service area.
00:34:45.000 But you could actually power the entire United States with 100 miles by 100 miles of solar.
00:34:52.000 Really?
00:34:53.000 Yes.
00:34:55.000 So you could just pick some dead spot that you fly over.
00:34:57.000 Which they owe plenty.
00:34:58.000 Cover that sucker up with solar panels and charge the whole country.
00:35:02.000 Absolutely.
00:35:02.000 24-7.
00:35:03.000 You need batteries, but yes.
00:35:05.000 Yeah.
00:35:07.000 Wow.
00:35:07.000 Yeah, it's not hard.
00:35:08.000 Meaning it's very feasible.
00:35:10.000 In fact, the sun is converting over 4 million tons of mass to energy every second.
00:35:18.000 And it's no maintenance.
00:35:19.000 That thing just works.
00:35:21.000 We have a giant fusion reactor in the sky that is the sun.
00:35:27.000 In fact, people are like, what about in a radiation?
00:35:30.000 I'm like, the sun is literally a nuclear reactor in the sky.
00:35:34.000 Are you scared to go in daylight?
00:35:37.000 Rocks have radiation.
00:35:39.000 Yes.
00:35:40.000 The radiation risk is greatly overestimated.
00:35:46.000 I always wonder why radiation is always bad in real life, but always awesome in comic books.
00:35:52.000 Yeah, exactly.
00:35:53.000 You get bitten by a radioactive spider, and suddenly you have spider abilities.
00:35:56.000 Get hit with gamma rays, you become the Hulk.
00:35:58.000 What if you're radioactive cockroach, you'd be like the cockroach man.
00:36:01.000 Yeah, you can be one of the X-Men.
00:36:03.000 Yeah.
00:36:05.000 I think the problem is most people just don't understand what radiation is, and so it just sounds like a mysterious, invisible death ray.
00:36:14.000 Well, it's almost like drugs.
00:36:16.000 Like, we think of it, we put a blanket over it.
00:36:18.000 Like, it's all one thing.
00:36:20.000 You know, radiation is Chernobyl.
00:36:22.000 Right.
00:36:23.000 I mean, the things you can go to, you can actually tour Chernobyl right now.
00:36:26.000 Can you really?
00:36:27.000 Yes.
00:36:28.000 You can actually go to where the meltdown is?
00:36:30.000 Well, I mean, there's a war zone, but apart from that, the issue is, you know, more getting shot than it is, you don't have a radiation risk.
00:36:39.000 I mean, the problem is, like, I think when people don't understand what radiation is, they just, they can't see it, they can't feel it, they think, well, I could just die at any moment, like, from a magic death ray.
00:36:48.000 Right.
00:36:49.000 You know, I've had people say, like, oh, the radiation from their phone is going to hurt them, or they're scared of the microwave.
00:36:54.000 I'm like...
00:36:55.000 When you say radiation, do you mean particles or photons?
00:36:57.000 And if you mean photons, what wavelength?
00:37:00.000 And they're like, I don't know what you mean.
00:37:02.000 They don't know anything about that.
00:37:03.000 Right.
00:37:04.000 They're afraid of the term.
00:37:07.000 But it's because of Three Mile Island and Fukushima.
00:37:10.000 We've been...
00:37:11.000 Yeah, but nobody died of radiation from Fukushima.
00:37:13.000 Not one person.
00:37:14.000 True.
00:37:14.000 In fact, but I was asked by people in California, like, when Fukushima happened, whether radiation would get to California.
00:37:23.000 And I'm like, that's the dumbest thing I've ever heard.
00:37:24.000 And so, actually, to help support Japan, I flew to Fukushima and ate locally grown vegetables on TV. And I'm still alive.
00:37:34.000 I have a friend, he's very smart, but he won't eat fish out of the Pacific because he's worried about the radiation from Fukushima.
00:37:42.000 Yeah, that's irrational.
00:37:44.000 There is no physics substance to that, I would say, at all.
00:37:49.000 Not even slightly.
00:37:50.000 I'm going to send him this clip.
00:37:51.000 Yes.
00:37:52.000 Go back to the sushi place, bro.
00:37:54.000 No, you should be...
00:37:56.000 If you eat too much tuna, you're going to have...
00:37:58.000 Mercury.
00:37:59.000 Mercury poisoning from tuna is a real thing.
00:38:02.000 You can get arsenic from sardines too.
00:38:03.000 I found that out the hard way.
00:38:05.000 Really?
00:38:05.000 Yeah.
00:38:06.000 You ate too many sardines?
00:38:07.000 Yeah, I got my blood work done and the doctor says, you have arsenic in your blood.
00:38:11.000 And I go, is someone poisoning me?
00:38:13.000 He goes, no, it's very low level.
00:38:15.000 It's like, is your girlfriend angry at you?
00:38:18.000 Do you eat a lot of fish?
00:38:20.000 And I said, yeah, I eat like three cans of sardines a night.
00:38:23.000 That's a lot of sardines, man.
00:38:24.000 Yeah, I love sardines.
00:38:27.000 I love them.
00:38:29.000 You love sardines?
00:38:30.000 I've always loved sardines.
00:38:32.000 Okay.
00:38:32.000 I love them.
00:38:33.000 But it turns out you can't eat too much of it because they're not good for you.
00:38:37.000 Okay.
00:38:37.000 I mean a little sardines once in a while but not three cans a night.
00:38:40.000 Well for me it's like I come home late from the comedy club and I want something easy to eat and I don't want to stop and get fast food so I open up a few cans of sardines.
00:38:47.000 And I'll, you know, watch a little TV, eat a few cans of coffee.
00:38:51.000 I was doing it every night.
00:38:52.000 And then I stopped doing it, and I got my blood work done a couple months later, and it was gone.
00:38:56.000 Yeah.
00:38:57.000 I think anchovies really, really pep up a Caesar salad.
00:39:01.000 Yeah, they do.
00:39:02.000 I'm a fan.
00:39:03.000 I'm a fan.
00:39:04.000 I'm a fan of anchovies as well.
00:39:05.000 One of my favorite pizzas ever is pineapple and anchovy.
00:39:08.000 Okay.
00:39:09.000 Double pineapple, double anchovy.
00:39:10.000 Wow.
00:39:10.000 It's amazing.
00:39:11.000 It's the sweet and the salty, and then you got the tomato sauce and the cheese.
00:39:17.000 It's my favorite pizza.
00:39:18.000 It's very good.
00:39:19.000 I mean, as a kid, I was very much against Hawaiian pizza, and as an adult, I like it.
00:39:24.000 Hawaiian's good, but I'm telling you, anchovies and pineapple is the bomb diggity.
00:39:29.000 That's the bomb diggity.
00:39:30.000 Okay, I'll give it a shot.
00:39:31.000 That's the bomb diggity.
00:39:32.000 Wait, can we order some right now?
00:39:34.000 Is that feasible?
00:39:35.000 I bet we could.
00:39:36.000 Okay, let's try it.
00:39:37.000 That'd be sick.
00:39:38.000 Yeah.
00:39:39.000 I can do it.
00:39:40.000 We'll have someone out there.
00:39:41.000 Have Jeff order a very large pizza with double pineapple, double anchovies.
00:39:47.000 Great.
00:39:47.000 Fantastic.
00:39:48.000 I'm hungry.
00:39:49.000 Let's fucking go.
00:39:51.000 Let's fucking go.
00:39:53.000 No time like the present.
00:39:54.000 Enjoy life.
00:39:55.000 Well, there's got to be a good spot around here.
00:39:58.000 Tell them to find a good spot and tell them it's for us.
00:40:00.000 They'll cook it up.
00:40:02.000 If they won't, tell them we'll mention their name.
00:40:08.000 Tell them we'll mention their name on the podcast.
00:40:12.000 Don't tell them it's us.
00:40:13.000 Tell them it's us.
00:40:14.000 Fuck it.
00:40:15.000 If they're going to close, tell them we'll mention their name.
00:40:17.000 What is this salty sauce that's so mysterious?
00:40:24.000 Don't tell them it's us.
00:40:25.000 Good call.
00:40:26.000 Yeah, don't tell them it's us.
00:40:28.000 Make sure you don't buy it from any liberals.
00:40:30.000 What is the salty, tangy substance on that?
00:40:32.000 Don't buy it from East Austin.
00:40:35.000 Don't buy it from anyone who still wears a mask.
00:40:39.000 There's a lot of them out there.
00:40:41.000 There's a lot of them out there.
00:40:42.000 They're still masked up.
00:40:43.000 It's wild.
00:40:44.000 Yeah, once in a while I see someone paranoid.
00:40:46.000 On the street?
00:40:47.000 Yeah.
00:40:48.000 I saw a guy on the street the other day just walking around with a mask on.
00:40:51.000 I'm like, okay, buddy.
00:40:51.000 You look like you're about 28 years old.
00:40:53.000 Yeah.
00:40:54.000 I think you're going to be okay.
00:40:55.000 Be okay, yeah.
00:40:55.000 You're probably not going to be okay breathing that fucking same air in that mask and all the bacteria you're spitting out.
00:41:01.000 Yeah.
00:41:02.000 It's attaching to that cloth.
00:41:05.000 Yeah.
00:41:05.000 Masks are not like some magic health shield.
00:41:10.000 I mean, there are times where a mask is warranted, like if a surgeon is operating on you or whatever, then you don't want the surgeon spitting in your wound, you know?
00:41:18.000 Of course.
00:41:19.000 But most of the time, a mask is not good for you.
00:41:22.000 Yeah, if you can breathe out of it, that means you're breathing in.
00:41:27.000 That means you're also exhaling.
00:41:30.000 So, like, how much is it filtering?
00:41:31.000 Like, what is it?
00:41:33.000 Particles?
00:41:34.000 I'd say like a mask is much like sort of a shield in battle in that, you know, it'll help protect you a little bit from arrows and stuff, but it doesn't make you arrow-proof.
00:41:44.000 We're just talking about, you know, shooting arrows and stuff.
00:41:46.000 Right.
00:41:47.000 So, I mean, there are times when masks are warranted, but most of the time it's actually kind of productive.
00:41:53.000 Well, that was one of the things about the old Twitter was the propaganda and the adherence to whatever the CDC was saying and the dismissing of legitimate scientists, guys like Jay Bhattacharya from Stanford and legit guys.
00:42:13.000 And they were suppressing them and even banning them.
00:42:16.000 They banned Alex Berenson.
00:42:18.000 It was wild.
00:42:19.000 They banned Alex for essentially reading peer-reviewed papers.
00:42:25.000 Yeah.
00:42:26.000 No, I mean, all Twitter was basically an arm of the government.
00:42:30.000 Yeah.
00:42:31.000 So...
00:42:31.000 Was that shocking?
00:42:32.000 Like, what was that like?
00:42:33.000 Because that, to me, that was the most bizarre, was the Twitter files, when you let Schellenberger and Matt Taibbi and all those guys get in the Twitter, and the response were, Matt Taibbi gets audited.
00:42:44.000 I mean, which is just wild.
00:42:46.000 I mean, it's just so blatant and so in your face.
00:42:49.000 Yeah, it's weird.
00:42:50.000 No, I mean...
00:42:54.000 Yeah, the degree which – and by the way, Jack didn't really know this, but the degree to which Twitter was simply an arm of the government was not well understood by the public.
00:43:05.000 And it was – there was no – it was whatever the official – it was like Pravda basically.
00:43:12.000 It's a state publication is the way to think of old Twitter.
00:43:15.000 It's a state publication.
00:43:17.000 And was the justification from their perspective that they are progressive liberals, they have the right intentions, it's important that they stay in power, the progressive liberals stay in government and power because this is their...
00:43:32.000 There was basically oppression of...
00:43:38.000 Any views that would even, I would say, be considered middle of the road, but certainly anything on the right.
00:43:45.000 I'm not talking about like far right.
00:43:48.000 I'm just talking mildly right.
00:43:50.000 The people, like Republicans were suppressed at 10 times the rate of Democrats.
00:43:54.000 Now, that's because old Twitter was fundamentally controlled by the far left.
00:44:00.000 It was like completely controlled by the far left.
00:44:03.000 And that's why I say, like, San Francisco-Berkeley is a niche ideology.
00:44:09.000 It's hard to say, like, is there a place that's more far left than San Francisco-Berkeley?
00:44:13.000 Maybe Portland.
00:44:14.000 Maybe Portland.
00:44:15.000 But it's like...
00:44:16.000 Right there.
00:44:17.000 Yeah.
00:44:18.000 It's basically Portland.
00:44:18.000 Those two places are the most far left places in America.
00:44:23.000 Yes.
00:44:24.000 So from their standpoint, everything is to the right, including moderates.
00:44:30.000 Right.
00:44:30.000 Right, right.
00:44:32.000 So now, if you internalize a far-left position...
00:44:39.000 Everything seems wrong to you that is not far left.
00:44:42.000 Right.
00:44:43.000 And so they naturally oppressed anything that didn't agree with their views.
00:44:49.000 That's why I say that it was an accidental far left information weapon.
00:44:54.000 So it's like Silicon Valley attracts the smartest engineers, the smartest sort of technologists and programmers from around the world.
00:45:07.000 They created an information weapon that was then harnessed by the far left, who could not themselves create the weapon, but happened to be co-located where the technologists were.
00:45:17.000 It happened to be aligned politically with the people that possessed it.
00:45:21.000 The technologists generally are moderate, maybe moderate left, but they're not far left.
00:45:29.000 That's why I say San Francisco, Berkeley, it doesn't even extend to South San Francisco or even to Palo Alto.
00:45:36.000 So SF Berkeley is the most far left, perhaps in a competition with Portland, but I'd say SF Berkeley is more far left even than Portland.
00:45:47.000 Literally, in America, we're talking about an area that's maybe a 10-mile radius.
00:45:55.000 Normally, the negative effects of a far-left ideology would be geographically limited to a 10-mile radius.
00:46:07.000 Any bad effects of that ideology would be geographically constrained under normal circumstances and have been in the past.
00:46:16.000 But when you have basically a technological megaphone, which was Twitter and social media in general, suddenly the far left are handed a megaphone to Earth, an incredibly powerful technology weapon that they themselves could not create,
00:46:36.000 but they happen to be co-located with the technologists who created it by accident.
00:46:46.000 Is it shocking that more people don't understand how dangerous that is?
00:46:51.000 I think some people understand.
00:46:53.000 Some people do.
00:46:55.000 Some people understand.
00:46:57.000 So, I mean, from the standpoint of some people who used to be at Twitter, the people are like, well, it's a big shift to the right.
00:47:03.000 That is correct.
00:47:04.000 It is a shift to the right because everything is to the right if you're far left.
00:47:08.000 Everything is to the right.
00:47:11.000 But how many far-left people have actually been suspended or banned from Twitter now X? Zero.
00:47:20.000 So it's really just moved to the center, but from the perspective of the far-left, it's moved to the right.
00:47:27.000 Everything's relative.
00:47:29.000 The difference in moderation...
00:47:33.000 Sorry, I should say, it propagated that far-left philosophy, not just to America, but to everywhere on Earth.
00:47:39.000 Right.
00:47:40.000 Yeah.
00:47:40.000 Yeah.
00:47:42.000 And with the same level of suppression in other countries as well?
00:47:44.000 Yes.
00:47:46.000 But the Taliban is on Twitter, right?
00:47:52.000 I always think of like, hey, Mr. Taliban, tally me a banana.
00:47:58.000 Hey, Mr. Taliban, tally me a banana.
00:48:00.000 But there's definitely some people on Twitter that are...
00:48:03.000 Daylight coming, I want to go.
00:48:05.000 Yeah.
00:48:07.000 Yeah, so the point is, from my standpoint, is that X, FK, Twitter should represent the sort of collective consciousness of humanity.
00:48:25.000 So now, that means that there are going to be views on there that you don't like or disagree with.
00:48:32.000 Yeah.
00:48:35.000 But that's humanity.
00:48:38.000 So are you going to exclude them or not?
00:48:40.000 Now, I mean, if somebody, you know, breaks the law, then the account is suspended.
00:48:46.000 I mean, if they actively advocate murder, then the account is suspended.
00:48:51.000 We do have what we call, like, the kind of United Nations exclusion rule, which is that you can have, say, the Ayatollah, who, you know...
00:49:06.000 Would prefer that Israel didn't exist, for example.
00:49:10.000 But he's allowed to go to the UN building in New York.
00:49:14.000 And, in fact, generally officials from Iran do, in fact, go to the UN building in New York, even though they're a heavily sanctioned country.
00:49:25.000 So, I think that there's merit to having, just like there's some merit to the UN, one can disagree with the UN, and I think we shouldn't have a world government that we bow down to.
00:49:37.000 In fact, that's risky for civilization.
00:49:39.000 But I think you do want to have the leaders of countries represented on social media.
00:49:47.000 You want to hear what they have to say, even if what they say is terrible.
00:49:50.000 I think that is true across the board.
00:49:53.000 And I think one of the things you just said that's very important is that's humanity.
00:49:56.000 And I think it's important that a social media platform, especially the biggest one, represents humanity so we understand what we're talking about.
00:50:07.000 Because if we have this distorted idea of what people think and want and need because everyone only exists inside this ideological bubble and anything outside of that bubble gets censored, then that changes, literally changes the tone of the entire country.
00:50:21.000 It changes what people think is okay and not okay, makes people feel differently.
00:50:26.000 It's not humanity.
00:50:27.000 It's different.
00:50:28.000 It's a very forced version of humanity.
00:50:32.000 Yes, absolutely.
00:50:33.000 Yeah.
00:50:34.000 So, I mean, the whole point of free speech.
00:50:38.000 Free speech is only relevant.
00:50:40.000 The First Amendment is only relevant if you allow people you don't like to say things you don't like.
00:50:49.000 Because if you like it, you don't need a First Amendment.
00:50:53.000 So the whole point of free speech is that, frankly, even people you hate say things you hate because if people you hate can say things that you hate, that means that they can't stop you from saying what you want to say, which is very,
00:51:09.000 very important.
00:51:10.000 Right.
00:51:11.000 But the problem with Twitter is it was not the case.
00:51:13.000 Correct.
00:51:14.000 It was people that you hate couldn't say.
00:51:18.000 Anyone they didn't like, they censored.
00:51:19.000 Yeah.
00:51:20.000 Or what's called de-amplify.
00:51:22.000 Well, not just the Amplify, but under the behest of the government, would suppress real news, which was very bizarre.
00:51:29.000 Yes.
00:51:30.000 So they were very aware of something being accurate, and they still suppressed it because the government wanted them to suppress it.
00:51:38.000 I mean, in my view, there have been severe First Amendment violations by multiple government agencies, and there should be repercussions for that.
00:51:45.000 And is it – do different laws apply because it's a privately owned social media company?
00:51:52.000 I mean what laws do apply in terms of like – when you're looking at it, one of the arguments that the leftists would use is it's a private company.
00:52:02.000 They can do whatever they want.
00:52:04.000 Yeah, it's funny that when the shoe's on the other foot, they now say the private company can't do whatever it wants.
00:52:08.000 Well, yeah, now they're upset.
00:52:12.000 But the government itself is not allowed to censor speech.
00:52:19.000 But in my view, the government de facto did censor speech.
00:52:24.000 And there should at least be a case that is heard by the public.
00:52:29.000 Because if the government severely coerces...
00:52:34.000 You know, a platform that sort of coerces the press, then I think that is or should be a First Amendment violation.
00:52:44.000 Well, they can't do it with other media forms, right?
00:52:48.000 They're not allowed to do it with any other – they're not allowed to.
00:52:52.000 Right.
00:52:52.000 If they try to do that with a newspaper, they'll get in trouble.
00:52:54.000 Right.
00:52:56.000 Would they?
00:52:57.000 That's the question.
00:52:58.000 You didn't know about the federal government.
00:53:02.000 You didn't know about the intelligence agencies inside of Twitter until we found out.
00:53:06.000 Do you think that this is ubiquitous?
00:53:09.000 It's absolutely all the social media companies.
00:53:11.000 In fact, right now, X, formerly known as Twitter, is the only one that is not kowtowing to the government.
00:53:20.000 It's the only one.
00:53:22.000 All the others just do exactly what the government wants.
00:53:26.000 That is wild.
00:53:27.000 Yes.
00:53:27.000 What I was getting at, do you think that that's everywhere?
00:53:30.000 Yes.
00:53:30.000 Do you think that that's CNN? Do you think that that's the New York Times?
00:53:33.000 Do you think that that's the Washington Post?
00:53:36.000 Because if they were going to infiltrate media, they were going to infiltrate social media.
00:53:41.000 I mean, it is weird the degree to which the media is in lockstep.
00:53:45.000 Look, why is the media in lockstep?
00:53:48.000 And why doesn't the media question the government?
00:53:50.000 They used to.
00:53:51.000 Why don't they do that anymore?
00:53:53.000 Seems weird.
00:53:55.000 Something doesn't add up.
00:53:56.000 What do you think?
00:53:57.000 Well, there seems like there's a bunch of factors, right?
00:53:59.000 I think one of the big factors is pharmaceutical drug companies allowed to advertise on television.
00:54:04.000 And we're one of two countries in the world that allow that.
00:54:08.000 I actually agree with pharmaceutical advertising, provided it is truthful.
00:54:11.000 Because there could be some drug that is helpful to someone, but obviously the claims need to be accurate.
00:54:18.000 So I actually think pharmaceutical advertising, if it is accurate, I think it actually, you know, play devil's advocate here, I think pharmaceutical advertising is generally accurate.
00:54:30.000 I think that's actually okay.
00:54:33.000 Now, I should say that a lot of the censorship that we see is coming from – indirectly from advertisers and advertising agencies and from PR companies who want a particular viewpoint pushed or are being – Driven by non-profits to push a particular...
00:54:56.000 What will happen is there will be a group of non-profits that push advertisers to advertise or not advertise on a particular platform.
00:55:09.000 One often hears of the George Soros boogeyman.
00:55:12.000 Soros actually...
00:55:15.000 You know, he is, I believe, the top contributor to the Democratic Party.
00:55:19.000 The second one was Sam Beckwithreed.
00:55:23.000 So, and Soros, I don't know.
00:55:26.000 I mean, he had a very difficult upbringing.
00:55:30.000 And in my opinion, he fundamentally hates humanity.
00:55:36.000 That's my opinion.
00:55:37.000 Really?
00:55:38.000 Yeah.
00:55:39.000 I mean, well, he's doing things that erode the fabric of civilization, you know, getting DAs elected who refuse to prosecute crime.
00:55:47.000 That's part of the problem in San Francisco and LA and much other cities.
00:55:51.000 So why would you do that?
00:55:55.000 Was it humanity or is it just the United States as a whole?
00:55:59.000 I mean, he's pushing things in other countries too.
00:56:01.000 He's doing the same thing?
00:56:02.000 Yeah.
00:56:03.000 Now, George at this point is pretty old.
00:56:07.000 I mean, he's not...
00:56:10.000 You know, he's basically a bit senile at this point.
00:56:14.000 But I mean, he's very smart.
00:56:18.000 And he's very good at arbitrage.
00:56:20.000 You know, famously, he shorted the British pound.
00:56:23.000 That's sort of how I think he made his first money, was shorting the pound.
00:56:29.000 So he's good at spotting, basically, arbitrage, like spotting value for money that other people don't see.
00:56:36.000 So one of the things he noticed was that the value for money in local races is much higher than it is in national races.
00:56:46.000 So the lowest value for money is a presidential race.
00:56:49.000 Then the next lowest value for money is a Senate race, then a Congress.
00:56:54.000 But once you get to sort of city and state district attorneys, the value for money is extremely good.
00:57:02.000 And Soros realized that you don't actually need to change the laws.
00:57:05.000 You just need to change how they're enforced.
00:57:08.000 If nobody chooses to enforce the law or the laws are differentially enforced, it's like changing the laws.
00:57:14.000 That's what he figured out.
00:57:16.000 But is what's stunning that this trend, that people haven't pulled the brakes on this and have it reverse course?
00:57:25.000 I'm pulling the brakes?
00:57:27.000 Yeah.
00:57:28.000 Yeah.
00:57:29.000 Pulling the brakes right now.
00:57:30.000 Yeah, you are.
00:57:32.000 But you might be the only one.
00:57:34.000 Well, I think more people should.
00:57:39.000 Most people just don't want to rock the boat.
00:57:44.000 Most people are looking for acceptance from society and they're, you know, if there's some negative press article, they're like shattered.
00:57:51.000 I couldn't give a damn.
00:57:54.000 Go ahead, make my day.
00:57:57.000 Well, it's fascinating where if you're a high-profile public figure like yourself, it's impossible to make everybody happy.
00:58:06.000 So there's going to be someone who says something shitty about you.
00:58:10.000 Somehow or another when it's in print, does that mean more?
00:58:13.000 Because other people are going to see this shitty thing?
00:58:16.000 That's where it gets odd.
00:58:20.000 Because essentially an article in the New York Times is just a single person's opinion and whatever editor gets involved.
00:58:26.000 It's just a lot of people will read that.
00:58:29.000 I mean, less people these days than in the past.
00:58:31.000 But I think people know that now.
00:58:32.000 People know that now.
00:58:33.000 I find the New York Times these days to be hard to read.
00:58:36.000 Well, unfortunately, they make some grave errors.
00:58:41.000 Yeah.
00:58:42.000 Like that Hamas bombing the- Hamas?
00:58:45.000 No.
00:58:47.000 The Israeli bombing the hospital story.
00:58:50.000 Yes.
00:58:51.000 It's delicious.
00:58:53.000 I think we should cut off chickpea exports.
00:58:56.000 That'll bring them to the knees right away.
00:59:00.000 What do you do, take a trip and dip it in nothing?
00:59:03.000 What we need to do is introduce them to pineapple and anchovy pizza.
00:59:07.000 I hope that's coming.
00:59:09.000 Is that coming?
00:59:10.000 Do we have a pizza name, like a company?
00:59:13.000 I think so.
00:59:13.000 I'll get some information.
00:59:15.000 I want to make sure it's a good one, though.
00:59:16.000 Pizza Leon.
00:59:17.000 Oh, that's legit.
00:59:18.000 It's pretty close.
00:59:19.000 Okay, there we go.
00:59:20.000 Nice.
00:59:21.000 Did they give us a timeline?
00:59:23.000 It shouldn't take too long.
00:59:24.000 They're not too far away, and it's late, so it shouldn't...
00:59:27.000 I would have bet 20 minutes, 30 minutes, maybe.
00:59:29.000 Oh, right.
00:59:29.000 Max 40, but...
00:59:31.000 Taking care of your health isn't always easy, but it should be simple.
00:59:34.000 And that's why, for the last three years, I've been drinking AG1 every day.
00:59:39.000 AG1 is a foundational nutrition supplement that supports whole body health.
00:59:44.000 It's simple, effective, and comprehensive.
00:59:47.000 It's just one scoop mixed in water once a day, every day.
00:59:51.000 It couldn't be easier, and it tastes great with hints of pineapple and vanilla.
00:59:56.000 And whether I'm at home or on the road, When I take AG1, I know I'm getting the nutrients that I need to help me feel my best.
01:00:03.000 AG1 is a science-driven formula of vitamins, probiotics, and whole food sourced ingredients that support your brain, gut, and immune system.
01:00:12.000 It's a foundational nutrition supplement designed to raise your baseline health.
01:00:17.000 Every scoop of AG1 contains 75 high-quality ingredients that are obsessively sourced for absorption, potency, and nutrient density, so you can Actually feel the benefits.
01:00:29.000 Trust me, you're going to love the way you feel when you take AG1. So, if you want to take ownership of your health, it starts with AG1. Try AG1 and get a free one-year supply of vitamin D3 and K2 and five free AG1 travel packs with your first purchase.
01:00:47.000 Go to drinkag1.com slash Joe Rogan.
01:00:52.000 That's drinkag1.com slash Joe Rogan.
01:00:57.000 Check it out.
01:00:58.000 This episode is brought to you by Oracle.
01:01:00.000 AI might be the most important new computer technology ever.
01:01:04.000 It's storming every industry and literally billions of dollars are being invested.
01:01:10.000 So buckle up.
01:01:11.000 The problem is that AI needs a lot of speed and processing power.
01:01:16.000 So how do you compete without costs spiraling out of control?
01:01:20.000 It's time to upgrade to the next generation of the cloud.
01:01:23.000 Oracle Cloud Infrastructure, or OCI. OCI is a single platform for your infrastructure, database, application development, and AI needs.
01:01:36.000 OCI has four to eight times the bandwidth of other clouds, offers one consistent price instead of variable regional pricing, and of course, nobody does data better than Oracle.
01:01:50.000 So now you can train your AI models at twice the speed and less than half the cost of other clouds.
01:01:57.000 If you want to do more and spend less like Uber and thousands of others, take a free test drive of OCI at oracle.com slash rogan.
01:02:09.000 That's oracle.com slash rogan.
01:02:13.000 oracle.com slash rogan.
01:02:16.000 You know, that's something I have to say I've got a lot of respect for.
01:02:20.000 If somebody's willing to make pizza late at night, my hat is off.
01:02:24.000 I mean, that is great.
01:02:27.000 Absolutely.
01:02:27.000 Late night food.
01:02:29.000 I appreciate the fun.
01:02:30.000 If you can get a really good late night meal, hats off.
01:02:33.000 Totally.
01:02:33.000 Or wigs off.
01:02:34.000 100%.
01:02:35.000 Yeah, I'm a giant fan of very good late night food.
01:02:38.000 And that's one of the things that Los Angeles really used to have.
01:02:40.000 They had a Pacific dining cart where you can get a legit steak 24 hours a day.
01:02:45.000 Really?
01:02:46.000 That's great.
01:02:46.000 Yeah.
01:02:46.000 I don't know if it's still open in downtown LA. I believe the one in Santa Monica closed.
01:02:50.000 But a Pacific dining cart in downtown LA was a legit steakhouse.
01:02:55.000 And we would leave the comedy store at 2, 3 in the morning, get a legit steak.
01:03:01.000 That's cool.
01:03:02.000 Is that still open?
01:03:02.000 Temporarily closed.
01:03:04.000 Ah, fucking COVID got them.
01:03:06.000 COVID just took out so many restaurants.
01:03:08.000 It's crazy.
01:03:08.000 It's not coming back.
01:03:11.000 Yeah, they're not coming back.
01:03:12.000 Motherfucker.
01:03:13.000 COVID got 70% of the restaurants in LA at one point.
01:03:16.000 Wow.
01:03:17.000 Not COVID, I should say.
01:03:19.000 Policies.
01:03:20.000 Well, the mind virus.
01:03:22.000 I mean, it's like just crazy.
01:03:24.000 Well, that's why I moved here.
01:03:26.000 One of the reasons why I moved here is we came here in May of 2020 and you could go indoors and eat in restaurants and And my kids who were pretty young at the time, 10 and 12, they were like, we want to live here!
01:03:37.000 So it's like, they're freaked out.
01:03:39.000 Like, LA was weird.
01:03:40.000 It changed.
01:03:40.000 Yeah, I mean, for most of COVID, I was actually in South Texas building this Starship factory.
01:03:47.000 And, you know, we're just, yeah, no masks, no nothing, just building a factory, building rockets.
01:03:52.000 And then, you know, we would have teams from California visit all masked up, and they'd freak out that we don't have masks, and we're like, we're still alive, man.
01:04:03.000 Yeah.
01:04:05.000 Did you lose anybody?
01:04:06.000 Did anybody from your factory die of COVID? Not that I'm aware of, no.
01:04:11.000 So part of it is that I kind of saw a dress rehearsal, which is that it kind of started in Wuhan.
01:04:20.000 And so Tesla's got 20,000 employees in China.
01:04:27.000 And so the first wave happened in China, and nobody died or got seriously ill.
01:04:33.000 I was like, okay, well, this is...
01:04:36.000 Can't be that bad.
01:04:37.000 And we're not relying on government statistics.
01:04:40.000 We literally know who shoot up for work.
01:04:42.000 Did they badge in or not?
01:04:46.000 And we had no one die and no one got seriously ill.
01:04:51.000 So I'm like, well, I don't know what the big deal is.
01:04:54.000 Well, there's a problem that people still want to stick to this initial narrative that they believed and that they espoused, like they repeated it.
01:05:03.000 And so they'll still fight you on this today.
01:05:07.000 People still fight you today on the merits of the lockdowns, the importance of vaccine mandates, closing schools.
01:05:15.000 There's people that stated an opinion in 2020 And they still are doing mental gymnastics to try to make it seem like that was the right choice.
01:05:25.000 No, it was just a panic.
01:05:27.000 Yeah.
01:05:28.000 And a lot of deaths got ascribed to COVID that had nothing to do with COVID. And in fact, I'd say in the beginning, the cure is worse than the disease.
01:05:36.000 So, because people panic too much.
01:05:39.000 And so that somebody would get diagnosed with COVID, they put them on intubated ventilator for a week, and this was going to basically cook your lungs.
01:05:48.000 So if you're on pure O2 under pressure with a tube stuck down your throat and under anesthetic, this is very bad for you.
01:05:58.000 It's one thing if you do that for a couple hours for an operation, but if you do that for a week, it's going to roast your lungs.
01:06:05.000 The air that we're breathing right now is 78% nitrogen, 1% argon, about 21% oxygen, and it's so miscellaneous.
01:06:14.000 So if you ask most people, what are you breathing, they say oxygen.
01:06:16.000 No, you're breathing nitrogen.
01:06:19.000 Only about a fifth of it is oxygen, and there's about 1% argon.
01:06:25.000 So I know quite a lot about life support systems because we make spaceships, and you have to keep people alive in a vacuum.
01:06:32.000 So you've got to say, okay, what percentage of nitrogen, what percentage of oxygen are you going to do?
01:06:35.000 What's the pressure going to be?
01:06:38.000 And so, like, sea level pressure is about 15 pounds per square inch.
01:06:43.000 And the partial pressure of oxygen, being 20%, is therefore roughly 3 pounds per square inch of oxygen.
01:06:51.000 So, in a spacecraft, you want to, and especially if you're in a space suit, you want to lower the pressure.
01:06:58.000 So you want to keep the oxygen, still give people enough oxygen to function, obviously, but you want to lower the nitrogen content so that you don't have a space suit that's at 15 psi.
01:07:08.000 Because at 15 psi, you just, you know, just pop out like a balloon.
01:07:12.000 It's like hard to move.
01:07:15.000 So you want to try to lower the pressure, you know, down to around, I don't know, six, seven PSI, maybe even five PSI. So you'd lower it to, you know, try to keep the oxygen, partial pressure of oxygen roughly the same,
01:07:31.000 so maybe around three PSI and then three PSI of nitrogen, so you've got a 50-50 mix of nitrogen and oxygen, and then you just get pretty hot into that week.
01:07:40.000 Yeah, I talked about it.
01:07:42.000 I made it a while.
01:07:44.000 I'm sweating.
01:07:46.000 I mean, it's gonna be sweaty and itchy.
01:07:48.000 A little bit.
01:07:49.000 Yeah.
01:07:49.000 Can't believe people wear them all day.
01:07:51.000 Yeah.
01:07:53.000 So...
01:07:53.000 Anyway, so I know a thing or two about keeping people alive in a vacuum, you know?
01:07:58.000 Right.
01:07:58.000 So, you know, we designed the life support system for keeping humans alive in the vacuum of space, which is very difficult.
01:08:07.000 So we know quite a lot about what it takes to keep people alive.
01:08:11.000 So you don't want to feed people 100% oxygen for an extended period of time.
01:08:18.000 This is not good for you.
01:08:19.000 Well, 80% of the people they put on ventilators died.
01:08:22.000 Yeah.
01:08:23.000 So, in fact, I actually posted about that because I called doctors in Wuhan and said, what are the biggest mistakes that you made on the first wave?
01:08:32.000 Those were early on.
01:08:32.000 And they said, we put far too many people on intubated ventilators.
01:08:36.000 So then I actually posted on Twitter at the time and said, hey, what I'm hearing from Wuhan is that they made a big mistake in putting people on intubated ventilators for an extended period.
01:08:47.000 And that this is actually what is damaging lungs, not COVID. It's the treatment.
01:08:52.000 The cure is worse than the disease.
01:08:54.000 And people yelled at me and said, I'm not a doctor.
01:08:57.000 I'm like, yeah, but I do make spaceships with life support systems.
01:08:59.000 What do you do?
01:09:01.000 I like that.
01:09:02.000 I twiddle knobs.
01:09:03.000 I'm like, okay, great.
01:09:04.000 Rock on.
01:09:06.000 Well, again, there was this very bizarre narrative that you had to believe everything that the government was telling you.
01:09:13.000 You had to believe everything the CDC was telling you.
01:09:16.000 And that even as time went on and we realized, hey, it looks like this came from a fucking lab.
01:09:23.000 Like, even as time went on, disputing that would get you banned.
01:09:27.000 It would get you kicked off of YouTube.
01:09:28.000 I think, to this day, there's certain things you're not allowed to say in regards to the vaccine on YouTube.
01:09:36.000 As I said, the only media that does not have crazy censorship at this point is X. Yeah.
01:09:44.000 That I'm aware of.
01:09:45.000 Everyone else, everything else is censored.
01:09:49.000 Spotify isn't.
01:09:50.000 That's why this can...
01:09:51.000 Good for Daniel Ack.
01:09:53.000 Oh, Daniel Ack's the man.
01:09:54.000 Yeah, he's great.
01:09:54.000 I love that dude.
01:09:55.000 And, you know...
01:09:56.000 I think more companies should follow suit.
01:09:59.000 I don't think it has to be this way.
01:10:00.000 Fortunately for us, they're in Sweden.
01:10:03.000 In Stockholm, Sweden, they have a very different perspective.
01:10:06.000 You've got a syndrome.
01:10:07.000 On all this shit.
01:10:10.000 Yeah.
01:10:11.000 What is wild about nitrogen is that that's mostly the nitrogen from fertilizer we suck out of the air.
01:10:17.000 Sorry, what do you mean?
01:10:18.000 Most of the nitrogen for fertilizer we suck out of the air.
01:10:21.000 Yeah, yeah.
01:10:22.000 Actually, one of the big inventions in chemistry was binding nitrogen.
01:10:28.000 Nitrogen is actually fairly inert, so it's quite hard to actually pull nitrogen out of there and bind it into ammonia.
01:10:37.000 Basically, the process for creating ammonia was actually a very important thing.
01:10:40.000 The Haber method.
01:10:41.000 Yeah.
01:10:41.000 Fritz Haber.
01:10:42.000 Same guy who invented Zyklon gas.
01:10:45.000 But it was actually very important to bind nitrogen from the air to fertilizer.
01:10:53.000 So that actually was, frankly, a life-saving invention at scale because you just run out of nitrogen.
01:11:06.000 So...
01:11:10.000 Pure nitrogen is a low energy state, so to try to bind it into a fertilizer requires light energy to do that.
01:11:16.000 It's quite tricky.
01:11:17.000 So that was a very important breakthrough.
01:11:19.000 Yeah, I read that 50% of the nitrogen in most people's body comes from that method.
01:11:27.000 50% of the nitrogen in most people's body that they've consumed from food.
01:11:32.000 Oh yeah, yeah, because of photo fertilizer.
01:11:34.000 That might be true.
01:11:38.000 It was a fundamental problem for most of civilization is how do you get nitrogen for the plants.
01:11:47.000 The limiting factor, in fact, even in the rainforest is like the nitrogen is bound nitrogen.
01:11:52.000 When you do eventually colonize Mars, what's the idea in terms of terraforming?
01:12:00.000 Is it contained ecosystems that are under domes?
01:12:04.000 What are you planning on doing to make it habitable?
01:12:10.000 Well, at first, you would have to have a life support system because Mars has a low-density atmosphere, only about 1% the density of Earth, and it's primarily CO2. Now, over time, you can terraform Mars.
01:12:25.000 Terraform means make it like Earth, essentially.
01:12:29.000 And if you warm Mars up, there's a bunch of frozen CO2 that will evaporate, densify the atmosphere, and you'd actually want kind of global warming on Mars.
01:12:39.000 Because Mars is about 50% further away from the Sun than the Earth.
01:12:44.000 So it gets about less than half the solar energy that Earth does.
01:12:48.000 And it's believed at one point in time, Mars had a much different environment, right?
01:12:54.000 It appears highly likely that Mars had liquid oceans a long time ago.
01:13:00.000 There's a lot of ice.
01:13:01.000 So Mars is covered in ice.
01:13:06.000 And now the ice is then covered in dust mostly except at the poles.
01:13:11.000 So there's a lot of ice.
01:13:13.000 In fact, I believe if Mars was warmed up, you'd have an ocean about a mile deep on 40% of the planet.
01:13:20.000 So it's quite a lot of water.
01:13:23.000 And do we think that it was like that at one point in time?
01:13:26.000 The evidence suggests that it is most likely that Mars had liquid water.
01:13:31.000 What's the prevailing theory of its demise?
01:13:34.000 Well, just over time, the solar system cooled.
01:13:37.000 So Earth used to be much higher.
01:13:38.000 Like, the very early Earth was like molten rock.
01:13:43.000 You know, so really almost nothing could survive in the beginning.
01:13:46.000 We were just a ball of lava.
01:13:48.000 We're still mostly a ball of lava.
01:13:51.000 We're like creme brulee.
01:13:52.000 There's a thin crust and it's very hot mushy rock underneath.
01:14:00.000 Technically, that rock is in a semi-solid state, but as soon as it gets to a low pressure, like pops out of the ocean, you have a volcano, obviously, with lava.
01:14:10.000 At surface ambient pressure, we're basically covered in liquid rock.
01:14:16.000 A thin crust on liquid rock.
01:14:20.000 Are you aware of the origin myth of the Dogon tribe?
01:14:23.000 No.
01:14:24.000 There's a tribe in...
01:14:25.000 I believe it's a tribe in...
01:14:26.000 I forget what part of Africa.
01:14:28.000 But they believe that they came from Mars.
01:14:31.000 And that there was a civilization that left Mars many, many eons ago.
01:14:38.000 And it's a really weird...
01:14:43.000 It's a really weird theory, because they know some things about Mars.
01:14:48.000 Yeah, I'm pretty sure they didn't come from Mars.
01:14:50.000 Oh yeah, I'm pretty sure too.
01:14:53.000 Do they have any spaceships?
01:14:54.000 If they don't have any spaceships, then I'm like, I don't believe it.
01:14:57.000 If they do have a spaceship, I'll believe it.
01:14:58.000 If you parked a spaceship, how many thousands of years?
01:15:01.000 If you've parked a metal spaceship...
01:15:03.000 Like, if you left a cyber truck in the desert, how many thousands of years do you think it would be there for?
01:15:10.000 Before it's gone?
01:15:11.000 If it got buried in dirt, we'd find it even like a million years from now.
01:15:14.000 A million?
01:15:15.000 Yeah.
01:15:16.000 Wow.
01:15:18.000 Really?
01:15:19.000 Well, what you'd find is...
01:15:20.000 Because it's stainless steel.
01:15:21.000 So it'd have to be some sort of an alloy.
01:15:23.000 It would...
01:15:24.000 It's kind of like...
01:15:25.000 But iron wouldn't, right?
01:15:27.000 Yeah, but you'd have something similar to...
01:15:30.000 Like fossils, basically.
01:15:32.000 Like the fossils...
01:15:34.000 They essentially discolor the rock.
01:15:37.000 So eventually, whatever the fossil is, and sometimes the fossil is like an amber or something like that, where it still does survive more or less intact.
01:15:45.000 But, I mean, there's fossilized, like, dinosaur fossils and tree fossils.
01:15:51.000 Essentially re-mineralized, right?
01:15:53.000 Yeah.
01:15:53.000 Yeah.
01:15:54.000 So you'd see, like, a Cybertruck shape in the rock, basically.
01:15:58.000 Oh.
01:15:59.000 Yeah.
01:16:00.000 But that's it.
01:16:01.000 You wouldn't find the actual Cybertruck shape.
01:16:05.000 No.
01:16:05.000 So if they did have a spaceship and it came here 30,000 years ago?
01:16:09.000 Oh, yeah, yeah.
01:16:10.000 We'd definitely find evidence of it.
01:16:14.000 Hmm.
01:16:14.000 Well, I mean, if it was one spaceship, maybe not.
01:16:17.000 But if it was a lot of them, sure.
01:16:19.000 That is the origin myth of the Dogon tribe, right?
01:16:22.000 Am I getting that right?
01:16:23.000 I didn't say Mars specifically.
01:16:24.000 It's a hidden star and a Sirius.
01:16:26.000 Oh, it's somewhere else.
01:16:28.000 Yeah.
01:16:29.000 You cannot be Sirius.
01:16:32.000 Serious XM? It's just very strange when people have this bizarre origin myth.
01:16:41.000 I wonder who was the first one to tell them they came from stars.
01:16:46.000 And when we eventually do, I mean, how bizarre.
01:16:51.000 Imagine if you're successful, we eventually do colonize Mars.
01:16:55.000 And you're correct.
01:16:57.000 Earth winds up through human folly or natural disaster getting wiped out.
01:17:04.000 And there's only the colony on Mars.
01:17:07.000 And that colony exists for 10, 20,000 years.
01:17:10.000 And they have their origin myth that we all came from Earth.
01:17:15.000 I mean ultimately that's going – if this does happen, you do colonize Mars and Earth does get destroyed, and if a period of time takes place – like look at the period, like at least the conventional timeline of the Great Pyramid, which is 4,500 years ago.
01:17:31.000 5,000 years.
01:17:32.000 Yeah.
01:17:32.000 So that's not that much time.
01:17:35.000 It's not that much time.
01:17:36.000 No, I mean if...
01:17:36.000 It's nothing on the galactic timescale.
01:17:38.000 Right.
01:17:39.000 So if we're talking 20, 30,000 years from now on Mars, people talk about Times Square and what Earth used to be like.
01:17:48.000 I mean, it is...
01:17:51.000 I think there's some debate.
01:17:53.000 It's like, how do you say what the...
01:17:54.000 When did civilization start?
01:17:56.000 And I'd say probably from the first writing.
01:17:58.000 Mm-hmm.
01:18:01.000 And the first writing is only 5,500 years old.
01:18:05.000 It's worth reading about the history of writing, but only 5,500 years.
01:18:10.000 And one has to credit basically the ancient Sumerians who aren't around anymore with the first writing.
01:18:18.000 Are you aware, though, that there's hieroglyphs that depict a history of Egypt that goes back far longer, maybe even 30,000-plus years ago, but archaeologists dismiss it because they think that that's mythical.
01:18:31.000 But non-conventional archaeologists who believe in what's called the Younger Dryas impact theory, that somewhere around 11,800 years ago, civilization was essentially all but wiped out by common impacts.
01:18:45.000 Okay.
01:18:46.000 And that is the reason why they keep finding these insanely old, huge structures, megalithic structures that are carved out of stone.
01:18:56.000 When you go back to Gobekli Tepe, which is 11,600 years ago, that's an insanely old structure that they didn't even know people were capable of building until they discovered it in the 1990s.
01:19:10.000 So the conventional timeline of people, when you go to 11,600 years ago, was just hunter-gatherers.
01:19:17.000 But now that they have Gobekli Tepe with its 3D carved things.
01:19:22.000 Have you seen Graham Hancock's amazing series on Netflix called Ancient Apocalypse?
01:19:28.000 I know.
01:19:29.000 You should check it out.
01:19:30.000 It's amazing.
01:19:30.000 But it's about that.
01:19:32.000 It's all about that there's a lot of physical evidence of an advanced civilization from far, far, far longer ago than we have convention dated, which is ancient Sumer, which we put in about 6,000 years ago.
01:19:47.000 Yeah.
01:19:48.000 It's difficult to date with precision, or at least to within a few hundred years, but roughly 5,500 years, what is the oldest stone tablet?
01:19:58.000 Yeah.
01:19:59.000 Because if you're an archaeologist, if you were to discover something older than that, you'd be very famous.
01:20:06.000 They really looked hard.
01:20:08.000 Yeah.
01:20:09.000 And 5,500 years really is kind of the...
01:20:15.000 If you say any kind of evidence that I've seen that is actually substantial, writing is 5,500 years old.
01:20:23.000 Yeah, in terms of writing, yeah.
01:20:25.000 Well, what they believe is that there's very little left of this ancient civilization other than things like the pyramids, other than things like the Sphinx.
01:20:34.000 There's a geologist that really stuck his neck out.
01:20:37.000 His name is Dr. Robert Shock from Boston University.
01:20:39.000 And what he said was, his theory is that there's deep water erosion all over the temple of the Sphinx, where the Sphinx was carved out of, that is indicative of thousands of years of rainfall.
01:20:51.000 And the last time they had rainfall in the Nile Valley was around 9000 BC. So what he believes is, because back then the whole Nile Valley was a lush rainforest and eventually receded into desert.
01:21:03.000 Okay.
01:21:04.000 Yeah.
01:21:04.000 So that whole area, like even the Sahara, used to be rich rainforest.
01:21:10.000 And it receded into what it is now.
01:21:13.000 But if you go back then, he believes that's when that thing was constructed.
01:21:18.000 And he says the geologists look at it.
01:21:22.000 And if he shows it to them in terms of like just shows an image of the erosion and doesn't tell them where it is, almost all of them will say that's water erosion from thousands of years of rainfall.
01:21:33.000 I think even if you say like, okay, even if you say like, okay, civilization is like 9,000 years old, it's still nothing.
01:21:41.000 Nothing.
01:21:41.000 Yeah.
01:21:42.000 So, you know, we're still talking about like a very tiny fraction of Earth's existence.
01:21:47.000 Yeah.
01:21:48.000 The geological evidence suggests Earth is about 4.5 billion years old.
01:21:53.000 So human civilization has been around for roughly one millionth of Earth's existence.
01:21:59.000 Yeah.
01:22:00.000 Because we're basically nothing.
01:22:01.000 And even if it's 10,000 years ago.
01:22:03.000 Even if it's 30,000 years ago, it's still nothing.
01:22:05.000 What they're saying, though, is that civilization is insanely fragile.
01:22:10.000 Exactly.
01:22:10.000 And much more fragile than I think we realize.
01:22:13.000 Yes, absolutely.
01:22:14.000 I think we should view civilization as being fragile.
01:22:17.000 Yeah, but we don't.
01:22:19.000 It's one of the weird things about people.
01:22:21.000 Unless the threat is in front of us, it's abstract.
01:22:26.000 Unless it's like, is the pizza here?
01:22:28.000 Oh yeah!
01:22:30.000 Pizza's here.
01:22:31.000 Yeah.
01:22:32.000 Fuck civilization.
01:22:33.000 No, actually, one of my sons who is a Saxon, he has these profound observations.
01:22:40.000 And he asked me, what was L.A. like 4,000 years ago?
01:22:44.000 I'm like, it wasn't around.
01:22:46.000 And they said, what will it be like 4,000 years from now?
01:22:50.000 Probably buried under rubble, I guess.
01:22:52.000 Probably very similar to what it was like 4,000 years ago.
01:22:54.000 Yeah, exactly.
01:22:56.000 Except less radioactive.
01:22:57.000 And he asked me, did they speak English 4,000 years ago?
01:23:00.000 I'm like, nope.
01:23:01.000 It's like, will they speak English 4,000 years from now?
01:23:04.000 Probably not.
01:23:08.000 Really?
01:23:09.000 No.
01:23:10.000 Never.
01:23:12.000 Why not?
01:23:12.000 Because it's not really good for you.
01:23:15.000 Well, I don't think anyone's going to accuse pizza of being like the healthiest thing in the world.
01:23:19.000 This looks awesome.
01:23:21.000 That does look awesome.
01:23:23.000 You want a plate, Jamie?
01:23:25.000 Yeah.
01:23:26.000 Get in there, sir.
01:23:27.000 Grab a piece.
01:23:28.000 Alright, sick.
01:23:30.000 This is awesome.
01:23:32.000 And what's the name of this pizza place again?
01:23:34.000 Pizza Leon.
01:23:35.000 Pizza Leon?
01:23:36.000 Shout out to Pizza Leon.
01:23:41.000 Oh yeah.
01:23:42.000 That really hits the spot.
01:23:44.000 That's legit.
01:23:46.000 I mean, I'm no Dave Portnoy.
01:23:50.000 Our pizza analyst, he'll probably...
01:23:52.000 I'm not gonna rate it.
01:23:54.000 It's excellent.
01:23:57.000 His Portnoy really gets into pizza?
01:23:59.000 Oh, man.
01:24:01.000 Oh, man!
01:24:05.000 Ever seen Portnoy's videos where he analyzes pizza?
01:24:08.000 Oh, my God.
01:24:09.000 It was like a whole method.
01:24:10.000 Okay.
01:24:10.000 He's got a number system.
01:24:12.000 All right.
01:24:13.000 He's into the crust and the flop and all these different things.
01:24:17.000 Wow.
01:24:17.000 Yeah.
01:24:18.000 Everybody knows the rules.
01:24:20.000 One bite.
01:24:22.000 Yeah.
01:24:23.000 What is his?
01:24:24.000 Everybody knows the rules.
01:24:25.000 One bite.
01:24:26.000 You only get one bite of a pizza?
01:24:27.000 Yeah.
01:24:28.000 One bite to taste it.
01:24:30.000 He bites into it, and then he just starts nodding his head.
01:24:33.000 He's basically like a Somalia pizza.
01:24:35.000 A Somalia, okay.
01:24:36.000 Yeah.
01:24:37.000 And is there, like, what's his favorite pizza joint?
01:24:40.000 Oh, the favorite one.
01:24:42.000 Everyone wants to know that.
01:24:44.000 It's O's Cheese.
01:24:44.000 We spent so much time on pizza.
01:24:46.000 And it's New Haven, Connecticut.
01:24:48.000 Really?
01:24:48.000 Yeah.
01:24:50.000 For some reason, the Italians that moved to New Haven, Connecticut really figured pizza out.
01:24:55.000 They have insane pizza in New Haven, Connecticut.
01:24:59.000 Yeah.
01:25:00.000 Like, really, like, legendary.
01:25:01.000 I've had it.
01:25:03.000 There was a comedy club I used to work out there called The Joker's Wild.
01:25:06.000 And I had New Haven pizza.
01:25:08.000 Even back then.
01:25:09.000 It's just...
01:25:10.000 Really good pizza.
01:25:12.000 Well, okay.
01:25:13.000 I don't know why, though.
01:25:18.000 It seems like something that could be replicated.
01:25:21.000 Yeah, exactly.
01:25:22.000 Yeah, it's not like rocket ship.
01:25:23.000 Yeah.
01:25:27.000 The thing is...
01:25:29.000 The people that are making pizza are not like the people that are making rocket ships.
01:25:34.000 If they were, they would replicate it.
01:25:37.000 Yeah.
01:25:38.000 They would go, what are these guys doing?
01:25:40.000 Let's back engineer it.
01:25:41.000 Sure.
01:25:42.000 Can't be that hard.
01:25:44.000 You know?
01:25:45.000 All these secret sauces and shit.
01:25:47.000 Well, what the fuck?
01:25:48.000 What's in there?
01:25:49.000 Yeah.
01:25:51.000 Well, I was hungry.
01:25:54.000 It's good though, right?
01:25:55.000 Mm-hmm.
01:25:57.000 The combination...
01:25:59.000 A pineapple and anchovy.
01:26:01.000 Surprisingly good, right?
01:26:02.000 Yeah.
01:26:03.000 What if this is the first pineapple and anchovy pizza they ever made there?
01:26:08.000 I don't see it on their menu very often.
01:26:10.000 Nobody's ordering that shit.
01:26:11.000 I used to order it when I would order it for delivery.
01:26:14.000 They'd go, are you sure?
01:26:17.000 I'm like, don't you think I know what I'm doing?
01:26:19.000 I'm ordering it.
01:26:22.000 Now with extra arsenic.
01:26:24.000 Mmm.
01:26:24.000 This is good.
01:26:26.000 This is good.
01:26:29.000 I do know why I don't eat this stuff, though.
01:26:32.000 Because I cannot stop.
01:26:35.000 That's a problem.
01:26:36.000 This pizza's too delicious.
01:26:37.000 Oh, so good.
01:26:38.000 High calorie, high carbohydrate foods.
01:26:42.000 Once they start going down the hatch, they don't want to stop.
01:26:44.000 Cops are the devil.
01:26:46.000 Oh, they are the devil.
01:26:49.000 Remember?
01:26:49.000 They used to be the base of the food chain.
01:26:52.000 Yeah.
01:26:52.000 The whole pyramid.
01:26:53.000 The whole pyramid.
01:26:54.000 The bottom of the food pyramid was carbs.
01:26:56.000 The food pyramid.
01:26:56.000 What would the Egyptians say?
01:27:00.000 We're out of our fucking minds.
01:27:02.000 What are you eating?
01:27:05.000 Yeah, exactly.
01:27:06.000 The stuff is just...
01:27:07.000 That's the bizarre thing about...
01:27:08.000 It's like Fourth of July in your mouth.
01:27:11.000 How many human beings eat just processed food?
01:27:14.000 Like the majority of their diet is processed food.
01:27:16.000 Like the entire center of the supermarket is shit.
01:27:19.000 You really probably shouldn't eat.
01:27:20.000 Except every now and then.
01:27:26.000 It's good, Jamie.
01:27:27.000 I bet.
01:27:27.000 You want one?
01:27:28.000 I don't.
01:27:30.000 You seem offended.
01:27:32.000 It's two of my favorite.
01:27:34.000 I don't like either of those things, honestly.
01:27:36.000 Have you tried it?
01:27:37.000 It's not bad.
01:27:40.000 I feel like you should try it.
01:27:41.000 I understand.
01:27:42.000 I feel like I should, too.
01:27:42.000 This show's all about trying it.
01:27:43.000 We shot an arrow at a car.
01:27:45.000 I'm just not going to like it, and I don't want to offend Pizza Leo, and I like that place.
01:27:50.000 I wouldn't be offended.
01:27:54.000 It's hard to miss with pizza, frankly.
01:27:56.000 Unlike the creeps who used to run Twitter, I don't care if someone has a different opinion than me.
01:28:01.000 I just don't like fish, to be that honest with you.
01:28:03.000 I've tried it many times, and I still haven't liked it yet.
01:28:07.000 I'm going to be one over.
01:28:09.000 You like any fish?
01:28:10.000 Not really, no.
01:28:11.000 You ever go fishing?
01:28:12.000 I can do crab meat, yeah, but I don't like the whole product.
01:28:14.000 Do you like sushi?
01:28:15.000 No.
01:28:16.000 I'm going to try some of Philips on Thursday, though.
01:28:20.000 You're going to try it, but you don't like fish.
01:28:22.000 I'm like DC. I'm afraid of that whole thing.
01:28:24.000 Mmm.
01:28:26.000 I've talked to Phillip about it in detail.
01:28:29.000 Okay.
01:28:30.000 I don't eat fish that often.
01:28:32.000 I like it.
01:28:33.000 Yeah.
01:28:35.000 It's particularly good when you catch it yourself and eat it fresh.
01:28:39.000 Fresh fish really is way better.
01:28:41.000 Way better.
01:28:41.000 Way better.
01:28:42.000 Yeah, fish goes bad quick, unlike meat.
01:28:44.000 Yeah.
01:28:46.000 Like, meat, you can, like, let it sit around for a while.
01:28:50.000 Kind of marinate.
01:28:50.000 Before you cook.
01:28:51.000 You don't marinate fish, I think.
01:28:53.000 Well, they do.
01:28:54.000 They actually dry-aged fish.
01:28:56.000 Okay.
01:28:57.000 Yeah, a lot of places dry-aged fish.
01:28:59.000 Dry-aged fish.
01:29:00.000 Yeah, I didn't know that.
01:29:01.000 I wasn't aware of that, but that's actually a common practice to dry-aged fish for certain sushi dishes, like really gourmet places.
01:29:09.000 Have you ever been to sushi by scratch?
01:29:11.000 Is that in town?
01:29:12.000 Yeah, it's just outside of town.
01:29:14.000 He used to run sushi bar, and then he sold...
01:29:16.000 It's my friend Philip Franklin Lee.
01:29:18.000 He's a Michelin star chef.
01:29:19.000 He used to run sushi bar in town.
01:29:22.000 He sold that, and then he opened up Sushi by Scratch.
01:29:25.000 But because of the contractual obligations, he has to be outside of the Austin proper, so he's about 30 miles away.
01:29:32.000 It's fucking fantastic.
01:29:34.000 If you like sushi, it's the best sushi you'll ever eat.
01:29:37.000 Okay.
01:29:37.000 I mean, it's really insane.
01:29:38.000 I say that with...
01:29:40.000 You eat it and you're like, Jesus Christ, the best sushi of all time.
01:29:45.000 Sushi by scratch?
01:29:46.000 Mm-hmm.
01:29:48.000 They have ones in Miami.
01:29:51.000 Where do they have it now?
01:29:52.000 Chicago.
01:29:53.000 They got a bunch of them.
01:29:54.000 He's not allowed to do it in Austin?
01:29:56.000 No, it's not in Austin proper.
01:29:58.000 I think once his contract is up, you know, he had a non-compete in Austin for like three years or something.
01:30:04.000 I don't know how long it was.
01:30:05.000 Okay.
01:30:07.000 Maybe eventually he'll open up one in Austin, but it's about 30 minutes outside of it.
01:30:10.000 What city is it again?
01:30:11.000 Cedar?
01:30:12.000 Yeah, Cedar Creek.
01:30:13.000 It's out at Lost Pines area.
01:30:16.000 Yeah.
01:30:16.000 It's 30 minutes.
01:30:17.000 It's no big deal.
01:30:19.000 Sushi price scratch.
01:30:20.000 Got it.
01:30:20.000 Yeah.
01:30:20.000 It's just shit.
01:30:21.000 Let me know if you want to go.
01:30:22.000 I'll hook it up.
01:30:22.000 Yeah, sure.
01:30:23.000 Yeah.
01:30:23.000 It's awesome.
01:30:24.000 It's really worth it.
01:30:25.000 If you like sushi, it's a mind blower.
01:30:27.000 It's a mind blower.
01:30:28.000 And it's omakase.
01:30:30.000 So, like, you sit down, they bring you food, that's it.
01:30:34.000 That looks good.
01:30:34.000 It's pretty fucking cool.
01:30:36.000 You've been to Matsuhisa in LA? Yes.
01:30:38.000 The Amacosta Brothers is great.
01:30:39.000 Yes, that place is outstanding.
01:30:41.000 Yeah.
01:30:41.000 Yeah.
01:30:42.000 I love good sushi.
01:30:44.000 And one of the things that's amazing is how many good restaurants there are in Austin.
01:30:48.000 I mean, for a city that's relatively small...
01:30:51.000 Good restaurants per capita is excellent.
01:30:54.000 Amazing.
01:30:54.000 Yeah.
01:30:55.000 And they're so good.
01:30:56.000 There's so many artists and restaurants.
01:30:58.000 We just found a new one that Brian Simpson told us about.
01:31:01.000 It's called Bacalar.
01:31:02.000 It's this Mexican restaurant that's in town.
01:31:05.000 Fantastic.
01:31:06.000 Really good.
01:31:07.000 They just opened up.
01:31:08.000 I think they only opened for like six weeks.
01:31:10.000 So shout out to them.
01:31:11.000 Just ate there the other night.
01:31:12.000 It's just there's so many good places here.
01:31:15.000 You can't have a bad restaurant in this town.
01:31:17.000 You will go under quickly.
01:31:19.000 The competition is strong.
01:31:20.000 There's so much competition and there's so much variety.
01:31:24.000 A lot of good restaurants in town.
01:31:26.000 Yeah.
01:31:27.000 That's amazing.
01:31:28.000 You know, I remember we were hanging out at your place, like, way back in the day when I first moved here, and you said something very prophetic when all this was happening, like, Austin's about to go supernova.
01:31:37.000 Yeah.
01:31:38.000 Kind of did.
01:31:39.000 Yeah.
01:31:40.000 It's Boomtown.
01:31:41.000 Yeah, it really is.
01:31:42.000 Yeah.
01:31:43.000 Legitimately.
01:31:44.000 Yep.
01:31:44.000 And with the Gigafactory, I mean, how many jobs have you brought into Austin from that factory alone?
01:31:50.000 Well, we're about 10,000 direct-ish, and then I think 50,000 indirect.
01:31:57.000 That's a lot.
01:31:58.000 That's pretty fucking awesome.
01:31:59.000 Yeah.
01:32:00.000 I mean, there's only too many people in the greater Austin area.
01:32:03.000 I know, that's crazy.
01:32:06.000 In fact, the kind of limiting factor for growth is just finding enough people.
01:32:16.000 This is terrible for sound.
01:32:19.000 Like the subtitle, Chewing Sounds.
01:32:23.000 This is my last piece.
01:32:25.000 The subtitle, Chewing Sounds.
01:32:27.000 People are going to have to deal with it.
01:32:28.000 This is my last piece.
01:32:29.000 Are you taking it away?
01:32:30.000 You son of a bitch.
01:32:31.000 You son of a bitch.
01:32:32.000 I've certainly eaten my full.
01:32:34.000 Yeah, take it away.
01:32:35.000 I'll keep going.
01:32:36.000 I'll eat the whole fucking bowl.
01:32:38.000 That's the problem with me and carbs.
01:32:39.000 Yeah, carbs are awesome.
01:32:41.000 Yeah, I know.
01:32:43.000 It's like...
01:32:44.000 I feel good, though.
01:32:46.000 I mean, this is the dopamine explosion from carbs.
01:32:48.000 Yeah, I'm happy I did it.
01:32:50.000 I mean, once in a while, it's fine.
01:32:52.000 Once in a while.
01:32:53.000 For me, it's once in a great while, but...
01:32:56.000 Well, there's like Tim Ferriss has that, you know, you have one meal a week or something.
01:33:00.000 Yeah, that's good.
01:33:04.000 Yeah, one meal a week, I'll go with sweets.
01:33:08.000 I'll have an ice cream sundae or some shit.
01:33:10.000 Ice cream sundaes are great.
01:33:12.000 Oh, they're fucking amazing.
01:33:13.000 Fucking amazing.
01:33:14.000 Yeah.
01:33:15.000 I don't think most people know what an ice cream sundae tastes like unless they smoke marijuana.
01:33:19.000 And then you're like, oh, this is a different thing.
01:33:23.000 It's an amazing invention.
01:33:25.000 Whoever figured out the hot fudge and then the whipped cream on top of it, what a combo.
01:33:29.000 Incredible.
01:33:30.000 Yeah.
01:33:30.000 Yeah.
01:33:32.000 Maybe.
01:33:32.000 Oh, yeah.
01:33:33.000 Maybe.
01:33:34.000 Great idea.
01:33:34.000 I don't know if that's possible.
01:33:36.000 Dairy Queen?
01:33:37.000 For sure.
01:33:37.000 Dairy Queen?
01:33:38.000 Well, I mean, that's what's open right now.
01:33:40.000 What's legit?
01:33:41.000 It's 1130. I know.
01:33:43.000 There's like that pizza restaurant, that pizza chain.
01:33:46.000 They have the best ice cream sundae that I've ever seen.
01:33:48.000 It's a giant one.
01:33:49.000 Out here?
01:33:51.000 God, I'm trying to remember the name of the...
01:33:53.000 It is a chain, but it's not like a big chain.
01:33:56.000 People that are upset right now because they're listening like on the treadmill and they're hearing us chewing like, these motherfuckers are going to get ice cream.
01:34:02.000 We're seeing pizza and ice cream sundaes.
01:34:06.000 While people try to lose weight.
01:34:08.000 Yeah.
01:34:08.000 Sweating out.
01:34:10.000 Oh, Buket of Beppo's.
01:34:12.000 Oh, yeah!
01:34:14.000 They've got a gigantic ice cream sundae.
01:34:17.000 Everything they have is gigantic.
01:34:18.000 It's amazing.
01:34:19.000 Yeah.
01:34:20.000 I worked there for a long time.
01:34:21.000 Is it?
01:34:21.000 It's actually really good.
01:34:22.000 No kidding!
01:34:23.000 It's amazing.
01:34:23.000 Oh, bro, they have that rigatoni, that rigatoni with the meat sauce, and oh my god.
01:34:29.000 Rigatoni is great.
01:34:30.000 It's fantastic.
01:34:31.000 Yeah.
01:34:31.000 And it's very reasonably priced for the amount of food you get.
01:34:34.000 Yeah, there's one in Palo Alto.
01:34:35.000 You got a crazy amount of food.
01:34:36.000 Yeah, it's really good.
01:34:39.000 Yeah.
01:34:39.000 I like all the photographs on the wall and everything.
01:34:41.000 There's one of those down the street from our old studio.
01:34:43.000 In L.A.? In Woodland Hills.
01:34:44.000 Remember?
01:34:45.000 Yeah.
01:34:46.000 Yeah.
01:34:47.000 Yeah, it's legit.
01:34:48.000 Yeah.
01:34:48.000 Yeah.
01:34:51.000 I used to take my kids to the one at the Grove in L.A. What, if anything, is out near SpaceX?
01:34:58.000 What do you guys have out there?
01:35:00.000 In L.A. or here?
01:35:01.000 Out here.
01:35:02.000 We've got the Starlink Terminal Factory.
01:35:09.000 For the Starlink v4 terminals, we build them here.
01:35:12.000 We build the version 3 terminals and the version 3 mini.
01:35:21.000 We do part of the production, or actually, I should say, we've done all of the production of the terminals thus far in L.A., and we'll continue to do production in L.A., but we've also just completed a second factory in Bastrop, just about 20 minutes from here.
01:35:37.000 And then SpaceX is where you make the launches.
01:35:41.000 What part of Texas is that?
01:35:44.000 Well, the Starship stuff is in South Texas, near the border, just right on the rear ground.
01:35:50.000 And how did you pick that location?
01:35:53.000 I was just literally looking at satellite images.
01:35:59.000 And for going to orbit, you kind of need to – you want to launch eastward so that you can take advantage of Earth rotation to get to orbit.
01:36:08.000 So it's a little counterintuitive that reaching orbital velocity...
01:36:14.000 Getting to orbit is about your speed parallel to the Earth's surface.
01:36:19.000 It's like how fast are you zooming around Earth?
01:36:22.000 It's not a...
01:36:23.000 The gravity at the altitude of the space station is almost the same as it is on the ground.
01:36:29.000 The reason the space station is actually up there is kind of the wrong terminology.
01:36:33.000 It's actually moving around the Earth at 17,000 miles an hour.
01:36:37.000 So the space station goes around the Earth at roughly every 90 minutes.
01:36:44.000 And because Earth is turning, and the speed at which it is turning, or the way you experience velocity, it's moving at roughly 1,000 miles an hour at the equator.
01:36:59.000 So the closer you are to the equator, the more you can take advantage of Earth's rotation to reach orbital velocity.
01:37:06.000 And since it's rotating eastward, you want to be on the east coast to make it easier to get to orbital velocity.
01:37:16.000 So you need a section of coast that's on the east, fairly southward, that is not occupied.
01:37:24.000 So, almost really all of Florida, except for Cape Canaveral, is wall-to-wall houses on the beach.
01:37:33.000 There's almost no section of Florida that, every section of Florida has houses, except for Cape Canaveral, which is a government base.
01:37:42.000 So, one of the few spots that wasn't occupied was the area just adjacent to the border with Mexico.
01:37:52.000 And it just wasn't super well suited to holiday homes.
01:38:00.000 And there was at one point a development that was going to take place, but then a hurricane came and destroyed the entire place and, in fact, rearranged the land so some of the plots were underwater.
01:38:11.000 So it's a tough spot to build a home, and that's why it was unoccupied.
01:38:19.000 So we needed a piece of—and it needs to be U.S. territory because if we go outside the U.S., there are export restrictions because rocket technology is an advanced weapons technology.
01:38:30.000 You know, arbitrarily go to another country.
01:38:36.000 So it needed to be U.S. land, east coast, and fairly southward.
01:38:43.000 That's fascinating that rockets...
01:38:45.000 It's one of the few spots that exist like that.
01:38:47.000 That rockets fall into the category of weapons technology.
01:38:50.000 Yeah, intercontinental ballistic missiles.
01:38:53.000 I mean, it makes sense.
01:38:55.000 Yeah, we could drop a rocket anywhere.
01:38:57.000 Nobody could stop us.
01:38:59.000 Wow.
01:39:00.000 That is crazy about the space station, too.
01:39:03.000 Yeah.
01:39:04.000 It's going 17,000 miles an hour.
01:39:07.000 Yes.
01:39:09.000 I mean, you've seen the videos of the rocket landing, right?
01:39:12.000 Yes.
01:39:12.000 It's amazing.
01:39:13.000 It's very precise.
01:39:14.000 It's pretty fucking amazing.
01:39:15.000 We could make it land basically anywhere.
01:39:17.000 No.
01:39:18.000 Or not land.
01:39:20.000 Yeah.
01:39:21.000 I mean, it doesn't have to turn on the thrusters to slow down.
01:39:26.000 What is it like to try to juggle these different things in your mind on a daily basis?
01:39:33.000 Like, what is it like to try to juggle X, Tesla, SpaceX, all these different things at the same time?
01:39:43.000 It's a lot for a human brain to handle.
01:39:46.000 Yeah.
01:39:47.000 I would imagine.
01:39:48.000 Yeah.
01:39:50.000 It strains my meat computer.
01:39:53.000 Goodness.
01:39:54.000 I mean, do you need something like that though?
01:39:56.000 Does your meat computer need more problem solving than the average one?
01:40:03.000 I mean, is this something like, if you only had one thing to work on, do you think you would get bored?
01:40:10.000 Or you would get distracted?
01:40:12.000 Or you would not be satisfied?
01:40:15.000 Like, do you need these things to be so complex and have so many of them simultaneously juggling?
01:40:23.000 Because you didn't pick three easy ones.
01:40:25.000 You picked three of the fucking hardest things you could ever get into.
01:40:28.000 You already detailed how difficult manufacturing is.
01:40:31.000 Rockets, duh.
01:40:32.000 It's one of the craziest things.
01:40:34.000 Not only that, but completely innovative rockets that land.
01:40:40.000 That never happened before, so you're doing that.
01:40:43.000 And then you said, you know what?
01:40:44.000 We've got to save humanity.
01:40:46.000 Let me go spend $44 billion on Twitter.
01:40:49.000 Man, that was expensive.
01:40:51.000 What's it worth?
01:40:52.000 What do you think it was actually worth?
01:40:58.000 Everything.
01:40:59.000 Yeah, not for the market.
01:41:01.000 Right?
01:41:02.000 I mean, like, for humans, yes, I agree with you.
01:41:05.000 I mean, I really genuinely do think this, and I've said this many times publicly, I think you did humanity an immense service.
01:41:13.000 And that if that didn't happen, the narrative of this country would have gone further and further down that road to the point where people would have been scared to speak their mind.
01:41:22.000 And they would have been scared.
01:41:23.000 And it changes the way people communicate.
01:41:26.000 Yeah.
01:41:38.000 Absolutely.
01:41:49.000 People, I guess, are afraid of being ostracized.
01:41:53.000 Fear of being ostracized, I think, is probably the biggest issue.
01:41:57.000 And they're just being totally shut down.
01:42:01.000 Where you have no outlet.
01:42:03.000 And you can just basically disappear except for in-person meetings.
01:42:14.000 Yeah, it was important to I'd like to have at least one social media outlet that wasn't canceling people.
01:42:26.000 What I really enjoy is reading the tweets.
01:42:30.000 I guess you still have to call them tweets.
01:42:33.000 Posts or whatever.
01:42:33.000 I don't have a good word for it.
01:42:35.000 Yeah, you can't say the exes.
01:42:38.000 All my exes live in Texas.
01:42:40.000 Reading the words, I should say.
01:42:43.000 I enjoy reading the words of people who proclaimed that they were leaving and going over to Threads.
01:42:49.000 Yeah, yeah.
01:42:51.000 That's an interesting thing about Momentum.
01:42:54.000 Very difficult to start a whole new social media platform.
01:42:58.000 Even one that initially got like, what do threads get?
01:43:02.000 Like some crazy number of initial people signing up for it.
01:43:06.000 But it just dropped off within like a couple of weeks.
01:43:09.000 Now it's a fucking ghost town.
01:43:11.000 Yeah.
01:43:12.000 It's like...
01:43:13.000 Wild.
01:43:13.000 Eerily quiet.
01:43:14.000 It's wild.
01:43:15.000 I mean, Zach himself doesn't post.
01:43:18.000 That's what's crazy.
01:43:19.000 You gotta use your own product.
01:43:21.000 It's interesting, though, because they're sneaking them in now in Instagram.
01:43:25.000 They sneak a little thread in there?
01:43:26.000 Because every now and then I'll see something, and go, that's interesting, and I click on it.
01:43:29.000 Oh, you motherfucker.
01:43:30.000 And it opens up threads for me.
01:43:32.000 I'm like, you got me.
01:43:34.000 Because they're integrated.
01:43:36.000 I don't use Instagram.
01:43:38.000 It's fascinating that...
01:43:40.000 I'm sure you don't.
01:43:40.000 Why would you?
01:43:41.000 If I bought X or Twitter, whatever, I wouldn't use anything else either.
01:43:45.000 Yeah, but I didn't use Instagram for a while.
01:43:49.000 I mean, there was a time where I was posting on Instagram, but I found myself doing selfies, and I'm like, what the hell is wrong with me?
01:44:01.000 Why am I posing for selfies to get likes?
01:44:04.000 This is crazy.
01:44:05.000 Bizarre.
01:44:07.000 So then I was like, you know, if you pose for selfies on Twitter, people would jump all over you, you know?
01:44:15.000 Yeah.
01:44:16.000 They would.
01:44:18.000 That's true.
01:44:18.000 They were like, what's wrong with you?
01:44:20.000 That's so true.
01:44:21.000 Yeah, people are, like, way more lenient on Instagram for some strange reason.
01:44:25.000 Yeah.
01:44:26.000 It's like pretty pictures, basically.
01:44:28.000 Yeah.
01:44:30.000 Pretty pictures and a lot of bullshit.
01:44:32.000 There's a lot of weirdness that comes with Instagram, like filters.
01:44:37.000 I've caught grown men using filters on their pictures.
01:44:41.000 It's very strange.
01:44:42.000 You know, I am concerned that, say, Instagram actually leads to more unhappiness, not less, in the sense that it just looks like everyone's having a great time and is way better looking than they really are.
01:44:59.000 And so you're like, man, everyone's like good looking and having a great time.
01:45:02.000 And then you sort of compare yourself to that and it's like, damn, I'm not as good looking and I seem to be sad a lot.
01:45:11.000 And then you're like, man, you know, I think it could make you kind of depressed.
01:45:16.000 Yeah, well, and also you're a grown man and you experience this.
01:45:19.000 You're also very intelligent and you experience this.
01:45:22.000 Imagine being a young kid.
01:45:23.000 Jonathan Haidt documented that in The Coddling of the American Mind.
01:45:27.000 There's a direct correlation between the invention of social media and its ubiquitous use and self-harm amongst kids, particularly girls.
01:45:36.000 It's really bad for girls.
01:45:39.000 Around 2007-ish, there's this big uptick on suicide, self-harm, depression.
01:45:49.000 Yeah.
01:45:50.000 People can't just, like, make themselves be better looking.
01:45:53.000 There's, like, a limit.
01:45:54.000 Right.
01:45:55.000 Yeah, and then surgery.
01:45:58.000 Yeah.
01:45:59.000 There's a big uptick in people getting their jaws reshaped and shit.
01:46:02.000 I mean, yeah, that's too bad.
01:46:05.000 So, I don't know.
01:46:06.000 I think...
01:46:11.000 I think, like, is Instagram a net happiness generator or not?
01:46:18.000 I'm not sure it is.
01:46:19.000 Speaking of distorted images, have you seen the court artist's drawing of Sam Bankman-Fried?
01:46:24.000 I mean, it's almost like they lost money or something.
01:46:26.000 What the fuck happened?
01:46:27.000 It looks like it melted.
01:46:29.000 Have you seen it?
01:46:30.000 It looks like a supermodel.
01:46:32.000 Oh, what?
01:46:33.000 SVF does?
01:46:34.000 Yes!
01:46:35.000 The guy who drew him.
01:46:37.000 One of them I saw...
01:46:39.000 Maybe there's more than one artist.
01:46:40.000 Maybe there's more than one artist, because some of those ones I saw were unflattering.
01:46:44.000 He looked like an anime superhero.
01:46:46.000 You're joking.
01:46:47.000 No, no, like perfect chiseled jawline.
01:46:50.000 Ridiculous.
01:46:51.000 He looked like he lost 30 pounds, started working out.
01:46:53.000 Look at this.
01:46:54.000 Look at that!
01:46:55.000 Are you kidding?
01:46:56.000 What the fuck, man?
01:46:57.000 Look at that guy on the left.
01:46:58.000 That guy looks like Superman.
01:47:01.000 Doesn't he?
01:47:02.000 Look how hot that guy is!
01:47:03.000 I almost feel like it's not accurate.
01:47:05.000 It's like Clark Kent or something.
01:47:06.000 It might be.
01:47:07.000 I feel like someone's fucking with us.
01:47:08.000 There's a few other pictures when I googled, like, you know.
01:47:10.000 There's some rough pictures, though.
01:47:12.000 It seems like someone's fucking with us, because that guy's handsome as fuck.
01:47:15.000 Because there's this one that's not the same.
01:47:17.000 No, that one's terrible.
01:47:18.000 That one's not the same.
01:47:18.000 What is that?
01:47:19.000 That's not real.
01:47:20.000 That's Satan's drawing.
01:47:21.000 But that one right there, if that is real, it's like, come on.
01:47:26.000 That's not real.
01:47:27.000 This is bullshit.
01:47:28.000 That's bullshit.
01:47:28.000 Because look at what they did with the girl.
01:47:31.000 Oh, wow.
01:47:31.000 Look at that picture there.
01:47:32.000 Even that picture is hilarious.
01:47:33.000 But Carla looks like she's melting.
01:47:35.000 Yeah.
01:47:36.000 I don't know.
01:47:37.000 I have to, I mean...
01:47:38.000 But she's probably massively depressed.
01:47:41.000 I mean, I bet, like, that's also, like, it's artist's interpretation of the energy she's giving off in court.
01:47:47.000 I mean, she has to rat on her boyfriend, and she's already pleaded guilty, and, you know, for a lesser sentence, she's going to rat him out.
01:47:54.000 Well, I mean, I don't know who SBF's PR team is, but they're doing an incredible job.
01:48:01.000 For real?
01:48:02.000 Well, I mean, the dude roofed tons of people off and stole their money.
01:48:07.000 And yet he's getting basically, you know, back rubbed from the press.
01:48:12.000 Well, don't you think that's because of the amount of money that he donated?
01:48:16.000 Yeah, I don't know.
01:48:16.000 I don't know what the deal is.
01:48:17.000 It might have something to do with that.
01:48:19.000 I'm not that cynical.
01:48:21.000 I generally don't think people are influenced by money.
01:48:25.000 Well, I don't know what he's doing.
01:48:26.000 Something's going on.
01:48:27.000 The number of articles that I've seen where it's basically a misunderstood philanthropist is ridiculous.
01:48:39.000 Well, your bullshit meter went off when he was offering money to buy in with you with Twitter, correct?
01:48:46.000 Well, a large amount.
01:48:48.000 Yeah.
01:48:51.000 A lot of people fell for his bullshit.
01:49:00.000 First of all, I hadn't really heard of the guy.
01:49:03.000 I'm like, who is this guy?
01:49:05.000 What did he do?
01:49:06.000 He's in the Bahamas?
01:49:07.000 That's pretty sus to begin with.
01:49:11.000 If you're on a tropical island, finance organizations in tropical islands are generally a bad track record.
01:49:19.000 And he's involved in crypto.
01:49:20.000 And crypto is scams.
01:49:22.000 The scam probability in crypto is high.
01:49:25.000 It's high.
01:49:26.000 I'm not saying it's all scams.
01:49:29.000 I would leave that distinction for NFTs.
01:49:36.000 That's like 80% scam.
01:49:38.000 Except for Beeple.
01:49:39.000 Beeple's legit.
01:49:40.000 But it gives you digital art.
01:49:42.000 Yeah, people's stuff is great.
01:49:44.000 But I mean, the funny thing is that the NFT is not even on the blockchain.
01:49:48.000 It's just a URL to the JPEG. So it's not even like, you should at least encode the JPEG in the blockchain.
01:49:57.000 Because if the URL, if the company housing the image goes out of business, you don't have the image anymore.
01:50:04.000 I never understood it.
01:50:05.000 I tried so hard.
01:50:06.000 I tried so hard.
01:50:08.000 I have a friend who made millions selling an artist.
01:50:12.000 I'm like, okay, save that money because eventually people are going to figure this out.
01:50:19.000 With art, there is a fair bit of money laundering and tax avoidance.
01:50:24.000 So some of these things that seem inexplicable… Like Hunter Biden?
01:50:29.000 Are you going to talk about Hunter and Biden's paintings?
01:50:32.000 Did he actually sell paintings for large sums of money?
01:50:34.000 Immense sums of money.
01:50:35.000 Okay.
01:50:36.000 Hundreds of thousands of dollars.
01:50:37.000 That seems unlikely to be a legit transaction.
01:50:41.000 Unlikely, right?
01:50:42.000 Unlikely.
01:50:42.000 Yeah, very unlikely.
01:50:44.000 The work's not bad, though, I've got to say.
01:50:46.000 For that kind of bullshit art, it's not that bad.
01:50:49.000 Yeah.
01:50:50.000 I mean, the thing is that it's hard to price art because it's in the eye of the beholder.
01:50:55.000 So how can you say whether something— Like, this is his stuff.
01:50:58.000 How much does that go for?
01:51:00.000 $225k.
01:51:00.000 A steal!
01:51:03.000 It's not bad.
01:51:04.000 It's not that bad.
01:51:06.000 That's not that bad.
01:51:07.000 I mean, I would buy that.
01:51:08.000 I wouldn't buy it for $225k, but I'll buy it for $5k.
01:51:12.000 Yeah.
01:51:13.000 It's not bad.
01:51:14.000 Like, look, a lot of crackheads are good artists.
01:51:17.000 Like, you know, you get crazy on drugs and splatter some shit around and you got a unique vision.
01:51:22.000 That's not bad.
01:51:22.000 That's not bad.
01:51:23.000 That is not bad at all.
01:51:25.000 How much does that one go for?
01:51:28.000 So he's probably a legit...
01:51:30.000 Meanwhile, we're going to find out he had a ghost artist.
01:51:32.000 I mean, I suspect he's not untalented.
01:51:37.000 Right.
01:51:37.000 So, I mean, I actually don't have any issues with his lifestyle or anything.
01:51:43.000 There's his stuff.
01:51:44.000 That stuff's not bad.
01:51:45.000 Like, whatever that dog-wolf thing is on the wall, that's pretty dope.
01:51:48.000 Yeah.
01:51:49.000 I like that.
01:51:51.000 What does it say underneath it?
01:51:54.000 Can you read it?
01:51:58.000 I can't read it.
01:52:00.000 Either way, not bad!
01:52:02.000 Yeah, it's not bad, it's not bad.
01:52:03.000 Have you ever heard the theory that the entire modern art movement, like Jackson Pollock and the like, was a CIA psyop?
01:52:16.000 I mean, I have heard that, but I have not heard any evidence for it.
01:52:19.000 I read the article a couple years ago.
01:52:21.000 Okay.
01:52:21.000 At least.
01:52:22.000 A modern art society?
01:52:24.000 Yeah.
01:52:24.000 Every day I wake up, there's another society.
01:52:27.000 But it was fascinating.
01:52:29.000 Yeah, every day.
01:52:31.000 There's always something.
01:52:32.000 Cat with a tinfoil hat.
01:52:33.000 No.
01:52:34.000 When was this?
01:52:35.000 Beginning of 2020. Yeah.
01:52:38.000 Was modern art really a CIA society?
01:52:40.000 But, I mean, if you can get someone to spend that kind of money on that kind of shit, like the Jackson Pollock stuff, So, here it is.
01:52:52.000 Preeminent culture war, Cold War, what is it?
01:52:55.000 Okay.
01:52:55.000 The relationship between modern art and American diplomacy began during World War II. Museum, MoMA, Battle for Hearts and Minds.
01:53:08.000 It's an interesting article.
01:53:09.000 So, I read the article a couple years ago.
01:53:11.000 I can't remember what their argument was.
01:53:14.000 But it's one of those things.
01:53:15.000 It starts there.
01:53:16.000 Like, they spent money to buy paintings directly from artists in the 40s.
01:53:21.000 Okay.
01:53:22.000 Even though modern art and American diplomacy were of a piece, Soviet propaganda asserted that the United States was a culturally barren capitalist wasteland to make the case for American cultural dynamism.
01:53:37.000 The State Department in 1946 spent $49,000 to purchase 79 paintings directly from American modern artists and mounted them in a traveling exhibition called Advancing American Art.
01:53:49.000 That exhibition, which made stops in Europe and Latin America, included work from artists like Georgia O'Keeffe and Jake.
01:53:57.000 Georgia O'Keeffe is the lady who makes vaginas, right?
01:53:59.000 Isn't that her work?
01:54:01.000 I don't know.
01:54:01.000 I think that is some of what she does, yeah.
01:54:04.000 I think she makes, like, vagina, like, flowers and stuff.
01:54:11.000 I mean, I think so.
01:54:12.000 It was pretty good, but...
01:54:13.000 Well, the Jackson Pollock one, I was like, how did...
01:54:16.000 Come on.
01:54:17.000 What happened there?
01:54:18.000 That one is just wild, because I really think I could do that.
01:54:22.000 Some of the shit of, like, Jackson Pollock and the CIA teamed up to win the Cold War.
01:54:27.000 Yeah.
01:54:28.000 Okay.
01:54:29.000 I think that's the article I read because that's 2017. I don't remember.
01:54:34.000 Well, I think art is very much in the eye of the beholder.
01:54:37.000 It's like what does it make you – how does it make you think and feel?
01:54:41.000 And if it makes you think and feel in a way that you like, then it's good.
01:54:45.000 Yes, absolutely.
01:54:46.000 But it's also a way to launder money.
01:54:50.000 Yes.
01:54:52.000 A number of these like very high-priced art things are tax evasion and money laundering.
01:54:57.000 Have you ever seen the documentary on the lost Leonardo?
01:55:03.000 No.
01:55:04.000 That's crazy.
01:55:05.000 That's crazy.
01:55:06.000 MBS purchased this.
01:55:07.000 Okay.
01:55:08.000 And he purchased it for some insane amount of money.
01:55:10.000 It turns out it probably isn't Leonardo da Vinci's.
01:55:13.000 Not only that, but the vast majority of the painting was made by a modern age woman who recreated everything.
01:55:21.000 Okay.
01:55:28.000 Repainted the image in the style of da Vinci.
01:55:32.000 Okay.
01:55:33.000 And it's very sketchy because if you look at the original painting, it was all fucked up and missing paint and there was many layers.
01:55:39.000 Someone had painted over it.
01:55:41.000 It's really an incredible documentary because it just shows how much fuckery is involved in these high-dollar transactions.
01:55:50.000 Yeah.
01:55:50.000 And I think it was the most expensive painting that was ever sold.
01:55:54.000 I think it went for something in the neighborhood of $400 million.
01:55:58.000 $450 million.
01:55:59.000 Yeah.
01:56:01.000 The Lost Leonardo, a new film solved the mystery of the world's most expensive painting.
01:56:05.000 Is the $450 million Salvatore Mundi a fake?
01:56:08.000 This film featuring tearful sycophants, sneering experts, dodgy dealers in a secretive super yacht may finally settle the great da Vinci controversy.
01:56:20.000 Apparently there's multiple layers and different styles of painting involved in it, and when they do some sort of a comprehensive examination, whether it's like, you know, I don't know what kind of imagery they're using, but they're doing something where they could say, like, this has been painted many times and fucked with.
01:56:38.000 It might have originally been one of Leonardo's students.
01:56:43.000 It might not have been Leonardo.
01:56:44.000 There's not a real clear...
01:56:46.000 It's dated to that time?
01:56:47.000 Yeah.
01:56:48.000 But whether it was Sotheby's that sold it?
01:56:51.000 I think so.
01:56:51.000 Yeah.
01:56:52.000 So they were kind of aware that there were some shenanigans with this piece.
01:56:57.000 But they also were aware that this was...
01:56:59.000 This was Three Lemons.
01:57:03.000 They were going to hit the jackpot with this bad boy.
01:57:05.000 Yeah.
01:57:06.000 And so they went through with it and...
01:57:09.000 Yeah.
01:57:09.000 Christie's.
01:57:10.000 Christie's, that's it.
01:57:12.000 The other one.
01:57:13.000 Christie's was aware of it.
01:57:15.000 It's a fascinating documentary.
01:57:16.000 I don't know what's right.
01:57:17.000 I don't know what's wrong.
01:57:18.000 It might be real.
01:57:19.000 But at the very least, it's been retouched mostly by this woman.
01:57:24.000 Isn't it like 90% of the painting?
01:57:27.000 It's some there's some very high percentage of the painting that was actually made by this woman and they show her she worked on it forever for years and years worked like painstakingly to retouch this piece of art Which is very odd yeah That they do that.
01:57:49.000 Because like, wouldn't you just want it all fucked up and old?
01:57:52.000 Sure.
01:57:52.000 I mean, that's the real piece of art.
01:57:54.000 Yeah.
01:57:55.000 The real piece of art is not like some woman in 2001 painting over it.
01:58:00.000 That's just crazy.
01:58:03.000 Yeah.
01:58:04.000 Um...
01:58:05.000 I mean, I enjoy art for the aesthetics, but not for the name value.
01:58:11.000 Yeah, I feel the same way.
01:58:12.000 I enjoy art just because of how it makes you feel.
01:58:21.000 It's a cool thing.
01:58:22.000 I mean, obviously my studio is filled with it.
01:58:23.000 You go outside, I love art.
01:58:25.000 It's everywhere out there.
01:58:27.000 I'm just, I love it.
01:58:29.000 Have you seen, oh, the woman, have you seen the piece that she made of you?
01:58:32.000 This Melania Blackman, have you seen that drawing that she did?
01:58:36.000 Is that the weird one?
01:58:37.000 It's everything.
01:58:38.000 It's enormous.
01:58:40.000 She draws it, and you're essentially made up of all these different characters and different things.
01:58:47.000 See if you can find her work with this.
01:58:49.000 She did one of me, and she did one of Anthony Bourdain that I bought that's out there as well.
01:58:54.000 She's super talented, this woman.
01:58:58.000 So this is her.
01:58:59.000 She's really hot, too.
01:59:00.000 Okay.
01:59:02.000 See if you can find her standing next to it.
01:59:06.000 There it is.
01:59:11.000 Instagram Bandit?
01:59:12.000 Oh, that was right!
01:59:13.000 There was something that happened where Instagram did something.
01:59:18.000 What happened?
01:59:21.000 So Instagram has not only banned me from promoting this artwork, but also shadow banned me altogether after I posted my latest piece.
01:59:29.000 This is after probably you said that you wanted to fight, Zucker.
01:59:32.000 Well, actually, I was just...
01:59:34.000 The Zuck fight is funny, because he was posting all these fight videos, and then someone on Twitter at the time said, Hey, you should fight Zuck.
01:59:46.000 And I said, well, I'm willing to fight if he is.
01:59:50.000 And then Zuck posted, I think on Instagram or something, name the place or something.
01:59:58.000 Something to that effect.
01:59:59.000 And I was like, okay, how about the Vegas Octagon?
02:00:05.000 And then Italy actually was willing to let us use the Colosseum.
02:00:13.000 So I was like, well, let's – can't turn that down.
02:00:18.000 And then – And then I was like, well, if it's going to be in the Coliseum, I like UFC and everything, but we don't have tons of ads and UFC branding on the Coliseum because it's a historical place.
02:00:31.000 It's a place of great history.
02:00:32.000 You don't want to just have it be old like NASCAR. And then Zuck pulled out.
02:00:43.000 He was the pull-out method.
02:00:44.000 So he pulled out of it?
02:00:46.000 He pulled out of it?
02:00:47.000 Yeah, yeah.
02:00:49.000 Oh.
02:00:50.000 What was the narrative?
02:00:52.000 What did you hear, Jamie?
02:00:52.000 I don't remember.
02:00:54.000 Well, listen, I've only...
02:00:55.000 So he was like, oh, no, it's got to be UFC rules.
02:00:58.000 I'm like, well, okay, we're going to have UFC rules in the Coliseum.
02:01:01.000 It's fine.
02:01:02.000 But we just don't want to have...
02:01:03.000 We've got to respect the historical integrity of the place.
02:01:06.000 The Coliseum just seems like the coolest place to do it.
02:01:09.000 That's why?
02:01:09.000 I mean, you're like gladiator, you know?
02:01:11.000 Come on.
02:01:12.000 And if they said it's okay...
02:01:14.000 So he just wants it to be in the actual UFC, like in Vegas?
02:01:18.000 So then he said, oh, well, you know, he accused me of not being serious.
02:01:22.000 And I said, look, listen, at the end of the day, I'll fight you any place, anywhere, under any rules.
02:01:29.000 That's what I said.
02:01:31.000 So, you know, he said name the place.
02:01:36.000 I'm like, I'm happy to fight him in a house, on a mouse, with a louse.
02:01:41.000 We'd like go full Dr. Seuss here.
02:01:43.000 Now, how much time?
02:01:45.000 I'm way bigger than him.
02:01:46.000 This is unfair.
02:01:47.000 I don't think you should fight me.
02:01:49.000 Because you're so much bigger than him?
02:01:50.000 Yeah, I'm like 50% higher than him.
02:01:52.000 Yeah.
02:01:52.000 I've got my patented walrus move.
02:01:55.000 I just lie on him.
02:01:57.000 The walrus move?
02:01:58.000 Well, you know, walrus doesn't need martial arts training.
02:02:01.000 Right.
02:02:02.000 Because it's really big.
02:02:04.000 You don't want to go wrestling a walrus.
02:02:06.000 Right.
02:02:07.000 Because it's going to roll on you.
02:02:08.000 Have you ever rolled with someone who's much smaller than you that does jiu-jitsu?
02:02:12.000 Yes.
02:02:13.000 Lex, yeah.
02:02:14.000 Yeah, yeah.
02:02:14.000 No, I do train as a martial arts.
02:02:17.000 How much have you trained, personally?
02:02:20.000 A decade of it.
02:02:21.000 Well, you did a lot of karate, right?
02:02:24.000 Judo?
02:02:24.000 Judo, shinkai, karate.
02:02:27.000 Yeah.
02:02:28.000 I did some jujitsu, taekwondo, street fighting, which was involuntary.
02:02:36.000 I think I'd be decent.
02:02:39.000 I did martial arts competitions when I was a teenager.
02:02:42.000 Really?
02:02:42.000 Interesting.
02:02:43.000 So look at you, George St. Pierre, Elon, John Donaher, the great master, and Lex Friedman.
02:02:49.000 But, like, Lex is...
02:02:50.000 I think he's got, like, 20% heavier than Zuck.
02:02:54.000 And I'm way bigger than Lex.
02:02:55.000 Yeah.
02:02:57.000 That's why they have weight categories.
02:02:58.000 Oh, yeah.
02:02:59.000 Yeah.
02:03:01.000 Did you get a chance to talk to Donaher at all?
02:03:04.000 The guy on the left?
02:03:05.000 That guy's fascinating.
02:03:06.000 Yeah, no, no, he's...
02:03:07.000 Do you know his history?
02:03:08.000 No, he's from New Zealand or something.
02:03:10.000 He was a professor of philosophy at Columbia and fell in love with Jiu Jitsu.
02:03:14.000 I mean fell in love with it to the point where he was sleeping on the mats and teaching all day long.
02:03:17.000 He's an obsessive.
02:03:19.000 He is a real Kaizen disciple.
02:03:23.000 He seems Zen.
02:03:24.000 Oh, man.
02:03:25.000 He is one of the most unique characters I've ever met in my life.
02:03:28.000 One of the most brilliant men I've ever met.
02:03:30.000 And he's completely dedicated to Jiu Jitsu.
02:03:33.000 And he has raised through this...
02:03:36.000 Particularly this one disciple Gordon Ryan who also lives here in Austin.
02:03:39.000 He's the greatest jiu-jitsu competitor of all time.
02:03:42.000 There's no question and he's only 28. He might even be 27 or maybe he's 28 now.
02:03:47.000 But I mean he is universally regarded as the greatest of all time and he is John's greatest student.
02:03:52.000 And the two of them together because Gordon has insane work ethic.
02:03:56.000 They work 365 days a year.
02:03:58.000 They do not take any days off.
02:03:59.000 They train every day.
02:04:00.000 That's Gordon to the right of John.
02:04:03.000 With the crazy beard.
02:04:04.000 He doesn't really have that hair.
02:04:05.000 He bleaches it.
02:04:05.000 But the two of them together are literally an unstoppable combination.
02:04:09.000 I like Alex Ripped.
02:04:10.000 He's pretty ripped.
02:04:12.000 Yeah, he's a combination of gigantic, brilliant, and insanely dedicated with the most incredible instructor that's ever existed.
02:04:20.000 I mean, John Donaher is universally regarded as the greatest jiu-jitsu instructor alive.
02:04:25.000 And his student is universally regarded as the greatest jiu-jitsu competitor alive.
02:04:30.000 Yeah.
02:04:31.000 And it is because John is like a complete...
02:04:35.000 like a guy out of a superhero book.
02:04:37.000 Like, you wouldn't...
02:04:38.000 you're not gonna find another one of those.
02:04:41.000 A guy with a genius level IQ who's one of...
02:04:43.000 I mean, you talk to him, he's fascinating.
02:04:46.000 And he is obsessed with combat sports, warfare, like strategy.
02:04:54.000 Brilliant guy.
02:04:56.000 Brilliant guy.
02:04:57.000 I was impressed when I met him.
02:05:00.000 If I was fighting someone where I was not much bigger than them, then I would be more concerned.
02:05:10.000 How much time would you need to prepare?
02:05:12.000 I don't need any time.
02:05:14.000 No time at all?
02:05:15.000 No.
02:05:16.000 How's your cardio?
02:05:17.000 That will not be a factor.
02:05:20.000 Really?
02:05:21.000 Yeah.
02:05:25.000 What's the likelihood of this actually happening?
02:05:27.000 I'm willing to do it anytime, anywhere, anyplace, any role.
02:05:30.000 Well, I think stating it this way might accelerate this process, especially on this platform.
02:05:35.000 I mean, I challenge him to a duel under any circumstances.
02:05:40.000 Sword fight?
02:05:41.000 Sure.
02:05:42.000 Jesus.
02:05:43.000 That's not necessary.
02:05:44.000 Pistols at dawn.
02:05:45.000 That's not necessary.
02:05:46.000 I think physical hand down.
02:05:48.000 Nerf guns at noon.
02:05:50.000 Nerf guns at noon.
02:05:51.000 Well, you could slash it out in the metaverse.
02:05:55.000 Yeah.
02:05:56.000 In the real world.
02:05:57.000 Listen, there's just a reason they have weight categories, you know?
02:06:03.000 So...
02:06:06.000 You know, there's a friend of mine who is pretty good at fighting, but she weighs about half of what I do.
02:06:13.000 And I said, let me show you why there's weird categories in fighting.
02:06:19.000 I'm going to do a move called The Walrus, and I'm just going to lie on you.
02:06:24.000 I'm not going to put you in a lock or anything.
02:06:25.000 I'm just going to lie on you.
02:06:27.000 You know, I've positioned myself such that it's hard to get out from under me.
02:06:32.000 And I just want to lie crossways on you and you try to get away.
02:06:36.000 And you won't be able to get away.
02:06:39.000 Because you couldn't.
02:06:41.000 Just, you know, like if a horse falls on you.
02:06:44.000 Right.
02:06:45.000 You can get trapped under a horse.
02:06:46.000 But you're not a horse.
02:06:47.000 What do you weigh, about 230?
02:06:49.000 Yeah, 240, yeah.
02:06:51.000 Yeah.
02:06:53.000 So, no, no, I'm not a horse, but I'm saying in the limit, if something's heavy enough, like, you know, if a horse falls in you and dies, you can get trapped under a horse and not be able to get yourself out.
02:07:01.000 Right.
02:07:02.000 But if someone's good enough, I mean, I'm sure you've seen, like, absolute weight classes in jiu-jitsu where you'll get a 145-pound competitor with strangles, a 220-pound competitor, and they're both well-trained.
02:07:13.000 Because if someone is that good...
02:07:14.000 It's unlikely.
02:07:15.000 It's not unlikely.
02:07:15.000 It happens quite often.
02:07:17.000 When you get elite competitors, like elite black belts at the 145, 155 pound weight limit, you'd be shocked.
02:07:25.000 There's a ton of videos of these guys who will strangle much larger black belts.
02:07:30.000 I'm not saying it's impossible, it's just highly unlikely.
02:07:33.000 And if this were not the case, there would not be strict weight categories in martial arts.
02:07:38.000 That is true.
02:07:38.000 That is true.
02:07:39.000 But the reason why they allow absolutes in jiu-jitsu is because it is the thrill of watching these smaller people go against much larger people.
02:07:50.000 And sometimes they win.
02:07:53.000 No, just like armies, people take note when a small army defeats a big army because it is so unusual.
02:08:03.000 Yes.
02:08:04.000 Not because it's normal.
02:08:05.000 Right.
02:08:06.000 You know, if there's like, if it's like, it was two against 10,000, and Boyder would beat those two guys up.
02:08:13.000 Which is more likely what happens.
02:08:15.000 Like, if you're severely outnumbered, you will lose.
02:08:19.000 Almost certainly.
02:08:20.000 So look at the size difference between these two guys.
02:08:23.000 Play it out.
02:08:25.000 This is Mikey Musumechi, who is another fascinating individual.
02:08:30.000 This guy is another super genius who trains every day, 12 hours a day, and he is competing against a black belt in the heavyweight division.
02:08:39.000 Mikey Musumechi might weigh 145 pounds, and he beats this guy.
02:08:44.000 Guy doesn't look to be in super great shape.
02:08:47.000 Well, he's enormous.
02:08:48.000 Yeah.
02:08:50.000 I mean, he's enormous and he's a black belt, so he's skilled.
02:08:52.000 I forget how Mikey wins this.
02:08:54.000 He catches him with something.
02:08:56.000 Yeah, I sort of skipped ahead.
02:08:56.000 I couldn't tell what exactly happened.
02:08:58.000 Get it a little bit further here.
02:08:59.000 I think...
02:09:00.000 What happened here?
02:09:02.000 It seemed like he got penalties or something.
02:09:04.000 Oh, they pushed him out of bounds?
02:09:05.000 I don't know what that won on the side.
02:09:06.000 I think they're out of bounds.
02:09:09.000 Yeah, that's all that was.
02:09:10.000 That's just out of bounds.
02:09:12.000 So scoot ahead and see what happens, what he catches him with.
02:09:17.000 What happened?
02:09:18.000 I'm not saying it's impossible.
02:09:19.000 It's just very unlikely.
02:09:21.000 Oh, so he won by points?
02:09:22.000 Yeah.
02:09:22.000 Oh, okay.
02:09:22.000 So he won by points.
02:09:23.000 I think you're making the point for me here.
02:09:25.000 Yeah.
02:09:26.000 Well, in that case, that guy's strangled a lot of much larger people than him.
02:09:31.000 But again, he's extraordinary.
02:09:33.000 He's a world champion.
02:09:34.000 He's a world champion for ONE, which is this huge organization in Singapore.
02:09:39.000 They do these events where they have all kinds of different martial arts.
02:09:41.000 They have MMA, Thai boxing.
02:09:44.000 Yeah.
02:09:44.000 Yeah.
02:09:46.000 So you would do it under any rules?
02:09:48.000 Sure.
02:09:49.000 All right, let's go.
02:09:51.000 I like the fact that you're interested in doing this.
02:09:53.000 It's fun.
02:09:54.000 It makes it fun.
02:09:56.000 Yeah, I mean, I could, you know, this could be an exercise in hubris.
02:10:03.000 We'll find out.
02:10:05.000 We could find out.
02:10:06.000 I like the fact that Zuck's interested in it, too.
02:10:08.000 I like the fact that he trains so much.
02:10:11.000 Yeah, I mean, there's...
02:10:12.000 There aren't very many ways to actually...
02:10:15.000 If you stick to the rules of jiu-jitsu...
02:10:17.000 Like, for example, if you want to put someone in an arm lock, you have to be able to extend their arm.
02:10:22.000 And if somebody is strong enough that you cannot extend their arm, then you're limited to...
02:10:28.000 Chokes.
02:10:28.000 Chokes.
02:10:29.000 And you can do an arm lock across the groin with both arms and legs.
02:10:34.000 Like Hoist Gracie did upside down in the third UFC fight or something like that.
02:10:40.000 Yeah.
02:10:41.000 But he was pretty severely damaged.
02:10:43.000 That was when he fought chemo, yeah.
02:10:44.000 That enormous guy.
02:10:45.000 Yeah.
02:10:46.000 And he did an upside-down arm lock across the groin because he could not do an arm lock, you know, a sort of sidebar arm lock.
02:10:55.000 But after that, people were like, why did that move?
02:10:57.000 So they were like, we're not going to allow themselves to get an arm lock across the groin without, you know, that was like a, you know, overconfidence, I think.
02:11:09.000 Well, it was exhaustion.
02:11:10.000 I mean, they fought like tooth and nail for something like seven or eight minutes.
02:11:15.000 And, you know, Hoist survived and then eventually wore the guy.
02:11:19.000 Well, when you're a big steroided up guy like that, too, the oxygen depletion, like the amount of oxygen your muscles require, you gas out pretty quickly when you're that big, unless you're insanely conditioned.
02:11:30.000 Yeah, but there's no way for him to do a single arm arm lock.
02:11:37.000 He couldn't do it.
02:11:38.000 Single arm arm lock.
02:11:39.000 If you try to do a lock on the side.
02:11:43.000 So like in side control?
02:11:44.000 So from some legs across your face?
02:11:46.000 Is that what you mean?
02:11:47.000 If you try to do an arm bar on the side, one arm across the knee or across the thigh, you have to be able to extend the arm.
02:12:01.000 You know, triceps have to be able to exceed the strength of their bicep is what it comes down to.
02:12:06.000 But if you cannot exceed the strength of their bicep, then you will not be able to do an arm extension.
02:12:11.000 I'm not sure what I'm...
02:12:12.000 Are you talking about like a Kimura, like a straight arm bar?
02:12:15.000 Yeah.
02:12:15.000 Oh, so you're talking about like with just the arms?
02:12:18.000 Where you just have one arm, like, if it's one arm versus one arm.
02:12:24.000 But that's never the case.
02:12:26.000 It's almost always the whole body's engaged.
02:12:29.000 That's, I think that's, well in judo that's a very common hold, a very common line.
02:12:35.000 A one arm arm bar?
02:12:37.000 Yeah, one arm.
02:12:40.000 You know, you have your one arm around the neck and you take their arm and you extend it across.
02:12:45.000 Oh, I see what you're saying.
02:12:47.000 Okay, so like from a scarf hold.
02:12:49.000 So a scarf hold, you would take the arm and put it over and you would push it down with one arm.
02:12:53.000 Yeah, that's unusual though.
02:12:55.000 Yeah, that's an unusual arm bar.
02:12:57.000 But if one person is much stronger than another, then that's the move.
02:13:02.000 Yeah, but...
02:13:03.000 It's a very fast move because you can take someone right from a throw, drop them on the floor, right into an armbar.
02:13:08.000 Yeah, okay.
02:13:10.000 That's a very specific armbar.
02:13:11.000 Ten seconds.
02:13:12.000 Right.
02:13:12.000 That's a rare armbar.
02:13:13.000 You never see that in MMA. What you do see, though, is the two legs isolate the arm, and then the person grabs a hold of it with a thumb up and uses all of their body Yeah, yeah.
02:13:25.000 That's what Gracie did.
02:13:26.000 Yeah, that's what most people do when they apply an arm bar.
02:13:29.000 Yeah.
02:13:29.000 Whether it's from the back, like, you know, a lot of people have done that, or whether it's from side control, which is a little more easy because you have control of the body.
02:13:38.000 Yeah.
02:13:38.000 I mean, it's also just that, you know, UFC is not just jujitsu.
02:13:43.000 You can punch people.
02:13:44.000 Right.
02:13:44.000 That makes a big difference.
02:13:45.000 Big difference.
02:13:46.000 Big difference.
02:13:47.000 Huge.
02:13:47.000 Yeah.
02:13:48.000 Yeah, MMA has changed the ideas of jiu-jitsu because there's a lot of techniques that people do where it works well in competition when someone's like grabbing your leg when you can't just rain down punches on their face.
02:14:00.000 Yeah.
02:14:01.000 There's a lot of unrealistic positions.
02:14:02.000 If somebody's pounding you in the face, it's pretty hard to be chill, you know?
02:14:05.000 Especially if somebody's got gorilla fists in your face.
02:14:09.000 Yes.
02:14:11.000 It's not going to be a good day.
02:14:12.000 Yeah, Carlson Gracie famously had a phrase that if you take a black belt and you punch him in the face, he becomes a brown belt.
02:14:19.000 Punch him again, he becomes a purple belt.
02:14:22.000 Yeah.
02:14:23.000 And so on and so forth.
02:14:25.000 I mean, most people have not been punched in the face.
02:14:27.000 I've been punched in the face.
02:14:29.000 So, you know, it comes as a surprise.
02:14:33.000 Yeah.
02:14:34.000 Yeah, it probably does come as a surprise.
02:14:36.000 I mean, there's also, like, even the UFC has a lot of limitations.
02:14:39.000 Like, you can't do 12 o'clock elbows, you know?
02:14:42.000 That's changing.
02:14:43.000 They're getting rid of that.
02:14:44.000 You gotta do 12 o'clock elbows?
02:14:45.000 Yes, finally.
02:14:46.000 Finally.
02:14:46.000 I've been singing that from the top of the roof forever.
02:14:50.000 Okay.
02:14:50.000 It's so nuts.
02:14:51.000 It's so stupid.
02:14:51.000 You know where it came from?
02:14:52.000 No.
02:14:53.000 It came from Big John McCarthy, who was the original UFC referee and pioneer of the sport.
02:14:59.000 He was bringing this to athletic commissions, and they were allowing certain techniques, but one of them they wouldn't allow was the 12-6 elbow, because they saw those late-night karate demonstrations where someone would smash bricks like that.
02:15:11.000 They thought someone would die if they hit him with this.
02:15:13.000 Definitely going to sting.
02:15:15.000 Yeah, but it's not even harder than this one.
02:15:17.000 This one's harder.
02:15:19.000 Because this one, you can throw your body weight into it, and it's a more natural movement.
02:15:24.000 This is an unusual movement.
02:15:26.000 I mean, I'm sure you could train it and get it probably as hard, but I think for most people, for me, I can tell you for sure, this elbow has more power.
02:15:35.000 Well, I think any elbow in the face is going to be a big wake-up call if you've never had an elbow in the face.
02:15:39.000 It fucking sucks.
02:15:40.000 It fucking sucks.
02:15:41.000 Did you watch the Tyson Fury, Francis Ngannou boxing match?
02:15:44.000 Is that the one we bit the ear off?
02:15:46.000 No, that's Mike Tyson.
02:15:48.000 Evander Holyfield.
02:15:49.000 That's from the 90s.
02:15:51.000 Yeah.
02:15:51.000 No, this fight that took place since last weekend.
02:15:53.000 Francis Ngannou, who was the UFC heavyweight champion, he vacated the belt so he could take this fight with Tyson Fury.
02:15:59.000 This was his dream fight.
02:16:01.000 Tyson Fury, who's the lineal heavyweight champion.
02:16:03.000 Francis Ngannou had never had a boxing match ever in his life.
02:16:07.000 Had zero boxing matches, but he was the UFC heavyweight champion.
02:16:11.000 Okay.
02:16:12.000 Knocked down Tyson Fury in the third round, beat him up in the eighth round.
02:16:16.000 Most people, including me, thought he should have won the decision, including most boxers, most boxing pundits.
02:16:24.000 And he lost by one point on one judge's scorecard.
02:16:28.000 He won on one judge's scorecard.
02:16:30.000 Another judge, who should go to jail, had it 96-93 for Tyson Fury, which is fucking outrageous.
02:16:36.000 But Francis Ngannou, who is a literal freak of nature.
02:16:39.000 I mean, this guy grew up in Cameroon and was working in the sand mines when he was a child.
02:16:46.000 Like a fucking Conan movie.
02:16:48.000 Like this great warrior.
02:16:50.000 Conan's like pushing the thing around in a circle.
02:16:52.000 He's developing his body, digging in the sand all day.
02:16:56.000 He's supremely physically advanced.
02:16:59.000 He looks fit.
02:17:00.000 He's not just fit.
02:17:01.000 He's the hardest puncher ever measured in all of MMA. There's a machine that we actually have outside at the gym.
02:17:08.000 And if you hit this thing, Francis has hit it harder than any person who's ever lived.
02:17:13.000 Yeah, well look at him.
02:17:14.000 Can I hit it?
02:17:15.000 Yeah, we set it up.
02:17:17.000 I had the record for the kick for a while.
02:17:19.000 Okay.
02:17:19.000 Yeah.
02:17:21.000 A couple people beat it now.
02:17:23.000 But Francis punched, like, it's an insane, it's like getting hit by a fucking car.
02:17:28.000 Sure.
02:17:28.000 And when he dropped Tyson Fury in the third round, you see Tyson's on his back going, what the fuck?
02:17:34.000 And then he realizes, like, because I think he thought he was just going...
02:17:37.000 Yeah, being like hit by a sledgehammer.
02:17:39.000 Well, he thought he was going to run him over because he's the boxing heavyweight champion.
02:17:42.000 He's like, there's no way this guy can box with me.
02:17:44.000 He even said at the beginning of the fight, it's time to go to school.
02:17:46.000 Okay.
02:17:47.000 And then Francis said at the end of the fight, you are a shitty professor.
02:17:51.000 You should watch it.
02:17:52.000 It's a good friend's show.
02:17:54.000 It's on ESPN+. I'm pretty sure you can still get it.
02:17:57.000 Yeah.
02:18:00.000 Anyway, I just, it's...
02:18:04.000 I'm just excited that you're interested in doing it still.
02:18:06.000 Sure.
02:18:10.000 All right.
02:18:12.000 Didn't you fuck your back up doing, like, sumo wrestling?
02:18:16.000 Yeah.
02:18:18.000 What happened there?
02:18:21.000 Still hurts a little bit, actually.
02:18:22.000 I've had, like, four operations.
02:18:24.000 Really?
02:18:25.000 From that?
02:18:25.000 Well, I had, like, some childhood injuries.
02:18:30.000 Like I said, I was in some pretty severe fights as a kid.
02:18:33.000 Like, really, like, I was almost killed at one point.
02:18:35.000 Really?
02:18:36.000 What happened?
02:18:38.000 It was just in school in South Africa.
02:18:40.000 It's a very violent place.
02:18:44.000 So there's been many involuntary fights.
02:18:48.000 It's just the way it was.
02:18:51.000 But anyway, so I had some rugby injuries as well.
02:18:55.000 I saw South Africa won the World Cup, which is cool, and rugby.
02:18:59.000 So I think that was not a good starting position.
02:19:05.000 But then the world champion sumo wrestlers, they did kind of like a demo bout for my birthday.
02:19:13.000 And since it was my birthday, I guess they just call up the birthday boy and say, like, hey, do you want to Sumo wrestle.
02:19:20.000 This is where it's a similar weight differential.
02:19:23.000 He was 50% heavier than me.
02:19:25.000 So like 360, 370 pounds.
02:19:30.000 And I knew he would take it easy on me in the first round.
02:19:34.000 So the only way I'm going to knock him over is momentum.
02:19:38.000 So I got to basically run at him.
02:19:40.000 So I did.
02:19:42.000 Run at him, did a judo throw, knocked him over, and smashed a disc in my neck in the process.
02:19:48.000 It would be like running at that wall.
02:19:50.000 If you run at a wall, it's going to hurt.
02:19:54.000 Did you have to get it fused or anything?
02:19:56.000 Yeah.
02:19:56.000 Oh, man.
02:19:58.000 You can knock over someone.
02:20:01.000 You can defeat someone bigger than you if you're willing to smash a disc in your neck.
02:20:06.000 Well, if you know what you're doing and you're willing to smash a disc in your neck.
02:20:10.000 Those two things.
02:20:11.000 So he wasn't expecting me to beat a total lunatic on round one.
02:20:16.000 And he defeated me obviously in round two and three because he was like, oh, now he knows what to expect.
02:20:22.000 So I had like five minutes of glory and a decade of pain.
02:20:29.000 Now that you've got your neck fused, that creates problems with the upper and lower discs as well, doesn't it?
02:20:36.000 Over time, if there's too much neck rotation, it can damage them.
02:20:41.000 How long ago was this?
02:20:42.000 You had this operation?
02:20:45.000 Man, I had three operations.
02:20:47.000 Well, you got it fused.
02:20:48.000 How long ago?
02:20:49.000 I had two artificial disks, and I'm actually in favor of artificial disks.
02:20:52.000 They put the wrong disk in, but then eventually the third one is like, let's just fuse it.
02:20:58.000 They put the wrong one in?
02:21:00.000 Yeah, twice.
02:21:03.000 How so?
02:21:04.000 Because I have friends that have artificial disks.
02:21:06.000 Yeah, no, I'm actually in favor of artificial disks.
02:21:08.000 You just need to have the right one.
02:21:09.000 So in my case, at this point I know a lot about it, the C5, C6 right facet is impacting.
02:21:19.000 You know, the facets are like the outriggers.
02:21:21.000 You've got the center core of the spine, the facets are the outriggers, and they're shingled.
02:21:24.000 So they're like, you know, one on top of the other like this.
02:21:27.000 There's a little nerve that goes out in between the C5, C6. And if those vertebra come close together, they grind the nerve.
02:21:37.000 So they just sort of start shearing the nerve.
02:21:42.000 Now, so my C5, C6, right for certain, it shows up clear as day on like a technetium scan.
02:21:52.000 So if you could do like a radioactive scan with technetium, it's very clear where the problem is.
02:22:00.000 So, what should have been done was a simple hinge.
02:22:05.000 Like, you know, basically to move the vertebra, basically the C5 vertebra, back about maybe an eighth of an inch to sort of unload the facet and then put a simple hinge,
02:22:20.000 so just rotation.
02:22:22.000 But I was given what is called a MOBI-C, which is a more mobile disk.
02:22:28.000 The MOBI-C allows it not just rotation, but also translation, so it can move back and forth.
02:22:33.000 So that then didn't solve the impacting of the C5-C6, because it could slide.
02:22:41.000 And when it would slide forward, the C5-C6 would bang and crunch the nerve.
02:22:46.000 And what does the normal neck do?
02:22:49.000 Does the normal neck move forward in that way?
02:22:52.000 The disc is like a gummy bear, basically, normally.
02:22:55.000 So it allows rotation and translation.
02:22:58.000 So it's like sitting on like a jello pillow.
02:23:02.000 That's what discs are.
02:23:03.000 Like one of those Bosu balls that people sit on sometimes and they work at a desk.
02:23:06.000 It's like a rubber pillow, basically.
02:23:12.000 So the natural disk allows for rotation and translation.
02:23:17.000 So they basically put a disk that had too much mobility in and did not solve the C5-C6 NOVA impact.
02:23:25.000 So then the third time around, I was like, listen, I just don't want to take a chance here.
02:23:31.000 Let's just fuse it.
02:23:32.000 Oh, wow.
02:23:34.000 And so it just limits your mobility?
02:23:36.000 No, I'm fine.
02:23:37.000 I'm like, I can look right and left.
02:23:39.000 It's okay.
02:23:40.000 Yeah, that's okay.
02:23:41.000 Yeah.
02:23:42.000 I'm not like, you know, totally stiff necked.
02:23:44.000 No, you don't seem like you're stiff-necked at all.
02:23:47.000 There was a guy who fought in the UFC named Yoel Romero, and he's a real freak, too.
02:23:52.000 An amazing athlete.
02:23:53.000 And he came from the Cuban wrestling program.
02:23:57.000 He was one of the greatest wrestlers that's ever competed, amateur-wise.
02:24:01.000 He had his entire neck fused, and when he runs, his neck doesn't move.
02:24:05.000 It's kind of freakish.
02:24:07.000 You see him running, and his neck looks like it's a stick.
02:24:10.000 And the whole body is like...
02:24:13.000 Like, moving around, but the neck is just locked in place.
02:24:16.000 It's very bizarre to look at.
02:24:18.000 I mean, he runs like a man whose entire neck is fused.
02:24:22.000 Like, watch him.
02:24:23.000 See if you can show the image of him.
02:24:25.000 Here, watch him run.
02:24:27.000 Wow.
02:24:27.000 See how his neck?
02:24:28.000 He's in good shape.
02:24:29.000 Oh, yeah.
02:24:30.000 You think?
02:24:32.000 I mean, you don't get in any better shape than this guy.
02:24:35.000 By the way, he looks like that now and he's 46, 47 years old and still competing at the highest level in Bellator.
02:24:43.000 I mean, he's an unbelievable athlete and one of the most explosive guys that's ever fought in the sport.
02:24:49.000 Just insanely powerful and fast.
02:24:54.000 Cool.
02:24:55.000 So, you compete at the highest level with your neck fused.
02:24:58.000 Yeah, yeah, yeah.
02:24:59.000 I'm not too worried about that.
02:25:00.000 Aljamain Sterling, who was the UFC Bantamweight Champion, he got his neck fused, or got a disc replaced rather in his neck, and then went on to defend his title three times.
02:25:13.000 Still fighting at the highest level with a fake disc in his neck.
02:25:17.000 Well, I guess it'll be okay then.
02:25:19.000 Yeah, well, medical science, pretty fucking incredible what they can do now.
02:25:23.000 You know?
02:25:24.000 I mean, injuries that would have, like, you would have been fucked for the rest of your life just a few decades ago.
02:25:30.000 Yeah.
02:25:31.000 Now you're good to go.
02:25:33.000 Yeah.
02:25:34.000 So I'm excited.
02:25:35.000 We've kind of rekindled this Zuck versus Elon Fire.
02:25:38.000 I mean, he's checking out.
02:25:39.000 I don't think he's checking out.
02:25:40.000 Yeah, he's checking out.
02:25:42.000 Do you think so?
02:25:42.000 Yeah.
02:25:42.000 Puck, puck, puck.
02:25:43.000 Well, maybe he's listening.
02:25:45.000 Zuck, Zuck, Zuck.
02:25:45.000 Zuck, Zuck, Zuck.
02:25:49.000 I'll go at him into fighting using taunts.
02:25:52.000 It might work.
02:25:53.000 Yeah.
02:25:53.000 I mean, somehow or another you got him to agree in the first place.
02:25:57.000 I was stunned.
02:25:59.000 Surely he will respond to a taunt like that.
02:26:01.000 Yeah, surely.
02:26:03.000 I mean, how can you resist?
02:26:04.000 How can you resist?
02:26:07.000 Exactly.
02:26:08.000 Let's go full schoolyard taunting.
02:26:11.000 What if there was, like, real consequences on the line?
02:26:14.000 Like, what if you guys had a real bet?
02:26:16.000 Okay, sure.
02:26:16.000 Like, the moderation team from X takes over moderation of Facebook if you win.
02:26:20.000 No problem.
02:26:21.000 Sounds good.
02:26:21.000 And if he wins, vice versa.
02:26:23.000 It's a fight for civilization.
02:26:25.000 Yeah.
02:26:25.000 A literal fight for civilization.
02:26:28.000 I mean, I'll do it.
02:26:29.000 Wow.
02:26:31.000 Heavy.
02:26:31.000 True.
02:26:33.000 And you wouldn't even train for this?
02:26:35.000 No, I'd train a little bit.
02:26:36.000 Train a little bit?
02:26:37.000 Yeah.
02:26:38.000 Like how many weeks do you need?
02:26:42.000 I mean, I don't have to train.
02:26:43.000 I could do it like tomorrow.
02:26:46.000 I tried going to his house, actually.
02:26:48.000 Did you really?
02:26:48.000 Yeah, because he lives in Palo Alto.
02:26:51.000 And we're doing some Tesla full self-driving testing.
02:26:55.000 So I'm like, well, I've got to pick a destination.
02:26:57.000 Did you press the button, go do-do, navigate to Zuck's house?
02:26:59.000 Yeah, basically.
02:27:02.000 It's not far.
02:27:03.000 I think he's like three miles away from the Tesla California headquarters.
02:27:07.000 Wow.
02:27:09.000 But I don't know.
02:27:11.000 There's nobody there.
02:27:14.000 You know why?
02:27:15.000 According to a spokesman, he was traveling.
02:27:17.000 Oh, yeah.
02:27:18.000 It would have been wild if he was there.
02:27:20.000 What would you have said?
02:27:21.000 Literally, anytime.
02:27:23.000 I just thought it was funny to go like, you know, I'm coming over to your house.
02:27:29.000 I'm going to get you.
02:27:31.000 Well, it's even more funny when it's two of the richest guys in the world.
02:27:34.000 Yeah.
02:27:34.000 Yeah.
02:27:36.000 So, anyway, he didn't answer.
02:27:39.000 No.
02:27:40.000 Too bad.
02:27:42.000 It's just fun.
02:27:44.000 It's fun, and I'm glad you're just for the fun of it.
02:27:48.000 I mean, I think it would be...
02:27:50.000 Well, actually, Dana White thinks it would be a really big ticket fight.
02:27:55.000 It would be fucking huge.
02:27:57.000 Yeah.
02:27:57.000 I would commentate on that.
02:27:58.000 Yeah, I mean, the proceeds could go to charity and stuff.
02:28:00.000 It would be fucking huge.
02:28:01.000 Yeah, it'd be crazy.
02:28:02.000 People would want to see what the hell's going on.
02:28:03.000 Oh, my God.
02:28:04.000 It would be fucking huge.
02:28:05.000 Yeah.
02:28:05.000 Yeah.
02:28:06.000 It would be really crazy.
02:28:08.000 Crazy.
02:28:08.000 Like, if they close the thing, and Bruce Buffer is in there, it's...
02:28:15.000 Let's go.
02:28:15.000 The place would go fucking bananas.
02:28:18.000 Bananas.
02:28:18.000 Yeah.
02:28:19.000 Yeah.
02:28:20.000 Let's do it.
02:28:21.000 Does it have to be in the Coliseum?
02:28:22.000 Would you agree?
02:28:23.000 No, I'll do it anywhere.
02:28:24.000 I literally said anywhere, anytime, any rules.
02:28:28.000 You know where you should do it?
02:28:29.000 The Sphere in Las Vegas.
02:28:30.000 The Sphere's great.
02:28:31.000 I was just there.
02:28:32.000 The Sphere's amazing.
02:28:33.000 It's amazing.
02:28:33.000 It's amazing.
02:28:34.000 I've only seen it on the outside, but it looks incredible.
02:28:37.000 The inside's even better.
02:28:38.000 I've seen video.
02:28:38.000 I haven't seen it live.
02:28:40.000 I was there on Saturday night, and it's awesome.
02:28:43.000 Like, it's really good.
02:28:45.000 I think it might be the best show on earth.
02:28:46.000 Oh, yeah.
02:28:47.000 Yeah.
02:28:47.000 I mean, if you have visuals that accompany the music, like, if you have, like, someone like Roger Waters, like, which his show is, like, insanely visual, something like that in the sphere would be incredible.
02:28:59.000 The art in the show that I saw on Saturday night was incredibly good.
02:29:02.000 Who was it?
02:29:03.000 I don't know who did all the art, but...
02:29:05.000 What was the band?
02:29:06.000 Oh, it was U2. Oh, yeah, yeah, yeah.
02:29:07.000 But, I mean, I've been to U2 concerts, and U2's great, but the sphere is really...
02:29:14.000 Like, if U2 hadn't been there, it would still be great.
02:29:17.000 Yeah.
02:29:17.000 Yeah, U2's...
02:29:18.000 It's like a...
02:29:20.000 You know, like, on Sunday, there was a movie.
02:29:23.000 They played a movie there.
02:29:25.000 Whoa.
02:29:26.000 Where do you look?
02:29:28.000 In all directions.
02:29:30.000 Oh my god.
02:29:32.000 I mean, it's like actually being in virtual reality.
02:29:36.000 In fact, it was so wild, the Saturday night one especially, that you step outside after the show and you're like, why is reality so boring?
02:29:48.000 Oh, so this is a postcard from Earth.
02:29:50.000 It's Darren Aronofsky's thing.
02:29:52.000 Oh, wow.
02:29:53.000 And you're watching this.
02:29:54.000 I saw that on Sunday.
02:29:55.000 And it covers the whole ceiling.
02:29:57.000 Oh, my God.
02:29:58.000 It's really great.
02:29:59.000 You saw it there?
02:30:00.000 Yeah.
02:30:01.000 Oh, my God.
02:30:03.000 That's incredible.
02:30:04.000 Yeah, it's really good.
02:30:05.000 That would be the greatest place to see a movie ever.
02:30:07.000 I think it's like Saturday Night Show.
02:30:10.000 And obviously YouTube adds to it.
02:30:12.000 But like I said, the sphere is really special in and of itself.
02:30:16.000 I think it's probably the best show I've ever seen.
02:30:19.000 Wow.
02:30:20.000 Yeah.
02:30:20.000 I can imagine.
02:30:21.000 I mean, it's just what an amazing venue and what an incredible idea.
02:30:24.000 Yeah, it's really cool.
02:30:25.000 To have the entire ceiling, all screened.
02:30:28.000 I gotta hand it to Dolan.
02:30:29.000 That was pretty great.
02:30:30.000 Amazing.
02:30:30.000 Yeah.
02:30:31.000 Absolutely amazing.
02:30:32.000 I'm so glad that he did that.
02:30:34.000 And then also the outside.
02:30:35.000 Oh, that's incredible.
02:30:37.000 They really play with perspective.
02:30:38.000 God, that's incredible.
02:30:39.000 Because it's round, but it doesn't look round.
02:30:41.000 Right.
02:30:42.000 So it'll be like, it'll simulate like a square, like all sorts of shapes.
02:30:47.000 Wow.
02:30:48.000 And then also the outside of it.
02:30:50.000 Like, they had the outside of it.
02:30:51.000 It looked like Earth.
02:30:52.000 It's just amazing.
02:30:53.000 It's really cool.
02:30:54.000 Yeah.
02:30:55.000 Super cool.
02:30:56.000 I like these epic things.
02:30:57.000 That's the venue.
02:30:58.000 You know.
02:30:59.000 It's really cool.
02:31:00.000 That's the venue.
02:31:01.000 That's where it needs to go down.
02:31:02.000 In the sphere?
02:31:03.000 Sure.
02:31:03.000 Yeah, that's even better than the Coliseum.
02:31:05.000 Okay.
02:31:05.000 Yeah.
02:31:06.000 Especially if the United States falls, that would be our Coliseum.
02:31:09.000 This would be our Rome.
02:31:11.000 Vegas would be our Rome.
02:31:12.000 I mean, the sphere did remind me of being like a modern-day Coliseum.
02:31:16.000 Yeah.
02:31:17.000 Yeah, like a modern-day version.
02:31:19.000 Like, what would they do?
02:31:20.000 Yeah, with our technology currently.
02:31:22.000 Yeah.
02:31:22.000 And then Vegas is, like, kind of Rome-esque in the sense that when we think about, like, the hedonism of Rome, its final days, that's Vegas.
02:31:32.000 Yeah, yeah, totally.
02:31:33.000 Yeah.
02:31:33.000 Perfect.
02:31:34.000 Yeah.
02:31:35.000 Let's go.
02:31:36.000 Let's go.
02:31:38.000 Are you not entertained?
02:31:40.000 You will fucking be entertained.
02:31:42.000 You'll be fucking entertained.
02:31:43.000 So entertained.
02:31:44.000 No doubt.
02:31:46.000 Let's do it.
02:31:48.000 There's one point in time where you were trying to get people to do a pause on AI. I mean, I signed onto a letter that someone else wrote.
02:31:57.000 I didn't think that people would actually pause.
02:32:01.000 But you thought it was probably a good idea if they did.
02:32:06.000 I think so, too.
02:32:07.000 Yeah.
02:32:07.000 I mean, making some sort of digital superintelligence seems like it could be dangerous.
02:32:16.000 It certainly has the potential, and certainly has the potential...
02:32:20.000 Well, when you were talking about what...
02:32:24.000 This mind virus, how it was able to propagate through social media and being in control of social media platforms.
02:32:32.000 Think about what that means if that same mind virus gets in control of a superintelligence.
02:32:39.000 And that is possible.
02:32:41.000 Yeah.
02:32:41.000 No, that's actually what I think the biggest danger is for AI is that if AI is implicitly programmed – I don't think they can do it explicitly – but implicitly programmed with values that lead to – that have led to the destruction of downtown San Francisco.
02:32:57.000 And a bunch of these AI companies are in – either in San Francisco or in the San Francisco Bay Area.
02:33:04.000 Then you could implicitly program an AI to believe that extinction of humanity is what it should try to do.
02:33:12.000 I mean, if you take that guy who was on the front page of the New York Times, and you take his philosophy, which is prevalent in San Francisco, the AI could conclude, like he did, where he literally says there are 8 billion people in the world,
02:33:29.000 it would be better if there were none, and engineer that outcome.
02:33:35.000 Yeah.
02:33:37.000 Well, especially if it doesn't need us anymore.
02:33:39.000 If it becomes sentient and then has the ability to make its own decisions and make a better version of itself, it would find us to be nothing but a problem.
02:33:49.000 Like, we have nothing to offer anymore.
02:33:52.000 Yeah, it is a risk.
02:33:53.000 So, you know, and like, if you query ChatGPT, I mean, it's pretty woke, you know.
02:34:01.000 Yeah.
02:34:03.000 People did experiments like, write a poem praising Donald Trump, and it won't.
02:34:08.000 But if you ask, write a poem praising Joe Biden, and it will.
02:34:12.000 Yeah.
02:34:12.000 So I'm like, hmm, you know.
02:34:14.000 That's a little sketchy.
02:34:15.000 Yeah.
02:34:16.000 Well, unfortunately, it's programmed.
02:34:19.000 Yes.
02:34:20.000 It's programmed to be that way.
02:34:22.000 Is it possible to overcome those problems?
02:34:27.000 Is it possible that we could realize the dangers that are involved in creating this but somehow or another engineer it in a way that would be ultimately beneficial to people?
02:34:37.000 Or is that just a whim?
02:34:38.000 As a hope and a prayer, a utopian version of what could happen versus the most likely outcome?
02:34:51.000 If you say, like, what is the most likely outcome of AI, I think the most likely outcome, to be specific about it, is a good outcome.
02:34:59.000 Most likely a good outcome.
02:35:00.000 But it's not for sure.
02:35:03.000 So...
02:35:03.000 I think we have to be careful how we program the AI. And make sure that it is not accidentally anti-human.
02:35:16.000 So...
02:35:19.000 You know, the accidentally extinctionist AI. You wouldn't want that.
02:35:26.000 Or even pruning.
02:35:29.000 Well, that is kind of how it works, is that these, what they call large language models, but, you know, it's really just a big pile of numbers.
02:35:40.000 And how you tune those numbers matters.
02:35:43.000 It's like pruning a tree.
02:35:45.000 You know, you could have a mighty oak.
02:35:48.000 It could be a little bonsai or a mighty oak.
02:35:51.000 So, depending on how you prune it.
02:35:54.000 Right.
02:35:55.000 That's what I'm saying.
02:35:56.000 Like, if it decided to prune, if it decided the real issue...
02:36:01.000 Perhaps.
02:36:02.000 We cause problems.
02:36:04.000 Or maybe it would prune places in the world that are, you know, overwhelmingly polluting, like third world countries.
02:36:12.000 Maybe decide that they're not very necessary, particularly if we use computers or AI or some sort of robotics to do human labor.
02:36:23.000 And then you have these areas where human beings are doing this labor and they're polluting and, you know, there's all sorts of issues that come about because of that.
02:36:32.000 You say, well, we just eliminate those people.
02:36:34.000 We eliminate that issue and then we have 30% less garbage in the ocean.
02:36:38.000 And then it makes this call.
02:36:41.000 Yeah.
02:36:42.000 Yeah.
02:36:45.000 It's something we should be concerned about.
02:36:48.000 And I actually need to go...
02:36:50.000 Oh, shit.
02:36:51.000 Sorry, I need to go...
02:36:52.000 Should we wrap it up?
02:36:53.000 Yeah, because I have to go to the airport.
02:36:56.000 I'm flying to London.
02:36:58.000 Yeah, you were explaining that.
02:37:05.000 Yeah, just for AI safety.
02:37:07.000 The AI safety conference in London.
02:37:10.000 So, yeah.
02:37:12.000 I'm leaving in about an hour and a half.
02:37:15.000 What do you hope to get out of this conference?
02:37:22.000 Well...
02:37:22.000 I don't know.
02:37:36.000 I mean, I'm just generally concerned about AI safety, but it's like what should we do about it?
02:37:46.000 I don't know.
02:37:49.000 Have some kind of regulatory oversight of some kind.
02:37:53.000 It's like you can't just go build a nuclear bomb in your backyard.
02:37:57.000 That's against the law, and you'll get thrown in prison if you do that.
02:38:01.000 How much of a concern is it?
02:38:02.000 This is, I think, maybe more dangerous than a nuclear bomb.
02:38:05.000 Really?
02:38:05.000 Yes.
02:38:08.000 How much of a concern is it if another country develops it before us?
02:38:18.000 I don't know.
02:38:21.000 We should just be concerned about AI being anti-human.
02:38:27.000 That's sort of the thing that matters.
02:38:31.000 So, potentially.
02:38:36.000 I'm saying it's like letting a genie out of a bottle.
02:38:39.000 It's sort of like a magic genie that can make wishes come true, except usually when they tell those stories, that doesn't end well for the person who let the genie out of the bottle.
02:38:51.000 Right.
02:38:54.000 Do you think we're creating a life form?
02:39:00.000 Yeah.
02:39:00.000 I mean, it's something that is indistinguishable from intelligence, an intelligent life form, certainly.
02:39:06.000 I keep coming up against this idea.
02:39:09.000 I keep banging it in my head that we're some sort of an electronic caterpillar that's creating a cocoon.
02:39:14.000 And we don't even realize what we're doing.
02:39:16.000 And we're about to give birth to some technological butterfly.
02:39:22.000 Yeah.
02:39:32.000 Well, I think we're on the cusp of an artificial intelligence revolution.
02:39:44.000 For the longest time, or for a very long time, we've been the smartest creatures on Earth.
02:39:50.000 That's been our defining characteristic.
02:39:52.000 I mean, speaking of martial arts, I don't think anyone should challenge a silverback gorilla to a fight.
02:40:01.000 Even if you're very good at martial arts, that thing's going to kill you.
02:40:07.000 It literally walks on its fists.
02:40:11.000 For those fists to meet your face, game over.
02:40:15.000 So we're not stronger than a gorilla.
02:40:18.000 We're not faster than other animals.
02:40:22.000 We're smarter.
02:40:26.000 Now what happens when there's something way smarter than us?
02:40:32.000 Where does it go?
02:40:33.000 That's a good question.
02:40:35.000 Well, listen, go talk to those people.
02:40:38.000 Go school them.
02:40:39.000 And I hope something good comes out of it.
02:40:43.000 And thank you for your time.
02:40:44.000 Appreciate you coming in here.
02:40:45.000 It's been fun.
02:40:46.000 Good to see you.
02:40:46.000 Always good to see you.
02:40:48.000 Thank you.
02:40:48.000 Thanks for everything, man.
02:40:49.000 Thanks for buying Twitter, too.
02:40:50.000 You're welcome.
02:40:51.000 Really.
02:40:51.000 It means a lot to people.
02:40:53.000 You know, that is aspirationally a force for good.
02:40:56.000 I think, at the very least, it stopped a lot of bad.
02:40:59.000 Yeah.
02:40:59.000 Good.
02:41:00.000 All right.
02:41:00.000 Bye, everybody.
02:41:01.000 All right.