Timcast IRL - Tim Pool - April 18, 2024


Youtube NUKES TimcastIRL Deleting Biggest Shows, Veiled Threat Of PERMANENT BAN | Timcast IRL


Episode Stats

Length

2 hours and 3 minutes

Words per Minute

197.58354

Word Count

24,339

Sentence Count

1,914

Misogynist Sentences

35

Hate Speech Sentences

28


Summary

On today's show, we discuss the removal of our most popular episodes from YouTube, the Biden Plane Crash, and a viral video about furries having litter boxes in the bathrooms. Plus, a new limited edition batch of Casper Coffee and much, much more.


Transcript

00:00:00.000 And it is beginning.
00:00:10.000 Today we got a notification from YouTube that our two biggest episodes, featuring Joe Rogan, Michael Malice, Alex Jones, and many others, had been removed for retroactive policy enforcement.
00:00:23.000 I spent some time on the phone with Google.
00:00:27.000 I'm angrily discussing this, and there's a lot to break down in this, but it's an election year.
00:00:34.000 And so three years, and for one of the episodes longer than three years, four years, but between three and four years after these shows had already aired, and they had no policy violations, they come back and make up fake reasons as to why these episodes are being removed from YouTube.
00:00:53.000 What they told us effectively said to me, we cannot be on YouTube.
00:00:59.000 My only options would be to delete and purge every single show and clip from this YouTube channel based on what they told me.
00:01:07.000 And then they said, no, no, no, no, don't do that.
00:01:10.000 But we'll get into all the finer details as we begin to talk about, there's a lot to break down.
00:01:15.000 I'm gonna get into the dirt and grime of how the business operates, why we do the things we do, what our moves are going to be going forward.
00:01:24.000 I've already had discussions with some top men, top men, and I think everyone's gonna be very, very excited as to what this means because YouTube basically just said, we don't want you, we don't like you, get the out.
00:01:38.000 So, let me save the greater details for the actual segment, because we're doing the intros, but we do have a lot of other news.
00:01:44.000 Biden, I guess, is claiming that something happened with the plane crashes.
00:01:48.000 Was it Uncle or something?
00:01:49.000 Was Uncle got eaten by cannibals?
00:01:50.000 Almost, almost.
00:01:52.000 Alright, we got to make sure we get politics in there.
00:01:54.000 I don't know that there's anything too tremendous politically, but I think the biggest story, actually, was Bloomberg writing that The UAE's attempt at weather manipulation resulted in mass flooding, which is wreaking havoc on the country.
00:02:10.000 And now you've got all these articles popping up from the left, like, no, it's a conspiracy theory!
00:02:13.000 Weather modification did not cause the Great Flooding!
00:02:16.000 Uh, yeah, they were cloud seeding.
00:02:18.000 And according to Bloomberg, the cloud seeding made these floods substantially worse.
00:02:23.000 Major backfire in government weather manipulation, I guess.
00:02:26.000 So, uh, we'll talk about that as well.
00:02:28.000 And then we have this viral video where children staged a walkout of their school, complaining that furries have litter boxes in the bathrooms.
00:02:36.000 The guy filming the video says, I heard that was a rumor.
00:02:38.000 That's not true.
00:02:38.000 And they're like, no, they're there!
00:02:40.000 Now, we've not confirmed this independently, so I think it's important to take all that information with a grain of salt, but I really don't believe someone orchestrated 70 children leaving a building and lying about something.
00:02:49.000 That seems like a conspiracy theory.
00:02:52.000 And Occam's Razor would suggest these kids are pissed off about furries in their school and litter boxes in their bathrooms, so we'll talk about that... Before we do, ladies and gentlemen, head over to casbrew.com.
00:03:02.000 Because they're trying to ban us.
00:03:02.000 Why?
00:03:04.000 Buy our coffee, I guess.
00:03:05.000 We got big plans for Casper Coffee.
00:03:07.000 We want to build physical locations all over the country.
00:03:09.000 We have a physical location being built right now in Martinsburg, West Virginia.
00:03:13.000 It's not so much about coffee.
00:03:14.000 It's about the third place.
00:03:16.000 Somewhere you can hang out, meet like-minded individuals, share an honest cup of joe, and talk about ideas, and get organized, and build community.
00:03:24.000 Something that is tremendously antithetical to the authority establishment plans.
00:03:31.000 They want you living in a pod, eating the bugs in virtual reality, and we want to resist that.
00:03:36.000 Now, the coffee's delicious, don't get me wrong, but I recommend you buy it.
00:03:40.000 Appalachian Nights is everybody's favorite.
00:03:41.000 We are now sold out of ReRise with Roberto Jr., but there's going to be a small limited batch popping up of 700 bags.
00:03:47.000 We're going to be selling those at about $7 each, just to move them, because we had the extra bags lying around, so we're going to brew some of the fresh coffee.
00:03:53.000 But it is good coffee.
00:03:54.000 It's my favorite.
00:03:55.000 Appalachian Nights, of course, is the best.
00:03:56.000 And when you buy Casper Coffee, the money that we're getting from it, we're not taking any profits or anything out right now.
00:04:03.000 Hopefully in the future it's a big profitable company.
00:04:05.000 It's all being reinvested into setting up these physical locations.
00:04:08.000 In Martinsburg, if you're a member of TimCast.com's Elite Club, that is, you click join us and it's 100 bucks a month, you're gonna get a key fob.
00:04:16.000 You're gonna be able to walk up to the building in the secret side entrance and boop, you're way right on in, walk upstairs and hang out in the club.
00:04:22.000 That's our plan for the private club.
00:04:24.000 But become a member at TimCast.com for 10 bucks a month and you'll get access to our Discord server where you can hang out and chat with like-minded individuals and network digitally.
00:04:32.000 It's not perfect, but it's better than nothing.
00:04:35.000 And you'll get access to our members-only uncensored shows, where you can even call in if you are a member.
00:04:41.000 Now, more importantly, I think it's important to stress that this show, right now, if it weren't for TimCast members, it would not exist.
00:04:49.000 The cost of flying out guests, putting up people in hotels, paying for the staff, the drivers and the coordination and all that stuff is very expensive.
00:04:55.000 So we are only able to do this because you guys are members.
00:04:59.000 So naturally when YouTube effectively declares war on us, and there's a lot I'll break down in that.
00:05:06.000 There's a great risk.
00:05:07.000 And so, uh, obviously we have options.
00:05:10.000 There is a demand for great media shows like ours, and many people are interested.
00:05:15.000 And YouTube, uh, is- is interested in destroying their company and brand.
00:05:19.000 Fine.
00:05:19.000 Whatever.
00:05:20.000 We'll see what happens when TikTok gets banned, I guess.
00:05:23.000 Maybe YouTube's not worried about it, but, uh, I strongly request y'all become members.
00:05:27.000 Because, uh, you'll get access to our uncensored show, a bunch of other bonus stuff.
00:05:31.000 You'll get to watch live spaces with Josie, if you're a fan of Josie's.
00:05:35.000 And, uh, we've got many more stuff.
00:05:36.000 We've got a couple documentaries.
00:05:38.000 We'll be releasing those on multiple platforms soon as well.
00:05:41.000 But, uh, become a member.
00:05:43.000 Hang out in the Discord.
00:05:44.000 We got more member stuff we're working on.
00:05:46.000 I think the, uh, physical locations are gonna be a blast.
00:05:48.000 Don't forget to also smash that like button.
00:05:51.000 Subscribe to this channel.
00:05:52.000 And don't forget to subscribe on Rumble.
00:05:54.000 At rumble.com slash Timcast IRL.
00:05:57.000 I think that's our URL.
00:05:58.000 Let's check it out.
00:05:59.000 We'll make sure we got that one right.
00:06:01.000 And follow me personally on X at Timcast.
00:06:04.000 Those are going to be very important because next week we'll be moving to a new studio.
00:06:08.000 I won't say much more beyond that, but follow us in those places if you catch my drift.
00:06:14.000 Unfortunately, today we did have a guest.
00:06:16.000 There was a medical emergency.
00:06:20.000 So instead, Phil Labonte is the guest.
00:06:22.000 How you doing, everybody?
00:06:23.000 My name is Phil Labonte.
00:06:23.000 I'm the lead singer of the heavy metal band All That Remains.
00:06:25.000 I'm an anti-communist and counter-revolutionary.
00:06:28.000 I was like, we have no guest.
00:06:29.000 Phil's famous.
00:06:30.000 Phil, sit in the chair.
00:06:31.000 I've followed your work for a long time.
00:06:32.000 I'm glad you're here.
00:06:32.000 Thank you.
00:06:33.000 I appreciate it.
00:06:34.000 And then in third chair, we have Mr. Bocas.
00:06:38.000 What a great cat.
00:06:39.000 It's a painting of Mr. Booker.
00:06:41.000 Nice job, Josie.
00:06:42.000 Yeah, I called everybody.
00:06:43.000 I was like, Oh, man, you know, these things happen.
00:06:46.000 It's rare.
00:06:47.000 It's rare, but it happens last minute thing, you know, and so, you know, then you got the animal surge that just steps up every chance every moment.
00:06:54.000 Yeah, we'll just we'll just make sure to talk more tonight.
00:06:56.000 Find a night by the way.
00:06:57.000 Thanks, yeah, appreciate it.
00:06:59.000 People watch, like, people watch the show, I mean, the guests are obviously, it's great to have the guests and stuff like that to get a different perspective, but people really do watch the show because- You're a guest, Phil!
00:07:08.000 Well, thank you very much.
00:07:09.000 Yeah, it's kind of like, what do you do in your normal life, this show, this show?
00:07:12.000 I was trying to get Richie Jackson to come.
00:07:14.000 Actually, the first person we tried calling was Libby, and I was like, we'll just see if Libby can come in and fill in the seat because she comes on all the time, and she was unavailable, and I was like, I know who needs to come on the show.
00:07:25.000 Raymond G. Stanley Jr.
00:07:26.000 Oh, that'd be great.
00:07:27.000 He was too far away.
00:07:28.000 He wasn't able to make it.
00:07:29.000 I was like, that actually would be a really great show.
00:07:31.000 It was so awesome.
00:07:32.000 Everybody knows him.
00:07:33.000 He's so funny.
00:07:34.000 Yeah.
00:07:34.000 And, uh, and he works here.
00:07:36.000 Am I, I don't want to speak out of turn, ex-Marine is Ray?
00:07:39.000 Former.
00:07:39.000 No, no, no, no.
00:07:40.000 Former Marine.
00:07:40.000 Yeah.
00:07:41.000 Oh, that's a different word you use?
00:07:42.000 Well, you're not, you're never not a Marine.
00:07:43.000 Once you're a Marine, you're always a Marine.
00:07:45.000 So there's a former because you were, you know, you were in service and you're, when you're not in service and you can be called a former Marine.
00:07:51.000 But I was calling Richie Jackson, too.
00:07:52.000 I was like, you know, he's a wild, crazy guy.
00:07:54.000 But everybody just was like, man, it was a perfect storm of everybody dipping.
00:07:59.000 What a crazy day.
00:07:59.000 Yeah, really.
00:08:00.000 Sir, just pressing the buttons.
00:08:01.000 Ian's here, too.
00:08:02.000 Oh, yeah.
00:08:02.000 Hello.
00:08:03.000 Hi.
00:08:03.000 Welcome.
00:08:04.000 Let's do this.
00:08:04.000 I love the meta shows where we talk about, like, the show beneath the show.
00:08:08.000 Oh, here we go.
00:08:08.000 I mean, we're going to talk about Nitty Gritty.
00:08:10.000 I'll break down everything for everybody.
00:08:13.000 I mean, I typically do.
00:08:14.000 I wonder if we're, like, the most transparent of shows when it comes to how shows run.
00:08:19.000 Among them.
00:08:19.000 Let's jump to this story from the post-millennial.
00:08:23.000 So earlier today, I was working out.
00:08:27.000 I had a great mini-ramp session.
00:08:29.000 I would say I'm around 20% of my capabilities.
00:08:32.000 I haven't skated mini-ramp in a very long time.
00:08:34.000 And Richie and I were getting the session going.
00:08:36.000 I burned 1600 calories.
00:08:38.000 It was glorious.
00:08:38.000 Then I went to physical training and I almost passed out because I was really pushing it.
00:08:42.000 I heard you were talking about like your blood pressure was like, you felt like you were gonna, or no, your watch was telling you that you maxed out, yeah.
00:08:49.000 Yeah, yeah, I had about 33 minutes of VO2 max, and then as soon as this, I get a message from Dane, our social media guy, and he's like, hey, take a look at this.
00:08:56.000 And it says your videos have been removed.
00:08:58.000 So two of the biggest, the two biggest, TimCast IRL shows on YouTube were deleted.
00:09:05.000 Both today.
00:09:07.000 Three years after they aired, with retroactive policy enforcement, that they claim were always in effect, but only now they decided to remove.
00:09:18.000 The biggest episode on YouTube, of course, was Joe Rogan, Alex Jones, Blair White, Michael Malice, me, I believe Luke Rikowsky was there, Ian was there, was there, Drew Hernandez was there, it was a massive show
00:09:30.000 in Austin in this trailer.
00:09:32.000 Joe Rogan pulls up to our big trailer mobile studio and he comes in and we're like, let's
00:09:37.000 roll and we had like 160,000 concurrent viewers, massive.
00:09:42.000 There's no policy violations.
00:09:44.000 We talked about things everybody always talks about.
00:09:46.000 We're very, very strict on this show.
00:09:47.000 You guys know, we've deleted episodes live during the show, and we've been working on engineering a dump button, which basically means if there's ever a policy violation, we hit a button, and then there's a delay, so that whatever violated the rules never appears, and we don't have to take the show down anymore.
00:10:04.000 So we built all this.
00:10:06.000 Show's deleted.
00:10:07.000 The other episode was the Michael Malice Alex Jones episode, which was our second biggest, which we did after they took down the first one.
00:10:15.000 Alex Jones and Michael Malice came on the show.
00:10:17.000 It was hilarious and fun.
00:10:18.000 They deleted it and gave us a warning.
00:10:21.000 I think, I don't know if we got, I don't think we ever got a strike from it.
00:10:25.000 No, we didn't.
00:10:25.000 We got a warning from it.
00:10:27.000 And the warning was on the channel for Two and a half years.
00:10:32.000 As soon as this happened, I'm on the phone with Google, and they're saying, we can't tell you what the policy violation was.
00:10:38.000 And I'm like, how are we supposed to do better and fit your terms of service if we have no idea what you're mad about?
00:10:44.000 And they're like, too bad.
00:10:45.000 I said, okay.
00:10:46.000 So I had all these people messaging and commenting, being like, duh, you're babies, you took the episode, blah, blah, blah.
00:10:52.000 And I was like, I called Michael Maus and Alex Jones and said, guys, can you come back immediately and do the show again?
00:10:58.000 They want to take down our episode and they won't tell us why.
00:10:59.000 We'll do the show again.
00:11:01.000 And so a week later, Alex and Michael came back and that was at the time our biggest episode ever, the Try Me YouTube episode.
00:11:09.000 I can only assume they were not happy we did that, but I got to tell you, before we even had Alex Jones on, I emailed our liaison at Google and said, what are the rules pertaining to Alex Jones?
00:11:18.000 Is he allowed to be a guest on shows?
00:11:20.000 And they said, absolutely.
00:11:22.000 And I said, okay.
00:11:23.000 You guys have no issue with this then?
00:11:24.000 He said, no, no.
00:11:25.000 He just can't have his own channel.
00:11:26.000 I said, done.
00:11:27.000 They deleted the episode.
00:11:28.000 They found whatever reason.
00:11:30.000 We did the show again.
00:11:31.000 No problems.
00:11:33.000 Over the past several years, I've actually spoken with people at Google and they said they were great episodes.
00:11:37.000 They were fine.
00:11:38.000 I've had people be like, oh, my friends work at YouTube.
00:11:41.000 They're big fans.
00:11:41.000 I've had people who work at YouTube tell me, I love watching the show every night.
00:11:46.000 They took down our two biggest episodes at the same time.
00:11:50.000 One, they claimed we promoted QAnon.
00:11:53.000 That is, I say defamation.
00:11:55.000 I've never promoted QAnon.
00:11:57.000 I mock people who are promoting QAnon.
00:11:59.000 When people on this show ever mention anything about it, we say that's silly nonsense.
00:12:04.000 They claim that we had some kind of vaccine medical misinformation, so you didn't get a strike on that one, but it's okay.
00:12:11.000 Here's what this means.
00:12:13.000 I'm on the phone with Google.
00:12:14.000 Immediately after this happened, I get an email and they're like, we just want to let you know we took these episodes down.
00:12:18.000 And I said, three years after these episodes aired, you're now claiming a policy violation.
00:12:24.000 And they're like, well, it was always against the rules.
00:12:27.000 And I said, okay, here's what we're going to do.
00:12:29.000 I will instruct my social media guy right now to delete every single video off the Tim Cast IRL channel.
00:12:35.000 We will air the episodes, and a week later, delete them from the platform.
00:12:38.000 We will put the clips up, and a week later, delete them from the platform, because that is the only thing we can do based on retroactive policy enforcement.
00:12:46.000 If you tell us what we're doing is fine, and we behave in that way for three years, I've got a thousand episodes.
00:12:54.000 I said we've got 1,006 episodes.
00:12:56.000 Probably about 990 are on YouTube, plus every single clip, which is three to six clips per episode.
00:13:03.000 And you tell us what we did on that show was fine for three years.
00:13:06.000 That means from that point on till today, we did the exact same things we did in that episode.
00:13:12.000 How many episodes am I supposed to go through now to figure out if they violate the rules?
00:13:16.000 And I was told by the person at Google, well, I don't know of any other episodes where this is an issue.
00:13:21.000 And I said, sure.
00:13:23.000 And you didn't know for three years this episode was an issue.
00:13:26.000 So my only option then is to delete every single show off the platform or you're going to ban us.
00:13:32.000 What they effectively told me was, no, no, it's fine, you're fine, and I said this.
00:13:39.000 Okay.
00:13:41.000 Then someone at the highest level of Google or YouTube came down to you guys and said, delete those episodes, I don't want them on the platform, make up a reason.
00:13:51.000 And you're telling me it's fine, and we're not gonna get banned, because you know it's political, and it was someone at Google who ordered the shows to be removed.
00:13:59.000 If that is not the case, then you have retroactively placed policy enforcement actions against us, which leaves me with no alternative but to delete every video off this channel, otherwise at any moment we could be banned.
00:14:12.000 And they said, no, no, I don't know, I can't tell you that.
00:14:16.000 Okay, great, you can't.
00:14:17.000 Well, I immediately made some calls to top men, who I will not reveal.
00:14:22.000 We have big plans coming up for the studio move, which is taking place this weekend.
00:14:28.000 A lot of people have said, Tim, go to Rumble!
00:14:30.000 Tim, go to Rumble!
00:14:31.000 Well, you know, we're on Rumble.
00:14:33.000 And then people ask us why the live show isn't on Rumble.
00:14:37.000 The live show is the biggest driver of memberships at TimCast.com, which is the only way we're able to do all of this.
00:14:45.000 If we downsized and became like a digital over-the-air show where we just Skype people in and stuff like that, Sure, that shaves off a ton of money from our budget, and we could make things a lot cheaper, but I think one of the things that makes the conversations on the show work better, and I've talked to a lot of people in the industry about it, and everyone agrees, is in-person conversations in real life.
00:15:09.000 Well, I don't want to do that.
00:15:10.000 That means right now, based on how much it costs to run this show, drivers, staff, hotels, etc., the amount of members we have is maintaining.
00:15:21.000 The amount of memberships we have is at a decent amount where we make a little bit more every month than we spend on the show, which gives us the ability to invest in other projects.
00:15:32.000 My concern when I talk to all these other big companies and they're like, we want TimCast IRL live here, here, or here, or otherwise, is that the clips don't drive a lot of memberships.
00:15:42.000 The live show does because once we wrap the live show, we say, hey, the show continues at TimCast.com.
00:15:48.000 Become a member to watch the members only uncensored.
00:15:51.000 With that, we are maintaining a slight growth.
00:15:54.000 We have a slight uptick, a little bit, in how many members we have, but we don't grow a whole lot.
00:15:59.000 It's very slow and steady.
00:16:02.000 And this means, based on the model we have, the show can continue.
00:16:07.000 If we were to stop doing live on YouTube the way we are, divide it up to other platforms, we run the risk of deranking, we run the risk of losing a large portion of what funds the show, driving new members, and then we become a sinking ship.
00:16:23.000 We would have to start firing staff, cutting corners, reducing investment in projects.
00:16:29.000 Possible.
00:16:30.000 We could do that.
00:16:31.000 I'd prefer not to.
00:16:32.000 So the conversations we've had with other big networks has been, can you cover the costs of how much we make through YouTube in ad revenue, so that if we make this move, and we lose money, we stay afloat for at least a certain amount of time?
00:16:47.000 And typically the answers have been, I don't know, maybe.
00:16:51.000 I don't know if we want to do that.
00:16:52.000 And I say, okay, well then we're gonna keep doing what we're doing on YouTube because the live show generates the memberships that make the show work.
00:16:59.000 It's that simple.
00:17:00.000 I'll let you guys in on another... I don't know if it's a secret or whatever.
00:17:03.000 Inside baseball.
00:17:04.000 Inside baseball.
00:17:05.000 I'm a big fan of rumble.
00:17:07.000 Uh, I'm friends with Chris Pavlovsky, he's a great dude.
00:17:09.000 We use Rumble infrastructure for TimCast.com.
00:17:11.000 We use Parallel Economy for our memberships.
00:17:12.000 We are absolutely utilizing their infrastructure, and they make money from it, we make money from it, because we want to build the Parallel Economy.
00:17:18.000 But there is a reality.
00:17:20.000 When Rumble launched, and we split our clips from YouTube to Rumble, we lost probably 40% of the revenue we got from YouTube.
00:17:27.000 Because as much as Rumble is great and we want Rumble to exist and we want to be on Rumble, we don't make ad revenue off those videos.
00:17:34.000 So when a video normally got $80,000 to $100,000 on YouTube, we would make a couple hundred bucks.
00:17:40.000 Now the video on YouTube gets $50,000, $60,000, and on Rumble gets $30,000.
00:17:44.000 That means we lost all of that ad revenue.
00:17:46.000 So we have to maintain memberships.
00:17:48.000 We can't just shut down and switch to Rumble because then we'd have to start laying people off and shrinking the ship.
00:17:56.000 Don't want to do that.
00:17:57.000 We are currently having conversations with some other companies.
00:18:00.000 Now that YouTube has made these moves, there is renewed interest in how we can make these changes.
00:18:05.000 I don't want to say too much because business negotiations are ongoing, but there are some potentially big moves that may happen based on this.
00:18:12.000 I think what we're seeing with YouTube is a few things.
00:18:15.000 Just yesterday, many people noticed the view count on the show was going all wild and crazy, but that wasn't unique to us.
00:18:21.000 It affected tons of other YouTubers and channels that were noticing weird issues pertaining to live and view count.
00:18:26.000 And the day before, something similar happened.
00:18:28.000 View counts crashed on a bunch of videos and people were like, whoa, my video didn't get any traffic.
00:18:33.000 It was not unique to any one channel.
00:18:35.000 Something must have happened at YouTube with a policy change.
00:18:39.000 Now there's retroactive enforcement.
00:18:41.000 I don't think it's a coincidence that around the same time, two episodes are nuked instantly.
00:18:46.000 Maybe some new guy came in.
00:18:48.000 Maybe they hired a new person.
00:18:49.000 Who knows?
00:18:50.000 Have no idea.
00:18:51.000 I told Google, I cannot run a business if this is how you treat your business partners.
00:18:58.000 This is an F you to me and a threat.
00:19:00.000 They issued a warning on our channel requiring us to take a class to better understand how we broke the rules.
00:19:06.000 But we didn't break any rules.
00:19:07.000 Not a single rule was broken.
00:19:08.000 They lied.
00:19:09.000 They're liars.
00:19:11.000 And so, I said, if you email me and say, due to this, that, or otherwise, we're going to remove these videos from your channel, don't worry, no effect to you.
00:19:21.000 I would grumble and complain.
00:19:23.000 But when you issue a warning on my channel, you are saying we are prepared to ban you permanently.
00:19:28.000 We are prepared to take you down for a week the next time this happens.
00:19:32.000 With any one of your videos from the past four years, it could happen.
00:19:35.000 We are prepared to permanently ban you if we can find three more videos over the past four years that we can interpret as breaking the rules, your show is permanently banned in every respect off of YouTube.
00:19:48.000 So I said, what should I do then?
00:19:50.000 The only thing I can do is delete every single video.
00:19:54.000 I don't think it's a coincidence that it's 2024, we knew things were going to get crazy this year, and now YouTube has taken such an extreme and drastic action such as a 3 year retroactive enforcement.
00:20:09.000 There's a couple things we can do.
00:20:11.000 We'll have more information on Monday.
00:20:14.000 We have big plans.
00:20:15.000 We're moving into our new studio on Saturday is the big opening party and skate jam and contest.
00:20:22.000 Friends and crew and friends of the show are going to be there.
00:20:24.000 We're going to be eating catering courtesy of Dutch's Daughter.
00:20:27.000 We're huge fans.
00:20:28.000 They're a great restaurant in Frederick, Maryland.
00:20:30.000 They make some of the best food.
00:20:31.000 And that will be the opening party.
00:20:34.000 That means Monday the show will be live from the new studio.
00:20:38.000 And I'm just so excited because the cameras look so good.
00:20:40.000 There's no more color issues and everyone looks a lot sexier.
00:20:44.000 So it's going to be great.
00:20:45.000 Everyone will look very thin with these cinematic beautiful cameras.
00:20:50.000 We're also going to be planning a change to how we broadcast the show but Considering what just happened, there are some business happenings behind the scenes.
00:21:00.000 I'd have no problem telling everybody literally what those plans are and what we're negotiating on, but because it involves third parties who are negotiating as well, it's a violation of their privacy.
00:21:09.000 I won't do that.
00:21:10.000 There may be some great news on Monday, though, and it could benefit this show in many ways.
00:21:18.000 And then I think following this, what we're going to do is Ramp up marketing in ways we've never done before.
00:21:25.000 So, with YouTube taking this attack against us, we have a couple, we have a couple, uh, there's a couple things we can do.
00:21:31.000 Here's a secret!
00:21:32.000 It's not really a secret, I've mentioned this before.
00:21:34.000 What I pay myself in terms of a salary comes from the Tim Pool Daily Show, which is youtube.com slash timcastnews.
00:21:41.000 That show alone generates me personally, produced by me, 99%, I say 99 because sometimes someone who works here might send me something and there's moderate assistance, but I wake up, I sit down, I reach the news, I monologue, I make a million dollars a year.
00:21:57.000 It is just above a million bucks off of that morning show alone.
00:22:02.000 If I did not do TimCast IRL, I would work a regular shift, have the rest of the day for family and travel.
00:22:09.000 I could do the show literally anywhere in the world with my girlfriend.
00:22:12.000 I could live in the mountains and we could ski all year round and do whatever and not have to worry about it.
00:22:17.000 Timcast IRL does generate profit, and it does generate hard assets, and these things do benefit my net worth.
00:22:23.000 I don't want to pretend that's not the case.
00:22:25.000 I don't pay myself a salary based off what is coming from Timcast IRL, however.
00:22:28.000 The overwhelming majority of the money basically covers the cost of everyone's salaries, travel, equipment, all of these things.
00:22:36.000 Again, I stress, the equipment and all that does add to my net worth.
00:22:38.000 I'm not going to lie about that.
00:22:40.000 But I'm not doing this, and I'm not making money from this.
00:22:44.000 This is just something that is fun to do, that is important, I enjoy doing, I enjoy bringing people here, I think it's beneficial across the board, and then the small amount of excess revenue that we get basically invests in these other projects.
00:22:56.000 So we've got, uh, Pop Culture Crisis, of course.
00:22:59.000 We've got, uh, Shane Cashman's Inverted World Show that we're building.
00:23:02.000 We've got the Boonies Skate Show that we're building.
00:23:05.000 That, uh, and I will stress, the, uh, the Boonies stuff is very, very expensive, but most of the cost is the building of a new studio for the sake of TimCast IRL.
00:23:14.000 So, what we need to do is There's two options.
00:23:21.000 I just say, wow, they got us.
00:23:23.000 We've lost.
00:23:24.000 Why am I even dealing with these headaches?
00:23:26.000 We're getting sued.
00:23:27.000 All of this nightmarish stuff for something that doesn't personally make me money that I can go spend on vacation at casinos and things like that.
00:23:34.000 Again, I'll stress, like, there's profit and there's, there's a net worth gain, but it's, it's like the exponential workload compared to how much you make.
00:23:41.000 It's just, I could work in the mornings and Tim pool show and make a million bucks a year and then sell sponsorships and even make more and not have to think about it.
00:23:49.000 Or, here's what else we can do.
00:23:51.000 We can work a deal with a... I don't want to say too much because it's going to be up to them when we do finalize the deal, but third parties.
00:24:01.000 And then attack this thing through massive marketing campaigns and basically make it a point that YouTube is not safe for your business.
00:24:13.000 If you try to start on YouTube, they will, with no warning, and with no reason, and everyone already knew this, but let's stress this, they will destroy your company overnight.
00:24:24.000 We have to build a parallel economy, and so that's the attack factor we're going to take.
00:24:28.000 As of next week, hopefully we'll have more information on this, as to what we're doing in terms of parallel economics, and how we're going to support, fund the show, and grow the show, in defiance of YouTube's ridiculous and insane retroactive enforcement, and hopefully, We will plant the seeds.
00:24:44.000 Nay, I should say, we will water the trees that have already been planted by other great people working hard on this, namely those at these other social media networks.
00:24:51.000 Shout out to Rumble, of course.
00:24:53.000 To Bill Ottman.
00:24:54.000 Mines.
00:24:55.000 Shout out to Elon Musk.
00:24:56.000 And we will water those trees that have already been planted, and then we will supplant and displace the corrupt and crooked establishment that breaks the rules for their own personal benefit and politics.
00:25:08.000 So.
00:25:09.000 That's the gist of my rant.
00:25:12.000 It would be great to federate the Minds, Rumble, and X. I would love to see these platforms interlocking.
00:25:19.000 That would be so hot.
00:25:20.000 I have no idea what that means.
00:25:22.000 It means like if you're on Minds logged in, you can follow your Twitter follower, your X followers, or respond to your Rumble comments.
00:25:28.000 I imagine that doesn't really help the owners of the platforms though.
00:25:35.000 You'd be surprised.
00:25:36.000 It seems like you're actually going to lose because you're giving other people more, but a rising tide raises all ships.
00:25:42.000 You really end up, it can end up becoming a really, really good, it's kind of like a federation of states.
00:25:48.000 Well, I don't know anything about the financials of that, but I know that $44 billion for X is a lot of money, and so he's going to have to be able to make some kind of profit off of it, at least for X. I don't know about what kind of financial situation Minds at Rumble is in.
00:26:04.000 Oh, I was slightly on a tangent.
00:26:07.000 Did you have another follow-up?
00:26:08.000 No, go ahead.
00:26:09.000 I got into internet video in 2006 because it's immensely powerful.
00:26:12.000 I mean, you can change the world with an internet video.
00:26:15.000 And it was on YouTube.
00:26:16.000 At first, it was on MySpace, but MySpace was a little clanky and I would have to embed my YouTube videos on my MySpace blog and then email them to my friends so that they could hear my thoughts.
00:26:24.000 And then MySpace just What happened was one month it got so popular that it was before virtual servers.
00:26:30.000 Their traffic ground to a halt.
00:26:32.000 You couldn't use the website for like a month and everyone jumped ship over to Facebook.
00:26:35.000 But anyway, the point was, it was always about the internet video.
00:26:38.000 It was never about YouTube.
00:26:39.000 People would be like, they'd have their YouTube shirts and they'd be so proud.
00:26:42.000 We do these YouTube live events and I love that stuff.
00:26:44.000 But it was about the internet video, the power of internet video.
00:26:47.000 That's always what it's been about.
00:26:48.000 It doesn't matter what network you're on.
00:26:49.000 And I feel betrayed.
00:26:51.000 It's a slight feeling of betrayal to have my stuff taken down by my provider, by my platform.
00:26:56.000 And it's not my plan, I understand it's not mine, and there are contracts involved, but it's like, you're supposed to have our back, man.
00:27:03.000 No.
00:27:05.000 We all saw the video that got released by, um, it was Breitbart and, uh, who else was it?
00:27:12.000 Oh, I can't remember.
00:27:13.000 Some tech company, media website.
00:27:15.000 Google staffers crying when Trump won.
00:27:18.000 Yeah, the ideological capture that, you know, you see in colleges when it comes to like sociology departments and stuff like that and the humanities departments, that has been going on for a decade.
00:27:34.000 That's been going on for a decade and that means they've been pumping people that believe the ideology that they're taught in school.
00:27:40.000 Those people have been pumped out into society.
00:27:42.000 So the reason that people that are at Google and in positions of at least some kind of authority and power, that it's because they got the ideology in college.
00:27:56.000 We have a story here.
00:27:58.000 I want to preface it by saying, if you're wondering why it is that three years after we aired them, our two biggest shows on YouTube... Granted, Darren Beattie was our biggest show ever.
00:28:09.000 It's got like 7 or 8 million views.
00:28:11.000 Yeah, I believe he's a former speechwriter for Trump.
00:28:14.000 People really, really wanted to watch that show.
00:28:16.000 They loved it, and it's on Rumble.
00:28:17.000 But our two biggest on YouTube were Alex Jones and Joe Rogan.
00:28:21.000 If you're wondering why they got deleted, you need only look at this news story from the National Review.
00:28:27.000 Police arrest Google employees who staged anti-Israel office protests.
00:28:32.000 Need I say more?
00:28:34.000 Employees at Google staged a sit-in of their own company requiring the boss to call the police and have them physically removed and they were placed on leave and their access was severed.
00:28:47.000 Now, with employees like that, I wonder how it's even possible a show like this exists.
00:28:54.000 I mean, that's the whole of the tech industry or whatever, the tech companies and stuff.
00:29:00.000 It is an ideology that is pervasive.
00:29:02.000 Obviously, it's not everybody, but because of the way that that ideology, the people operate, if you speak out too strongly or if you don't keep your head down, they go to HR and accuse you of all kinds of things and then you're out.
00:29:15.000 Everyone knows that that happens basically in modern corporate America today.
00:29:21.000 We have a video clip.
00:29:22.000 This is Cassie Dillon, now Cassie Akiva, and you can see here Google Office.
00:29:28.000 You guys are gonna leave? You guys refusing to leave?
00:29:30.000 Yes.
00:29:31.000 Okay.
00:29:38.000 And there it is.
00:29:38.000 I mean, that's it.
00:29:40.000 Google employees being arrested in a Google office.
00:29:43.000 Is it for trespassing?
00:29:44.000 Yes.
00:29:44.000 Well, that's what she had.
00:29:45.000 It's because they were protesting.
00:29:47.000 Just think about how insane this is, okay?
00:29:49.000 First, they weren't trespassing.
00:29:52.000 They work at Google.
00:29:54.000 They are in a Google office where they work.
00:29:57.000 They go and sit in their boss's office and then say, we're now protesting.
00:30:00.000 He says, you can get out of my office.
00:30:01.000 Say no.
00:30:01.000 He's like, I guess I have to call the police on you!
00:30:05.000 Then they get arrested for trespassing where they work!
00:30:10.000 I gotta say, man, I am impressed.
00:30:13.000 I am absolutely impressed with the left because, well, the right has everything to lose.
00:30:20.000 The fabric of their nation they love so dear?
00:30:23.000 The flag.
00:30:24.000 The lives and futures for their children, their sacred honor, blood and treasure.
00:30:29.000 And many of them just say, I will not speak up, challenge the system, because I have to feed my family.
00:30:37.000 But they have everything to lose.
00:30:38.000 These leftists are willing to protest at their own companies and get arrested after their boss calls the cops on them.
00:30:46.000 That's the extent of their zeal.
00:30:49.000 How do you win when this is the case?
00:30:51.000 Look, there was that kid that burned himself alive.
00:30:53.000 People are... I mean, you've got people that are so committed that they'll burn themselves alive, you know, over an ideology.
00:31:04.000 I'm convinced that it's music is the solution and it's almost silly.
00:31:07.000 Like I used to think I had to go find the people and then help them and I'd go around and like, I got to get to that guy and I got to get to that guy.
00:31:13.000 Then I realized if I just make the best sound, people come to me and it creates like this environment of like, That was the 60s, man.
00:31:23.000 That's the same thing that people have been saying since the 60s, and it didn't work then.
00:31:28.000 Do you remember the South Park episode?
00:31:32.000 Where the hippies are like, we gotta fight the establishment.
00:31:35.000 Okay, what do we do?
00:31:36.000 We gotta play harder, man!
00:31:37.000 Just keep playing music, nothing's getting done.
00:31:40.000 Yeah, but if you do it right, I mean, look at the Beatles.
00:31:42.000 I've been watching so much Beatles music, the way that they transform the world, the entire world.
00:31:47.000 They change music.
00:31:49.000 I gotta translate Ianisms for the general audience, right?
00:31:52.000 Because a lot of people are posting one saying Ian is wrong.
00:31:54.000 No, no, no, no, Ian is not wrong.
00:31:56.000 It's not the idea that music changes the world is that music builds influence.
00:32:00.000 And once you're an influential person in entertainment and pop culture, you can change hearts and minds.
00:32:04.000 That's why so many people are freaked out about Taylor Swift.
00:32:07.000 She commands masses of fans because she's an entertainer with music.
00:32:13.000 And then she writes a song called, what is it, You Need to Calm Down, which depicts conservatives as crack-toothed yokels and insults them for not supporting the LGBTQ ideology, and that's the power of music.
00:32:27.000 So perhaps it does sound a little naive, but let's translate that.
00:32:32.000 Creating entertainment that people want to follow and makes them feel good gives you a path towards influencing them in a variety of ways.
00:32:40.000 From the indirect to the direct.
00:32:41.000 You write a song, it's catchy, they like it, and you slip in their insults of the right and conservatives.
00:32:45.000 Then, once you're the most famous musician in the world, you get on stage and say, everyone go vote Democrat.
00:32:51.000 And Republicans fear that.
00:32:53.000 So Ian, I believe is correct in Breaking down the idea, it's true.
00:32:59.000 It's not just the song that does it, it's the industry that builds influence.
00:33:03.000 Is that what you meant?
00:33:04.000 Big time, yeah.
00:33:04.000 And you don't even have to put the lyrics in the song.
00:33:08.000 Often what'll happen is you'll make a song that someone puts on repeat and listens to 20 times in a night, and then they'll go to your website and find out who you are, and then they'll just adopt your politics.
00:33:16.000 They're like 19-year-olds or 14-year-olds and stuff.
00:33:20.000 Real quick, the Taylor Swift song, I think it's called You Need to Calm Down, is that it?
00:33:23.000 Yes, it is.
00:33:24.000 Imagine the 14 and 15 year olds who are hanging out at a bagel shop, and that song is playing, and they're not really listening, and in the background she's like, don't step on my gown, you need to calm down.
00:33:34.000 That is indoctrinating young people towards these ideas.
00:33:38.000 It is the radio screaming in their ears everywhere they go, right is bad.
00:33:44.000 And there's a comic about this actually.
00:33:46.000 It's like someone pencil drew this comic, it's great, and some guy says, you're brainwashed, and the other person says, and then it shows a music festival where the singer's like, Republicans are bad!
00:33:58.000 A guy on the TV saying, Republicans are bad!
00:34:00.000 A guy outside yelling, Republicans are bad!
00:34:02.000 Protesters, Republicans are bad!
00:34:04.000 And then the person's being like, you're brainwashed actually.
00:34:07.000 That that is entertainment and and and influence.
00:34:10.000 Yeah, I mean that there is there is truth to that.
00:34:13.000 So like if you're going to talk about being able to influence the culture, but I mean that's the that's the point of really like the overall point of just Tim cast like as a as an entity, right?
00:34:23.000 Like it's the the IRL.
00:34:26.000 Actually moves the needle, you know, I mean a friend of mine was talking today about this the situation with YouTube and stuff I think that we should reach out to some of the people in Congress we know and see if Congress will send someone one of the whether it be gates or whether it be Someone just to send a letter to Google and be like hey Did you guys take this down because of you know, why did you take this stuff down?
00:34:46.000 Why are you censoring people?
00:34:47.000 I mean, I know that they have I agree.
00:34:49.000 you know, terms of service and stuff. But if they don't have a legitimate answer, I mean,
00:34:53.000 it's something that the, that it does affect the interest of the American people. So it's,
00:34:57.000 it's something that they might do. I mean, I think that there might be in the term. I haven't read
00:35:01.000 the Google terms in a long time, if ever, actually in totality, but it might say can ban anyone at
00:35:07.000 It does.
00:35:07.000 They put those silly, and that could be unconstitutional, you could argue.
00:35:10.000 Those often can be unenforceable.
00:35:13.000 It just really depends.
00:35:15.000 You know, I've talked to lawyers about various platforms, and they go, well, they put those in there, but look, it really comes down to a judge, and the judge contracts don't mean much of anything.
00:35:27.000 They mean a lot.
00:35:27.000 They mean, don't get me wrong, but people, I think it's because of movies, to be honest.
00:35:33.000 Uh, there's a show on Netflix.
00:35:35.000 I don't know if you guys saw.
00:35:35.000 It's a, what is it, a Black Mirror episode, I think?
00:35:37.000 Where the woman's life is being broadcast on Netflix.
00:35:40.000 Okay, it's a Netflix show about a woman who turns on Netflix and there's a show about her life.
00:35:46.000 Oh, okay.
00:35:46.000 It's not the Truman Show kind of thing?
00:35:48.000 But like, they said, we use an AI that calculates your life and predicts it perfectly, and so she's watching all of her private moments broadcast in the show, and everyone's like, it's you.
00:35:58.000 She goes to her lawyer and says, how do I stop them from doing this?
00:36:01.000 And they're like, this contract here is ironclad.
00:36:03.000 You agree to the terms of service.
00:36:05.000 Sorry!
00:36:06.000 Yeah, that's not how it works.
00:36:07.000 That's not how it works.
00:36:08.000 If you go to someone and say, hey, let's do a deal.
00:36:11.000 Phil, I'll buy that drink off you right there.
00:36:13.000 What is that, a Spindrift?
00:36:14.000 I will give you 20 bucks for it.
00:36:16.000 Sound good?
00:36:16.000 Alright, let me just draft this contract up that says we're gonna do just that.
00:36:19.000 Just sign the contract.
00:36:21.000 Phil says, sure, I trust you, I'll sign the contract.
00:36:23.000 Ha ha!
00:36:24.000 In the contract it says I get all the rights to his music!
00:36:27.000 I own all of his music catalog!
00:36:28.000 He signed it!
00:36:29.000 A judge is gonna laugh.
00:36:30.000 He's gonna say, shut up!
00:36:32.000 You did a deal for a soda, not a music catalog.
00:36:34.000 Throw it in the garbage, get out of my courtroom, don't waste my time.
00:36:36.000 Just because you signed it doesn't mean anything.
00:36:37.000 Yeah.
00:36:38.000 So, when it comes to the rules, What really matters is the expectation of both parties, consideration provided.
00:36:45.000 That is, what did you exchange and for what?
00:36:48.000 And I believe Google is absolutely in violation across the board.
00:36:52.000 It's just who wants to sue Google?
00:36:54.000 Who wants to go up against a trillion dollar company or whatever it's worth?
00:36:58.000 Alphabet.
00:36:59.000 It's even bigger.
00:37:00.000 They've got like they own I don't know how many Alphabet 9 companies they own under their umbrella and it's like big military tech, like really wild life extension.
00:37:08.000 They're just all over the globe right now.
00:37:10.000 Alphabet, a huge, huge company.
00:37:12.000 When I started YouTube, it was YouTube.
00:37:14.000 Google didn't own them.
00:37:15.000 It was just YouTube.
00:37:16.000 Steve Chen started the company and it was like broadcast yourself.
00:37:20.000 We're having fun!
00:37:20.000 Look at my dog!
00:37:21.000 Look at these sloths!
00:37:23.000 And then Google bought it, and I was like, oh god, corporatization.
00:37:25.000 And now Alphabet, and now the governments involved, we know, through like Edward Snowden's PRISM stuff, we know that the governments, and with the Twitter files and all that, we know that governments have been heavily involved, the American government, with censorship on social media.
00:37:38.000 So it feels just like part of the war machine at this point.
00:37:40.000 I will add, A possibility as to what happened.
00:37:47.000 For the first time ever, I recorded a quick 30-second bit.
00:37:50.000 I opened up TimCast IRL.
00:37:51.000 I clicked popular, showing our most popular episodes.
00:37:54.000 And then I said, if you're looking for a show on culture, news, and politics, check out TimCast IRL, Monday to Friday, 8pm.
00:37:59.000 Click the subscribe button.
00:38:00.000 Come hang out.
00:38:01.000 We've had great guests from, you know, Kanye to Joe Rogan.
00:38:05.000 And tried to keep it light.
00:38:07.000 They denied it as an auction ad.
00:38:09.000 I appealed.
00:38:10.000 They denied it.
00:38:10.000 I recorded a new ad.
00:38:11.000 They denied it.
00:38:12.000 They said it was an election ad.
00:38:13.000 They said I had to fill out some form and get a certificate as an election advertiser.
00:38:16.000 And I'm like, but I'm not promoting any politicians!
00:38:20.000 So I contact Google.
00:38:21.000 Their staff say, you know what?
00:38:23.000 You're right.
00:38:23.000 This is not an election ad.
00:38:25.000 We don't know what's going on.
00:38:25.000 We'll get back to you.
00:38:27.000 Three or so days ago, or I think it was three days ago, Monday, they get back to me and say, your ad is approved.
00:38:33.000 Thank you.
00:38:34.000 Those two episodes it took down were featured in that advertisement.
00:38:38.000 So I wonder if a component of this is someone working Google Ads sees an advertisement for their biggest live show on average, averaging the largest live audience on YouTube, and the two biggest episodes are Relics Jones.
00:38:53.000 I wonder if a higher-up saw that and said, delete those episodes now.
00:38:56.000 I don't care how, make up a reason.
00:38:59.000 Now they've got an advertisement on Google that's I'm like, do I just blast a ridiculous amount of money at this ad?
00:39:06.000 Should I make an ad on Google right now?
00:39:08.000 You know, maybe I'll do that.
00:39:09.000 It'll be funny because like, will they deny the advertisement?
00:39:11.000 I make an ad where I say, YouTube wrongly deleted our two biggest episodes.
00:39:16.000 They'll probably deny it because it's too, what are you, meta?
00:39:21.000 But it doesn't violate any of the rules to do that.
00:39:23.000 Hmm.
00:39:24.000 Interesting.
00:39:24.000 Yeah.
00:39:26.000 Hey, here's YouTube.
00:39:27.000 Yeah, these are the two biggest shows for three years and after I made an ad they deleted them clearly They're scared that you they're scared of you finding out that they had these shows I That hanging out with Alex and Joe Rogan and everybody in the RV That was like one of the more fun nights of my life so far That was a really I mean it was just an exhilarating evening to hang with those dudes and watch everyone talk What's crazy is that show to come back on it wasn't our most concurrent viewership And, but it was our most viewership after the fact.
00:40:00.000 I think it had like 2.4 million views.
00:40:03.000 On YouTube, on YouTube.
00:40:03.000 The Darren Beattie one, you can look it up.
00:40:05.000 I think it's like, what, 7 million total or something like that.
00:40:07.000 It's like 6 point something on Rumble.
00:40:09.000 It was several hundred thousand on YouTube.
00:40:11.000 It was a couple hundred thousand on other platforms.
00:40:13.000 And by the way, yes, on Rumble.com slash Timcast IRL.
00:40:17.000 It's on Rumble.
00:40:18.000 The whole channel's on Rumble right now.
00:40:19.000 So make sure you subscribe to our Rumble channel, TimCastIRL, as well as our, uh, x, uh, YouTube do- I'm sorry, uh, x.com slash TimCast?
00:40:27.000 I think it's an x.com- x.com slash TimCast?
00:40:29.000 It's still twitter.com, though, I think.
00:40:31.000 Yeah, maybe it'll work for either Twitter.com.
00:40:33.000 I think at this point anyone it's tough to say I would encourage people to multi stream just to do it and build followings on all but I bless you Tim.
00:40:41.000 I understand the thank you.
00:40:44.000 Absolutely, sir.
00:40:45.000 My pleasure that to focus it all into one platform to generate massive ad revenue on that one platform does make a lot of sense.
00:40:53.000 Face value, but man just having your tentacles all over the place if you're starting out we it's if you're starting out It's it's there's a temptation to like build a platform build a following on a platform And then that's where you feel like you're at home, and you want to stay there But it is good to spread out But you a lot of times you'll need that that home base kind of platform to start you off so that way you can actually like continue to actually have a You know produce a show or whatever you know for a lot of smaller producers.
00:41:20.000 I mean I And YouTube's been so good with ad revenue.
00:41:22.000 The whole partner program thing, 2008, when they introduced it, that was one of the great things about Google buying YouTube is that they introduced that money, big money, and they could pay people.
00:41:29.000 And that was like, what the fuck?
00:41:31.000 I'm just doing this for fun.
00:41:32.000 I didn't expect, I'm a waiter.
00:41:32.000 I'm doing this to help people.
00:41:34.000 That was my life at the time.
00:41:35.000 I didn't, I just wanted to help people.
00:41:37.000 And then they were like, we're going to give you money.
00:41:39.000 And then, man, I got nervous about that.
00:41:41.000 Actually, they didn't bring me on the partner programs.
00:41:42.000 I was too racy.
00:41:43.000 I was getting high and talking about saying fuck shit and talking about politics and all the racy stuff.
00:41:48.000 Yeah, welcome back guys.
00:41:49.000 It's real life.
00:41:51.000 We're all in this together.
00:41:52.000 And it was a shocking twist to watch people start to get paid and go.
00:41:57.000 Then we made Maker Studios.
00:41:58.000 I got in with Ben and Danny and we conceptualized Maker Studios
00:42:01.000 and we built out this multi-channel marketing concept and then all the YouTubers came.
00:42:06.000 Phil DeFranco was there, Dave Days, Casam G, Lisa Nova.
00:42:12.000 We were all in Venice making stuff.
00:42:12.000 It was awesome.
00:42:14.000 It was just such a good time.
00:42:15.000 And they sold it to Disney.
00:42:16.000 They sold it to Disney for a billion.
00:42:18.000 But I was so high, I was just like, I don't give a fuck about the money, man.
00:42:21.000 I was just so like...
00:42:22.000 Dark at that time in my life.
00:42:24.000 Let me tell you, I worked for Fusion, which was owned by Univision and ABC News, so Disney.
00:42:32.000 And one of the funniest things in the world was when I was talking about some collaborations that would be great to do, they mentioned, hey, aren't those people signed to Maker Studios?
00:42:41.000 And I was like, yeah, I think so.
00:42:43.000 And they were like, awesome, we'll do it.
00:42:45.000 And I was like, cool, all right.
00:42:47.000 Well, let me reach out to these guys and see what we can, Do you know what Disney bought?
00:42:52.000 And they went, what do you mean?
00:42:53.000 They're with Maker, right?
00:42:55.000 I'm like, yeah.
00:42:56.000 So let me reach out to them and see if they're interested in what they need to do it.
00:42:59.000 And they're like, no, no, no, they're already signed to Maker.
00:43:01.000 Disney owns it.
00:43:02.000 And I was like, and?
00:43:03.000 And they were like, so we'll just do it.
00:43:05.000 And I was like, do you know what Disney bought?
00:43:10.000 So there were high ups at this company that thought they bought a talent roster,
00:43:17.000 which gave them signed talent under obligations, like a talent agency.
00:43:22.000 And they did not realize all it did was a rev split.
00:43:26.000 Yep, yeah.
00:43:27.000 So I was like, I ended up having this conversation where I'm like, do you think that these people are signed to a talent agency we own and that they have to go through us for gigs?
00:43:27.000 And it didn't do anything.
00:43:37.000 And they were like, what is Maker?
00:43:40.000 And I was like, it's a multi-channel, like, what is it called?
00:43:43.000 Multi-channel marketing network, MCM.
00:43:46.000 That was like what it became called.
00:43:46.000 Yeah.
00:43:47.000 Multi-channel network.
00:43:48.000 Multi-channel network.
00:43:48.000 Yeah, I was like, all this means is that their YouTube channel is part of a multi-channel network to generate revenue for their channels.
00:43:55.000 And they were like, so we can't work with them?
00:43:57.000 And I'm like, we can work with them the same as we can work with literally any person at any company, but that means we have to negotiate a rate, figure out who their manager and agent is, and their agent could say no.
00:44:07.000 And they were like, we don't own their agency?
00:44:09.000 And I was like, no dude!
00:44:10.000 They had no idea what they bought.
00:44:12.000 The whole point of Maker in the beginning when me and Danny and Ben were talking about
00:44:15.000 it in a hotel room at YouTube Live in 2007 in San Francisco, I was like, we got to make
00:44:20.000 a web actors guild.
00:44:22.000 We'll call it WAG.
00:44:23.000 I don't know.
00:44:24.000 We had SAG, Screen Actors Guild for actors.
00:44:26.000 And all these YouTubers were getting screwed because we weren't getting...
00:44:28.000 It was just like, I could see the whole, the people taking advantage of it.
00:44:32.000 I wanted to create some sort of union.
00:44:35.000 And that's where the impetus came from to make Maker.
00:44:38.000 And then, so everyone was just kind of poured it coming in.
00:44:40.000 they were already wealthy, they were already making stuff, and we were just working together
00:44:43.000 I used to say I don't care about the money and lately I've been thinking about this a lot.
00:44:46.000 I care about it.
00:44:48.000 I care about it.
00:44:49.000 I understand the value and the usefulness of it but it's not my primary motive.
00:44:54.000 Um, I'm fortunate that I've never been starving on the street.
00:44:57.000 Pretty much.
00:44:58.000 I've lived in a car for a little bit of time, but like, like when I say I don't care about, I got to find a better way to phrase that.
00:45:03.000 It's not my priority.
00:45:05.000 Yeah, social capital nobody nobody will invest in a guy when he's like I care what the money they're gonna be like
00:45:10.000 exactly Cuz they want to make anyone to see some money
00:45:12.000 But like social capital is real if you've got a hundred people that will work for you for free
00:45:16.000 That's is more valuable than a million bucks nowadays You're not gonna find a hundred people that are gonna work
00:45:19.000 for you for free didn't that in the economy that we have right now
00:45:22.000 See sincerely this is this is an actual This is an actual material thing that you're gonna actually
00:45:27.000 have to confront if you actually want to do something like that and with people
00:45:31.000 having such a hard time with making ends meet with the jobs that they have the the
00:45:35.000 The value of the dollar going down so much as it has in in the past
00:45:40.000 You know year or two like Getting people to work for free, you have to be able to support yourself as it is, and you've got people that can't get houses, can't start families, they can't do all kinds of things that they want to do.
00:45:51.000 You hear people constantly talking about that, that they don't have the money for this, can't afford this, everything's gone up, the prices are so much, blah, blah, blah.
00:45:58.000 Getting people to work for free is not going to happen.
00:46:01.000 I would suggest another plan.
00:46:03.000 What you can do is get a bunch of exercycles and wire them to large batteries and then offer people a free exercise program to get in shape.
00:46:13.000 And what they're really doing is powering your house, saving your electric bill.
00:46:17.000 You can also start a cryptocurrency.
00:46:19.000 People did that.
00:46:19.000 I don't know how legal that is anymore.
00:46:21.000 You shouldn't.
00:46:22.000 I want to give a shout out to Nathan for you.
00:46:25.000 You see that episode where he's like, I've created a moving company.
00:46:30.000 And I've created a new workout program.
00:46:33.000 And he got a guy who like never did this to go on TV and claim that moving is actually the most robust and all-around workout.
00:46:41.000 And then he goes to him and he's like, okay, I'm a moving company.
00:46:44.000 We'll move your house and all your furniture to your new location.
00:46:47.000 It'll cost you X. He's like, okay.
00:46:49.000 Then he goes to a bunch of people who want to lose weight.
00:46:50.000 And he's like, it's a great workout program where you move furniture.
00:46:53.000 And they're like, oh, wow.
00:46:54.000 So he basically, he gets laborers for free to move someone's furniture for him.
00:46:58.000 That show was great!
00:47:00.000 Let's jump to the story.
00:47:02.000 This is actually the big news, I gotta be honest.
00:47:05.000 This is from Bloomberg.com, and I laughed a lot when I read the headline.
00:47:10.000 Dubai grinds to standstill as cloud seeding worsens flooding.
00:47:14.000 I would just like to stress, the headline is effectively, government weather manipulation backfires, worsening flooding.
00:47:23.000 I don't wanna say it caused it, okay, maybe it did, but yo, look at this.
00:47:30.000 Torrential rains across the UAE prompted flight cancellations, forced schools to shut, and brought traffic to a standstill.
00:47:36.000 The heavy rains that caused widespread flooding across the desert nation came after cloud seeding.
00:47:42.000 The UAE has been carrying out seeding operations since 2002 to address water security issues, even though the lack of drainage in many areas can trigger flooding.
00:47:50.000 So I don't know if you guys saw these videos that were going viral.
00:47:52.000 Yeah.
00:47:53.000 Insane flooding.
00:47:54.000 Apparently just a few days before, they do this thing where they spray potassium chloride into updrafts, which launches salts into cloud formations, which is then, it attracts water particles.
00:48:08.000 It's a, you know, salt.
00:48:10.000 It wants to absorb the water, it pulls the water in, creating a dense pocket of water that falls down as rain.
00:48:17.000 I don't know if they accidentally let loose too much, but it sparked, look at this, it's a natural salt, this is potassium chloride, and it resulted in this mass flooding all over Dubai.
00:48:29.000 Oh, this is kind of like a good thing.
00:48:31.000 Not the flood, but the warning itself is a good thing, like that we know that this can happen.
00:48:36.000 No, this is a terrible thing.
00:48:38.000 And the reason this is a terrible thing is because there are people talking about using methods to affect the amount of sunlight the planet gets
00:48:50.000 in order to prevent the Earth from warming anymore.
00:48:53.000 Now, first of all, the idea that the Earth warming is bad is controversial in and of itself.
00:49:01.000 When human beings meddle with stuff like this, they do not have the ability to predict the outcome,
00:49:08.000 which is why you have floods in Dubai, right?
00:49:12.000 So this is similar to what happened with Lysenkoism in the Soviet Union.
00:49:23.000 was a scientist and he rejected darwinism and this was soviet science soviets rejected darwinism and their belief was that plants that are like each other work communally this was an argument made because they they were against western science totally and they said you should plants you should plant plants that are like You know, of the same variety.
00:49:46.000 You can plant them very close together because they will work as a one unit and they will be more prosperous.
00:49:54.000 That is absolutely wrong and it caused a famine that killed millions and millions of people.
00:50:00.000 This is what happens when the, when man thinks that he needs to affect nature on that grand
00:50:08.000 a scale.
00:50:09.000 A similar thing happened in China when you talk about the, or when you hear about the
00:50:12.000 sparrows, there was an argument that that Mao was making that sparrows were foreign.
00:50:18.000 They were not Chinese.
00:50:20.000 They were not native to China.
00:50:21.000 So because they were not Chinese.
00:50:23.000 They were not communist So the communists should get rid of the of the non-communist sparrows I know it sounds crazy, but that's the argument that he made And so that's what they did every time the sparrows landed people would go and chase off the sparrows.
00:50:35.000 They would kill them They would they would they would you know, just whatever just to get them into the air and get them to go away What ended up happening?
00:50:42.000 Was the fact that there were no sparrows, or not enough sparrows, meant that the bugs ended up creating another, like, a massive swarm of bugs that ate the crops, and there was another famine.
00:50:54.000 These types of grandiose plans to affect, like, the amount of sunlight that falls on Earth are doomed, and they doom millions and millions of human beings.
00:51:06.000 They're risky.
00:51:06.000 Billions, possibly.
00:51:07.000 Some of them are effective.
00:51:08.000 Sometimes geoengineering is good.
00:51:10.000 Why would you want to make less sunlight?
00:51:12.000 The Amazon River Basin, for instance.
00:51:14.000 Apparently the Amazon rainforest was man-made.
00:51:17.000 Apparently humans have worked on dirt.
00:51:18.000 If you look up the dirt under the Amazon... I don't believe that one bit.
00:51:21.000 Check this out.
00:51:22.000 I know, it's shockingly bizarre.
00:51:24.000 No, I don't believe that at all.
00:51:25.000 There's this rich soil.
00:51:26.000 But Phil, did you know that Atlanteans were white?
00:51:29.000 I heard that clip on Joe Rogan.
00:51:32.000 Amazon soil, it's really dark, rich soil that's man-made.
00:51:36.000 They find it in the basin of the Amazon.
00:51:37.000 Have you guys studied this dark terra preta, is what it's called?
00:51:40.000 And apparently it was created by humans.
00:51:42.000 I have done no research on this.
00:51:43.000 I have read absolutely nothing about it.
00:51:45.000 And I'm going to sit here and smugly tell you he's wrong.
00:51:48.000 It's a very dark, fertile, anthropogenic soil found in the Amazon basin.
00:51:52.000 I don't know anything about this, but I don't believe I don't believe that the Amazon forest was created by humans.
00:52:00.000 No, what happened was they made the dirt to fertilize the area while they were like that tens of thousands of years ago or whatever, could have been Atlantis, could have been an ancient civilization, and then after everything passed away, the Amazon just flourished because of this soil, this rich soil that they'd created.
00:52:13.000 Right, but I don't think creating soil is comparable to chemical geoengineering.
00:52:18.000 Yeah, it's a bit different.
00:52:19.000 Yeah, so, Washington Post has a different take on it, and this is interesting.
00:52:24.000 The headline is, This Technology Didn't Cause Dubai's Floods, Scientists Say Here's Why.
00:52:30.000 No, no, wait, hold on there a minute.
00:52:32.000 This technology?
00:52:35.000 Why did an editor who picked this up say, don't put cloud seeding in the headline?
00:52:39.000 Why not?
00:52:40.000 Why could they not say cloud seeding?
00:52:41.000 This is weird.
00:52:43.000 This article may as well not exist.
00:52:45.000 But that's what they're trying to say after nearly two years worth of rain flooded the Dubai region Tuesday.
00:52:50.000 Attention quickly shifted to cloud seeding.
00:52:52.000 Why did they put cloud seeding in the headline?
00:52:54.000 It's almost like they don't want people to know they're doing this.
00:52:57.000 I'm not saying that's the case.
00:52:58.000 It's just a nondescript article nobody's going to read is weird.
00:53:02.000 Yeah, I mean, this stuff gives me the gives me the shivers because of things like, you know, like the... Remember pig iron?
00:53:11.000 Pig iron?
00:53:11.000 Yeah, it's when they said, we need the metal for weapons, or what were they?
00:53:15.000 They told everybody to melt down all of their tools to make weapons, but it was garbage iron that broke.
00:53:21.000 Yeah, the iron became very brittle.
00:53:23.000 Yeah, Chinese Communist Party.
00:53:24.000 Oh, yes.
00:53:24.000 Yes.
00:53:25.000 Yes.
00:53:25.000 Okay.
00:53:25.000 Yes.
00:53:25.000 I do.
00:53:25.000 I do remember that.
00:53:26.000 I'm not familiar with the story, but I do remember that.
00:53:28.000 But this is the thing that I'm concerned about is like this type of impulse by the powers that be or whatever, NGOs, big governments, whatever you want to call it, or whoever's involved in it, because I think that they're probably it's not just just governments.
00:53:42.000 It's climate activists.
00:53:44.000 And there are NGOs that are involved in stuff like the UN and stuff.
00:53:48.000 The things that are going on in Europe about the farmers and the protests and trying to prevent the farmers from using certain kinds of fertilizer because of carbon and stuff like that, all of those things will have massive downstream effects on the rest of the world.
00:54:07.000 And when you meddle with What actually are delicate systems, right?
00:54:12.000 The system that provides food for the 8 billion people on earth is because of petrochemicals.
00:54:21.000 It's because of oil.
00:54:22.000 Without oil, if we just say leave it in the ground like the environmentalists say they want to, that means that billions of people die.
00:54:31.000 Not millions, billions.
00:54:33.000 And I think that there are people that are far too quick to think that humans have everything figured out, especially nowadays with the information and technological revolutions that we've had since just since the turn of this century, never mind last century.
00:54:47.000 But this one, people frequently think, OK, we've solved these problems.
00:54:51.000 We've got everything under control.
00:54:53.000 AI is almost here.
00:54:54.000 We're going to figure everything out.
00:54:55.000 We can go ahead and just go ahead and do it and we'll figure it out and everything will be fine.
00:54:58.000 But that is probably wrong.
00:55:01.000 I got a conspiracy for you.
00:55:02.000 Let's hear it.
00:55:02.000 Remember global cooling?
00:55:04.000 Yes, in the 70s.
00:55:05.000 Yeah, they were telling people to drive as much as possible.
00:55:07.000 Were they really?
00:55:08.000 Yes.
00:55:09.000 There were magazine articles about it.
00:55:10.000 There's a viral video where it's a guy saying, we may face another ice age as the planet cools rapidly.
00:55:16.000 Trends are showing the planet getting cold.
00:55:19.000 Conspiracy theory.
00:55:20.000 Yeah.
00:55:20.000 The government fearing global cooling said the people aren't producing enough carbon.
00:55:26.000 So we are going to have a new ice age and it will destroy our economy.
00:55:29.000 It'll destroy this country.
00:55:31.000 So they created a device that would heat the planet just a little bit to stave off global cooling.
00:55:37.000 But oh no!
00:55:38.000 They lost control and it overheated and started global warming and now they're like, oh quick, global warming's the problem now.
00:55:44.000 That explains everything.
00:55:45.000 Overcompensation.
00:55:46.000 That's right.
00:55:47.000 I understand.
00:55:47.000 Now we have global warming and they're desperately trying to stop everybody from carbon.
00:55:52.000 You made a great point before the show about that we're in an interglacial period still.
00:55:56.000 We're still in an ice age.
00:55:57.000 We're coming out of the last ice age.
00:55:58.000 I think it's confusing because the comets seem to have hit 10,000, 12,000 years ago and melted a bunch of ice.
00:56:03.000 So it looks like we're kind of out of it already.
00:56:05.000 But the reality is we still have ice on earth because we're in an ice age.
00:56:08.000 People don't realize that the term ice age means that there is constant ice on the polar caps.
00:56:15.000 The Earth has gone in and out of ice ages.
00:56:18.000 The fact that we have polar ice caps currently means that we are currently in an ice age.
00:56:24.000 We're coming out of it, and that is natural.
00:56:26.000 There will be a point in the future when there are no ice caps on Earth.
00:56:31.000 Human beings will survive.
00:56:32.000 We will be able to deal with this.
00:56:37.000 And reptiles and all sorts of land-based animals have dealt with that kind of stuff for as long as there has been life on earth human beings being the the Conscious and creative and opposable thumb having Machines that we meet machines that we are we'll figure this out, too It's not the end of the world and it really does boil down to Governments are just trying to use the the climate as an excuse to control the populations populations Let me let me play this clip.
00:57:08.000 This is from Damn, that's interesting on reddit and it's a clip from 1978 warning of an impending ice age check this one out At least eight times in the past million years, it has advanced and retreated with clockwork regularity.
00:57:22.000 If we are unprepared for the next advance, the result could be hunger and death on a scale unprecedented in all of history.
00:57:30.000 What scientists are telling us now is that the threat of an ice age is not as remote as they once thought.
00:57:37.000 During the lifetime of our grandchildren, Arctic cold and perpetual snow could turn most of the inhabitable portions of our planet into a polar desert.
00:57:50.000 Wow!
00:57:51.000 In 1977, the worst winter in a century struck the United States.
00:57:59.000 Arctic cold gripped the Midwest for weeks on end.
00:58:04.000 Great blizzards paralyzed cities of the Northeast.
00:58:08.000 One desperate night in Buffalo, eight people froze to death in marooned cars.
00:58:13.000 Pat Bushnell was on the road that night.
00:58:16.000 Traffic just absolutely stopped.
00:58:18.000 I was afraid of being stuck in the car all night long, with the cold and the wind running out of gas.
00:58:26.000 And then what?
00:58:27.000 I think that if we had to go through a real bad winter, just like we just went through, I think we'd have to think about moving someplace else.
00:58:37.000 Move where?
00:58:39.000 The brutal buffalo winter might become common all over the United States.
00:58:44.000 Climate experts believe the next ice age is on its way.
00:58:49.000 According to recent evidence, it could come sooner than anyone had expected.
00:58:53.000 Ooh, scary music.
00:58:56.000 And this is in my lifetime.
00:58:58.000 You know, like I was two, but still.
00:59:04.000 Sea coasts long free of summer ice are now blocked year-round.
00:59:09.000 According to some climatologists, within a lifetime, we might be living in the next ice age.
00:59:19.000 Of the nine planets in our solar system, only Earth has conditions favorable to human life.
00:59:24.000 So, uh, imagine if in 1978 when they made this video, governments of the world decided to enact
00:59:37.000 a global geoengineering project to prevent global cooling.
00:59:42.000 Imagine the catastrophe.
00:59:45.000 Now that they believe it's global warming and the sea levels will rise, imagine they were like, okay, we're going to, you know, enact all these policies, we're going to create these devices, these chemicals, that will make the planet warmer.
00:59:58.000 Then 20 years later, they're like, uh-oh, the planet's actually warming, the exchange was wrong.
01:00:01.000 They might have done that.
01:00:02.000 They might have literally done that.
01:00:03.000 Well, I'm not saying they did, I'm saying... That's interesting.
01:00:05.000 If it is true that climate change is happening and the planet's getting warmer, Imagine if in 1978, they actually tried to heat up the planet out of fear of an Ice Age.
01:00:15.000 This is the problem with humans thinking they're smart enough to control everything.
01:00:19.000 And also, I read about, we're in what's called the Quaternary Ice Age, which started around 2.6 million years ago.
01:00:24.000 So this whole thing, they were already in an Ice Age, that entire show that they were just doing, when they were like, we may enter another Ice Age.
01:00:31.000 You smart humans didn't know you were in an Ice Age when you were making that video?
01:00:35.000 The stupidity of intelligence.
01:00:36.000 Oh, dude, when we were kids they thought dinosaurs were lizards.
01:00:39.000 Now they're birds.
01:00:40.000 Yeah.
01:00:41.000 So maybe they're not even birds.
01:00:42.000 Maybe they're mushrooms.
01:00:43.000 Who knows?
01:00:44.000 Yes.
01:00:47.000 Bro, I was just thinking, I had a vision a couple nights ago about something about, something about, you were sparking some memory about what I think, what we think something is that it's not.
01:00:57.000 Anyway, it'll come back to me.
01:00:58.000 No, I don't think dinosaurs are mushrooms.
01:00:59.000 That's not why I said that.
01:01:00.000 But they have feathers.
01:01:01.000 Apparently they have feathers.
01:01:02.000 That's freaking cool.
01:01:04.000 Yeah, they're birds.
01:01:05.000 That's why chickens look like little dinosaurs.
01:01:07.000 That was one of my favorite moments on IRL.
01:01:11.000 You almost saw a memory like, my mind was, my neural pathways were reforming.
01:01:15.000 And in that moment, Ian shattered through the veil and saw the truth of the universe.
01:01:19.000 If I was on DMT.
01:01:20.000 Dinosaurs were mushrooms.
01:01:21.000 We would have known.
01:01:23.000 I think that mushroom, you know my theory about mushrooms, about fungus.
01:01:26.000 I think what happened was we got a planet, it's twisting open, you've got hydrogen, oxygen making all this water.
01:01:30.000 Panspermia.
01:01:31.000 You get all these spores just splash into Earth.
01:01:34.000 So we've got these spores in our tide pools.
01:01:37.000 The spores that start eating the vegetable matter become mushroom, become fungus.
01:01:41.000 The spores that start eating other spores become animal.
01:01:44.000 And that's where we came from.
01:01:46.000 Well, where did those come from?
01:01:47.000 Space.
01:01:48.000 Well, yeah, but that's just pushing it back.
01:01:50.000 That's a cop-out.
01:01:52.000 I like the way you think, Phil.
01:01:53.000 Well, Phil, what happened was, there was a volcanic eruption, and in this charged particulate burst, it made contact with water, and then these chemical compounds began to merge, forming proteins that began to self-replicate.
01:02:13.000 Yeah, the formation of amoebas are fascinating because it's like a single cell that joins with another single cell and they work together to get the food to come in between the two of them and then other cells will come up around and become kind of like they'll curl in so the food doesn't fall out and you see like six cells working together to capture food that they can all share and then it becomes an organism and you're like oh that's a thing now and we see a six-celled organism.
01:02:34.000 Pretty cool, that's a pretty good theory.
01:02:36.000 And the way that fungus- evolutionary biologists would be able to tell you specifically where they think- how they think fungus evolved from, like, the tide pool.
01:02:43.000 I'm not- You know, the funny thing is, the next evolution, of course, are gonna be these gigantic, creepy robots.
01:02:48.000 Oh, dude.
01:02:48.000 It's kinda awesome, though.
01:02:50.000 You got that video of the Boston Dynamic?
01:02:52.000 The new one?
01:02:53.000 I kinda wanted to save that one for its own segment.
01:02:55.000 Oh man, creepy as hell.
01:02:56.000 You tweeted it out like... Hold on, before we move on to that, I want to at least make the point, they've been talking about global warming and stuff for a long, long time.
01:03:06.000 The coastlines have not changed, right?
01:03:11.000 There is significantly less ice on the North Pole than there was 20-30 years ago, but the coastlines have not changed.
01:03:18.000 But what about the Sphinx?
01:03:21.000 Wasn't that underwater?
01:03:22.000 It looks like it.
01:03:23.000 Was it?
01:03:24.000 Apparently there was water involved in the erosion.
01:03:26.000 There's lots of erosion on the sides.
01:03:28.000 I don't know if it was heavy rainfall or if it was actually submerged.
01:03:31.000 And I think that it actually, it was like eroded and then they built up.
01:03:34.000 Wait a minute.
01:03:36.000 If, you know, you guys know what expanding earth theory is?
01:03:38.000 Yeah.
01:03:39.000 There's a cool video we could pull up.
01:03:40.000 Hold on.
01:03:42.000 What if the earth is expanding?
01:03:44.000 Because if it does, that means the water levels will go down.
01:03:46.000 Oh.
01:03:49.000 That's interesting.
01:03:50.000 Yeah, that proves it.
01:03:52.000 If it's, unless it's, that's actually, that's possible.
01:03:58.000 Unless it's making more water as it expands open, but I think you might be, you might have an interesting point.
01:04:02.000 So let's say you've got, so I don't, I don't know that I, this is true.
01:04:06.000 I think this is just like fringe internet stuff.
01:04:07.000 But the idea is that titanic plates aren't actually going under and over and overlapping and spinning around.
01:04:13.000 It's that they're overlapping and unfolding.
01:04:16.000 And so the earth is actually getting bigger.
01:04:18.000 So imagine you've got, you know, a ball.
01:04:21.000 and around it is an inch of water in every direction.
01:04:24.000 If the ball gets bigger, the water will spread thinner and thinner to cover the mass.
01:04:28.000 So if the earth was more compressed 4,000 years ago and has been expanding,
01:04:32.000 the water would be going down because it would less and less water
01:04:34.000 to cover the surface of the expanding ball.
01:04:36.000 Whatever keeps Barack Obama's house dry is the theory that we should go with.
01:04:41.000 It's possible that there is more hydrogen coming out as it expands to make more water.
01:04:46.000 So it might be, there might be a homeostasis with it.
01:04:48.000 But the concern with the ice caps melting and the sea levels rising is that there's a word for it.
01:04:55.000 The ice is pressing down on the poles.
01:04:58.000 It's pressing down on Antarctica.
01:04:59.000 So if that ice is abruptly removed and Antarctica lifts up because there's no more weight on top of it, Earth elsewhere will dip down.
01:05:07.000 It will sink.
01:05:07.000 So, like, that's, they think what happened to Atlantis is that because all those ice caps just abruptly changed, Atlantis sunk down as well as got hit by a flood.
01:05:17.000 But if it's not abrupt, then I don't think, yeah, if it doesn't happen all abrupt, like just in a day or three days or something, then it might be a really slow, you might be able to, There's a funny viral video.
01:05:28.000 Who was it on Joe Rogan?
01:05:29.000 Was it Graham Hancock?
01:05:30.000 Today, yeah.
01:05:31.000 Or while the clip was circulating today, I believe it was Graham Hancock.
01:05:34.000 Yeah.
01:05:34.000 And who else was on that show?
01:05:35.000 Basically, the other guy was saying that these ideas are rooted in white supremacy.
01:05:39.000 Yeah.
01:05:40.000 Because they believe that Atlantis was a bunch of white people.
01:05:43.000 Yeah.
01:05:43.000 And I was like, well... That was a cool... Did you watch the whole show?
01:05:46.000 I didn't get a chance to see it.
01:05:47.000 I mean, look, I don't know about... I'm not a guy that studies the Atlantis stuff, but I don't imagine that Atlantis has, you know, had...
01:05:47.000 No.
01:05:57.000 I don't know.
01:05:58.000 I don't imagine that Atlantis was actually real, to be honest with you.
01:06:01.000 But that being said, whoever comes up with a story or whatever culture is creating the story, because I think that it is a myth, so whoever's writing the myth, they're going to make the inhabitants like them, especially a thousand years ago or whenever the story of Atlantis first started circulating or whatever.
01:06:22.000 People just imagine themselves.
01:06:23.000 People project.
01:06:25.000 It's ridiculous to call it racist or white supremacist because...
01:06:29.000 Someone is sitting around a room full of white dudes and then is imagining another, like a person a hundred years ago and they imagine a white dude.
01:06:35.000 It's like people, you know, there's, there's pictures of Jesus in different cultures and there's like, there's like Japanese Jesus, there's black Jesus, Arabic Jesus and all that stuff.
01:06:43.000 I think the, I think the- Swarthy Jesus.
01:06:45.000 The Atlanteans had Neanderthals in prison.
01:06:47.000 The last Neanderthals on earth were in prison in the Capitol and they all died in the flood.
01:06:50.000 That's in my script that I'm writing anyway.
01:06:52.000 That's in my, in my, it's awesome movie that is going to be produced.
01:06:55.000 It's going to be the greatest movies of Atlantis ever made called The Lost City of Atlantis.
01:06:58.000 I love it.
01:06:59.000 Big budget. The guy that was debating Graham Hancock's name is Flint Dibble.
01:07:03.000 There you go.
01:07:03.000 And that's the guy that wrote an article and kind of in...
01:07:05.000 I don't want to speak out of turn, but they're saying that he was...
01:07:08.000 Drag that man.
01:07:10.000 Drag that man.
01:07:11.000 He took my quote out of context.
01:07:14.000 Yeah, he made a thing, it was like, Graham Hancock stuff, he's citing sources that are racist, and then he was like
01:07:21.000 associating Graham with racist.
01:07:22.000 Drag that man.
01:07:23.000 And Graham was like, you're making me look bad, you're associating me with things.
01:07:25.000 Like, no, that's not what I meant to do. I'm out of context.
01:07:28.000 It was a pretty cool episode.
01:07:29.000 Well, let's jump to this video we have from Boston Dynamics.
01:07:32.000 Lex Fridman posted this.
01:07:34.000 I'd like to play for you your dystopian apocalyptic nightmare clip starting now.
01:07:38.000 Oh man, it's oh Oh wow.
01:07:50.000 You saw its legs turn around?
01:07:53.000 It's so creepy.
01:07:55.000 It's standing backwards.
01:07:57.000 Its head and legs spin around.
01:08:03.000 Wow, dude.
01:08:04.000 I mean, I get the point of that is to show the articulation, how it's capable of moving in a great in a way that is more as more mobile than a human being.
01:08:13.000 But still, there's a whole lot of man.
01:08:16.000 That's the exorcist kind of movement that you look at.
01:08:18.000 You're just like, when is she going to go ahead and climb on the ceiling?
01:08:21.000 I'd like to contrast the two worldviews here.
01:08:24.000 Lex Redmond says, Congrats to Boston Dynamics on their new electric version of Atlas Robot.
01:08:29.000 Thanks to all the amazing engineering teams at Boston Dynamics, Tesla, and others pushing the field of robotics forward.
01:08:34.000 I can't wait to hang out with Atlas and Optimus together at some point, Robot Party.
01:08:38.000 To which I responded, I can't wait to fight these things as my friends scavenge a run-down gas station for food and I attempt to buy them time before we flee into the sewers.
01:08:46.000 My thought when I saw that was, they will also be in the sewers and they can see in the dark.
01:08:51.000 So, yes, they will be.
01:08:53.000 So, night vision is a great technology.
01:08:55.000 And they can, like, crumple up.
01:08:56.000 Yeah, they can turn into a little box.
01:08:58.000 You'll kick it.
01:08:58.000 You'll be walking in the water and accidentally kick the thing.
01:09:01.000 I mean, how much does that look like the stuff from my robot?
01:09:03.000 Imagine you're in the sewer with your buddies and you have, like, a backpack and you're, like, armed.
01:09:07.000 You've got limited provisions and you're, like, we need to make it through because the robots find us, they'll kill us.
01:09:11.000 And then you stub your toe and you look down and there's a box and you go, Oh my god, and then it starts curling, curling up and shifting around and his arms are folding, his head spins around and then it goes like, human detected.
01:09:24.000 God, it looks like T-1000, yeah, it feels like the Terminator from T-1000.
01:09:28.000 So here's an honest question though, like, why would anyone assume these things would not become dangerous?
01:09:35.000 Well, if you can create an intelligence, and if people that create the intelligence decide that they're gonna give it motivations, which is stuff that people that are working on AI are gonna do, because that is what is happening here.
01:09:51.000 Like, we're watching, not only are we watching robotics come to a place where it can mimic human form, we're also trying to mimic human intelligence, and they're going to be combined, without question.
01:10:02.000 In 50 years, there are going to be artificial intelligence, Humanoids walking around in society like that's going to be very normal.
01:10:09.000 There's going to be well normally I'd say this you would think there would be some kind of flag.
01:10:16.000 They would require like any artificial human humanoid robot is required to wear something or have a mark.
01:10:23.000 So, you know, it's not a real person but based on how the internet evolved that won't happen.
01:10:27.000 Yeah.
01:10:28.000 We are legislatively paralyzed.
01:10:30.000 So, the internet, for example, on Axe, for instance, even with Elon Musk doing a great job, as he does, of getting rid of predators, you still have hardcore adult content on Axe, and 13-year-olds are allowed on there.
01:10:41.000 That's insane!
01:10:42.000 That should not be allowed.
01:10:43.000 There should be age verification.
01:10:44.000 They should block those hardcore channels so that you can't watch this stuff.
01:10:48.000 That says to me, they are going to make AI humanoid robots.
01:10:54.000 They've already got rudimentary ones that are clearly not people.
01:10:59.000 You watch the videos and they're looking better and better, but they move stiff and they're like, hello, Phil, it's great to see you.
01:11:06.000 And you're like, okay, the voice is getting better.
01:11:09.000 We saw that one video where the robot stuttered.
01:11:11.000 They add a fake AI stutter voice.
01:11:14.000 To make it more human-seeming.
01:11:15.000 Right.
01:11:16.000 And so what's gonna happen is there will be no regulation.
01:11:18.000 You'll be walking down the street one day and there'll just be some guy and he'll be like, how's it going?
01:11:22.000 You'll be like, hey, what's up?
01:11:23.000 You won't even realize it was a robot.
01:11:25.000 Robot the whole time!
01:11:25.000 It's gonna be data from Star Trek.
01:11:28.000 No.
01:11:29.000 Next Generation.
01:11:29.000 Like, sort of.
01:11:31.000 But one of the first, the funny thing about data from Star Trek The Next Generation is He had no emotions, and he struggled to act human.
01:11:39.000 He was trying to learn how to be human.
01:11:41.000 The first thing they're doing is creating the personalities.
01:11:44.000 Do you think that they'll work out, these robots, just to seem more human?
01:11:48.000 Because they don't need to work out, but do you think you'll see one running on a trail and be like, hello, sir?
01:11:52.000 They don't regenerate the way we do, so that would just limit their lifespans.
01:11:55.000 So you'll know if you see a guy running down a trail and he says, hey to you, that it's not a robot.
01:12:00.000 Unless it's a spy robot intended to infiltrate, you know, these places.
01:12:04.000 I mean, we're literally, like, just a skin suit away from the first gen Terminators that they talked about in the movie.
01:12:10.000 Then you could see them because their skin was latex.
01:12:12.000 I think, uh, I don't think they're gonna build these things for any functional work purpose first.
01:12:18.000 I think it'll be, uh, saxobots.
01:12:21.000 Because, look, I can hire someone at minimum wage, or I can spend how much money on one of these robots to lift boxes?
01:12:29.000 Wow, you got like a $30,000 robot that just cleaned your house, and you had sex with you?
01:12:35.000 That would be crazy.
01:12:36.000 I didn't mean you, personally, but I'm just saying in general.
01:12:39.000 The rhetorical you.
01:12:40.000 If someone could do it, they'd go get groceries.
01:12:42.000 Maybe leaving the house is a little extreme for a robot at this point.
01:12:45.000 So this is the issue, actually.
01:12:47.000 Actually, fair point.
01:12:47.000 I was wrong.
01:12:48.000 How much do these robots cost?
01:12:52.000 I'm not sure the cost, but I imagine that they're going to be looking to be, to... It's gotta be, what, millions?
01:12:58.000 Maybe now, but I mean, the... I don't know, because the... I don't know what the technology's like, but really, it's like, you're talking about electric motors and servos, so I don't know how involved it is, and I'm speaking definitely as a very ignorant person about this, but the technology is really in the software, and it's not in the servos and etc.
01:13:18.000 Like, the actual motors and stuff like that, it's not...
01:13:21.000 Super crazy, far-out technology to do it.
01:13:24.000 The important part is the balance and the software.
01:13:28.000 If it costs $30,000 for one of these robots, McDonald's replaces their staff in two seconds.
01:13:34.000 Because they're going to say, over the course of three years, we are going to pay any minimum wage employee $40,000.
01:13:43.000 These robots last five years and they cost $30,000 upfront.
01:13:46.000 $30,000 up front.
01:13:48.000 Done.
01:13:48.000 And they don't get burnt, and they don't come and call in sick,
01:13:52.000 and they can't, and they're not gonna go to HR, and they're not gonna, I mean, just, you know,
01:13:57.000 it's like, they could double as security at your building.
01:13:59.000 Well, no, you don't wanna do that, because what do you, why would you steal any,
01:14:04.000 why would you worry?
01:14:04.000 Because you've got all robots inside, you order at a kiosk, the robots make it and hand it to you.
01:14:09.000 What are you gonna, what's secure, what do you have to worry about security?
01:14:11.000 Because everyone's paying with their credit card or whatever.
01:14:14.000 There's no reason.
01:14:15.000 Yeah, it's like, let someone come in and break something, whatever, you've got insurance.
01:14:21.000 It takes all of the concerns of safety go away aside from safety for your customers.
01:14:29.000 Obviously, you want to make sure that the people that are coming to get food from there are safe, but otherwise, internally, for your business and stuff, all of the worries about OSHA and stuff, like, get out of here, who cares, you know?
01:14:43.000 There's a ton of stuff, there's a ton of things that make it more appealing.
01:14:48.000 This is exactly what the automotive industries did.
01:14:51.000 It's just that the automotive industry has big gigantic robots that have arms and stuff like that.
01:14:56.000 If the automotive industry can have these and just give them existing power tools and they can do what your average person is doing on the line, You're talking about wiping out entire industries for jobs.
01:15:12.000 Have you guys seen the, uh, they have, so there was an article in the New York Post about a guy who spends $10,000 a month on AI girlfriends.
01:15:19.000 No.
01:15:19.000 God.
01:15:20.000 Yeah.
01:15:21.000 Because, you know, there was that, there was that one company where people were using it for, you know, titillation.
01:15:28.000 And so they banned it and they were like, stop.
01:15:30.000 And then all of a sudden everyone revolted and said, but my waifu.
01:15:33.000 So they said, okay, grandfathered in, but from now on, no more of this weird, creepy, you know, titillating content.
01:15:39.000 So these other companies emerged and they were like, we'll let you do it.
01:15:43.000 So a lot of people are wondering where these like AI porn images came from that popped up all over Twitter.
01:15:48.000 There are services that allow you to generate your own girlfriend, like you can customize it and everything.
01:15:53.000 And then you pay a subscription, they allow you to generate Overt adult content of AI women guys are paying for it.
01:16:01.000 I was thinking the next era of Luddites It's not gonna be factory workers.
01:16:06.000 It's not it's not gonna be trades.
01:16:07.000 It's going to be sex workers There's gonna be a bunch of, yeah.
01:16:11.000 Musicians, too, maybe, because I was thinking in the shower earlier, like, geez, I, yeah, making music, it's like, it's not about the finished product.
01:16:20.000 Making music is actually about making the music.
01:16:22.000 It's about banging on a drum with your buddy and, like, making some sounds together.
01:16:25.000 Well, maybe.
01:16:26.000 When we played the A.I.
01:16:27.000 songs when Harmeet was here, she was just dismissive, saying, oh, this is stupid, it's boring, it's bad.
01:16:33.000 And I think the issue for a lot of people is they assume all music is Zeppelin or The Weeknd or Taylor Swift or something like well-crafted songs when I would probably estimate, I don't know, Phil probably knows better, but I'd say like 70% of music is background instrumental stuff for jingles, for movies, for, like, most people don't realize that when you're watching a movie, There's really subtle background music the whole time, almost the entire time in films.
01:17:01.000 And there's a guy who writes all that music.
01:17:03.000 So you go online, you can AI generate all of that now.
01:17:07.000 That's going to eliminate a large portion of money in the music industry.
01:17:11.000 Well, I'm down to talk more about sex work, actually.
01:17:13.000 Oh, before we go into sex work, the Boston Dynamic Robot, I got a price tag.
01:17:18.000 The Spot Robot Dog is $74,500.
01:17:21.000 What about the guy?
01:17:23.000 I haven't seen one for the guy.
01:17:24.000 And by the way, I'm on Brave.
01:17:25.000 It's AI answering my search queries.
01:17:27.000 It's got AI generating an answer.
01:17:29.000 I want to know how much this robot guy costs.
01:17:33.000 I mean, this is probably like prototypes.
01:17:36.000 There's probably like 20 or so of those that they've got made now.
01:17:39.000 How amazing would it be to get one of those, get a realistic silicon Seamus mask, put it over its head, and then attach it to chat GPT real-time voice with Seamus's voice, and Seamus could never leave us.
01:17:56.000 That's right.
01:17:56.000 He would leave, but we always have a- He'd be great!
01:17:59.000 You could have an AI that knows everything there is to know about potatoes.
01:18:02.000 Do you think you'd ever buy one of those Boston Dynamic dogs?
01:18:06.000 You can buy them now, can't you?
01:18:06.000 Yeah, it's $75,000.
01:18:07.000 Just to have it patrol the studio or something?
01:18:10.000 Freak people out?
01:18:11.000 People would lose their shit.
01:18:13.000 Actually, I mean, I gotta be honest, it's a really great thing to have because they walk into their own charger, I'm pretty sure, and they sit down.
01:18:19.000 Is that what they do, right?
01:18:20.000 I don't know.
01:18:21.000 Yeah.
01:18:21.000 Once the battery gets low, they walk to their charger and then charge.
01:18:23.000 The new place, it might not be a bad idea to have one on patrol.
01:18:27.000 Teach it to scale.
01:18:27.000 Well, because think about this.
01:18:29.000 The biggest issue we have is information when it comes to security.
01:18:33.000 So with all the buildings we've got, the reason we have security is because if someone comes around who shouldn't be, we need to know what's happening.
01:18:41.000 Then we call backup.
01:18:42.000 Now, a human being can also be armed in West Virginia.
01:18:45.000 So we've got a handful of those guys.
01:18:47.000 Ain't nobody coming around.
01:18:48.000 But 74,000 dollars for one robodog?
01:18:52.000 Costs way more for security.
01:18:54.000 Yeah.
01:18:54.000 I don't know how secure... It's more of a scout, I think, at this point.
01:18:57.000 Exactly.
01:18:58.000 And so, you can reduce the amount of security guards you have, have a couple guys who are armed, and have a couple robodogs, you cut your costs down.
01:19:05.000 Just built-in night vision.
01:19:06.000 Because the robodogs can do the patrolling.
01:19:08.000 All night long.
01:19:09.000 And then alert the security team to any.
01:19:12.000 And also scare off wildlife and stuff.
01:19:14.000 Yeah.
01:19:14.000 You can also have a thermal on a robot dog.
01:19:18.000 Yeah.
01:19:18.000 Thermal vision as opposed to night vision.
01:19:19.000 It's actually better if you're looking to identify living things.
01:19:23.000 What I was actually planning on doing was building fake auto-defense turrets for Freedomistan.
01:19:28.000 It'd be so cool!
01:19:29.000 So you would just see these, like, two big things moving back and forth with lasers pointing, and, you know, they wouldn't actually have any capability to do anything other than look intimidating.
01:19:37.000 Just follow someone if they get on motion.
01:19:40.000 And what it would actually be is, you know the sprinklers that go ch-ch-ch-ch?
01:19:43.000 Yeah.
01:19:43.000 We would just put a big cylinder on it so it would look like it's patrolling, but it's actually just a sprinkler.
01:19:47.000 That's funny.
01:19:48.000 And then people would be like, I'm not going anywhere near that thing, I don't know what that is.
01:19:51.000 Yeah, that'd be hilarious.
01:19:52.000 Yeah, it's freaky.
01:19:53.000 You know, I was talking about this earlier.
01:19:54.000 In West Virginia, we've had weirdos come onto the property, assuming nobody's there.
01:19:59.000 We had that incident that happened, I think it was last year, when some guys broke into one of the buildings and one of our security guys opened fire on them.
01:20:06.000 them you should yeah yeah and so like you but this is the price of freedom
01:20:11.000 yeah do you like look we we're out in the middle of nowhere and there's crime
01:20:17.000 You go to New York, there's crime.
01:20:18.000 In New York, you have no freedom and there's crime.
01:20:20.000 In West Virginia, there's crime and you have freedom.
01:20:22.000 So you can defend yourself.
01:20:23.000 You should get a gun safe for in the studio.
01:20:27.000 And so I can bring a gun to leave there when I'm not there.
01:20:30.000 It's safe, locked up.
01:20:31.000 Constitutional carry state.
01:20:32.000 Yeah.
01:20:33.000 You know?
01:20:34.000 But anyway.
01:20:35.000 Robots.
01:20:37.000 Robot sex dogs.
01:20:39.000 What?
01:20:40.000 Those words, I didn't mean to say them at the same time.
01:20:43.000 But now I'm thinking about it.
01:20:44.000 Clip it.
01:20:45.000 Uh-oh.
01:20:47.000 It brings a whole new meaning to the word doggy style, if you know what I'm talking about.
01:20:50.000 No!
01:20:51.000 No, down-vosh, down!
01:20:53.000 Dude!
01:20:54.000 Wow, that's coming up on the horizon and I did not mean to manifest that.
01:20:59.000 Oh God.
01:21:01.000 I mean, we talked about this before.
01:21:04.000 There's already a mod for, I think it's Skyrim, where you can talk to an AI companion and it uses GPT to answer your questions and talk to you.
01:21:13.000 Now with these AI girlfriends, this is the first thing they're putting money in because they know guys will spend money on it.
01:21:19.000 Look, you build a robot that can carry boxes, Amazon will go to their insurance company, they'll talk about liability, they'll talk about rates, They make the AI porn and the guys, they're buying it up.
01:21:32.000 I imagine like this, the, the robots like this, they're going to be, you know, they're going to be home appliances where, you know, you're going to have a robot around to do menial tasks.
01:21:44.000 You've already got Alexa that goes and turns people's lights in mine, in my apartment at the down here, the apartment, the apartment I have Alexa and it's handy.
01:21:53.000 When I go home, I don't, I don't have that stuff in my place in New Hampshire.
01:21:56.000 So when I go home, I can't even wait.
01:21:59.000 The chat is putting 20s for what he had just said.
01:22:02.000 Because you know I'm right.
01:22:03.000 That's so funny.
01:22:04.000 See, this is what the show would be every night if we didn't have to censor ourselves.
01:22:09.000 Let's be free together.
01:22:11.000 We didn't have to censor that.
01:22:12.000 You said it on the show.
01:22:13.000 Yeah, dawg.
01:22:14.000 Now you're home.
01:22:15.000 People are putting 20s in chat!
01:22:17.000 I agree, that was a 20 all the way.
01:22:19.000 I don't talk to the YouTube chat very much, but... Are you talking about the IRL chat or the Discord chat, or are you talking about the... IRL.
01:22:26.000 They were saying Ian King, ha ha ha, 2020.
01:22:28.000 I sexed good lord.
01:22:29.000 Robi- Don't encourage it, no more.
01:22:32.000 I love you.
01:22:33.000 Thank you for the chat, keep it coming.
01:22:35.000 I think it's important that we coin the term now, it already exists, but-
01:22:38.000 Robots, sex dogs.
01:22:39.000 Robosexuals.
01:22:40.000 Robosexuals.
01:22:41.000 Dude, would it be- The guys who have AI girlfriends are robosexuals.
01:22:45.000 Would it be rape if you had sex with a robot, but it didn't tell you it was a robot?
01:22:50.000 Would the robot have raped you?
01:22:52.000 Wait, what?
01:22:52.000 If the robot didn't disclose it was a robot before it had sex with you, would that be considered rape?
01:22:56.000 I don't think inanimate objects have intent.
01:23:01.000 It would be the owner of it, actually.
01:23:03.000 Oh, that's weird.
01:23:04.000 This is actually a question we haven't actually answered, and the Supreme Court's gonna have to take it up.
01:23:10.000 You're in a self-driving car.
01:23:12.000 Let's say we're at the point where we have those self-driving taxis, right?
01:23:16.000 You're sitting in the back seat and you're on your phone, boop, boop, boop, and an old lady steps out from between two cars and she sees the car and she goes, wah!
01:23:22.000 And then the self-driving taxi has to make a decision, hit the old lady, swerve out of the way, crash, killing the passenger.
01:23:30.000 What does it do?
01:23:31.000 Who does it prioritize?
01:23:33.000 We don't know.
01:23:34.000 A human would react instinctively and swerve, probably, and put their passengers at risk.
01:23:37.000 We don't know.
01:23:38.000 But a person has to program the vehicle to do it.
01:23:41.000 So the next question is, if a self-driving taxi has nobody and it's driving around and it hits somebody, injuring them, who's at fault?
01:23:51.000 The terrible thing is that there's no criminal charge at all because it's a corporation.
01:23:54.000 It would be a fine and a lawsuit.
01:23:56.000 But if a human being is driving that car, that human being is responsible.
01:23:59.000 It's a scary prospect.
01:24:01.000 So, if somebody makes a robot, like Boston Dynamics' Atlas robot, what happens if one of their robots goes rogue and starts raping people?
01:24:12.000 Boston Dynamics is on the hook for that. It's gotta be.
01:24:15.000 Are they though? Who owns the robot?
01:24:17.000 First you gotta catch the robot and interrogate the thing.
01:24:20.000 Get its code and be like, why is it doing this?
01:24:23.000 The self-driving taxis are sold to another company.
01:24:25.000 Right? So the person driving... someone buys a Toyota and crashes it.
01:24:28.000 Toyota's not at fault.
01:24:30.000 You could sue Toyota, maybe, depending on what happened, but typically it's the driver of the car, and we say, you're driving a car, and you crash the car.
01:24:36.000 Now there's no driver who's at fault.
01:24:38.000 The company who made the self-driving car, or the company that bought the self-driving car and pressed go.
01:24:41.000 Right, because someone could buy the dynamic robot and change its code, potentially.
01:24:45.000 I don't know if that's actually feasible.
01:24:46.000 Or not even.
01:24:47.000 They buy it, and they say, I want this robot to provide companionship, but then it goes, Roger that, yeah!
01:24:55.000 Everybody knows that these things are going to be Wi-Fi, someone's going to hack it, and then you're just going to control it like it's a drone, man.
01:25:03.000 And it's going to have a built-in camera that's going to be transmitting your sex life to someone.
01:25:08.000 I kind of moved away from the sex part once I said you take over it.
01:25:11.000 You just want to keep going back.
01:25:14.000 It's probably the biggest driver of humanity is sex.
01:25:18.000 Like, it is, the porn, I think they say that porn is responsible for the success of the internet in a lot of ways.
01:25:23.000 Yeah, porn is responsible for the VHS over Betamax.
01:25:26.000 And it's also why the internet speeds ramped up is because there was massive demand for, the main video demand was, you know, graphic content.
01:25:33.000 So that's what I'm saying, like, with these AI girlfriends, the chat communications and video development, it's like, the big, okay.
01:25:42.000 You have chat GPT, like I signed up.
01:25:44.000 How much does that cost?
01:25:44.000 It's like cheap.
01:25:45.000 And what do you do?
01:25:46.000 You like ask it questions and it's like, eh, fine, whatever.
01:25:49.000 But the AI girlfriends are a billion dollar industry.
01:25:51.000 Yeah.
01:25:51.000 Guys are dumping money on this stuff.
01:25:54.000 The monetary drive for the advance of this technology is because...
01:25:58.000 Simp guys want to bang robots.
01:25:59.000 They're robo-sexuals.
01:26:00.000 Look, I mean... I think it's important we say that, too.
01:26:03.000 I think we call them robo-sexual.
01:26:04.000 I'm fine with that.
01:26:06.000 There is going to be a demand for that, clearly, because you hear, you hear, you know, all the red pill dudes.
01:26:12.000 Actually, it's probably less the red pill dudes, more the people that listen to the red pill dudes.
01:26:16.000 But they're, they complain about the fact that women are, their standards are too high, etc, etc.
01:26:22.000 And they're, Women complain about men and the sexes have never been more at each other's throats and there are dudes that are like, I'm checking out of society or checking out a dating market and stuff.
01:26:34.000 There's a huge percentage of young guys that are, you know, 18, 19, 20 years old that have never had a girlfriend that have never been on a date.
01:26:40.000 There's all kinds of women that are like, oh, I can't find a guy.
01:26:44.000 People are going, when you can customize something like a robot or an AI to give you what you're looking for, there's going to be a lot of people that are going to gravitate to that.
01:26:54.000 Now I'm wondering about women with their AI robot men.
01:26:58.000 A woman wants to feel safe.
01:27:00.000 If there's a robot, I will protect you.
01:27:02.000 And he's got laser turrets on his arms.
01:27:04.000 He's like, no one's going to mess with me and my kid.
01:27:06.000 And he's also able to inseminate you and give you kids with your genetic desires or whatever the hell.
01:27:11.000 He's going back to it.
01:27:12.000 Yeah.
01:27:12.000 Like, would a woman take that as a robot husband?
01:27:15.000 Like, better than any of those simp dudes.
01:27:17.000 Like, this guy doesn't even work out.
01:27:18.000 This robot can lift 7,000 pounds.
01:27:20.000 He's got the horn tonight.
01:27:23.000 Here's the issue, right?
01:27:24.000 Guys, not every guy, but a lot of guys like to be domineering.
01:27:28.000 They like to dominate.
01:27:31.000 I wonder if women do.
01:27:34.000 I was reading this thing about the success of strip clubs.
01:27:36.000 Why is it that strip clubs are almost always women?
01:27:39.000 There are, you know, clubs where guys strip, but they're rare.
01:27:43.000 And I think this might be like OkCupid data.
01:27:45.000 They said men like watching women in submissive positions.
01:27:48.000 Women don't like seeing men.
01:27:50.000 To women, on average, submissive men are not attractive.
01:27:54.000 They want strong, commanding men.
01:27:56.000 So to see a guy on stage serving you is a weak position.
01:28:00.000 It's more of a funny thing to watch and less of a, you know, like attractive.
01:28:07.000 Whereas for guys, they just see the woman's body and they're like, yeah, dance, right?
01:28:11.000 So I wonder how that will translate to robots.
01:28:15.000 You said on stage, like, women don't want to see a guy as a servant.
01:28:19.000 So you're thinking, they don't want a servant robot.
01:28:21.000 They want a robot that's going to take charge and be like, we're going to the park today.
01:28:25.000 Guys, maybe, I don't know.
01:28:27.000 Guys are going to buy these.
01:28:28.000 So look, with these websites, I think it's called, what was it called?
01:28:32.000 I can't remember what the name of the website was.
01:28:34.000 New York Post had the story.
01:28:36.000 But, uh, it's like you could, you, you pick the kind of girl you want, like the kind of hair, the size of the, the funny thing is like all of the AI girls that have been generated have massive knockers.
01:28:45.000 And it's just like, just like ridiculously obscene, not real.
01:28:48.000 I don't, I don't think we got that.
01:28:49.000 Yeah.
01:28:50.000 So like on the, on the New York post, they showed a bunch of pictures of, of demo women and their boobs are just like, like those women would be in serious pain.
01:28:58.000 Yeah.
01:28:58.000 They would need surgery.
01:28:59.000 They're back breakers.
01:29:00.000 Yeah, and then, like, I pulled up the website for the show.
01:29:04.000 Here's the creepiest thing of it.
01:29:05.000 When you click create, it gives you two options.
01:29:08.000 Real or anime.
01:29:11.000 I don't get that.
01:29:13.000 That's the weirdest thing to me.
01:29:14.000 Like, what?
01:29:15.000 Wow, I don't get it.
01:29:16.000 Dudes want anime waifus, and I'm like, why not?
01:29:19.000 I guess this is the depopulation of humanity.
01:29:21.000 Why is it anime? That's just so weird. That's it. It's that's something about weebs. I guess what's a weed like an egg an
01:29:26.000 anime dork?
01:29:27.000 Dudes that like I'm like, dude, I like anime, but I don't get I guess this is the depopulation of humanity
01:29:34.000 Like it's it's a self-selecting system where people are choosing to have
01:29:38.000 virtual relationships and then just Until they're dead and because it's easy and then it's it
01:29:44.000 also works for if there really is an agenda a global general
01:29:48.000 There are too many people you guys we can't keep exponentially growing at this rate with this technology
01:29:54.000 Jeez, man And so just kind of you look you look at this stuff these AI girlfriend stuff and it's like They will say whatever you want them to say you program their personalities what they look like they can generate graphic images and then Imagine a guy grows up on that stuff, and then he meets a woman in real life, and she's like, hey, I'm not into that, like, we have to have boundaries.
01:30:14.000 Whoa, boundaries?
01:30:15.000 Robo-girlfriend has no boundaries.
01:30:16.000 She does whatever I tell her to do.
01:30:18.000 There's just, like, it's gonna shatter brains.
01:30:20.000 It's gonna break people.
01:30:21.000 The changes that have happened in the past 25 years, I mean, obviously, I mean, even even Ted Kaczynski's, you know, the manifesto, he acknowledged all of the changes that had happened just in the previous 100 years since the 150 years since the Industrial Revolution.
01:30:41.000 Humanity has had all of the things that have had social pressures and evolutionary pressures, all of that stuff has been removed because we have machines to do our work, we have machines to protect us, we have machines and technology to inform us and stuff, all of the things, all of the connection to actual nature and stuff, all that stuff's been removed and now with Machines becoming so, like, I mean, if you thought machines were, you know, common when you had toasters and cars and forklifts, like, when people are going to have, you know, when people have Neuralink and personal robots, nowadays personal robots, like, you have a robot that, you know, sweeps your house!
01:31:33.000 Or some people do.
01:31:34.000 You can buy, you know, some robots like this.
01:31:36.000 We used to have the Roombas.
01:31:37.000 Yeah.
01:31:38.000 But they suck.
01:31:38.000 Well, yeah, they're small in their opinion, but they're not as good as the big ones.
01:31:41.000 But that guy's gonna be able to grab the Dyson that you bought and do the Dyson for you, and that Dyson works like mad, man.
01:31:47.000 And he can probably fold up into a little cube and then sit himself in the corner.
01:31:51.000 So he'll sit right next to the Dyson, he'll get up, pick up the Dyson, actually do a good job cleaning.
01:31:55.000 That sounds terrible.
01:31:56.000 Or they'll make these, have you seen these amorphous robots?
01:31:59.000 They're like, they can change form, they can go through tubes and stuff, they're like, look like a goo, kind of.
01:32:04.000 Yeah, robots are- That could clean your floor real easy.
01:32:06.000 Well, I mean, maybe, but robots aren't going to be like robot isn't going to be one kind.
01:32:09.000 It's not going to be just the humanoid thing.
01:32:11.000 I mean, nowadays, everyone think nowadays you can actually think of, you know, Tim's car is a robot because it's got, you know, it's a Tesla that can do all kinds of stuff that.
01:32:20.000 Other cars can't do that.
01:32:23.000 I require this of Elon to add a voice assistant for Teslas.
01:32:27.000 Oh, awesome.
01:32:28.000 How am I not at the point where there is not like a red bar that, you know, moves up and down and says, Hello, Tim, where would you like to go today?
01:32:36.000 And I'd be like, we're going to the casino.
01:32:39.000 Hollywood it is.
01:32:40.000 And then it just goes, start driving.
01:32:42.000 Yeah, that's a good idea.
01:32:43.000 Also, Elon, make the headlights also double as projector screens so you can project a movie onto, like, the back of your garage while you're chilling and watch.
01:32:51.000 Also, Elon, make me a sandwich.
01:32:53.000 Yeah, Elon, get over here.
01:32:55.000 It's go time.
01:32:56.000 After you colonize Mars, we require more of you.
01:32:58.000 Exactly.
01:32:59.000 He's doing more than most people to... We'll build an electrostatic slingshot to get things into Martian orbit.
01:33:03.000 He actually is making more Elons.
01:33:05.000 So he's got, like, seven kids or something like that?
01:33:08.000 What were you gonna say?
01:33:09.000 Talk about Andrew Tate?
01:33:10.000 No, I was gonna talk about space sling shots.
01:33:14.000 I'd rather talk about space sling shots.
01:33:16.000 Have you seen those things where it's like a big, it's a big disc and there's like a hammer in it that spins around really fast and then shoots the thing straight in the sky?
01:33:22.000 Yeah, spin launch.
01:33:23.000 Yeah, that's the company.
01:33:25.000 And that's Earth to orbit.
01:33:26.000 But what you can then when once it's you got something in orbit, you can send it through like a mag rail that just fires it off into another orbit that catches it in like a reverse mag love magnet.
01:33:36.000 And so you can really like shoot packages.
01:33:38.000 Have they done that yet already?
01:33:40.000 I don't think so.
01:33:40.000 No, it's a big thing.
01:33:41.000 It spins really, really fast.
01:33:42.000 Oh, spin launch.
01:33:43.000 Yeah, it's up and active.
01:33:44.000 You can't really send organic like humans up because of the pressure will kill them.
01:33:47.000 But you can send.
01:33:48.000 I wonder how this is doing.
01:33:51.000 Yeah, man, spinning and throwing that...
01:33:53.000 We're gonna go to Super Chats!
01:33:55.000 So if you haven't already, would you kindly smash that like button, subscribe to this channel, share the show with your friends, head over to TimCast.com, click join us, become a member, because it doesn't look like YouTube likes us very much, and it'll be interesting to see what happens moving forward.
01:34:12.000 But of course, the premise of this episode is three years after our biggest episodes aired, they made up reasons to take them down.
01:34:20.000 And they won't give us assurances, they put a warning on the channel, And I'll stress this again, because I told them, I was like, look, you got a problem with the episodes, three years later, tell me you're taking them down.
01:34:30.000 Fine.
01:34:31.000 Instead, you issued a warning on the channel, which is effectively a strike.
01:34:35.000 So it's a four-strike system.
01:34:36.000 Here's how it works.
01:34:37.000 The first violation, we warn you.
01:34:39.000 The second violation, you get a seven-day suspension from broadcast.
01:34:44.000 The third strike, the official second strike after the warning, I believe it's two weeks, and the third is a permanent ban.
01:34:52.000 Permanent ban.
01:34:54.000 So, if they were like, look it's been three years, I know this has been sitting on the channel for a long time, we're just gonna take them off the channel, I would have been offended and angry and I would have said whatever.
01:35:04.000 There's no threat to us being banned when they say something like that.
01:35:07.000 But to come to me and say, not only that, not only will we retroactively ban your show, We'll delete you permanently if we find any anything in any clip you've ever done over the past four years and your thousand plus episodes.
01:35:22.000 So I'm like I gotta so I have to delete every episode.
01:35:25.000 We would have to literally just go in and purge the entire channel because we have no idea when they will decide to retroactively ban us.
01:35:32.000 And there you go.
01:35:33.000 Anyway, we've got plans, we've got plans.
01:35:36.000 I can't say too much.
01:35:37.000 A lot of people are like, why are you still on YouTube, blah blah blah.
01:35:39.000 We do post all our clips on Rumble.
01:35:40.000 There is projects, stuff behind the scenes going on, that were not for third parties' involvement in their interests, I would gladly tell you.
01:35:48.000 But again, I'll respect other people's privacies in that regard.
01:35:51.000 We'll read your Super Chats.
01:35:53.000 Become a member at TimCast.com.
01:35:54.000 We'll have the uncensored show coming up.
01:35:55.000 It'll be fun.
01:35:56.000 Alright. 1596.
01:36:00.000 159648 Sentile says, Time for a Timcast brought to you by Rumble.
01:36:05.000 Love you guys.
01:36:06.000 Hate Google and YouTube.
01:36:08.000 Well, there was a lot going on behind the scenes.
01:36:09.000 I have spoken with top men.
01:36:12.000 We'll see.
01:36:14.000 Ted Diorio says, Not today, Clint!
01:36:17.000 Marodney says, Did I really beat Clint?
01:36:19.000 You both did.
01:36:20.000 Clint.
01:36:21.000 Amazing.
01:36:23.000 Alright, Colby Hanson says, For Phil, Donut Operator has new t-shirts that says, The left lane is for crime.
01:36:30.000 Donut's a smart man.
01:36:34.000 What is this?
01:36:35.000 Shadow's Hand says, Warhammer 40k is now woke.
01:36:38.000 They took male factions that were that way for over 30 years and added women for no reason, and then gaslighted the fans into saying they were always there.
01:36:45.000 Get woke.
01:36:46.000 Really?
01:36:47.000 Carl did a video on Sargon of Akkad.
01:36:49.000 Actually, under the Sargon of Akkad page on YouTube, he did a video on 40k.
01:36:56.000 It is a shame.
01:36:58.000 40k seemed like one of the only properties that was doing really good at keeping woke out.
01:37:04.000 And the reason is because it literally is about space fascists.
01:37:08.000 It's about everybody's evil in the whole 40k world.
01:37:12.000 Have you played it before?
01:37:14.000 No, no.
01:37:15.000 I'm just familiar.
01:37:16.000 I'm not well versed, but I'm familiar with the lore.
01:37:20.000 Kinsei Sensei says if they shut you down, I'm canceling my premium membership and moving to Rumble permanently.
01:37:26.000 The issue is, you know, one of the things I said to Google was, maybe we just shut it down, move to a different platform, but maybe that's exactly what you want.
01:37:35.000 Like, if they took down our two biggest shows ever, which they said were fine for three years, it seems to me like I know the subject of this, I've had some conversations.
01:37:45.000 The idea is they can't ban TimCast IRL instantly.
01:37:51.000 They need to do things so that when it does get banned, they'll say, oh, well, but he had several strikes over the past several months.
01:37:58.000 What were we supposed to do?
01:38:00.000 So they look through all our episodes, retroactively enforce against two of them, giving us a warning.
01:38:05.000 The next thing that happens is in a month from now, because I took the training, which takes 90 days for the warning to resolve, Couple months from now, they'll say, oh, episode 412.
01:38:16.000 Look what we found.
01:38:19.000 Strike.
01:38:20.000 Now you can't broadcast for a week.
01:38:22.000 Exactly.
01:38:23.000 Because if they came outright right now and banned us, there would be a huge stink, a huge conundrum, there'd probably be a lawsuit, it'd be crazy.
01:38:31.000 So, instantly I'm like, okay, here we go, game's on, I get it.
01:38:35.000 We'll see.
01:38:37.000 As I mentioned, we're talking with top men, so I don't want to say too much, but there's a strong possibility that this entire YouTube channel has all of its videos purged within a week.
01:38:47.000 And then what we end up doing is the show is on YouTube for a week before being permanently deleted and then being archived on other platforms or maybe even being on other platforms.
01:38:55.000 The YouTube clips will be up for maybe a month before being deleted.
01:39:00.000 I don't know.
01:39:02.000 People do watch old episodes.
01:39:04.000 They do.
01:39:04.000 We can see in the analytics.
01:39:06.000 And people do watch older clips.
01:39:07.000 Sometimes clips will get views for a month or two.
01:39:10.000 But, uh, what do we do?
01:39:13.000 You know?
01:39:13.000 That's what YouTube wants.
01:39:15.000 That's the world they've created.
01:39:17.000 They've outright said they would like to be irrelevant.
01:39:19.000 I love this.
01:39:20.000 I love this.
01:39:21.000 Uh, I remember meeting with Google 11 years ago, and they were like, we are losing to Netflix, and we need to compete.
01:39:30.000 Okay, well, here's why you lose.
01:39:32.000 Netflix has edgier content than we've ever had.
01:39:35.000 Crazier content.
01:39:36.000 They have ancient alien conspiracy stuff on Netflix.
01:39:40.000 You can't even have that on YouTube.
01:39:42.000 They'll ban you.
01:39:43.000 I mean, they do, but like, you never know.
01:39:45.000 YouTube will just destroy your company overnight.
01:39:48.000 What sane person wants to start a business?
01:39:51.000 That's what I've been saying for a long time.
01:39:52.000 If you're looking to get into this, you start on Rumble.
01:39:54.000 You don't start on YouTube.
01:39:56.000 To be fair, I will stress this.
01:39:58.000 We need Rumble to launch their ad network.
01:40:01.000 We need that ad revenue.
01:40:03.000 And they have some, but it doesn't compare.
01:40:07.000 Same thing for X. X is pretty good.
01:40:09.000 The ad share has gone up.
01:40:11.000 And so, what we need is...
01:40:17.000 This is a component of X functionality.
01:40:19.000 If X had a live player with a live chat feed, that would be massive for generating revenue.
01:40:26.000 Because when you post a tweet, or an X post, what happens is, if ads appear in it, everybody who sees it generates revenue.
01:40:33.000 And you get a share of that.
01:40:34.000 That's fantastic.
01:40:35.000 It's awesome.
01:40:36.000 I have 2,000,000 followers, 2,010,000 followers on X, hundreds of millions of impressions, and I think I get like $1,000 a week.
01:40:43.000 Maybe like $5,000 a month or something.
01:40:47.000 That will not run a company.
01:40:48.000 It's fantastic for me just posting garbage and satire and jokes and nonsense on the platform.
01:40:54.000 You know, it's good income, but it certainly can't run a company.
01:40:57.000 I wonder, if we were to get hundreds of millions of impressions on a show like this, I don't know if it would generate the revenue we need it to.
01:41:03.000 It's hard.
01:41:04.000 It is.
01:41:05.000 Ad revenue is very different from membership revenue.
01:41:06.000 Membership revenue is asking a person to directly give that ten bucks.
01:41:10.000 And then you have, man, this is also difficult too, is inflation.
01:41:14.000 Ain't nothing I can say about that.
01:41:16.000 Inflation makes it harder because, you know, it gets to a point where we have to pay people more to cover the cost of gas and rent insurance, but then The cost of running this show goes up.
01:41:28.000 We have to then ask everyone to pay more, but then if we do, we might just lose members outright, so it's difficult.
01:41:33.000 It's real tough, man.
01:41:36.000 Oh, YouTube's on the fritz.
01:41:38.000 Of course.
01:41:38.000 Not surprised.
01:41:40.000 Raymond G. Stanley Jr.
01:41:41.000 says, I would have been super honored to be on IRL.
01:41:43.000 We called, but it was like a last minute thing.
01:41:46.000 Our guest had an emergency.
01:41:49.000 And then I was like, oh man, we gotta have Raymond on the show.
01:41:53.000 Like everybody knows who he is.
01:41:54.000 So it would like, everybody would be, it would be awesome.
01:41:58.000 But we'll, another time, another time.
01:42:00.000 It's out in the ether now.
01:42:01.000 We will plan for it.
01:42:03.000 I love meeting.
01:42:04.000 It was kind of like meeting a superhero.
01:42:07.000 What a cool name too, right?
01:42:09.000 I mean, you got it all.
01:42:10.000 I like Raybert G. Stanbert Jr.
01:42:13.000 Just like somebody decided to make a parody of Raymond.
01:42:17.000 That's funny.
01:42:18.000 Was it Burtman?
01:42:19.000 Dr. Tran said you or Ian said something bad about Israel.
01:42:22.000 That's what happened.
01:42:22.000 It was just three years ago though Like nobody was talking.
01:42:26.000 Yeah, and I'm I mean, yeah, who knows I'm pretty neutral I'm I like seeing both sides, but you know, I digress Silver Screen Psychopathy says, you talk a lot about not supporting evil corporations, but you're paying Screwtube.
01:42:40.000 Time to head over to Rumble, baby.
01:42:42.000 The tube doesn't want you.
01:42:43.000 I'm sure Rumble will be happy to have you.
01:42:45.000 Let's go.
01:42:46.000 I will just simply stress again, I have spoken with top men.
01:42:51.000 We'll see what happens next week, but...
01:42:53.000 I don't pay YouTube.
01:42:54.000 They pay me.
01:42:55.000 It's an Indiana Jones reference, by the way.
01:42:58.000 You ever seen Raiders of the Lost Ark at the end of the movie?
01:43:00.000 Where's the Ark?
01:43:02.000 It's being taken care of by Top Men.
01:43:04.000 Right.
01:43:04.000 Who's that?
01:43:05.000 Top Men.
01:43:06.000 Yeah.
01:43:06.000 Yeah.
01:43:07.000 I got goosebumps.
01:43:09.000 Top Men.
01:43:09.000 Dude.
01:43:11.000 And then it shows the guy in the warehouse and he's like carting it.
01:43:14.000 Top Men, an Indiana Jones reference.
01:43:17.000 Alright.
01:43:18.000 Best movie.
01:43:19.000 John Eddie says, my cousin Katie had a blood vessel pop in her brain, causing her to have a fatal heart attack.
01:43:24.000 Her parents need help with final expenses.
01:43:26.000 There's a GoFundMe for her.
01:43:28.000 Katie Rodriguez, Sholo.
01:43:30.000 Thank you.
01:43:30.000 Sorry to hear, man.
01:43:31.000 That's crazy.
01:43:34.000 I will stress to everybody, we are on Rumble.
01:43:37.000 All of the clips are on Rumble.
01:43:39.000 The full live show isn't.
01:43:42.000 But, uh, you know, we'll see what happens.
01:43:44.000 Amishman says, this is all a publicity stunt ahead of Timcast moving to the X platform next week.
01:43:50.000 Nothing, uh, they literally took our episodes down.
01:43:54.000 Um, YouTube deleted them.
01:43:56.000 Sent us notifications.
01:43:57.000 The YouTube video is struggling to play right now.
01:43:59.000 Is other people experiencing that?
01:44:02.000 I'm clear.
01:44:02.000 Looks good on my end.
01:44:03.000 Looks like YouTube on our end is like on the verge of crashing.
01:44:07.000 Yeah.
01:44:09.000 James Savick says, my warning to YouTube, ban Tim and I am gone.
01:44:14.000 Look, man.
01:44:15.000 Remember when they banned Alex Jones and Milo and Paul Joseph Watson from all these platforms?
01:44:19.000 They don't care.
01:44:21.000 What they're looking at is... it's political.
01:44:25.000 There are employees there who just got arrested because they want Google to divest from Israel.
01:44:29.000 I guarantee you there are managers at YouTube who hate Israel and don't like the fact that this show has nuance on the subject matter.
01:44:38.000 Phil's defended Israel on several occasions.
01:44:41.000 Uh-oh, can't have that.
01:44:42.000 I've been thinking lately that there's this big picture, earth politics, there's this global business that's happening.
01:44:48.000 There's all this business going on.
01:44:49.000 And if you come up with an ideology, the global business will be like, can we tolerate this ideology?
01:44:54.000 Is this ideology going to derail our global business?
01:44:57.000 If it's not, we'll accept it.
01:45:00.000 Okay.
01:45:00.000 Then you're going to have to find a way to let the population maul the ideology and figure it.
01:45:05.000 And then if the population can come up with a way to integrate that ideology, to make global business a little better, they'll let it.
01:45:09.000 But if you push the ideology too hard without showing them that you're going to make business better, they'll kill you.
01:45:14.000 So you you've got to be, or they'll ban you or they'll do.
01:45:16.000 So when it's good to have ideology, but you've got to learn how to synchronize it with the business of earth.
01:45:23.000 Jason Dixon says, Tim, I'd like to sell 10 Bitcoin and invest in Timcast.
01:45:27.000 Timcast has no investors.
01:45:28.000 I am the sole owner and none of its other companies, related companies, have any shared interest as well.
01:45:36.000 SCNR has partial ownership from Bill Ottman of Minds.com, who's a good friend, but I don't believe Timcast will ever take investment.
01:45:45.000 You know, I say I don't believe because I don't, maybe I die at some point.
01:45:49.000 The issue is just that We want to expand cultural endeavors and grow.
01:45:54.000 And so the money that comes in through everything basically funds and supports the mission and the operation.
01:45:59.000 It's like we've gone over expenses and like salaries and all that stuff and we're like, man, it's just like the bulk of the costs are travel, accommodation, massive expenses.
01:46:09.000 It could be upwards of like $3,000 per day.
01:46:10.000 So a lot of money.
01:46:13.000 Yeah, in that regard.
01:46:14.000 And then we have international guests, people who come from Europe.
01:46:17.000 The craziest thing is that we were looking at a flight to Texas and it was 2,500 round trip.
01:46:22.000 Whoa.
01:46:23.000 Yeah.
01:46:24.000 And we're not talking first class.
01:46:25.000 And I was like, wow.
01:46:27.000 The other thing too is we book a lot of travel within like a week or two, because a lot of guests shift around.
01:46:34.000 And the problem is that if we book someone like two months in advance, which we sometimes do, and then book their flights and they cancel on us, we lose that money.
01:46:42.000 We've also had certain people be like, I missed the flight or I can't take the flight.
01:46:46.000 And so it ends up costing a lot of money.
01:46:49.000 The crazy expense is driving.
01:46:52.000 If we had a studio next to an airport, it would be a lot cheaper, but a lot noisier.
01:46:56.000 It's funny, that's how, for me too, when I travel, the flights are like 150 bucks, but to get to the airport and back is 180 with an Uber.
01:47:04.000 It's crazy.
01:47:04.000 We're so far in the woods.
01:47:06.000 I mean, and to be honest with you, like the cost of travel and stuff, it's not going down because the cost of oil is not going anywhere but up unless there's some kind of change in the U.S.
01:47:17.000 policy, so.
01:47:19.000 All right.
01:47:20.000 Noor Allahi says, if you post and stream to Rumble the way you do to YouTube, I will stream and watch TimCast IRL there.
01:47:25.000 We all have that app on our Roku.
01:47:26.000 Ask Crowder how loyal Fanbase can be.
01:47:28.000 Here's a hundred bucks, a show of good faith.
01:47:30.000 I really appreciate it, my friend.
01:47:31.000 One of the concerns is that I think around 60% of viewers watch on the YouTube app on their TVs.
01:47:39.000 So they're not chatting, they're not super chatting, they're sitting on their couch with their friends and family and they turn the TV on.
01:47:45.000 This is one of the craziest things that I didn't know for a long time because we look at the concurrent viewership and we're like, wow, we have 44,000.
01:47:52.000 Total viewership is actually much bigger than that.
01:47:54.000 We can't track that because we don't have the same tools as, like, Nielsen Ratings.
01:47:57.000 But, uh, I ended up learning that, like, a guy, his wife, and his friend, or kids, will be watching the show.
01:48:04.000 They'll, like, it'll be the end of the day, and they'll turn the show on the TV.
01:48:07.000 They'll open up the YouTube app, press play, and then there's, like, three or four people in one room watching the show.
01:48:11.000 That counts as one person in the concurrent viewership.
01:48:14.000 Yeah.
01:48:15.000 So, there's no real way to track all that.
01:48:19.000 So we don't actually know the full size of viewership.
01:48:22.000 What we have with the concurrent viewers is not people, it's screens.
01:48:26.000 And it's around 60-70% television screens, which means the viewership's actually a lot bigger.
01:48:32.000 It's crazy.
01:48:33.000 And so then, you know, people will come and be like, wow, you get 40,000 concurrent viewers?
01:48:37.000 You average that?
01:48:38.000 And I'll be like, screens.
01:48:40.000 So if we're talking like your average family or whatever and these are people in their 30s and they may it may just be like at most like two people we're looking at concurrent viewership is actually closer to around like 70 or 80.
01:48:54.000 Some people are watching on their phones and laptops too for sure.
01:48:58.000 And that's the big challenge too with moving to another platform is that people would have to switch to Rokus and other things like that.
01:49:04.000 But I do believe we have a solution.
01:49:05.000 It's just, you know, the other top men that I've spoken with, they want to get their ducks in a row before we say what's going on.
01:49:14.000 Andrew Starr says, no one cares about your salary, dude.
01:49:18.000 Well, they sure do chat a whole lot, endlessly, about how I'm only doing it for money.
01:49:23.000 So I make it a point to point out, I would live a much more comfortable life if I only did the morning show.
01:49:31.000 That was the original plan for IRO with the van.
01:49:33.000 I could just drive around and do my morning show, my monologue clips, anywhere.
01:49:40.000 Could be skating and skiing anywhere I wanted, living in a van down by the river.
01:49:44.000 But then we did this show, and, uh, decided to, you know, build stuff, I guess.
01:49:50.000 Yeah.
01:49:51.000 Build stuff.
01:49:51.000 Gotta give back.
01:49:53.000 Yep.
01:49:53.000 Then we hired a bunch of people and then we built a bunch of infrastructure and tried to make it professional and better and we keep expanding.
01:50:00.000 The new studio sorely needed, definitely.
01:50:03.000 People complain that the lighting makes them look like zombies.
01:50:07.000 Yeah, it can be pretty bright sometimes.
01:50:10.000 The new studio looks so good.
01:50:12.000 It's mostly the balance.
01:50:13.000 It's like cinema quality.
01:50:14.000 Wesley just nailed it.
01:50:16.000 Yeah.
01:50:17.000 He's so good.
01:50:18.000 I think Aaron was involved too.
01:50:18.000 And it's not just Wesley.
01:50:19.000 I was impressed with how thin I looked.
01:50:21.000 Oh, good job, man.
01:50:22.000 You've been working on it quite a bit.
01:50:24.000 No, it's not that.
01:50:25.000 It's that these cameras, they flatten your face.
01:50:28.000 It's funny, too, that the weirdest thing is people who are like, Tim is short and fat.
01:50:32.000 And I'm like, then they watch a video of me skating.
01:50:34.000 They're like, oh, Tim's kind of tall, actually.
01:50:37.000 He's taller than me.
01:50:38.000 It's got to be the beanie.
01:50:39.000 When the beanie comes off, man, your brain, it's just, people, when you see what you- It's actually a mirror.
01:50:44.000 Yeah, when they see what got the beanie, it's like, okay, he actually is a genius.
01:50:47.000 Because you see, like, it's a large brain.
01:50:51.000 Sure.
01:50:51.000 Relatively.
01:50:53.000 But I was actually surprised because I don't know what it is.
01:50:53.000 Pretty interesting.
01:50:55.000 Lenses have a huge impact.
01:50:56.000 Did you guys ever watch a video of how lenses change how people look?
01:50:59.000 Yeah.
01:51:00.000 And so these new cameras make everybody look very different.
01:51:04.000 Everybody looks pretty tall.
01:51:08.000 Sick!
01:51:08.000 Yeah, it makes you look slim. I don't know. It's a bigger room. It's a fixed lens.
01:51:15.000 And they're higher quality cameras. They're actual DSLRs.
01:51:21.000 These are camcorders. These are really good ones, and they do look great. They also lit the
01:51:26.000 backgrounds, which might be causing dynamic shape. So you can see there's some shape definition
01:51:32.000 in the bodies.
01:51:33.000 Yeah, so the new studio has spotlight lighting.
01:51:36.000 Each person has a light that shines directly on them, plus LED bar backlighting.
01:51:42.000 And then there's like windows and stuff, so decorations and things on the wall will be harder to see because the room's a lot bigger too.
01:51:48.000 But uh, it looks great.
01:51:49.000 Monday's gonna be epic.
01:51:52.000 What do we have on Monday?
01:51:54.000 Who's the first guest?
01:51:56.000 Is that Scott Pressler?
01:51:59.000 Yeah, looks like it.
01:52:01.000 I love him, man.
01:52:02.000 That'll be great.
01:52:03.000 Yeah, Pressler will be our first guest in the new studio.
01:52:07.000 Gonna be fun on a bun.
01:52:09.000 Let's grab some more Super Chats.
01:52:13.000 What do we have?
01:52:14.000 YouTube's on the fritz for me.
01:52:17.000 I've still got it here.
01:52:18.000 Amir Habibi says, Mr. Bocas makes some good points.
01:52:21.000 We need more podcasts with him as a guest.
01:52:23.000 Well, rest in peace, Mr. Bocas, but we are planning on having Seamus on at some point.
01:52:27.000 You know, it would be great if we had Seamus and Seamus.
01:52:32.000 Seamus 1 and Seamus 2.
01:52:34.000 Yeah, Seamus 1 is the cat.
01:52:35.000 Yep.
01:52:36.000 Seamus 2 is the cartoonist.
01:52:37.000 We don't have a lot of respect for cartoonists over here.
01:52:41.000 Dirty AI.
01:52:44.000 Seamus should be here.
01:52:45.000 I believe he'll be here all next week.
01:52:46.000 So we're excited to have him back.
01:52:48.000 I think Seamus is fun.
01:52:49.000 Yeah, he's a good dude.
01:52:50.000 I love him because he says he's a Christian, but he likes to question things.
01:52:54.000 No, I'm just kidding, Seamus.
01:52:56.000 I believe you.
01:52:56.000 Don't put words in his mouth.
01:52:57.000 Yeah, yeah.
01:52:58.000 I'm gonna make you question everything.
01:53:00.000 Clint Torres says, howdy people.
01:53:01.000 Apologies for the tardiness.
01:53:02.000 I had to see a lady about a cat fill.
01:53:05.000 You should do a couple of gym sessions with Tim.
01:53:08.000 It sounds like he has a lot to teach about going hard.
01:53:11.000 He definitely does.
01:53:12.000 Today was nuts.
01:53:14.000 Why?
01:53:14.000 Because Richie wouldn't let me stop skating.
01:53:17.000 So I was trying to do a run on the mini ramp.
01:53:20.000 And these are not even like the craziest tricks, it's just I haven't skated a mini ramp in a long, long time.
01:53:27.000 So I was doing a boardslide, fakie disaster, axle, back disaster, nollie front disaster, no stall, switch blunt, and then like a rock to turn around and then a kickflip 5-0.
01:53:42.000 And I got one, but I hit the wall with my hand.
01:53:46.000 And so that's not clean.
01:53:48.000 And then I was like, I'm so beat, I'm done.
01:53:50.000 And then Richie was like, you gotta get a clean one.
01:53:51.000 So I skated for another 40 minutes at max heart rate.
01:53:54.000 And then I couldn't get it.
01:53:56.000 I was like, look, the one I got is what I got.
01:53:58.000 But then I was like, on the verge of dying.
01:54:01.000 He's really good at pushing you.
01:54:02.000 I like Richie.
01:54:03.000 He's a great teacher.
01:54:03.000 Richie's great.
01:54:05.000 Yeah, good dude.
01:54:06.000 We were trying to get him to come on the show, but he wasn't here.
01:54:08.000 Yeah.
01:54:09.000 So we're trying to find everybody.
01:54:11.000 Well, I don't know exactly what he's saying, but I think everyone's trying to help Rumble get bigger and bigger and bigger.
01:54:16.000 I can respect that.
01:54:17.000 He wants you to stream to Rumble because it would benefit him directly so he's pitching Fitz like a woman.
01:54:22.000 Well, I don't know exactly what he's saying, but I think, like, everyone's trying to make, help Rumble get bigger and
01:54:30.000 bigger and bigger.
01:54:31.000 I can respect that.
01:54:32.000 I would just like to stress, and in all fairness, we need to make money off the clips so that we can pay people who
01:54:41.000 work here.
01:54:42.000 the We put the clips on Rumble either way.
01:54:46.000 Those clips don't generate money.
01:54:48.000 It's like very, very little.
01:54:50.000 So we lost a lot of ad revenue by doing so, but we want to be on Rumble.
01:54:54.000 We think Rumble's important.
01:54:55.000 We think it's good.
01:54:56.000 The live show is the biggest driver of memberships to TimCast.com because when we're live, we say, hey, the members-only show starts now, go watch, and then tons of people instantly sign up.
01:55:07.000 The fear is that if we disrupt that, and we don't see the same turnaround, because we don't know...
01:55:12.000 We stop generating memberships, and then we become a sinking ship.
01:55:16.000 And then we have to figure out the stability point where, okay, how many memberships do we generate through streaming on other platforms?
01:55:23.000 And if the number is that it's lower because we've deranked ourselves on YouTube, split our audience up, some people can't find the stream or otherwise, then we have to say, how do we shrink the ship to maintain its current size based on the current level of growth?
01:55:36.000 Right now, where we're at with YouTube, we have moderate to slow growth.
01:55:40.000 I would call it stable.
01:55:42.000 And that's great!
01:55:43.000 Then there's an opportunity for BizDev with like Casper and other things.
01:55:47.000 Other companies have asked us to stream on their platform, and I said, if we do that, and it reduces our current level of memberships, then we have to start cutting fat.
01:55:56.000 I don't want to do that.
01:55:57.000 We are stable where we're at in everything we're doing.
01:56:00.000 We're seeing moderate viewership growth on YouTube, moderate membership growth, and so that allows us to invest in other ways to shore up the defenses for the show.
01:56:09.000 If we venture off into the unknown, don't generate the revenue, It's only a risk for us.
01:56:16.000 So what I've said to all these companies is, mitigate that risk, deal, and most of them have said, we don't know if we can do that.
01:56:23.000 We'll see what happens next week.
01:56:25.000 YouTube has changed the game and opened the door for a lot of competitors in ways they should not have by doing this.
01:56:32.000 The fact that they took down one of the craziest podcasts ever.
01:56:37.000 I'm offended by this.
01:56:39.000 Alex Jones, Joe Rogan, Blair White, Michael Malice, Luke Rutkowski, Ian Crosland, me, Drew Hernandez.
01:56:44.000 I said Drew Hernandez?
01:56:45.000 Yeah, Drew Hernandez was there.
01:56:46.000 You just, yeah, you just did.
01:56:47.000 Who am I forgetting?
01:56:48.000 Lydia was there too.
01:56:49.000 It was back when Lydia was on the show.
01:56:50.000 All of these people on this crazy show.
01:56:52.000 It's a cacophony of nonsense.
01:56:53.000 Joe Rogan's laughing.
01:56:54.000 Jones is going nuts.
01:56:55.000 Michael Malice is laughing.
01:56:57.000 YouTube deleted it.
01:56:59.000 That's crazy.
01:57:00.000 Yep.
01:57:01.000 That's crazy.
01:57:03.000 One of the craziest podcasts ever.
01:57:04.000 It was just such a simple show.
01:57:06.000 Yeah.
01:57:07.000 And they were just like, after three years, I didn't even bring up Klaus Schwab.
01:57:09.000 I wanted to bring up Klaus Schwab, and I should have, because it would have been funny.
01:57:12.000 I remember the moment when I could have said it, too.
01:57:14.000 It was a funny moment when you asked Joe to do DMT or something, or ayahuasca, and he was like, what?
01:57:18.000 Who are you?
01:57:19.000 Yeah, he was like, well, I'm gonna wake up, and I'm like, why didn't I puke?
01:57:21.000 I was like, God, this guy's funny as fuck.
01:57:23.000 He's not just, he's not famous for no reason.
01:57:25.000 I can't believe they did, that's insane.
01:57:26.000 He was funny.
01:57:28.000 Wow.
01:57:29.000 Alright, we'll grab a couple more of these here, Super Chats.
01:57:32.000 We are gonna have that Members Only Uncensored show, and we'll talk to you guys, so become a member!
01:57:36.000 Support the show.
01:57:38.000 You know, I do believe that if we, uh, were... So, here's something, I think if we were to, like, say, okay, YouTube, screw you, and we chose any other platform, we would see a massive burst in memberships instantly.
01:57:54.000 But then people, their memberships, they cancel them, their cards expire.
01:58:00.000 We don't have a membership team that calls people and asks them to re-sign up.
01:58:05.000 I feel like that's annoying.
01:58:07.000 Maybe we should.
01:58:08.000 Maybe there's a lot of people who don't realize their memberships lapsed and they would love to stay members.
01:58:13.000 If we just had someone hit them up and be like, hey, we see that your membership stopped, would you want to keep going or no?
01:58:18.000 Yeah.
01:58:18.000 I know what you mean, because it is annoying to get a call you don't want to get, but I think Valuetainment does that.
01:58:23.000 They do.
01:58:24.000 They have a dedicated marketing facility.
01:58:27.000 Maybe if we had like two people and all they did was like send messages to people and say, Hey, we noticed your membership dropped off.
01:58:33.000 We'd like to, we'd love to have you back.
01:58:35.000 Is there anything we can do?
01:58:36.000 And if they said no, they can have a nice day and maybe people would be like, Oh, I didn't realize here.
01:58:39.000 So yeah, sign me back up.
01:58:41.000 Phone calls, particularly phone calls, hearing a voice.
01:58:44.000 I guess, you know, I wish I could do it.
01:58:46.000 I can't.
01:58:49.000 We'll just, yeah, we'll get an AI that sounds like me, but like, I am Tim Poole.
01:58:52.000 Ooh, we get AI to do it.
01:58:53.000 Trust me.
01:58:53.000 Please sign up for my website.
01:58:56.000 Welcome to the future.
01:58:57.000 Actually, if you could just get a voice, an AI voice filter, anyone could do it for you, they just sound like you.
01:59:01.000 No, it can't do me.
01:59:02.000 Oh, really?
01:59:02.000 Yeah, it's weird.
01:59:04.000 We've tried a couple times to take a recording of my voice, and I've even talked to, like, Seamus about it, and he's like, impersonating Tim is hard to do.
01:59:13.000 It's like, yeah, I don't know why.
01:59:16.000 People have told me that it's hard to do an impression of me.
01:59:18.000 You do sound very neutral.
01:59:20.000 There's not a lot of things that you could actually grab onto and exaggerate a little bit to make... Joe Rogan too.
01:59:27.000 Joe Rogan's a hard voice to impersonate.
01:59:27.000 Yep.
01:59:30.000 I've seen people who have tried, but...
01:59:33.000 There's ways he talks you can get, but the actual sound of his voice, like people can imitate Trump and it sounds like Trump, you know?
01:59:38.000 Yeah.
01:59:39.000 But we tried putting my voice into the AI voice generator and it sounded weird.
01:59:43.000 It sounded like this.
01:59:44.000 Hi, I am Tim Pool.
01:59:45.000 I'm like, that is not.
01:59:46.000 Cause your voice, it kind of sounds like high, but it's deep.
01:59:46.000 Yeah.
01:59:50.000 It's got like, it's like, it's like low register, but kind of like the upper, I don't know.
01:59:55.000 And it's sharp too.
01:59:55.000 It's got like a sharpness to it.
01:59:57.000 The way you like finish a sentence and finish a sound a lot of times.
02:00:01.000 Karsten Ellsworth says, Tim, I'm a professional marketer and longtime Timcast member looking for a job change.
02:00:06.000 The new marketing effort sounds fun.
02:00:08.000 If you're hiring, I'd love to join the culture war.
02:00:09.000 Where can I send a resume?
02:00:11.000 I don't know that we actually would bring on someone.
02:00:11.000 I don't know.
02:00:16.000 You know, Dane already is our marketing guy.
02:00:18.000 I think all we would do is just like make ads and do like awareness campaigns and just generate ubiquity.
02:00:27.000 It's not so much that the ads make people watch, but it's that everyone becomes familiar with the show.
02:00:32.000 And you know what I was thinking of doing?
02:00:34.000 What if every Monday we put up an ad on a variety of platforms that says like, this week on Timcast IRL we've got, and then it shows the guests.
02:00:43.000 And it's like, watch live Monday through Friday this week, and then we just change that every week.
02:00:47.000 Because then we're directly advertising something.
02:00:51.000 Maybe that.
02:00:51.000 It's too bad it takes too long to get ads approved, because if YouTube actually did quick turnaround on ad approvals, I would do a daily, you know, where it's like, tonight at 8pm, check out, you know, Philobonti on Tim Castellano.
02:01:04.000 The occasional, when the guest doesn't show, and a marketing, a big marketing thing for something that doesn't happen might be a problem.
02:01:11.000 I think it's fine.
02:01:14.000 All the big cable networks do this.
02:01:15.000 And then if someone doesn't show up, they're just like, unfortunately, they weren't able to make it.
02:01:18.000 All right, everybody, smash that like button, subscribe to this channel, share the show with your friends.
02:01:22.000 Don't forget to subscribe to our Rumble channel at rumble.com slash TimCastIRL, and subscribe, follow me on Twitter, or I should say Axe, at TimCast, and the show, of course, at TimCastIRL.
02:01:34.000 We're gonna go to the members-only show right now, so become a member at TimCast.com.
02:01:37.000 Like I said, Phil, what's going on?
02:01:39.000 I am PhilThatRemains on Twix.
02:01:42.000 I am PhilThatRemainsOfficial on Instagram.
02:01:43.000 The band is All That Remains.
02:01:45.000 You can follow us on Apple Music, Spotify, Pandora.
02:01:50.000 I don't know, what are the other ones?
02:01:51.000 YouTube, you know.
02:01:53.000 Amazon Music.
02:01:53.000 Amazon Music, there you go.
02:01:55.000 YouTube, you know, the internet.
02:01:56.000 And don't forget, the left lane is for crime.
02:01:58.000 Oh, another thing.
02:01:59.000 Check out the All That Remains Instagram page.
02:02:01.000 It's Instagram.com slash all that remains.
02:02:05.000 Keep an eye on that because it just got wiped today and there are things coming.
02:02:09.000 Do you have a date for the song's release?
02:02:10.000 Not yet, but it will be announced probably in the next few days or week or so.
02:02:15.000 So keep an eye out.
02:02:17.000 And Bucko, did you have any last words?
02:02:21.000 Okay.
02:02:22.000 All right.
02:02:22.000 Meow.
02:02:23.000 Good job.
02:02:24.000 I'm Ian Crossland.
02:02:25.000 Follow me at Ian Crossland on Rumble on YouTube, which I'm still on.
02:02:29.000 I've had my channel for 18 years or whatever the hell.
02:02:31.000 Follow me all over the place.
02:02:32.000 Every social network.
02:02:33.000 I probably got a presence except TikTok.
02:02:35.000 I don't mess with it.
02:02:36.000 And I'm going to be in Austin on April 27th for the Minds Festival.
02:02:40.000 It's going to be awesome, dude.
02:02:42.000 Toby Turner's kicking off the show with the music set.
02:02:44.000 I may play a song with him.
02:02:46.000 And it's a night of roundtable debates, discussions, comedy, music.
02:02:49.000 It's going to be fantastic.
02:02:50.000 You go to festival.minds.com and get your tickets there.
02:02:54.000 Use promo code Ian for 20% off.
02:02:57.000 And I'm really looking forward to seeing you there.
02:02:59.000 And I'll probably be hanging out with people after the show and meet the crowd and everything.
02:03:02.000 So catch you there.
02:03:03.000 See you.
02:03:04.000 All right.
02:03:05.000 Thanks, y'all.
02:03:05.000 Have a good night.
02:03:06.000 See you tomorrow.
02:03:07.000 We'll see you all over at TimCast.com.
02:03:09.000 Not tomorrow, but in a few minutes.