Based Camp - November 19, 2024


Trump to Make Online Censorship Illegal (This Will Change the Internet Forever)


Episode Stats

Length

46 minutes

Words per Minute

176.6617

Word Count

8,265

Sentence Count

610

Misogynist Sentences

6

Hate Speech Sentences

19


Summary

Free speech is under threat in the UK, and it s time to reclaim it in the USA. Simone and I discuss why we need to fight for our right to free speech, and why we should all be fighting for it.


Transcript

00:00:00.000 upon my inauguration as president, I will get big online platforms out of censorship business.
00:00:07.360 From now on, digital platforms should only qualify for immunity protection under Section 230
00:00:13.080 if they meet high standards of neutrality, transparency, fairness, and non-discrimination,
00:00:20.120 dramatically curtailing their power to arbitrarily restrict lawful speech.
00:00:25.800 I'm used to getting really excited about a candidate, and then they win, and then they proceed to immediately disappoint me.
00:00:33.980 Like, wait, how did you even know? I didn't even put this on my wish list, and yet here it is. This is amazing.
00:00:40.400 Europeans were like, I don't see how America is that different from Europe. I don't see how you're more free than we are.
00:00:46.400 Do you get it now? Do you get that Trump was allowed to win in this country?
00:00:52.400 Now, let's go start a fucking revolution.
00:00:56.440 The entire world would be better off if these people were permanently removed from these platforms.
00:01:01.920 Like, there is no downside and only upside to see people like Candace Owens, Tucker Carlson, Tim Poole,
00:01:06.900 never be allowed to publicly broadcast their opinions ever again.
00:01:09.800 Tread on them! Tread the fuck all over them!
00:01:13.980 I don't give a fuck about anybody that winds up at any of these rallies and gets shot or whatever the fuck, okay?
00:01:17.940 You gotta fight for your right!
00:01:20.880 Is free speech under threat in the UK? With the rise of so-called non-crime hate incidents,
00:01:31.780 arrests over grossly offensive memes, and the government's online safety bill threatening to
00:01:37.480 clamp down on social media posts, can you really speak your mind in 21st century Britain?
00:01:43.240 If any U.S. university is discovered to have engaged in censorship activities such as flagging
00:02:01.420 social media content for removal of blacklisting, those universities should lose federal research
00:02:08.320 dollars and federal student loan support for a period of five years.
00:02:12.860 Ding dong! It's America, motherfucker!
00:02:15.260 Furthermore, when users have their content or accounts removed, throttled, shadow banned, or
00:02:22.160 otherwise restricted, no matter what name they use, they should have the right to be informed
00:02:27.900 that it's happening, the right to a specific explanation of the reason why, and the right
00:02:34.700 to a timely appeal.
00:02:46.700 In addition, all users over the age of 18 should have the right to opt out of content moderation
00:02:52.360 and curation entirely, and receive an unmanipulated stream of information if they so choose.
00:02:59.700 Would you like to know more?
00:03:02.260 Hello, Simone!
00:03:03.540 I am excited to be here with you today.
00:03:05.060 Today is another in a series of episodes we're going to be doing on the first major projects
00:03:09.040 of the Trump administration, and this one is relevant to your and my daily life, our potential
00:03:16.100 income, our careers.
00:03:18.440 It's not just important, it's existential.
00:03:22.400 And it could change the way the internet works forever, and in a way that is so astronomically
00:03:28.960 positive, again, it feels like we just got kicked on to a whole new timeline.
00:03:34.480 Oh my gosh.
00:03:35.220 Oh yeah, yeah, yeah.
00:03:37.220 Because you haven't gone into this yet, and you don't know about this plan yet, this is
00:03:42.060 game changing.
00:03:44.320 So let's play a little bit of this here, and we can talk about it as we're watching it,
00:03:47.840 okay?
00:03:48.240 If we don't have free speech, then we just don't have a free country.
00:03:52.440 It's as simple as that.
00:03:54.700 If this most fundamental right is allowed to perish, then the rest of our rights and
00:03:59.360 liberties will topple, just like dominoes, one by one.
00:04:03.300 They'll go down.
00:04:04.680 That's why today I'm announcing my plan to shatter the left-wing censorship regime and
00:04:11.180 to reclaim the right to free speech for all Americans.
00:04:15.200 And reclaim is a very important word in this case, because they've taken it away.
00:04:20.200 In recent weeks, bombshell reports have confirmed that a sinister group of deep state bureaucrats,
00:04:26.820 Silicon Valley tyrants, left-wing activists, and depraved corporate news media have been
00:04:33.760 conspiring to manipulate and silence the American people.
00:04:38.200 What came out?
00:04:39.480 Like he's referring to some kind of bombshell reveal, Twitter files of...
00:04:44.640 We know from Mark Zuckerberg.
00:04:47.540 Mark Zuckerberg.
00:04:48.520 Oh, yeah, yeah, right, right, right.
00:04:50.140 The FBI asked him to censor the Hunter Biden laptop story, despite knowing that it was true,
00:04:57.120 in order to help the Democrats win an election cycle.
00:05:00.340 That is insane.
00:05:01.940 That is obscene.
00:05:03.680 That should not be allowed to happen.
00:05:05.920 With our own videos, I know when I talk about certain topics, and there are things I just
00:05:11.820 can't mention.
00:05:12.900 And I don't.
00:05:13.800 I don't mention it.
00:05:14.640 And it makes me feel like I'm a very dishonest person, but I'm like, if I do, then people
00:05:17.960 aren't going to see our videos.
00:05:19.260 And even mentioning that this is the case, well, it won't be the case for long if this
00:05:24.400 goes through.
00:05:25.200 But just this being the case is really, really sad.
00:05:28.640 I cannot mention basic facts, like how many people voted in each election cycle.
00:05:33.480 That is something that has gotten videos demonetized in the past, and we'll test to see if it demonetizes
00:05:37.880 this one.
00:05:38.520 And this is not like a conspiracy series.
00:05:41.440 It's just numbers of voters in each election cycle.
00:05:44.280 I cannot.
00:05:44.940 And this was true during like the Wuhan thing, right?
00:05:47.460 When they would like demonetize you for saying it was lab leak theory.
00:05:50.760 And now we know the names of the first three people who got them, and they were all gain
00:05:54.180 of function researchers at the Wuhan virus labs.
00:05:57.000 I'm sorry, not virus labs, COVID virus labs.
00:05:59.760 The COVID gain of function virus labs.
00:06:02.380 The labs that we're studying, how do we make a COVID virus that can spread faster in humans?
00:06:06.280 Like, it's wild to me the lies that people have, and the tech companies have spread.
00:06:15.100 And I like that they are labeling this as an exit, because so many people were like, this
00:06:20.400 was like when What's His Face came out, was like the things that the NSF was spying on
00:06:25.800 average Americans.
00:06:26.920 Oh, Edward Snowden.
00:06:28.300 Edward Snowden.
00:06:29.000 And everyone's like, oh, wow, that's terrible.
00:06:31.340 So anyway, whatever.
00:06:33.120 Yeah, that was a big nothing burger on the end.
00:06:35.000 Well, no, and that's the way the Democrats acted when it came out that the FBI had suppressed
00:06:39.900 true information to help Dems win an election.
00:06:42.340 Well, actually, you know what?
00:06:43.800 So each of these developments, I think, in one sense, life goes on.
00:06:48.640 But in the other sense, people also understand that they're being censored.
00:06:51.920 And I see this more and more, because I'm watching YouTube in the background, basically
00:06:55.980 all day, in the self-censorship, in the strange language that people are using.
00:07:00.940 In fact, now it's even spreading into the common lexicon.
00:07:05.880 People use it offline, even when they're not on YouTube.
00:07:08.720 Like unaliving or graphing.
00:07:10.120 Yeah, yeah.
00:07:10.640 There was one fantasy event that people were trying to put on.
00:07:15.020 It was kind of a LARP event that they were sending out invites to.
00:07:18.540 And they're like, you might even unalive one of your enemies.
00:07:21.560 And it's like, no, this is an invite.
00:07:23.920 Like, but yeah, it's getting to that point.
00:07:26.600 So I do think that people have internalized it.
00:07:29.040 And it has led to a lot of creepy behavior.
00:07:31.420 So it both has been and hasn't been a big deal.
00:07:35.420 No, it's been a huge deal.
00:07:37.260 The problem is, is you are aware of the words that people have found to get around the system.
00:07:41.640 You are not aware of the entire categories of topics that regularly get our videos hidden
00:07:47.620 from people.
00:07:48.280 That's true.
00:07:48.860 Yeah, that's fair.
00:07:49.980 Yeah.
00:07:50.120 So you're here being like, oh, here's where we notice it.
00:07:52.640 It's where you don't notice it that it's an existential issue to our democracy.
00:07:57.000 And what I was pointing out, that was this noted thing.
00:08:00.180 It was like, oh, so crazy.
00:08:01.500 Okay, anyway.
00:08:02.340 And the Dems tried to do that again with this revelation.
00:08:05.380 And I think that most Americans expected this.
00:08:07.680 Because even the legacy GOP would have swept this under the table.
00:08:10.920 They're like, you know, when we were at the NatCon, a lot of people there were like,
00:08:15.220 screw the base.
00:08:15.840 I was like, that's not what the base wants.
00:08:17.060 They go, we just need a deep state that's controlled by Republicans.
00:08:19.580 And I was like, no, we need to burn the deep state.
00:08:21.200 We need to burn the deep state.
00:08:22.260 We need to burn the deep state.
00:08:23.380 And they're like, how could you say that?
00:08:24.780 Like, that's where we're all employed, like gesturing at everyone in the room.
00:08:28.260 And I'm like, well, maybe that's part of the problem.
00:08:30.020 And so I think that, again, a lot of conservatives today, they get angry.
00:08:35.060 And this is what we saw from, like, Leather Apron Club.
00:08:37.040 And he was like, no one should vote for president.
00:08:38.620 He's like, look, the new right is accepting of gays.
00:08:41.620 I don't like that the new right is accepting of gays.
00:08:44.320 I don't like that they have, you know, absolute heroes like Scott Pressler, considered heroes among them.
00:08:49.300 I do not like that gays speak at their events.
00:08:51.780 You know, he was talking about all this.
00:08:53.200 We should not support them or we are platforming this new agenda, which is basically just what Democrats were in the 90s.
00:08:58.960 And I'm like, if you think that, you are fundamentally wrongheaded because you're so stuck in a 90s culture.
00:09:04.240 We just basically aren't fighting those battles because they're not relevant in the common era.
00:09:10.600 We are fighting much bigger battles than either Republicans or Democrats were fighting in the 90s era.
00:09:16.620 Battles over things like basic free speech.
00:09:19.980 We think that this should be a fundamental right, whether I'm on YouTube or X or any other platform.
00:09:25.800 And this needs to be enshrined in law and in the way that we act.
00:09:30.480 And so that the new right goes after this, because this isn't what leftists would have gone after in the 90s.
00:09:35.780 This isn't what leftists would have gone after at the 2000s.
00:09:37.780 This would have been seen as radical and insane what's about to be proposed.
00:09:40.520 But it is what the new right cares about.
00:09:42.940 And so the question is, do you care about this stuff more than you care about banning gay marriage?
00:09:49.080 You know, do you care about keeping our borders safe more than you care about hating on Hispanic people?
00:09:55.260 There is a portion of the legacy GOP that's like, yeah, these are just bigger battles.
00:10:01.520 And then there's a portion that genuinely did care about their homophobia and racism more than actually fixing anything.
00:10:07.000 And we're like, oh, we're happy to see you go.
00:10:09.140 I'm earnestly telling you that it may not be in your best interest to vote this election cycle.
00:10:13.940 To vote for a candidate is to assent to their platform, to signal to those running their campaign that you approve of the messaging that they have put out.
00:10:21.820 Our message to gay Americans tonight is this.
00:10:25.440 You're free to marry who you want, if you want, without the government standing in your way.
00:10:30.540 And I think that, frankly, I wouldn't be surprised if me and Trump won just the normal gay guy vote.
00:10:37.920 Because, again, they just wanted to be left the hell alone.
00:10:40.760 But what should you do if, like so many other conservatives, you see the Republican Party shifting ever further away from real conservative values?
00:10:47.640 It reminds me of the atheist movement, when the atheist movement shattered and there was one faction that was like, oh, actually all the statistics say religion is super important and as society has descended into secularism, everything has fallen apart.
00:10:59.440 And then there's another group that we found out just like dunking on conservatives.
00:11:02.900 And that's where like Bill Nye went and like black science guy went, whatever his name is, Tyson, whatever.
00:11:08.440 Anyway, Neil deGrasse Tyson.
00:11:10.640 Well, he was riding his bike.
00:11:12.300 Well, there was a lightning strike.
00:11:13.580 And now he reads real fast.
00:11:14.900 He's good at science and math.
00:11:16.280 Black doctor.
00:11:17.640 I'm going to go out on a limb here.
00:11:20.300 Where exactly did he get the bike?
00:11:22.180 He stole it.
00:11:23.160 Right.
00:11:23.440 That's what I thought.
00:11:24.200 And what we learned is that they never actually cared about what was true.
00:11:30.560 They weren't partnered with us because they wanted to make things better for people.
00:11:33.580 They were partnered with us because they liked dunking on a specific sub population.
00:11:37.980 For them, it was conservatives.
00:11:40.060 And I think what many conservatives are realizing now is there was a faction, small faction of the party that turns out wasn't relevant to winning elections.
00:11:47.060 They're really just like dunking on gay people and racial minorities.
00:11:50.120 And they're gone now.
00:11:51.120 So hooray.
00:11:52.720 Watch our video where we go over like every major influencer in the space.
00:11:55.620 And all of them were like none of my followers vote.
00:11:58.180 And people were like, oh, it was just staging.
00:11:59.940 It wasn't just staging.
00:12:01.160 Leather Apron Club released this video right before the election happened.
00:12:06.220 Okay.
00:12:06.400 Not with enough time for Democrats to feel comforted by it.
00:12:09.160 And due to the YouTube algorithm, it was mostly seen by his base, who he would have known are Republican legacy voters.
00:12:16.160 And I think that through that, he created a mandate.
00:12:18.780 And the other people who, you know, went this direction created a mandate for the party to not listen to their types of voices.
00:12:26.460 And this, if you don't attack gays, we're not going to support you anymore.
00:12:30.340 Those people didn't vote because they don't relevant anymore.
00:12:31.940 But let's continue with all of this craziness.
00:12:34.260 So I'm going to play some more video here.
00:12:36.140 They have collaborated to suppress vital information on everything from elections to public health.
00:12:43.580 The censorship cartel must be dismantled and destroyed.
00:12:48.460 And it must happen immediately.
00:12:50.960 And here is my plan.
00:12:52.300 First, within hours of my inauguration, I will sign an executive order banning any federal department or agency
00:13:00.740 from colluding with any organization, business, or person to censor, limit, categorize, or impede the lawful speech of American citizens.
00:13:11.220 I will then ban federal money from being used to label domestic speech as mis- or disinformation.
00:13:19.540 And I will begin the process of identifying and firing every federal bureaucrat who has engaged in domestic censorship directly or indirectly,
00:13:30.300 whether they are the Department of Homeland Security, the Department of Health, Human Services, the FBI, the DOJ, no matter who they are.
00:13:40.060 I guess it's sad that this has to be done.
00:13:46.060 Like, how did we get here where we have to be like, we're going to have to fire the people who, you know, we shouldn't be-
00:13:53.260 We're manipulating elections.
00:13:54.800 Yeah.
00:13:54.940 Yeah.
00:13:55.220 Like, hmm.
00:13:57.020 How did that happen in the first place?
00:13:59.520 But no, I think that this is really important, and I'm glad that there's going to be consequences for this.
00:14:04.940 When I think we don't just need to fire them, we need to fire them and say why they're being fired.
00:14:08.780 Well, yeah, I want to be public about firing, but I want to be really public about, you know, what has happened and what we're doing about it.
00:14:17.340 I think, I've heard some people commenting on their hope for the Twitter files, but for various government departments, like, what actually happened with COVID?
00:14:25.540 What's actually going on with all these different things?
00:14:28.620 Like, just release the information.
00:14:30.980 I mean, one, the government should be transparent and public about these things anyway.
00:14:35.100 Unless it's top secret stuff that's going to get spies killed, then it should be 100% transparent and available for people.
00:14:42.260 We have the technology.
00:14:43.440 We have the ability.
00:14:44.540 And honestly, the accountability would make everything work more smoothly.
00:14:48.980 So-
00:14:49.580 Well, and here's what I think a lot of government officials didn't realize.
00:14:52.600 And if we get in the Trump administration, this is something I want to work on implementing if we can, is do mass scrapes of government data into AI to find all evidence of collusion and wrongdoing.
00:15:04.940 Because I think a lot of people thought there was just too much information to sort through.
00:15:08.780 And with AI, that's no longer the case.
00:15:11.280 I mean, maybe, except what I will say is among the most effective people at knowing to keep things offline are the government people.
00:15:21.360 They know to meet in person.
00:15:23.080 They know to do it offline.
00:15:25.260 No paper trail, et cetera.
00:15:27.280 They're among the very, very, very best.
00:15:29.180 We can look for unusual patterns, which is what AI is very good at doing.
00:15:32.720 Yes, you absolutely can.
00:15:34.260 You can look for connections and social graphs and things like that.
00:15:37.220 I would just say that that isn't going to be as exciting as it would be for-
00:15:41.940 I disagree.
00:15:43.000 And here's why I disagree.
00:15:44.420 Because we got the Hillary Clinton email leaks.
00:15:47.260 That's true.
00:15:48.320 That's true.
00:15:49.140 And this is supposed to be-
00:15:49.920 They talk in coded language, but it's very obvious that it's coded language when it's coded language.
00:15:54.200 We can't talk about that, because if we talk about pizza-related maps, I don't know what's going to happen.
00:16:00.880 Yeah, I don't know what's going to happen, but I can tell you they definitely weren't looking for a pizza-related map, because that's not a real thing.
00:16:07.980 Are you kidding me?
00:16:09.380 Haven't you seen-
00:16:10.640 No, sorry, this is where, like, all of my walls broke down.
00:16:13.400 It's when they were talking about pizza-related maps on a napkin.
00:16:17.380 That would have been an amazing Halloween costume, is, like, some kind of explorer with, like, a giant, like, pizza-related map, like, an actual one.
00:16:25.680 And, like, you're actually-
00:16:26.860 No, but I-
00:16:27.320 Your Halloween costume is you are a PDA file, but it turns out that, like-
00:16:31.620 Sorry, I remember seeing that, and I was like, wait.
00:16:34.760 Even if we're going to say this isn't what-
00:16:36.500 And I don't think it's what the conspiracy theorists thought it was.
00:16:39.260 We need to at least admit that the Clinton campaign had admitted an obviously coded language was released in the emails.
00:16:46.020 Why is nobody asking follow-up questions to this?
00:16:49.540 Even if we admit that, okay, yeah, oh, the cheese bread-related date and all that is wrong, you know, all that's nonsense.
00:16:56.880 Why were they emailing a party about a pizza-related nap on a napkin?
00:17:02.440 For those who don't know what we're talking about here, in the email leaks that were confirmed to be actual leaked emails, even by the campaign, there is a quote where one person is asking another person,
00:17:13.660 The realtor found a handkerchief.
00:17:16.200 I think it has a map that seemed pizza-related.
00:17:19.260 Is it yours?
00:17:20.260 They can send it to you if you want.
00:17:22.260 I know you're busy, so feel free not to respond if it's not yours or you don't want it.
00:17:27.020 And then replies,
00:17:27.920 It's mine, but not worth worrying about.
00:17:30.220 Like, we all agree that obviously they are not talking about a handkerchief that is pizza-related.
00:17:36.800 What are they talking about here?
00:17:38.300 It's clearly a code for something.
00:17:40.760 Or here's another fun quote from them.
00:17:42.580 We plan to heat the pool, so a swim is a possibility.
00:17:46.140 Bonnie will be Uber service, so transport Ruby, Emerson, and Maeve Lotto.
00:17:50.380 11, 9, and almost 7.
00:17:52.540 So you'll have some further entertainment, and they will be in that pool for sure.
00:17:56.900 Pizza-related map on a napkin.
00:17:58.500 They obviously were not following up with a party looking for a napkin.
00:18:02.860 Like, so what was it there?
00:18:05.180 What was the thing?
00:18:06.340 What were they talking about?
00:18:07.840 Right, yeah.
00:18:08.520 Why is there no natural curiosity among the reporter class?
00:18:11.340 And I'm suddenly realizing that we're super not cool for not having, like, code words for all the evil things we do.
00:18:16.400 What's up with that, Malcolm?
00:18:18.360 Mm.
00:18:19.480 Who knows how much this video is going to be suppressed, right?
00:18:21.900 But here's the thing.
00:18:22.960 Not for long.
00:18:23.900 But, you know, the other thing I wanted to talk about, this being mentioned here, is to have an order banning federal departments and agencies from colluding with any organization, business, or person to censor, limit, categorize, or impede the lawful speech of American citizens.
00:18:38.140 And, like, why didn't he campaign on this shit?
00:18:43.520 Like, I am so weird.
00:18:45.620 What's so cool about Trump winning is normally I'm used to getting really excited about a candidate, and then they win, and then they proceed to immediately disappoint me.
00:18:56.900 Yeah, no, this is like, he won, and we're unwrapping presents, and instead of them being full of shit, they're full of shit.
00:19:02.180 He wasn't even promising us.
00:19:03.640 Yeah, it's like, wait, how did you even know?
00:19:06.140 I didn't even put this on my wish list, and yet here it is.
00:19:09.580 This is amazing.
00:19:10.600 Yeah.
00:19:11.040 Second, I will order the Department of Justice to investigate all parties involved in the new online censorship regime, which is absolutely destructive and terrible, and to aggressively prosecute any and all crimes identified.
00:19:27.560 These include possible violations of federal civil rights law, campaign finance laws, federal election law, securities law, and antitrust laws, the Hatch Act, and a host of other potential criminal, civil, regulatory, and constitutional offenses.
00:19:46.280 To assist in these efforts, I am urging House Republicans to immediately send preservation letters, and we have to do this right now, to the Biden administration, the Biden campaign, and every Silicon Valley tech giant, ordering them not to destroy evidence of censorship.
00:20:06.780 Okay, so damn.
00:20:08.000 All right, this isn't just, hey, we're going to tamp down on this.
00:20:11.220 He's out to punish.
00:20:12.500 He's out to punish people.
00:20:13.560 Yeah, I love the spurious lawsuits they hit him with for not labeling his prostitute payments as prostitute payments.
00:20:19.900 They got him with a felony for this, and they're actually doing nefarious shit, and they felt safe.
00:20:25.700 And I am so glad all of these dims being like, oh, he's going to hit us with these spurious lawsuits.
00:20:31.100 No, no, no.
00:20:31.920 They weren't afraid of spurious lawsuits.
00:20:33.700 They knew they were doing shady, shady things.
00:20:37.740 And now they're going to get hit with real lawsuits for all the real naughties that they did.
00:20:43.120 And I am so excited, so excited for people who did bad things to finally be punished.
00:20:50.920 And we just have to make sure all this goes through.
00:20:53.300 We need to make sure this government operates efficiently.
00:20:56.340 The machine, if we don't stop the machine now, if they win next time, it's going to take over.
00:21:03.000 However, our world, our civilization is so close to being subsumed.
00:21:08.760 We have a chance to recreate a great America, a real great America.
00:21:14.920 I'm so excited for that.
00:21:16.720 And for the people who don't know what he's talking about with these preservation letters,
00:21:18.960 it's basically letters saying that you can't delete things or it's like super illegal.
00:21:22.160 Google and this can be a huge problem for them because if they delete them and then the government
00:21:25.820 goes to Google and they find them on their servers and stuff, it's not going to look good.
00:21:29.840 Yeah.
00:21:30.080 Yeah.
00:21:30.560 It's not good.
00:21:31.280 That's, again, why I actually think the government people are really good about this.
00:21:36.880 They're among the first people that I learned from the whole call me.
00:21:40.540 Like, you know those people you've met where every time you email them about something,
00:21:43.820 they're like, call me.
00:21:44.620 Or like, let's meet in person.
00:21:46.320 And you know that those, like, those people almost invariably have worked with the government
00:21:50.660 or have been involved with the government in some way, which is really funny.
00:21:54.240 Well, what if AI can go through phone records now?
00:21:59.580 I'm just saying I'm going to have a lot of fun with all the nefariousness that they've done
00:22:04.040 if we have confidence.
00:22:04.860 I do.
00:22:05.280 Well, again, I mean, this indicates to me that we do have a shot at this Twitter file for
00:22:10.460 everything that's been going on in the government.
00:22:12.180 And I'd love for that.
00:22:12.740 I want it to all come out.
00:22:14.620 I think this is a sign that we are, whatever.
00:22:22.660 Third, upon my inauguration as president, I will ask Congress to send a bill to my desk
00:22:27.780 revising Section 230 to get big online platforms out of censorship business.
00:22:35.840 From now on, digital platforms should only qualify for immunity protection under Section 230
00:22:41.760 if they meet high standards of neutrality, transparency, fairness, and non-discrimination.
00:22:48.440 We should require these platforms to increase their efforts to take down unlawful content such
00:22:55.220 as child exploitation and promoting terrorism while dramatically curtailing their power to
00:23:02.960 arbitrarily restrict lawful speech.
00:23:05.800 Don't you love this?
00:23:07.120 Because this is what everyone's going to say.
00:23:08.480 They're going to say, well, what if people use this for like negative things?
00:23:11.440 And he's like, no, child exploitation and promoting terrorism.
00:23:16.000 Those are the two categories where we will clamp down.
00:23:19.600 And I think they need to be very clear about promoting terrorism because people will try
00:23:23.840 to claim, because they'll redefine like everything the right's doing.
00:23:26.480 It's terrorism to misgender someone.
00:23:28.700 It's terrorism to, you know, that's exactly the direction they're going to go.
00:23:31.980 So we get in the administration, we're going to work really hard to make sure that the language
00:23:34.800 is super clear on this.
00:23:36.200 But I love this.
00:23:39.500 This is transformational.
00:23:43.400 It's something that Democrats and Republicans should be able to get behind.
00:23:46.080 But many Democrats won't because they know that the system benefits them.
00:23:48.760 They know that they have been colluding.
00:23:50.320 They know all the evil done by their party.
00:23:52.340 But I think enough of them will that we can get this through the House and the Senate,
00:23:56.640 even if we have a few defectors.
00:23:58.700 Which is so exciting, so exciting that this is happening.
00:24:03.740 And keep in mind how much this would change our YouTube channel, our daily life, because
00:24:07.940 every day with this YouTube channel, I expect the next day to wake up and have the channel
00:24:12.360 ban.
00:24:13.920 That's our life, right?
00:24:15.380 Like I can try to grow an audience on Rumble, but it's basically impossible.
00:24:18.460 I can try and grow an audience on Substack where we post full videos.
00:24:21.680 You can see the full videos and often less censored versions of them on Substack.
00:24:26.140 But I know that's not going to happen, right?
00:24:28.700 If you want to sign up for a Substack, you'll get an email every day when the video goes
00:24:32.080 live.
00:24:32.600 And before it goes live on YouTube, by the way.
00:24:34.840 But is anything going to come of this?
00:24:36.300 No, no, no.
00:24:37.200 I'm not going to be able to build up anything.
00:24:38.760 YouTube is the algorithm.
00:24:39.940 YouTube gets us in front of people, right?
00:24:42.200 Everything else is an indulgent shitpost, basically.
00:24:45.980 Which is why we edit everything was the assumption that people are going to be watching it visually.
00:24:49.880 Oh, wow.
00:24:50.540 She has got you hard.
00:24:53.600 You got to fight her there.
00:24:54.940 But this would be transformational to your and my job, career, and life.
00:25:01.540 I've always been really concerned about free speech on the internet and the extent to which
00:25:05.260 companies are sort of forced to quash free speech.
00:25:09.460 I do still think that we're going to see a certain amount of censorship based on people's
00:25:13.880 fears around advertisers.
00:25:15.340 But I think...
00:25:15.720 No, no, no.
00:25:16.020 This makes that illegal.
00:25:18.240 Oh, am I not getting this right?
00:25:19.620 Oh, interesting.
00:25:19.820 Yes, you're misunderstanding.
00:25:20.960 This makes it illegal for a company to say, I am censoring stuff based on advertisers.
00:25:25.620 Oh, even advertisers.
00:25:27.020 Okay, wow.
00:25:27.580 Yeah, I guess I hadn't processed that.
00:25:29.500 That's...
00:25:29.980 You can only censor two categories.
00:25:33.240 Things that are important in danger and things that promote terrorism.
00:25:35.360 I mean, there will be ways around that because money is money.
00:25:39.040 Like a business that gets put out...
00:25:40.880 No, I disagree.
00:25:42.140 First of all, the whole advertiser thing was always fake.
00:25:45.180 It was always fake with companies that have been captured by the woke virus trying to force
00:25:51.160 the virus into new environments and protect the virus from the burning light of truth.
00:25:57.040 And so what they would do is they would force these other organizations...
00:26:02.000 This is why...
00:26:02.560 Hold on.
00:26:02.820 I'll give you an example.
00:26:03.920 Elon buys Twitter, right?
00:26:05.560 And turns it into ex-Disney leaves because they're all like, oh, you're not censoring
00:26:11.100 people enough.
00:26:11.960 You know, they recently came back.
00:26:14.340 Yeah, they completely capitulated on that.
00:26:16.480 They weren't leaving for business reasons.
00:26:18.760 They were leaving to enforce their cultural values.
00:26:22.100 That's interesting.
00:26:23.040 Wow.
00:26:23.640 Okay.
00:26:25.280 Okay?
00:26:25.580 Yeah.
00:26:26.600 Well, I guess advertisers won't leave in the future if they know that that wouldn't work
00:26:29.980 anyway.
00:26:30.540 So they'll just be where the people are.
00:26:32.620 Yeah.
00:26:33.180 There's a lot of companies today that are not interested in maximizing their profits.
00:26:37.260 They're interested in enforcing an ideological value set because they have been captured,
00:26:41.280 like the cordyceps virus or fungus, by this self-replicating memetic set that only cares
00:26:46.240 about protecting and replicating itself.
00:26:48.240 And that won't be allowed anymore.
00:26:52.580 That is what is being made illegal here.
00:26:54.280 Can you imagine, by the way, Europeans who are like, I don't see how America is that
00:26:58.320 different from Europe.
00:26:59.320 I don't see how you're more free than we are.
00:27:01.680 Do you get it now?
00:27:03.940 Ding dong.
00:27:04.920 It's America, motherfucker.
00:27:06.780 Do you get that Trump was allowed to win in this country?
00:27:10.880 Oh yeah.
00:27:11.420 They tried to kill him a few times, but he had old Shinzo Abe protecting him.
00:27:15.680 And they, their witches and their spells.
00:27:19.200 There's a great like posts that are like, we, we, on our wicked, we're trying to post
00:27:24.120 spells on him, but he seems to have some protective force field around him that's protecting him
00:27:28.660 from our magic.
00:27:29.560 And there's all these like Shinzo Abe things.
00:27:31.300 And that's the reality is we have divine protection because we will save this civilization where
00:27:39.320 all of you bureaucrats and all these Europeans, right?
00:27:41.680 They watch our show and they're like, why are you so anti-Jerman?
00:27:44.080 Why are you so anti-?
00:27:45.260 Because you capitulated.
00:27:47.060 You allowed your countries to fall.
00:27:50.400 Why are we so anti-French?
00:27:51.540 Because of what you have descended into.
00:27:54.900 Okay.
00:27:56.080 When you show me that your countries can be based and awesome, like the Trump administration,
00:28:02.600 then I'll be a bit more excited for you.
00:28:04.260 And I've been excited about some European countries.
00:28:06.260 We said some positive things about some of the governments in other countries and some
00:28:10.780 of the other countries themselves, when they have elected, when they have stood on the
00:28:14.500 side of civilization and not on the side of the barbarians that wish nothing but to destroy
00:28:21.880 civilization.
00:28:24.000 Anyway, sorry if I'm offensive there.
00:28:28.100 I'm a little bit of a patriot myself.
00:28:29.680 America.
00:28:34.200 America, fuck yeah.
00:28:37.040 Coming again to save the motherfucking day, yeah.
00:28:40.080 America, fuck yeah.
00:28:42.780 Freedom is the only way, yeah.
00:28:45.880 Fourth, we need to break up the entire toxic censorship industry that has arisen under the
00:28:52.180 false guise of tackling so-called myths and disinformation.
00:28:56.040 The federal government should immediately stop funding all nonprofits and academic programs
00:29:02.360 that support this authoritarian project.
00:29:05.380 If any U.S.
00:29:06.640 University has discovered to have engaged in censorship activities or election interferences in the past,
00:29:14.180 such as flagging social media content for removal of blacklisting, those universities should lose federal
00:29:21.700 research dollars and federal student loan support for a period of five years and maybe more.
00:29:27.600 We should also enact new laws laying out clear criminal penalties for federal bureaucrats who partner with private entities to do an end run around the Constitution and deprive Americans of their first, fourth and fifth amendment rights.
00:29:43.860 In other words, deprive them of their vote.
00:29:47.560 And once you lose those elections and once you lose your borders like we have, you no longer have a country.
00:29:54.340 Furthermore, to confront the problems of major platforms being infiltrated by legions of former deep staters and intelligence officials,
00:30:03.640 there should be a seven year calling off period before any employee of the FBI, CIA, NSA, DNI, DHS or DOD is allowed to take a job at a company possessing vast quantities of U.S. user data.
00:30:22.120 I absolutely love this concept of a digital bill of rights.
00:30:25.560 I, there's something so scary and empty when you get disempowered by a system and there's just absolutely nothing you can do.
00:30:35.060 Like, oh, sorry, like your account has been completely locked.
00:30:38.260 And this isn't just with regard to censorship.
00:30:41.300 I think this is also with regard to like your data, like all of your photos are shared on some platform and suddenly they just decide that they're going to shut you down.
00:30:49.140 I think that there do need to be better citizen rights if you are a social media or data management company domiciled in the U.S.
00:30:58.160 that you should be subject to giving your users some minimal rights.
00:31:02.940 I mean, I'm really against regulation and the EU has gone way overboard with this.
00:31:06.200 But when it comes to like just literally saying, you know, you can't even like export your data after all this is pretty messed up.
00:31:15.260 Well, one, come on, don't you love that all of these academic programs and all of these academic systems that have been because the university systems have become the priesthood cast or the at least a certification for priesthood cast of the urban monoculture of this virus, this mimetic virus that's spreading through society.
00:31:34.400 And they have enforced it.
00:31:35.940 They have done, quote unquote, studies to enforce what's said and stuff like that.
00:31:40.240 And they have had no fear around doing this to people.
00:31:42.900 They have had no fear around censoring the common American citizen for saying normal.
00:31:48.620 And what he is doing here is these people who sit around all day, these gender studies, people, these fat studies, people who don't have to do real research because it's not a real field of study and come up with ways to enforce the urban monoculture on people who are not of that cultural subsystem.
00:32:06.320 They now, they now have to fear there is now consequences to doing this to other people.
00:32:15.140 And frankly, if somebody was like, could you roll out the most pro natalist policies you could conceive of?
00:32:22.180 It would be this plus his education policy.
00:32:24.960 With the education policy, he's breaking the urban monocultures back there.
00:32:28.200 He's doing the Bane Batman smash on the urban monoculture.
00:32:32.700 And here he's tossing the corpse in a ditch because this is how they enforced their cultural value system on the disintermediated conversation that's been happening online.
00:32:43.360 And the level of safety, I feel, for this country and the future, if this stuff gets enacted, is absolutely enormous.
00:32:54.700 Fifth, the time has finally come for Congress to pass a digital bill of rights.
00:33:00.460 This should include a right to digital due process.
00:33:04.640 In other words, government officials should need a court order to take down online content, not send information requests such as the FBI was sending to Twitter.
00:33:15.980 Furthermore, when users of big online platforms have their content or accounts removed, throttled, shadow banned, or otherwise restricted, no matter what name they use,
00:33:28.980 they should have the right to be informed that it's happening, the right to a specific explanation of the reason why, and the right to a timely appeal.
00:33:39.260 In addition, all users over the age of 18 should have the right to opt out of content, moderation, and curation entirely, and receive an unmanipulated stream of information if they so choose.
00:33:53.400 The fight for free speech is a matter of victory or death for America and for the survival of Western civilization itself.
00:34:01.320 To sort of sum things up, what I really love about this is that we may be returning to a time of genuine discourse and letting the best ideas win.
00:34:29.520 So when you go back to, like, the old university systems, they were all about debate, and often very raucous, very animated, very passionate debate, but in the way that, like, sports teams compete, right?
00:34:42.720 You know, people would get animated, but there wouldn't be personal attacks.
00:34:45.840 It'd be letting the best ideas win.
00:34:47.900 And the problem with censorship and moderation is that the best ideas can't win, and even worse, really shitty ideas end up in echo chambers where they get shittier, and there's no challenging them.
00:34:59.160 And if we return to a system where this is no longer allowed, where dumb ideas aren't shunted to isolated bubbles where they become stronger and fester, but instead are allowed to just be exposed to mainstream ecosystems so they can die out and be challenged, I'm thrilled.
00:35:17.020 And this is great.
00:35:18.220 I mean, it could, I think a lot of people think that unmoderated internets turn to greater and greater extremism.
00:35:23.460 I think a lot of the extremism and polarization that we see as a result of the internet isn't a result of free discourse.
00:35:29.880 It's a result of the very lack thereof.
00:35:31.740 It's a result of the moderation of the isolation and the formation of these siloed bubbles, specifically because there is censorship.
00:35:39.840 Yeah.
00:35:40.400 Here, what I want to talk about is, one, the idea of a digital bill of rights.
00:35:43.560 Why didn't we have this before?
00:35:44.960 This seems like an obvious thing that we should have, given that the digital world is the medium through which we communicate.
00:35:51.280 You cannot have free speech if someone else owns the air and can stop certain words or phrases or ideas for traveling through the air, or can make them a little quieter when they're traveling through the air.
00:36:03.220 And that's what's been happening.
00:36:05.720 That's why we haven't had a real free speech ecosystem for a while.
00:36:09.860 And I think here that the idea of digital due process, that one, the FBI and other organizations like that, if they're demanding something from a company, they need to go through judges.
00:36:22.360 They need to get a court order.
00:36:23.940 They need to get basically a search warrant.
00:36:26.420 Like, that is amazing.
00:36:28.340 Why wasn't that the case before?
00:36:29.500 It's one of these things where it's like so dumb that it wasn't the case before, that just a random person at the FBI could be like, I don't want Trump to win.
00:36:35.040 Okay, yeah, sure, I'm going to call up and get the 100 Biden laptops to race islands.
00:36:40.200 How did that happen?
00:36:41.580 That should, our democracy shouldn't work that way.
00:36:44.180 And yet it works that way all over Europe.
00:36:45.740 It works that way all over the United States these days.
00:36:47.460 This is horrifying.
00:36:49.800 Horrifying.
00:36:50.260 And I also am glad that he's putting consequences on the people who are using the government in this way.
00:36:55.620 And then in addition to that, the idea that you cannot be shadow banned without being explained why you were shadow banned and being told that you're shadow banned.
00:37:04.220 But better than that is you have to be told why.
00:37:07.180 I can't tell you all the times YouTube will say one of our videos is ineligible for monetization.
00:37:13.000 And I'll be like, why?
00:37:15.480 Why?
00:37:16.880 I don't know.
00:37:17.960 Like, you've got to tell me where.
00:37:19.940 Why?
00:37:20.260 Why specifically?
00:37:21.400 What word?
00:37:23.120 That was an hour-long video that was mostly about, like, science or something like that.
00:37:27.920 And you guys don't even know.
00:37:29.060 It's like the most benign stuff that this happens to.
00:37:35.560 Why?
00:37:36.920 Why did you shadow ban me?
00:37:39.120 Why did you block my video?
00:37:40.640 Why aren't people seeing it?
00:37:41.980 And I can't be told it.
00:37:42.940 I always know.
00:37:43.980 Look, it's whatever the video.
00:37:46.960 And what's interesting is it's not like the sexual videos.
00:37:49.320 The sexual videos never get blocked.
00:37:52.460 It's the video.
00:37:53.260 Isn't that weird?
00:37:53.860 Yeah.
00:37:54.420 Where we talk about stuff that you're not supposed to know.
00:38:00.460 Well, yeah.
00:38:01.180 It seems like the common themes are certain names and certain keywords, right?
00:38:06.700 No, that's not the common theme.
00:38:08.200 I can't even say the common theme.
00:38:09.600 Oh, no.
00:38:10.020 Really?
00:38:11.020 The common theme is basically how scared you should be of government overreach.
00:38:14.920 Oh.
00:38:15.680 Hmm.
00:38:17.040 That's the common theme.
00:38:18.200 The common theme is how broken is our system at a fundamental level?
00:38:23.320 And whenever you provide proof of that, you get banned.
00:38:26.900 And that's shocking to me that we've reached this level in our country.
00:38:31.280 And watch the AI ban this one too, because I can tell they're doing it with AI now.
00:38:35.100 And that's a recent switcher they did.
00:38:36.500 It cares about Google's interests over the interests of the watcher.
00:38:40.000 It cares about protecting the urban monoculture from evidence of
00:38:48.200 Yeah.
00:38:49.760 And I think that this is the thing.
00:38:51.540 That's what gets me, is evidence.
00:38:53.160 Presenting evidence is the thing that is most likely to get you banned.
00:38:56.920 That's not good.
00:38:58.820 Yeah.
00:38:59.280 Yeah.
00:38:59.380 So, how do you feel about Trump winning right now?
00:39:04.660 Pretty damn good.
00:39:05.980 Yeah.
00:39:06.540 How do you feel?
00:39:07.240 As you said, I'm so used to people winning, and I unbox it, and it's full of shit.
00:39:13.360 And I hear I'm unboxing it, and I'm like, why weren't you telling me this on the campaign trail?
00:39:16.540 Yeah.
00:39:17.180 Yeah.
00:39:17.600 Like, and you know why I think he wasn't telling?
00:39:20.240 Because he didn't have the team around him yet.
00:39:21.840 Now he's got Vivek.
00:39:22.680 Now he's got Elon.
00:39:23.300 The new right are not old government deep state people that he's staffing everything with.
00:39:28.780 They are actually new outside Silicon Valley entrepreneurial types.
00:39:33.900 Who did build?
00:39:34.540 Who built?
00:39:35.200 Who fixed?
00:39:35.980 Yeah.
00:39:36.760 They fix things.
00:39:38.260 Yeah.
00:39:38.560 They're about fixing things.
00:39:41.080 Yeah.
00:39:41.240 And I think that that's why.
00:39:43.240 It's because if you were unwrapping the package that the Heritage Foundation had laid, it was like, and we like the Heritage Foundation.
00:39:49.320 They're going to get on board with the new right.
00:39:50.540 I know it.
00:39:50.960 They'll be 100% soon.
00:39:52.560 We work with them to improve.
00:39:54.340 But it was banned pornography.
00:39:56.520 Censor women on the internet.
00:39:58.220 You know, like, all of this stuff that's like actually like far lefty stuff if you're in online circles.
00:40:02.880 You know, like Tracer Buck controversy, Stellar Blade controversy.
00:40:05.540 I'm like, why are you being a nanny state?
00:40:07.420 You know, and they're like, what do you mean why are we being a nanny state?
00:40:10.020 I thought this was right-leaning.
00:40:11.260 And I was like, this shit hasn't been right-leaning in 20 years.
00:40:14.220 Like, what are you talking about?
00:40:16.180 You're acting like a blue-haired feminist right now.
00:40:19.660 I mean, well, and I think, you know, so the people at Heritage, I think I'll get it.
00:40:26.960 I think a lot of it just has to do with organizational inertia.
00:40:30.100 And so that's why I'm not too worried about them.
00:40:32.520 Well, who knows?
00:40:33.040 You know, they might hire us this next time around and we can help their stuff for stuff that does not fit the new rights agenda.
00:40:39.180 We'll see.
00:40:39.640 I doubt that, but.
00:40:42.180 I wouldn't be surprised, Simone.
00:40:44.140 Organizational inertia.
00:40:45.180 There's a line.
00:40:46.220 So for all of the DC-based think tanks, there is a long line of people and you have to do your, pay your dues and do your time and then you get put up through those organizations.
00:40:56.100 Some organizations are willing to reform faster than others.
00:40:59.120 Like, the Heritage Foundation of all the organizations we've talked to, which is actually kind of ironic because they've got all the hate for the, you know, Project 2025.
00:41:05.580 We're going to ban pornography.
00:41:06.580 We're going to ban furries.
00:41:07.400 We're going to ban.
00:41:07.800 Anyway, they basically, they got hit by all of this mindset the hardest when they're actually the most.
00:41:16.480 And I actually say, if you're reading Project 2025 and you're like, some of this stuff is really like old school, ultra progressive, like nanny state stuff.
00:41:24.060 The other organizations, the Heritage Foundation is the most open to change.
00:41:28.860 They are the most quickly changing.
00:41:30.840 They are the most likely to be our allies.
00:41:33.680 Yeah, they're very agile.
00:41:35.460 They're very smart.
00:41:36.420 They're very pragmatic.
00:41:37.920 With other groups.
00:41:38.600 I won't name the other groups, but they're all big names.
00:41:41.120 Every single one of them is worse than the Heritage Foundation.
00:41:43.600 Yeah.
00:41:43.780 And so what this tells me when I read these plans is that these people aren't in the room when this stuff is happening.
00:41:49.700 And this is why they're complaining about people like Elon giving too much advice to the president.
00:41:54.080 That was a drug report recently.
00:41:55.280 They're like, Elon, like staffers tell us that Elon's like a bad guest staying at Mar-a-Lago too long.
00:42:00.200 I think what's happening is Elon is doing demon mode on the transition, which is what he does.
00:42:05.060 He does sprints.
00:42:06.640 He goes in and he goes super, super high focus on stuff because that's how he fixes problems.
00:42:12.880 He doesn't go slow.
00:42:14.040 He doesn't.
00:42:14.800 And this is honestly very appropriate.
00:42:16.440 This kind of focus, hyper focus, and like basically pit bull binding stuff until they're fixed is the way to go during a governmental transition because these initial days are so essential.
00:42:30.580 Yeah, I agree 100%.
00:42:32.980 I think our country and the world owes him a great debt.
00:42:36.640 And for people who want to attack Elon, you can shut the fuck up.
00:42:41.080 I don't think anyone is.
00:42:42.220 I guess you're saying he's getting a lot of hate.
00:42:44.540 No, people always, whenever you say nice things about Elon, there's this like sewer class.
00:42:49.400 But there are people who don't like him?
00:42:51.200 Oh yeah, even in our comments.
00:42:52.680 Despite like his track record of shot calling and just making it impossible to happen.
00:42:56.660 Don't you know, he didn't really found those companies.
00:42:59.620 And I'm like, bitch, do you have any idea how hard it is to grow a company?
00:43:02.240 When he came into these companies, they were often nothing burger companies.
00:43:05.980 Yeah, founding companies is a little alarmingly easy.
00:43:09.840 It's dealing with them after they're founded.
00:43:11.860 That's really freaking hard.
00:43:13.060 And they're like, he didn't develop this technology.
00:43:15.400 It's like, yeah, he hired the people and bought the companies that developed this technology.
00:43:19.020 That's really hard.
00:43:19.740 Other people could have done that.
00:43:21.260 Like, do you not look at a fundamental level, understand how like money works or entrepreneur,
00:43:27.600 or they'll say, oh, he got it all through government contracts.
00:43:30.380 Good.
00:43:31.360 Most major companies make a ton of money through government contracts.
00:43:35.320 That means that his founders don't know what they're doing.
00:43:38.520 Well, when you're doing something like space travel, it's kind of hard to not get involved
00:43:46.800 with government contracts.
00:43:47.860 Through his own money.
00:43:49.280 Yeah.
00:43:50.320 Why did he achieve more than NASA did in X many years for like a fraction of the cost
00:43:55.420 on his own money?
00:43:56.940 What are you talking about?
00:43:59.540 Yeah.
00:44:00.820 Anyway.
00:44:01.840 Well, what do you want for dinner?
00:44:06.260 Well, what are my options?
00:44:07.940 Did you still cook anything?
00:44:09.980 No, I have to do that tomorrow morning.
00:44:12.300 So tonight, your options are...
00:44:15.100 Do we have any chicken left?
00:44:17.860 Chicken from what?
00:44:19.520 From the teriyaki chicken pack.
00:44:20.760 Yeah.
00:44:21.240 I can do teriyaki chicken.
00:44:22.800 I can do that with rice.
00:44:23.700 I can do that with gyoza.
00:44:24.400 I can do that with stir fry vegetables too.
00:44:26.200 Do that with the teriyaki chicken with the long hot and some scallions and then mix in
00:44:31.820 some sauces that I'll leave you.
00:44:33.460 You did a great job last night.
00:44:34.420 Would you like me to just do penang and teriyaki?
00:44:37.420 And I couldn't find the oyster sauce, but you didn't seem to miss it.
00:44:41.100 If I put it on the table, it was a bottle of the drippy stuff.
00:44:45.960 It was a bottle in the middle of the table.
00:44:48.700 Oh, okay.
00:44:49.400 Well, I mean, you'd put out the other stuff, but I...
00:44:53.200 I put a bucket of penang next to a bottle with something black and drippy.
00:44:57.200 That's oyster sauce.
00:44:58.740 Yes, I did.
00:44:59.400 I put them both back in the fridge too after giving them to you.
00:45:02.000 You put a little too much penang in last time, so put less of that in.
00:45:05.120 Yeah, I thought that after I instant regret, but you know...
00:45:10.880 I put about half of what you put up the penang in, use the oyster sauce, and I can flavor
00:45:15.260 the rest of myself, and you can just heat up some rice for me, or you can cook some...
00:45:19.940 No, let's just heat up rice.
00:45:20.920 That'll be easier.
00:45:21.900 What were you thinking otherwise?
00:45:23.440 Stir fry vegetables?
00:45:24.420 Because I can just add vegetables.
00:45:25.400 Yeah, I was going to say stir fry vegetables, but you could cook the stir fry vegetables with
00:45:28.440 the chicken, I guess?
00:45:29.480 That's what I was thinking.
00:45:30.340 Or the stir fry potato thing that we have as well?
00:45:32.800 No, no, no, no.
00:45:33.620 Those rosemary potatoes are a different kind of flavor profile, I think.
00:45:37.480 But what we...
00:45:38.600 So are you okay with vegetables instead of rice, or you still want rice?
00:45:42.040 Yes.
00:45:43.060 Oh, no, not instead of rice.
00:45:44.780 I'm okay with you cutting up some vegetables and cooking them with the chicken, but not
00:45:47.960 instead of rice.
00:45:49.060 Just add some vegetables.
00:45:50.880 I love you.
00:45:51.980 I love you too.
00:45:55.060 And oh my gosh, I get to start dinner early.
00:45:58.080 This is really nice.
00:46:03.620 What are you doing?
00:46:10.320 Octavian, what do you have to do?
00:46:17.180 Are you fishing?
00:46:18.440 Yeah.
00:46:19.620 Are you a good fisherman?
00:46:20.480 I'm trying to catch a fishbowl on it in the fish.
00:46:26.300 Okay.
00:46:28.900 I'm going to take a fishbowl and hook it, okay?
00:46:33.580 Daddy, daddy, daddy.
00:46:34.820 You'll bring the fish up and grab it.
00:46:36.920 Daddy.
00:46:38.420 It's not going to fall.
00:46:41.720 Yeah, don't play with that.
00:46:43.800 Do you like hot of fish, Dad?
00:46:45.480 Yeah.
00:46:45.620 I'm not going to fall.
00:46:46.240 I'm not going to fall.