Timcast IRL - Tim Pool - April 27, 2022


Timcast IRL - Biden Forms DHS Ministry Of Truth Amid Elon Musk Twitter Win w-Sharyl Attkisson & Poso


Episode Stats

Length

2 hours and 5 minutes

Words per Minute

192.43813

Word Count

24,244

Sentence Count

2,083

Misogynist Sentences

29

Hate Speech Sentences

45


Summary

Biden's Department of Homeland Security has a new department called the Ministry of Truth, and it s run by a woman who is lamenting Elon Musk's takeover and someone who is a Russiagate proponent. We discuss all of this and much more on today's episode of Hot Messengers.


Transcript

00:00:01.000 Something crazy is going on.
00:00:03.000 So on Twitter, my What's Happening section, it's this curated feed that appears on the right side of the screen.
00:00:10.000 All of a sudden, skateboarding is trending, baseball is trending, Ubuntu is trending, and I'm like, I like those things.
00:00:18.000 Baseball's kinda meh, but I like skateboarding, and I like Linux.
00:00:22.000 Why is skateboarding trending?
00:00:24.000 The entire time I've been on Twitter, or at least for the past seven or eight years, it's been completely political.
00:00:30.000 There was even this one period where Twitter was trending a story that was about me that had no traction, no interest, and it was completely fabricated.
00:00:40.000 Now all of a sudden, the trending has cleared up.
00:00:43.000 Something dirty is happening behind the scenes at Twitter.
00:00:47.000 And I think there is now ample evidence suggesting that Twitter is cleaning house and trying to purge nefarious code.
00:00:54.000 I think perhaps Vijay Agade was crying in the meeting because they lied to Congress about what they were doing behind the scenes.
00:01:01.000 Personally, I think Twitter was suppressing right-wing accounts and creating fake left-wing accounts for the sake of Healthy conversations, right?
00:01:10.000 Trying to create some kind of pseudo balance.
00:01:12.000 Now, I'll present my case for this.
00:01:14.000 It is just a hypothesis.
00:01:17.000 I won't say it's definitive because I don't know for sure, but that's how things really seem.
00:01:21.000 Amid this, all of these shenanigans with Elon Musk buying Twitter, Biden's DHS has announced a new Ministry of Truth.
00:01:29.000 It's actually some kind of department of battling disinformation, but sure.
00:01:35.000 It's being run by some woman who is lamenting Elon Musk's takeover.
00:01:39.000 Someone who is basically a Russiagate proponent.
00:01:42.000 So we can see where this is going.
00:01:44.000 They are not just going to back down.
00:01:46.000 We've now got journalists claiming Elon Musk is already in breach of contract for buying Twitter because he disparaged Twitter.
00:01:53.000 Surprise, surprise, he's not.
00:01:54.000 They're lying once again.
00:01:56.000 Senator Hawley is calling for a censorship audit on the platform.
00:02:01.000 We gotta talk about this stuff.
00:02:02.000 We got a bunch of other stories maybe we'll get to.
00:02:04.000 We've got illegal immigrants now in the United States, I think over a million.
00:02:08.000 We've got a new op-ed from Stephen Marsh on Civil War saying abortion may be like a large catalyst for this.
00:02:17.000 We've talked about it before, but I don't know if we'll get to all of that because so much is going on with this Twitter stuff.
00:02:22.000 It is not just about censorship anymore.
00:02:25.000 It looks like there may be some serious Enron-level illegal or malfeasant goings-on at this company.
00:02:33.000 And Elon Musk, as Jack Posobiec said, bought the evidence.
00:02:36.000 So, joining us to discuss all this today is Cheryl Atkison.
00:02:41.000 Would you like to introduce yourself?
00:02:43.000 Hello, I'm Cheryl Atkison, and my voice is a little funky today, but I think we can hang in there.
00:02:50.000 You want to pull the mic up a little bit?
00:02:51.000 And you can, you can keep your, you know, keep it down.
00:02:54.000 Yeah, just rest your voice and take it easy, and we'll keep the hot tea coming.
00:02:59.000 Well, I am a longtime establishment journalist.
00:03:02.000 Working for CNN and CBS and PBS before going out on my own.
00:03:08.000 And I tend to cover a lot of media issues, sort of looking at my own industry in a critical way that I think is very healthy, but a lot of other journalists tend not to do.
00:03:17.000 Right on.
00:03:18.000 Well, thank you for coming.
00:03:18.000 It should be interesting considering what we're dealing with now.
00:03:21.000 Already there are journalists trying to lie and cover up.
00:03:23.000 We've got this guy from NBC saying, Twitter says the flux in followers is all organic.
00:03:29.000 We got Jack Posobiec hanging out.
00:03:29.000 Sure.
00:03:33.000 The number one trend in the United States of America now is the Ministry of Truth because earlier today nobody was talking about this DHS Disinformation Governance Board until I happen to be on Twitter.
00:03:48.000 Someone sends me this thing That it's Nina Jankowicz who's in charge of this thing.
00:03:53.000 Nina Jankowicz, you may know her from previously calling and saying that Trump supporters were planning to show up at the polls on election day militarized and with weapons to intimidate people from going to the polls.
00:04:06.000 That's her October 2020 on CNN saying that previously she was a member of a Harry Potter fan band known as the Moaning Myrtles.
00:04:17.000 That's a scandal.
00:04:18.000 Who had lyrics that I don't think I can say on YouTube about, you know, obviously underage boys.
00:04:26.000 Yes, I'm serious.
00:04:27.000 And no one was talking about this other than the fact that she very obviously leaked the news to Politico and then was, you know, crowing about this.
00:04:36.000 And for some reason, Politico didn't even think to do a story on the Ministry of Truth that was being enacted by the Biden administration.
00:04:45.000 So it was up to little old me to have to go tweet this out and all the receipts of Nina Jankowicz, who she is, what they're doing.
00:04:52.000 Because Tim and everybody else here, I got to say this, and Cheryl, it's amazing.
00:04:55.000 I do hope you feel better, but I'm really honored to be on with you tonight.
00:05:00.000 She is one of the people who immediately, when she saw the Hunter Biden laptop story, said that it was Russian disinformation.
00:05:08.000 She said it was a fairy tale that he could have left his laptop in this Delaware tech shop because, you know, never heard of a crackhead losing something before.
00:05:19.000 This is the person who's now in charge of your ministry of truth.
00:05:23.000 Wow.
00:05:23.000 All right.
00:05:24.000 Well, Jax, thanks for joining us.
00:05:25.000 Thanks for blowing that one open, baby.
00:05:27.000 Also, buy Pillow.
00:05:27.000 Buy Pillow.
00:05:28.000 Buy Pillow.
00:05:30.000 My favorite book.
00:05:30.000 Just buy the Pillow.
00:05:31.000 Just buy the Pillow.
00:05:32.000 I keep being surprised if there's political pushback against Elon buying Twitter when just before it was Vanguard, BlackRock, and Morgan Stanley that owned a quarter of the company.
00:05:42.000 I never heard anyone mention that.
00:05:44.000 So I don't know.
00:05:45.000 What's better off in the hands of one man like Jeff Bezos and The Washington Post?
00:05:50.000 Or is it better off in the hands of a multinational corporation?
00:05:53.000 Anyone other than Vanguard, State Street, Black Rock, et cetera, et cetera.
00:05:53.000 You decide.
00:05:58.000 Bezos.
00:05:59.000 Yeah.
00:05:59.000 Yeah.
00:06:00.000 All right.
00:06:00.000 We also got Lydia pressing the buttons.
00:06:02.000 I am here pushing buttons.
00:06:03.000 I had a great conversation with Cheryl earlier this evening.
00:06:06.000 I'm really optimistic, hoping her voice holds out for us tonight.
00:06:09.000 Hopefully we can make it the whole show.
00:06:10.000 We'll see what we can do.
00:06:11.000 Before we get started, my friends, head over to Stronger Bones and Life and pick up your Ageless Multicollagen from Biotrust.
00:06:19.000 You can secure your supply of Ageless Multicollagen up to 51% off.
00:06:23.000 Why do you need it?
00:06:24.000 Well, if you are starting to get old, it's good for your joints, your hair, your skin, your nails.
00:06:30.000 It'll keep you looking young and supple.
00:06:32.000 And as I skate all the time, we're out on the, uh, we got a new 7-foot vert wall.
00:06:36.000 It's sort of vert, you know, it's like you ever see skateboarders go up the half pipe.
00:06:39.000 That's what we got.
00:06:40.000 And so we're getting big air and we're falling down.
00:06:43.000 We're getting hurt.
00:06:43.000 We got to keep our knees healthy.
00:06:45.000 So I got to eat this ageless multi collagen.
00:06:47.000 Just head over to stronger bones and life.com and you'll get a 60 day money back guarantee.
00:06:52.000 The healthy aging support of collagen in its ideal forms.
00:06:55.000 Five key types of collagen you need from four different sources.
00:06:58.000 For every order today, they will donate a nutritious meal to a hungry child in your honor through their partnership with NoKidHungry.org.
00:07:06.000 To date, BioTrust has provided over 5 million meals to hungry kids.
00:07:09.000 Please help BioTrust hit their goal of 6 million meals this year.
00:07:13.000 It is non-GMO and free of artificial colors, flavors, preservatives, and sweeteners, free of gluten, antibiotics, and RBGH and RBST.
00:07:21.000 Nearly no odor or taste, unlike bone broth.
00:07:24.000 There's no clumping.
00:07:25.000 You'll get free shipping with every order, free VIP live health and fitness coaching from BioTrust's team of expert nutrition and health coaches for life with every order, and their free eReport, the 14 foods for amazing skin with every order.
00:07:37.000 Again, that's over at StrongerBonesAndLife.com.
00:07:41.000 Shout out Biotrust.
00:07:42.000 And don't forget, head over to TimCast.com.
00:07:45.000 In the top right, you will see that beautiful sign up button.
00:07:47.000 Sign up to help support our journalists and the work we do.
00:07:51.000 We recently put up a billboard in Times Square calling out the Washington Post for lying.
00:07:56.000 With your support, we will continue to call out the establishment for their lies.
00:08:00.000 And everything we're seeing right now, yo, we are storming the hill with the Daily Wire building culture, With what we've been working on with our journalists, with the show we do here, challenging the mainstream press and the lies every night.
00:08:12.000 And then putting up a billboard with the help of the Daily Wire.
00:08:15.000 I think it's a white pill moment.
00:08:19.000 It's good times.
00:08:19.000 Good times ahead.
00:08:20.000 But this is a battle being won.
00:08:22.000 The war is not over.
00:08:22.000 So go to TimCast.com, become a member.
00:08:25.000 And you will get access to exclusive members-only segments of this show Monday through Thursday at 11 p.m.
00:08:29.000 We'll have one up for you tonight.
00:08:31.000 But don't forget to smash that like button right now, subscribe to this channel, share the show with your friends, and let's jump into the first story.
00:08:39.000 I'd like to show you this tweet from Elon Musk.
00:08:42.000 It is a meme!
00:08:43.000 It's a meme of me!
00:08:44.000 And boy, did this one trigger a lot of people.
00:08:48.000 But it's not so much about the meme.
00:08:49.000 I do want to highlight.
00:08:51.000 It shows me say, here's an example of Twitter's left-wing bias.
00:08:54.000 Twitter says, we need to take into consideration the context.
00:08:57.000 Me then saying, the context is affected by your bias.
00:09:01.000 And then Twitter saying, we need an example of that.
00:09:03.000 And the cycle continues, and I give you another example.
00:09:06.000 Do people understand that this actually comes from your appearance with her on Joe Rogan?
00:09:12.000 Because I actually think people may not understand the context.
00:09:14.000 Well, I think a lot of people don't understand that this is just summarizing.
00:09:18.000 I went on Joe Rogan.
00:09:20.000 Man, this comes up a lot now because this keeps becoming relevant.
00:09:23.000 And we had this conversation.
00:09:25.000 Dude, that was huge.
00:09:26.000 That was huge.
00:09:26.000 Yeah.
00:09:27.000 Let me help you guys understand because this tweet's got 34,000 retweets.
00:09:31.000 What does this tweet mean?
00:09:32.000 During the show, I was sitting down with Joe Rogan, Vijaya Gadde, the top lawyer, the one who reportedly cried at a meeting, and Jack Dorsey, the former CEO.
00:09:41.000 I said, you have banned many people for saying hashtag learn to code.
00:09:48.000 That is an example of your bias.
00:09:50.000 Vijaya Gaddai said, yes, but you need to take into consideration the context.
00:09:54.000 They were saying that to harass people.
00:09:57.000 I responded with, no, they weren't.
00:10:01.000 Your interpretation of that is based on fake news and leftist biased media.
00:10:06.000 And she said, I would need to see an example of that, to which I responded, here's an example.
00:10:12.000 You suspended the editor-in-chief of the Daily Caller For indirectly saying Learn to Code in a quote tweet.
00:10:20.000 He didn't tweet it at anybody.
00:10:21.000 I'm pretty sure that's what happened, right?
00:10:23.000 He didn't direct it at anybody.
00:10:23.000 He didn't tweet it.
00:10:24.000 He was like... No, I don't think it was even a quote tweet.
00:10:26.000 It was just a standalone tweet.
00:10:28.000 And they suspended him for it.
00:10:29.000 That's right.
00:10:30.000 And there were many other people who did not direct Learn to Code.
00:10:34.000 So I say that to her, and then, well, you gotta take into consideration the... So that was a mistake, but the context around it... Okay, here's another person.
00:10:42.000 Who tweeted a blanket thing like, people are saying learn to code and it's funny.
00:10:46.000 Banned.
00:10:47.000 So this cycle goes on and on.
00:10:49.000 In this tweet though, I want to show you some other things because what this really is about, this is about some shady goings on.
00:10:56.000 We talked about this yesterday, but I think we've got, we have a mystery, my friends.
00:11:00.000 Take a look at Twitter.
00:11:01.000 Let's pull up the tweet again.
00:11:02.000 On the right side, what is this?
00:11:04.000 Skateboarding is trending?
00:11:05.000 Nah, Tim Pool is trending.
00:11:07.000 Okay, I can't do anything about that.
00:11:08.000 Elon Musk tweeted me out.
00:11:09.000 But skateboarding?
00:11:10.000 Dodgers at Diamondbacks?
00:11:12.000 Now that's interesting.
00:11:13.000 Now this, it says the Blackphone is trending.
00:11:15.000 That's promoted.
00:11:17.000 The What's Happening tab, for me, maybe many of you have noticed this, has been consistently and overtly political, and it's typically leftist politics.
00:11:27.000 It's typically saying something about, you know, it'll say, Joe Biden did not shake hands with thin air, according to fact checkers.
00:11:35.000 That's always what's going on in my What's Happening.
00:11:37.000 All of a sudden, it's like, you like skateboarding, Tim?
00:11:41.000 Well, I actually do like skateboarding, but you've never recommended that to me before.
00:11:46.000 Something strange is going on, and when you look at some of these tweets... Let me see if, uh... I'll pull this one up right here.
00:11:53.000 DailyMailReports.
00:11:55.000 Burning the evidence before the new boss starts, Don Jr.
00:11:58.000 and right-wingers see giant leaps in their Twitter followers after MuskBid was accepted.
00:12:03.000 Let me see if I can pull up this tweet, because I might have things out of order.
00:12:06.000 This is a tweet from me.
00:12:09.000 Referencing the red-headed libertarian, a good friend of the show, Josie, her Twitter handle is at TRHLOfficial.
00:12:17.000 That account was abruptly suspended January 20th, 2021, after she pointed out that on January 9th, 2019, she predicted Joe Biden would run for president and Kamala Harris would be the VP.
00:12:32.000 A year later, with no warning, for no reason, she got suspended.
00:12:36.000 That's it.
00:12:37.000 She didn't break any rules.
00:12:38.000 Why did that happen?
00:12:40.000 Well, just, I believe it was today, she received an email.
00:12:43.000 Abruptly, they reinstated her account.
00:12:47.000 Why?
00:12:48.000 It is obvious at this point that the drop-off of progressives, their follower counts are collapsing, and conservatives are rising.
00:12:56.000 It is not organic.
00:12:58.000 Twitter is lying.
00:12:59.000 There are journalists in the press claiming it's, oh, Twitter says it's organic.
00:13:02.000 They're assisting in the lie.
00:13:04.000 I think what they're doing is they're playing a game of saying it's organic because it's not bots.
00:13:11.000 So it's not bot activity.
00:13:13.000 Yes, it is real people that are leaving.
00:13:15.000 A lot of progressives are leaving.
00:13:17.000 A lot of conservatives are coming back.
00:13:18.000 But the activity is not organic.
00:13:21.000 The activity is artificial because this is the ban hammer which had swung down on people like Josie, people like so many others that are getting these.
00:13:30.000 Now it's being lifted and they're magically coming back.
00:13:33.000 I don't think progressives are leaving.
00:13:34.000 Sean King wouldn't even leave.
00:13:36.000 You know, he deactivates his account.
00:13:38.000 Whoa, whoa, whoa.
00:13:39.000 Sean King did leave for like 12 hours.
00:13:42.000 Right, but I mean he came back.
00:13:43.000 And then he came back.
00:13:44.000 And then he pretended like he didn't leave.
00:13:45.000 And then they were trying to call you out, Jack, as if you were lying.
00:13:48.000 Right, and then he was trying to call me out as if I had made it up, and it's like, bro, we all saw you take your account down.
00:13:53.000 I'm gonna say it.
00:13:55.000 I'm looking for the simple solution here.
00:13:57.000 Right.
00:13:57.000 I'm trying to make the least amount of assumptions.
00:14:00.000 So when I see, you take a look at the social blade analytics.
00:14:04.000 Yeah.
00:14:05.000 Monday was the day that Elon Musk, it was announced, would purchase Twitter at 2.53 p.m.
00:14:10.000 I know because I was recording live, I do all my recordings for my main show live, and at 2.53 the tweet comes out.
00:14:17.000 Yeah.
00:14:17.000 Okay.
00:14:18.000 So you mean to tell me The day, 8 in the morning, when the Wall Street Journal announced Elon Musk was in negotiations with Twitter.
00:14:27.000 Like final negotiations.
00:14:28.000 Final negotiations.
00:14:28.000 It was 8am.
00:14:29.000 Not a single conservative said, I'm going to come to Twitter and gloat.
00:14:32.000 Not a single one came and said, let's cheer the sun.
00:14:34.000 Not a single leftist said, I'm going to leave, this is getting dangerous.
00:14:37.000 Okay, fine, maybe, because it hadn't happened yet.
00:14:39.000 By 2.53pm, not a single conservative joined Twitter to tweet, we got it, baby.
00:14:44.000 Not a single one.
00:14:45.000 Not a single progressive said, I'm leaving.
00:14:47.000 I can't believe this just happened.
00:14:48.000 No, they all just, for some reason, waited 24 hours.
00:14:51.000 They all said, you know, Elon Musk got Twitter back.
00:14:55.000 I would like to gloat as a conservative.
00:14:57.000 I'm going to wait until tomorrow to gloat.
00:14:59.000 I think that's because the news broke the next day overnight, and they weren't responding to him buying Twitter.
00:15:05.000 They were responding to the news telling them to be afraid of it.
00:15:08.000 Wrong!
00:15:10.000 There would be a tiny, tiny bump.
00:15:13.000 When I track analytics, and I've done it for like a dozen different accounts, you would think you would see a 5% increase at least, right?
00:15:20.000 Because this was trending like crazy.
00:15:22.000 Every major news outlet was boom, breaking, breaking, breaking.
00:15:26.000 CNN had it on immediately.
00:15:28.000 You'd think there would be a tiny, tiny anomaly.
00:15:32.000 So I gain maybe 1300 followers per day.
00:15:35.000 You'd think Monday I'd see 1500, if that was true, Ian?
00:15:38.000 No, it was the exact same.
00:15:40.000 It wasn't until the next day I saw 20,000.
00:15:44.000 The next day, 40,000.
00:15:46.000 Something happened overnight.
00:15:49.000 And now when I look at the what's happening tab, yo, now skateboarding is trending.
00:15:54.000 For the first time for me in 8 years?
00:15:57.000 Yeah, I've been skateboarding my whole life.
00:15:58.000 Now they're recommending that to me?
00:16:00.000 I want to add something to this.
00:16:02.000 When the progressives are losing followers, I don't think progressives are leaving, for the exact same reason.
00:16:08.000 You would think there would be a small anomaly.
00:16:11.000 The day the news was announced of progressives saying, I'm gonna leave.
00:16:16.000 Maybe Ian is right.
00:16:17.000 That many people didn't notice until the next day when the news was really all over the place.
00:16:22.000 But you'd think at least some people in the know would have left.
00:16:25.000 A small percentage.
00:16:27.000 2%?
00:16:27.000 There is zero anomalous data.
00:16:31.000 It is static, like normal.
00:16:32.000 And then the next day, boom!
00:16:34.000 5,000 followers gone, 10,000 followers gone.
00:16:37.000 No, I think these were bot accounts.
00:16:39.000 I think Twitter I would argue.
00:16:43.000 It's possible, at least.
00:16:44.000 Twitter was involved in this.
00:16:46.000 We've already seen the story from Judicial Watch, where the Democratic Party, I think it was in California, was requesting censorship.
00:16:54.000 How much do you want to bet?
00:16:55.000 That was D.C.
00:16:56.000 Drano.
00:16:56.000 D.C.
00:16:57.000 Drano.
00:16:57.000 Drano?
00:16:57.000 That was D.C.
00:16:58.000 Well, that's his lawsuit.
00:16:59.000 Okay, okay.
00:16:59.000 He was censored after reques- and I believe- Well, no, I think- I think Judicial Watch revealed this.
00:17:03.000 We had Tom Finn on recently.
00:17:04.000 Yeah, I think they did it, and then I know Harmeet Dhillon is working on that one as well, where his censoring actually came at the behest I'm willing to bet that Twitter was operating fake accounts for the sake of quote-unquote health of the platform.
00:17:25.000 That is, the platform was overwhelmingly being dominated by right-leaning voices.
00:17:31.000 That liberals were overwhelmingly rejecting wokeness.
00:17:35.000 A healthy platform, a good balance between left and right, right?
00:17:38.000 Well, the problem is former hippie skateboarding liberal types and left-wing types like Tim Pool all of a sudden are saying, we got to vote for Trump.
00:17:45.000 The scales had shifted.
00:17:47.000 How much you want to bet Twitter said, we've got to, we have the mission on our hands.
00:17:52.000 So we're going to ban a good chunk of the right.
00:17:54.000 We're going to artificially inflate the left to fake the health of the conversation.
00:17:59.000 Well, I think that almost gives them too much credit for doing something that's to the benefit of the whole.
00:18:06.000 And I think there may be an element of that thought process in there.
00:18:09.000 But I do think there was a big element, as in California, of political figures and corporate figures able to call the shots with Twitter so that some of the balance, or what may look like balance, is actually influence.
00:18:22.000 And I think that's what makes the case for Twitter not just being a private company that can make its own policies, which is what people say when There's a question of do they have the right to censor people say, well, it's a private company.
00:18:33.000 It's not really I argued in the last book that I wrote that it is a quasi official public government organization because of its contracts.
00:18:43.000 Because it is beholden to the government, because it fears regulation if it doesn't abide by certain rules and requests, and it has admitted to taking directives from political figures and members of Congress who were not elected by us to control Twitter, but that's exactly what they've been allowed to do.
00:19:02.000 Let me address that.
00:19:03.000 The path to hell is paved with good intentions.
00:19:06.000 The people at Twitter think they're on a glorious mission to save humanity.
00:19:10.000 But if you have the majority of the American people saying, wokeness is not good, and you end up with post-liberals, these are people who voted for Democrats.
00:19:20.000 I was supporting Bernie Sanders in 2016, and I did not vote for Trump.
00:19:24.000 Now, here I am, having voted for Trump and the Republicans in 2020.
00:19:28.000 That is them saying, uh-oh, something bad is happening because our worldview is being rejected.
00:19:34.000 In reality, they are insane.
00:19:36.000 Psychopaths thought.
00:19:39.000 How do we save the balance of the system?
00:19:41.000 I know, as psychopaths, we need to create the perception that psychopaths are actually normal.
00:19:50.000 Just because they have good intentions doesn't mean they're not doing evil.
00:19:54.000 So the path to hell is paved with good intentions.
00:19:54.000 Right.
00:19:56.000 They think they're good people.
00:19:57.000 They're not.
00:19:58.000 I can echo that, man.
00:19:59.000 As a social media administrator for like six, five or six years at Mines, what would happen is you'd get people would be boosting stuff and they'd be posting stuff.
00:20:06.000 And I'd go through and I'd be like, OK, all politics are not safe for work.
00:20:10.000 I want to keep that conversation in its own area.
00:20:13.000 And then I'd see something would come up that I would agree with.
00:20:15.000 And I'd be like, wow, I agree with that, and I want that message to be propagated, but my job as an admin is not to make that decision.
00:20:22.000 I have to put that in the bucket with all the other politics.
00:20:25.000 For instance, Tim's work.
00:20:26.000 He'd make a video, and it would be cogent, and I'd be like, well, it's politics, so it goes with the politics.
00:20:29.000 And it takes a strong mind to continuously do that.
00:20:32.000 I don't think any human's really capable of setting their emotions aside like that.
00:20:36.000 Let's take it back in time.
00:20:37.000 How did this all begin?
00:20:39.000 Gizmodo reported May 9th, 2016.
00:20:43.000 Far-right Gizmodo.
00:20:44.000 Far-right Gizmodo.
00:20:45.000 Former Facebook workers.
00:20:46.000 We routinely suppressed conservative news.
00:20:49.000 And when I said, wow, look at this report from what is NewsGuard certified.
00:20:56.000 Real news.
00:20:58.000 I get accused of echoing false claims from conservatives that they routinely face suppression and censorship.
00:21:04.000 And then every time that comes up, I'm like, oh, I've never asserted that as a fact.
00:21:08.000 I've only cited Gizmodo.
00:21:10.000 Now, of course, it's a fact.
00:21:11.000 We have ample evidence it's happening.
00:21:13.000 But the crazy thing is, when the news broke, it was Gizmodo that was telling conservatives they were being suppressed.
00:21:20.000 All of a sudden, now the narrative shifts.
00:21:22.000 I mean, look at their image.
00:21:23.000 It's an elephant with a sheet over it.
00:21:26.000 Gizmodo of all outlets.
00:21:28.000 We know what's happening.
00:21:30.000 I think Vijay Gowdey may have been crying at our Twitter meeting because they were doing something unethical, amoral, or potentially illegal.
00:21:39.000 And they're about to get caught.
00:21:40.000 I think that Monday night, when Twitter learned that, so this makes more sense.
00:21:47.000 A lot of people are saying progressives are leaving, conservatives are joining.
00:21:51.000 It is true, to a certain degree, that many conservatives are tweeting like, look, I just signed up.
00:21:56.000 Sure.
00:21:56.000 You know what makes more sense?
00:21:58.000 When the Twitter staff had that all-hands-on meeting later in the day, and the CEO said, it's happening, let me answer your questions.
00:22:07.000 After that meeting which is now 5 p.m.
00:22:09.000 It's probably 6 7 p.m.
00:22:10.000 Eastern someone there said clean it up clean it up clean it up, man You got to get rid of all that code.
00:22:16.000 He's gonna come in.
00:22:16.000 He's gonna see this We are going to jail dude, and then pulled the code out and that night boom Josie gets reinstated All of these learn-to-code people start getting reinstated.
00:22:27.000 I think they were running an algorithmic ban on right-leaning users for saying things like learn-to-code.
00:22:34.000 I bet there was a list of phrases that were obvious and overtly right-leaning.
00:22:39.000 I bet you Hunter Biden's in there.
00:22:41.000 And I think we all know the pharmaceutical industry.
00:22:45.000 There's many, many pharmaceutical interests and topics that were controlled.
00:22:51.000 Cheryl, does the pharmaceutical industry have influence on the media?
00:22:53.000 What are you saying?
00:22:55.000 Guys, who is this?
00:22:56.000 That deviates from the left-right pattern, but I think that's another big one that's at play.
00:23:02.000 Jack, that's completely not true.
00:23:03.000 Let me just quickly say, this episode is brought to you by Pfizer.
00:23:08.000 You see that meme where every morning a news show is brought to you?
00:23:11.000 We're not really brought to you by Pfizer.
00:23:13.000 I have to clarify, that was a joke.
00:23:14.000 No, we're kidding.
00:23:16.000 I'm actually brought to you by Pfizer.
00:23:17.000 We're just kidding.
00:23:19.000 Also, Cheryl can beat- pretty sure she can beat us all up.
00:23:21.000 What bothers me is that there's no way to- well, at this stage, there's no way to know if there's nefarious stuff going on in the Twitter code.
00:23:27.000 I wish that we could watch that happen.
00:23:29.000 It's another value of having the software be free.
00:23:31.000 Well, if they went ahead and changed stuff really fast, is it too late now?
00:23:35.000 Did they make changes that even if someone were to go in and try to see, is it too late?
00:23:39.000 Look, I said this yesterday on War Room, and I'll say it here again.
00:23:43.000 Elon Musk didn't just buy a platform.
00:23:47.000 Elon Musk bought evidence.
00:23:49.000 He's got all this.
00:23:50.000 And I guarantee you that he knows people that he can bring in that when they actually peel back the curtain, look under the hood of this thing, they can go back as far as they have to go because he will bring in the highest caliber people.
00:24:04.000 Because you see all the stories, by the way, in Business Insider, they say, oh, he's, you know, he's so demanding and he's always firing people.
00:24:10.000 No, because he wants the best.
00:24:11.000 Can I say something?
00:24:13.000 If he's really smart, and I think he's really smart, these transactions, as you know, They're gone over by attorneys and analysts.
00:24:21.000 Some attorney from Elon Musk's group sent some attorneys over at Twitter a note that said, don't touch things.
00:24:28.000 Preserve your records.
00:24:31.000 But watch out for Twitter going down mysteriously, which of course happens.
00:24:37.000 Watch for any kind of server migration.
00:24:40.000 So right now, if they manipulate code, I don't think I'm qualified to answer that.
00:24:45.000 that they've done changes to look at edits. But correct me if I'm wrong Ian, if they migrate
00:24:49.000 the servers fresh, copy only the existing code, all the records are going to be lost.
00:24:54.000 I don't think I'm qualified to answer that. I don't know.
00:24:57.000 So can I, can I just go back real quick onto the, the ministry of truth by the way,
00:25:01.000 cause I've just got, I want to Okay, because there's an update, and oh my goodness.
00:25:06.000 Let me pull up this story first, and we'll throw it to Jack.
00:25:08.000 We have this from TimCast.com.
00:25:09.000 Department of Homeland Security forms disinformation governance board.
00:25:14.000 Homeland Security Secretary Alejandro Mayorkas said the board would work to prevent disinformation campaigns that target minority groups They lost Twitter, and this quickly, they are already trying to play dirty games.
00:25:28.000 We have it over at the Daily Mail, where they mention it is being headed by a Russia expert who called the Hunter laptop story a Trump campaign product and said she shudders to think about Elon Musk taking over Twitter.
00:25:46.000 Wow.
00:25:46.000 Here's our Ministry of Truth.
00:25:48.000 Jack, what's going on?
00:25:49.000 Well, Tim, see, here's the thing is that Nina Jankowicz of the Moaning Myrtles has responded.
00:25:56.000 So she's responded to some of this criticism.
00:25:58.000 She notes, for those who believe this tweet is key to all my views, it is simply a direct quote from both candidates.
00:26:05.000 For you see, this was during the final presidential campaign.
00:26:08.000 And if you debate and if you look at my timeline, you will see I was live tweeting.
00:26:13.000 See, she was just live tweeting.
00:26:14.000 She was just live during the debate.
00:26:16.000 Except here's the problem.
00:26:18.000 In the immediate follow-up tweet to this, she wrote, the emails didn't need to be altered to be part of an
00:26:25.000 influence campaign.
00:26:26.000 Voters deserve the context, again, the keyword context, not a fairy tale about a laptop
00:26:33.000 repair shop. She called it an influence campaign in the very next tweet.
00:26:38.000 So we caught her in day one on the job lying about what she said about the Hunter Biden laptop, trying to cover it up.
00:26:47.000 But I'm sorry, Nina, we got the receipts.
00:26:51.000 Can I say that I find it a little odd that we've all bought into arguing over sort of the minutia of how this is done, instead of stepping back and looking at 2015 and the notion that anybody should control for any reason.
00:27:08.000 So we're arguing, should they control a certain hashtag?
00:27:10.000 Does it really attack people?
00:27:12.000 Was it attacked?
00:27:13.000 Why are they controlling a hashtag in the first place?
00:27:16.000 Before 2015, there was relatively little, if any, discussion over bringing in third parties, a corporation no less, to control the content that we can access and make our own decisions on on the internet.
00:27:30.000 The social media companies had already invented tools where we could control our own experience.
00:27:34.000 If you don't want to see something, if you want someone to call your experience, you can follow the right people, you can block other people.
00:27:41.000 The notion that someone else should be doing that for us was introduced to us in about 2016 and we've kind of bought into it.
00:27:49.000 We've argued over the terms.
00:27:51.000 We've kind of bought into that.
00:27:52.000 There are reasons why this should be done and there are people who could maybe do it better instead of stepping back in my view and saying.
00:27:59.000 Only that which is illegal should be moderated by anybody, in my view.
00:28:04.000 Well, that's where we've been, right?
00:28:05.000 So we had Torba from Gab on the show who said, here's my rules, and he pulls out the Constitution, and I'm like, agreed.
00:28:12.000 Well, that's kind of what Elon's saying now, that his notion, and by the way, this is why these two stories are connected, because Elon said that his view, and he said this in that TED talk he gave last week, or it was actually an interview, I guess, That his notion should be that it should be up to the laws of the state in which Twitter is operating because those laws are reflective of the will of the people, I guess, if you're in a democracy, right?
00:28:37.000 That being said, right, that being said, the very same way, the ink isn't even dry on the paperwork for Twitter and already the Biden administration is launching a ministry of truth.
00:28:49.000 They know they've lost this one.
00:28:51.000 And they had such power in the private sector because their sycophants were just like, but it's a private company.
00:28:56.000 Now that they've lost it because someone had the money to buy it, now it's going government.
00:29:01.000 This will be interesting.
00:29:03.000 You'll need money to combat this in the courts.
00:29:06.000 It'll be difficult.
00:29:07.000 Does anybody read or are they required to read anymore?
00:29:11.000 I'm a little older than you guys, but 1984, an animal farm in school.
00:29:16.000 My kid wasn't.
00:29:17.000 I mean, we read it at home.
00:29:20.000 This was required reading when I was growing up.
00:29:23.000 And when you say ministry of truth, and I get it, a lot of people get it.
00:29:26.000 I think a lot of people, when I've mentioned these things about 1984, and how language is used to mean the exact opposite of what
00:29:34.000 it really is by the government, all the things that are happening that are such perfect
00:29:38.000 parallels to what were written about decades ago, so many people seem unaware of. And I feel
00:29:43.000 like if people had been opened up to this, they could see it happening and witness it with some
00:29:49.000 knowledge that they're not able to in many Right.
00:29:53.000 And the main character, for those who haven't read it, the main character of 1984 works for the Ministry of Truth.
00:29:59.000 That's his job, is to go back in time and censor things that are no longer in line with the party narrative.
00:30:04.000 The Ministry of Peace is all about war.
00:30:06.000 The Ministry of Love is all about hatred or whatever.
00:30:10.000 They call it double speak.
00:30:11.000 Double speak, yeah.
00:30:11.000 So we normally save superchats for the end, but sometimes we get a good one.
00:30:15.000 We do have a good one here from Christopher who said, they're purging their code, but they're bringing back followers to make their earnings report look good so Elon can't buy.
00:30:23.000 Now I don't know if that's true, that he wouldn't be able to buy if the earning report looks good, but this will affect the earnings report if Twitter comes out and says, we had a massive increase of users!
00:30:33.000 Look at all this!
00:30:34.000 I think Elon Musk cornered Twitter.
00:30:37.000 He came to them knowing their earnings report was going to be bad and the stock would fall.
00:30:42.000 And so they had no choice but to accept the premium because he made the offer only a couple weeks before they had to do their earnings report.
00:30:50.000 If they did not accept the $54.20, earnings report comes out, stock drops to $30, the board can be sued for that massive loss by the shareholders who would be outraged.
00:31:01.000 So he wins this one.
00:31:03.000 They can't break the deal now.
00:31:04.000 If they do, they lose a billion dollars.
00:31:06.000 But it is interesting because whatever Twitter is doing right now, removing these bans, will make their growth look better.
00:31:14.000 This is someone that's not paying attention.
00:31:16.000 Right.
00:31:16.000 People that are like, well, I was unbanned.
00:31:18.000 Except the report will say in quarter one, we saw an influx of 3.4 million users.
00:31:23.000 They won't say it was the last day that they unbanned everybody.
00:31:26.000 Yeah, they're not going to talk about how.
00:31:27.000 They got to be careful with fraud.
00:31:28.000 If they're going to try and fraud.
00:31:29.000 It's not fraud.
00:31:30.000 No, it's not.
00:31:31.000 They defraud their investors by telling them that a bunch of accounts that they had anyway, that they were considering unbanning before, they just chose to do it to make some extra money, they're going to go to prison.
00:31:40.000 That's why they're running around to everyone in the media saying, oh, no, no, no, this is organic.
00:31:46.000 That's why they're making sure to give statements every single day to everyone who asks them.
00:31:51.000 Josie getting unbanned is not organic.
00:31:52.000 Someone did that.
00:31:53.000 And they, in my opinion, lied to Congress.
00:31:57.000 When they're like, there's nothing in place for censoring these people or politics, it's like, bro, your rules outright say misgendering will get you banned, but it's an inversion of how conservatives see what misgendering is.
00:32:11.000 Their rules are overtly targeting conservatives, and they are lying to Congress about it, and they're not getting in trouble.
00:32:17.000 I'm looking at the top 10 owners of Twitter.
00:32:18.000 It's Vanguard, Morgan Stanley, BlackRock, SSGA Funds Management, Aristotle Capital Management, Fidelity Management, ClearBridge Investments.
00:32:26.000 It's 10 investment firms.
00:32:27.000 Do you know what SSGA is?
00:32:29.000 Uh, no.
00:32:30.000 Google it.
00:32:31.000 SSGA Funds Management?
00:32:33.000 You're gonna love this.
00:32:33.000 Yeah.
00:32:35.000 Wow.
00:32:36.000 Yeah, look it up.
00:32:37.000 What does the SSGA stand for?
00:32:39.000 Oh, State Street, isn't it?
00:32:41.000 State Street Global.
00:32:42.000 So State Street's number four.
00:32:43.000 Yep.
00:32:44.000 So they don't care about lying to Congress.
00:32:45.000 They're going to send Jack Dorsey up there, tell him to lie to Congress, and then he's going to be the one that's on the hook if something goes wrong.
00:32:51.000 But listen, what happens is they give themselves plausible deniability.
00:32:55.000 They go to Jack and say, hey, Jack, here's the report on everything we're doing.
00:32:59.000 And he goes, OK.
00:33:00.000 Then he goes to Congress and they're like, are you doing this?
00:33:02.000 We are not doing that.
00:33:03.000 Based on a report he read that was a lie to him.
00:33:06.000 And then he says, well, they lied to me.
00:33:07.000 I didn't know.
00:33:08.000 I told the truth.
00:33:10.000 And the person who lied to me wasn't under oath.
00:33:13.000 That's the game.
00:33:14.000 That's a tough one.
00:33:15.000 How do you navigate that kind of situation?
00:33:18.000 Really high paid lawyers.
00:33:21.000 Are they claiming net gains of users because they're also people claiming They're fleeing Twitter.
00:33:27.000 But are they saying that overall they're gaining way more than they're supposedly losing?
00:33:30.000 That's the strategy.
00:33:32.000 That's what Tim's saying, is that the strategy is they can pick up and far surpass by removing the bans.
00:33:38.000 So it's not even a political thing in this theory.
00:33:41.000 It's just it's about the earnings report.
00:33:43.000 Katy Perry lost 200,000 followers.
00:33:46.000 You're not gonna convince me that a bunch of Katy Perry fans were like, I am outraged that Elon Musk bought the platform.
00:33:52.000 But she also lost Russell Brand, more importantly.
00:33:54.000 Did she?
00:33:55.000 That's sad.
00:33:56.000 I mean, if anyone's gonna, like, annoyingly just leave out of emotion, it would be someone that follows Katy.
00:33:56.000 Oh, really?
00:34:02.000 No offense, Kate, but, you know, your fans are like bubblegum people.
00:34:05.000 Didn't Barack Obama lose, you know, hundreds of thousands?
00:34:08.000 Look, everybody knows that those main accounts, and Dave Rubin had the tweet up today, That the New York Times has 53 million followers and yet gets like 50 retweets per tweet.
00:34:20.000 And Elon Musk even responded to that saying, what's going on?
00:34:23.000 Actually, Rubin used to talk about this all the time.
00:34:24.000 He called it the Rubin ratio, right?
00:34:26.000 So the Rubin ratio is how many followers you have versus what is your engagement.
00:34:30.000 Meanwhile, like, you know, I can write something in a certain way.
00:34:35.000 You know, or drop receipts on someone like we just did with this government official Nita Jankowicz, catching her in a provable lie, in a demonstrable lie.
00:34:42.000 And that's going to get tons of retweets.
00:34:44.000 But a New York Times article with 53, you know, 1% of that, right, should be enough to get you tons of retweets.
00:34:50.000 Didn't he say he could tweet a celebrity photo and a banana and it would get more tweets?
00:34:54.000 See, I wasn't going to bring that up, Cheryl.
00:34:56.000 And it did.
00:34:56.000 He did.
00:34:57.000 And it got more tweets.
00:34:58.000 It was like an 80s sitcom and a banana.
00:35:00.000 And he got more.
00:35:01.000 He got like 5,000 retweets on it.
00:35:02.000 Let me actually pull this tweet up from Dave Rubin because there's a lot more context in this.
00:35:07.000 We have the tweet from Dave Rubin himself.
00:35:09.000 He says, Hey Elon Musk, as long as you're digging, check into how New York Times, Forbes, etc.
00:35:14.000 bought their Twitter followers to fake influence.
00:35:16.000 New York Times has 53 million followers and rarely gets 50 retweets.
00:35:22.000 I could post a banana emoji and a pic of an 80s sitcom star and get more.
00:35:26.000 See next tweet.
00:35:28.000 Okay, here is Dave's next tweet.
00:35:30.000 It is an 80s sitcom star with a banana and has 7,718 retweets.
00:35:34.000 God bless her.
00:35:35.000 Now, Elon Musk responded.
00:35:37.000 I noticed that.
00:35:39.000 I noticed that too.
00:35:40.000 Pretty weird.
00:35:41.000 Everyday Astronaut says.
00:35:43.000 Conversely, for some reason the last two days my account suddenly got 30,000 followers a day and we've done nothing different.
00:35:48.000 It's far beyond our average 1 to 2k per day.
00:35:51.000 You're not going to convince me that that is all organic.
00:35:54.000 I responded to Elon Musk.
00:35:56.000 So Elon said almost every media outlet on earth wrote about me acquiring Twitter, causing a massive influx of new users.
00:36:01.000 I just want to point out there was no influx on the Monday this was announced.
00:36:04.000 Take a look at this.
00:36:06.000 So Social Blade is GMT, by the way.
00:36:08.000 So that's London time.
00:36:10.000 So just to take that into consideration.
00:36:10.000 Sure, sure, sure.
00:36:12.000 And it's possible, I was saying, that they calculate everything by like 5 p.m.
00:36:12.000 Absolutely.
00:36:15.000 That still doesn't explain how... So whoever's account this is, Everyday Astronaut... Right, so 3 p.m.
00:36:22.000 Eastern would be what?
00:36:23.000 8 p.m.?
00:36:24.000 On 423, $3,000.
00:36:24.000 On 424, $3,000.
00:36:25.000 $3,700.
00:36:25.000 And then the next day, $30,000.
00:36:25.000 On 425, $3,000.
00:36:25.000 $3,900.
00:36:26.000 423, 3000. On 425, 3000. 3,900. 3,700. And then the next day, 30,000. I don't buy it.
00:36:33.000 No, no, you can't.
00:36:35.000 I mean, that's what is that?
00:36:36.000 Well, the other way to check is... A 10X increase?
00:36:38.000 The other way to check is to check the create on date.
00:36:41.000 So go through and look, because I've seen a lot of accounts that are created April 22 now with zero followers.
00:36:46.000 I have seen that.
00:36:47.000 OK.
00:36:48.000 And some people were tracking on governors.
00:36:50.000 Actually, a mainstream outlet, I forget which one, was looking at Governor DeSantis' account, and they saw that some... But, but it was only like 10% of the followers were created on in April 22.
00:37:03.000 So take a look at this.
00:37:04.000 $425, it's $39.
00:37:04.000 $424, it's $37.
00:37:05.000 Now there's a gain of about just shy of $200.
00:37:06.000 425, it's 39.
00:37:07.000 $160.
00:37:09.000 Now there's a gain of about just shy of 200, and 160, that can be explained by,
00:37:15.000 it can be simply explained by Monday more people are at work and they're on Twitter.
00:37:21.000 But there's nothing... There should be a larger anomaly because it was 8am on the 25th when they said Elon was in final talks to buy this.
00:37:30.000 Shouldn't there be at least a small percentage of conservatives being like, I'm gonna follow an astronaut?
00:37:37.000 I would think so.
00:37:38.000 So I think there's something dirty happening.
00:37:41.000 But back to Dave Rubin's point.
00:37:43.000 Let me tell you what's going on with some of these accounts.
00:37:45.000 For one, journalists buy fake followers.
00:37:48.000 I know all about that.
00:37:49.000 And the New York Times likely has When you sign up for Twitter, they tell you to follow these people.
00:37:56.000 It says follow these accounts and then you'll just go boop, boop, boop.
00:38:00.000 New York Times is probably one of the first recommended and it's probably why.
00:38:02.000 I get that constantly.
00:38:03.000 If you scroll back on, so the way to check, people use this for identification purposes many times because typically some of the first accounts you follow will be geographically co-located.
00:38:16.000 Yeah, so you might follow your local newspaper.
00:38:19.000 Now, if you're in New York, that might be New York, but if you're in Minneapolis, you follow the Star Journal, right?
00:38:24.000 Maybe you're in Nashville, maybe wherever you are, right?
00:38:26.000 And so typically, if you scroll back on someone's Twitter followers, those are in sequential order.
00:38:34.000 So you're actually looking at a timeline of when they followed each person.
00:38:39.000 So if you go back to the earliest ones, usually the first two or three are going to be like New York Times, CNN, or Washington Post, because that's what's recommended to you.
00:38:39.000 And in that time.
00:38:50.000 In many cases, when you're signing up for your account, they require that you follow three before you can sign up and they present those three to you.
00:38:59.000 Tim, how do you buy followers?
00:39:01.000 We just Google it.
00:39:02.000 I don't know if you can still do it, but it was... Fiverr used to have it.
00:39:06.000 How expensive is it?
00:39:09.000 They're a little bit more professional now.
00:39:11.000 It used to be like these Macedonian platforms.
00:39:13.000 I'll say it.
00:39:14.000 Everybody accuses everybody else of buying followers.
00:39:19.000 You can't really track this stuff anymore.
00:39:21.000 The bot farms have gotten really good at obfuscating this.
00:39:26.000 What people used to do was track engagement.
00:39:29.000 So they would look at a certain account and then run it through some program and it would be like 75% of their followers don't tweet, they're fake.
00:39:37.000 And my response to people was, dude, if you go to Donald Trump and you run him through this, you're going to see 90% of his followers don't tweet.
00:39:44.000 They're his fans who signed up for Twitter.
00:39:44.000 Why?
00:39:48.000 They follow him, they don't post.
00:39:49.000 So you can't call people fake for that.
00:39:51.000 So it's really difficult to know for sure.
00:39:53.000 That may be true of the New York Times.
00:39:54.000 The New York Times might not be getting retweets because... Everybody signed up because they had to.
00:39:59.000 Well, no, no, no, no, no.
00:40:01.000 I don't comment on the New York Times.
00:40:03.000 I follow all these news outlets.
00:40:04.000 I don't engage.
00:40:04.000 That's true.
00:40:05.000 I see the story and I click the link.
00:40:07.000 I'm not going to argue with a news article.
00:40:09.000 I like, Jack, what you said earlier about the real value of these numbers is what is your follower-to-interactivity ratio.
00:40:17.000 Rubin ratio.
00:40:19.000 In early YouTube, 2006-2007, I started noticing you get 1,000 subscribers and you get 8,000 subscribers.
00:40:24.000 But I was only getting like 4,000 views.
00:40:25.000 I'm like, where's those other 4,000 people?
00:40:28.000 I wish I had a button where I could Have a bunch of accounts unfollow me, like unfollow me if they're dead accounts, if they haven't logged in in 30 or 60 days.
00:40:36.000 Because I need to schleff that nonsense number.
00:40:39.000 I want an accurate account of who's really there.
00:40:41.000 Yes, yes, yes.
00:40:42.000 But Ian, I'll give some insights to people who watch this show.
00:40:46.000 We produce this live show.
00:40:47.000 We then produce, I think, between three and five clips from the show the next day.
00:40:51.000 And then I have three clips on two different channels.
00:40:55.000 Of those videos, The average subscriber watches 10 per month.
00:41:01.000 So that means if I'm putting up, let's say, 8 clips per day for 31 days, we've got 248 clips or whatever.
00:41:11.000 The average person only sees 10 of them.
00:41:14.000 So when I'm wondering, the average follower, the average follower, right.
00:41:19.000 Only sees about 10 of the clips that I put up every, every month.
00:41:22.000 So if I, you know, I have 1.3 million followers on my main, on my personal Tim pool channel, I get 300 or so thousand, 200, 300,000 views.
00:41:32.000 I'm not surprised because of those, you know, people, you can just basically do the math.
00:41:38.000 Some people are diehard fans and they'll watch every video.
00:41:41.000 Some people will watch every other video.
00:41:42.000 Some people watch a video once a week.
00:41:44.000 Yeah, it's the accounts that haven't logged in in 60 days that I'm not interested in having around anymore.
00:41:48.000 I feel like they're bloating my numbers.
00:41:49.000 But YouTube does delete those?
00:41:51.000 Sometimes, but I want a button where I can manually do it.
00:41:54.000 I'm trying to tell YouTube, put it on there, man.
00:41:56.000 Let people see their real numbers.
00:41:58.000 I mean, that can be... You're the only one that wants to see your real numbers.
00:42:01.000 I'm joking.
00:42:02.000 Most people want to see the big numbers.
00:42:03.000 They want the inflated, living on top of the hill of gold thing.
00:42:07.000 It's like, you know, if you're going to make that actually like social currency, your follower number, then there needs to be some regulations about buying fake stuff.
00:42:14.000 It's like counterfeiting money and telling everyone you're rich.
00:42:19.000 Where this comes into an issue is when advertisers come up.
00:42:23.000 Because if you're using bot traffic, and I'm going to be careful about this, the previous owners of Newsweek got in trouble for this.
00:42:31.000 Because the previous owners, before they were bought out, were using bot traffic to juice their numbers, juke the stats, and then present that to advertisers, claiming that they were getting X amount of traffic, which was completely false.
00:42:46.000 Well, so back in the day, there was this big scandal around ad rights distribution or ad rights sales.
00:42:52.000 And an ad right was that you could have a website that gets a thousand views per month.
00:42:58.000 You sell the rights to those views to a bigger network.
00:43:02.000 The network aggregates 50 websites that each get, you know, a thousand views.
00:43:07.000 And now they say we get 50,000 views.
00:43:10.000 Technically, that's true.
00:43:11.000 What was happening was there would be a company and we'll call it We'll call it Golden Brand, right?
00:43:18.000 They're the Golden Brand.
00:43:19.000 They're the hottest brand.
00:43:21.000 They go to an advertiser.
00:43:22.000 Look, we get 50 million views per month on Golden Brand.
00:43:27.000 Now do you want your product associated with our golden brand?
00:43:30.000 Yeah.
00:43:31.000 Then you gotta pay for 50 million views.
00:43:34.000 And then what would happen is, yeah, 5 million would be on Golden Brand, 5 million would be on clickfarm.garbage, the others would be on ultimateamericanpatriot.info, and what these websites would do is they advertise You know, top 25 celebrities.
00:43:51.000 And then when you click it, it'll show you a celebrity.
00:43:55.000 And in order to see the next picture, you got to click to a new page.
00:43:58.000 Turning one person into 25 views that they can then fluff their numbers up, sell to advertisers.
00:44:05.000 It was a huge scandal.
00:44:07.000 You click those every time, don't you, Tim?
00:44:07.000 Everybody was doing it.
00:44:09.000 Thank God not everybody.
00:44:09.000 Top 25 celebrity skateboarders.
00:44:10.000 And the picture they get you to click on isn't in the whole thing.
00:44:14.000 But that's why.
00:44:15.000 That's so aggravating.
00:44:16.000 You can go through the whole thing to get you to the end.
00:44:18.000 You're one person reading one article, but now one article or one page becomes 25 pages or more.
00:44:24.000 25 views!
00:44:25.000 But you just answered the question, maybe.
00:44:27.000 I don't understand all this and I get approached from time to time, like everybody on the web that has a web page or website, whatever.
00:44:36.000 And people call me and say, can I put your website?
00:44:39.000 You don't mind if I put it on my aggregation site, right?
00:44:42.000 Like, this is good for you, right?
00:44:43.000 I'm like, no, but you know, are they selling to somebody who's buying ads for them that what your viewers are?
00:44:52.000 If you'll make a video and this happens all the time.
00:44:55.000 The Daily Wire publishes an article.
00:44:56.000 Someone will just quote the article and then repost it on their blog.
00:45:00.000 It's called newsjacking.
00:45:01.000 And they'll do that to try and get clicks and make money off of it.
00:45:01.000 Right.
00:45:04.000 So what they're doing is they're, in some cases, they might be copying your work wholesale and then wrapping it on their site so that nobody ever actually clicks through to charlatans.com.
00:45:17.000 But you're a click, you're reading it on, you know, whatever, what Tim is in clickfarm.garbage.
00:45:22.000 And because they never clicked through to your site, you don't get any of the ad traffic.
00:45:26.000 Is it too far off the beaten path for me to mention something about?
00:45:29.000 We're talking about ads.
00:45:31.000 I have a website.
00:45:32.000 It's pretty expensive to maintain.
00:45:34.000 It's just a little thing I do myself.
00:45:37.000 Nothing like the big traffic you guys get, I'm sure.
00:45:39.000 But it gets more expensive as more people visit.
00:45:42.000 So I let Google AdSense put ads on there, you know, rotate them through, make a little money to pay one of my web guys.
00:45:49.000 I got a notice a week ago that said you can no longer use AdSense.
00:45:52.000 And I have violations for factual articles that they want me to take down, but it's not over that.
00:45:57.000 You can no longer use AdSense unless you approve our new terms.
00:46:01.000 And then there's no way to approve the new terms.
00:46:03.000 So I had my web guy look.
00:46:04.000 I said, am I crazy?
00:46:05.000 There's no way to get in to approve the terms to continue using AdSense, to continue collecting off the ads.
00:46:12.000 And the web guy goes in and there is no way to approve the terms.
00:46:15.000 So I'm in effect locked out of the AdSense cycle.
00:46:18.000 There's no one to contact.
00:46:19.000 There's no way to appeal.
00:46:21.000 There's no one who will answer the question.
00:46:23.000 But I feel like there are these sneaky ways, even if you don't, they can't pull you down for blatantly violating something they say you violated.
00:46:30.000 They find other ways to make it pretty impossible to operate.
00:46:33.000 That sounds like an oversight developmentally rather than a malicious attack on you.
00:46:38.000 I can't tell but that's what it sounds like.
00:46:40.000 Except I didn't read anywhere that anybody else was.
00:46:42.000 Usually you can Google something like that or search it and people are saying that happened to me.
00:46:46.000 I have the same problem and I just don't see other people have it.
00:46:50.000 I definitely saw a lot of people talking about Google AdSense sending out new terms.
00:46:56.000 I didn't see anybody mentioning that.
00:46:58.000 Maybe that's a great question if anyone's in the super chat who's watching or can send anything in.
00:47:04.000 But that does sound like a technical issue.
00:47:05.000 But it reminds me of Facebook.
00:47:08.000 that recently said I had a violation and then if you click in there is no violation and it says if you don't fix it we're going to take down your page and you click to appeal and it says there's too many appeals we can't consider yours.
00:47:20.000 It's a it's to me anyway it's a trap that I can't get out of so I just quit using my professional Facebook.
00:47:25.000 Regarding Terms of Service, this is very important.
00:47:28.000 When a company updates their Terms of Service and wants you to click accept on them, what they should have to do legally is show you the old Terms of Service and the new Terms of Service side-by-side with highlights of all the changes.
00:47:40.000 So you can click in the areas, go right to the place that's been changed and see exactly what's been changed.
00:47:45.000 Completely unconscionable the way they do it.
00:47:46.000 Now let's change it.
00:47:48.000 I want to pull up this tweet from ContraPoints.
00:47:51.000 ContraPoints is a prominent leftist YouTuber and trans woman who quoted the Elon Musk tweet of me and Vijay Gade and said, the Rogan clip he's referencing is about whether Twitter having a rule against misgendering trans people is, quote, left-wing ideological bias.
00:48:09.000 Partly true, but yes.
00:48:12.000 And I wonder if Contra, tweeting this earlier today, with tremendous respect to ContraPoints for actually knowing the argument I made and getting it earlier in the day.
00:48:22.000 Natalie Wynn, ContraPoints then says, Source.
00:48:25.000 These are the people restrained by the current moderation that Elon apparently intends to remove.
00:48:30.000 Says everything you need to know about the sort of place that this is about to become.
00:48:37.000 I wanted to highlight this because this from ContraPoints, I believe, shows you exactly what the left's view of the platform is.
00:48:46.000 So when I was on Rogan's show, I said, your rules are overtly biased.
00:48:51.000 Jack Dorsey took issue with this and said, no, they're not.
00:48:54.000 And I said, you have a misgendering policy.
00:48:58.000 Conservatives do not agree with progressives on what misgendering means.
00:49:03.000 To a conservative, if someone is born male, they are he him.
00:49:07.000 To a progressive, if someone identifies with a pronoun, they are whatever pronoun they want.
00:49:12.000 That is a difference in worldview.
00:49:15.000 And at the very least, you can say, Republicans gonna vote one way, Democrats gonna vote the other.
00:49:20.000 That is not a moral statement.
00:49:23.000 I am not saying it is good to misgender anybody.
00:49:25.000 I'm saying if Twitter decides that the conservative perspective on this is out the window, and they will enforce with permanent removals the progressive worldview, then there is a biased system to the left.
00:49:40.000 ContraPoints, and the fans of ContraPoints, believe it's a good thing that conservatives are not allowed to have their worldview on the town square and the largest political social media platform.
00:49:53.000 Now, conservatives think the left should be allowed to have their views on the platform, but the right should as well.
00:49:59.000 I just gotta say right there, this is why post-liberals, people who used to vote Democrat, used to vote for Bernie, are now finding themselves voting for Trump.
00:50:06.000 I'm one of them.
00:50:07.000 Because I'm like, I think it's fantastic that ContraPoints expresses all of these opinions and has left-wing views.
00:50:14.000 I'd like to argue them.
00:50:14.000 Wonderful.
00:50:16.000 I also appreciate that Jack Posobiec has his opinions as well.
00:50:18.000 But they banned Alex Jones.
00:50:20.000 They banned Donald Trump.
00:50:21.000 They banned Laura Loomer, Milo Yiannopoulos.
00:50:22.000 They banned Carl Benjamin.
00:50:23.000 And for what reason?
00:50:24.000 They said naughty words.
00:50:26.000 Words they did not like.
00:50:28.000 Well, okay, that's politics.
00:50:30.000 If you want a political platform, expect things you don't like.
00:50:33.000 Some people like what they say, Vijaya did not.
00:50:36.000 There's a really, another response to this.
00:50:39.000 Wait, Tim, you're leaving on not one, but two brothers that absolutely deserve to have their Twitter accounts back.
00:50:48.000 Please, please remind everyone who that is.
00:50:50.000 The Krasenstein brothers.
00:50:51.000 Absolutely the Krasenstein brothers.
00:50:52.000 Well, but so they were accused of using bots.
00:50:55.000 Now that was a bot issue.
00:50:55.000 I don't care.
00:50:56.000 But they should have their accounts back.
00:50:58.000 They accused many conservatives using bots as well.
00:51:01.000 I haven't seen any evidence.
00:51:02.000 You show me proof that Krasinskians were using bots, I'll say, okay, fine.
00:51:05.000 For the time being, I don't know about that.
00:51:07.000 One of the other responses was that there were several examples of breaking the rules.
00:51:15.000 That doesn't negate my point.
00:51:17.000 Vijayagadi told me that Carl Benjamin was suspended for saying some really awful things.
00:51:24.000 And I'm like, yeah, I agree.
00:51:25.000 I think those things are really awful, too.
00:51:27.000 I can't remember exactly what he said.
00:51:29.000 But does that matter?
00:51:31.000 Perhaps there are some people who would come out and say, I don't believe anybody who opposes free speech should be allowed on the platform.
00:51:38.000 It is an affront to American culture.
00:51:41.000 It is a front to the founding fathers and everything this country stands for.
00:51:44.000 How about this?
00:51:45.000 How about every conservative who says burning the flag is wrong?
00:51:48.000 If this platform was run by Donald Trump, he'd ban people for showing pictures of burning the flag.
00:51:54.000 And that would be wrong in my opinion.
00:51:56.000 But it goes to show you that when they- But then what if you're reporting?
00:51:59.000 What if you're reporting on Antifa?
00:52:01.000 Well, I don't think Trump would ban you for reporting on Antifa doing it.
00:52:05.000 I'm saying if Carl Benjamin goes on the platform and insults someone and calls them a name, so they ban him for it.
00:52:12.000 I believe if it was run by staunch conservatives, people like Donald Trump, and you posted yourself burning a flag, you'd be banned for it.
00:52:19.000 But this is also my point, though, is that if you're running this through some arbitrary system, you have some machine that's banning people, well, you start with saying, OK, I don't like the American flag burn.
00:52:33.000 I personally don't like the American flag burn, obviously.
00:52:36.000 But then you start banning people for showing that.
00:52:39.000 Well, then if I'm reporting on Antifa or if you're reporting on whoever you're reporting on, right, and you have to show extreme symbols.
00:52:46.000 So I remember I did an Antifa documentary.
00:52:48.000 And was it Vimeo?
00:52:50.000 Vimeo took it down because they said it was violence.
00:52:52.000 Well, it was a documentary about a violent group that uses violence to attain political ends.
00:52:58.000 So yes, there's going to be violence in it, but I'm not promoting the violence.
00:53:01.000 Again, we're arguing something that is easily resolved and was in the original design, I think.
00:53:07.000 Somebody says something you don't like, you block them once, you'll never see them again.
00:53:11.000 That's your method, your tool that you can use. But what you're saying
00:53:16.000 when you want people to be blocked is you don't want other people to see it. You're not even
00:53:20.000 talking about yourself. You're trying to control what other people can access. That's what I have a
00:53:24.000 problem with. There are the tools that exist there. If I don't like what you say, I can hit block
00:53:29.000 and I'll never see it again.
00:53:30.000 It only has to happen once. Avert your eyes and move on.
00:53:33.000 It is insane to me that the left thinks you should not be allowed on the platform
00:53:40.000 If you would choose to call them something they did not choose to be called.
00:53:45.000 In what reality?
00:53:47.000 I can go outside, walk down the street, and call everyone a chicken-effer.
00:53:54.000 I can see a guy walking down the street and be like, screw you, chicken-effer!
00:53:57.000 And you know what's gonna happen?
00:53:58.000 Nothing.
00:53:59.000 Now, in some places, they might attack you.
00:54:01.000 They might hit you for it.
00:54:04.000 99% of the time, they're gonna be like, okay, whatever, and they're gonna walk off.
00:54:09.000 Yet, for some reason, these people on Twitter are like, Not even Zuby, famously suspended for saying, okay, dude, not trying to misgender someone by calling them a dude, but just being like, okay, dude, like in response.
00:54:22.000 I call everyone dude.
00:54:23.000 I actually call everyone dude, regardless of gender.
00:54:25.000 They are actually arguing that you would see an image of someone.
00:54:30.000 And because you use the wrong pronoun, you should not be allowed on the platform.
00:54:35.000 Right.
00:54:36.000 Insane.
00:54:38.000 Man, it's a private company owned by a guy.
00:54:41.000 If he wants to censor people and kick them off, I get it.
00:54:43.000 When the company... Well, the company is Twitter.
00:54:46.000 Twitter.com is just a piece of property that's owned by the company, Twitter.
00:54:49.000 So I want to talk about the product, Twitter.com.
00:54:52.000 If when it gets big and you're like, OK, now this guy is controlling something that we're using in the commons, what do you do?
00:54:58.000 Use the government to say you can't censor?
00:55:01.000 I don't like that.
00:55:02.000 That's the only the only thing I can think of is to make him open up the code so that other people have access to other Twitter.
00:55:08.000 Well, I agree with you.
00:55:09.000 I think having the government come in and say anything, it just compounds the situation that we're already in.
00:55:15.000 I think that social media titans should have said when they were initially approached with trying to moderate and fake fact check and do all the things they do, their best answer for their own protection, too, would have been to say, we don't do it.
00:55:30.000 As Tim said, Gab said, we only censor Or control that, which is illegal.
00:55:35.000 The other stuff you can do yourself.
00:55:37.000 Except doxing.
00:55:39.000 And spam.
00:55:40.000 No, I don't... Did he say ban spam?
00:55:43.000 He wants to.
00:55:44.000 He said he wants to.
00:55:46.000 And so they're interesting and fair points.
00:55:48.000 And authenticate all humans, which is interesting.
00:55:50.000 No, no, no, I mean... I know Elon does.
00:55:51.000 I'm talking about Torbo.
00:55:52.000 Oh, sorry.
00:55:53.000 I don't know if he's Spanish, but I'm on Gab.
00:55:56.000 But you can't dox people.
00:55:59.000 And I said, I think everybody agrees with that.
00:56:01.000 Because showing someone's address is not an expression of your political views or opinions.
00:56:06.000 It's just, you know, it's an attack on someone.
00:56:08.000 So should we...
00:56:09.000 But hold on.
00:56:11.000 I actually wonder if that, if doxing should be allowed.
00:56:14.000 That's what I'm wondering.
00:56:15.000 Should we amend the Constitution, the right to privacy, to incorporate doxing?
00:56:18.000 You can't do it.
00:56:19.000 Well, this does get into, though, the area that we're talking about where essentially what Elon has said, well, I want to just go with whatever the law is, and that's very noble.
00:56:32.000 We've all come to agree that there are certain things that we want moderated, like doxxing, that aren't necessarily covered by the law.
00:56:42.000 Yeah, like, and this is, maybe you don't want to moderate it, but like, white supremacy, for instance.
00:56:46.000 You'll go to Twitter, it'll be white supremacy, you'll see a swastika, which is totally legal, big in your newsfeed, and you'll see all these people tagging you in it, and you'll be like, no, this is what would happen with an unmoderated, like, open network, and what'll happen is these people with these really niche Intense powerful even destructive ideas will go hard on it like all day because they're obsessed with it and they destroy the network any ability to have a normal communication like not want to be inundated with like You know political ideology and racism and all this crap So I get that the moderator is like we got to stop that we want to prevent another Hitler from rising up on our platform.
00:57:22.000 Here's my I threw out a take on this the other day and it very much in line with what you're saying here is that when the Internet was still kind of in its infancy, Google used to have this idea called safe search.
00:57:33.000 Remember safe search?
00:57:34.000 You know, safe search on, safe search off.
00:57:36.000 And you always had to turn it off to find the good stuff, right?
00:57:40.000 What the idea was, though, the concept behind that was that self-moderated content, right?
00:57:46.000 So if my kids, for example, are picking up, like let's say we're in a long car ride, we try to limit their screen time.
00:57:53.000 We don't do like tablets in the house or like anything like that.
00:57:56.000 But if we're on a long car ride or on a flight or something, right?
00:57:58.000 I might have the tablet for them, but I'm making sure that it's on lockdown.
00:58:03.000 I'm moderating that content.
00:58:05.000 Now, as a dad, right, I know what sort of things I'm going to allow my kid to do.
00:58:10.000 Same idea is that you go on and then you can set, maybe you can set certain, you could click, I don't want to see hate speech, right?
00:58:17.000 And that's a filter and boom, you click that and then anything that Twitter, the good people at Twitter who have deemed to be hate speech, you don't see that.
00:58:25.000 But it doesn't deny anybody the ability to use the platform.
00:58:30.000 Well, that's what I was going to say.
00:58:31.000 It'd be so easy to create a tool that says, if I'm someone I want a third party to moderate what I can see and I like what Twitter's doing, it should be an opt-in.
00:58:41.000 And there can be different things you can opt into.
00:58:44.000 I talked with Bill Ottman of Minds about this.
00:58:48.000 Being of there being the overworld in the underworld and the underworld is whatever is legal.
00:58:53.000 And so if advertisers have an issue, they don't advertise in the underworld portion, but people can opt in and say, I want to see everything from the underworld or I don't.
00:59:01.000 And so you have your standard moderation policies, you know, which is they, you know, mines is still very much a free speech platform that allows a lot.
00:59:09.000 But the real serious nasty stuff that you might not want to see or the outright, you know, hateful stuff, all of that will be considered, you know, underbelly versus overworld.
00:59:18.000 I don't think mine's actually doing that.
00:59:20.000 Well, yeah, there's a not safe for work type of way to view the site, kind of like what you were saying, Jack, a toggle.
00:59:25.000 Problem is if someone uploads a swastika and they're like, no, I'm not going to, I'm not going to self tag this one.
00:59:30.000 It's just, I'm going to tag it as a dog.
00:59:31.000 And then you, your kid opens up the website and they see a swastika and they're like, what?
00:59:35.000 So then you got to report the thing.
00:59:37.000 Then it goes to a group of people or something much worse than a swastika, something graphic, like a body blown open or something.
00:59:43.000 It goes to the admins to decide like, okay, this was mistagged.
00:59:47.000 If it's from an account that consistently mistags their stuff, you can ban it.
00:59:51.000 But that person's gonna start a new account.
00:59:53.000 Now, do you have to make the user have a unique email address to start their own account?
00:59:57.000 I think they do at Twitter.
00:59:58.000 Mine's... I think mine's does.
01:00:00.000 We tried not having one of those.
01:00:02.000 Of course, you get the same guy.
01:00:03.000 I'll make 90 accounts and then upload the swastika on every account.
01:00:07.000 Hashtag dog.
01:00:08.000 And you're like, wow.
01:00:09.000 So...
01:00:11.000 So getting the community to moderate, honestly, is probably the best way to do it, but they're going to see some nasty stuff.
01:00:18.000 Under Elon Musk's rules, I'm willing to bet you'll be freely posting swastikas.
01:00:24.000 You know what's really interesting, too, is that Twitter banned... I'm pretty sure they banned the American Nazi Party.
01:00:29.000 Do you guys remember that?
01:00:31.000 I'm pretty sure they did.
01:00:32.000 I wonder if Elon Musk would reinstate that political... Well, there will be people who will want to test it and make that point.
01:00:39.000 But I also, as a reporter, the kind of journalism I do, I don't want to see just the accepted version of something.
01:00:47.000 I'm looking for stuff that may seem objectionable to other people because I'm looking for sources.
01:00:53.000 I'm looking for who's saying things that are off the narrative.
01:00:56.000 I'm finding whistleblowers that way.
01:00:59.000 I'm looking to see what's being passed around.
01:01:01.000 That may not be true.
01:01:02.000 Maybe I'm looking for something that's not true on purpose.
01:01:06.000 And it's become very difficult, whether you're working on the internet or social media, to find the things I used to be able to find a lot more easily 10 years ago.
01:01:13.000 I just want to say Elon Musk is the hero we need and deserve because he just tweeted, next I'm buying Coca-Cola to put the cocaine back in.
01:01:25.000 Yo, these are the kind of tweets I strive for.
01:01:28.000 Michael Malice, you should take heed of Elon Musk.
01:01:32.000 Do you get the impression like All right, and I'm just gonna like this. This is my
01:01:37.000 impression. I've never met Elon I don't think I know anyone who knows him or anything like
01:01:41.000 this but my impression is that Elon just kind of really likes
01:01:45.000 Twitter and he enjoys playing the game of Twitter and he's talked about
01:01:52.000 this that you know, Twitter is like going into the ring and or going into the arena and
01:01:58.000 It's like these people came and ruined the game. They ruined the fun of Twitter
01:02:02.000 they ruined the great thing that Twitter was and He's just taking it back because he likes it so much and he
01:02:09.000 wants it to be the way it was But doesn't he also know how it is to be an outlier or to
01:02:14.000 Censored or left out of the I'm not talking about social media
01:02:14.000 be?
01:02:18.000 But not being invited to the White House when other experts are being convened on something you're an expert in
01:02:24.000 He kind of understands that world of certain people being carved out or excluded or censored.
01:02:29.000 I think that makes him someone kind of on the outside looking in and understanding.
01:02:33.000 That's a really good point.
01:02:35.000 I was trying to pull up a tweet of his from earlier that I was looking at about free speech.
01:02:40.000 I have a feeling that he's still of the mindset that free speech means that he has to moderate the network to allow people to do what they want.
01:02:47.000 But God gives us the free speech, not you, Elon.
01:02:50.000 What we need to do is free the software code so people can control their own network.
01:02:53.000 So, Elon Musk, uh, did I actually lose that one?
01:02:57.000 Here we go.
01:02:57.000 Where is it at?
01:02:58.000 Elon Musk responding to Ben Shapiro.
01:03:00.000 Ben Shapiro said Twitter should be, uh, Ben Shapiro quoting Elon Musk, Twitter should be politically neutral.
01:03:05.000 Washington Post and every left-wing blue check-ree.
01:03:08.000 You guys are giving away your game.
01:03:09.000 Elon Musk said attacks are coming thick and fast.
01:03:12.000 Primarily from the left, which is no surprise.
01:03:14.000 However, I should be clear that the right will probably be a little unhappy, too.
01:03:18.000 My goal is to maximize area under the curve of total human happiness, which means the 80% of people in the middle.
01:03:26.000 I would like to give a shout-out to our good friend, Shuwan Head.
01:03:29.000 who in response to a tweet from Elon Musk, he said, for Twitter to deserve public trust, it must be politically
01:03:35.000 neutral, which effectively means upsetting the far right and the far
01:03:38.000 left equally.
01:03:39.000 Xiwen had said, Elon Musk waging war on both my friend groups.
01:03:42.000 She did respond to his other tweet saying, but those are the least funny people on this website.
01:03:49.000 You need to be catering to the fringe schmitzos.
01:03:52.000 I disagree with Elon on that.
01:03:53.000 Or put it this way.
01:03:55.000 I think what he's saying is unachievable.
01:03:57.000 Because you don't see people on the far right saying that they want people on the far left banned.
01:04:04.000 But that's the inverse when it comes to the far left.
01:04:06.000 They do want the other side banned.
01:04:08.000 That's the whole issue.
01:04:10.000 So, to the greatest extent of this, you are not going to be able to get an equal level of, you know, quote-unquote anger on both sides.
01:04:19.000 It's just not going to happen, you know, unless you're adding some, like, arbitrary rules on top of this thing.
01:04:25.000 Because, essentially, the right just wants to be left alone, number one.
01:04:30.000 Or, number two, wants to be able to have an even playing field.
01:04:34.000 The left doesn't want an even playing field.
01:04:36.000 Elon doesn't understand.
01:04:38.000 That Twitter's policies that he doesn't like were specifically to make the left and the right both unhappy.
01:04:45.000 The left was demanding censorship.
01:04:47.000 They're saying, Twitter is overtly supporting white supremacists by letting them say words.
01:04:52.000 Can you believe that people misgendered me?
01:04:54.000 They should be banned.
01:04:55.000 And Twitter said, okay, we'll ban some of them.
01:04:58.000 The right was like, well, they banned them and I'm angry, but I'll stay on the platform because I can tolerate views I don't like.
01:05:05.000 So Twitter said, how do we make both sides happy?
01:05:07.000 Ban a bunch of conservatives because conservatives can tolerate it and the left can't.
01:05:11.000 If Elon Musk actually restores free speech, as he's saying, they're primarily coming from the left, that will always be the case.
01:05:17.000 There will be no circumstance where someone will do something overtly egregious on the right, and the right's going to be like, that's unfair, they should be allowed to do those things.
01:05:29.000 Okay, I'm not going to get into it.
01:05:31.000 But when Alex Jones, right, was out there and across every platform, he was one of the biggest shows.
01:05:39.000 He just was.
01:05:40.000 And if you ran tracking on this, he was getting more views a day than I think anybody outside the mainstream media.
01:05:47.000 I saw lots of people on the right.
01:05:48.000 I saw conservatives up and down attacking him for things, mocking him, ridiculing him, saying they disagreed with various takes that he had on various issues that YouTube probably doesn't want me to get into.
01:06:01.000 That's not my point.
01:06:03.000 I don't remember anybody on the right ever saying that Alex Jones should be taken off the air.
01:06:08.000 I don't remember ever hearing that said once.
01:06:12.000 What if conservatives came out and said, you should not be allowed to have your own pronouns, and Twitter should enact a policy that if you change your pronouns, you'll be banned?
01:06:22.000 Then they'd become pretty extreme.
01:06:24.000 Can your pronouns be grandfathered?
01:06:26.000 Well, hold on there a minute.
01:06:27.000 Right now, Twitter will ban you if you misgender someone.
01:06:29.000 Right.
01:06:30.000 So I'm saying, what if conservatives said, okay, we'll do the inverse.
01:06:34.000 If you use pronouns that are not in line with your biology, we'll ban you.
01:06:38.000 That is equally as extreme.
01:06:41.000 My point is Elon Musk is not proposing that and conservatives are happy.
01:06:44.000 hold the infinity gauntlet. Everyone knows the answer is no one. You're supposed to break
01:06:48.000 it apart. My point is Elon Musk is not proposing that and conservatives are happy. He's saying
01:06:53.000 you can have whatever pronouns you want but you can't make someone else say them and the
01:06:57.000 right's like we can live with that even though we don't like it.
01:07:00.000 And the left's like, no, we can't.
01:07:02.000 Ban them.
01:07:03.000 There's no way to make the left happy.
01:07:04.000 You can block whoever you want.
01:07:04.000 That's why you block.
01:07:06.000 Right.
01:07:06.000 But that makes the right happy, but not the left.
01:07:08.000 Elon Musk will not be able to achieve that goal.
01:07:10.000 Life is not about being happy.
01:07:11.000 And I've had to explain this so many times to people.
01:07:13.000 This is the difference between cancel culture and boycotts.
01:07:17.000 Right?
01:07:17.000 Completely different things.
01:07:20.000 So a boycott is when one concerted group says, we are no longer going to support a certain industry, organization, company, or figure, right?
01:07:32.000 Conservatives, I believe right now, are generally, I don't know if anyone's actually said this, but they're essentially boycotting Disney, right?
01:07:38.000 Over the groomer situation, that whole scandal.
01:07:40.000 And so that's going on as a boycott, but then I'll always see this and people will say, hold on, hold on.
01:07:47.000 I thought you said you were against cancel culture.
01:07:49.000 No, that's different.
01:07:50.000 We're not saying that they should go out of business.
01:07:53.000 We're not saying that no one can do business with them.
01:07:56.000 That's what cancel culture is, is destroying someone's ability to even have a livelihood whatsoever.
01:08:03.000 It's an inverted boycott.
01:08:04.000 Yes.
01:08:04.000 Whereas boycott, just to reiterate, is people saying, we all have decided we're not going to buy from you.
01:08:11.000 The cancel culture is, don't let them sell to anyone.
01:08:13.000 Exactly.
01:08:14.000 But here's why.
01:08:14.000 The cancel culture is afraid that without that, The popular sentiment would not be on their side.
01:08:22.000 And when we talk about conservative versus liberal on the Internet, for example, and social media, I think part of that is because there's a control issue where certain factions, which are actually quite small minority in my view of people, Have been able to use social media in a controlled way to portray as if a greater section of the public feels that way.
01:08:48.000 And I think they're afraid if people were to actually see what percentage of the public feels a certain way on certain topics, they would find that many liberals side along with conservatives on some issues.
01:09:00.000 And sometimes these are very fringe, fringe discussions and issues getting a lot of attention because the social media is controlled.
01:09:08.000 When left to their own devices, this stuff would fall along the fringe and would not look like it does today with this controlled conversation elevating certain issues and views to a level that's far beyond, I think, what they really are.
01:09:21.000 Yeah, the Biteswa.
01:09:23.000 I think you've brought this up.
01:09:24.000 Biteswa.
01:09:25.000 Biteswa.
01:09:25.000 What is it, Mandarin?
01:09:28.000 It's Mandarin for white left.
01:09:30.000 It means white left.
01:09:30.000 They have a word to define the white left, the white liberals.
01:09:34.000 That's, to me, indicative that there's a focus on the white left, the white liberal.
01:09:40.000 And then you start to see the manipulation in the social media.
01:09:42.000 And I wonder how involved the creators of the Biteswa are in Here's what happened.
01:09:52.000 In 2014 and 2015, when everything was just open, Twitter had a much smaller user base than it does today.
01:09:59.000 And Twitter gained a massive user base, huge influx in 2016 because of Trump and because of his ubiquitous and singular use of Twitter.
01:10:08.000 The way he used that platform like no other person at his level ever had before, right?
01:10:13.000 There's always staff tweets, Katy Perry or, you know, Barack Obama, Hillary.
01:10:18.000 These were always staff tweets.
01:10:19.000 I don't even think he knows what Twitter is to an extent.
01:10:19.000 Joe Biden, right?
01:10:21.000 He's at some website the kids use, right?
01:10:24.000 And so prior to this, I mean, there were really no rules.
01:10:30.000 The idea of someone even being banned on Twitter was almost unthinkable.
01:10:36.000 It was unheard of that people would get suspended.
01:10:38.000 It's certainly not for speech, anything like this.
01:10:40.000 But There was a place on the internet where crazy people dwelt and the crazy met with more crazy and they combined to create exponential crazy.
01:10:52.000 And this is a place No, it is the upside down of Tumblr and Tumblr essentially in that time space through things like Gamergate, the SJW wars, and then eventually Trump came on to Twitter and essentially occupied Twitter and specifically occupied the headquarters of Twitter.
01:11:15.000 And by maintaining that.
01:11:18.000 To use a cliche, high ground.
01:11:20.000 By maintaining the high ground, they were able to impose their will across Twitter.
01:11:25.000 And so something I've been saying a lot lately is what's going on here is the liberation of Twitter from the Tumblr occupation.
01:11:33.000 The Tumblr occupation.
01:11:34.000 The Tumblr occupation.
01:11:36.000 Let's not forget that after Trump won in that unexpected race, which was unexpected to almost everybody except I did.
01:11:36.000 Yes.
01:11:45.000 I was a national journalist who predicted repeatedly on National TV that he would win just because I was listening outside the beltway to regular people but once he won the left admitted including media matters the propaganda group and
01:12:00.000 that they went and held meetings with Facebook and convinced Facebook to take this new tact,
01:12:05.000 which was brand new, to do the fake fact checks, the moderations.
01:12:10.000 They didn't call them fake fact checks, of course, but the notion that they would have
01:12:13.000 to come on and prevent something like this from happening again.
01:12:17.000 That whole notion was raised in a very organized fashion shortly after the 2016 election going
01:12:22.000 into 2017.
01:12:28.000 So there's also a Time News article where they talk about fortifying the election.
01:12:31.000 I think it's for 2020.
01:12:33.000 They're planning it for a year and a half, I think.
01:12:35.000 Well, it was Time Mag.
01:12:37.000 Who was talking about it?
01:12:38.000 So this is the great Time article that Ian's referring to, which was essentially that's That's when the serial killer has, you know, conducted their killings.
01:12:47.000 That's the Zodiac sending the cipher to, what was it, the San Francisco Chronicle?
01:12:53.000 You know, sending the letter off to let him know, let them know that it's been done.
01:12:57.000 They needed credit for it.
01:12:58.000 It was so good.
01:12:58.000 And they needed credit.
01:13:00.000 So they needed to know, they needed the world to know.
01:13:02.000 And humans have this innate desire for credit, for their esteem to be stoked.
01:13:09.000 So for that ego power, because so many people live off of ego alone because they're not in touch with, as I would say, they're not in touch with God.
01:13:18.000 They're not in touch with the spiritual side of things.
01:13:20.000 And so they live for this world.
01:13:22.000 They're very worldly and they don't understand that, uh, you know, this world is ephemeral.
01:13:28.000 This world will, you know, we'll leave it behind.
01:13:29.000 But anyway, the point being is that article is a confession.
01:13:33.000 We got you.
01:13:33.000 It's just a confession.
01:13:34.000 Well, in terms of controlling political outcomes and information, that's just the game for them.
01:13:40.000 It was, it was brilliant.
01:13:42.000 And like you said, they wanted credit for it.
01:13:46.000 Well, they got it.
01:13:47.000 Thanks Time Magazine for blowing that one off the bow.
01:13:50.000 People are... The secret history of the shadow campaign that saved the 2020 election.
01:13:56.000 That's what it's called, the shadow campaign.
01:13:59.000 I just want to mention, just a moment ago I said I was browsing Twitter and it's because I was in a message that Tucker Carlson was having a unique conversation.
01:13:59.000 Right.
01:14:06.000 Ladies and gentlemen, I am going to play for you this clip I just pulled up.
01:14:11.000 Tucker Carlson, some good news before we go, a little justice finally.
01:14:14.000 The Twitter account libsoftiktok just surpassed a million followers on Twitter.
01:14:18.000 It's about a minute long.
01:14:19.000 I grant you this clip from Tucker Carlson.
01:14:21.000 It just aired tonight.
01:14:22.000 Some good news before we go.
01:14:23.000 A little justice, finally.
01:14:24.000 The Twitter account Libs of TikTok just surpassed a million followers on Twitter.
01:14:28.000 Now, it was just last week that the Washington Post and its reporter Taylor Lorenz tried to destroy the person who runs that account.
01:14:34.000 But it turns out a lot of people actually want the account because they want to know what their teachers are doing in the classroom.
01:14:39.000 It's not an attack on anybody.
01:14:40.000 It's called transparency.
01:14:42.000 so tim pool of youtube fame with the help of the daily wires jeremy boring just put up a billboard
01:14:47.000 in times square highlighting what the washington post tried to do to the founder of libs of tick
01:14:51.000 tock and you're seeing on your screen now of course as predicted taylor lorenz says she's
01:14:55.000 the victim here of course this billboard is so undeniably idiotic it's hilarious but don't forget
01:15:01.000 these campaigns of a much darker and more violent side i'm grateful to be the newsroom that
01:15:05.000 recognizes these bad faith politically motivated attacks and has a strong security team so she
01:15:10.000 exposes the founder of libs of tick tock to violence but when you say her name you're a
01:15:14.000 terrorist in other words stop hurting me she says as she punches you in the face these people
01:15:20.000 This is what they were saying about Vijaya, right?
01:15:22.000 They were saying that Sager and Cernovich get contacted by the Washington Post saying, what was your intention by including the name of the Twitter official in your reporting?
01:15:38.000 First of all, they're all commenting on this political story that she apparently had broke down in tears during one of the meetings.
01:15:44.000 So commenting on that story and then having Elon Musk respond to them somehow became them attacking her.
01:15:52.000 I actually looked this up earlier today, and a lot of those websites, it's hard to tell when they do these estimates.
01:15:57.000 She's worth anywhere between $30 and $70 million.
01:15:59.000 Who is?
01:16:02.000 Wow.
01:16:02.000 78.
01:16:03.000 I have the numbers actually right here.
01:16:04.000 Yeah, but there's each website has like a different, you know, estimation of it.
01:16:09.000 So it depends on which one you look at.
01:16:10.000 So I'm giving the swath, right?
01:16:13.000 This is a major public figure who has major control in the world, who's worth tons of money, but can't talk about her because that's an attack.
01:16:22.000 Well, from wallmine.com, she has over $38 million worth of Twitter stock.
01:16:29.000 Her stock's worth over $28 million.
01:16:31.000 She makes $8 million a year as chief legal officer and secretary at Twitter.
01:16:35.000 It's $7.9 million.
01:16:36.000 $8 million a year.
01:16:38.000 Don't you talk about her, Ian.
01:16:39.000 Don't you talk about her, Ian.
01:16:41.000 She's about to get fired and have a nice package.
01:16:43.000 Goodbye.
01:16:44.000 Maybe he'll keep her on, but you're harassing her.
01:16:46.000 Wow.
01:16:47.000 The misogyny.
01:16:49.000 I only take issue with the things she's done and said.
01:16:52.000 She's cool.
01:16:53.000 Here's a prediction.
01:16:55.000 I think if Twitter really is allowed to become an organic thing, the big story is going to be how everything changes when you see it reflect what people really think and feel.
01:17:08.000 Instead of this balanced conversation as maybe Tim suggested.
01:17:13.000 It's not so balanced and I'm not saying all views shouldn't be heard.
01:17:16.000 They should be heard.
01:17:18.000 But I think people will be surprised how small some of this group becomes and how big other groups become that you haven't heard much from because they've been suppressed.
01:17:29.000 A problem that comes up on social media since the beginning, it seems like, is that the most popular thing gets more traction than everything else because it goes up on trending and then they're like, what's that?
01:17:37.000 And then it just snowballs things, like crazy things can snowball, like the swastika.
01:17:42.000 Hitler didn't, you know, nothing.
01:17:43.000 That's true always.
01:17:46.000 Unless you have time, like chronological feeds, because then you're just seeing what just got posted by the people you're following.
01:17:53.000 But people can share.
01:17:54.000 People share popular stuff.
01:17:57.000 And then when they do, the more you share it, the more people see it.
01:17:59.000 Across time, you're right.
01:18:01.000 The most famous people get free stuff.
01:18:03.000 That makes it so weird.
01:18:04.000 Oh, I mean, look, if you're rich, the bank gives you money.
01:18:06.000 If you're poor, they take it from you.
01:18:08.000 Same with social media attraction.
01:18:09.000 It's very weird.
01:18:10.000 It's almost like physics.
01:18:12.000 I mean, look, I've heard the joke where, you know, they say when you're rich, the bank will give you money.
01:18:17.000 But when you're poor, they say you owe us money.
01:18:18.000 And I'm like, well, here's why that happens.
01:18:21.000 When you have very little money in the account, they have to pay to maintain it.
01:18:24.000 And so it's draining their, you know, draining their money to run an account for you that's not being utilized enough.
01:18:31.000 So they charge you for it.
01:18:32.000 It seems counterintuitive.
01:18:33.000 If you're very wealthy, you're giving them access to capital for whatever it is they want to do.
01:18:37.000 And the popular posts on Twitter are getting more eyeballs on Twitter, so that's why they want the popular stuff at the top similar to having rich people.
01:18:45.000 And it creates a big cycle.
01:18:46.000 So, it's always that way.
01:18:49.000 There's some attempt at raising things up.
01:18:51.000 The problem is, Twitter wasn't doing it.
01:18:54.000 Twitter inverted it.
01:18:54.000 Twitter was deciding what was going to be.
01:18:57.000 YouTube is deciding for political reasons.
01:19:00.000 If everything was based off merit, I'll tell you, YouTube would be a wacky place.
01:19:06.000 All of the thumbnails would be big-tittied women, because that's- I'm not trying to be crass, that's literally what it was.
01:19:13.000 All of these big creators, in the early days of YouTube, realized that if I want views, the thumbnail's gotta get people to click on it.
01:19:20.000 Well, wasn't this the thing where that was the thumbnail, but the video actually had nothing to do with it?
01:19:23.000 Yes, exactly.
01:19:25.000 Or what they would do is they'd be like, I'm gonna comment on this video game story, but first, yo, we got this story about the supermodel.
01:19:32.000 Isn't she looking great?
01:19:33.000 Yes, the supermodel did these things.
01:19:35.000 Anyway, on to the video games.
01:19:37.000 That way they could justify, well, I did talk about her.
01:19:39.000 Because YouTube tried cracking down, saying, okay, you can't use thumbnails that do not represent the video.
01:19:44.000 It's causing us problems.
01:19:46.000 So, YouTube slowly starts making rules to try and deal with the insanity that is this platform.
01:19:51.000 And eventually, they say, okay, the other problem we have is people are just posting two-minute clips.
01:19:57.000 We need people to post long-form stuff.
01:20:00.000 So they make this algorithm that promotes videos over 10 minutes, videos with high engagement rates, and what do they get?
01:20:06.000 They were hoping for Game of Thrones.
01:20:08.000 They got Culture Wars.
01:20:11.000 And they don't like it.
01:20:12.000 Then they realized, hey, wait a minute, all of these conservative, anti-woke, and libertarian personalities are getting a lot of traction because people like their ideas.
01:20:19.000 Ban them.
01:20:20.000 And they did.
01:20:21.000 This show was shadowbanned.
01:20:23.000 You could not Google search it for a year and a half.
01:20:26.000 And then one day I mentioned on this show, I was like, oh yeah, Google shadowbanned us.
01:20:29.000 Like, if you were to Google the title of one of our clips, the Facebook version would come up and the YouTube would not.
01:20:37.000 On Google.
01:20:38.000 That seems to make no sense.
01:20:39.000 And then one day everyone's like, Tim, they removed it.
01:20:41.000 You can be searched again.
01:20:43.000 There was a hit piece in the media.
01:20:45.000 And then all of a sudden, a wave of YouTubers who were deemed wrong thinkers were purged from the recommendation algorithm.
01:20:51.000 Because someone just said, I don't like it.
01:20:56.000 I'm like a peacetime strategist.
01:20:58.000 I'm not a wartime strategist.
01:20:59.000 But in my mind, I think we have to build a system where that can never happen again.
01:21:02.000 But then I'm like, well, in times of war, you have to be able to censor the enemy.
01:21:06.000 At least, if you don't, the enemy's propaganda will convince your people to kill you.
01:21:11.000 That's what's... I honestly think Chinese propaganda is destroying us.
01:21:15.000 Russia tried, they overreact as to what Russia was actually doing, but I think China's actually way more successful.
01:21:21.000 And the reality is, yeah, we had an office of censorship in World War II because there were concerns that journalists, particularly, would put out information that would harm the war effort.
01:21:31.000 Speaking of that, I got, this is amazing, Life Magazine from March 1944, which is, I think it's 44, but it's a magazine from two months, three months before D-Day.
01:21:45.000 In the magazine, they show all of the American military power in the UK.
01:21:51.000 It's fascinating.
01:21:52.000 They said, the United States is helping the UK in the event Germany may try to invade.
01:21:59.000 Fake news.
01:21:59.000 So, I mean, this is actually something I studied when I was in the I.C.
01:22:01.000 I went through classes on this.
01:22:02.000 beaches of Normandy, stage D day.
01:22:04.000 But they could not say that in the press. So in this magazine at the
01:22:07.000 time, they're like, we're here with our military to protect Britain.
01:22:10.000 Fake news. So it was fake news.
01:22:12.000 So, I mean, this is actually something I studied when I was in
01:22:15.000 the I.C. I went through classes on this. They call it denial of
01:22:17.000 deception operations.
01:22:18.000 So Germany knew that there was an invasion coming
01:22:23.000 and it was going to be the Americans.
01:22:25.000 The question was, where would the invasion come from?
01:22:30.000 And they conducted so many deception operations because they knew that the Germans had spies throughout Europe.
01:22:39.000 So what did they do?
01:22:40.000 They knew that they sent Patton to Europe because they knew Patton.
01:22:43.000 They sent him to the UK because they knew that the Germans would be watching him.
01:22:47.000 But where do they send him?
01:22:48.000 They don't send him across from Normandy.
01:22:51.000 They send him to Dover.
01:22:53.000 Now, Dover is directly across from Calais.
01:22:55.000 This is the shortest point on the English Channel.
01:22:59.000 This is actually where the Channel Tunnel is, because it's the shortest point.
01:23:02.000 Normandy is where the invasion actually was.
01:23:04.000 It's not where you'd expect.
01:23:06.000 They send Patton over to Dover.
01:23:09.000 What do they do?
01:23:10.000 They get soldiers and they make fake uniform patches for them depicting fake units.
01:23:18.000 They have these soldiers go into town and pretend that there's a massive buildup.
01:23:22.000 They start renting hotel rooms.
01:23:24.000 They start buying food.
01:23:25.000 They start sending messages.
01:23:27.000 They're going all throughout town And they act as if they're all members of various units that are there in the town, and yet none of it's real.
01:23:35.000 So the German high command is getting all these reports back saying, hey, there's a ton of stuff going on right across from Calais, from this place called Dover.
01:23:44.000 We think it's coming.
01:23:45.000 What do the Germans do?
01:23:47.000 They realize that they can't put all their eggs in one basket.
01:23:50.000 They split the forces along the North Atlantic wall from Normandy and Calais.
01:23:56.000 The deception, and there's way more to it than this.
01:23:58.000 There's so much to get into.
01:23:59.000 They actually, they used a dead body at one point and they put like fake documents.
01:24:04.000 The British did this.
01:24:05.000 They put fake documents in the body, drop him off of the Rock of Gibraltar.
01:24:09.000 He washes up on Spain.
01:24:10.000 The Spanish find the guy.
01:24:12.000 They hand him to the German embassy.
01:24:13.000 They say, look, we've got these documents.
01:24:15.000 See, we know all about this invasion.
01:24:18.000 It's coming.
01:24:18.000 And then that makes its way up.
01:24:20.000 Hitler splits the forces.
01:24:23.000 The deception operation was so extensive that even after D-Day, They kept the forces split because they still thought another invasion was going to be coming.
01:24:36.000 They thought Normandy was a feint.
01:24:39.000 They thought that was the decoy and an even bigger one was still coming.
01:24:42.000 Wow.
01:24:43.000 It's incredible.
01:24:44.000 This is one of the greatest military deception operations that you're talking about there that's ever been conducted.
01:24:50.000 You think if Hitler wasn't a meth head that he would have taken the troops out of Calais?
01:24:54.000 Well, no, I mean, I think it's game theory, right?
01:24:56.000 I think it's game theory that if you don't have... Keep in mind, this isn't an age of satellites and, you know, thermal, you know, resonance imaging, IR, FLIR, any drones, etc.
01:25:06.000 So you really don't know.
01:25:07.000 So you're going based off of, you know, you've got the Enigma machine to send encrypted messages, but that's already broken, right?
01:25:14.000 Even though he didn't know that.
01:25:15.000 And then you've got these human intelligence reports.
01:25:19.000 So game theory suggests that if you've got what appears to be credible information of
01:25:24.000 two invasion forces, you've got to prepare for both.
01:25:27.000 All right.
01:25:29.000 Well, how about the theory that, and I understand some of this information has to be controlled,
01:25:35.000 Who do you trust to control it?
01:25:36.000 Do you trust your own government to be doing the right thing during times of war?
01:25:40.000 No.
01:25:41.000 And I'm going back to CNN.
01:25:42.000 I worked there back when it was a news organization in 1990.
01:25:45.000 Wow.
01:25:45.000 Long time ago.
01:25:48.000 And Gulf War I was happening, actually 1989, I think it was around August, when Iraq invaded Kuwait.
01:25:54.000 And we spent three years, I think, doing great coverage, but listening to, almost every day, Saddam Hussein's press guy give his view.
01:26:04.000 And I think it would be, you know, not accurate to say that the people watching CNN were convinced by Iraq's press guy to take Iraq's viewpoint.
01:26:14.000 They weren't.
01:26:15.000 But it was valuable to see How they were portraying their side and their viewpoint.
01:26:21.000 I think we have a right to see it.
01:26:22.000 I think it's in for it informs us to see and hear that.
01:26:26.000 And the notion that I've heard lately that we shouldn't even hear certain viewpoints or enemies or we wouldn't want to hear from Hitler.
01:26:32.000 I'd want to hear from Hitler.
01:26:34.000 I'm not saying that I would want to believe what Hitler says.
01:26:38.000 I would love to hear his justifications and his explanations.
01:26:42.000 I don't think that should be banned.
01:26:44.000 I mean, maybe there's some stuff that should be.
01:26:46.000 Yeah, the cuties.
01:26:47.000 The argument is cuties.
01:26:48.000 Like it's so vile that in order to just to see it is the corruption.
01:26:53.000 So like just to listen to Yosef Goebbels speak about the Jews was like enough to have a weak mind corrupted by it.
01:27:00.000 So they're like, some people can handle it.
01:27:02.000 Like you are able to look at it logically.
01:27:05.000 But it's not up to us to decide which people can handle it in my view.
01:27:10.000 Cuties actually had little girls doing things.
01:27:13.000 And you could say that what Joseph Goebbels was doing was, you know, vile, the way he was dehumanizing people.
01:27:19.000 Nazis literally killed the people.
01:27:22.000 Cuties wasn't speech.
01:27:23.000 That's a completely different thing.
01:27:24.000 I don't know.
01:27:24.000 I think a movie could be considered a form of speech.
01:27:27.000 What about an internet video of you talking?
01:27:29.000 Is that speech?
01:27:30.000 Yes.
01:27:31.000 In the film, they took little girls, these are real human beings, and they had them perform lewd dances for an extended period of time and be trained to do it.
01:27:40.000 That is not speech.
01:27:42.000 Oh, uh, well, self-expression, I'm not sure.
01:27:44.000 No, no, no, no, no, no, no, no, no, no, no, no, no, no.
01:27:46.000 It's graphic, it's imagery.
01:27:47.000 If an adult human male takes a little girl into a backroom and teaches her extensive lewd dancing... So are you suggesting... ...is not speech?
01:27:54.000 Are you suggesting that imagery is not a form of speech?
01:27:56.000 Ian, what are you talking about?
01:27:57.000 Because if someone posts a picture on Twitter... Alright, let's slow down again.
01:28:01.000 A man, several of them, took little girls into a room to teach them lewd, sexualized dancing.
01:28:07.000 Wait, wait, don't forget.
01:28:07.000 They were paid.
01:28:09.000 And they were paid to do it.
01:28:11.000 That is not speech.
01:28:13.000 Yeah, but I'm not talking about that.
01:28:14.000 I'm talking about the portrayal of the movie.
01:28:16.000 Right, we are.
01:28:17.000 And that's the problem.
01:28:19.000 Ian, I know you are, but that's not what I brought up Cuties for.
01:28:22.000 It's not about that.
01:28:23.000 You cannot.
01:28:25.000 sexually abused children to make a piece of speech.
01:28:29.000 Well, showing the movie is not the abuse.
01:28:33.000 This comes up in the child porn arguments, and this was in Kataji Brown Jackson's stuff, is that every time you show the image, every time another person sees it, the child is then re-exploited.
01:28:44.000 Okay, so that argument could be taken to Joseph Goebbels talking about the dehumanization of Jews.
01:28:48.000 Every time you listen to him say that, he's doing something That's more than speech.
01:28:54.000 No, no, no, Ian.
01:28:55.000 This is the argument anyway.
01:28:56.000 You don't understand.
01:28:58.000 If you go out and you say, murder is wrong.
01:29:02.000 That's speech.
01:29:03.000 Okay.
01:29:04.000 If you go out, film yourself killing someone, and then show the video saying, that was wrong, what I did.
01:29:10.000 It's like, okay, that video is a depiction of someone being murdered, of you committing a crime.
01:29:15.000 That is not speech.
01:29:18.000 Journalistic integrity.
01:29:19.000 If you as a journalist saw a video of me killing someone and showed people, you wouldn't be on the hook.
01:29:23.000 That's not what we're talking about.
01:29:25.000 I'm talking about other people showing video that might be harmful to the mind.
01:29:29.000 You need to... I don't know if you need to censor it, but if you don't censor it, you can get really dangerous.
01:29:33.000 The difference between Goebbels and Cuties is that, first of all, he was a Nazi and the Nazis did those things.
01:29:39.000 But if you're referring to someone who is advocating for genocide, The question is, are they actively participating in doing it or just expressing themselves?
01:29:47.000 When you're talking about cuties, you're talking about people who actively participated in exploiting little girls.
01:29:52.000 So we're talking about genocide, just talking about it versus doing it.
01:29:54.000 I agree they're different.
01:29:55.000 How about this then?
01:29:56.000 Let's take it out of the abstract.
01:29:58.000 Facebook said that in certain areas of Europe, because of the Russia-Ukraine conflict right now, they are going to take off their normal ban on calls for violence, as long as you're calling for violence up to and including murder of anyone of Russian ethnicity.
01:30:16.000 Yeah.
01:30:17.000 Yeah, these are weapons.
01:30:18.000 These social media networks are weapons.
01:30:19.000 So they're basically saying that, no, I don't know whether or not that includes genocide, but that certainly sounds like genocide, right?
01:30:27.000 If you're, you know, basing this on, you know, someone's ethnicity.
01:30:31.000 So Facebook has come out and said, we will agree with this and we will allow this under these circumstances.
01:30:36.000 So correct me if I'm wrong, but I think that alluding to genocide is legal under our free speech laws.
01:30:42.000 But as long as it's not an eminent threat, like on Wednesday at 2 p.m., go do that, then you're not inciting anything.
01:30:48.000 But if you're just saying, I want them to be gone, that I believe is legal.
01:30:53.000 It is legal speech.
01:30:54.000 So on Twitter, if that goes up and then it gets a million retweets, and then all these people are like, yeah, I think that too, you're creating a dangerous precedent.
01:31:01.000 Well, this was in the TED talk that Musk was doing, the guy interviewing him said, you know, some hate speech is legal, right?
01:31:08.000 I hate broccoli.
01:31:09.000 Right?
01:31:09.000 You know, I hate ice cream.
01:31:11.000 Whatever, right?
01:31:12.000 Like, you're allowed to say that you hate certain things, so you take the emotion out of it.
01:31:16.000 So Elon, I don't think will, he says he's a free speech absolutist, but I don't think Twitter will be absolutely free speech.
01:31:22.000 That's absurd.
01:31:23.000 Doxing's not going to be allowed.
01:31:24.000 And you going on Twitter and advocating for genocide, I do not believe Elon will allow it.
01:31:29.000 He's going to be like, no, because you're calling for violence.
01:31:32.000 And it's just it's on the line.
01:31:33.000 But what happens then when it comes into the Israel Palestine question?
01:31:38.000 Because Palestinians constantly say that they want to see the country of Israel wiped off the map from the river to the sea.
01:31:46.000 Yep.
01:31:46.000 Right.
01:31:46.000 And prominent blue-check journalists have said this.
01:31:50.000 I was told that that was a mistranslation, that they actually want to legally remove the borders and undo the country's formation, not kill the people and destroy it all.
01:32:00.000 This is what Khamenei Jad, I think that's his name, the president of Iran for a while, was saying, I want Israel wiped off the map, but he was saying that he wants it literally redrawn and taken away because it was done unjustly.
01:32:11.000 Not that he wanted to kill them, but the media was like, oh, he wants them wiped out?
01:32:14.000 Now let's make him a villain and say that he wants to hurt people.
01:32:16.000 I don't think there's any point in trying to dissect the Israel-Palestine conflict in
01:32:22.000 the span of a few minutes.
01:32:23.000 That's too much.
01:32:24.000 You know, of course there's going to be a guy who says, this is what we really mean.
01:32:26.000 Rest assured, the Al-Qassam brigades are not agreeing with that assessment.
01:32:31.000 And when they fire a rocket out of a children's hospital at civilians in Israel, it's because
01:32:36.000 they want to wipe them out.
01:32:37.000 Not just that, out of the basement of the AP's headquarters, where they were operating, and the AP had an office in Gaza City, co-located with one of these rocket bases, and never once reported on it.
01:32:53.000 So these are tough questions.
01:32:55.000 Some of these are hard calls.
01:32:58.000 But let's go back to the absurdity of what's become with social media and the internet.
01:33:02.000 You brought up Goebbels.
01:33:04.000 One of the only things I know I was banned for on Facebook was fairly recently, I posted a Goebbels quote that simply said something like, with no context, it is quite possible knowing the psychology of the audience to convince people that a square is in fact a circle.
01:33:23.000 That could mean anything.
01:33:24.000 It could be a criticism.
01:33:25.000 I knew what I was aiming it as, but I simply did a historic quote, and I got pulled down my account.
01:33:33.000 And I thought it was a mistake, and there was no way to appeal, and I kept checking back.
01:33:37.000 And a week later, I'm back up, and I said, there must have been some mistake.
01:33:41.000 All I did was quote, historically, Hitler's propaganda saying, That people can be convinced.
01:33:47.000 And I got banned again.
01:33:48.000 I was gone for another 10 days.
01:33:50.000 So you can't even say something like that.
01:33:53.000 Tim, how many lights are there in this room?
01:33:55.000 What are the lights?
01:33:59.000 There are four! Four lights!
01:34:01.000 Yeah, so what I do is I like to tweet, Under no pretext shall the right of the people,
01:34:09.000 shall the right of the people to keep and bear arms be infringed.
01:34:12.000 It must be frustrated by force if necessary.
01:34:15.000 And that's from Carl Miss Jeffermarks.
01:34:20.000 Does that get you banned?
01:34:22.000 Well, I feel like if you quote the Second Amendment, you might.
01:34:24.000 But if you quote Marx, you probably won't.
01:34:26.000 So Karl Marx has the quote, under no pretext shall the workers relinquish the right to bear arms.
01:34:32.000 It should be frustrated.
01:34:33.000 Any attempt to take this, you know, weapons from the people should be... Chairman Mao, political power grows from the barrel of a gun.
01:34:38.000 Yeah.
01:34:38.000 So my point is I mix them together and I'm like, hey, look, it's not left or right.
01:34:43.000 It's liberty.
01:34:44.000 You know, but I guess the authoritarians say that, too.
01:34:46.000 And then take your guns once they gain power.
01:34:48.000 Yeah.
01:34:49.000 Alright, let's read superchats!
01:34:50.000 If you haven't already, smash that like button, subscribe to this channel, share the show with your friends, head over to TimCast.com and become a member.
01:34:58.000 Why?
01:34:59.000 As a member, we will do things like get billboards in Times Square calling out the establishment and, uh...
01:35:05.000 You know, we're working on some other campaigns.
01:35:07.000 We'll do more culture jamming as marketing.
01:35:09.000 And you will keep our journalists employed, you will help fund the show we are doing here, and you will get access to exclusive segments of this show.
01:35:16.000 We're gonna have a half an hour segment coming up at 11 p.m.
01:35:19.000 tonight over at TimCast.com.
01:35:21.000 Members only.
01:35:21.000 Don't miss it!
01:35:22.000 And let's read what you guys have to say in these Super Chats.
01:35:26.000 Alright.
01:35:26.000 GoneFall says, do chickens fart?
01:35:29.000 Technically, yes.
01:35:31.000 Uh, usually what happens is, I don't know if it's a legitimate fart, but chickens, when they poop, they're, there's like, you know, it's, it's, so it's a, it's a shart.
01:35:41.000 Oh, okay.
01:35:41.000 Because you'll hear it when they, you know, so chickens have one hole, one, well, two, their mouth, but you know, in the back end, one, even the roosters, it's called the cloaca.
01:35:51.000 So they've got the liquid stuff, the solid stuff and the eggs all going through the same place.
01:35:56.000 So the eggs often come out covered and, you know.
01:35:58.000 I think there was a kid who set used to sit next to me in like fifth grade who had that.
01:36:02.000 Yeah.
01:36:02.000 So sometimes the chickens, the chickens will begin their squat and you'll hear a, you know, so call it whatever you want.
01:36:10.000 And it could be an egg.
01:36:11.000 It could be anything.
01:36:13.000 He would be doing that all through class.
01:36:14.000 That's what I'm saying.
01:36:15.000 The chicken's about to lay an egg.
01:36:17.000 Sometimes the chickens randomly lay eggs where they stand.
01:36:19.000 I guess it just happens.
01:36:20.000 But usually they'll go into a box and they'll get ready and they'll look stressed out and then they'll start singing.
01:36:26.000 Or they'll sing before they do it.
01:36:28.000 It's called the egg song.
01:36:31.000 All right, let's grab some more Super Chats.
01:36:34.000 Sev says, shout out to SpaceX!
01:36:35.000 The Dragon capsule is currently docking to the ISS over the Pacific.
01:36:39.000 Trunan on Jabba the Pressure!
01:36:41.000 Yes, Elon Musk also tweeted about that.
01:36:43.000 That's really cool.
01:36:44.000 I love how there was a report that came out that said on the day that he purchased Twitter, it was the same day that he has like a weekly meeting over, I think it was with Tesla over some valve issues or maybe it was SpaceX, I'm not sure which one, but they pointed out that he went to this meeting at 10 p.m., had this super high level engineering meeting about valves and they're improving the efficiency.
01:37:07.000 No one in the room mentioned anything about Twitter.
01:37:11.000 And Elon is just going way down the rabbit hole of this like highly technical process.
01:37:17.000 And folks, he's a multitasker.
01:37:19.000 He's just completely able to compartmentalize this stuff.
01:37:22.000 And so his Twitter account will literally be, you know, trolling the media, trolling the media, trolling the media, and then rocket launch.
01:37:30.000 Good for him.
01:37:30.000 I mean, I don't really need to park a truck in front of WAPO or anything like that.
01:37:34.000 That might be a little much, but I was just thinking about this.
01:37:37.000 And then he says, heart Andy and Lydia, a true wish of happiness and love to you both and children.
01:37:42.000 Thank you.
01:37:43.000 I mean, I don't really need to park a truck in front of WAPA or anything like that.
01:37:47.000 That might be a little much, but I was just I was just thinking about this.
01:37:50.000 Doxing's free speech.
01:37:51.000 We could doc if a journalist wants to publish an address of somebody and then lie about
01:37:58.000 it.
01:37:59.000 Why?
01:38:00.000 So I'm mixed on that too.
01:38:02.000 I was thinking the same thing.
01:38:03.000 I'm not for it, but if these are publicly available addresses, people can type in.
01:38:08.000 I think it's unethical.
01:38:09.000 I agree.
01:38:09.000 I think it's unethical.
01:38:10.000 There's an ethics problem.
01:38:11.000 But these journalists, you know, Taylor Lorenz goes on MSNBC and she's like, doxing is wrong!
01:38:16.000 Tweets, doxing is wrong!
01:38:18.000 And then literally publishes Lib's address and then lies about it.
01:38:23.000 Now they removed it after the fact.
01:38:24.000 I wonder if it was a mistake.
01:38:25.000 I don't know.
01:38:26.000 I'm not going to give the benefit of the doubt.
01:38:27.000 Yeah.
01:38:27.000 If you've, um, I mean, if you've committed a crime, the only time that I've ever, that I would ever even say that it's not unethical is if someone's committed a crime and you need to identify the person.
01:38:40.000 There was this case, you know, it's not even political, but out of Cedar Rapids, this Lily Peters case.
01:38:46.000 I don't know if any of you have seen it.
01:38:47.000 It's just a horrific 10-year-old girl.
01:38:50.000 And there were people who, based on Facebook, were pretty sure that they knew who the perpetrator was.
01:38:56.000 It later came out that it was the cousin, who's also underage.
01:38:59.000 Really disgusting stuff.
01:39:00.000 But the point is, that's not doxxing, because that's identifying the perpetrator.
01:39:05.000 You know, alleged, alleged, alleged perpetrator of a crime.
01:39:08.000 If it's just a suspect of a crime, is it?
01:39:09.000 Ethical.
01:39:12.000 Ah, yeah, that's tough.
01:39:13.000 All right.
01:39:14.000 Joshua French says, Lieutenant Jack, I am a IS-1 retired.
01:39:19.000 Need to correct you from the last time you were on.
01:39:21.000 You claimed that the CPO was not an officer.
01:39:23.000 A CPO is a non-commissioned officer, and they help mold the JOs.
01:39:28.000 I think everybody knows what I was saying of the distinction between enlisted and officer.
01:39:31.000 Oh, okay.
01:39:32.000 I mean, you can be petty about it if you want, but that's a pun.
01:39:37.000 It's petty officer versus officer.
01:39:38.000 But there is a distinction within military ranks of the E ranks of enlisted all the way up to the MCPON in the Navy, and then the O ranks, which start at O-1 and then go all the way up to the CNO.
01:39:50.000 All right.
01:39:52.000 James Smith says, Ian, I am sorry for what I wrote in chat yesterday.
01:39:56.000 It's okay, Zeke.
01:39:57.000 I still love you, man.
01:39:58.000 James, James.
01:39:59.000 No problem, bro.
01:40:00.000 What did he say about you?
01:40:01.000 I don't know.
01:40:02.000 We forgive him.
01:40:03.000 I got you back, bro.
01:40:05.000 Let's grab some more Super Chats.
01:40:09.000 John Kristen says, T-shirt idea that says, freedom of speech is a musk, with cartoon musk releasing Twitter birds as doves.
01:40:16.000 It's an excellent idea, but I don't think we're allowed to use Elon Musk's likeness for merchandise.
01:40:20.000 I don't think we can do that.
01:40:24.000 We can say freedom of speech is a musk, and then show a person from behind throwing Twitter birds in the air.
01:40:30.000 Well, you better do it, because if you don't do it, someone's going to now, because that's a great slogan.
01:40:34.000 That's pretty great.
01:40:34.000 That's a really good one.
01:40:35.000 Oh, they can take it.
01:40:38.000 All right.
01:40:40.000 Amber Black says, the irony of Nina being in charge of the Ministry of Truth and a Harry Potter fan is in the books.
01:40:45.000 The Ministry and Voldemort censoring news and speech wasn't exactly supposed to be a good thing.
01:40:49.000 Yeah, I'm not like, I'm not a huge Harry Potter person, but I've seen, I have seen the movies and that's like a whole thing in Harry Potter, right?
01:40:56.000 It's the, what's the woman's name?
01:40:57.000 Umbridge?
01:40:58.000 Umbridge, yeah.
01:40:59.000 So she, it's like, is this a person who reads those books and just totally missed the point?
01:41:07.000 So my question actually is now, because J.K.
01:41:09.000 Rowling though, of course, is famously against the speech codes when it comes to the trans issue.
01:41:16.000 And so will this person who's now the head of the Ministry of Truth Would she ban J.K.
01:41:23.000 Rowling as a major Harry Potter fan because J.K.
01:41:27.000 Rowling does not uphold the speech codes?
01:41:31.000 Ben Thomas says, probably.
01:41:33.000 Tim, a UK citizen wanting to move, what state should I move to?
01:41:38.000 Georgia.
01:41:39.000 Why Georgia?
01:41:39.000 I don't know.
01:41:40.000 Some East Coast.
01:41:43.000 I mean, for a UK citizen?
01:41:44.000 You know, I mean, it depends on like, if you're looking for, you know, it depends, I guess, what part of UK you're from, right?
01:41:50.000 Because if you're from London, you might like New York because they're, you know, they're somewhat comparable.
01:41:54.000 But if you're from like, if you're from one of the more rural areas, then you're going to want a more rural area of the United States.
01:42:01.000 Oklahoma.
01:42:03.000 It's hot over there, yeah.
01:42:05.000 I was in Nebraska recently.
01:42:07.000 Really liked it.
01:42:07.000 Especially western Nebraska.
01:42:08.000 Ogallala.
01:42:09.000 It was awesome.
01:42:11.000 There's so much land in the United States.
01:42:13.000 I was in Kenosha last weekend, too.
01:42:15.000 Back to Kenosha.
01:42:16.000 That town's been through so much, man.
01:42:18.000 Lost Valley says, the left think they're Dumbledore, but in reality they're Dolores Umbridge.
01:42:22.000 Yeah, that's true.
01:42:24.000 That's what it is.
01:42:25.000 Kicked Johnny Depp out of the latest one over Amber Heard, by the way.
01:42:31.000 C Scott says, New theory.
01:42:34.000 The sudden change to the censorship is due to an insider at Twitter, and changed, updated the source code.
01:42:42.000 This is why Twitter locked it down.
01:42:44.000 I disagree.
01:42:46.000 If somebody at Twitter went rogue and removed the restrictions, and so they locked it down, they would have reversed the change.
01:42:52.000 They wouldn't have been like, oh no, he did it, leave it I guess.
01:42:54.000 Maybe though.
01:42:55.000 Who's that?
01:42:59.000 Oh my goodness, you're on my Siri.
01:43:03.000 He's listening to you.
01:43:05.000 We live in a spy state.
01:43:06.000 Wow.
01:43:07.000 That's no joke either.
01:43:08.000 That has happened.
01:43:12.000 Thanks robot with no emotions.
01:43:14.000 The Chronicles of Chris says they don't have good intentions, just pretend to.
01:43:17.000 Their intentions are evil, don't ever think otherwise.
01:43:20.000 Amen.
01:43:21.000 Sometimes their intentions are good and it's even worse.
01:43:24.000 But I guess what is good and evil really?
01:43:26.000 Well, for me, it's a combination of things that make up what would be evil or good.
01:43:36.000 I think evil is a combination of things.
01:43:38.000 For these people, they are motivated by their own self-interest, motivated by their own ego, and they're willing to use force and violence and cause harm to other people.
01:43:47.000 These things combine, and I'm like, yeah, that's evil.
01:43:50.000 There's also money behind it that use those kind of people that are true believers to accomplish the goal.
01:43:56.000 There's tiers to this.
01:43:57.000 And you've outlined this in your work that, you know, there's a tier of people who are just kind of hired because they are, as you say, the true believers that they're never going to question this.
01:44:07.000 You know, this is a lot of the frontline people that you see on TV, especially at CNN.
01:44:11.000 If you try to actually talk to them about any of these issues, they would never be able to have the discussion that we're having right now.
01:44:18.000 Christopher says, yes, they are purging their code, but they are also bringing followers to make their earnings report look better to tell Elon he can't buy.
01:44:25.000 Now the problem there is, if the shareholders vote no on the sale, they have to pay Elon Musk $1 billion.
01:44:31.000 Yeah.
01:44:33.000 So I wonder how much Twitter has available and how bad that would be for Twitter.
01:44:36.000 It seems like Elon boxed them in.
01:44:38.000 There's no way out.
01:44:39.000 The shareholders are like, if we back out of the deal, We're going to have to pay Elon a billion bucks, and then that's going to cause the stock to tank.
01:44:47.000 And that comes out of operating.
01:44:49.000 Right.
01:44:49.000 But not just the stock will tank, not just because Elon's, the deal's done, like being canceled, but also because they're losing a billion dollars and will struggle to operate, the company would implode.
01:44:57.000 They have no choice.
01:44:58.000 They have to sell.
01:44:59.000 It's amazing.
01:45:02.000 All right.
01:45:03.000 JS Fahler says, any changes Twitter makes to their source code is tracked by their version control system, GitHub.
01:45:08.000 They can mess with the history, but unless they are very careful, it will look suspicious to an expert.
01:45:12.000 I was just going to say, if someone made a change based on that theory or wrote, you know, change back to what's real, and then someone came back in and changed it back, that's going to show.
01:45:22.000 That's going to leave a fingerprint.
01:45:24.000 It'll be logged.
01:45:26.000 Mikael Isaacson says, y'all are hornet.
01:45:29.000 Twitter is now in the hands of the good guys and out of the bad guys hands.
01:45:33.000 Swedish deep state, the global empire of Wallenberg is crumbling.
01:45:37.000 Is that what it is?
01:45:38.000 I'm also fighting a sneeze.
01:45:39.000 Give me a second.
01:45:40.000 There we go.
01:45:41.000 No sneeze.
01:45:42.000 Yeah, I was going to sneeze in the middle of reading.
01:45:44.000 Well, there's clearly a fight going on.
01:45:45.000 I don't know if I would be so far as to say that it's completely in the hands yet.
01:45:50.000 And that's what we're talking about.
01:45:51.000 We're also hearing these issues about a potential margin call coming in on Tesla.
01:45:56.000 Now, keep in mind, Tesla and Elon Musk's net worth is directly tied to his vast ownership of Tesla stock.
01:46:03.000 I think something like 20 percent.
01:46:05.000 So if there's a margin call on this, what does that do?
01:46:09.000 Because those shares are his collateral with Morgan Stanley.
01:46:12.000 Yep.
01:46:13.000 Why does he need collateral if he's got all that money?
01:46:15.000 I don't understand that.
01:46:18.000 He doesn't have the money.
01:46:19.000 So it's all in assets.
01:46:20.000 It's all shares.
01:46:21.000 Yeah.
01:46:22.000 So Grand Kai says, Tim, this is quarter two.
01:46:24.000 Them reporting in the first quarter would be Enron level.
01:46:27.000 Good point.
01:46:27.000 That was my mistake.
01:46:29.000 Yeah.
01:46:29.000 Any new users coming in would be for quarter two, not quarter one.
01:46:32.000 So quarter one's already abysmal.
01:46:36.000 All right.
01:46:38.000 PardonWill says, I created a Victorian-era tabletop.
01:46:41.000 Thought TimCast might be interested in taking a look before I publish.
01:46:44.000 Send an email to SpinTheUFO.
01:46:45.000 Love y'all.
01:46:46.000 Very cool.
01:46:47.000 I saw it.
01:46:49.000 So a lot of people are pointing out, I was wrong that earnings are calculated in March for the first quarter and reported in April.
01:46:56.000 It has nothing to do with the upcoming report from Twitter.
01:46:58.000 The numbers will tell what was in March, not what's happening now.
01:47:01.000 I stand corrected.
01:47:04.000 I stand corrected.
01:47:06.000 Sparky says, why Elon Musk buys Twitter now?
01:47:09.000 U.S.
01:47:09.000 government needs Elon Musk for SpaceX because U.S.
01:47:11.000 won't use Russia for space launches anymore.
01:47:13.000 Otherwise, U.S.
01:47:14.000 government would conspire to block his purchase of Twitter.
01:47:17.000 Ah, interesting.
01:47:18.000 That's why he's winning.
01:47:20.000 Elon, I think you really planned this out more than people realize.
01:47:24.000 Well, I think he also kind of realizes where the chess... He's fantastic at always sort of knowing when it's the right time.
01:47:30.000 His timing is impeccable.
01:47:32.000 That he understood that... So he threw Starlink up for Ukraine.
01:47:36.000 He understood that there was... Obviously, he understands the space industry, right?
01:47:40.000 That's the industry that he's in.
01:47:42.000 But it's, in essence, it's actually a very small industry, right?
01:47:45.000 There's only a few players.
01:47:46.000 You take Russia out, now suddenly the U.S.
01:47:50.000 needs Elon Musk.
01:47:53.000 All right.
01:47:54.000 Ayabat says, it was just revealed this evening that Edward Snowden was one of the original creators of Zcash, a cryptocurrency that enables encrypted transactions.
01:48:02.000 I thought you should, you all should know.
01:48:03.000 Ooh, we should fact check that one.
01:48:05.000 That sounds interesting.
01:48:08.000 All right.
01:48:09.000 Pinch Me says, Tim, coffee shop name ideas.
01:48:12.000 Timoffee Pool, no, Timoffee, Pool of Caffeine, Coffee Roost, Cluckabean.
01:48:18.000 Very, very nice.
01:48:19.000 All good.
01:48:20.000 Yeah, Michael Mouse had a good idea for the Coffee Beanie.
01:48:22.000 The only problem is the coffee bean is already a chain.
01:48:25.000 We can't, it's too similar.
01:48:27.000 So, you know.
01:48:29.000 Pool of Caffeine's interesting.
01:48:30.000 That's nice.
01:48:31.000 It sounds interesting.
01:48:32.000 I don't know if it's a good name for a restaurant.
01:48:33.000 Beanie Town.
01:48:34.000 Caffeine Beanie.
01:48:35.000 Cluck-a-whatever I like.
01:48:37.000 I like that too, yeah.
01:48:38.000 Cluck-a-waka-waka.
01:48:38.000 Cluck a coffee.
01:48:40.000 Little Pressure Washing says, Tim, me, my wife, two daughters, 15 chickens, five goats, three dogs, one cat, all watch you here and are loyal members and proud to continuously support your amazing work, bro.
01:48:52.000 From Little Tails Farm.
01:48:53.000 We love you all over at Little Tails Farm.
01:48:55.000 Thank you for your support and for your funny video.
01:48:58.000 I think they're the ones who posted the video where the chicken was in the window.
01:49:01.000 That sounds right, yeah.
01:49:01.000 Yeah, they made a little chicken house and then like the camera came up and a chicken's looking out the window.
01:49:05.000 Chickens are hilarious, man.
01:49:07.000 Yeah, this looks like from Forbes, Edward Snowden revealed his key participant in mysterious ceremony creating 2 billion anonymous cryptocurrency.
01:49:15.000 What?
01:49:16.000 That's 5 hours ago.
01:49:19.000 NextPack says, we need people who are unbanned to start doing stuff Twitter normally bans conservatives for, i.e.
01:49:25.000 learn to code, and see if it still bans them.
01:49:27.000 If not, then yes, the system has changed and they are hiding it.
01:49:31.000 And will this have a ripple effect on other social media platforms?
01:49:35.000 It may.
01:49:36.000 It's going to be it's going to be an election wizard tried to do that.
01:49:39.000 And I forget exactly what the tweet was, but he was taken down for it was it was regarding Leah Thomas and Rachel Levine.
01:49:49.000 Now, here's what's interesting.
01:49:50.000 If you this is something that we noted in one of my group chats, we're talking about this, that.
01:49:56.000 If you tweeted, right, and this is totally anecdotal, I have no idea if this is true, so don't mess with this, you know, mess with this at your own, you know, risk, whatever the, you know, at-home kids thing is.
01:50:06.000 If you just tweet about Leah Thomas, you weren't getting banned.
01:50:10.000 But if you included Rachel Levine, you were getting banned.
01:50:13.000 And why?
01:50:14.000 Because she's a member of the administration.
01:50:16.000 Yep, yep.
01:50:16.000 Very interesting.
01:50:18.000 All right.
01:50:19.000 Private A says, if you want to understand how communists are working in the U.S.
01:50:23.000 and beyond, look up Counterpunch with Trevor Loudon on Epoch TV and YouTube.
01:50:27.000 He has Obama's roots, too.
01:50:29.000 Trevor Loudon is great.
01:50:30.000 We did a video together about Antifa a few years ago.
01:50:35.000 All right.
01:50:35.000 Ron Quay says, I find it hilarious that people have been asking for government help when it comes to social media bias for years with no results.
01:50:41.000 And now that Elon bought Twitter, they're jumping on the opportunity.
01:50:45.000 Babylon Bee had a headline, um, eccentric billionaire does more for freedom of speech than the Republican party has in 20 years.
01:50:52.000 Raymond G. Stanley Jr.
01:50:54.000 says, lightly tap the like button for Ian.
01:50:56.000 Spin the UFO.
01:50:57.000 All right, Raymond, I'm spinning it.
01:50:59.000 I'm gonna spin it with my hands.
01:51:01.000 Ooh, that's dangerous.
01:51:02.000 You're gonna knock it off.
01:51:03.000 Look at that wobble.
01:51:06.000 We have some more of the desktop keyboard cleaner things coming.
01:51:11.000 But I got the electric ones.
01:51:14.000 That guy says, Tim, your billboard just featured on Tucker.
01:51:17.000 We saw it.
01:51:18.000 It's funny that he called you of YouTube fame.
01:51:20.000 Of YouTube.
01:51:20.000 It's like this, sometimes what you do, it's so real.
01:51:23.000 Like you, I'm saying you generally, but what we're doing here and what you're doing, it's like, they don't know how to respond to it.
01:51:28.000 So it's almost like it's this, this cognitive dissonance of like accepting that the paradigm has shifted.
01:51:33.000 I constantly get- I've been on Tucker's show.
01:51:35.000 Yeah, I constantly get Jack Pasovic of Twitter.
01:51:39.000 Of Twitter, yeah.
01:51:40.000 Like I'm actually just a human being as well, like- Do other stuff.
01:51:43.000 Of Pennsylvania, you know.
01:51:44.000 What if Tucker was like, And a billboard was put up by Tim Poole, who formerly worked for American Eagle Airlines at O'Hare.
01:51:52.000 That's true, too.
01:51:53.000 Correct, yeah.
01:51:54.000 But I don't know how it's relevant.
01:51:55.000 But I guess we're big on YouTube, so, you know, it's not incorrect.
01:51:59.000 Big in Japan.
01:52:01.000 But I suppose it's good he said it because people can search for me now.
01:52:03.000 That's true.
01:52:03.000 You know, so.
01:52:05.000 But, uh, could have had me come on the show.
01:52:09.000 That's nice.
01:52:09.000 The shout-out's good enough.
01:52:11.000 Are we gonna get Tucker on the show, man?
01:52:12.000 Is he in D.C.?
01:52:13.000 Does he do a show in D.C.?
01:52:15.000 No, no, no.
01:52:15.000 He hasn't done a show in D.C.
01:52:16.000 for a while.
01:52:17.000 Really?
01:52:17.000 No, he's out.
01:52:18.000 He's long gone.
01:52:18.000 Is he in L.A.
01:52:19.000 or something?
01:52:20.000 He's got a couple places where he goes.
01:52:23.000 No, it's not New York as well.
01:52:25.000 He's in a secret location.
01:52:25.000 I'd love to have him on the show.
01:52:26.000 We do know, I think a lot of people, and it is public, that he does Maine, actually.
01:52:31.000 Oh, cool.
01:52:32.000 Maine?
01:52:33.000 Build his own studio up there.
01:52:34.000 It is so nice up there.
01:52:35.000 Oh, that's awesome.
01:52:36.000 Yeah, up in Western Maine.
01:52:37.000 And that's public.
01:52:38.000 I'm not, like, revealing anything.
01:52:39.000 Ducks.
01:52:40.000 And then he's got another one, and I'll just say it's in the South.
01:52:43.000 But we do the show, uh, our show overlaps.
01:52:45.000 Exact same time.
01:52:46.000 So he's at 8 to 9, right?
01:52:47.000 Right.
01:52:48.000 We could do like a simulcast.
01:52:50.000 Yeah, that's what we did with Daily Wire a couple weeks ago.
01:52:52.000 Make the universe fold in.
01:52:52.000 That'd be so good.
01:52:53.000 That'd be really cool.
01:52:54.000 Yeah, we could prerecord with him.
01:52:55.000 Yeah, you could prerecord.
01:52:55.000 We did that with Ben.
01:52:56.000 We put it up on Sunday and it's actually one of our biggest shows of the past few months or whatever.
01:53:00.000 It's got like, I don't know, 600 or so thousand.
01:53:03.000 Yeah, 670 I think.
01:53:04.000 Yeah, that'd be great.
01:53:05.000 Yeah, we could do, we could definitely do prerecords for Sundays.
01:53:08.000 Make them longer or whatever.
01:53:11.000 All right.
01:53:11.000 Ryan Grisaf says, to quote the philosophical Beanie aficionado, quote, now they'll face the consequences they held themselves above.
01:53:21.000 This is the will of the people.
01:53:22.000 Ah, yes.
01:53:23.000 Love it.
01:53:23.000 It truly is.
01:53:25.000 All right.
01:53:26.000 Probable cause says, hey, Tim, sorry I'm late.
01:53:28.000 Here's money.
01:53:29.000 Trudeau violated the Conflict of Interest Act in 2016.
01:53:33.000 RCMP said charging a prime minister would cause damage that would outweigh the negative effects of charging an ordinary person.
01:53:40.000 So much for that higher standard.
01:53:42.000 Too big to fail.
01:53:43.000 Welcome to the real world in politics, man.
01:53:44.000 That's how they play these dirty games.
01:53:45.000 Yeah, right.
01:53:47.000 There was the thing with Jacinda Ardern as well.
01:53:49.000 There was some court decision against her today.
01:53:51.000 I need a chance to look into it, though.
01:53:53.000 Hmm.
01:53:53.000 New Zealand.
01:53:54.000 All right, we got, oh yeah, yeah, that's right.
01:53:57.000 We got a bunch of stuff coming in.
01:53:58.000 JWM says, the big spring poultry swap and farmer's market is happening this Saturday.
01:54:03.000 Just up the road from you in Sharpsburg, Maryland.
01:54:06.000 Send someone up.
01:54:07.000 You can get some great content for your chicken channel.
01:54:09.000 Ooh, that's really interesting.
01:54:11.000 Yeah, the chicken poultry swap.
01:54:13.000 Everybody wants Roberto.
01:54:14.000 I kind of want to watch that video right now.
01:54:16.000 Like, I just want to see everybody going up to the poultry swap.
01:54:20.000 You would trade because you want to switch.
01:54:22.000 I don't know exactly what they're doing, but I've had people say, hey, would you would you trade roosters?
01:54:25.000 Right.
01:54:26.000 Because and I'm like, I got too many roosters already, dude.
01:54:29.000 So I think we've got we've got a plan for Roberto.
01:54:32.000 He's actually going to be retiring not to the boys dormitory, but to a smaller chicken city to go out to the country with only a few hens and and live out his days.
01:54:42.000 That's awesome.
01:54:43.000 Yeah, retirement, retirement.
01:54:45.000 We have the boys dorm, which is only gonna be roosters because you can house roosters together as long as there's no girls, you put a girl in there and they're gonna be fighting.
01:54:52.000 But no girls, they all hang out there.
01:54:54.000 They're bros.
01:54:55.000 Yeah, that's like real life.
01:54:56.000 I'm happy for Roberto.
01:54:57.000 That's a that's that's a tough pill to swallow because he screams outside my window.
01:55:01.000 Every day.
01:55:02.000 Every day.
01:55:03.000 Because he misses you.
01:55:04.000 That's probably why.
01:55:04.000 Why Sarah still run the best for him?
01:55:07.000 You know Sarah our Brahma has two sons and a daughter who were just born and we want brahmas are large
01:55:12.000 So all within the first week the Brahma baby was bigger than all the other babies
01:55:17.000 So we're like he's gonna get really really big and we're gonna have him
01:55:21.000 You know do the thing with all the chickens and then we're gonna have bigger and bigger chickens
01:55:25.000 And then what we're going to do is we're going to pick the biggest rooster and the biggest hen, and we're going to have a bunch of those.
01:55:29.000 Super chickens.
01:55:30.000 Okay, I just thought you were talking about cows, and then you mixed it with the chickens, and it didn't make sense.
01:55:35.000 No, all chickens.
01:55:36.000 Brahma's are chickens.
01:55:38.000 Brahma's are big chickens.
01:55:40.000 Brahman.
01:55:40.000 And so then we're going to have, we're just going to, I'm going to try and make, you know, six foot tall chickens you can ride.
01:55:47.000 Because a chicken generation is I think seven months It's fast.
01:55:52.000 It's like GMOs of chickens.
01:55:54.000 Yeah.
01:55:54.000 Yeah, basically.
01:55:56.000 Super chickens.
01:55:56.000 We'll figure it out.
01:55:58.000 Every seven months, we're going to have bigger and bigger chickens.
01:56:01.000 So maybe in 30 years, we'll be riding on chicken back.
01:56:03.000 Can you put, like, weights on the chicken so it builds muscle over its life?
01:56:08.000 Like, it's working out, you know?
01:56:09.000 I wonder.
01:56:09.000 Like, hang a five-pound weight on its back or something?
01:56:11.000 Five pounds is probably way too much.
01:56:14.000 Maybe, like, ten ounces, you know?
01:56:14.000 That's probably, yeah.
01:56:17.000 Just like those fake chicken arms.
01:56:19.000 So it's like funny, but it's also building up his muscle a little bit.
01:56:22.000 Is that abuse to do that to a non-consenting animal?
01:56:24.000 It's a good idea.
01:56:26.000 Probably.
01:56:26.000 From a non-consenting animal.
01:56:28.000 They're animals.
01:56:30.000 It is kind of crazy how quickly they're born and grow up.
01:56:32.000 Seven months until they're adults having their own kids.
01:56:34.000 Little brains.
01:56:35.000 Seven months.
01:56:36.000 Little brains.
01:56:37.000 But they dream.
01:56:38.000 Chickens have dreams.
01:56:39.000 And you can watch them dream on ChickenCityLive.com.
01:56:42.000 Oh, there it is.
01:56:43.000 Yeah, they're dreaming right now.
01:56:45.000 Matt Nill says, Tim, I live in Florida.
01:56:47.000 I listen to IRL on my drives throughout the state.
01:56:50.000 As a truck-roving fresh fish, Governor DeSantis vetoed HB741, which limits the amount of power a citizen can sell back to the FPL if they have solar panels.
01:57:00.000 Interesting.
01:57:01.000 Check it.
01:57:02.000 Will do.
01:57:03.000 I like the idea that if you have solar panels and you make too much energy, they kick you back some, but there's limits in most places.
01:57:09.000 Well, let me say, this is a topic of my TV show Sunday.
01:57:13.000 In California, they're supposedly reimbursing for the solar at a rate that's way out of whack with what's actually being provided, which means the electric customers are subsidizing the solar customers, which means poor people are subsidizing the people living in the new houses that are required to have solar in California.
01:57:33.000 Let's see, Briss Brofer says, Tim and Ian, would someone who works with federated identity protocols, such as SAML, OAuth, et cetera, and an enterprise level have a home with the TimCast team?
01:57:45.000 I suggest putting up a job board of some sorts.
01:57:47.000 I have not found one.
01:57:48.000 That is a good point.
01:57:48.000 Yeah, message me on Twitter or on Mines.
01:57:52.000 I'm actually looking for a UX developer that, and also, give me a minute.
01:57:57.000 I'm going to go grab some paperwork that I was writing earlier, and I'll let you know in a minute.
01:58:01.000 NewTekHD says Elon's latest tweets are not free speech absolutists anymore, especially when he says he wants only 80% of the audience in this controlled opposition.
01:58:11.000 Is this controlled opposition to make sure Truth Social doesn't succeed and hide the ball when GOP investigations might start?
01:58:17.000 I don't think so.
01:58:18.000 I think Elon is a troublemaker.
01:58:22.000 Simple solutions.
01:58:23.000 Elon got mad.
01:58:24.000 He was friends with the guys at Babylon Bee.
01:58:25.000 He went on their show.
01:58:26.000 He's a fan of the Babylon Bee, and he has the ability to do these things, and he did it.
01:58:31.000 I think it's just crazy stuff happening.
01:58:35.000 It is kind of crazy, the assumption that Elon, because he's super rich, must be in some cabal or something, but you'd be surprised, man.
01:58:42.000 I've often said it, like, how is this show allowed to be successful?
01:58:44.000 Well, it's because the establishment isn't as powerful as people think they are.
01:58:48.000 So we can do it.
01:58:50.000 It's nodes.
01:58:51.000 I mean, there's nodes to it.
01:58:52.000 A lot of people mentioning Tucker Carlson shouting us out.
01:58:55.000 Very, very awesome to hear.
01:58:59.000 RoboCheezits with a huge super chat saying, I noticed a historical error on the show a while back, which is FDR knew about Pearl Harbor.
01:59:07.000 This is false because of the Japanese invasion of the Philippines, which would cause a war anyway.
01:59:11.000 And it does not make sense to lose your main battleship fleet right before a war with the naval power.
01:59:15.000 I agree with that.
01:59:16.000 I've heard these claims that's like, oh, they knew and they let it happen.
01:59:19.000 And I'm like, yeah, I've heard these theories.
01:59:20.000 I think you know, maybe they thought, maybe they heard, but I don't,
01:59:24.000 I, simple solutions, man.
01:59:25.000 Well, part of, part of it is also, so there were indications that an invasion was coming.
01:59:30.000 One of the issues though, and so the N2 for PECOM at the time was this guy by the name of
01:59:36.000 Commander Layette, who later became Admiral Layette. And And the issue wasn't that, like, obviously, we knew the Japanese fleet was no longer in Tokyo, right?
01:59:46.000 You know, this is something that I learned when I was Navy Intel training.
01:59:48.000 So we knew, obviously, we knew they had left, like, we're not that bad.
01:59:53.000 But the issue was, they weren't sure where the attack would be.
01:59:55.000 Was it going to be Hawaii?
01:59:56.000 Was it going to be Hong Kong?
01:59:58.000 Was it going to be the Philippines?
01:59:59.000 They knew those were the big three.
02:00:01.000 Well, what people didn't, what they didn't foresee, and this is what Layette had said, but they didn't listen to him, was that it was actually all three at once.
02:00:08.000 Massive blitzkrieg.
02:00:09.000 And they said, Oh, they thought they're never going to attack US territory.
02:00:12.000 There's no way that's beyond their capabilities, etc.
02:00:15.000 And keep in mind prior to then, because aircraft carriers were kind of the big thing that, or excuse me, were kind of the new thing that all of naval combat was battleship based.
02:00:28.000 Of course, we lost a lot of our battleships in that, but our carriers were out.
02:00:31.000 They used their carriers to great effect, but in doing so, that made U.S.
02:00:36.000 Naval Command have to put more emphasis on the carriers and really create that carrier doctrine, which is what ended up winning the war in the Pacific Wars.
02:00:46.000 Let's grab one more.
02:00:46.000 We got Legomathagayan saying, Ian, I speak Persian.
02:00:50.000 Ahmadinejad said, obliterated from the page of time, not wiped off the map.
02:00:55.000 The suggestion he was mischaracterized is widespread but a bald-faced lie.
02:00:59.000 Wow.
02:01:02.000 It's still metaphor, obliterated from time.
02:01:04.000 Like, what does that mean?
02:01:06.000 Erase it from the history books.
02:01:07.000 He didn't say let's burn people's body.
02:01:10.000 He wasn't like specifically talking about killing.
02:01:12.000 He said obliterated.
02:01:13.000 Yeah, he wanted it gone.
02:01:15.000 Like, obliterated.
02:01:16.000 Ladies and gentlemen, if you haven't already, please smash that like button.
02:01:19.000 Do it for Ian.
02:01:20.000 You heard him.
02:01:21.000 He needs those likes.
02:01:22.000 You know what I'm going to do with those likes?
02:01:24.000 I'm going to hire some developers and make the best technology on earth.
02:01:28.000 All right.
02:01:29.000 Smash the like button, subscribe to this channel, share the show with your friends, and head over to TimCast.com.
02:01:33.000 Become a member.
02:01:34.000 We're gonna have that members-only show coming up around 11 or so p.m.
02:01:37.000 tonight.
02:01:37.000 They're Monday through Thursday.
02:01:39.000 Don't forget to follow the show at TimCast IRL, basically.
02:01:41.000 Everywhere.
02:01:42.000 You can follow me personally at TimCast.
02:01:44.000 If you want to see me posting weird nonsense on Instagram or Twitter, follow me.
02:01:49.000 Cheryl, do you want to shout anything out?
02:01:52.000 Well, it was great to be here.
02:01:54.000 I'm so sorry I sounded like this, because I have a lot to say normally.
02:01:58.000 And thanks for having me.
02:02:01.000 Do you want to mention your Twitter or anything?
02:02:04.000 Sure.
02:02:04.000 Sheryl Ackeson, at Sheryl Ackeson, S-H-A-R-Y-L, A-T-T-K-I-S-S-O-N.
02:02:11.000 I have a Sunday TV show, feeds the 43 million households across the country on all kinds of affiliates, ABC, NBC, CBS.
02:02:18.000 And I try to cross-post everything on my website.
02:02:21.000 Cheryl Ikson dot com.
02:02:23.000 All right.
02:02:23.000 Thanks.
02:02:25.000 Very cool.
02:02:26.000 Human Events Daily, go check it out.
02:02:28.000 It's a podcast for people who don't like podcasts because we give you everything in 25 minutes or less.
02:02:34.000 I would be remiss if I didn't mention, go and actually look at the statements that Dr. Oz made in the debate in Pennsylvania this week.
02:02:41.000 My home commonwealth, this guy is lying to you.
02:02:44.000 He is a far left liberal.
02:02:46.000 This is a vanity project for him.
02:02:48.000 And ladies and gentlemen, look, it's simple.
02:02:51.000 Just buy the pillow.
02:02:52.000 Buy the pillow.
02:02:53.000 Just buy it.
02:02:54.000 What pillow?
02:02:54.000 Just buy the MyPillow.
02:02:56.000 You know you want one.
02:02:57.000 Just do it.
02:02:58.000 How would they do that if they were going to do that?
02:02:59.000 You know what I want to do?
02:03:00.000 Buy the MyPillow.
02:03:01.000 I'm not kidding.
02:03:01.000 I'm going to buy like 300 and I'm going to fill a room with them.
02:03:04.000 Let's do it.
02:03:04.000 And then, I mean, we actually should do that.
02:03:07.000 We should film it.
02:03:07.000 I'll put one behind my head.
02:03:09.000 It'll be the MyPillow.
02:03:10.000 So that'll be like when a guest gets too unruly, they get sent to the MyPillow room.
02:03:14.000 No, it's the room back here.
02:03:16.000 And then it's all padded, like a padded room.
02:03:19.000 One of our offices now is kind of vacant.
02:03:22.000 Like, legit, let's line the walls with MyPillows and then put like a hundred MyPillows on the floor.
02:03:27.000 We'll put cameras in it and push someone in there and just lock them in there.
02:03:31.000 And then that'll be a live cam too.
02:03:33.000 Actually, do you think Mike would sponsor this?
02:03:38.000 So we just built a three foot launch ramp and we've got a seven foot quarter pipe.
02:03:44.000 Do you think he would sponsor sending us a bunch of pillows for our foam pit?
02:03:48.000 That might take some explaining.
02:03:50.000 Yo, we're planning on building a foam pit at our new facility.
02:03:55.000 So the idea is we're gonna have a stage.
02:03:57.000 Well, that is the pillows that's interlocking foam.
02:03:59.000 Right.
02:03:59.000 So the top layer of the stage will fold up against the wall and it'll expose a foam pit so you can launch and do flips and land in the foam pit.
02:04:07.000 I would be so down to make the foam pit just a bunch of MyPillows.
02:04:09.000 We've got to call it the pillow pit.
02:04:11.000 The pillow pit.
02:04:12.000 Yeah.
02:04:12.000 The MyPillow pit.
02:04:13.000 We'll put MyPillow on the side of it.
02:04:14.000 The MyPillow pillow pit.
02:04:15.000 The MyPillow pillow pit.
02:04:17.000 That's actually not bad.
02:04:19.000 But that only works if you go to MyPillow.com and utilize promo code POSIP for up to 65% off sleep like Joe Biden sleeps through an international crisis.
02:04:27.000 That's promo code POSO.
02:04:28.000 P-O-S-O.
02:04:30.000 Just to be clear.
02:04:31.000 All day, every day at Toys and Sundaes.
02:04:32.000 So we're gonna have 31 foot high ceilings.
02:04:35.000 And you're gonna be able to, like on the top of the studio, because the studio's gonna be second floor, there's gonna be a third floor.
02:04:40.000 It would be super cool to do a massive pile of my pillows and have someone jump, like get a pro to like jump off the third floor and land in the pillows.
02:04:47.000 You'd have to get the highest firmness of the pillows.
02:04:49.000 It could be done.
02:04:50.000 We'll do it.
02:04:51.000 We'll do it.
02:04:51.000 All right.
02:04:52.000 I'm excited.
02:04:53.000 Hey guys, Ian Crossland from iancrossland.net.
02:04:55.000 If you want to get in touch, and I am looking for a couple of developers where we have this open source project that we've been working on, a charity that we're building right now, and we need a couple of pieces.
02:05:03.000 I need, like I said earlier, a UX developer that wants to commit some time over the next couple of months.
02:05:10.000 A donation.
02:05:11.000 Donate your time.
02:05:11.000 It'd be great.
02:05:12.000 Get in touch with me on Twitter or on Mines.
02:05:14.000 And also I'm looking for a project manager that's familiar with Trello, Jira, Nextcloud, open source.
02:05:20.000 project management software.
02:05:21.000 So if that's something you want to do and you want to get involved with us, contact me, Mines, or Twitter.
02:05:25.000 And I'll see you guys next time.
02:05:27.000 So fun having Cheryl.
02:05:29.000 I really wish she'd been able to speak more.
02:05:30.000 We just have to have you back again.
02:05:32.000 That's just the bottom line.
02:05:33.000 You live close enough.
02:05:34.000 Come on, let's do it again next week.
02:05:36.000 I'm just kidding.
02:05:36.000 We'll make it happen, though, for sure.
02:05:38.000 You guys may follow me on Twitter at Mines.com, at Sarah Patchlitz, or at Sarah Patchlitz.me.
02:05:44.000 We will see you all over at TimCast.com for the member segment, but don't forget to check out YouTube.com slash Chicken City.
02:05:51.000 Subscribe and literally just watch chickens.
02:05:54.000 It's relaxing though.
02:05:54.000 It's like nature sounds and you can feed the chickens.
02:05:57.000 And we'll see you all over at TimCast.com.
02:05:59.000 Thanks for hanging out.