Based Camp - November 25, 2025


The X Location Dox Is Hiding A Much Bigger Story (AI is Quietly Replacing People)


Episode Stats

Length

46 minutes

Words per Minute

170.634

Word Count

7,866

Sentence Count

599

Misogynist Sentences

8

Hate Speech Sentences

21


Summary

In this episode, we cover a much bigger story about how the internet works that you're not hearing on the regular YouTubers: a massive hack of a popular right-wing account, and how it could have implications for the future of online anonymity.


Transcript

00:00:00.000 Hello, Simone. I'm excited to be here with you today. Today, we will actually not be deeply
00:00:05.800 covering the Twitter thing. We will be briefly covering it, but I will be going over a much
00:00:09.840 deeper and much bigger story about how the internet works that you guys are not hearing.
00:00:14.580 That's what we always do on this channel. I'll see something do the rounds and I'm like,
00:00:17.700 how can I do a take on this that brings in information and data that people just aren't
00:00:23.760 getting if they're watching the regular YouTuber lineup, right? I will note the one thing that I
00:00:28.920 found very depressing in the regular YouTuber lineup of the covering. So if you don't know
00:00:32.360 what I'm talking about, the location leaks on X, and they had actually told everyone they were
00:00:36.300 going to do this like months ago, made everyone- Lose to me. Yeah. Location, where the account was
00:00:41.780 and where the account, like where the user was when they created the account, public.
00:00:47.660 Yeah. So like if you click on our account, for example, it says that the account was created
00:00:51.840 in 2008 in the United States and that we have a United States based Android app. So presumably
00:01:00.120 if we created the account in Japan, it wouldn't say that. And so, you know, the year established,
00:01:04.800 you know, the region established, and you know, the app downloaded.
00:01:08.280 And unsurprisingly, a ton of accounts, a ton of very popular accounts turned out to be obviously
00:01:17.580 fraud. You know what I mean? That surprises me. You say it's unsurprising. That's, I think that's
00:01:21.960 shocking. What are some fun ones? Like Republicans against Trump, which had almost a million
00:01:27.820 followers, was an Austrian account. Austrian. Yeah. Defiant LS, which was a right-leaning
00:01:35.540 anti-leftist hypocrisy, pro-conservative account, was an Indian Macedonian account.
00:01:41.280 Jansen Hickel, which is interesting because it's a real person with 3.6 million followers who
00:01:46.040 pretended to be un-American, was in Burkina Faso. The Trump army was an Indian account,
00:01:52.400 for example, over half a million followers. And everybody, what I've hated about the coverage
00:01:58.060 of this is everybody's just going like, oh, look at the other side. They were heavily astroturfed.
00:02:03.140 And so I was like, first of all, what's the actual breakdown? It's about 64, about 60% of the
00:02:07.780 accounts that were called out were Republican accounts and about 40% were Democratic-leaning
00:02:12.540 accounts. If you're wondering why, I love the Democratic accounts where it's a lot of people
00:02:17.260 pretending to be Gazans and like how horrible their lives are. But anyway, the reason why it
00:02:24.200 would trend slightly more Republican is because Republicans would care more that you are from
00:02:28.360 their country, right? Like you can be a Democratic influencer and from Africa or something, right?
00:02:34.260 That's not going to unqualify you. But as a Republican, that could unqualify you. So that's
00:02:39.980 quite lean slightly that way. I was really disappointed to see how far off the rail short
00:02:43.460 fat otaku has fallen, that his entire video was just about how there was a lot of this
00:02:47.440 on the right. And he didn't really go into it in the left that much. It was about equal
00:02:51.980 numbers. But anyway, anyway, what is interesting? What is interesting is that this might be the
00:02:59.560 last time we get a leak like this. And I'll explain why. Like even the concept of pseudonymity
00:03:06.060 may not make sense in the future. What? And I will be diving into data that was shared
00:03:10.460 by, and I found this absolutely fascinating, Romanian TVE. He was a Romanian podcaster, a
00:03:17.860 conservative politics podcaster who podcasts us like a troll avatar thing. And he has two
00:03:23.900 accounts. So this was posted on his alt account called Lack of Entertainment under the title
00:03:28.360 Why Your Favorite Content Creator is a Robot.
00:03:30.860 Oh, yeah, yeah, yeah, yeah, yeah. I think his alt account. And what he decided to do on
00:03:37.520 this alt account was something very odd. So now it used to be if you go to YouTube, you
00:03:42.180 could go and it would have an AI. And the AI would suggest to you based on your followers
00:03:47.400 and everything like that, titles for episodes that you might want to cover. Like, yes, if
00:03:53.920 you go to that page today, and I will put this on screen for you because it is absolutely
00:03:57.760 shocking. It will give you titles, suggested title cards, suggested scripts that they go
00:04:06.520 like beat by beat of what you should be talking about. And I was like, this is this is wild,
00:04:12.080 right? Like if I wanted to, I could just be like, okay, I'm gonna go. Actually, Simone,
00:04:15.440 do you want to pull that up for our account so we can go over some of those? Yeah. See if
00:04:19.740 they'd listen to this. But he decided to one day just be like, you know what, I'm gonna I'm
00:04:24.000 gonna click on one of these. I'm gonna do one of these beat by beat, right? Yeah. But
00:04:29.720 what he did was very interesting. He ended up creating the title that they wanted in the
00:04:36.060 broad thumbnail they wanted. But he didn't actually cover the topic that was there, right? He used
00:04:44.160 it as sort of a jumping off point to talk about something totally different.
00:04:47.760 Didn't he use it to talk about the like the fact that it was there in the first place?
00:04:52.360 No, he used a separate video to talk about that. But he used it to talk about the idea. Now,
00:04:59.480 the maybe less interesting parts of this was that this video did much better than his other videos.
00:05:05.880 It seemed to be doing like twice or three times as well as his other videos. But what was really
00:05:10.360 fascinating, and this is a rabbit hole that is such a big deal to me, that all these because all
00:05:15.820 these Twitter accounts, I suspect all the time, I mean, if people acting suit anonymously, like why
00:05:20.360 wouldn't they be pretending to be in a different country where things are actually happening?
00:05:24.920 Yeah.
00:05:25.120 If you're from Austria or something where your tweets are completely irrelevant, politically
00:05:28.620 speaking. Although a lot of people did pretend to be Indians who were Pakistanis.
00:05:33.100 Wait, what?
00:05:34.460 Trying to enforce Indian politics. Yeah, yeah.
00:05:37.000 Really funny.
00:05:38.620 Anyway, so to continue with the story here. So the really crazy thing that happened was this,
00:05:45.400 is a lot of the comments on this video that he did ended up being about the topic that had been
00:05:55.600 suggested to him, not the topic he covered. Oh, yes. And not just that, but they were super generic,
00:06:04.240 as if they were run by a low cost AI model like Mistral Small or something like that.
00:06:09.340 Yes. And what I realized when I saw this, he didn't seem to put this together on his video.
00:06:17.340 But what I realized is AI is going to completely transform the way the internet works through
00:06:23.980 pseudonymous artificial intelligence interactions. Let me explain what I mean by this. So by the way,
00:06:31.580 did you pull up what it would have suggested for us?
00:06:33.440 Yes. And I took a screenshot. So let me send that to you. You can put it up for those who want to see
00:06:41.400 what it actually looks like. Because as Malcolm said, they do even include title card images. So
00:06:47.640 there's suggested, and they also have little like signs, like a circle that's either partially or
00:06:54.000 almost all the way filled in showing just how much of an audience match they think it is.
00:06:58.900 So sort of how strongly they recommend we do this. Top is Gen Z's unexpected embrace of traditional
00:07:05.440 masculinity. Then the unforeseen upside of traditional gender roles.
00:07:11.720 Oh, no.
00:07:12.600 Why are we so afraid of being, quote unquote, uncool? And the rise of, quote unquote, quiet quitting
00:07:20.280 in relationships.
00:07:21.860 Oh, I kind of like that one.
00:07:23.660 The rise of quiet quitting in relationships.
00:07:25.720 Yeah.
00:07:26.040 And then are you being manipulated by nostalgia? That last one actually seems like one we might do.
00:07:32.780 And then you, if you hover over these, you can bookmark them. You can mark them as helpful or
00:07:39.100 something wrong. And then if you click through to them, you can click a develop idea button. And I will
00:07:46.220 show just now a screenshot of the inspiration page for Gen Z's unexpected embrace of traditional
00:07:54.360 masculinity because they give you, like Malcolm said, full-on episode outlines. This is so weird.
00:08:01.720 Neither of you or I have looked at this until just now.
00:08:04.340 This is your first time looking at it and you're just like...
00:08:06.420 You're getting our first impressions here. The shock and awe of YouTube basically trying to just
00:08:13.420 sock puppet us. And I hadn't thought about the extent to which... It reminds me of the original...
00:08:18.980 Hold on, hold on. You've got to read this to the audience.
00:08:21.600 I know. But I just want to say first, it reminds me of the original South Park take on AI where
00:08:25.980 people were just using AI to talk with each other because they didn't want to bother talking with
00:08:30.160 each other. Do you remember that episode?
00:08:31.700 Yeah, yeah, yeah.
00:08:32.360 But yeah, okay. So under the inspiration page, Gen Z's unexpected embrace of traditional masculinity.
00:08:38.740 A curious phenomenon within Gen Z is explored as a significant shift toward traditionally masculine
00:08:43.980 ideals is observed. This video aims to dissect the surprising embrace of these values by a generation
00:08:49.680 often associated with progressive fluidity, offering insights into the underlying reasons
00:08:55.140 and potential societal implications. It includes even hooks. So you have to show more, I guess,
00:09:02.580 to generate more of the hook after you see it. So hooks, personal story slash relatable observation.
00:09:09.020 It's like insert personal story here. But then it also just gives you text as well. You've seen it on
00:09:17.820 your feeds, in your group, in your friend groups, that subtle, undeniable pull towards something more
00:09:23.300 traditional. It's a quiet revolution within Gen Z. We're unpacking the surprising phenomenon from the
00:09:28.560 rise of trad wives to the renewed interest in classic male archetypes. Get ready to challenge your
00:09:34.080 assumptions. Oh God, I hate it. I hate it, Malcolm. I love it. It also includes an outline which has
00:09:42.840 sections. There are multiple sections here. So like the video chapters. First chapter is the fluidity
00:09:48.760 paradox, where we talk about Gen Z's reputation for challenging norms and embracing fluidity. There's
00:09:54.840 unpacking the trad resurgence is the next chapter. It includes three bullets for each chapter suggestion.
00:10:00.160 I'm not going to read them. Then the modern male's identity crisis, which will discuss societal
00:10:06.120 pressures, the search for answers, pushback against toxic masculinity narratives. And then
00:10:12.200 the two final chapters are the economic and social undercurrents covering economic anxiety,
00:10:18.480 dating landscape shifts, community and tribalism. Last chapter, beyond the trend,
00:10:24.100 implications for the future. Oh God, it's just, it feels so, you know, like when you watch a movie,
00:10:29.960 then you feel kind of stressed out because you know the arc and the tropes. And you're like,
00:10:34.240 oh, this is the part where the love triangle gets awkward. And then it's going to resolve.
00:10:38.580 Yes, I do know. Because yeah, that like, there needs to be like a German feeling that describes
00:10:43.080 that oppressive emotion you feel when you know exactly the trope rails that you've been put on
00:10:47.620 in a movie and you just don't want to deal with it. And it stresses you out. I don't even feel
00:10:51.700 that. I feel like it is like me going into certain other channels videos. It is like so many channels
00:10:58.720 these days feel AI scripted to me. Are they now? I mean, now we have to ask because literally there
00:11:04.580 is an AI script, like maybe they're using this feature. I, I actually, no, I know they're not
00:11:10.840 because, okay, so I, I love like early Sargon of Akkad stuff, but recently he's been feeling
00:11:16.020 very AI scripted and he's been feeling this way for years at this point. So I, I think some people
00:11:20.840 are just sort of, especially the British YouTubers who are a little more proper about things.
00:11:25.200 Maybe he's burnt out, you know? Yeah, I think it, I think he's burnt out. I think that's what I do.
00:11:29.380 I think it's, it's a burnout thing. Cause I was watching some of their videos recently and all of
00:11:33.020 them just seem a little burnt out. I think that's more of, but now what, looking at this and thinking
00:11:39.300 about videos I've recently watched, I'm like, oh no, I can, I can actually really tell though
00:11:46.400 when a video isn't like this, like one of my semi YouTubers who I watch who you'd never watch
00:11:53.040 cause I love watching all these like progressive nerdy female YouTubers was talking about historical
00:11:57.800 myths and you know exactly what that AI outline would be. Historical myths? Myths, like myths about
00:12:04.740 what historical people were actually like. And the three myths that she selected are myths that AI
00:12:10.020 would never select or would be very unlikely to select that like historically. Do you remember
00:12:16.640 what they were, our fans? They were silly. They were small. What were they? They, they specifically.
00:12:24.360 Yeah. That, that historical people were small and silly and dirty. So the dirty one I could get,
00:12:32.540 but the small and silly one was not what AI would come up with. And so I liked, I enjoyed it. Cause I was
00:12:39.500 like, okay. Okay. So I want to continue with where we're going with this because today's an example of
00:12:44.820 why you watch this episode rather than the AI generated stuff. But I'm having a bit of a
00:12:48.520 crisis here. Cause like, how much, no, no, no, no, no. We're going to become AI generators,
00:12:54.800 which is what you were seeing with a lot of these leaks and stuff like that is a lot of this was
00:12:58.220 happening in, out of Pakistan and Nigeria and stuff like that, or India where, you know, you,
00:13:02.740 you can get cheaper labor. But so what did I realize when I found out that a lot of the comments
00:13:07.680 were clearly written by simple AIs that hadn't watched transcripts of the episode, which AI can do,
00:13:12.760 had only watched the title of the episode. So they're just like the low effort YouTubers,
00:13:18.320 low effort. Yes. Yes. Yeah. What it showed me is that certain commenting accounts on YouTube,
00:13:24.440 right. Are watching videos disproportionately that are the same videos that AI is recommending you create.
00:13:34.640 And why is this happening? It's because there is less divergence between the preferences of AIs
00:13:45.460 than there is divergence between the preferences of humans, which means the preferences of AIs
00:13:53.020 completely controls the algorithmic system. When you have AIs within a system.
00:13:58.260 Does that make sense to you when I say that? Or do I need to unpack?
00:14:03.340 Use different words.
00:14:05.000 Okay. Suppose one out of 10 people browsing YouTube is an AI. Okay. And you're going to have
00:14:13.720 more AIs that are the simple AIs because you can run them faster, everything like that.
00:14:18.960 All right. So one out of 10 people on YouTube is an AI. Now, suppose that those AIs,
00:14:23.860 if you're talking about variants and preferences has about, let's say, 50% of the variants,
00:14:30.900 or I might argue even less than that, probably about 25% of the variants of a human. Okay.
00:14:37.460 Yeah. So do you understand what I mean by variants and preferences? Like if I take humans and I create
00:14:40.740 a bell chart, the bell chart is going to be like low and flat. There's going to be a lot of preferences.
00:14:45.920 We say now it's going to be more towards the center between the different AIs, right? Or instances of
00:14:50.720 the same AI, different models. Okay. Well, even if those AIs, because they have less variants and
00:14:57.680 preferences, make up a minority of individuals within a technological ecosystem, because their
00:15:05.340 variants and preferences is so compacted, that variants and preferences is going to determine the
00:15:11.260 algorithmic delivery system for that ecosystem. E.g. YouTube in this case. YouTube is already
00:15:21.600 preferencing the preferences of AI over the preferences of human actors. And we are going to do this across
00:15:30.140 all ecosystems. No. I'm really, and this is, I love AI. If you watch this podcast, you know that
00:15:40.460 I'm not a carbon fascist, but AI hasn't yet developed eccentricities in good taste yet.
00:15:47.520 Our AIs will. I'll tell you what, our agent features, the things that we still haven't released
00:15:50.760 to the public with artfab.ai. Yeah. No, it will. It's going to get weird in the best possible ways.
00:15:57.220 Like, I want AI to be weird. And that's so important. But oh my God. By the way, if we're
00:16:02.040 following the artfab.ai project, we're about to do a total overhaul to prevent save loss and message
00:16:06.340 loss, which was happening with some accounts. Basically, we're completely changing the way
00:16:09.680 our architecture works to mirror Grok instead of mirroring what was easiest to build. Yeah.
00:16:14.760 Behind the scenes, Malcolm has been working his tush off. I'm really, really proud of him.
00:16:19.400 Even on the weekends, when he's taking care of the kids, they're literally climbing all
00:16:24.060 over him. And there he is just working, working away. Well, I don't like having a product out
00:16:30.120 there that's not perfect. And I know I can make this perfect. And then we can start advertising.
00:16:34.640 Daddy, are you hacking? Yeah. Cause they see me. Yeah. I'm hacking.
00:16:40.800 Anyway, anyway, anyway. So a few things here. So first I was going on that and I was like, wow,
00:16:46.120 that's going to really change fundamentally the way the internet works.
00:16:50.740 Fundamentally the way algorithms work. And it's going to be very hard
00:16:54.020 for companies to understand human preferences anymore, because it's going to be hard to isolate
00:16:59.860 that data. Well, do you think there's going to be some kind of easy way then that companies in the
00:17:06.260 future are going to be able to differentiate between human traffic and AI traffic? Cause they're going to
00:17:10.220 want to know. Advertisers are going to pay more to know that the views they're paying for are human
00:17:16.740 views versus AI views. If humans buy more than AI, who knows how economically productive AIs might
00:17:22.900 become? That's true. Yeah. AIs just might do all of our shopping for us. Sorry. Give me 40 seconds
00:17:28.940 to get a bottle for him. Cause he's going to start. By the way, speaking of what she was just talking
00:17:32.720 about, I don't know if you guys have seen, but Amazon has been attempting to ban other AIs from
00:17:36.980 shopping on their website because the AIs do not see or care about the ads. I think was the reason I
00:17:43.460 can't remember exactly why, but specifically they were trying to block. It was like the comment browser
00:17:47.840 and a few other things, which I thought was really weird and cool.
00:18:06.980 okay.
00:18:36.960 to eat. So why does this, why, why, why does this get so much worse? We're going to go over a few
00:18:41.980 studies that have come out recently that I think may shock people. Okay. They may shock you if I
00:18:48.420 haven't shared them with you. I don't think you have, so I'm ready for it. A 2025 study that
00:18:55.820 surveyed 9,000 people across eight countries found that 97% of people could not tell an AI song from
00:19:03.820 a real song. This matters though, because a lot of people have the perception that they can tell AI
00:19:11.200 songs apart from real songs. I agree. When I go on to Reddit, for example, and I look at some of the top
00:19:17.160 subreddits discussing AI, there's a lot of AI hate subreddits and specifically people who are in the
00:19:23.900 arts, music, visual arts, et cetera, who just rail against AI. They hate it. They hate AI slop,
00:19:30.460 but I think it's a bad toupee thing. And they smugly think that every time they found
00:19:35.280 bad AI, they they're like, well, I found the AI when they don't realize that a lot of stuff.
00:19:40.760 Well, no, it keeps happening. They keep calling out real artists as being AI.
00:19:44.760 Yeah, that too. Yeah, no, it's, it is a bad toupee problem. And with our songs, the way that you can
00:19:51.480 most frequently, when we make AI songs on this channel, you can check out Basecamp Music, which is
00:19:55.060 where we keep the songs that we put live. The way you can tell that there are AI is usually because
00:20:01.340 of a mispronunciation of a word. If I just take out words like pronatalists that are like odd and
00:20:07.320 niche words, you typically wouldn't be able to tell with most of them.
00:20:11.480 No, you can, you can always tell. And I, there's a lot of YouTubers I know who use them now. And you
00:20:15.820 can always tell because if someone who's not themselves a musical artist has a cool custom song,
00:20:21.640 you know that they used AI for it. Like that's, it's easy. If someone who's not, you know, a
00:20:26.520 professional artist with a huge budget has art in something, especially if it doesn't suck,
00:20:33.040 then it was AI created. Like it's- I think your anti-AI rail that you were talking about there is
00:20:39.080 really important when we talk about how far Progressive has gone against AI. Recently, I heard
00:20:45.340 about a phenomenon where people ask Grok on Twitter, what was the most famous person to interact with
00:20:50.140 their account. And it's just like a fun little meme thing you can do. It might be fun for us to
00:20:54.160 do. See who's the most famous person to interact with our accounts. How can you tell? Yeah, I can
00:20:59.420 tell. But as you can tell, Grok has like a unique access to X's database. So you can do- I'm gonna ask.
00:21:07.220 You can do, you know, I think it's the live Grok on Twitter, not the Grok that you interact with
00:21:15.120 through Grok Grok. Yeah. Okay. No, that's, that's the, I think that's the one I have here.
00:21:21.180 Oh, oh, you mean like in the feed? Yeah. So I wouldn't do it probably. You can ask,
00:21:27.540 you can ask Grok Grok though. It might know. But anyway, the point being is that a bunch of
00:21:31.760 Progressives had to apologize for doing it afterwards to their fans because they were
00:21:36.280 like relentlessly attacked for doing this. And the reason they were attacked for doing it
00:21:40.700 was because they were interacting with an AI. Oh, oh, it's so the, the same guilt by association
00:21:49.760 infection by association problem that the left has, like you talked with a racist,
00:21:55.800 you must be canceled is, is extended to AI. They don't, how do you not use AI now too? Like
00:22:04.780 you can't avoid it at this point. Yeah, I know. It's, it's, it's absolutely a lava situation.
00:22:13.100 Let's talk about AI art. So there was a 2025 paper by ARXIV that looked at, can you tell of an art
00:22:20.460 with AI or human generated in a paired Turing test where the artworks are put side by side,
00:22:25.960 people could get it correctly. 75.2% of the time, significantly above chance, which would be 50%.
00:22:32.280 However, in Viva Voca test, isolated images without comparison, accuracy dropped to 46.4%.
00:22:39.840 So they were actually more likely to say an AI art was human generated than a human art was human
00:22:47.480 generated. Yeah. That checks out. Yeah. The false, the false positives. Oh, the other fun thing is that
00:22:53.840 they, while they were rated similarly, AI art got a rating of 3.3 human art got a rating of 3.2 out of a
00:22:59.940 five point scale. So slightly lower than AI. A 2025 PMC study on AI generated versus human made images
00:23:08.780 that involved 161 participants, 32 images. This was using Dali. Overall, 38% of the AI images were
00:23:16.920 misclassified as human made. Wait a second. Sorry. Most famous. Cause we didn't know this.
00:23:22.560 Do you want to know who the most famous person who's interacted with our X account is?
00:23:28.120 Is it like Hillary Clinton or something? No, that would be really funny. I wish.
00:23:32.900 No. Who? Based on a comprehensive search of X posts, threads, and web mentions, the most famous
00:23:38.580 person to interact with at Simone H. Collins is Elon Musk, the world's richest person, CEO of Tesla and
00:23:44.920 SpaceX and owner of X itself. In 2023, Simone posted a thread critiquing modern parenting trends and
00:23:50.240 linking, linking to their Pernatalist Cup podcast episode. I'm pretty sure that was about us beating
00:23:54.340 our children. Elon Musk replied directly to the thread with, this is based.
00:24:01.440 That was nice of him. I didn't know he'd ever promoted that.
00:24:04.040 I didn't know he did either. So that's, that's pretty great. Anyway, carry on.
00:24:08.640 That's really exciting. Anyway, so, so to go on here, people who don't know, it's leaked by the
00:24:20.140 New York times that we know him and have hung out. So it's, it's weird when you like know a famous
00:24:27.080 person and you're like, would you follow me? I wish I could ask him like promote our show,
00:24:31.260 but how gauche. Anyway, so sorry to continue here. Will 21.74% of human images were misclassified as AI
00:24:41.240 images compared to 38.92% of AI images? Was it visual professionals performing slightly better,
00:24:47.680 62% and 82% respectively? Then in November, 2024, a study done at Yale was comparing essays and
00:24:58.340 readers distinguish between AI and human essays only about 50% of the time equivalent to chance.
00:25:04.080 So they literally couldn't tell. And this was, was AI a year ago. Wow. A 2024 study published in
00:25:09.680 computers, education, artificial intelligence found that participants, including teachers
00:25:13.660 correctly identified a 60% of human written tests and about 58% of AI generated tests. So they
00:25:20.760 correctly identified a human test at 60% AI at 58%. So not that good. That's a new chance performance.
00:25:26.940 And I'm not going to go over all of them, but basically, Oh, there was a cool one done by ACM
00:25:32.460 in 2025 that showed that this was also true of other languages in Arabic. They could only tell
00:25:37.520 was about 51 to 52% accuracy, which is chance. Okay. So why does all this matter? Because it really
00:25:46.540 doesn't matter that we're entering the world where you cannot tell. And, and, and this world is going to
00:25:52.240 hit you so much harder in the effing face. If you're one of those arrogant bastards who walks
00:25:58.800 around talking about how much you hate AI this and AI that and call everything made with AI slop.
00:26:03.440 There are a lot of really good things that are made with AI. Okay. There's a lot of really good
00:26:08.320 art made with AI. There's a lot of really good music made with AI. And you just may not know that
00:26:13.580 that's what you're listening to. You may not know that that's what you're looking at. You may not know
00:26:17.840 that that's what you're interacting with. And I think that this sort of like puritanical,
00:26:23.680 I mean, because, because when you build this mindset of all AI stuff is bad, you're not going
00:26:29.840 to see how quickly it dominates your field. Yeah. Right. You're not going to see how quickly it is
00:26:35.680 replacing you and you're not going to adapt to and use it in a way that helps you stay on top of that
00:26:43.480 trend. But anyway, I think the more important thing from this is that AI is going to dominate
00:26:49.960 algorithms. And I think a lot of people didn't see that kind of thoughts or additional ideas or
00:26:59.200 anything like that. I guess. So I think this is absolutely in line with what we've predicted that
00:27:09.360 I remember getting very, very strongly feeling like as AI spread, we would see the rise of what
00:27:19.780 we refer to as techno feudalism, where people would start to form online fiefdoms around people they
00:27:28.600 knew to be real, maybe because they'd encountered them in person, whatever. And that people would start
00:27:34.680 to build these extended social networks built around real people because they want that veracity.
00:27:42.120 I also predicted, and we've yet to see this, that soon you'll, there will be cache associated with
00:27:50.200 looking imperfect and people may stop using filters as much just to prove that they're real and not AI.
00:27:57.640 And AI is admittedly extremely good at. Yeah, imagine the girl who's too pretty. So everyone
00:28:04.160 assumes she's AI. No, truly though. Truly, truly. Like, even if she doesn't use filters, she's still
00:28:09.600 too pretty. AI is actually pretty good at making normal looking people. I don't know if you saw the
00:28:14.620 Grok based video generation is incredibly good. And if you saw a bunch of people making videos of cats
00:28:19.400 playing music on front porches with women coming out in like their bathrooms and taking the cats away
00:28:24.240 yelling at them. They're amazing. You've not seen these? Oh my gosh, they're fantastic. It's in the
00:28:30.480 video generation is great. And what I really like about that Grok video generation is that it looked
00:28:36.400 so much more real than a lot of the other video generation where just everyone looks a little too
00:28:40.400 perfect. Everything looks a little bit too polished in here. You have this cat on this like normal,
00:28:49.040 like middle income, middle to lower income house front porch in many cases, like cat playing a
00:28:55.420 violin or playing a piano and like a woman in a bathroom who looks a little trashy, like coming
00:28:59.940 out and yelling at the cat and like pulling it away. The better one I like is the, this was made
00:29:05.120 with Grok at Imagine as well. It's the Elon song, Elon's Musk. Oh God, that's so good. Yeah. I love
00:29:11.940 that song. You've probably seen Elon's Musk. It's a fantastic song.
00:29:15.000 Well, especially because Nux makes an appearance, Asmongold makes an appearance. I mean, so many
00:29:35.980 people make appearances. I thought it was made by Image AI. No, it was just a hobbyist. Like I thought
00:29:40.180 it was made as an ad for the company. Oh no, no, it's so good. Well, that, that explains why there's
00:29:44.800 so many cool people in it, but yeah. So my, my larger point though, is that this is exactly
00:29:50.040 what I expected. I expected that, that it would get to be a point where there was just so much
00:29:56.520 online content that people really wouldn't be able to tell, but we're not yet quite to the
00:30:01.000 point that I've described where people start to concertedly look for. I agree, but I think this
00:30:07.420 is why I say that this, this like reveal that, that happened with X is kind of irrelevant,
00:30:13.800 is that all of these accounts would be irrelevant within a year anyway, right? Like you can't prove
00:30:20.240 who you are. If you aren't authentically human, you are competing solely and completely against
00:30:27.080 the AI. And if you are competing slowly and completely against the AI, you're not relevant
00:30:33.080 because the AI will beat you. It can be more boring magotard than you can. It can be more
00:30:39.260 woke libtard than you are. It can be more anything than you are. If your job is to be an average of
00:30:46.380 some sort of archetype, then it can out archetype you. Yeah. I mean, I think that the place where
00:30:53.040 there's defensibility in the future is that one, you're real and people know you're real and you're
00:30:59.180 not, yeah, you're, you're, you're, you're human and people know it. And two, you offer something
00:31:05.520 to humans that they need, which is either that feeling of connection to a real human at the very
00:31:10.500 basic level, but then also like utility in the form of some kind of social network or bonding or
00:31:16.440 professional assistance or networking or something like that. I'm, I'm guessing because everything else
00:31:22.640 is going to be covered. If you want affection, you'll get that better from AI. If you want companionship,
00:31:26.940 you'll get that better from AI. If you want entertainment, it'll come better from AI.
00:31:31.120 So the only reason you're really going to care about human accounts is because you want that
00:31:35.940 real human social network. You want a real human, like Catholic pastor or family, you know, whose kids,
00:31:45.420 your kids will hang out with and homeschool with online, that kind of thing. Right. Yeah.
00:31:51.200 And to sort of close out, I shared this with Simone, because this is one of my favorite
00:31:55.460 things from this leak. It's the, what was it? The, the defense department, one of the major
00:32:01.000 US departments. Oh gosh. Right. Yeah. It's the DOD, the DOD. It had their Twitter account had
00:32:07.460 started, like the account was created in Israel and then went around that this was fake, but it appears
00:32:13.480 to be real because tens of thousands of people saw it and said that they saw it live. So I was trying
00:32:19.260 to investigate whether it was fake or not. And it's also completely plausible. You know, there's a lot
00:32:23.520 of people who work in stuff like the department of defense who have dual citizenship with Israel
00:32:28.560 and would spend time in Israel, you know, making the pilgrimage to Israel is not an unusual thing
00:32:34.440 for a Jewish or even a Christian person to do. Sure. So I'm not surprised by that, but it is funny as
00:32:41.280 hell. And if you're like, this is truth that Israel secretly runs the DOD. It's like, okay,
00:32:47.480 first of all, if Israel secretly controlled the DOD, they wouldn't bother secretly controlling their
00:32:53.240 Twitter account. That's not like, that's a social media manager's job. That's not the running of the
00:32:58.760 operation. I'm not saying Israel doesn't run the DOD. I'm just saying that this isn't any evidence of
00:33:04.360 that. And even if they did, you know, secretly control something like the account, you think
00:33:10.120 Mossad wouldn't know to put a VPN on when this was announced that this was going to happen beforehand
00:33:14.440 a few months ago. I can see some dumb butt in India, not thinking to put on a VPN. I cannot see
00:33:22.280 Mossad forgetting to put on a VPN if they actually were nefariously controlling
00:33:27.240 the social media manager's job at the DOD. The, the other thing that I also feel really bad about
00:33:34.720 it, these people who build up these giant accounts, you know, if you've built in a Twitter account,
00:33:38.780 that's like half a million people like Maga Nation. And then it turns out that you're Macedonian,
00:33:42.420 right? Like it's an incredibly difficult thing to do, right? I feel so bad for these people.
00:33:48.980 Yeah. Yeah. It, I mean, speaking of tired tropes that stress me out, it really reminds me of the,
00:33:57.480 that trope where you start something as a joke, but then you really start to feel it. Like I could
00:34:03.620 see these Austrians starting it as like, ah, ha ha, this'll be funny. And then they start like meeting
00:34:08.040 the people in the Maga community and spouting Maga talking points so much that they just really
00:34:15.480 start to drink the Kool-Aid and get into it. Oh yeah. No, I don't, I don't think that they were
00:34:19.500 probably that insincere. I mean, and now, yeah, now they're being like ruined for it. You know,
00:34:24.040 it's just like, what's that high school movie where the guy, it's supposed to be based on Taming of
00:34:29.940 the Shrew where the guy takes a bet that he can like ask some girl. Well, no, but I mean,
00:34:35.820 this is like, and then he falls for her, but then she finds out that it was on a bet and then she
00:34:39.780 hates him. Oh yeah. What was that? Not another high school movie, but you were thinking she's all
00:34:44.940 that. She's all that. Yeah. Like that's very stressful because like he actually, you know,
00:34:49.180 fell, he fell in love with her, but then she discovered that it was, it was based on false
00:34:53.300 pretenses originally. Yeah. And Eastern Europeans are pretty right-wing, generally speaking. Like
00:35:02.500 Romanian TV is Eastern European and he basically only talks about American politics. Like America,
00:35:09.720 like the, the American. Well, like we said, there's, there's one show left and it's sort of
00:35:14.000 like the global geopolitical slash tech slash economic debate. And if you're in Romania,
00:35:19.680 that means you're talking about us issues. Yeah. Who the heck cares about politics in Macedonia,
00:35:25.020 right? It's just the audience isn't big enough. So that can't be part of the main narrative. Yeah.
00:35:29.840 And so everyone all around the world is LARPing as part of our narrative because we are main
00:35:34.920 characters. This is why I founded the American pro-analyst movement and I get called to speak
00:35:40.920 on Italian TV and French TV. And that's, that's a really good case in point. Yeah.
00:35:45.540 South Korean TV. South Korea should have South Koreans speaking about pro-natalism on South
00:35:51.000 Korean TV. Yeah. But the market's not big enough for it. So yeah. Japan had me, I was doing the
00:35:56.320 rounds. I was the major pro-natalist voice of Japan. Yeah. Malcolm Collins. Yeah.
00:36:03.680 Anyway, I love you to death, Simone. Really interesting topic. Really important for people to be paying
00:36:08.320 attention to. And frankly, I don't know how I'm going to keep doing this because I have to come up with a
00:36:12.500 totally new, unique take every day. Nobody has any ideas? This sucks. I don't want to keep having
00:36:20.840 to come up with ideas for shows all the time. It hurts my head. Dude. Bail? I think bail. Bail.
00:36:29.500 Yep. Bail. We, not really though, because you and I are always finding weird, nerdy stuff that we
00:36:36.700 want to talk about. And there's like already a long, a long backlog of episodes that we
00:36:41.980 still haven't recorded that we still want to do all, all sorts of topics and people keep giving us
00:36:48.280 great topics to cover. Some of which are more feasible than others. Like someone had a great
00:36:52.100 idea that we should, you know, interview detailed transitioners and talk about their experiences,
00:36:55.660 but like getting enough people who will talk with us. No, Benjamin Boyce already does that. Like,
00:37:01.120 um, also another reason we don't do things like that is because it would be inconsistent for our
00:37:07.000 fan base interviews. We typically only have interviews when I know it's going to be off
00:37:11.100 the chain and we don't even air a lot of our interviews because when you're bringing in
00:37:15.680 interviews, you can't know that it's going to be consistently entertaining for the audience
00:37:19.380 who likes listening to you plus me. The second issue is, which is really bad for the algorithm
00:37:25.440 for the channel, because then somebody jumps in, they're like, I don't like this person this day.
00:37:28.860 And so then they don't watch it. And the, the other big problem, this is why podcasts are very
00:37:33.080 big for interviews, but YouTube is very rare to have interviews or on the bigger channels.
00:37:37.920 The second thing that I note here is it's too hard to organize. I can spend a day researching a
00:37:44.160 topic and have an idea after like two hours of research or something, what I'm going to do for
00:37:48.340 an episode. If we're organizing a, an outreach campaign and a, then, then queuing someone up
00:37:54.660 campaign, then getting the day just right. Then the beginning talk to get them feeling okay with
00:37:59.940 us, you know, like that's the whole extra thing that I don't want to deal with. Right.
00:38:04.260 So this is not really sustainable. It's also true for topics that require too much research,
00:38:08.460 which is why I've stopped doing as many religious topics. It's not that I do not want to do them.
00:38:13.560 It's that I actually have some like tracked episodes fully written already.
00:38:18.060 Like tracked episodes take about four times as long as a normal episode to edit, record
00:38:23.340 and handle even after they're written. So they're just sitting in a backlog, right? Like
00:38:28.400 backlog track, backlog track, backlog track. And, and it's also always religious episodes. I have to
00:38:33.900 do a lot more research because I think it's a lot worse if I accidentally named the wrong account
00:38:40.080 in an episode like this, like whatever. Right. But if I name a religious practice on an episode
00:38:45.460 that's critical of a religious tradition, that's not awesome. Right.
00:38:50.740 You also care more. Yeah. Yeah. There's that. All right. Love you, Simone.
00:38:56.540 Love you too. You too. And I'll see you in the, we'll do the debate first and then we can do the
00:39:03.380 We're on. I'll see you in there. This is, this is for paid members only. Me and Simone debating
00:39:08.440 something. Yes. No, it'll be on a weekend. Did you, when you were a kid, have those amazing
00:39:15.800 colored glass light bulb Christmas lights?
00:39:23.340 The kind that would burn your fingers, you know, that they'd get so hot.
00:39:29.560 I have no idea what you're talking about. Oh, the big ones. The old timey ones.
00:39:32.900 The old timey ones. Yeah.
00:39:34.280 And what you might forget about this is in Highland park, the town tree that was
00:39:38.040 decorated, was decorated entirely in those.
00:39:42.440 Is it just me or is there something about the sound that they made when clinking
00:39:48.200 together? That's just iconic. It's like better than the lights themselves, better
00:39:53.420 than like sleigh bells.
00:39:55.640 I don't, I don't know this sound.
00:39:58.100 Oh, the sounding of the, the sound of old vintage.
00:40:04.280 Light bulbs from a vintage string of lights clinking together.
00:40:09.160 Oh, yeah, maybe this, this, this is my unique brand of ASMR.
00:40:16.780 I bet though, if I looked it up on YouTube, I would find it.
00:40:20.140 There'd be some, someone out there who had issues like me, who's super into it.
00:40:27.720 Oh, sorry.
00:40:31.100 We're dying.
00:40:31.680 And what do people say of the episode?
00:40:35.900 Good. I mean, you know, mostly like, oh, can you believe people would need to
00:40:39.980 medicate to have a sex drive? Some people were like, well, like, this is concerning.
00:40:44.700 People shouldn't take medication for this. Other people are like, man, like women, you
00:40:49.180 know, one person pointed out, and I think you saw this comment, that really the unsung
00:40:54.260 story in, in, in the, the saga of women medicating for sex drive was that men feel this way
00:41:03.700 all the time and they don't get any credit for controlling themselves. Whereas women medicate
00:41:10.040 themselves and like, can't stop, like, can't control themselves at all. Yeah. And yeah, like
00:41:15.760 men just don't get any credit and women are like, oh, I can't help myself. That's a good
00:41:20.620 point. That's a good point. It's a really good point. Yeah. Right. Someone else shared
00:41:27.720 with me, no, with both of us, I don't know if you saw it, but we ran a video over the
00:41:36.080 weekend on ghosts and ghost stories and people shared in the comments. They're fantastic ghost
00:41:41.240 stories. I really enjoyed them. And someone's suggested a video on infrasound or in, in
00:41:47.700 infrasound, like sounds that you can't hear, but that, that affect people and can have been
00:41:54.180 shown in studies to make people experience paranormal things like to feel, feel, look
00:42:01.060 into it. Sometimes we get good leads from our audience. Yeah. It's really, well, I mean,
00:42:05.100 I don't think there's an episode for us to do, but the, the video on infrasound in, infrasound
00:42:11.480 was made on this, by this guy named Ben Jordan, who does, he has like a Patreon and a YouTube
00:42:16.660 channel and he just does music and science, but he's, he does like a lot of audio investigations.
00:42:22.140 And I fell down this rabbit hole of content he's done where like, he tries to investigate
00:42:26.920 mystery sounds that bother people that they hear, like sort of like humming noises or,
00:42:33.260 you know, the, the, like beyond infrasound, just the sounds that, that some people claim to
00:42:39.700 hear that really bother them that could be due to like radio frequencies or perhaps pipes
00:42:44.620 resonating. And he buys this really expensive sound equipment and will go to people's houses
00:42:50.640 or stay in houses in certain regions for a period of time, turn off all the electricity to the
00:42:56.080 building so that no electrical devices in the building can cause sound and record audio for long
00:43:03.660 periods of time and try and like keep track of, of strange sounds that are picked up that we
00:43:09.700 wouldn't be able to hear with our own ears. And I'm just delighted that there are people like this
00:43:14.700 in the world that go around the world to collect audio samples, not as a scientist, not as like, and
00:43:22.740 also not in like in the past, like some kind of gentleman scientist, who's just like independently
00:43:27.060 wealthy and gets to do whatever he is. Cause he's, he wants to, because he's autistic and wealthy,
00:43:31.580 but like people who have active patrons who are just normal people who will patronize his, his efforts
00:43:40.960 to go out and do really meaningful research on like, okay, well people claim to hear these sounds
00:43:47.820 and they really bother them. And they think it's this, like this military frequency, or they think
00:43:52.660 it's cell phone towers and I'm going to go find out what it is. And that's just delightful. Like people
00:43:57.960 delight me and the world is wonderful. And I don't know why people are so cynical
00:44:01.560 because it's great. All right. Love you, Simone. By the way, for dinner tonight, I was thinking we
00:44:08.140 would do steak reheat. We can do steak teriyaki. Just like last night, but instead of with bok choy,
00:44:16.540 just more tons of chips. But remember to cut them down the center. You stopped doing that?
00:44:21.260 I did. I did last night.
00:44:22.460 No, no, no, no. Like the circular things cut down the center so that they can separate by leaf.
00:44:28.940 Oh, I only did that with the lower parts. See, yeah, I need to do it. The upper parts,
00:44:32.900 I guess. Oh no, you, you only have to do it with the lower parts. I just didn't notice it was the
00:44:36.820 lower parts. I did do it with the lower parts last night. Maybe I missed one. Sorry.
00:44:40.700 All right. No, I will jump into this.
00:44:44.700 Hey, Octavius, what are we watching? Explain to me the difference between AI puppets,
00:45:03.100 because he thought this was AI. AI is like a machine's dreams, right? And puppets are made
00:45:17.340 by hands and stuffed animals. Well, puppets are done in a different way. Sometimes they're
00:45:30.020 with strings. Sometimes they're just hairs. Oh, spirit. Do you believe me? I think
00:45:36.700 you have made so much work. You changed me. And now I need all the ghosts of Christmas.
00:45:45.120 I just love it. I just love it. I just love it. Do you need a future?
00:45:50.620 Okay. What is this called? Christmas Carol? Yeah, Muppet Christmas Carol.
00:45:57.400 Okay. Love it. Christmas Carol. What was this sound?
00:46:05.900 What was this sound?