SNEAKO - July 19, 2022
Why Social Media Is Melting Your Brain
Episode Stats
Words per Minute
183.01436
Summary
Is there a principal reason why you should get off social media? And if so, what is it? Is it for your own good? And is it for society's good? Or is it because you're being subtly manipulated by algorithms that are watching everything you do constantly, and then sending you changes in your media feed that are calculated to adjust you slightly to the liking of some unseen advertiser?
Transcript
00:00:00.000
This is an interview on how social media ruins your life.
00:00:03.700
Is there a principal reason why I should delete my social media?
00:00:11.660
One of them is for your own good, and the other is for society's good.
00:00:16.360
For your own good, it's because you're being subtly manipulated by algorithms
00:00:22.220
that are watching everything you do constantly,
00:00:24.660
and then sending you changes in your media feed, in your diet,
00:00:30.340
that are calculated to adjust you slightly to the liking of some unseen advertiser.
00:00:36.660
And so if you get off that, you can have a chance to experience a clearer view of yourself and your life.
00:00:43.540
But then the reason for society might be even more important.
00:00:47.940
Society has been gradually darkened by this scheme
00:00:52.080
in which everyone is under surveillance all the time,
00:00:54.660
and everyone is under this mild version of behavior modification all the time.
00:01:08.700
It doesn't mean that you need to get off social media, but you could pick an option.
00:01:14.880
Or you could step outside, see the bot behavior that it turns people,
00:01:25.160
I'll clip up my shit, post it on TikTok, and don't be a bot.
00:01:28.660
And now everybody who's posting my shit on TikTok is seeing all the bot mentality.
00:01:33.120
Everybody in the comment sections, and yo, TikTok people vouch for me.
00:01:36.560
Everybody in the comment sections is saying the exact same thing.
00:01:48.540
Be a fucking consumer echoing all the same nonsense.
00:01:58.300
It's made teens especially depressed, which can be quite severe.
00:02:05.060
Every single teen, every Gen Z kid thinks they're depressed.
00:02:17.520
Are you being programmed by all the same opinions that everybody has?
00:02:30.340
No, I have a chemical imbalance and I've been scrolling for eight hours looking at girls
00:02:39.260
But it's made our politics kind of unreal and strange where we're not sure if elections
00:02:46.280
We're not sure how much the Russians affected Brexit.
00:02:50.120
We do know that it was a crankier affair than it might have been otherwise.
00:03:05.100
They want to dumb you down so you don't think for yourself.
00:03:11.720
If you don't believe me, start clipping up this stream and then watch everybody in the
00:03:19.400
If you're in school, whatever the fuck, whatever age you are, and observe people scroll.
00:03:25.320
You remember Wally when people were in wheelchairs sitting like this and didn't even move?
00:03:34.000
So they take all the power to keep you sedated, to domesticate you.
00:03:38.680
That's why every day they call them, you're immature.
00:04:10.180
And then people at the top living the fucking dream.
00:04:41.640
The founders of the great Silicon Valley spying empires like Facebook
00:04:46.500
have publicly declared that they intentionally included addictive schemes in their designs.
00:04:54.300
Now, we have to say, this is what I would call almost a stealthy addiction.
00:05:02.280
What it says is, we will get the broad population to use the services a lot.
00:05:08.760
We'll get them hooked through a scheme of rewards and punishment.
00:05:16.640
The algorithm is developed to keep you hooked as long as possible.
00:05:20.420
They want you to be on the app as long as possible.
00:05:27.680
You can see the algorithm in the way people talk.
00:05:30.480
Because everyone says exactly what they see on their feed.
00:05:41.220
Nah, just echoing the same TikTok personalities.
00:05:46.720
Everything, their whole personality, everything they say is echoed from an infographic or some
00:05:59.140
If you start thinking for yourself, someone calls it toxic.
00:06:05.260
People think it's a mental illness now if you think for yourself.
00:06:13.380
The punishment is when you're treated badly by others online.
00:06:16.360
And then within that, we'll very gradually start to leverage that to change them.
00:06:36.340
So it's not as dramatic as a heroin addict or a gambling addict, but it is the same principle.
00:06:45.860
Because a heroin addict, at least crackheads get up and dance a little bit.
00:06:49.940
At least crackheads are like, they got stories.
00:07:02.840
One thing crackheads will do is they will get to it.
00:07:06.560
You algorithm people just fucking get in your moral high ground, echoing the same shit that everyone else is saying.
00:07:15.320
I mean, there isn't some master sort of wizard of oil sitting behind the screen, is there?
00:07:19.860
Well, this is the peculiarity of the situation.
00:07:22.560
The people who run the tech companies like Google and Facebook are not doing the manipulating.
00:07:27.980
But the manipulating, which rides on the back of the addicting, is the paying customer of such a company.
00:07:36.000
And many of those customers are not at all bad influences.
00:07:39.600
They might simply be trying to promote their cars or their perfumes or whatever.
00:07:43.920
And indeed, I have sympathy for them because they're concerned that if they don't put money into the system, nobody will know about them anymore.
00:07:51.740
How is it different to just television advertising or billboard advertising or anything else?
00:07:58.880
So when you watch the television, the television isn't watching you.
00:08:02.540
When you see the billboard, the billboard isn't seeing you.
00:08:12.500
Just the other day, I was talking about upgrading my Wi-Fi because my stream was lagging.
00:08:15.700
And then I got an email from like Spectrum Mobile.
00:08:19.920
I've never gotten an email from a Wi-Fi company in my life.
00:08:24.420
An hour after I was having a conversation about wanting to upgrade my Wi-Fi, they emailed me.
00:08:29.040
And people will still say, well, they're not listening.
00:08:39.760
When you're on TikTok all day, not only do they know your sexual orientation,
00:08:57.660
The ads you get now are personalized for your dumb brain.
00:09:05.200
And vast numbers of people see the same thing on television and see the same billboard.
00:09:09.780
When you use these new designs, social media, search, YouTube, when you see these things,
00:09:16.240
you're being observed constantly, and algorithms are taking that information and changing what
00:09:20.860
you see next, and they're searching and searching and searching, and they're just blind robots.
00:09:25.880
There's no evil genius here until they find those patterns, those little tricks that get
00:09:33.120
In terms of society, I mean, you, you, you, Chad, how many of you have had that same experience?
00:09:39.020
How many of you have gotten an ad that was something you were talking about two hours ago?
00:09:44.000
Maybe I'm just a paranoid conspiracy theorist, or maybe all of you have had the same experience.
00:09:51.380
Why do they know exactly when to send you a specific ad?
00:09:58.360
And you think that when I say bot, I'm just like a conspiracy theorist or some shit?
00:10:11.640
What's the first thing you do when you wake up?
00:10:13.420
The first thing you do, part of our brain is connected to this.
00:10:20.980
Everybody has had the same experience in the chat.
00:10:25.600
Every single one of you have had the same shit happen to you.
00:10:28.280
So now we have an option because you can't get rid of your phone.
00:10:45.220
Since I was a little kid, since I was a teenager, I noticed that everybody was on this.
00:10:49.600
So I invested all my time and energy into this.
00:10:52.820
Not as a bot program, but as somebody on the outside creating for it.
00:10:58.440
Since I was a little kid, I dropped out of school to do this shit, bro.
00:11:02.480
Threw in this, you know, it's making people depressed.
00:11:09.300
Unfortunately, there's a vast amount of evidence.
00:11:11.480
There have been dozens of studies at this point, including studies released by Facebook scientists.
00:11:21.700
And when Facebook releases such things, they say, oh, but we do all these good things, too, that balance it.
00:11:27.360
Anybody who trusts Mark Zuckerberg, you cannot trust any of these goddamn people.
00:11:31.760
All they want to do is continue getting richer.
00:11:37.580
To make more money off of you, you need to be more and more and more addicted.
00:12:01.320
Remember when everybody was echoing the same Vine jokes over and over again?
00:12:04.620
Now if you go on Tinder, half the girls' bios, the way to win my heart is by Vine 2016 Vines.
00:12:17.200
If that's not some bot shit, I don't know what is.
00:12:20.600
There's a general acknowledgement that depression correlates.
00:12:25.360
The scariest example is a correlation between rises in teen suicide and the rise in use of social media.
00:12:47.900
But are you sure you can blame it on social media?
00:12:51.100
Or is it not just those two things may have happened at the same time for other reasons?
00:12:56.800
It's very similar to the problem of global climate change.
00:12:59.800
We can say statistically over the whole population, yes, the correlation is real.
00:13:04.480
And any particular person, of course, we can't.
00:13:06.700
Just as we can't blame any particular storm on global warming.
00:13:14.860
And this is something that's very well demonstrated.
00:13:18.640
So when the company's own scientists are publishing on this topic and come to the same agreement,
00:13:33.500
I love Silicon Valley, and I do not at all feel that I've turned on my own kind.
00:13:37.660
And just to be clear, I'm very much a part of this.
00:13:43.860
They're just trying to farm as much data as possible.
00:13:47.060
They're all aware of what Edward Snowden was saying back in 2012.
00:14:08.200
So you need to act as if you're always being watched.
00:14:26.200
Why the fuck do you think my shit blew up in a day?
00:14:30.300
Once you realize how bot-minded people are, I don't even have TikTok.
00:14:34.860
Everyone's messaging, you're all over my For You page.
00:14:42.500
It's easy to capture your attention because all of you are dumb.
00:14:45.240
Since an outsider, I believe that what we're doing is not in our own self-interest.
00:14:58.020
If they destroy society, they destroy themselves.
00:15:00.140
I believe it's very clear that we could offer all of the good things.
00:15:05.260
And there are many, many good things in these services, in social media in particular.
00:15:09.700
I'm convinced we can offer them without this manipulation engine in the background.
00:15:14.360
There's a world of other business plans, and I think they'd be better for us.
00:15:18.060
So I don't think we're being evil so much as we're being stupid.
00:15:25.760
He just said the same shit I said, but I'm saying it.
00:15:32.040
He just called you stupid, but he said it in a nice way.
00:15:48.040
We're being evil so much as we're being stupid.
00:15:51.460
So I don't think we're being evil so much as we're being stupid.
00:15:59.960
When it comes to Facebook, has Facebook made itself safe yet in terms of data harvesting and scraping and all that?
00:16:08.000
Well, Facebook's fundamental design is one that is, the business model is to addict you and then offer a channel to you to third parties to take advantage of that, to change you in some way without you realizing it's happening.
00:16:28.080
So I don't think any amount of tweaking can fully heal it.
00:16:41.100
We're all addicted to the algorithm and they don't care how dumb we get.
00:16:44.800
It's actually more beneficial the dumber you get.
00:16:47.560
The more bot-minded you are, the more time you spend on your phone, the more money they make.
00:16:52.640
It's getting harder to get to the top because they have a stronghold.
00:16:58.060
You think the people at the top, like, have gender pronouns?
00:17:04.880
They don't, they're not part of the same stupid shit that we're a part of.
00:17:10.480
You think Elon Musk is arguing about trans bathrooms?
00:17:16.060
Billionaires see us and they literally think that we're stupid.
00:17:19.620
They see us all at the bottom and we're like little ants.
00:17:24.840
They look down at us like we're ants running around in a hill arguing about this nonsense.
00:17:29.580
Bible Choice, holding these signs, protests, Black Lives Matter, Black Lives Matter, laughing at us.
00:17:42.420
To throw a barrage of rules at somebody who's following certain incentives and then expect them to really make a difference.
00:17:50.480
So when Mark Zuckerberg says he's taking action and, you know, he regrets what's happened and all the rest of it, you're saying he can't make his own product a safe and desirable product?
00:17:59.900
I believe that as long as his business incentives are contrary to the interests of the people who use it, who are different from the customers, then no matter how sincere he is, and I believe he's sincere, and no matter how clever he is, he can't undo that problem.
00:18:19.440
He has to go back to the basics and change the nature of the business.
00:18:23.820
They're just going to become—that's why when you see, like, Zuckerberg talk and everything, you're like, how is this guy even a real person?
00:18:29.960
Because they've committed so much—they've fucked our brains up so much that they're lost, and, like, they need to justify it a little bit, but there's a little bit—they know how much they're ruining everybody.
00:18:45.780
They have all empathy separated from—why do you think Mark Zuckerberg looks so scary now?
00:18:53.300
When I got back on it, I started feeling miserable.
00:18:55.220
Same feeling as eating fried Oreos after being healthy for a while.
00:19:02.120
And if he doesn't agree with that and says we're just going to carry on, how important is security of that data and the inability to repeat what has happened with Cambridge Analytica and all that kind of sort of data harvesting that went on?
00:19:19.940
I don't believe that this is—I don't believe that what happened with Cambridge Analytica is the worst of it.
00:19:28.760
Like, let's suppose that Facebook reforms itself so that the next Cambridge Analytica can't get access to that data.
00:19:55.080
Look at them staring into your soul, knowing everything—this dude right here knows every single thing about you.
00:20:07.080
—just to the same results, because the service Facebook offers is exactly what Cambridge Analytica—
00:20:16.080
Yeah, I mean, this is—you know, there are—bad actors are able to use Facebook in ways that Facebook can't understand,
00:20:26.080
because the way the service is designed is fundamentally to be manipulative.
00:20:30.080
So I think the data protection idea is a sincere and good idea, but it's certainly not adequate.
00:20:37.080
It doesn't address the core problem, which is the manipulation engine.
00:20:41.080
And as long as that is there, a bad actor can find a way to utilize it.
00:20:45.080
So, to me, this concern about data protection, while laudable, doesn't address the core problem.
00:20:55.080
I mean, you know, why is something like YouTube, which is basically just a way of watching video, bad for you?
00:21:06.080
Because you start talking about this, and they strike your channel for misinformation.
00:21:10.080
I started talking about the Coco, started talking about this.
00:21:12.080
They deleted the video, and they gave me a community guidelines warning.
00:21:15.080
If I talk about this again, they can strike it, and I cannot stream.
00:21:18.080
If this shit doesn't wake you up, I don't know what will.
00:21:21.080
I have a community guidelines strike for talking about code.
00:21:24.080
If you say the wrong thing, if you have the wrong opinion, it's misinformation.
00:21:36.080
You can really look at this, and, like, nobody cares.
00:21:49.080
Community guidelines strike for misinformation?
00:21:52.080
Why do they get to— Why are they in charge of information?
00:21:57.080
Why are you okay with Zuckerberg and Susan telling you what information is correct or not?
00:22:03.080
They clearly don't have your best interest in mind.
00:22:06.080
So if you have the wrong opinion, if you say something that challenges the programming,
00:22:22.080
So for some percentage of people, it'll have an effect of making them crankier around election time
00:22:28.080
and feeling needier around the time they might be making a purchase and so forth.
00:22:32.080
And the way it works is that all the data Google can get on you, much of which comes from just your email or whatever else it might be,
00:22:41.080
is fed into an engine that compares you with other people who share some similar traits.
00:22:46.080
And YouTube's ordering of videos that are presented to you is designed to, on the one hand, maximize your engagement so you won't stop watching,
00:22:55.080
but that's achieved not just by observing you but by a multitude of people who are similar to you.
00:23:01.080
And then when you do get an ad, it's contextualized in a way that has been shown to be effective not only for you but for this whole population.
00:23:11.080
And it's bad for you because it leeches your free will.
00:23:17.080
It makes you all think you have a chemical imbalance in your brain.
00:23:21.080
And it's getting worse and it's getting faster and our attention spans are getting worse and worse and worse, bro.
00:23:26.080
You know that faint noise off in the background when you hear someone just playing TikToks.
00:23:47.080
You start talking about this shit, look at that.
00:23:56.080
Bro, I started the stream and he got 5,000 more followers since I started the stream.
00:24:06.080
The world a little darker because you're not perceiving reality clearly anymore.
00:24:15.080
And it, the people who are paying or maybe not paying, just using the system to, in a clever way to get at you, are not necessarily pleasant people.
00:24:28.080
They're, they're, they're, they're sort of the worst actors.
00:24:31.080
But don't, don't some users think, look, I can handle advertising.
00:24:38.080
Uh, and you know, they think they're manipulating me, but I know what I'm doing.
00:24:45.080
The problem is that behaviorist techniques are often invisible to the person who's being manipulated.
00:24:55.080
Uh, it used to be that the only way to be subjected to continuous observation and modification was to either be in an experiment.
00:25:03.080
Uh, you could be in the basement of a psychology building and have students tweaking you for their projects.
00:25:14.080
And often the people who are in these situations do not realize it's happening to them.
00:25:18.080
In fact, the whole point is that it's, it's sneaky.
00:25:21.080
It's, it's a, it's a mechanical approach to manipulating people.
00:25:29.080
It doesn't involve direct communication and people don't get the cues to understand what's happening with them.
00:25:34.080
Why do you think, um, social media has had the effect on politics that it has?
00:25:39.080
You know, is it because of the way people respond to...
00:25:42.080
You know what happens when you step out of social media?
00:25:44.080
When you get off Twitter for a bit, you start realizing,
00:25:51.080
But now, because everyone feels important, they start posting a black square.
00:25:55.080
Everyone posts a black square when Black Lives Matter happens.
00:26:13.080
While I think about this, this is my opinion on Ukraine.
00:26:41.080
But then, this girl on TikTok posted an infographic.
00:26:44.080
I think I'm actually gonna call off the strike.
00:26:56.080
Well, I'd like to give you a slightly detailed answer as quickly as I can.
00:27:02.080
And that is that, in traditional behaviorism, you would give an animal or a person a little treat like candy,
00:27:09.080
or maybe an electric shock, and you'd go back and forth between positive and negative feedback.
00:27:14.080
And when researchers try to determine whether positivity or negativity is more powerful,
00:27:22.080
But the difference with social media is that the algorithms that are following you respond very quickly.
00:27:34.080
...is like getting startled or scared or irritated or angry,
00:27:37.080
tend to rise faster than the positive responses, like building trust or feeling good.
00:27:47.080
So the algorithms naturally catch the negativity and amplify it,
00:27:51.080
and introduce negative people to each other and all of this.
00:27:54.080
And so what this does is it means that the algorithms discover there's more engagement possible,
00:28:00.080
say, by promoting ISIS and promoting the Arab Spring.
00:28:03.080
And so ISIS gets more mileage or promoting the Ku Klux Klan than Black Lives Matter.
00:28:09.080
Now, in the big picture, it's not true that negativity is more powerful.
00:28:13.080
But if you're doing this very rapid measurement of human impulses instead of accumulated human behavior,
00:28:22.080
So you tend to have elections that are more driven by rancor and abuse,
00:28:27.080
and you tend to have outcomes that are kind of crazy.
00:28:32.080
King J, do 10 push-ups right now. Don't skip your workout for the stream.
00:28:36.080
The effects on the media we consume, the news, as well as also alarming,
00:28:40.080
because then it'll be the news that makes people angry
00:28:44.080
that is the news that gets seen in the future or now,
00:28:47.080
rather than, you know, a more balanced diet of what's really going on in the world.
00:28:53.080
Well, I think what goes on on a show like this is that you have a bit of a longer time horizon
00:28:59.080
by which you measure success. So you have to impress your viewership enough to tune in.
00:29:05.080
But this is over a process of days and weeks and months and years.
00:29:08.080
My advice? Stop caring about being likable so much.
00:29:12.080
When you start getting out of the bot shit, you're going to offend a lot of bots.
00:29:27.080
Nobody says what they think anymore because so many people get offended by it.
00:29:37.080
If you get that mad at what I think, I'm not going to change your mind.
00:29:48.080
Put the, anything you wanted me to react to, put it in the discord.
00:29:53.080
And you build up a sense of rapport with your, your viewership, right?
00:29:57.080
Um, if you're an algorithm that's just looking at instant responses,
00:30:04.080
...engage this person and it'll be, you'll find that engagement more often
00:30:16.080
And it'll be, you'll find that engagement more often by irritating people than by educating them.
00:30:24.080
Because they thought that by saying he was triggering every day in the news that he was going to lose.
00:30:35.080
He kept triggering people every day and got famous for triggering people.
00:30:38.080
And there's people who get it and people who don't.
00:30:42.080
All publicity really, at the end of the day, is good publicity in this game.
00:30:46.080
In this game of politics, in this game of entertainment.
00:31:05.080
Or, you know, any of the other populist leaders who are doing very well at the moment.
00:31:11.080
I have never known Trump, but I have met him a few times over a fairly long period.
00:31:16.080
Over 30 years, actually, through different circumstances.
00:31:19.080
And I will say that, while I never would have voted for him as president, and I always thought
00:31:25.080
he was somewhat untrustworthy and a bit of a showman and a bit of a scammer.
00:31:31.080
He never lost himself and became so strangely insecure and so weirdly irritable until he had
00:31:52.080
His character has been really damaged by his Twitter addiction.
00:31:55.080
Because of the reaction he gets from each tweet?
00:31:58.080
So, you know what happens in addiction is the addict becomes, not just on the good part
00:32:04.080
of the addiction experience, but on the whole cycle.
00:32:07.080
You know those people who just argue on Twitter all goddamn day long?
00:32:09.080
I used to be one of those guys like going back and forth, like arguing about atheism
00:32:15.080
And then you step out of it, you're like, yeah, I destroyed his argument.
00:32:19.080
It's just like the same people in the chat now saying like, you're, you're dumb.
00:32:24.080
They leave the stream and they feel like, because they're just waiting for me to
00:32:30.080
And you just gotta, ha ha, whoop, whoop, whoop.
00:32:35.080
Because anybody who's like trying to get a reaction or something like that, anybody
00:32:38.080
who's engaging, this is, and too much negativity, it makes you miserable.
00:32:43.080
A gambler is not just addicted to winning, but to this whole process where they mostly lose.
00:32:49.080
And in the same way, the Twitter addict or the social media addict becomes addicted
00:32:54.080
to this engagement, which is often unpleasant, where they're engaged in these, you know, really
00:33:02.080
And only once in a while is that, you know, you'll, you can watch Trump.
00:33:06.080
Like every once in a while, there'll be this tweet where somebody likes him.
00:33:09.080
And that's when he gets his little, we call it in the trade, the dopamine hit.
00:33:13.080
That's what it's called in Facebook, for instance.
00:33:16.080
He gets his little dopamine hit and then he dives in for more negativity and things.
00:33:24.080
And do you think it's possible to create a do-gooding social network?
00:33:31.080
And the way to do it is to have a different business model where instead of...
00:33:34.080
So right now, we've created this bizarre society, it's unprecedented, where if any two
00:33:39.080
people wish to communicate over the internet, the only way that can happen, the only way it's
00:33:44.080
financed is through a third party who believes that those two...
00:33:54.080
So we can keep all the good stuff, and there is good stuff on social media, of course.
00:33:59.080
We can keep all that and just throw away the manipulation business model and substitute
00:34:05.080
And there are many alternatives that would be better.
00:34:10.080
It could be a paid service, like a Netflix, where you're paying for it.
00:34:18.080
It could become a public thing that isn't commercial at all.
00:34:24.080
But what we did in Silicon Valley is we wanted it both ways.
00:34:27.080
We wanted everything open and free, but we wanted hero entrepreneurs and hackers.
00:34:31.080
And so the only way to get that was this advertising thing that gradually turned into the manipulation
00:34:39.080
And this weird business plan, once you can see that there are alternatives, you realize how strange it is
00:34:54.080
If I have anything, I've been doing this for a while, there's no point in trying to be like,
00:35:12.080
And so you're not going to change people's mind.
00:35:19.080
So you can make people laugh, make some money off of it.
00:35:24.080
But if you're going to, well, you must click the GoFundMe link that I have here.
00:35:28.080
And this is the infographic that has the right thing.
00:35:39.080
We don't have to get rid of the idea of social media.
00:35:41.080
We just have to get rid of the manipulation machine that's in the background.
00:35:45.080
Just one last thing as well that is also obsessing parents at the moment.
00:35:50.080
Screen time itself, do you think that is a bad thing?
00:35:55.080
To be frank with you, I struggle with this question because I have an 11-year-old.
00:36:01.080
And so I tend to think that manipulation time when the kids are being observed by algorithms
00:36:08.080
and tweaked by them is vastly worse than just screen time by itself.
00:36:14.080
Someone just donated five and said, how do you get out of the bot mentality?
00:36:20.080
You see that we're all just little ants running around yelling about nothing.
00:36:40.080
Don't just be in here consuming, echoing this, this is the right thing to say.
00:36:50.080
Do you include video games in the social media?
00:36:53.080
You know, the things that are manipulating them?
00:36:55.080
Because they are similarly addictive, aren't they?
00:36:57.080
They're addictive but not manipulative typically.
00:37:00.080
Now, here I'm not sure how evil we've become lately because there might be some video games that are using behavior mod techniques for pay.
00:37:12.080
If you're thinking about it out there, don't do it, okay?
00:37:17.080
But the mainstream video games are not doing that.
00:37:22.080
So there are plenty of things that are addictive that aren't leveraging that for manipulation.
00:37:33.080
It's getting people to pay for things within the game.
00:37:37.080
No, but see, the thing is, getting them to pay is still not manipulating them for a third party.
00:37:44.080
I mean, Amazon does that to get you to buy stuff.
00:37:54.080
Black Lives Matter in Ukraine, every single situation, the pandemic, there's always people making money off of it.
00:38:00.080
Every time there's people enraged giving GoFundMe links and telling you what you do, there's always somebody who's making all the bread off of it.
00:38:07.080
Black Lives Matter, the guy who made all the money from all your dono links, is a white dude.
00:38:11.080
All the GoFundMe links to Ukraine, most of them were a scam.
00:38:15.080
The Pandy, the people were really outside getting money where the rich people at the top, looking at all this like laughing while we stayed inside.
00:38:28.080
Handing out masks in the park, like making mean eye contact with you.
00:38:35.080
People are laughing at us while we all walk into a restaurant on a plane, take it off so that...
00:38:42.080
We walk on a plane, all of it on, take it off so that we all eat, breathing everywhere, put it back on.
00:39:02.080
So every time you see one of those situations, George Floyd, anything like that, when there's mob mentality, think about it this way.
00:39:10.080
Someone with way more power, someone with way more money is scamming you while you are trying to do the right thing and save the world.
00:39:17.080
But ultimately, we can't save the world, no matter what.
00:39:25.080
No matter how many of this you take, we have no control.
00:39:30.080
They're taking the money while we run around and get scared.
00:39:39.080
Especially if you feel your kids are wasting money, you might object to it.
00:39:42.080
You might feel it's not an ideal example of human behavior and character and maybe there could be a better business, whatever.
00:39:50.080
But it's not directly manipulating you, say, to influence an election.
00:39:55.080
It's not trying to change your behavior out in the larger world.
00:39:59.080
And that's the thing that's really tragic about designs like Facebook and Google.
00:40:05.080
But your advice tonight to everyone watching this is delete all your accounts.
00:40:20.080
The best manipulation tactics, if you read like any book about control or power, the best manipulation is with fear.
00:40:36.080
Because people will do anything to stop being scared.
00:40:45.080
Every time you're scared, someone's looking down.
00:40:55.080
One, if you're a young person and you've only lived with social media, your first duty is to yourself.
00:41:08.080
And you can't know yourself without perspective.
00:41:11.080
So at least give it six months without social media.
00:41:16.080
Don't like quit Facebook and keep another Facebook thing.
00:41:19.080
People are in the chat saying like, oh, but you make money on social media.
00:41:21.080
I didn't have Instagram until like a couple of years ago.
00:41:24.080
All throughout high school, I didn't have Instagram.
00:41:33.080
I took years to really learn myself and develop my own brain because I saw that it was manipulating people.
00:41:39.080
I needed to be on my own to really figure myself out before I could be on the internet all the time.
00:41:46.080
That's why I could stream every day and I could bob and weave all the fucking haters, all the people saying,
00:41:50.080
And so misogynist, you're, you're just spreading your, you're dying.
00:41:55.080
I know how to slip all that because I know myself.
00:41:58.080
It took me years of traveling, of talking to people, of being introspective, being alone, going to different places to finally develop the mental strength to do this shit every single day.
00:42:09.080
And now it's so confusing to people because people never get to that point every day.
00:42:13.080
They'll write essays and they'll say shit like you need to take your pills.
00:42:20.080
One day you're going to take your therapy and you know, I'm not going to.
00:42:25.080
The more you say that, the more whoo, whoo, slip, slip, the more I'm going to keep making money streaming.
00:42:31.080
And the more money that y'all are going to make, bro, going viral on TikTok.
00:42:35.080
I guarantee you don't believe me, start posting my stream clips and see all the bots in the comments section.
00:42:41.080
Like WhatsApp, because then it'll still be spying and manipulating.
00:42:45.080
Get rid of the whole thing for six months and know yourself and then you can decide.
00:42:50.080
You have to decide, but you can't until you know yourself.
00:42:53.080
And then for the rest of society, I'd say as long as we can have some small percentage of people who are off it,
00:43:00.080
then the society can have voices to give perspective.
00:43:03.080
If everybody's universally part of this thing, we cannot have perspective.
00:43:11.080
You know, we need more people who are just outside of that loop,
00:43:17.080
And I think we'll find it extraordinarily valuable to have them.
00:43:24.080
I mean, see, that's what I used to say a couple of years ago.
00:43:27.080
People are going to call me a conspiracy theorist or a hippie or like I'm a boomer or some shit like that.
00:43:35.080
Anyone who's really like doubting what I'm saying right now.
00:43:38.080
I don't think that they're they're really happy with their lives.
00:43:40.080
People like in the chat who get it right now, they're saying, why are there so many goofy people in here?
00:43:44.080
You're seeing the goofy people because a lot of people don't want to change.
00:43:47.080
A lot of people are addicted and they don't want to see it.
00:43:49.080
And they're getting triggered by the stuff I'm saying.
00:43:51.080
But a lot of you, the reason you're still here is because, you know, I'm saying a little bit of truth.
00:43:55.080
You resonate a little bit with what I'm saying.
00:43:57.080
You know that this shit is not making you happy.
00:43:59.080
You know, when you lay for five hours on your side, scrolling, scrolling, scrolling, scrolling, scrolling.
00:44:09.080
Someone donated five and said, if you chose any voiceover commentary niche on YouTube that isn't saturated, needs more creators, what would it be?
00:44:21.080
That's going away because everyone's scared of getting canceled.
00:44:24.080
Have you just been through the mill and kind of worked out?
00:44:35.080
I mean, you know, I mean, I hear here's what I'll tell you.
00:44:39.080
The bind you've put me in is that I'd be happy to trash the new age and and demonstrate that I'm not part of that manner of thinking.
00:44:48.080
I think I hope I've come across as a non utopian.
00:44:50.080
But the problem is many of my friends in California are quite new age.
00:45:07.080
As soon as I start talking about this, they're going to find something to say that's misinformation.
00:45:12.080
Community guidelines strike for misinformation.
00:45:16.080
Why do you think that what I'm saying is called misinformation by all these people at the top with more money than all of us?