SNEAKO - July 19, 2022


Why Social Media Is Melting Your Brain


Episode Stats

Length

45 minutes

Words per Minute

183.01436

Word Count

8,364

Sentence Count

775

Misogynist Sentences

9

Hate Speech Sentences

18


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Is there a principal reason why you should get off social media? And if so, what is it? Is it for your own good? And is it for society's good? Or is it because you're being subtly manipulated by algorithms that are watching everything you do constantly, and then sending you changes in your media feed that are calculated to adjust you slightly to the liking of some unseen advertiser?

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 This is an interview on how social media ruins your life.
00:00:03.700 Is there a principal reason why I should delete my social media?
00:00:07.200 And if so, what is it?
00:00:08.480 Mmm.
00:00:09.880 There are two.
00:00:11.660 One of them is for your own good, and the other is for society's good.
00:00:16.360 For your own good, it's because you're being subtly manipulated by algorithms
00:00:22.220 that are watching everything you do constantly,
00:00:24.660 and then sending you changes in your media feed, in your diet,
00:00:30.340 that are calculated to adjust you slightly to the liking of some unseen advertiser.
00:00:36.660 And so if you get off that, you can have a chance to experience a clearer view of yourself and your life.
00:00:43.540 But then the reason for society might be even more important.
00:00:47.940 Society has been gradually darkened by this scheme
00:00:52.080 in which everyone is under surveillance all the time,
00:00:54.660 and everyone is under this mild version of behavior modification all the time.
00:01:00.840 It's made people jittery and cranky.
00:01:04.280 And dumber.
00:01:06.060 Social media makes you dumber.
00:01:08.700 It doesn't mean that you need to get off social media, but you could pick an option.
00:01:11.840 You could be a bot scrolling all day.
00:01:14.880 Or you could step outside, see the bot behavior that it turns people,
00:01:20.940 and become a creator.
00:01:22.500 So on this stream, what I've been saying is,
00:01:25.160 I'll clip up my shit, post it on TikTok, and don't be a bot.
00:01:28.660 And now everybody who's posting my shit on TikTok is seeing all the bot mentality.
00:01:33.120 Everybody in the comment sections, and yo, TikTok people vouch for me.
00:01:36.560 Everybody in the comment sections is saying the exact same thing.
00:01:40.480 You're projecting.
00:01:41.840 You're an incel.
00:01:42.980 You're problematic.
00:01:44.500 You're da-da-da-da-da-da.
00:01:46.240 Retards.
00:01:47.460 So pick one.
00:01:48.540 Be a fucking consumer echoing all the same nonsense.
00:01:52.700 Or capitalize off of it.
00:01:58.300 It's made teens especially depressed, which can be quite severe.
00:02:02.920 Everybody's depressed now.
00:02:05.060 Every single teen, every Gen Z kid thinks they're depressed.
00:02:09.000 Every single one.
00:02:10.020 Is it a chemical imbalance?
00:02:12.780 Is it?
00:02:13.480 Are you just spending too much time on this?
00:02:15.680 Are you not moving your body enough?
00:02:17.520 Are you being programmed by all the same opinions that everybody has?
00:02:20.900 This is toxic.
00:02:22.240 I don't like.
00:02:22.900 This is misogynist.
00:02:24.640 Maybe that's the reason you're depressed.
00:02:26.420 Maybe you should go outside.
00:02:27.780 Maybe you should get some money.
00:02:29.040 Maybe you should improve your life.
00:02:30.340 No, I have a chemical imbalance and I've been scrolling for eight hours looking at girls
00:02:35.060 half-daked twerking.
00:02:36.400 Fucking idiots.
00:02:39.260 But it's made our politics kind of unreal and strange where we're not sure if elections
00:02:45.020 are real anymore.
00:02:46.280 We're not sure how much the Russians affected Brexit.
00:02:50.120 We do know that it was a crankier affair than it might have been otherwise.
00:02:53.540 You say it's bad for me as an individual.
00:02:55.420 Is it bad for me because I'm addicted?
00:02:57.160 Have I become chemically hooked?
00:02:59.220 Yes, it's terrible for you.
00:03:01.820 It's meant to control you.
00:03:03.300 They want you to be really dumb.
00:03:05.100 They want to dumb you down so you don't think for yourself.
00:03:07.460 And 95% of you all say the exact same thing.
00:03:11.720 If you don't believe me, start clipping up this stream and then watch everybody in the
00:03:16.040 comments say the exact same shit.
00:03:17.680 If you don't believe me, go outside.
00:03:19.400 If you're in school, whatever the fuck, whatever age you are, and observe people scroll.
00:03:24.100 Look how dumb they look.
00:03:25.320 You remember Wally when people were in wheelchairs sitting like this and didn't even move?
00:03:31.780 You know why they want you to be like that?
00:03:34.000 So they take all the power to keep you sedated, to domesticate you.
00:03:38.680 That's why every day they call them, you're immature.
00:03:40.980 Why don't you stop shouting?
00:03:42.420 Why don't you just be quiet like me?
00:03:45.260 Because then you're boring.
00:03:46.700 They want you to be boring.
00:03:48.080 They want you to do nothing.
00:03:49.920 They want you to sit there and consume.
00:03:53.020 More, more, more.
00:03:54.320 Buy, buy, buy.
00:03:55.780 Boobie, here's your money, OnlyFans girl.
00:04:00.320 Then you're not going to challenge anything.
00:04:02.460 And the people at the top keep getting richer.
00:04:05.140 The divide is getting bigger.
00:04:06.960 There's no more middle class.
00:04:08.380 It's fucking people here.
00:04:10.180 And then people at the top living the fucking dream.
00:04:14.800 Personally, I want to live the fucking dream.
00:04:17.240 I'm working on it.
00:04:18.600 I dropped out of school.
00:04:19.980 I'm here.
00:04:20.320 I don't have a job.
00:04:21.220 I'm working towards that.
00:04:22.400 I don't have it yet.
00:04:23.280 I'm trying to get to a bag.
00:04:24.940 And watch it keep coming back to my stream.
00:04:26.740 I'm going to get it.
00:04:27.840 Why?
00:04:28.360 Because I want it.
00:04:29.300 I have ambition.
00:04:30.420 I'm not going to sit down like you sad losers.
00:04:36.460 Pick one.
00:04:39.660 You have.
00:04:41.640 The founders of the great Silicon Valley spying empires like Facebook
00:04:46.500 have publicly declared that they intentionally included addictive schemes in their designs.
00:04:54.300 Now, we have to say, this is what I would call almost a stealthy addiction.
00:05:00.240 It's a statistical addiction.
00:05:02.280 What it says is, we will get the broad population to use the services a lot.
00:05:08.760 We'll get them hooked through a scheme of rewards and punishment.
00:05:13.440 And you know, this is 2018 before TikTok.
00:05:15.440 Look how it happens now.
00:05:16.640 The algorithm is developed to keep you hooked as long as possible.
00:05:20.420 They want you to be on the app as long as possible.
00:05:24.060 There's an algorithm that they feed you.
00:05:26.420 And you know what's so funny about it?
00:05:27.680 You can see the algorithm in the way people talk.
00:05:30.480 Because everyone says exactly what they see on their feed.
00:05:33.420 So many people have a lack of personality.
00:05:35.580 They all echo the same jokes.
00:05:37.200 They don't think for themselves.
00:05:38.780 Period.
00:05:39.760 That's sussy.
00:05:41.220 Nah, just echoing the same TikTok personalities.
00:05:44.420 You don't believe me?
00:05:45.500 Listen to people.
00:05:46.720 Everything, their whole personality, everything they say is echoed from an infographic or some
00:05:51.420 shit they read online.
00:05:52.260 It's predictable.
00:05:53.920 That's what I mean when I say bots.
00:05:56.320 You don't think for yourself.
00:05:58.360 You're not supposed to.
00:05:59.140 If you start thinking for yourself, someone calls it toxic.
00:06:02.000 Someone says you're immature.
00:06:03.240 They tell you to take pills.
00:06:05.260 People think it's a mental illness now if you think for yourself.
00:06:08.660 There's something wrong.
00:06:10.640 And the rewards are when you're retweeted.
00:06:13.380 The punishment is when you're treated badly by others online.
00:06:16.360 And then within that, we'll very gradually start to leverage that to change them.
00:06:23.060 So it's this very kind of stealthy...
00:06:26.160 Thank you, Lucky.
00:06:26.720 From Check Republic, Donate 100.
00:06:27.940 Any chance on quality restocking this drop?
00:06:29.640 He missed the XL on both in white and black.
00:06:31.520 We're restocking very soon.
00:06:32.640 Quality.clothing, link in description.
00:06:34.620 Manipulation of the population.
00:06:36.340 So it's not as dramatic as a heroin addict or a gambling addict, but it is the same principle.
00:06:41.440 But who's doing the manipulation?
00:06:42.740 I disagree.
00:06:43.540 I think it's as bad as a heroin addict.
00:06:45.860 Because a heroin addict, at least crackheads get up and dance a little bit.
00:06:49.940 At least crackheads are like, they got stories.
00:06:53.200 This, the algorithm addicts, they got nothing.
00:06:57.060 They're boring.
00:06:58.700 Crackheads got some jokes.
00:07:00.400 You know what crackheads do?
00:07:01.440 Crackheads go and get it.
00:07:02.840 One thing crackheads will do is they will get to it.
00:07:06.560 You algorithm people just fucking get in your moral high ground, echoing the same shit that everyone else is saying.
00:07:12.600 Boring.
00:07:14.520 Manipulating.
00:07:15.320 I mean, there isn't some master sort of wizard of oil sitting behind the screen, is there?
00:07:19.860 Well, this is the peculiarity of the situation.
00:07:22.560 The people who run the tech companies like Google and Facebook are not doing the manipulating.
00:07:27.020 They're doing the addicting.
00:07:27.980 But the manipulating, which rides on the back of the addicting, is the paying customer of such a company.
00:07:36.000 And many of those customers are not at all bad influences.
00:07:39.600 They might simply be trying to promote their cars or their perfumes or whatever.
00:07:43.920 And indeed, I have sympathy for them because they're concerned that if they don't put money into the system, nobody will know about them anymore.
00:07:51.740 How is it different to just television advertising or billboard advertising or anything else?
00:07:55.740 The difference is the constant feedback loop.
00:07:58.880 So when you watch the television, the television isn't watching you.
00:08:02.540 When you see the billboard, the billboard isn't seeing you.
00:08:05.200 Now they literally farm all of your data.
00:08:09.600 And I'll bring this example up all the time.
00:08:11.280 You all know what I'm talking about.
00:08:12.500 Just the other day, I was talking about upgrading my Wi-Fi because my stream was lagging.
00:08:15.700 And then I got an email from like Spectrum Mobile.
00:08:18.780 Here's a new Wi-Fi thing.
00:08:19.920 I've never gotten an email from a Wi-Fi company in my life.
00:08:23.500 In my life.
00:08:24.420 An hour after I was having a conversation about wanting to upgrade my Wi-Fi, they emailed me.
00:08:29.040 And people will still say, well, they're not listening.
00:08:31.260 That's just a conspiracy.
00:08:32.560 Why do you not trust the government?
00:08:35.880 They're spying on you constantly.
00:08:37.720 They know literally everything about you.
00:08:39.760 When you're on TikTok all day, not only do they know your sexual orientation,
00:08:43.460 they know where you live.
00:08:44.460 They know where you're from.
00:08:45.400 They know what you like.
00:08:46.420 They know how dumb you are.
00:08:47.560 They know what you want to do.
00:08:48.680 They know who you don't like.
00:08:49.800 They know how much you fuck.
00:08:50.700 They know every single thing about you.
00:08:53.660 TV advertisements are taking a risk.
00:08:55.620 Coca-Cola, maybe you want to drink it?
00:08:57.660 The ads you get now are personalized for your dumb brain.
00:09:02.140 My dumb brain, too.
00:09:03.640 I'm just as dumb.
00:09:05.200 And vast numbers of people see the same thing on television and see the same billboard.
00:09:09.780 When you use these new designs, social media, search, YouTube, when you see these things,
00:09:16.240 you're being observed constantly, and algorithms are taking that information and changing what
00:09:20.860 you see next, and they're searching and searching and searching, and they're just blind robots.
00:09:25.880 There's no evil genius here until they find those patterns, those little tricks that get
00:09:31.340 you and make you change your behavior.
00:09:33.120 In terms of society, I mean, you, you, you, Chad, how many of you have had that same experience?
00:09:37.760 Give me a WRL right now.
00:09:39.020 How many of you have gotten an ad that was something you were talking about two hours ago?
00:09:44.000 Maybe I'm just a paranoid conspiracy theorist, or maybe all of you have had the same experience.
00:09:48.660 Why do you think that is?
00:09:51.380 Why do they know exactly when to send you a specific ad?
00:09:54.760 It's mind control.
00:09:57.020 Literally.
00:09:58.360 And you think that when I say bot, I'm just like a conspiracy theorist or some shit?
00:10:01.560 But we're on this shit six hours a day.
00:10:04.300 We're connected into the matrix now.
00:10:06.580 Our whole lives are on here.
00:10:08.080 We cannot live without our phones.
00:10:10.200 We start getting anxious.
00:10:11.640 What's the first thing you do when you wake up?
00:10:13.420 The first thing you do, part of our brain is connected to this.
00:10:20.980 Everybody has had the same experience in the chat.
00:10:23.020 Everybody.
00:10:23.900 Are we all conspiracy theorists?
00:10:25.600 Every single one of you have had the same shit happen to you.
00:10:28.280 So now we have an option because you can't get rid of your phone.
00:10:31.240 Are you going to keep being a bot?
00:10:33.960 Right opinion, right opinion.
00:10:35.460 This is cute.
00:10:36.780 She's hot.
00:10:37.620 She's hot.
00:10:38.420 That's a good idea.
00:10:40.080 Or are we going to capitalize off of it?
00:10:43.420 I want to capitalize off of it.
00:10:45.220 Since I was a little kid, since I was a teenager, I noticed that everybody was on this.
00:10:49.600 So I invested all my time and energy into this.
00:10:52.820 Not as a bot program, but as somebody on the outside creating for it.
00:10:58.440 Since I was a little kid, I dropped out of school to do this shit, bro.
00:11:02.480 Threw in this, you know, it's making people depressed.
00:11:05.100 But is there any actual evidence for that?
00:11:08.640 Yeah.
00:11:09.300 Unfortunately, there's a vast amount of evidence.
00:11:11.480 There have been dozens of studies at this point, including studies released by Facebook scientists.
00:11:18.060 So this is something we can call a consensus.
00:11:21.700 And when Facebook releases such things, they say, oh, but we do all these good things, too, that balance it.
00:11:25.880 But there's...
00:11:26.400 They're lizard people.
00:11:27.360 Anybody who trusts Mark Zuckerberg, you cannot trust any of these goddamn people.
00:11:31.760 All they want to do is continue getting richer.
00:11:33.940 They do not have your best interests in mind.
00:11:35.960 They want to make more money off of you.
00:11:37.580 To make more money off of you, you need to be more and more and more addicted.
00:11:41.800 It's almost impossible to get off it now.
00:11:43.620 I'm addicted.
00:11:44.420 We all are.
00:11:45.840 I can't live without my phone.
00:11:47.260 I can't live without social media.
00:11:48.900 I need it.
00:11:50.140 But I don't want to be just getting dumber.
00:11:52.700 This shit makes you dumber.
00:11:54.500 That's why all of you have no personality now.
00:11:57.640 Period.
00:11:59.460 Yeah, that's sussy.
00:12:01.320 Remember when everybody was echoing the same Vine jokes over and over again?
00:12:04.620 Now if you go on Tinder, half the girls' bios, the way to win my heart is by Vine 2016 Vines.
00:12:11.980 All of you sound the same.
00:12:14.520 All of you.
00:12:17.200 If that's not some bot shit, I don't know what is.
00:12:20.600 There's a general acknowledgement that depression correlates.
00:12:25.360 The scariest example is a correlation between rises in teen suicide and the rise in use of social media.
00:12:32.800 And so, yes, unfortunately...
00:12:34.220 All of you think you're depressed.
00:12:35.180 It's not a chemical imbalance.
00:12:36.640 They're changing the chemicals in your brain.
00:12:38.620 They control your dopamine.
00:12:40.420 You're not born depressed.
00:12:42.160 It's taught.
00:12:43.320 You learn that shit from this.
00:12:46.960 Unfortunately, this is real.
00:12:47.900 But are you sure you can blame it on social media?
00:12:51.100 Or is it not just those two things may have happened at the same time for other reasons?
00:12:54.140 Well, here's a distinction we have to make.
00:12:56.800 It's very similar to the problem of global climate change.
00:12:59.800 We can say statistically over the whole population, yes, the correlation is real.
00:13:04.480 And any particular person, of course, we can't.
00:13:06.700 Just as we can't blame any particular storm on global warming.
00:13:10.120 But it's causality, isn't it?
00:13:11.840 Yeah.
00:13:12.500 I mean, it is causality.
00:13:14.860 And this is something that's very well demonstrated.
00:13:18.640 So when the company's own scientists are publishing on this topic and come to the same agreement,
00:13:24.400 I think it's time to say this is real.
00:13:27.560 Why have you sort of turned on your own kind?
00:13:33.500 I love Silicon Valley, and I do not at all feel that I've turned on my own kind.
00:13:37.660 And just to be clear, I'm very much a part of this.
00:13:40.120 Ask anybody who's worked in Silicon Valley.
00:13:42.940 They'll tell you the same thing.
00:13:43.860 They're just trying to farm as much data as possible.
00:13:47.060 They're all aware of what Edward Snowden was saying back in 2012.
00:13:51.160 They're taking all your data.
00:13:52.980 They know literally everything about you.
00:13:55.540 So accept it.
00:13:56.700 It's too late now.
00:13:58.040 They know every creepy video you watched.
00:14:01.140 They know how gay you are.
00:14:02.460 They know how straight you are.
00:14:03.700 They know how weird you are.
00:14:05.200 Every little thing.
00:14:06.160 They got so much dirt on you.
00:14:08.200 So you need to act as if you're always being watched.
00:14:10.520 That's all we can do.
00:14:11.800 So just act accordingly.
00:14:13.080 Stay focused.
00:14:14.540 Focus on yourself.
00:14:18.460 That's what I'm going to do.
00:14:19.740 I don't want to be out here.
00:14:20.660 I'm not a self-improvement.
00:14:21.540 I'm not a self-help guy.
00:14:22.580 That's what I'm going to do.
00:14:23.660 And it's working.
00:14:24.820 So far, it's working.
00:14:26.200 Why the fuck do you think my shit blew up in a day?
00:14:28.480 I figured this shit out.
00:14:29.400 This shit is easy.
00:14:30.300 Once you realize how bot-minded people are, I don't even have TikTok.
00:14:33.660 And I went viral overnight.
00:14:34.860 Everyone's messaging, you're all over my For You page.
00:14:38.080 You're all over my For You page.
00:14:39.440 Because it's easy.
00:14:40.320 It's easy to figure out what you want to hear.
00:14:42.500 It's easy to capture your attention because all of you are dumb.
00:14:45.240 Since an outsider, I believe that what we're doing is not in our own self-interest.
00:14:55.540 Business interests are a part of society.
00:14:58.020 If they destroy society, they destroy themselves.
00:15:00.140 I believe it's very clear that we could offer all of the good things.
00:15:05.260 And there are many, many good things in these services, in social media in particular.
00:15:09.700 I'm convinced we can offer them without this manipulation engine in the background.
00:15:14.360 There's a world of other business plans, and I think they'd be better for us.
00:15:18.060 So I don't think we're being evil so much as we're being stupid.
00:15:24.000 He said the same shit.
00:15:25.760 He just said the same shit I said, but I'm saying it.
00:15:27.840 Why are you so loud?
00:15:28.820 You just sound so much.
00:15:29.960 Why are you so mean?
00:15:31.120 He just said the same shit.
00:15:32.040 He just called you stupid, but he said it in a nice way.
00:15:34.740 Yeah, it's not manipulation.
00:15:36.300 You guys are just stupid.
00:15:38.120 You guys are dumb.
00:15:38.900 But people hate me because of ma, ma.
00:15:40.700 But you need to hear that or you don't listen.
00:15:42.520 This guy said this shit in 2018.
00:15:45.160 2018 before TikTok.
00:15:46.740 How true is this now?
00:15:48.040 We're being evil so much as we're being stupid.
00:15:49.940 It'd be better for us.
00:15:51.460 So I don't think we're being evil so much as we're being stupid.
00:15:56.160 Stupid!
00:15:56.480 Stupid!
00:15:59.960 When it comes to Facebook, has Facebook made itself safe yet in terms of data harvesting and scraping and all that?
00:16:08.000 Well, Facebook's fundamental design is one that is, the business model is to addict you and then offer a channel to you to third parties to take advantage of that, to change you in some way without you realizing it's happening.
00:16:25.900 I mean, that's what it does.
00:16:28.080 So I don't think any amount of tweaking can fully heal it.
00:16:33.100 I think it needs a different business plan.
00:16:35.780 I mean, it's very hard.
00:16:37.140 They will make a new business plan.
00:16:39.120 They'll never change this model.
00:16:40.320 The model works.
00:16:41.100 We're all addicted to the algorithm and they don't care how dumb we get.
00:16:44.800 It's actually more beneficial the dumber you get.
00:16:47.560 The more bot-minded you are, the more time you spend on your phone, the more money they make.
00:16:51.540 That's the divide now.
00:16:52.640 It's getting harder to get to the top because they have a stronghold.
00:16:56.120 They know what you all think.
00:16:58.060 You think the people at the top, like, have gender pronouns?
00:17:01.140 You think Elon Musk took this?
00:17:03.040 You think Mark Zuckerberg took it?
00:17:04.300 You think Bill Gates?
00:17:04.880 They don't, they're not part of the same stupid shit that we're a part of.
00:17:09.440 They don't live in that.
00:17:10.480 You think Elon Musk is arguing about trans bathrooms?
00:17:13.980 No.
00:17:15.020 No.
00:17:16.060 Billionaires see us and they literally think that we're stupid.
00:17:19.620 They see us all at the bottom and we're like little ants.
00:17:24.140 Literally.
00:17:24.840 They look down at us like we're ants running around in a hill arguing about this nonsense.
00:17:29.580 Bible Choice, holding these signs, protests, Black Lives Matter, Black Lives Matter, laughing at us.
00:17:35.420 It's all planned out.
00:17:36.560 It's all predictable.
00:17:37.920 While they get more and more and more money.
00:17:40.800 You think they're going to change?
00:17:41.920 Hell no.
00:17:42.420 To throw a barrage of rules at somebody who's following certain incentives and then expect them to really make a difference.
00:17:50.480 So when Mark Zuckerberg says he's taking action and, you know, he regrets what's happened and all the rest of it, you're saying he can't make his own product a safe and desirable product?
00:17:59.900 I believe that as long as his business incentives are contrary to the interests of the people who use it, who are different from the customers, then no matter how sincere he is, and I believe he's sincere, and no matter how clever he is, he can't undo that problem.
00:18:19.440 He has to go back to the basics and change the nature of the business.
00:18:22.180 And they never will.
00:18:23.160 They never will.
00:18:23.820 They're just going to become—that's why when you see, like, Zuckerberg talk and everything, you're like, how is this guy even a real person?
00:18:29.960 Because they've committed so much—they've fucked our brains up so much that they're lost, and, like, they need to justify it a little bit, but there's a little bit—they know how much they're ruining everybody.
00:18:41.620 And so they become evil.
00:18:43.700 They lose any sense of morality.
00:18:45.780 They have all empathy separated from—why do you think Mark Zuckerberg looks so scary now?
00:18:50.480 Suckabubu, I went on social media for a month.
00:18:53.300 When I got back on it, I started feeling miserable.
00:18:55.220 Same feeling as eating fried Oreos after being healthy for a while.
00:18:57.820 It feels gluttonous.
00:18:58.940 W. Sneeko.
00:19:00.060 Exactly.
00:19:00.960 Business plan.
00:19:02.120 And if he doesn't agree with that and says we're just going to carry on, how important is security of that data and the inability to repeat what has happened with Cambridge Analytica and all that kind of sort of data harvesting that went on?
00:19:19.940 I don't believe that this is—I don't believe that what happened with Cambridge Analytica is the worst of it.
00:19:26.180 The whole system is designed for this.
00:19:28.760 Like, let's suppose that Facebook reforms itself so that the next Cambridge Analytica can't get access to that data.
00:19:34.780 They can still get access—
00:19:36.080 Look at them.
00:19:36.700 Look at them.
00:19:36.720 Look at them.
00:19:37.080 Look at them.
00:19:38.900 Look at them.
00:19:39.080 Look at them.
00:19:40.080 Look at them.
00:19:41.080 Is this the face of people who care about you?
00:19:43.080 Don't they literally look soulless and insane?
00:19:47.080 These look like the get-out people.
00:19:50.080 Crazy.
00:19:51.080 All those smiles.
00:19:52.080 It's like algorithm faces.
00:19:54.080 Look at them.
00:19:55.080 Look at them staring into your soul, knowing everything—this dude right here knows every single thing about you.
00:20:07.080 —just to the same results, because the service Facebook offers is exactly what Cambridge Analytica—
00:20:15.080 They do it themselves.
00:20:16.080 Yeah, I mean, this is—you know, there are—bad actors are able to use Facebook in ways that Facebook can't understand,
00:20:26.080 because the way the service is designed is fundamentally to be manipulative.
00:20:30.080 So I think the data protection idea is a sincere and good idea, but it's certainly not adequate.
00:20:37.080 It doesn't address the core problem, which is the manipulation engine.
00:20:41.080 And as long as that is there, a bad actor can find a way to utilize it.
00:20:45.080 So, to me, this concern about data protection, while laudable, doesn't address the core problem.
00:20:53.080 Do you think they're all as bad as each other?
00:20:55.080 I mean, you know, why is something like YouTube, which is basically just a way of watching video, bad for you?
00:21:02.080 YouTube—
00:21:04.080 You know why YouTube is bad for you?
00:21:06.080 Because you start talking about this, and they strike your channel for misinformation.
00:21:10.080 I started talking about the Coco, started talking about this.
00:21:12.080 They deleted the video, and they gave me a community guidelines warning.
00:21:15.080 If I talk about this again, they can strike it, and I cannot stream.
00:21:18.080 If this shit doesn't wake you up, I don't know what will.
00:21:21.080 I have a community guidelines strike for talking about code.
00:21:24.080 If you say the wrong thing, if you have the wrong opinion, it's misinformation.
00:21:29.080 They tell you now what you cannot—
00:21:31.080 They tell you now what you can and can't say.
00:21:33.080 That doesn't ring bells in your head.
00:21:36.080 You can really look at this, and, like, nobody cares.
00:21:39.080 I talk about this on a regular—
00:21:40.080 Well, YouTube, they have their rules.
00:21:43.080 They have their community guidelines.
00:21:45.080 What happened to freedom of speech?
00:21:47.080 Are you shitting me?
00:21:49.080 Community guidelines strike for misinformation?
00:21:52.080 Why do they get to— Why are they in charge of information?
00:21:57.080 Why are you okay with Zuckerberg and Susan telling you what information is correct or not?
00:22:03.080 They clearly don't have your best interest in mind.
00:22:06.080 So if you have the wrong opinion, if you say something that challenges the programming,
00:22:11.080 misinformation, community guidelines strike.
00:22:14.080 Warning.
00:22:15.080 Content deleted.
00:22:16.080 It's not necessarily bad for you.
00:22:20.080 Remember, this is a statistical distribution.
00:22:22.080 So for some percentage of people, it'll have an effect of making them crankier around election time
00:22:28.080 and feeling needier around the time they might be making a purchase and so forth.
00:22:32.080 And the way it works is that all the data Google can get on you, much of which comes from just your email or whatever else it might be,
00:22:41.080 is fed into an engine that compares you with other people who share some similar traits.
00:22:46.080 And YouTube's ordering of videos that are presented to you is designed to, on the one hand, maximize your engagement so you won't stop watching,
00:22:55.080 but that's achieved not just by observing you but by a multitude of people who are similar to you.
00:23:01.080 And then when you do get an ad, it's contextualized in a way that has been shown to be effective not only for you but for this whole population.
00:23:08.080 So it's this giant statistical thing.
00:23:11.080 And it's bad for you because it leeches your free will.
00:23:14.080 It makes you cranky.
00:23:15.080 It makes you depressed.
00:23:17.080 It makes you all think you have a chemical imbalance in your brain.
00:23:21.080 And it's getting worse and it's getting faster and our attention spans are getting worse and worse and worse, bro.
00:23:26.080 You know that faint noise off in the background when you hear someone just playing TikToks.
00:23:34.080 Just scroll, scroll, scroll, scroll.
00:23:35.080 And this is the noise they make.
00:23:36.080 So pick one.
00:23:46.080 I'm telling you, man.
00:23:47.080 You start talking about this shit, look at that.
00:23:49.080 This dude, 28,000 followers.
00:23:51.080 This dude, 80,000.
00:23:56.080 Bro, I started the stream and he got 5,000 more followers since I started the stream.
00:24:02.080 Posting these clips.
00:24:04.080 Insane, man.
00:24:06.080 The world a little darker because you're not perceiving reality clearly anymore.
00:24:10.080 You're being, it's being manipulated.
00:24:12.080 It's being tricked in a way.
00:24:15.080 And it, the people who are paying or maybe not paying, just using the system to, in a clever way to get at you, are not necessarily pleasant people.
00:24:28.080 They're, they're, they're, they're sort of the worst actors.
00:24:30.080 They're lizards.
00:24:31.080 But don't, don't some users think, look, I can handle advertising.
00:24:35.080 You know, I know what I'm doing here.
00:24:36.080 I'm getting a free service.
00:24:38.080 Uh, and you know, they think they're manipulating me, but I know what I'm doing.
00:24:43.080 Shout out to our new members.
00:24:44.080 Thank you, BVH.
00:24:45.080 The problem is that behaviorist techniques are often invisible to the person who's being manipulated.
00:24:51.080 And, and this has a long history.
00:24:53.080 This has been done for a long time.
00:24:55.080 Uh, it used to be that the only way to be subjected to continuous observation and modification was to either be in an experiment.
00:25:03.080 Uh, you could be in the basement of a psychology building and have students tweaking you for their projects.
00:25:08.080 Or you could join a cult.
00:25:10.080 Or you could be in an abusive relationship.
00:25:12.080 I mean, this has been done before.
00:25:14.080 And often the people who are in these situations do not realize it's happening to them.
00:25:18.080 In fact, the whole point is that it's, it's sneaky.
00:25:21.080 It's, it's a, it's a mechanical approach to manipulating people.
00:25:25.080 And because...
00:25:26.080 But!
00:25:27.080 It's, it's so algorithmic.
00:25:29.080 It doesn't involve direct communication and people don't get the cues to understand what's happening with them.
00:25:34.080 Why do you think, um, social media has had the effect on politics that it has?
00:25:39.080 You know, is it because of the way people respond to...
00:25:42.080 You know what happens when you step out of social media?
00:25:44.080 When you get off Twitter for a bit, you start realizing,
00:25:46.080 Oh shit, nobody cares about my opinion.
00:25:49.080 Nothing I'm saying really matters.
00:25:51.080 But now, because everyone feels important, they start posting a black square.
00:25:55.080 Everyone posts a black square when Black Lives Matter happens.
00:25:58.080 Because I am a champion of black people!
00:26:01.080 And I support black people!
00:26:03.080 No one gives a fuck!
00:26:04.080 You step outside of that, you're like,
00:26:06.080 Whoa!
00:26:07.080 I'm irrelevant!
00:26:09.080 I don't matter at all!
00:26:11.080 When you get back on it, this is...
00:26:13.080 While I think about this, this is my opinion on Ukraine.
00:26:17.080 Ukraine should...
00:26:19.080 Putin...
00:26:20.080 This is my message to Putin.
00:26:21.080 Putin, you should not start World War 3!
00:26:23.080 He doesn't give a fuck!
00:26:24.080 Dummy!
00:26:27.080 Everybody thinks they're a politician now!
00:26:29.080 Everybody thinks that their voice matters!
00:26:31.080 And it doesn't!
00:26:32.080 We're all just making noise!
00:26:34.080 It's just chatting!
00:26:36.080 None of this has any effect!
00:26:38.080 On nothing!
00:26:39.080 You think Putin's like,
00:26:40.080 Hmm, I was gonna nuke?
00:26:41.080 But then, this girl on TikTok posted an infographic.
00:26:44.080 I think I'm actually gonna call off the strike.
00:26:47.080 I think I'm not gonna invade Ukraine anymore,
00:26:49.080 because of this infographic.
00:26:51.080 ...to things on social media.
00:26:56.080 Well, I'd like to give you a slightly detailed answer as quickly as I can.
00:27:02.080 And that is that, in traditional behaviorism, you would give an animal or a person a little treat like candy,
00:27:09.080 or maybe an electric shock, and you'd go back and forth between positive and negative feedback.
00:27:14.080 And when researchers try to determine whether positivity or negativity is more powerful,
00:27:19.080 they're roughly at parity.
00:27:20.080 They're both important.
00:27:22.080 But the difference with social media is that the algorithms that are following you respond very quickly.
00:27:29.080 They're looking for the quick responses.
00:27:31.080 And the negative responses...
00:27:33.080 There's one big boy, though.
00:27:34.080 ...is like getting startled or scared or irritated or angry,
00:27:37.080 tend to rise faster than the positive responses, like building trust or feeling good.
00:27:45.080 Those things rise more slowly.
00:27:47.080 So the algorithms naturally catch the negativity and amplify it,
00:27:51.080 and introduce negative people to each other and all of this.
00:27:54.080 And so what this does is it means that the algorithms discover there's more engagement possible,
00:28:00.080 say, by promoting ISIS and promoting the Arab Spring.
00:28:03.080 And so ISIS gets more mileage or promoting the Ku Klux Klan than Black Lives Matter.
00:28:09.080 Now, in the big picture, it's not true that negativity is more powerful.
00:28:13.080 But if you're doing this very rapid measurement of human impulses instead of accumulated human behavior,
00:28:20.080 then it's the negativity that gets amplified.
00:28:22.080 So you tend to have elections that are more driven by rancor and abuse,
00:28:27.080 and you tend to have outcomes that are kind of crazy.
00:28:31.080 And so the...
00:28:32.080 King J, do 10 push-ups right now. Don't skip your workout for the stream.
00:28:34.080 Do 10 push-ups and listen to the stream.
00:28:36.080 The effects on the media we consume, the news, as well as also alarming,
00:28:40.080 because then it'll be the news that makes people angry
00:28:44.080 that is the news that gets seen in the future or now,
00:28:47.080 rather than, you know, a more balanced diet of what's really going on in the world.
00:28:53.080 Well, I think what goes on on a show like this is that you have a bit of a longer time horizon
00:28:59.080 by which you measure success. So you have to impress your viewership enough to tune in.
00:29:05.080 But this is over a process of days and weeks and months and years.
00:29:08.080 My advice? Stop caring about being likable so much.
00:29:12.080 When you start getting out of the bot shit, you're going to offend a lot of bots.
00:29:16.080 And you just got to bob and weave all of it.
00:29:18.080 I'm triggered.
00:29:19.080 Don't say that.
00:29:20.080 But you don't care about Ukraine?
00:29:22.080 But what about your empathy?
00:29:23.080 Who? Who? Who? Who? Who?
00:29:25.080 Say what you really think.
00:29:27.080 Nobody says what they think anymore because so many people get offended by it.
00:29:30.080 So just let them be offended.
00:29:32.080 They like being offended.
00:29:33.080 That's their thing.
00:29:34.080 Okay, go ahead, bot.
00:29:35.080 Be mad.
00:29:36.080 Be mad.
00:29:37.080 If you get that mad at what I think, I'm not going to change your mind.
00:29:41.080 You want me to sit here and argue with you?
00:29:43.080 I can't say what I think because of you?
00:29:45.080 Fuck that.
00:29:46.080 Dr. Two smooth said, thank you.
00:29:47.080 Please react to how to fix it.
00:29:48.080 Put the, anything you wanted me to react to, put it in the discord.
00:29:53.080 And you build up a sense of rapport with your, your viewership, right?
00:29:57.080 Um, if you're an algorithm that's just looking at instant responses,
00:30:01.080 you don't get that.
00:30:02.080 It's just like, how did I...
00:30:03.080 W. Howard in the chair.
00:30:04.080 ...engage this person and it'll be, you'll find that engagement more often
00:30:09.080 by irritating people than by educating them.
00:30:12.080 And so, is that how you create...
00:30:14.080 What are you saying?
00:30:15.080 ...then by educating...
00:30:16.080 And it'll be, you'll find that engagement more often by irritating people than by educating them.
00:30:21.080 Wrong.
00:30:22.080 Why do you think Trump won?
00:30:24.080 Because they thought that by saying he was triggering every day in the news that he was going to lose.
00:30:30.080 Every single day they tried to cancel him.
00:30:32.080 Every single day.
00:30:33.080 Instead, he just got more publicity.
00:30:35.080 He kept triggering people every day and got famous for triggering people.
00:30:38.080 And there's people who get it and people who don't.
00:30:40.080 People who like him and people who hate him.
00:30:42.080 All publicity really, at the end of the day, is good publicity in this game.
00:30:46.080 In this game of politics, in this game of entertainment.
00:30:49.080 That's why Tate blew up.
00:30:50.080 That's why Trump blew up.
00:30:51.080 That's why Kanye's still relevant.
00:30:52.080 It keeps him relevant.
00:30:53.080 Because the same amount of people are like,
00:30:54.080 Don't say so...
00:30:55.080 It keeps him relevant.
00:30:59.080 And so, is that how you create Trump?
00:31:03.080 Yeah.
00:31:04.080 Or Duterte.
00:31:05.080 Or, you know, any of the other populist leaders who are doing very well at the moment.
00:31:09.080 Partly from the internet.
00:31:11.080 I have never known Trump, but I have met him a few times over a fairly long period.
00:31:16.080 Over 30 years, actually, through different circumstances.
00:31:19.080 And I will say that, while I never would have voted for him as president, and I always thought
00:31:25.080 he was somewhat untrustworthy and a bit of a showman and a bit of a scammer.
00:31:31.080 He never lost himself and became so strangely insecure and so weirdly irritable until he had
00:31:41.080 his own addiction, in this case to Twitter.
00:31:43.080 And it's really damaged him.
00:31:45.080 I mean, I view Trump in a way as a victim.
00:31:48.080 Really?
00:31:49.080 Oh yeah, absolutely.
00:31:50.080 Trump's a victim?
00:31:52.080 His character has been really damaged by his Twitter addiction.
00:31:55.080 Because of the reaction he gets from each tweet?
00:31:57.080 Yeah.
00:31:58.080 So, you know what happens in addiction is the addict becomes, not just on the good part
00:32:04.080 of the addiction experience, but on the whole cycle.
00:32:06.080 Yeah.
00:32:07.080 You know those people who just argue on Twitter all goddamn day long?
00:32:09.080 I used to be one of those guys like going back and forth, like arguing about atheism
00:32:13.080 or 9-11 and shit like that.
00:32:15.080 And then you step out of it, you're like, yeah, I destroyed his argument.
00:32:18.080 I really got him.
00:32:19.080 It's just like the same people in the chat now saying like, you're, you're dumb.
00:32:22.080 I don't like you.
00:32:24.080 They leave the stream and they feel like, because they're just waiting for me to
00:32:28.080 react.
00:32:29.080 They want me to say something to it.
00:32:30.080 And you just gotta, ha ha, whoop, whoop, whoop.
00:32:34.080 Let them be mad.
00:32:35.080 Because anybody who's like trying to get a reaction or something like that, anybody
00:32:38.080 who's engaging, this is, and too much negativity, it makes you miserable.
00:32:43.080 A gambler is not just addicted to winning, but to this whole process where they mostly lose.
00:32:49.080 And in the same way, the Twitter addict or the social media addict becomes addicted
00:32:54.080 to this engagement, which is often unpleasant, where they're engaged in these, you know, really
00:33:00.080 abusive exchanges with other human beings.
00:33:02.080 And only once in a while is that, you know, you'll, you can watch Trump.
00:33:06.080 Like every once in a while, there'll be this tweet where somebody likes him.
00:33:09.080 And that's when he gets his little, we call it in the trade, the dopamine hit.
00:33:13.080 That's what it's called in Facebook, for instance.
00:33:16.080 He gets his little dopamine hit and then he dives in for more negativity and things.
00:33:21.080 Then he gets it again.
00:33:22.080 And you can see the addiction playing out.
00:33:24.080 And do you think it's possible to create a do-gooding social network?
00:33:29.080 Yes.
00:33:30.080 I'm absolutely positive.
00:33:31.080 And the way to do it is to have a different business model where instead of...
00:33:34.080 So right now, we've created this bizarre society, it's unprecedented, where if any two
00:33:39.080 people wish to communicate over the internet, the only way that can happen, the only way it's
00:33:44.080 financed is through a third party who believes that those two...
00:33:47.080 Mods, clean up this chat.
00:33:48.080 What are you all doing, man?
00:33:49.080 ...can be manipulated in a sneaky way.
00:33:51.080 It's an insane way to structure civilization.
00:33:54.080 So we can keep all the good stuff, and there is good stuff on social media, of course.
00:33:59.080 We can keep all that and just throw away the manipulation business model and substitute
00:34:04.080 in a different business model.
00:34:05.080 And there are many alternatives that would be better.
00:34:08.080 They just have to be honest.
00:34:10.080 It could be a paid service, like a Netflix, where you're paying for it.
00:34:13.080 You're the genuine customer.
00:34:15.080 It has to keep your interest.
00:34:17.080 It could be like a public library.
00:34:18.080 It could become a public thing that isn't commercial at all.
00:34:22.080 That's an option.
00:34:24.080 But what we did in Silicon Valley is we wanted it both ways.
00:34:27.080 We wanted everything open and free, but we wanted hero entrepreneurs and hackers.
00:34:31.080 And so the only way to get that was this advertising thing that gradually turned into the manipulation
00:34:36.080 engine as the computers got faster.
00:34:39.080 And this weird business plan, once you can see that there are alternatives, you realize how strange it is
00:34:45.080 and how unsustainable it is.
00:34:47.080 Yeah.
00:34:48.080 This is the thing we must get rid of.
00:34:49.080 So stop being a bot.
00:34:50.080 Stop trying to be morally right all the time.
00:34:53.080 That's my advice.
00:34:54.080 If I have anything, I've been doing this for a while, there's no point in trying to be like,
00:34:58.080 this is the right thing.
00:34:59.080 And if you don't agree, then I'm sad.
00:35:01.080 And you don't have empathy.
00:35:02.080 You're a bad person.
00:35:04.080 Why would you?
00:35:05.080 Why would you?
00:35:06.080 Why would you say that?
00:35:07.080 Why would you?
00:35:08.080 Let's just get money on it.
00:35:09.080 Let's just profit off of it.
00:35:10.080 That's the best thing you can do off of this.
00:35:12.080 And so you're not going to change people's mind.
00:35:14.080 It's all just noise.
00:35:15.080 The world is the way it is.
00:35:17.080 Our voice doesn't really have any effect.
00:35:19.080 So you can make people laugh, make some money off of it.
00:35:22.080 That's it.
00:35:23.080 You can be entertained.
00:35:24.080 But if you're going to, well, you must click the GoFundMe link that I have here.
00:35:28.080 And this is the infographic that has the right thing.
00:35:30.080 And this is the information that's correct.
00:35:32.080 And if you don't put on this, then you're bad.
00:35:35.080 Ugh.
00:35:36.080 Ugh.
00:35:37.080 We don't have to get rid of the smartphone.
00:35:39.080 We don't have to get rid of the idea of social media.
00:35:41.080 We just have to get rid of the manipulation machine that's in the background.
00:35:45.080 Just one last thing as well that is also obsessing parents at the moment.
00:35:50.080 Screen time itself, do you think that is a bad thing?
00:35:53.080 Or is it just what's on the screen?
00:35:55.080 To be frank with you, I struggle with this question because I have an 11-year-old.
00:36:01.080 And so I tend to think that manipulation time when the kids are being observed by algorithms
00:36:08.080 and tweaked by them is vastly worse than just screen time by itself.
00:36:14.080 Someone just donated five and said, how do you get out of the bot mentality?
00:36:17.080 You see it for what it is.
00:36:20.080 You see that we're all just little ants running around yelling about nothing.
00:36:23.080 I'll get a dumb the more scrolling we do.
00:36:25.080 So don't be one of the bots.
00:36:28.080 Don't be one of the...
00:36:30.080 Don't be like the WALL-E people.
00:36:32.080 Make money off of it.
00:36:33.080 What are the people at the top doing it?
00:36:35.080 They're making money off this shit.
00:36:37.080 They're profiting off of social media.
00:36:38.080 Be like them.
00:36:40.080 Don't just be in here consuming, echoing this, this is the right thing to say.
00:36:44.080 Don't do that.
00:36:45.080 Don't engage in arguments.
00:36:46.080 Just ignore it all and get money.
00:36:49.080 So...
00:36:50.080 Do you include video games in the social media?
00:36:53.080 You know, the things that are manipulating them?
00:36:55.080 Because they are similarly addictive, aren't they?
00:36:57.080 They're addictive but not manipulative typically.
00:37:00.080 Now, here I'm not sure how evil we've become lately because there might be some video games that are using behavior mod techniques for pay.
00:37:09.080 That's conceivable.
00:37:10.080 I can see how that could happen.
00:37:12.080 If you're thinking about it out there, don't do it, okay?
00:37:15.080 Find something better to do.
00:37:17.080 But the mainstream video games are not doing that.
00:37:20.080 They are addictive.
00:37:22.080 So there are plenty of things that are addictive that aren't leveraging that for manipulation.
00:37:26.080 So these are two different stages.
00:37:28.080 What do you think of Fortnite?
00:37:30.080 I have not played it.
00:37:31.080 You haven't played it?
00:37:32.080 Because Fortnite is exactly that.
00:37:33.080 It's getting people to pay for things within the game.
00:37:37.080 No, but see, the thing is, getting them to pay is still not manipulating them for a third party.
00:37:42.080 That's getting them to buy stuff.
00:37:44.080 I mean, Amazon does that to get you to buy stuff.
00:37:47.080 All kinds of people do that.
00:37:49.080 That might be annoying.
00:37:51.080 You might have...
00:37:52.080 Here's a good example of bot mentality.
00:37:54.080 Black Lives Matter in Ukraine, every single situation, the pandemic, there's always people making money off of it.
00:38:00.080 Every time there's people enraged giving GoFundMe links and telling you what you do, there's always somebody who's making all the bread off of it.
00:38:07.080 Black Lives Matter, the guy who made all the money from all your dono links, is a white dude.
00:38:11.080 All the GoFundMe links to Ukraine, most of them were a scam.
00:38:15.080 The Pandy, the people were really outside getting money where the rich people at the top, looking at all this like laughing while we stayed inside.
00:38:22.080 They were laughing at us.
00:38:23.080 And we were just like,
00:38:25.080 Stop the spread! Stop the spread!
00:38:28.080 Handing out masks in the park, like making mean eye contact with you.
00:38:33.080 Put it on!
00:38:35.080 People are laughing at us while we all walk into a restaurant on a plane, take it off so that...
00:38:42.080 We walk on a plane, all of it on, take it off so that we all eat, breathing everywhere, put it back on.
00:38:47.080 The people at the top are just laughing.
00:38:50.080 We look goofy.
00:38:52.080 We look so stupid.
00:38:55.080 Get out of that.
00:38:57.080 Every single drought, people are making money.
00:39:02.080 So every time you see one of those situations, George Floyd, anything like that, when there's mob mentality, think about it this way.
00:39:08.080 Someone is scamming you.
00:39:10.080 Someone with way more power, someone with way more money is scamming you while you are trying to do the right thing and save the world.
00:39:17.080 But ultimately, we can't save the world, no matter what.
00:39:21.080 Our vote doesn't matter.
00:39:23.080 This doesn't change anything.
00:39:25.080 No matter how many of this you take, we have no control.
00:39:28.080 They have all the control.
00:39:30.080 They're taking the money while we run around and get scared.
00:39:33.080 So don't get scared.
00:39:34.080 Just bob and weave.
00:39:37.080 Object to it.
00:39:39.080 Especially if you feel your kids are wasting money, you might object to it.
00:39:42.080 You might feel it's not an ideal example of human behavior and character and maybe there could be a better business, whatever.
00:39:50.080 But it's not directly manipulating you, say, to influence an election.
00:39:55.080 It's not trying to change your behavior out in the larger world.
00:39:59.080 And that's the thing that's really tragic about designs like Facebook and Google.
00:40:03.080 They are succeeding at doing that.
00:40:05.080 But your advice tonight to everyone watching this is delete all your accounts.
00:40:11.080 I would like to make.
00:40:13.080 No, don't delete your accounts.
00:40:14.080 Just stop.
00:40:15.080 Just realize what it is.
00:40:17.080 Stop falling into the fear.
00:40:19.080 They're spreading fear.
00:40:20.080 The best manipulation tactics, if you read like any book about control or power, the best manipulation is with fear.
00:40:26.080 And what happened the past two years?
00:40:28.080 Everybody got scared.
00:40:33.080 And then you're easy to control.
00:40:35.080 You'll do whatever they say.
00:40:36.080 Because people will do anything to stop being scared.
00:40:42.080 Pick one.
00:40:43.080 Don't delete it.
00:40:44.080 Just see it for what it is.
00:40:45.080 Every time you're scared, someone's looking down.
00:40:48.080 Ha ha.
00:40:49.080 Dummy bot.
00:40:52.080 Two very quick pitches on that account.
00:40:55.080 One, if you're a young person and you've only lived with social media, your first duty is to yourself.
00:41:02.080 You have to know yourself.
00:41:03.080 You should experience travel.
00:41:05.080 You should experience challenge to yourself.
00:41:07.080 You need to know yourself.
00:41:08.080 And you can't know yourself without perspective.
00:41:11.080 So at least give it six months without social media.
00:41:14.080 Yeah.
00:41:15.080 And really quit them.
00:41:16.080 Don't like quit Facebook and keep another Facebook thing.
00:41:19.080 People are in the chat saying like, oh, but you make money on social media.
00:41:21.080 I didn't have Instagram until like a couple of years ago.
00:41:24.080 All throughout high school, I didn't have Instagram.
00:41:26.080 I didn't have Snapchat until too late.
00:41:28.080 I even took long breaks with YouTube.
00:41:29.080 I've been on YouTube since 2013.
00:41:31.080 I took years off.
00:41:33.080 I took years to really learn myself and develop my own brain because I saw that it was manipulating people.
00:41:39.080 I needed to be on my own to really figure myself out before I could be on the internet all the time.
00:41:44.080 Now I really feel like I know myself.
00:41:46.080 That's why I could stream every day and I could bob and weave all the fucking haters, all the people saying,
00:41:50.080 And so misogynist, you're, you're just spreading your, you're dying.
00:41:55.080 I know how to slip all that because I know myself.
00:41:58.080 It took me years of traveling, of talking to people, of being introspective, being alone, going to different places to finally develop the mental strength to do this shit every single day.
00:42:09.080 And now it's so confusing to people because people never get to that point every day.
00:42:13.080 They'll write essays and they'll say shit like you need to take your pills.
00:42:16.080 You need to calm down.
00:42:17.080 Stop doing one day, one day you'll mature.
00:42:20.080 One day you're going to take your therapy and you know, I'm not going to.
00:42:25.080 The more you say that, the more whoo, whoo, slip, slip, the more I'm going to keep making money streaming.
00:42:31.080 And the more money that y'all are going to make, bro, going viral on TikTok.
00:42:35.080 I guarantee you don't believe me, start posting my stream clips and see all the bots in the comments section.
00:42:40.080 Guarantee you.
00:42:41.080 Like WhatsApp, because then it'll still be spying and manipulating.
00:42:45.080 Get rid of the whole thing for six months and know yourself and then you can decide.
00:42:49.080 I can't tell you what's right.
00:42:50.080 You have to decide, but you can't until you know yourself.
00:42:53.080 And then for the rest of society, I'd say as long as we can have some small percentage of people who are off it,
00:43:00.080 then the society can have voices to give perspective.
00:43:03.080 If everybody's universally part of this thing, we cannot have perspective.
00:43:07.080 We cannot have a real conversation.
00:43:09.080 And it's too lonely right now.
00:43:11.080 You know, we need more people who are just outside of that loop,
00:43:15.080 who are thinking without the manipulation.
00:43:17.080 And I think we'll find it extraordinarily valuable to have them.
00:43:21.080 Are you just a new age hippie?
00:43:24.080 I mean, see, that's what I used to say a couple of years ago.
00:43:27.080 People are going to call me a conspiracy theorist or a hippie or like I'm a boomer or some shit like that.
00:43:32.080 Fine.
00:43:33.080 Keep going on TikTok.
00:43:34.080 Keep scrolling.
00:43:35.080 Anyone who's really like doubting what I'm saying right now.
00:43:38.080 I don't think that they're they're really happy with their lives.
00:43:40.080 People like in the chat who get it right now, they're saying, why are there so many goofy people in here?
00:43:44.080 You're seeing the goofy people because a lot of people don't want to change.
00:43:47.080 A lot of people are addicted and they don't want to see it.
00:43:49.080 And they're getting triggered by the stuff I'm saying.
00:43:51.080 But a lot of you, the reason you're still here is because, you know, I'm saying a little bit of truth.
00:43:55.080 You resonate a little bit with what I'm saying.
00:43:57.080 You know that this shit is not making you happy.
00:43:59.080 You know, when you lay for five hours on your side, scrolling, scrolling, scrolling, scrolling, scrolling.
00:44:03.080 You wake up out of it.
00:44:04.080 It's like, whoa, come on, man.
00:44:09.080 Someone donated five and said, if you chose any voiceover commentary niche on YouTube that isn't saturated, needs more creators, what would it be?
00:44:14.080 Just stick to whatever you believe.
00:44:16.080 Be more honest with yourself.
00:44:18.080 It's not any niche.
00:44:19.080 Just people who have a unique voice.
00:44:21.080 That's going away because everyone's scared of getting canceled.
00:44:24.080 Have you just been through the mill and kind of worked out?
00:44:28.080 I want to take out all this.
00:44:29.080 And let's just let's just stop.
00:44:31.080 Do I seem new age to you?
00:44:34.080 I don't know.
00:44:35.080 I mean, you know, I mean, I hear here's what I'll tell you.
00:44:39.080 The bind you've put me in is that I'd be happy to trash the new age and and demonstrate that I'm not part of that manner of thinking.
00:44:47.080 I'm certainly not.
00:44:48.080 I think I hope I've come across as a non utopian.
00:44:50.080 But the problem is many of my friends in California are quite new age.
00:44:55.080 So I want to be kind to them.
00:44:57.080 Good.
00:45:00.080 Good.
00:45:01.080 Good video.
00:45:02.080 What did you think, chat?
00:45:04.080 What did you think?
00:45:06.080 And watch this.
00:45:07.080 As soon as I start talking about this, they're going to find something to say that's misinformation.
00:45:11.080 That's what happened yesterday.
00:45:12.080 Community guidelines strike for misinformation.
00:45:15.080 What do you think?
00:45:16.080 Why do you think that what I'm saying is called misinformation by all these people at the top with more money than all of us?
00:45:22.080 Right.
00:45:24.080 But is now on our list?
00:45:25.080 What 오 takes place.
00:45:26.080 No, no.
00:45:27.080 Stop staring at it please!
00:45:28.080 What is this?
00:45:29.080 No, no, no, no.
00:45:30.080 Oh, just my last lap's.
00:45:31.080 The key is nearly as of
00:45:36.640 what we have is.
00:45:39.640 Our last lap.