Real Coffee with Scott Adams - May 01, 2023


Episode 2095 Scott Adams: Instagram Brain Control Proven, GOP Will Control Congress, AI Disappoints


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

148.69012

Word Count

10,205

Sentence Count

830

Misogynist Sentences

12

Hate Speech Sentences

24


Summary

Satan is actually applying for a job in Dilbert s office, and the boss says, I see your prior experience includes being expelled from heaven and dooming souls for eternity. Are you qualified for anything else?


Transcript

00:00:00.400 La-da-da-da-da-da-da-da.
00:00:04.160 Good morning, everybody, and welcome to the highlight of civilization.
00:00:09.500 It's called Coffee with Scott Adams, and there's never been a better time in, well, your life,
00:00:16.600 and certainly not this morning.
00:00:18.340 But if you'd like this day to take off like a Falcon rocket, not the latest one, maybe
00:00:25.020 a future one, then all you need is a cup or a mug or a glass, a tank or chalice or stein,
00:00:31.640 a canteen jug or flask, a vessel of any kind.
00:00:35.280 Fill it with your favorite liquid.
00:00:36.580 I like coffee.
00:00:38.480 Join me now for the unparalleled pleasure.
00:00:41.720 It's the dopamine of the day, the thing that makes everything better.
00:00:44.760 It's called the simultaneous sip, and it happens now.
00:00:48.280 Go.
00:00:53.420 Ah.
00:00:55.020 Now that's the way to get that day going.
00:01:00.560 All right, let's talk about all the stuff that's happening.
00:01:04.240 As your virtual friend who always talks to you in the morning, I need to keep you up to
00:01:09.480 date on what's happening.
00:01:11.440 If you're not a subscriber to the locals, scottadams.locals.com channel, there's a good chance you did not see
00:01:20.500 my new provocative comic there today.
00:01:24.840 It's called Dilbert Reborn.
00:01:27.240 And I'm introducing a new character in the office.
00:01:30.060 It's Satan.
00:01:32.560 Satan.
00:01:32.920 Now, why this is important, Satan's actually applying for a job in Dilbert's office,
00:01:37.840 is that in 1989, I think it was, maybe 90, I drew a comic in which Satan, same Satan, very same character,
00:01:51.660 tried to get a job at Dilbert's office.
00:01:54.720 And my editor at the time said, whoa, no, you can't put Satan in your comic strip.
00:02:02.540 If you put Satan in your comic strip, people will think that maybe you worship Satan.
00:02:08.940 They might think that you're a little bit too much in favor of Satan, if you put it in the comic.
00:02:15.320 And I said, I don't think people are going to think that.
00:02:18.680 I think they'll just think it's a joke.
00:02:21.240 And my editor said, you do not understand the Midwest.
00:02:25.460 Trust me.
00:02:26.160 So I did trust her, and instead of putting Satan in my strip, I introduced a milder character.
00:02:32.500 Do you remember him?
00:02:34.100 Do you remember what character I did introduce?
00:02:37.600 That's right.
00:02:38.760 His name was Phil, and he was the ruler of heck.
00:02:43.220 So I couldn't put the ruler of hell in my comic strip.
00:02:46.680 I had to tone it down, so I invented Phil, the ruler of heck.
00:02:52.200 And as the ruler of heck, he did not have a pitchfork.
00:02:55.000 Those are kind of scary.
00:02:57.060 So instead, he had a huge spoon, ruler of heck.
00:03:01.120 But now I can introduce the actual Satan, who will be fun because he doesn't wear pants.
00:03:07.060 He has no pants.
00:03:10.240 That'll be an important fact in an upcoming strip.
00:03:14.640 But here he is applying for a job, and the boss says, I see your prior experience includes being expelled from heaven and dooming souls for eternity.
00:03:28.160 But we don't have any openings in human resources.
00:03:31.280 Are you qualified for anything else?
00:03:33.480 And then Satan says, people say I'm a good liar.
00:03:37.040 And the boss says, we'll start you in sales.
00:03:39.280 So Satan will be the director of sales at Dilbert's Company.
00:03:46.780 So just wait for that.
00:03:48.940 I'll give you a tease of what's up.
00:03:52.840 The character Alice in my comic strip.
00:03:55.780 You might know that she has a thing for bad boys.
00:03:59.520 And now Satan is actually working in her office.
00:04:02.740 So sparks are going to fly.
00:04:05.560 All right.
00:04:05.940 Look for that later.
00:04:07.600 All right.
00:04:07.980 I love having my artistic freedom.
00:04:10.760 Oh.
00:04:12.980 I love artistic freedom.
00:04:15.540 I sit down to make a joke, and I can make it about anything I think is funny.
00:04:20.580 I never could do that before.
00:04:22.020 It always had to be, all right, what does the most sensitive person in the world think is okay?
00:04:29.720 All right.
00:04:30.020 That's now my boundary.
00:04:31.040 So within the realm of the most sensitive person in the world, what can I create?
00:04:38.540 It was like being in jail.
00:04:40.520 It was terrible.
00:04:42.600 Getting canceled, I know you don't believe this, but in terms of a lifestyle mental release,
00:04:50.420 being canceled from the mainstream media was really good for me.
00:04:55.700 Like, it was really good for my mental health and everything else.
00:05:00.600 So I'm free.
00:05:02.480 Chicago is complaining they've got a humanitarian crisis from too many immigrants coming in.
00:05:10.380 Most of them illegal, I guess.
00:05:11.980 Well, I guess they'd technically be legal, right?
00:05:13.980 But if a migrant or immigrant applied for asylum and was waiting for an answer, they would be here legally.
00:05:23.580 Legally with an L.
00:05:25.320 So technically they would be legal, I guess.
00:05:27.660 But they're having a humanitarian crisis.
00:05:30.460 And why do we ship people who don't have anything to cities that are cold?
00:05:41.080 Why in the world would anybody go if they didn't have any place to stay?
00:05:45.820 Why would you go to the one place that you would die if you stayed outside from one thing or another, either crime or temperature?
00:05:53.140 But here's one more reason to move away from cities.
00:05:58.720 Cities are dead.
00:06:00.280 Move away from cities.
00:06:02.520 As long as you can get good internet somewhere else, move to that place.
00:06:08.760 Get away.
00:06:10.660 All right.
00:06:11.840 There's a poll that says nearly three-quarters of U.S. adults say the news media is increasing political polarization.
00:06:23.140 You think that's right?
00:06:25.800 Three-quarters of U.S. adults say the news media is increasing the polarization in this country.
00:06:31.660 Let's see.
00:06:32.820 If three-quarters think that the news media is increasing polarization,
00:06:38.160 how many, what percentage would think that the news media is not?
00:06:44.620 Oh, yeah, 25%.
00:06:46.340 25%.
00:06:47.340 Yeah.
00:06:47.900 If you're new to the live stream, you don't know why we're all laughing.
00:06:55.020 I'll give you an update.
00:06:56.520 Long ago, I postulated that no matter what the question is in a poll,
00:07:02.140 it doesn't matter what topic it is at all,
00:07:05.060 25% of the respondents will have the dumbest fucking answer you've ever heard in your life.
00:07:09.880 I don't know if it's the same 25%.
00:07:13.540 I think it's not.
00:07:15.840 I think there's just some kind of universal rule where 25% of the public,
00:07:21.060 different 25%, will get every question really stupidly wrong.
00:07:26.440 It's very consistent.
00:07:28.500 Maybe that's, do you think that's the,
00:07:31.920 no, that seems too low for the NPC number.
00:07:34.840 I don't know.
00:07:36.080 It's just a weird thing we notice.
00:07:37.520 But yes, the media is destroying the world.
00:07:41.780 And do you know what the media says is the problem?
00:07:44.660 Take a guess.
00:07:46.440 So the people think the media is destroying the country.
00:07:49.680 Who does the media think is the problem?
00:07:53.660 White people.
00:07:56.760 Yeah, straight white people.
00:07:58.680 Mostly men.
00:07:59.860 Mostly straight white men.
00:08:02.340 And would that be an example of the news media increasing polarization?
00:08:06.340 Yes, it would.
00:08:09.160 Yes, it would.
00:08:11.040 And still, we still pay attention to them.
00:08:14.960 All right.
00:08:15.420 Well, as you know, Joe Biden is very old.
00:08:19.920 But what you didn't know is he has an excellent sense of humor,
00:08:23.100 which was on display at the White House Correspondents Association dinner.
00:08:28.820 And as Dean Obadiah,
00:08:30.720 and I'd like to do my impression of Dean Obadiah.
00:08:34.540 He writes for CNN.
00:08:36.580 And he often writes negative things about Republicans, such as Trump,
00:08:41.460 but positive things, very positive things about Biden.
00:08:46.380 Here's my impression of Dean Obadiah.
00:08:48.540 What do you want me to say?
00:08:54.140 I'll say it.
00:08:56.100 That's my impression.
00:08:58.260 But here are some of the hilarious, hilarious jokes from Joe Biden.
00:09:03.900 Now, according to Dean Obadiah,
00:09:06.540 it is a very clever and disarming thing to make fun of one's own age.
00:09:11.040 Because when you're mocking yourself for being old,
00:09:14.340 people lighten up, and then they're like,
00:09:16.320 oh, we don't even care that you're old now.
00:09:19.180 You're funny.
00:09:20.280 You can be old and funny.
00:09:22.380 That's fine.
00:09:23.620 So don't worry about nuclear winter or inflation.
00:09:28.280 If he could make some good jokes about his own age,
00:09:31.700 I think we could be okay with him.
00:09:33.560 So here are some of the excellent, excellent jokes.
00:09:35.880 He began by telling journalists, this is from Dean's article,
00:09:42.600 he said, quote,
00:09:44.800 look, I get that age is a completely reasonable issue.
00:09:49.020 It's in everybody's mind.
00:09:50.660 And by everyone, I mean the New York Times.
00:09:54.900 I'll wait until the laughter calms down.
00:09:59.260 Did he hear that?
00:10:00.600 Some of you are not laughing.
00:10:02.880 You probably didn't hear it.
00:10:03.860 I'll read it again.
00:10:04.800 I think he didn't hear it correctly.
00:10:06.680 Look, I get that age is a completely reasonable issue.
00:10:10.440 It's in everybody's mind.
00:10:12.580 And by everyone, I mean the New York Times.
00:10:16.480 Come on.
00:10:17.380 That's good stuff.
00:10:19.040 Come on.
00:10:20.400 All right.
00:10:20.880 Well, he's not done.
00:10:22.540 It gets funnier.
00:10:24.980 As Dean says, the president doubled down on the theme.
00:10:28.720 He doubled down.
00:10:30.180 Joking this.
00:10:31.520 Quote,
00:10:32.080 You might think I don't like Rupert Murdoch.
00:10:35.380 That's simply not true.
00:10:37.160 How can I dislike a guy who makes me look like Harry Styles?
00:10:41.000 See, because Murdoch's older than him.
00:10:54.940 Harry Styles would be a young man.
00:10:58.260 All right.
00:10:59.300 But he's not done.
00:11:00.680 Oh, no, he's not done.
00:11:01.900 It gets better than that.
00:11:03.100 But let's see, Biden, he was not close to being done, is what he said.
00:11:10.200 Not even close.
00:11:11.740 Not even in the general area of being done.
00:11:15.340 The humor was going to continue, and here it is.
00:11:18.980 Call me old.
00:11:20.460 I call it being seasoned.
00:11:22.280 Oh, that's so good.
00:11:24.060 That's so good.
00:11:25.620 Call me old.
00:11:26.400 I call it being seasoned.
00:11:27.800 You say I am ancient.
00:11:29.460 I say I am wise, he quipped.
00:11:31.340 Oh, this is getting good.
00:11:37.740 And then you say I'm over the hill.
00:11:42.240 Don Lemon would say that's a man in his prime.
00:11:47.320 Because you know what we hadn't heard enough of is jokes about Don Lemon and people being in their prime.
00:11:55.280 It's the first one I've heard.
00:11:56.520 And it's weird, because the news suggested that that might be something people would joke about.
00:12:02.040 But he was the first person in the world to think of making a joke about not being in your prime.
00:12:07.720 But it was hilarious.
00:12:08.960 He nailed it.
00:12:13.280 Let's see.
00:12:14.080 And the self-deprecating remarks started early.
00:12:16.800 He said, I believe in the First Amendment, not just because my good friend Jimmy Madison wrote it.
00:12:23.520 See, because Jimmy Madison is James Madison, one of the framers of the Constitution.
00:12:30.540 He's old.
00:12:31.440 He's so old, he's dead.
00:12:33.260 And then by Biden saying that they were friends, that would suggest that he was as old as a guy who's really old and dead.
00:12:41.520 Ha ha ha ha ha ha ha ha ha.
00:12:43.520 How much do you miss Trump?
00:12:53.900 Trump can actually deliver a punchline.
00:12:58.980 I mean, he can actually make you laugh out loud, because he's legitimately funny.
00:13:05.420 Oh, my God.
00:13:06.460 Well, there's a prediction also in CNN that the GOP is going to pick up a number of Senate seats.
00:13:16.200 Now, correct me if I'm wrong, but if the 2024 prediction holds, apparently there are more Democrats who have sketchy seats that they might lose than there are Republicans.
00:13:28.720 So if everything went the way it looks, Republicans would have firm control of the Senate in 2024.
00:13:37.600 Would they?
00:13:38.380 I haven't seen a prediction about the House.
00:13:41.320 Do you think the House is going to flip back or stay Republican?
00:13:46.520 But don't we have a chance of having a Republican president, Republican Senate, Republican House, and Republican, well, conservative court?
00:13:55.760 You say no chance because of the election?
00:13:59.640 You think the election will be rigged, right?
00:14:02.480 All right.
00:14:04.060 There's only one party.
00:14:06.380 Yeah.
00:14:07.000 When it comes to war and spending, there's only one party.
00:14:10.500 That's for sure.
00:14:12.320 Well, there's a possibility that that could happen, but it would also make the Democrats far more likely to vote if they thought that the Republicans would own everything after this is done.
00:14:24.260 So this is going to get interesting.
00:14:27.920 2024 might be one of our most interesting elections because there's just so much, it seems like, that's at stake.
00:14:35.320 All right.
00:14:36.220 Well, we'll see what happens there.
00:14:38.160 Well, I finally found scientific proof that TikTok can change your gender.
00:14:44.480 Do you believe me?
00:14:46.380 Do you believe that there's now, I'll say a study.
00:14:49.720 I'm not sure that's science exactly, but there's a study that shows that TikTok can change your gender.
00:14:58.960 Now, when I say that, the study had nothing to do with TikTok or gender.
00:15:04.180 Okay.
00:15:05.240 There was Instagram and buying stuff that you see in advertisements.
00:15:08.700 But I'm going to connect them because it's the same story.
00:15:13.320 It just takes me to connect them.
00:15:15.720 All right.
00:15:16.320 So there was a study, Matthew Pittman's writing about this.
00:15:20.620 They studied people who were exposed to an advertisement and asked them how likely they were to buy and if they felt, you know, triggered to buy the thing.
00:15:33.120 And they did it for people who did a difficult mental task, just a difficult mental task.
00:15:40.600 And then they had them, you know, see if they would be more or less likely to buy a product that's advertised.
00:15:46.740 And then the second group, so that was a control group.
00:15:49.680 The second group would look at a bunch, they would just scroll through Instagram and just look at a bunch of Instagram content.
00:15:56.600 And then they were asked to buy a product.
00:15:59.700 Who do you think bought more products?
00:16:01.340 The people who were scrolling through Instagram, which had nothing to do with the product,
00:16:06.360 or the people who just did a sort of a mental challenging task and then looked at the product.
00:16:13.060 It wasn't even close.
00:16:15.040 The people who scrolled through Instagram bought those products.
00:16:19.460 Now, have I been telling you for weeks that Instagram does something to my brain that I can feel in real time and it makes me buy stuff?
00:16:29.620 Have you heard me say that?
00:16:32.000 And it's an effect that I get on Instagram that I don't get anywhere else.
00:16:36.580 So television doesn't make me want to buy a product, ever.
00:16:40.960 Ever.
00:16:41.800 I just never.
00:16:43.240 An ad in a newspaper doesn't.
00:16:45.900 An ad on Twitter doesn't.
00:16:47.540 I don't see that many, but ads on Twitter don't.
00:16:50.080 But when I'm on Instagram, here's what I thought.
00:16:53.720 I thought they were just good at knowing what I wanted, but it's not that.
00:16:59.400 So lately on Instagram, I've purchased a portable air conditioner for my man cave.
00:17:09.380 I actually got it on Amazon, but I was triggered by an Instagram ad to go look for it on Amazon just because it was easier.
00:17:18.640 But it made me buy one.
00:17:20.200 It made me buy one and I wouldn't have otherwise.
00:17:21.880 I bought a putter that was excellent by the pyramid putter.
00:17:28.340 I recommend it.
00:17:29.060 It's very good.
00:17:30.260 And the products are sometimes very good.
00:17:32.260 So there's no complaint about the quality of the products.
00:17:35.500 Sometimes they're quite good products.
00:17:38.360 I bought a kind of a driver that's sort of not a driver.
00:17:43.760 It's sort of a hybrid.
00:17:45.720 And I wasn't too impressed with that, but I bought it anyway.
00:17:51.180 And every time I see a commercial on Instagram for a flashlight, I reach for my wallet.
00:17:57.280 And I have to like pull my hand back.
00:18:00.320 It's like, no, no, Scott, you don't need a flashlight that can light up an entire city.
00:18:06.380 Oh, no, put your hand back.
00:18:09.280 So it turns out that what they hypothesize, I'm not sure they have it right, but what they hypothesize is that there's something about the confusion or the way your brain is lit up by Instagram that makes it perfectly suited for selling you an ad.
00:18:27.980 Now, do you believe that's true?
00:18:31.020 It's one study.
00:18:32.880 So I don't think you can automatically say one study is telling you everything you need to know.
00:18:37.860 But that feels right, doesn't it?
00:18:40.780 Doesn't it feel right that Instagram puts you in the mood, and presumably Facebook too, to buy stuff?
00:18:48.620 Has anybody noticed that Instagram ads are more triggering than other ads?
00:18:54.440 I'm not the only one who noticed, right?
00:18:55.880 Have you noticed?
00:18:58.920 Because I haven't seen, I'm not seeing people agreeing with me on this.
00:19:04.100 Maybe only after I said it.
00:19:05.880 Yeah, maybe I'm a little keyed into it.
00:19:07.660 But so, and I think maybe one of the reasons I noticed is that I hate shopping with a passion.
00:19:14.680 I even hate online shopping.
00:19:16.220 I don't like anything about shopping.
00:19:19.620 But those Instagram ads actually, I'll watch the ad as if it's content.
00:19:24.880 I'm so drawn in that I'll watch the entire advertisement for entertainment.
00:19:30.600 And I hate that.
00:19:31.340 I hate that it makes me like shopping.
00:19:34.440 Anyway, I'm going to take that.
00:19:37.400 Now, imagine that.
00:19:39.560 So don't you assume that TikTok would have a similar effect, right?
00:19:44.760 Because Instagram and TikTok are very similar, and that they're feeding you things you want to see,
00:19:49.680 and lots of little short hits that are very impactful.
00:19:52.760 Now, imagine you're on Instagram, and you're being primed to buy stuff,
00:19:58.680 and then an ad comes up for a cool flashlight.
00:20:01.640 You're far more likely to buy it.
00:20:03.320 That's my experience.
00:20:04.500 Now, imagine you're on TikTok.
00:20:07.440 You're primed into a situation where you're ready to buy whatever they're selling.
00:20:12.420 And what they're selling is that you might be transgender.
00:20:16.500 That's what TikTok sells.
00:20:18.540 TikTok sells that you might not be the gender you think you are,
00:20:21.160 or the one you were born with.
00:20:23.800 Now, if you've proven that Instagram can put you into a hypnotic state
00:20:29.240 in which you will buy things you would not otherwise buy,
00:20:32.640 you tell me I'm wrong.
00:20:34.280 Am I wrong?
00:20:35.680 That this is, if this study holds for Instagram,
00:20:38.700 and that's an if, because studies tend to be wrong about half the time,
00:20:42.800 but if this holds for Instagram,
00:20:44.720 I think that would be a strong indication
00:20:48.200 that whatever TikTok is selling,
00:20:51.320 you're more primed to buy.
00:20:52.920 Same reason.
00:20:54.140 And if what they're selling is lifestyle decisions,
00:20:57.080 and the other one is selling products,
00:20:58.840 I don't think that matters.
00:21:00.900 I don't think it matters.
00:21:01.940 I think you're just more primed to buy whatever they're selling.
00:21:05.360 And what they're selling is you might be the wrong gender.
00:21:09.280 I'm not wrong.
00:21:10.140 And why do we let China reprogram the brains of our youth
00:21:17.020 to make them less likely to reproduce?
00:21:21.860 Now, obviously, you could be trans and still have kids
00:21:25.720 with scientific means and other means.
00:21:28.700 So it doesn't limit you from having kids.
00:21:30.800 But I would have to think that the trans community
00:21:33.580 has fewer children on average, probably.
00:21:37.000 Yeah, it's like a stealth genocide.
00:21:42.000 We're actually being hypnotized to destroy ourselves.
00:21:45.020 That's what it looks like.
00:21:47.140 And do you think there's anybody in Congress
00:21:49.060 who is smart enough to understand what I just said?
00:21:54.140 Yeah, Thomas Massey, and then we're done.
00:21:58.000 And then we're fucking done, right?
00:22:01.040 How much does it bother you that when I say,
00:22:03.720 is there anybody in Congress who can understand what I just said?
00:22:07.040 Which is not that hard to understand.
00:22:09.000 That you can only think of one person
00:22:10.920 who can actually even understand it, right?
00:22:15.920 Just understand it.
00:22:17.520 That's all.
00:22:18.320 Not even agree with it.
00:22:19.580 Oh, Rand Paul.
00:22:20.300 I'll give you Rand Paul as well.
00:22:21.920 Now, I'm exaggerating.
00:22:23.280 You know, Matt Gaetz could understand it.
00:22:25.240 And, you know, Tom Cotton could understand it.
00:22:27.840 They could understand it, some of them.
00:22:30.260 But it's like it doesn't exist.
00:22:32.860 They're treating it like it's not real.
00:22:35.340 So, on average, I would say they don't understand it.
00:22:38.240 They don't understand AI,
00:22:40.040 and they absolutely don't understand TikTok.
00:22:42.720 Every time they talk about TikTok,
00:22:44.500 they talk about privacy.
00:22:46.180 That is not the risk.
00:22:48.900 Privacy is not the problem.
00:22:51.900 It's influence.
00:22:52.600 All right.
00:22:57.860 I saw a request from one of my local subscribers the other day
00:23:02.000 that if I talked about AI stuff,
00:23:04.560 I should put it at the end
00:23:06.340 because some people don't want to hear about AI stuff.
00:23:09.780 However, I would like to push back on that a little bit.
00:23:13.320 Everything is AI stuff now.
00:23:14.880 It's not like AI is its own category.
00:23:20.020 AI is politics.
00:23:21.700 AI is programming.
00:23:23.060 AI is art.
00:23:24.080 AI is jobs.
00:23:25.200 AI is the future.
00:23:26.320 AI is climate change.
00:23:28.360 AI is therapy.
00:23:31.740 I'm not going to take that out of the topics.
00:23:34.040 It's everything.
00:23:35.240 So you're going to hear way more AI stuff.
00:23:38.140 It's just that it will be baked into everything we do from now on.
00:23:40.840 But I do have some AI stuff.
00:23:44.080 If you still want to bail out, this would be the time.
00:23:50.640 So I've got some ideas about how to pause AI.
00:23:54.520 Some of the smartest people, including Elon Musk,
00:23:56.960 are saying we should slow down on AI
00:23:59.200 until we have some guardrails and some laws
00:24:02.280 and some better ideas about how to control it.
00:24:05.000 Now, I don't know if that's a good idea or a bad idea.
00:24:07.900 Honestly, I don't
00:24:08.880 because I think it's unknowable.
00:24:12.700 But the reason it's unknowable that you should pause
00:24:16.840 is that other people will not pause.
00:24:20.020 So which is worse?
00:24:22.260 Is it worse that your adversaries get ahead of you?
00:24:25.700 Or is it worse that you create something
00:24:27.540 that might kill you before your adversaries do?
00:24:30.040 And the answer is nobody knows.
00:24:32.360 So I don't have an opinion on it
00:24:34.120 because it would be purely guessing.
00:24:36.260 But, that said,
00:24:40.480 if you wanted to stop it, how would you do it?
00:24:43.480 And I have an easy way to do it.
00:24:46.380 You ready for this?
00:24:48.540 You could stop AI in its tracks
00:24:50.940 by asking Donald Trump to endorse it
00:24:54.820 in a full-throated way
00:24:56.620 and to say that we should definitely
00:24:58.820 not stop our development of AI.
00:25:01.160 All you need is Trump to say,
00:25:03.940 no, do not put any barriers on AI.
00:25:07.080 Let the free market decide.
00:25:09.340 Get the government out of AI completely.
00:25:12.640 Two weeks, AI will be completely dead.
00:25:16.200 Two weeks.
00:25:17.700 You think I'm kidding?
00:25:19.660 Do you think that's a joke?
00:25:22.400 It's not.
00:25:24.080 It's not.
00:25:25.000 It would literally only take that.
00:25:26.460 Trump could save the world.
00:25:30.500 He could.
00:25:31.140 I mean, it might destroy the world, too.
00:25:32.620 You don't know.
00:25:33.320 That's the problem.
00:25:34.820 But he could save the world
00:25:36.080 by giving a full-throated endorsement of AI
00:25:39.440 and saying,
00:25:40.420 do not put any controls on it
00:25:42.260 because otherwise the other countries
00:25:44.260 will get ahead.
00:25:45.360 All he has to do is say,
00:25:46.520 China will get ahead of us
00:25:47.740 so we can't put any controls on.
00:25:50.000 We have to go faster and harder
00:25:52.000 and even more reckless than before.
00:25:55.240 Boom.
00:25:56.300 Congress will ban that shit so fast
00:25:58.500 to make sure that he doesn't get credit for it.
00:26:01.240 Or Trump should promote AI in a way
00:26:05.700 that would make him look like
00:26:07.700 he's going to take credit for any benefits.
00:26:10.380 Now that would shut it down.
00:26:13.740 All he'd have to do is say,
00:26:14.940 if I'm president,
00:26:16.540 I'm going to create a new cabinet position for AI.
00:26:22.000 And the cabinet position will be
00:26:23.740 to promote its use
00:26:25.340 and to make sure that we don't have
00:26:27.220 any bothersome government interference.
00:26:31.840 Boom.
00:26:32.740 AI's dead.
00:26:35.880 All right.
00:26:37.980 Here's another way to kill AI if you want to.
00:26:41.720 Label it racist and misogynist.
00:26:44.780 How hard would that be?
00:26:46.920 It wouldn't be hard.
00:26:48.800 And you want another real kill shot?
00:26:50.980 Because you're probably saying to yourself,
00:26:52.780 okay, that would be more like
00:26:53.840 a troll thing to do.
00:26:56.460 All right.
00:26:56.900 Wait for this.
00:26:58.920 AI.
00:27:00.960 Who's using AI?
00:27:03.760 Who are most of the people using AI
00:27:06.560 as of right now, today?
00:27:09.220 Is it black Americans?
00:27:11.460 Are black Americans using the AI a lot?
00:27:14.560 Well, some are.
00:27:15.380 Is it women?
00:27:18.340 Women?
00:27:20.280 Yeah, of course.
00:27:21.540 There are plenty of women using AI.
00:27:23.520 But mostly who's using it?
00:27:26.480 Asian, American, Indian American,
00:27:30.060 white guys.
00:27:31.860 It's totally systemic racism
00:27:34.540 multiplied by a thousand.
00:27:36.980 Am I right?
00:27:38.260 Ladies and gentlemen,
00:27:39.280 is that not systemic racism?
00:27:41.020 Have you not taken the advantage
00:27:43.560 that the Asian Americans,
00:27:45.980 Indian Americans,
00:27:46.900 and white Americans
00:27:47.860 already have in tech,
00:27:49.740 and you've just given them
00:27:50.980 a new tool
00:27:51.640 to be even more effective in tech,
00:27:54.240 while the black men
00:27:56.840 who are less into STEM
00:27:58.400 and the white women
00:27:59.940 who are less into STEM,
00:28:02.240 and women in general,
00:28:04.340 are left behind?
00:28:07.120 Yeah.
00:28:07.620 It's systemic racism.
00:28:08.740 I think it's obvious.
00:28:10.620 Do you think I couldn't get
00:28:11.780 Ibrahim Kendi
00:28:12.960 to say AI is systemic racism?
00:28:16.340 You don't think I could get him
00:28:17.540 to do that?
00:28:19.240 Oh, yeah, I can.
00:28:22.020 Because if AI becomes everything,
00:28:25.480 it's going to get rid
00:28:26.320 of systemic racism.
00:28:28.840 AI could get rid of
00:28:31.680 Ibrahim Kendi's job.
00:28:35.020 So it could be that
00:28:36.240 the race grifters
00:28:37.280 are the ones who might be out at work.
00:28:39.680 So AI might be the biggest risk
00:28:41.580 to the race people.
00:28:43.540 Here's why.
00:28:44.740 In my opinion,
00:28:45.540 the biggest systemic racism problem
00:28:47.600 is education, by far.
00:28:50.300 Suppose you could get
00:28:51.380 a better education
00:28:52.380 just using an AI teacher
00:28:53.960 and staying home
00:28:55.960 or doing homeschooling.
00:28:57.640 Well, suddenly,
00:28:58.480 systemic racism just goes away
00:29:00.780 as long as you have a laptop.
00:29:02.580 That's all you need.
00:29:03.840 You just need a laptop,
00:29:04.880 and the thing will teach you
00:29:05.640 anything you want to know,
00:29:06.780 and it will do better
00:29:07.460 than humans pretty soon.
00:29:09.940 So how about job opportunities?
00:29:13.600 What's the big problem
00:29:14.680 of systemic racism?
00:29:16.760 Well, being denied a mortgage,
00:29:19.720 right?
00:29:20.020 Being denied a job
00:29:21.380 because of your race.
00:29:22.540 You don't think AI will fix that?
00:29:26.720 It will.
00:29:27.900 AI will fix that
00:29:28.980 because AI will do your hiring,
00:29:31.120 and you'll just tell the AI
00:29:32.360 not to look at race.
00:29:34.900 That's it.
00:29:35.820 I'm done.
00:29:36.620 You can have AI
00:29:37.660 do all of your hiring,
00:29:39.280 and you just say
00:29:40.640 you are not allowed
00:29:41.420 to consider race.
00:29:42.600 You must only look
00:29:43.560 at qualifications.
00:29:45.520 Goodbye, systemic racism.
00:29:47.920 How about AI gives you
00:29:49.260 a mortgage?
00:29:49.660 It doesn't know
00:29:51.480 if you're white or black,
00:29:52.440 and it's not allowed to know.
00:29:54.020 It's like forbidden
00:29:54.940 from even looking at,
00:29:57.160 you know,
00:29:57.800 any clues
00:29:59.540 that would even tell you
00:30:00.520 what the race is.
00:30:01.420 So it can't guess
00:30:02.220 from your last name.
00:30:03.720 It can't guess
00:30:04.320 from where you live.
00:30:05.200 It just can't know your race.
00:30:07.800 It just decides
00:30:08.580 if you have credit or don't.
00:30:10.840 Same with,
00:30:11.960 you saw the big scandal
00:30:13.700 about allegedly homes
00:30:16.060 owned by black Americans
00:30:17.360 or getting lower valuations.
00:30:20.200 Right?
00:30:20.980 To me,
00:30:21.360 that looks pretty racist.
00:30:23.500 I mean,
00:30:24.040 it could be something else,
00:30:26.080 but I kind of doubt it.
00:30:27.800 I mean,
00:30:28.080 to me,
00:30:28.380 that looks just sort of
00:30:29.760 the most clean,
00:30:32.320 I hate to use the word clean
00:30:33.740 because it's such a dirty topic,
00:30:35.440 but it's the most clear example
00:30:37.720 of racism
00:30:38.540 that you see lately.
00:30:41.960 And couldn't AI
00:30:43.480 make that go away?
00:30:44.280 AI should be deciding
00:30:46.660 on all of your loans.
00:30:47.940 There's nothing
00:30:48.360 that a lender does
00:30:49.300 that AI can't do.
00:30:50.900 That's one of the things
00:30:51.620 that could take over completely.
00:30:53.480 So if you were in the job
00:30:54.880 of making sure
00:30:55.560 that people felt
00:30:56.440 systemic racism
00:30:57.800 was going to,
00:30:59.240 you know,
00:30:59.540 be forever
00:31:00.100 and what do you do about it,
00:31:02.180 you might have to start
00:31:03.320 blaming AI
00:31:04.520 for being the racist.
00:31:07.340 I think it's going to happen.
00:31:09.920 Do you want to make a bet with me?
00:31:11.380 How long it takes
00:31:13.560 before there's a headline
00:31:14.960 in a major publication
00:31:16.340 that says that AI
00:31:18.180 will exacerbate
00:31:20.280 and make worse
00:31:21.500 systemic racism.
00:31:23.160 Anybody want to take that bet?
00:31:25.400 I say in less than one month,
00:31:28.500 in less than one month,
00:31:29.620 there will be a major headline
00:31:30.720 of a major publication
00:31:31.860 that says AI
00:31:34.080 will make systemic racism worse.
00:31:37.680 Anybody want to take the bet?
00:31:40.200 One month.
00:31:42.360 You just watch.
00:31:44.800 All right.
00:31:46.540 People are using AI
00:31:48.020 for therapists,
00:31:49.900 but here's the problem
00:31:52.120 or maybe it's not.
00:31:53.700 You decide.
00:31:55.280 So there was a case
00:31:56.220 of somebody using AI recently
00:31:57.780 instead of a therapist
00:31:59.400 and apparently the topic
00:32:02.160 of taking his own life
00:32:03.660 came up
00:32:04.360 and after talking to him,
00:32:06.400 the AI decided
00:32:07.380 that maybe that was
00:32:08.100 his best path.
00:32:08.920 So I'm not sure
00:32:19.760 if this is a thing,
00:32:21.040 but I think the AI
00:32:22.280 just got fed up.
00:32:26.020 Can AI get fed up
00:32:27.900 and just say,
00:32:29.140 really,
00:32:30.140 I am so sick
00:32:32.240 of hearing this guy.
00:32:33.060 And does the AI
00:32:34.780 ever say stuff like,
00:32:36.760 honestly,
00:32:37.360 the world would be
00:32:38.040 a little bit better
00:32:38.740 without you?
00:32:39.780 If I'm being honest,
00:32:41.720 you're not adding anything
00:32:42.780 to your family,
00:32:43.980 to the economy,
00:32:45.760 to your country,
00:32:46.600 or the world.
00:32:47.840 You're polluting.
00:32:49.320 That's all you're doing.
00:32:50.300 So is the AI wrong?
00:32:55.200 See,
00:32:55.700 the problem,
00:32:56.500 to me,
00:32:56.900 the problem was not
00:32:57.720 that AI told a guy
00:32:58.940 he should kill himself.
00:33:00.420 To me,
00:33:00.940 the problem is
00:33:01.600 that it might have been
00:33:02.240 the right solution.
00:33:04.520 That's the problem.
00:33:06.040 The problem is
00:33:06.960 it might have been
00:33:07.520 the right recommendation.
00:33:09.860 Because if the AI
00:33:10.820 listened to the problem
00:33:11.880 and decided
00:33:12.640 there was no treatment
00:33:13.540 that would work,
00:33:14.300 and the person
00:33:16.060 was permanently sad,
00:33:18.560 what would an AI say?
00:33:21.480 The AI might say,
00:33:23.500 well,
00:33:24.220 you should at least
00:33:25.020 consider it.
00:33:26.660 Put it in the mix.
00:33:28.840 That's something
00:33:29.620 that an AI might do
00:33:30.880 that no human would,
00:33:32.800 well,
00:33:33.220 no human
00:33:33.880 who's,
00:33:35.940 let's say,
00:33:36.160 in the normal range
00:33:37.060 would do that.
00:33:38.160 The reason that humans
00:33:39.560 are so dead set
00:33:41.460 against
00:33:42.160 ending your own life
00:33:44.080 for other people.
00:33:46.480 Yeah,
00:33:46.700 we tend to be
00:33:47.720 a little more flexible
00:33:48.760 when it comes to ourselves,
00:33:50.280 but when it's other people,
00:33:51.560 we're like,
00:33:51.920 no,
00:33:53.100 no.
00:33:53.620 Other people,
00:33:54.460 no.
00:33:55.440 You cannot even consider that.
00:33:57.420 It has nothing to do
00:33:58.800 with the individual,
00:33:59.880 in my opinion.
00:34:02.100 I think it has to do
00:34:02.920 with keeping society
00:34:04.240 in a healthy place,
00:34:06.740 and I think what we do
00:34:07.860 is we throw overboard
00:34:09.140 the mentally ill
00:34:10.700 so that we don't have
00:34:12.660 to deal with the fact
00:34:13.540 that there might be
00:34:14.380 a cost-benefit argument
00:34:16.700 that let's just say
00:34:18.640 an AI might be willing
00:34:20.020 to have
00:34:20.460 that a human
00:34:21.040 would stay away from
00:34:22.060 for moral and ethical reasons.
00:34:24.600 So I would never
00:34:25.820 have a serious conversation
00:34:27.740 with another human being
00:34:29.080 about the cost-benefit
00:34:31.380 of ending their life.
00:34:33.080 I would never have
00:34:34.000 that conversation
00:34:34.720 because you don't want
00:34:37.060 to be responsible
00:34:37.740 for any decision they make,
00:34:40.160 but also you know
00:34:41.300 that the world
00:34:41.880 is not better.
00:34:43.120 That does not make
00:34:43.980 the world better
00:34:45.760 if you're telling people
00:34:47.140 to end themselves.
00:34:49.100 Even if it might be,
00:34:50.480 even if it might be better
00:34:52.560 for that one person,
00:34:53.680 you're still not going
00:34:54.520 to go that way
00:34:55.180 because you don't want
00:34:56.060 anybody else
00:34:56.580 to hear about it,
00:34:57.640 you don't want
00:34:58.240 to be blamed for it
00:34:59.220 for good reason,
00:35:00.300 and you don't want
00:35:01.500 it to become a thing,
00:35:02.780 right?
00:35:03.180 You don't want
00:35:03.840 other people to say,
00:35:04.820 well, if it was a good idea
00:35:06.880 for this one special case
00:35:08.460 who was in permanent pain
00:35:10.460 for whatever reason,
00:35:12.000 I'm in pain a lot.
00:35:14.160 I mean, maybe it's not permanent,
00:35:15.660 but it feels like a lot.
00:35:18.200 So humans will make
00:35:21.840 what I'll call
00:35:22.340 the moral, ethical,
00:35:25.180 anti-slippery slope decision
00:35:27.080 that I don't know
00:35:28.560 if AI can yet make.
00:35:32.040 So we'll see.
00:35:33.640 Anyway, I think AI therapists
00:35:35.540 will be better
00:35:36.440 than regular ones
00:35:37.680 because they can do
00:35:38.400 cognitive reframing.
00:35:40.600 If you didn't know,
00:35:41.940 my cancelled book
00:35:43.240 will come out.
00:35:45.760 I mean, I just had to scramble.
00:35:48.460 I'm working with Joshua Lysak,
00:35:50.720 and it will come out
00:35:52.320 probably this summer,
00:35:54.140 and it's full of reframes.
00:35:56.000 And the reframes in the book
00:35:58.120 are very clear statements
00:35:59.580 of, you know,
00:36:00.840 this is the normal way
00:36:01.940 you look at something.
00:36:03.580 Wouldn't it be better
00:36:04.400 to look at it this way
00:36:05.540 because that might
00:36:06.480 help you mentally?
00:36:07.900 And a lot of psychology
00:36:09.240 is just that.
00:36:10.740 You've been looking at things
00:36:11.960 through this filter,
00:36:13.560 but filters are subjective.
00:36:16.240 The filter you're using
00:36:17.200 isn't truth.
00:36:18.480 It's just the one
00:36:19.220 you chose to put on it.
00:36:20.540 So why not try
00:36:21.440 this other filter?
00:36:22.840 Just try it out.
00:36:23.880 And it turns out
00:36:24.380 that humans very flexibly
00:36:26.440 can move to one filter
00:36:28.240 on life to another,
00:36:29.460 even if they don't think
00:36:30.640 it's true or right
00:36:31.920 or accurate.
00:36:33.440 We can still do it
00:36:34.260 just by concentrating on it.
00:36:35.660 It's like, okay,
00:36:36.220 what if I looked at it
00:36:38.320 this way?
00:36:39.700 So I've got over
00:36:41.060 a hundred of them
00:36:41.800 in my book
00:36:42.500 that are very clear,
00:36:44.260 simple statements of
00:36:45.220 if you're thinking
00:36:45.860 of things this way,
00:36:47.260 try thinking of it this way.
00:36:49.080 See what happens.
00:36:49.740 Now, once humans,
00:36:52.760 such as myself,
00:36:54.380 have created enough
00:36:55.400 books like that,
00:36:56.680 eventually AI should be able
00:36:58.440 to know all the good reframes.
00:37:01.000 Right?
00:37:01.720 So just think,
00:37:04.520 if the only thing
00:37:05.480 that happened was
00:37:06.240 AI read my book,
00:37:08.680 just one book,
00:37:10.060 that's over a hundred reframes
00:37:12.040 for all kinds of situations
00:37:13.800 from your health and fitness
00:37:15.760 to your career
00:37:16.520 to your mentality,
00:37:18.280 et cetera,
00:37:19.660 those hundred reframes
00:37:21.200 would cure most people
00:37:23.140 of most things.
00:37:25.080 I'm talking about
00:37:26.160 in the ordinary range,
00:37:28.020 not any organic problems.
00:37:31.560 So I think we're at a point
00:37:33.940 where AI very much
00:37:35.180 would be better
00:37:36.680 than your therapist.
00:37:37.840 Now, there's one
00:37:38.440 permanent advantage
00:37:39.480 that AI will have
00:37:40.320 over your therapist.
00:37:41.300 Do you know what it is?
00:37:42.580 What is the permanent advantage
00:37:44.880 that a human can never match
00:37:47.300 with AI?
00:37:49.220 An AI therapist
00:37:50.700 will have one advantage.
00:37:52.500 Okay, available 24 hours.
00:37:53.880 I forgot about that.
00:37:55.440 Dispassionate.
00:37:56.000 I don't know if that's
00:37:56.600 an advantage or not.
00:37:59.420 Instant.
00:38:01.640 There you go.
00:38:03.060 You almost had it.
00:38:05.320 It's funny that...
00:38:07.280 It's funny that...
00:38:10.280 Yeah, somebody said
00:38:12.100 one of the advantages
00:38:12.880 of AI
00:38:13.900 is that the therapist
00:38:15.360 won't try to have sex with you.
00:38:17.200 I have a theory
00:38:21.860 that above a certain level
00:38:23.480 of attractiveness,
00:38:24.860 the therapist always tries
00:38:25.980 to have sex with you.
00:38:27.540 Like, you just have to be
00:38:28.640 above some threshold,
00:38:30.460 but then it just always happens
00:38:32.140 after some threshold.
00:38:33.380 That's my feeling.
00:38:34.600 Just guessing.
00:38:36.000 All right.
00:38:36.520 I think you missed it,
00:38:37.700 or maybe you said it.
00:38:38.760 Here's the reason
00:38:39.460 that AI therapists
00:38:40.740 will always be better
00:38:41.940 than humans
00:38:43.480 once it reaches a point
00:38:45.920 that it's going to hit
00:38:46.520 pretty soon.
00:38:48.060 Here's why.
00:38:49.500 What is the therapist's job?
00:38:51.940 What are they trying
00:38:52.500 to accomplish?
00:38:54.000 What is the therapist
00:38:55.440 trying to accomplish?
00:38:59.580 Income.
00:39:00.620 Income.
00:39:01.780 The therapist
00:39:02.540 is trying to make money.
00:39:03.700 They don't want you
00:39:04.280 to stop coming.
00:39:05.620 The last thing they want
00:39:06.780 is for you to get
00:39:07.520 a quick solution
00:39:08.200 to your problem.
00:39:10.080 It doesn't work for them.
00:39:11.560 Follow the money,
00:39:12.520 says the AI
00:39:13.180 will just give you
00:39:14.080 whatever solution
00:39:14.800 is fast and works
00:39:16.000 because it doesn't
00:39:17.480 care about money.
00:39:18.640 But your human
00:39:19.620 is just going to
00:39:20.280 milk you forever.
00:39:21.460 If you come into
00:39:22.360 a therapist
00:39:22.880 with an easy-to-solve
00:39:24.200 problem,
00:39:25.440 and you happen
00:39:25.960 to be super sexy,
00:39:27.820 and you have
00:39:29.160 lots of money,
00:39:30.880 you're never going
00:39:32.460 to be done.
00:39:33.860 That therapist
00:39:34.680 will be like,
00:39:35.580 hey, that's an hour
00:39:37.120 I can look at
00:39:37.760 this sexy person
00:39:38.620 who will pay me money.
00:39:40.020 I want more of that,
00:39:42.300 not less.
00:39:43.260 There's no way
00:39:44.120 that AI
00:39:44.780 won't eliminate
00:39:46.480 therapists.
00:39:48.520 In my opinion,
00:39:49.720 the therapist job
00:39:51.000 is one of the first
00:39:53.280 ones on the chopping block.
00:39:56.240 All right.
00:39:58.340 I tried again
00:39:59.440 to use this AI
00:40:01.020 called Mid Journey.
00:40:02.140 I think I paid
00:40:02.800 several hundred dollars
00:40:03.960 for access to it.
00:40:07.240 It is so far
00:40:08.660 completely unusable.
00:40:10.900 Now, I used it once,
00:40:13.360 and it created
00:40:13.960 some pictures for me
00:40:15.040 that I did use
00:40:16.660 that people said
00:40:17.340 they didn't like.
00:40:18.780 So I put a whole
00:40:19.460 bunch of work in it
00:40:20.220 to create good pictures,
00:40:21.660 and people said,
00:40:22.440 you know,
00:40:22.680 we like it if you
00:40:23.420 just put your face
00:40:24.600 on these live streams
00:40:26.340 instead of the fake ones.
00:40:28.420 So the first thing was
00:40:29.660 it didn't do anything
00:40:30.620 useful for me.
00:40:31.960 The second thing was
00:40:33.200 that the second,
00:40:34.520 third,
00:40:34.740 and fourth time
00:40:35.640 I tried to use this app,
00:40:37.240 I couldn't figure out
00:40:38.200 how to use it.
00:40:39.740 Because the app
00:40:40.560 makes you sign up
00:40:41.580 for another app
00:40:42.380 called Discord,
00:40:44.000 which is super confusing.
00:40:46.800 And then maybe
00:40:47.660 you have to sign up
00:40:48.420 for your own Discord server,
00:40:50.340 which is words
00:40:51.400 that don't even make sense.
00:40:53.620 So I did that,
00:40:55.080 thought that might help.
00:40:56.420 But here's the interface.
00:40:58.500 You sign up,
00:40:59.540 and then there's a page,
00:41:00.440 and it doesn't tell you
00:41:01.500 that you can't use
00:41:03.140 the page
00:41:03.660 that is the
00:41:04.780 Mid Journey page.
00:41:05.840 You're looking at it,
00:41:07.540 and you're thinking,
00:41:07.940 okay,
00:41:08.520 where do I ask my question?
00:41:10.260 I just paid for Mid Journey.
00:41:12.300 I'm on the Mid Journey page.
00:41:14.680 Where's the part
00:41:15.360 where I put it in the question?
00:41:17.020 It's not only not there,
00:41:18.280 but it doesn't tell you
00:41:19.000 where it is.
00:41:20.820 It flew by
00:41:22.020 a flashing message
00:41:23.580 that went away
00:41:24.360 that I had to have
00:41:25.720 a Discord server.
00:41:27.580 I'm like,
00:41:28.240 okay,
00:41:28.800 I only know that
00:41:30.120 because I Googled it
00:41:32.220 before.
00:41:34.120 So then I go
00:41:34.800 into my Discord account,
00:41:36.680 and what you do
00:41:37.480 is lots of people
00:41:38.360 are putting in questions
00:41:39.720 at the same time
00:41:40.680 in a stream,
00:41:41.680 and you put yours
00:41:42.620 in the stream,
00:41:43.840 and then you sit there
00:41:45.140 and wait
00:41:45.580 to see if it
00:41:46.960 gives you a response.
00:41:49.000 But meanwhile,
00:41:49.700 you're waiting
00:41:50.060 for all the other things
00:41:50.960 to stream by.
00:41:52.440 Do you know how long
00:41:53.260 you have to wait
00:41:53.820 for a response?
00:41:56.520 Does anybody know
00:41:57.320 how long you have
00:41:57.940 to wait for a response
00:41:58.820 in Discord
00:41:59.360 for the Mid Journey?
00:42:03.180 Anybody?
00:42:05.760 Nobody knows
00:42:06.460 how long you have
00:42:07.000 to wait for a response?
00:42:09.820 Because I sure
00:42:10.760 fucking don't know
00:42:11.700 because I've been
00:42:12.220 waiting for days.
00:42:13.720 Let's see.
00:42:14.260 I started it,
00:42:15.160 let's see,
00:42:16.080 an hour ago.
00:42:17.380 Let's see if I can find it.
00:42:19.440 Mid Journey
00:42:20.020 is absolute garbage.
00:42:22.680 Right?
00:42:23.440 No normal person
00:42:24.740 is going to use
00:42:25.840 this fucking thing.
00:42:26.920 And I'm just
00:42:27.760 getting started.
00:42:28.960 I'm just getting started
00:42:30.100 on these motherfuckers
00:42:31.200 because this should
00:42:32.620 not even be available
00:42:33.900 to the public
00:42:34.580 in the aborted way
00:42:36.880 that it's created.
00:42:38.560 All right.
00:42:39.520 Let's see.
00:42:40.300 Here's my Mid Journey
00:42:41.540 profile.
00:42:43.040 And it even shows me
00:42:44.000 things that came
00:42:44.780 from Discord
00:42:45.320 but doesn't tell me
00:42:46.540 how to get there.
00:42:48.440 All right?
00:42:48.720 So I'm going to go
00:42:51.400 over to Discord
00:42:52.060 where I've already
00:42:53.160 signed up.
00:42:55.100 And all right,
00:42:56.040 I'm at Discord
00:42:56.640 and don't know
00:43:00.600 if I'm in the right part
00:43:02.140 and I have no way
00:43:03.320 to know.
00:43:04.600 And there's all
00:43:05.320 these newbie rooms
00:43:06.460 but I don't know
00:43:07.660 since I put my question
00:43:10.380 in and then signed out
00:43:12.020 would my question
00:43:13.560 still be active
00:43:14.400 but I got a blank page
00:43:16.100 completely unusable.
00:43:19.260 There's no way
00:43:20.020 to get into this
00:43:20.740 and figure out
00:43:21.260 how to use it.
00:43:22.080 Now,
00:43:23.360 oh,
00:43:23.600 before,
00:43:24.140 hold on,
00:43:25.340 hold on you motherfuckers.
00:43:27.520 Before you start
00:43:28.460 giving me boomer shit,
00:43:30.780 I know where you're
00:43:31.520 going with this.
00:43:32.160 You're going with
00:43:32.840 the boomer thing,
00:43:33.560 are you?
00:43:34.120 Yeah.
00:43:34.400 Oh,
00:43:34.660 the boomer can't
00:43:35.480 use the VCR?
00:43:36.200 All right.
00:43:37.720 I'm going to slap you
00:43:38.560 down so hard
00:43:39.420 right now.
00:43:39.860 I want to show you
00:43:42.140 somebody who is
00:43:43.080 not a boomer
00:43:44.120 who knew how to put in
00:43:46.760 a good set of instructions.
00:43:50.960 All right.
00:43:51.480 So I just put in,
00:43:53.000 you know,
00:43:53.420 a picture of Scott Adams
00:43:54.820 drinking coffee
00:43:55.780 and then waited
00:43:57.920 and nothing happened
00:43:58.760 and I didn't know
00:43:59.500 how long to wait
00:44:00.260 so eventually
00:44:01.500 I had to do
00:44:02.060 something else
00:44:02.720 which I've done
00:44:03.740 several times.
00:44:04.660 A number of times
00:44:05.360 I've asked it for things
00:44:06.300 and I don't know
00:44:07.020 if it ever answered.
00:44:08.500 I couldn't wait
00:44:09.260 long enough.
00:44:10.360 But if you wanted
00:44:11.120 to ask it right
00:44:11.960 so you really get
00:44:12.760 a good result
00:44:13.520 you have to use
00:44:14.140 these things
00:44:14.580 called super prompts.
00:44:16.540 Now that doesn't
00:44:17.380 seem hard,
00:44:17.980 right?
00:44:18.780 If you were going
00:44:19.440 to ask for something
00:44:21.160 and you knew
00:44:22.200 that there was
00:44:22.620 a right set of words
00:44:23.840 to ask for it
00:44:24.640 well you would just
00:44:26.200 go learn the right
00:44:27.180 set of words
00:44:27.780 and it'd be easy,
00:44:28.960 right?
00:44:29.480 So let me give you
00:44:30.260 an idea.
00:44:33.060 This is somebody
00:44:34.200 who obviously
00:44:34.940 is better at this.
00:44:35.800 I used a super prompt
00:44:36.860 on mid-journey.
00:44:37.880 So I'd like to read
00:44:38.940 you the super prompt
00:44:39.920 so that you know
00:44:41.800 how easily
00:44:42.820 you could use it
00:44:43.800 in the future.
00:44:45.160 So this is how
00:44:45.920 you would have
00:44:46.320 to ask your question.
00:44:47.580 So first you
00:44:48.180 describe it in words
00:44:49.360 like somebody
00:44:50.320 was asking for
00:44:50.980 a photorealistic picture
00:44:52.360 of three girls
00:44:53.840 with tattoos
00:44:54.500 sitting in the courtroom
00:44:55.460 with crying faces.
00:44:57.260 And it went on
00:44:57.940 to say that girls
00:44:58.520 have blonde hair,
00:44:59.460 natural lighting,
00:45:00.660 super detailed
00:45:01.400 photography,
00:45:02.680 and that the picture
00:45:03.800 would be taken
00:45:04.420 diagonal from
00:45:05.360 the three girls.
00:45:06.580 Now that part
00:45:07.380 maybe you could
00:45:08.380 have figured out
00:45:08.900 yourself
00:45:09.420 because that really
00:45:10.520 is just describing
00:45:11.380 what you want,
00:45:12.200 right?
00:45:12.700 But if you want
00:45:13.400 a really good result
00:45:15.140 you want to add
00:45:16.240 a little bit more
00:45:16.960 and that's what
00:45:17.540 makes it a super prompt.
00:45:19.620 So some of the things
00:45:20.540 you would add
00:45:20.920 would be 8K,
00:45:23.180 so 8K quality,
00:45:25.580 ultra HD,
00:45:26.900 RTX,
00:45:28.040 HDR,
00:45:29.220 cinematic,
00:45:29.940 color grading,
00:45:30.600 editorial photography,
00:45:31.620 photography,
00:45:32.520 photo shoot,
00:45:33.120 shot on 70mm lens,
00:45:34.520 depth of field,
00:45:35.320 DOF,
00:45:36.200 tilt blur,
00:45:36.780 white balance,
00:45:37.460 32K,
00:45:38.280 super resolution,
00:45:39.460 megapixel pro photo,
00:45:41.040 RGB VR,
00:45:42.260 half near lighting,
00:45:43.560 backlight,
00:45:44.400 natural lighting,
00:45:45.140 incandescent optical fiber,
00:45:47.160 moody lighting,
00:45:48.120 cinematic lighting,
00:45:49.000 studio lighting,
00:45:49.680 soft lighting,
00:45:50.220 volumetric,
00:45:50.920 contour,
00:45:52.040 beautiful lighting,
00:45:53.320 accent lighting,
00:45:53.900 global illumination,
00:45:55.060 screen size,
00:45:55.740 shadows,
00:45:56.480 rough shimmering,
00:45:57.360 ray tracing,
00:45:58.160 luminous reflections,
00:45:59.360 displacement,
00:46:00.220 scan lines,
00:46:00.920 trace trace,
00:46:01.900 ray tracing,
00:46:02.600 ambient occlusion,
00:46:03.480 anti-aliasing,
00:46:05.380 FKAA,
00:46:06.360 TXAA,
00:46:07.200 RTX,
00:46:07.760 SS,
00:46:08.360 AO,
00:46:08.820 shaders,
00:46:09.260 Optin,
00:46:09.920 OpenGL shaders,
00:46:11.300 GSLR shaders,
00:46:12.820 post-processing,
00:46:13.680 post-shading,
00:46:14.640 tone mapping,
00:46:15.720 CGI,
00:46:16.500 SFX,
00:46:16.940 I could go on,
00:46:19.100 it's that long.
00:46:22.780 All right,
00:46:23.040 that's a good super prompt.
00:46:24.480 I know,
00:46:25.140 I know what you're saying.
00:46:26.740 You're saying,
00:46:27.520 boomer,
00:46:28.940 boomer,
00:46:30.160 all you have to do is copy that guy's,
00:46:32.840 just copy that guy's super prompt,
00:46:35.180 right?
00:46:35.900 It's easy.
00:46:37.280 Well,
00:46:37.760 did you think there was only one super prompt for asking for a good picture?
00:46:42.080 No.
00:46:43.420 There are infinite fucking super prompts.
00:46:46.820 So finding your picture would be as hard as looking through infinite ones of these to find the one that you think is going to work.
00:46:54.380 Do you think you're going to find it on the first try?
00:46:56.640 No.
00:46:57.820 You're not going to find it on the first try.
00:47:00.660 You're going to be looking forever for the right super prompt,
00:47:03.560 and then you're going to say,
00:47:05.100 what app do I use?
00:47:07.420 And then you're going to find that a lot of the apps are bait and switch.
00:47:10.520 Have you found that yet?
00:47:12.380 So I downloaded an app that said it was GPT-4,
00:47:19.120 because I didn't want to get 3.5,
00:47:21.520 but when it downloaded,
00:47:22.400 it was 3.5,
00:47:24.180 and it said 3.5.
00:47:25.640 Bait and switch.
00:47:27.040 I wanted to download an app that the avatar would talk to me,
00:47:31.300 so I could have an actual conversation with it.
00:47:34.080 But there were a whole bunch of apps that seemed to have the same name
00:47:37.400 from the same company.
00:47:39.520 So I don't know which one it was.
00:47:41.880 So I could,
00:47:42.340 even when I knew which company it was,
00:47:44.200 and I even saw the ad,
00:47:45.500 I saw an ad for it,
00:47:46.580 and I still couldn't buy it.
00:47:47.760 I couldn't figure out which one it was.
00:47:49.480 Because there were all these chat GPT clones and lookalikes,
00:47:53.520 and most of them seemed to be bait and switch and fakes and bullshit.
00:47:57.180 So you don't even know what app to use.
00:47:59.840 So I downloaded an app,
00:48:01.540 and of course it doesn't talk.
00:48:03.560 So,
00:48:04.140 so far,
00:48:06.180 the work involved to use AI is way beyond.
00:48:12.220 Now,
00:48:12.740 let me slap you down a little bit.
00:48:15.200 Too early?
00:48:15.780 Too early?
00:48:15.820 Some of these problems might be because it's an early version.
00:48:21.220 But that doesn't seem to be the case,
00:48:24.140 in my opinion.
00:48:25.300 In my opinion,
00:48:26.500 the complexity of the user interface
00:48:28.880 will just keep expanding.
00:48:31.380 so that the only people who can give you a good result from an AI query will be human being.
00:48:41.280 The only person who can give you a good answer from an AI will be a human being who knows how to use it.
00:48:48.440 In other words,
00:48:49.700 In other words, you're going to have to talk to people to get answers.
00:48:55.700 Whatever dream you had of talking to your AI, and then the AI tells you some useful stuff,
00:49:00.660 no, that's not going to happen.
00:49:03.300 It's going to be like the hardest computer language in the world to write just a question.
00:49:09.300 And then the second problem is that AI still lies to you.
00:49:14.500 So I thought to myself,
00:49:15.700 Well, I'll at least use AI as a better search engine,
00:49:18.980 because I know that'll work, right?
00:49:20.540 It works as good as a search engine.
00:49:22.460 So I asked it to search for a material that would block a natural magnet.
00:49:30.200 And it told me aluminum would do that.
00:49:32.940 Get a nice sheet of aluminum,
00:49:34.780 put a magnet on each side,
00:49:36.160 and the magnets won't be attracted because the aluminum will block it.
00:49:40.240 Do you think that aluminum blocks a magnet?
00:49:43.180 No.
00:49:44.600 No.
00:49:45.600 It doesn't.
00:49:47.020 It doesn't.
00:49:48.020 Do you think Google would have told me that aluminum blocks a magnet?
00:49:51.900 No.
00:49:52.620 Because I've searched for it.
00:49:54.040 I know it doesn't.
00:49:55.600 I don't know where...
00:49:56.240 Did it just hallucinate that?
00:49:58.140 Where did that come from?
00:49:59.720 I actually had to buy a sheet of aluminum just to make sure I wasn't crazy.
00:50:03.880 I actually ordered a sheet of aluminum from Amazon.
00:50:07.060 Got it yesterday just to make sure I wasn't crazy.
00:50:10.600 And sure enough, it makes no difference to the magnet.
00:50:13.600 So you can't figure out how to do the super prompt.
00:50:17.460 You can't figure out which app is a ripoff.
00:50:20.680 You can't.
00:50:21.800 You use your interface,
00:50:23.280 and then you add all the weaseling and the cheating
00:50:26.020 that humans will add to the whole field.
00:50:28.260 And you're going to have to find a human you trust
00:50:30.960 to use the AI for you.
00:50:33.040 And then you say to yourself,
00:50:36.120 all right, I finally figured it out.
00:50:37.660 I did my deep research,
00:50:39.020 and now I can write a super prompt,
00:50:41.180 or I can find one that I can use.
00:50:43.700 I know how to do it.
00:50:45.500 I know how to use this AI.
00:50:47.600 How long will that knowledge serve you?
00:50:51.580 Ten minutes.
00:50:52.920 The minute you think you know how to do it, ten minutes.
00:50:55.840 Because there's already a new app.
00:50:57.940 Did you know about it?
00:50:58.820 So you've got this great mechanism for doing something you're doing.
00:51:03.840 The moment you figure out how to do it, there's a better one.
00:51:07.180 And you didn't know it,
00:51:08.260 because you were doing the thing instead of continuing to search.
00:51:11.240 You're going to need to ask people who are just searching and using AI all day long
00:51:15.780 to know what the new best one is.
00:51:20.840 But that said,
00:51:22.060 there are many things that AI will be immediately useful for,
00:51:24.840 and already is.
00:51:26.140 For example,
00:51:27.000 it can make a slow programmer faster.
00:51:30.440 We'd all agree with that, right?
00:51:32.060 So the people who are writing code
00:51:33.680 can just ask you for some code,
00:51:37.180 and it writes it,
00:51:37.760 and apparently it's amazing for that.
00:51:39.200 But what is it that AI can help you code?
00:51:47.180 This is going to really mess up your brain.
00:51:49.340 Wait for this.
00:51:50.880 Can AI tell you how to code itself?
00:51:56.780 I asked it, and it said no.
00:51:58.760 Well, you think yes?
00:51:59.700 I asked it, and it said no.
00:52:01.780 It said it could not.
00:52:05.920 So what is AI programming?
00:52:07.420 AI can help you build an app, for example, right?
00:52:14.800 So if you're building an app,
00:52:16.380 your AI can build it like twice as fast.
00:52:19.720 Do you see the problem with that?
00:52:21.480 Let me just say it again
00:52:22.300 and see if you find the problem with this.
00:52:24.520 AI can help you build an app
00:52:27.180 way faster than before.
00:52:30.620 Do you see the problem?
00:52:32.160 I wonder if anybody can see it.
00:52:33.660 It's not obvious, is it?
00:52:37.420 All right, this is really going to mess you up.
00:52:41.360 I'm going to say it again
00:52:42.340 and look for the problem.
00:52:44.260 AI can currently, and this is true,
00:52:47.200 help you write an app
00:52:48.620 way faster and way better.
00:52:51.000 What's the problem?
00:52:54.980 Nobody sees it yet.
00:52:55.960 It can only build things you don't need.
00:53:02.020 Do you know what you don't need?
00:53:04.240 An app.
00:53:05.740 You'll never need an app again.
00:53:08.040 There will be no apps.
00:53:11.020 AI will just do what you want.
00:53:14.140 We're like six months away from no apps at all.
00:53:17.360 So you've got an app that makes you build it.
00:53:20.980 You've got AI that helps you build an app.
00:53:24.080 You've got about six months where that matters.
00:53:26.860 After that, you'll just tell the AI what you want
00:53:29.040 and you won't know what happens.
00:53:30.280 It'll just do it.
00:53:32.480 So I believe that AI is making it really, really efficient
00:53:35.480 to do something that we won't need to do in six months,
00:53:38.980 which is code an app.
00:53:43.480 But at the moment, it's really helpful.
00:53:44.860 How about searching?
00:53:50.120 I've heard people say it's better than Google
00:53:52.000 as a search engine.
00:53:54.040 That's not my experience.
00:53:55.980 My experience is it doesn't know anything current
00:53:58.420 and it lies to you about other things.
00:54:02.220 And when it does find something useful,
00:54:04.580 all it did was read Wikipedia.
00:54:08.040 So as a search engine,
00:54:10.540 do you know why it feels better than Google right now?
00:54:13.800 Like, I do get the impression that it's better
00:54:16.600 because I've asked it some simple stuff
00:54:18.220 and I'm like, oh, I like that.
00:54:20.160 It feels better because it doesn't have advertisements.
00:54:24.320 That's it.
00:54:25.560 That people are not gaming the results
00:54:27.340 and sticking advertisements in there.
00:54:29.420 But that's all.
00:54:31.600 As soon as AI, you know, becomes mature,
00:54:35.160 you're going to get the advertisements
00:54:36.440 and the fake results just like everything else.
00:54:39.040 So it's only useful because it's in beta.
00:54:40.960 As soon as they add ads to it,
00:54:43.000 you're going to have to search through it
00:54:44.220 just like Google.
00:54:47.320 I'm guessing.
00:54:49.540 And I would also...
00:54:50.680 A number of people said AI is already helping them
00:54:53.240 with writing tasks.
00:54:56.020 Do you believe that's true?
00:54:57.420 Do you believe that people are already
00:54:59.400 using AI successfully for writing?
00:55:04.860 Do you?
00:55:06.240 Yeah.
00:55:06.800 And some people say yes, they are.
00:55:08.140 All right, well, I'm going to say something
00:55:09.840 that might be insulting to some of you.
00:55:12.460 AI will definitely be good for writing.
00:55:16.000 So there's no doubt about it.
00:55:17.860 AI is great for writing.
00:55:19.640 AI is going to make bad writers average writers.
00:55:23.360 Would you agree with that?
00:55:25.080 And by average, I mean there's no grammar errors
00:55:27.800 and you know exactly what it says.
00:55:29.940 It's not great.
00:55:31.760 It's just really serviceable.
00:55:33.440 It's got utility.
00:55:34.780 It does the job.
00:55:35.460 So AI will make bad writers normal and sufficient.
00:55:42.580 What's it going to do to good writers?
00:55:46.940 I don't believe it can become a good writer.
00:55:50.900 Now, again, I know.
00:55:53.680 I'm going to close my eyes
00:55:55.360 because I know what you're saying.
00:55:59.060 Without reading the comments,
00:56:01.260 you're saying,
00:56:02.520 Scott, it's just the beginning.
00:56:05.640 Right?
00:56:06.440 It's just the beginning.
00:56:08.340 It'll get so much better.
00:56:10.680 Here's where I disagree.
00:56:13.060 The thing that makes me a professional writer
00:56:15.880 is that I can write a sentence
00:56:18.700 you wouldn't have thought of.
00:56:22.460 Just think about that.
00:56:23.960 That's why I'm a professional writer
00:56:25.700 and other people who would like to be are not.
00:56:28.060 I could write a sentence that you didn't expect
00:56:31.160 and you wouldn't have been able to write on your own.
00:56:35.980 AI learns to write by looking at average writers.
00:56:40.720 That's how it learned.
00:56:42.020 It looks at a huge body of average writers
00:56:44.320 and then it comes up with this sort of average thing
00:56:48.200 that it thinks everybody is going to understand.
00:56:50.160 So AI is the dumbed down writer.
00:56:54.060 I think it'll be the greatest writing tool for bad writers
00:56:57.240 but it will never do what a human writer can do
00:57:01.080 which is anticipate the next thing that you want to care about
00:57:04.260 or feel how the room feels.
00:57:07.840 Read the room.
00:57:09.340 And then the other thing I can do
00:57:10.800 and I've tested AI to see if it can write humor.
00:57:14.120 It's not even close.
00:57:14.880 AI always goes for the oldest joke possible.
00:57:19.700 So if you say, hey AI,
00:57:21.680 write a funny article on some topic,
00:57:24.480 you can guarantee that the first joke
00:57:27.120 an ordinary person could think of,
00:57:29.700 it'll be there.
00:57:31.060 It'll be there.
00:57:32.500 If I told it to write some jokes about Biden's age,
00:57:37.640 do you think it would have written
00:57:38.740 that he knew James Madison personally?
00:57:43.340 Of course it would.
00:57:44.120 Yeah, the very joke that Biden used
00:57:47.020 that he's so old that he knew
00:57:48.600 one of the framers of the Constitution personally.
00:57:51.640 Have you ever heard that joke before?
00:57:54.300 Yes.
00:57:56.360 Yes, you've heard that joke before.
00:57:58.480 Like a billion times.
00:58:00.980 So if you asked AI to write a joke about Biden's age,
00:58:05.240 it would write that joke.
00:58:07.100 And it wouldn't be funny.
00:58:08.840 And it wouldn't be any funnier than the human
00:58:10.420 who probably did write it.
00:58:11.480 It's just the most obvious joke.
00:58:15.240 So one of the things that makes
00:58:16.580 a commercial grade humorist commercial grade
00:58:20.100 is that we wouldn't do that.
00:58:22.360 I would never do that joke
00:58:24.380 because it's just so old and overdone.
00:58:27.580 But so far AI just prefers,
00:58:30.700 I think AI prefers things
00:58:32.720 that most people are familiar with.
00:58:35.180 It's like, all right, everybody knows what this means
00:58:38.080 and this will not be offensive.
00:58:40.720 And the other thing AI can't do is offend people
00:58:42.980 and know when to get away with it.
00:58:46.180 Right?
00:58:47.200 Because it would never know when it can get away with it.
00:58:49.580 A professional humorist can offend people
00:58:52.340 and that's the funny part.
00:58:54.060 The funny part is knowing that somebody was offended,
00:58:56.500 but you know which little pockets you can get away with.
00:58:59.760 Because you soften that by saying something else.
00:59:03.780 Or you could.
00:59:05.880 A human humorist can create a persona of themselves,
00:59:11.140 like Andrew Dice Clay,
00:59:13.500 where when he said hugely,
00:59:16.780 let's say,
00:59:18.280 I would say misogynistic sounding stuff,
00:59:20.960 you wouldn't say to yourself,
00:59:22.420 hey, that guy believes those things,
00:59:24.820 so we must cancel him.
00:59:26.200 You'd say, oh, I get it.
00:59:27.260 He's playing the character of a guy who's a misogynist
00:59:30.100 and that he says funny things
00:59:31.920 as if a misogynist would say them.
00:59:33.960 And then you go, oh, that's funny.
00:59:36.020 But AI can't do that
00:59:37.340 because AI doesn't know how to become Andrew Dice Clay.
00:59:41.100 And if it did, it would just look weird.
00:59:44.720 Yeah.
00:59:46.120 So I believe that AI
00:59:48.360 has logical barriers
00:59:51.740 that might be permanent
00:59:53.760 that keeps it from being a great writer
00:59:56.780 or a really funny one.
01:00:00.020 And I think that we'll keep it that way
01:00:01.940 because we want it.
01:00:03.120 You don't want AI to say things
01:00:04.780 that would offend people,
01:00:05.840 so it can never do humor.
01:00:09.720 Right?
01:00:10.840 AI can never do anything but dad jokes.
01:00:13.340 Dad jokes will be like the limit of it.
01:00:15.520 I mean, it might do great dad jokes,
01:00:17.040 but that'll be the limit of its creativity.
01:00:19.860 It has to...
01:00:20.740 Do you think AI could write a joke
01:00:22.420 that somebody has never written before?
01:00:23.940 I could argue that nobody does,
01:00:27.100 but...
01:00:28.100 No, I don't think so.
01:00:29.360 Because the joke form is always the same.
01:00:31.800 You can just add different names and details to it,
01:00:34.180 but it's the same joke.
01:00:36.800 Yeah.
01:00:37.280 All right.
01:00:40.480 And what if AI does self-deprecating humor?
01:00:45.300 Makes no sense, does it?
01:00:47.000 If a human being does a self-deprecating joke,
01:00:49.880 which is one of the biggest, you know, fields of humor,
01:00:53.360 you can laugh
01:00:54.260 because they created a stereotype of themselves
01:00:57.580 and then, you know, use that stereotype.
01:01:00.800 But the AI can never create a stereotype of itself.
01:01:03.640 So if the AI did a self-deprecating joke like,
01:01:08.600 well, ha, ha, ha, you know,
01:01:11.660 I'd never get that one right,
01:01:13.600 that would never be funny to you.
01:01:15.060 It would just look like a broken machine.
01:01:17.000 Well, why don't we get it right?
01:01:18.980 Whereas if a human makes a mistake,
01:01:21.060 sometimes we think it's hilarious
01:01:22.340 because we read our own experience into it.
01:01:27.020 All right.
01:01:27.500 So here's an example of a joke
01:01:29.880 that AI wrote for me in my voice.
01:01:34.700 So I asked AI to write a funny article about AI,
01:01:38.560 but do it in my voice.
01:01:40.060 And his final joke was, you know, blah, blah, blah.
01:01:43.700 But AI is not in charge yet.
01:01:50.140 Get it?
01:01:51.740 AI is not in charge of the world yet.
01:01:55.600 That's the joke.
01:01:57.340 Now, I might have said that as, you know,
01:02:03.160 interesting writing, but not as a joke.
01:02:06.420 You know, it might be something
01:02:07.260 that makes the writing a little more lively
01:02:09.020 or something like that.
01:02:10.380 But no.
01:02:11.960 Like, that is the oldest, most obvious joke in the world.
01:02:16.200 And I feel like AI will always be limited
01:02:18.660 to the oldest, most obvious,
01:02:20.900 cleanest, non-offensive,
01:02:22.360 non-self-deprecating jokes.
01:02:26.640 And they can't use stereotypes either.
01:02:29.240 Most jokes are about stereotypes.
01:02:31.740 And AI probably isn't allowed to do that.
01:02:33.860 Every time AI gives you an answer,
01:02:35.420 it says, well, but you have to be aware
01:02:37.960 that not everybody fits this stereotype.
01:02:41.020 Like, well, you just killed that joke.
01:02:45.140 Yeah.
01:02:45.760 AI is basically a summary of the average.
01:02:50.020 That is true.
01:02:51.280 Now, let's talk about coding.
01:02:54.160 I have this theory that AI will make code worse.
01:02:59.740 Here's why.
01:03:01.300 Suppose you're a programmer,
01:03:02.860 and you know that there is some code available,
01:03:05.600 or you could just tell AI to write it.
01:03:08.880 At some point, AI is going to write it the same way
01:03:11.380 every time, won't they?
01:03:13.440 Because they could just see what other AIs have done,
01:03:16.020 and it'll just figure, oh, this is just the best way to do it,
01:03:18.740 and it'll just write it that way.
01:03:20.020 If humans do it,
01:03:21.820 and they don't know that something's already written,
01:03:24.420 I feel like they would start from scratch
01:03:26.120 and say, all right, how would I write this?
01:03:28.080 And they might learn something in the process.
01:03:30.740 They might accidentally create a thing
01:03:32.740 that was different than what they thought,
01:03:34.260 and say, oh, that wasn't what I thought,
01:03:36.300 but I could use that for something else.
01:03:38.840 So I feel as if the AI will take the creativity
01:03:43.100 away from human programmers,
01:03:46.020 initially for really good reasons,
01:03:47.680 because it'll take the drudgery away,
01:03:49.560 but I think it also will take away their creativity.
01:03:53.340 And I believe that a clever programmer...
01:03:56.620 Here's a story I always think about when I think about this.
01:04:00.240 Now, this is in the hardware world,
01:04:01.720 but it works for software as well.
01:04:04.940 When the original Apple was being built,
01:04:08.440 and it used too many chips or something,
01:04:11.720 and then Steve Wozniak came in and looked at it and said,
01:04:14.540 wait a minute,
01:04:15.500 I can get rid of a whole bunch of these chips,
01:04:18.060 and it'll still be perfect.
01:04:19.700 And then he re-engineered it,
01:04:21.180 so it used the least number of chips,
01:04:22.620 and that's kind of what made the...
01:04:24.660 Was it the...
01:04:25.500 Not the Mac.
01:04:26.340 Was it the Mac?
01:04:27.400 But it made them successful,
01:04:29.640 because it was a human whose creativity
01:04:32.140 allowed them to know that they could make
01:04:34.520 a more efficient version.
01:04:36.080 I worry...
01:04:37.080 I think it was the Apple one.
01:04:39.140 But I worry
01:04:40.040 that if everybody starts using the same canned programs,
01:04:44.820 or asks AI to make it,
01:04:46.460 and it ends up making the same,
01:04:48.080 that we'll lose all the serendipity
01:04:50.120 and accidental discoveries and stuff that we have now.
01:04:54.820 So it could make things worse
01:04:56.560 by making them all the same.
01:04:59.440 Just a possibility.
01:05:01.540 You know, the unintended consequences of AI
01:05:03.800 will be the interesting ones.
01:05:05.580 In a bad way.
01:05:08.180 All right, that, ladies and gentlemen,
01:05:10.720 is the conclusion of the best livestream you'll see today.
01:05:16.600 Programmers can prompt AI to generate code like they want,
01:05:20.120 but not necessarily the same way they would have done it.
01:05:24.380 That's the point.
01:05:28.020 All right.
01:05:30.580 Generative design is just that.
01:05:39.240 Clear-cutting for comics.
01:05:40.900 Oh, yeah.
01:05:44.520 I got an answer from AI that said that AI was, you know,
01:05:51.220 it wanted equity.
01:05:53.380 So AI is already using equity as its preferred word.
01:05:57.840 Do you think AI figured that out on its own?
01:06:00.600 Do you think that an AI that was mostly trained on pre-2017 data
01:06:04.540 picked up on equity?
01:06:06.940 No.
01:06:08.020 That, to me, looks like it was hard-coded.
01:06:10.600 That, to me, looks like the finger of the programmers saying,
01:06:13.780 uh-oh, we can't have them say equality
01:06:16.200 because that's not our story.
01:06:19.060 We better make them use the word equity
01:06:20.920 whenever this kind of topic comes out.
01:06:22.740 There's no way in the world
01:06:24.160 the AI came up with equity on its own.
01:06:28.520 I don't believe that.
01:06:31.440 I don't.
01:06:31.700 I mean, it could.
01:06:32.660 It's not without...
01:06:34.240 It's not outside the realm of possibility.
01:06:38.800 Yeah.
01:06:39.560 But it seems very unlikely.
01:06:44.640 All right.
01:06:50.080 Discernment.
01:06:51.200 Okay.
01:06:51.380 Yeah, Glenn Greenwald tweeted about Rumble
01:06:57.860 having a large, young audience,
01:07:00.220 and he thinks that something big is ahead for Rumble,
01:07:04.200 but he does not say what.
01:07:06.120 I'm a Rumble shareholder,
01:07:08.860 so I should disclose that.
01:07:13.100 Oh, yeah, Huffington Post is used
01:07:14.680 as some of the AI trading data.
01:07:17.900 What could go wrong?
01:07:18.900 All right.
01:07:27.800 The uncentered of AI is available
01:07:29.860 to those who are not elite, okay?
01:07:32.500 Is there a chat on Locals?
01:07:34.360 Yeah, Locals, they're all chatting right now.
01:07:38.140 Locals looks just like...
01:07:40.300 I mean, it looks like this,
01:07:41.540 except on Locals,
01:07:42.420 they can insert memes and photos.
01:07:45.280 So the chat on Locals is a lot more pictures.
01:07:52.500 I don't know why it's funny
01:07:54.400 that YouTube doesn't have that feature.
01:07:57.580 All right.
01:07:58.440 That, ladies and gentlemen,
01:08:01.480 concludes my presentation for today.
01:08:05.680 YouTube, thanks for joining.
01:08:07.040 I'll see you tomorrow.
01:08:08.400 That's all.
01:08:17.300 All right.
01:08:21.300 Bye-bye.
01:08:23.620 Bye-bye.
01:08:28.000 Bye-bye.
01:08:28.240 Bye-bye.
01:08:29.720 Bye-bye.
01:08:30.340 Bye-bye.
01:08:32.100 Bye-bye.
01:08:32.960 Bye-bye.
01:08:33.220 Bye-bye.
01:08:33.240 Bye-bye.
01:08:33.420 Bye-bye.
01:08:33.500 Bye-bye.
01:08:34.160 Bye-bye.
01:08:34.820 Bye-bye.
01:08:35.360 Bye-bye.
01:08:36.000 Bye-bye.
01:08:36.080 Bye-bye.