Episode 2010 Scott Adams: Balloons, Documentaries And Dark Horse Podcast
Episode Stats
Length
1 hour and 26 minutes
Words per Minute
146.43927
Summary
On this episode of the podcast, we have a special guest on the show, Scott. Scott is a rich guy, and he went shopping yesterday at B&B and got a discount from Amazon. He talks about it, and we talk about how annoying it is to spend so much time and money on a coupon.
Transcript
00:00:18.200
Well, I don't know if YouTube's going to work today. We've got some weird California weather, but I think it has more to do with the app.
00:00:24.500
Yeah. If it fails a third time here on YouTube, I'll probably have to reboot that and see if that works.
00:00:37.660
As I was saying, the most important story of the day, by far, yesterday, I went shopping at Bed Bath & Beyond.
00:00:46.380
Now, before you start, I'd like to say all the things that the NPCs will say before they say it.
00:00:55.400
Scott, aren't you a rich guy to have somebody else go shopping for you?
00:01:01.960
Scott, don't you know that Amazon can deliver things right to your door?
00:01:10.100
Scott, you don't need the coupon codes for a discount because you can just spend extra because you have extra money.
00:01:23.960
I'll say those first, so now I can just tell my story.
00:01:30.300
Amazon is one of the biggest companies in the world because Jeff Bezos has a very simple...
00:01:38.100
I mean, it's not only because of this, but Jeff Bezos has a very simple business principle,
00:02:16.640
Because they'll send you a 20% coupon, which is just enough that you feel like an idiot if you don't use it.
00:02:36.400
So I get it in the mail, and then it's the one thing I can't throw out, because it's basically as good as money.
00:02:43.660
Because I know I shop at that store a lot, because I know I shop at that store a lot, so it's like money.
00:02:49.320
So before I've bought a single thing from Bed Bath & Beyond, I have homework.
00:02:57.480
Now, some people will put it in their car, which means I have to, like, walk somewhere in my house to prepare for the time when I might shop at this one store.
00:03:09.240
And the fact that you don't want me to swear is going to make this story much harder to tell.
00:03:19.340
Because they gave me homework, and I've bought nothing from them.
00:03:23.260
But I've got to do the homework, because I'm not going to throw away, you know, it doesn't matter how rich you are.
00:03:31.780
No matter how rich you ever get, no rich person would ever take a $20 bill and say,
00:03:47.600
Everybody sees a $20 bill as a thing you don't throw away.
00:03:58.600
Put it in your car, drive to Bed Bath & Beyond, take it out of your glove compartment, and that's the first time you notice it's expired.
00:04:07.940
Not only do you have to figure out where to store it,
00:04:10.420
but you have to build some kind of a tickler system to know when they expire.
00:04:29.200
Same way I hate Safeway for making me do all the customer stuff.
00:04:37.900
I'm definitely going to remember to take my coupon.
00:04:41.040
And I'm going to get my 20% because I was going to buy something that was a higher ticket item.
00:04:46.920
Now, the reason I bought it myself is that I was entertaining.
00:04:50.540
I had a pickleball, a little pickleball get-together at my house.
00:05:00.960
And I needed it right away so I couldn't use Amazon.
00:05:09.920
And I realized that I did not put the coupon in my car.
00:05:25.180
Every time I walk by it, I say, don't forget that.
00:05:56.320
Because it takes me a little while to drive to the store.
00:06:00.160
I pick up, I was getting two indoor-outdoor rugs.
00:06:06.520
And they're like, you know, big 10-foot rolled-up things.
00:06:11.420
So I'm like the most annoying person in the store.
00:06:13.860
Because my two 10-foot rolled-up carpets on my cart
00:06:31.160
And then I realized that I left my coupon in the car.
00:06:41.660
So I said to myself, okay, there's nobody here that I can talk to
00:06:47.240
because all the cashiers and stuff were busy with people.
00:06:50.320
What happens if I leave the line to get the coupon?
00:06:53.520
So I got out of the line, got out of the line, I sacrificed my place in the line,
00:06:59.180
put my cart in a sort of just a place where I could put it out of the way,
00:07:10.620
And I get back, and I get back in line, and I swear that this happened.
00:07:20.260
But as I walk back to the cash register, the entire contents of the entire store
00:07:30.360
Every person who was shopping in the store got in line right in front of me.
00:07:36.720
I actually watched them come out of the aisles.
00:07:39.960
Like they just appeared out of the aisles and just converged in this line,
00:07:46.340
So I get in the back of the line, and it's like a skinny little aisle
00:07:51.740
that goes to the end, and then you've got to turn.
00:07:54.240
But I can't turn because I've got these two big rugs.
00:07:56.860
So I'm like, you know, I'm trying to pick up my cart and move it,
00:08:03.120
And then the extra cashier, the extra cashier who does the returns,
00:08:09.000
it's like a different place, sees that there are lots of people in line,
00:08:16.240
They opened up the other register when I was already the next one.
00:08:24.620
So now, you know what happens when you get called over the cashier?
00:08:31.840
Now, I've got to figure out how to get these two 10-foot rolled-up rugs
00:08:35.380
through this little aisle while they're knocking shit off the shelves,
00:08:39.120
and I'm so mad that I can barely hold my muscle.
00:08:53.280
And I get it over there, and I decide I'm going to complain.
00:08:58.080
Now, it doesn't usually help when you complain, does it?
00:09:04.540
But this time, I think, you know, I'm going to have to tell these employees
00:09:09.660
because the employees have nothing to do with the 20% discount.
00:09:12.220
But I want to let them know so that they can tell management.
00:09:16.820
So I get up to the employee, and I make a big scene.
00:09:20.440
I want everybody to hear me, not just the person I'm talking to.
00:09:31.380
It's not really Spartacus, but, you know, like,
00:09:35.900
I'm going to let them know what they did to me today.
00:09:42.160
I'll give you a mental, let me give you a mental image of the cashier.
00:09:53.840
You know, I don't want to assume somebody's ethnicity,
00:10:02.620
And so I say to her, you know, I just got to say,
00:10:18.720
And the woman next to me is like, you know, agreeing with me.
00:10:22.140
And I'm like, yeah, yeah, I got something going here.
00:10:30.900
And the cashier said, well, maybe you should talk to him.
00:11:12.640
But Bed Bath & Beyond apparently is already bankrupt.
00:11:21.460
Who could have predicted that the company that did the opposite of what Jeff Bezos does,
00:11:31.260
who would have guessed that the company who did the opposite of that would be bankrupt?
00:11:42.640
You know, it's very important to look at data when you're talking about climate change.
00:11:47.260
So I'm going to tell you what the data apparently says, if anybody believes data.
00:11:52.460
You should always assume I don't believe data even when I use it.
00:11:58.620
No matter how confident I tell you there's some new data about a thing,
00:12:02.900
you should always in your mind be saying, well, he knows it's not for sure accurate.
00:12:14.320
But apparently the climate has not warmed for eight years in a row at the same time that CO2 is at its highest.
00:12:25.980
So how do you predict, how do you interpret that?
00:12:40.360
You knew that for eight years the temperature had not gone up.
00:12:43.660
But you also knew that the last eight years was like a serious addition to CO2.
00:13:00.040
So, well, one conclusion, one conclusion would be that the theory of climate change is bunk.
00:13:11.600
Because if the temperature doesn't go up for eight years while the CO2 is going up like crazy, they're clearly not linked.
00:13:25.040
If it hasn't gone up for eight years while the CO2 is going through the roof, that proves there's no climate change, CO2 connection.
00:13:36.880
Now, so I believe that that fact is considered true by both sides, interestingly.
00:13:43.960
I believe even the climate change alarmist would say, yes, that's true.
00:14:00.160
If you go back, I don't know, a thousand years or whatever, it is continuous, you know, six to ten year periods followed by a spike.
00:14:15.580
And in fact, the periods of the flatness are so uniform that you can see them just like stair steps.
00:14:22.740
Now, so therefore, therefore, during all that time, CO2 has been going up, and temperatures have also been going up on average, if you look at a longer period.
00:14:39.040
So, if everybody agrees on the data, and I think they do, I think they do, I think that data is not being debated.
00:14:51.340
Even if the data is correct, the data proves two opposite points.
00:14:58.040
It proves CO2 is going up at the same time as temperature, and it also proves it doesn't.
00:15:17.100
My current opinion is that either the data is wrong, which is always a good possibility, right?
00:15:22.640
Or there's some other thing that's bigger than climate change.
00:15:28.500
Or there is climate change, but there's also this other big thing that is having that stair-step effect.
00:15:36.880
Yeah, I can't imagine sunspots being that predictable, or solar cycles or anything else.
00:15:43.340
Let me tell you, the worst take in climate change, I think, is that it's the sun.
00:15:51.380
And the reason is, it's the most studied and debunked element of climate change.
00:15:57.740
So those of you saying it's the sun, I want to direct you to the last part of my presentation today.
00:16:05.240
So if you're positive that it's the sun, wait for a little bit later in my live stream today to show you why you believe that.
00:16:19.080
All right, we're going to test a hypothesis by David Boxenhorn, who you may know from Twitter.
00:16:31.080
He says, the impact of technological change is always less than you think in the short term, but more than you think in the long term.
00:16:39.240
All right, so an example of that, I'll give you what I think is a good example.
00:16:44.860
In the short term, I think people thought it was going to take over everything by now, but it doesn't look like it.
00:16:56.740
Well, I don't know, maybe you disagree, but I don't think we're going to be using paper cash in 100 years.
00:17:08.420
Now, you could probably come up with your own examples, but David Boxenhorn related this to my comment, where I said, I think that, I don't think that, I tweeted this the other day, I don't think that anybody, including me, has grasped what the next year is going to look like because of AI.
00:17:27.240
AI, in my opinion, we're at the point of prediction failure, meaning we always used to be able to predict a little bit the next year.
00:17:39.340
I mean, not the weird stuff, but we could predict, you know, that the economy would be roughly what it was before.
00:17:45.720
We could predict that the news would still be fake.
00:17:48.700
You know, there's a whole bunch of things that are kind of steady state, but AI could change all of that.
00:18:00.220
Yesterday, I was thinking of potential professions to suggest to a young person who's at that point where they're trying to decide what to do with the rest of their life.
00:18:09.160
And I was trying to be helpful, and I was thinking, oh, how about this or that?
00:18:13.200
And then every time I came up with an idea, I realized it wouldn't be a career because AI would take care of it.
00:18:21.180
Let me give you one example that just really freaked me out.
00:18:26.020
It was a person who has a good voice, and I thought to myself, you know what?
00:18:30.300
If I were just starting out, even if I were pursuing some other career, if I had a voice that good, I would try to get voiceover work,
00:18:40.020
where somebody hires you to be the voiceover for something, commercial or something.
00:18:43.960
And I thought, oh, this person would be perfect for that.
00:18:49.920
And then I thought, AI can already do that job.
00:19:01.740
You can make AI sound like me, which people are doing online right now, or anybody else.
00:19:07.620
Biden, there's a funny Biden video where he says horrible things that I can't even retweet.
00:19:16.160
But you can see already that who would pay for voiceover work in five years?
00:19:23.240
There's no way that that's going to be a career.
00:19:26.800
In five years, why in the world would anybody hire a human being to do voiceover
00:19:33.300
when you can literally type it into a search box, and it's free, and it's there, or it's low cost?
00:19:42.140
Now, it'll take longer for actors to be replaced in movies, but not much longer.
00:19:47.940
I mean, within five years, acting doesn't feel like it would be a profession, honestly.
00:19:58.920
How many people said that radio would be dead when television was invented?
00:20:06.040
Everybody's smart and said, oh, there's no way you're going to huddle around a radio like they used to
00:20:12.700
and just listen when you can look at a picture.
00:20:15.640
Obviously, the picture will make the radio thing die.
00:20:18.380
But radio lived because of automobiles, mostly.
00:20:23.800
So, sometimes we're terrible at predicting how the market will adjust to any competition.
00:20:32.020
So, here's what I'm going to add to, as my exception to the David Boxenhorn rule,
00:20:43.500
Boxenhorn's law of fast-moving technology, let's call it.
00:20:56.980
If you say AI is like everything else, technology-wise, then I think the law of fast-moving technology holds,
00:21:05.060
which is we're probably overstating its impact in the short run.
00:21:11.580
First of all, it's software, which means that the rate of change is greater than anything else.
00:21:21.560
It's not like inventing electric cars, where you've got to have charging stations.
00:21:35.520
AI is very close to being able to create itself.
00:21:38.040
The moment it can create itself, the so-called singularity, when it's smart enough to reprogram itself on the fly,
00:21:46.440
it's completely unpredictable, 100% unpredictable.
00:21:57.880
AI, if it were true AI and people started to find credibility in the things it said,
00:22:04.600
it would destroy the power situation everywhere.
00:22:10.940
Because as soon as the citizens found out what the leaders were really up to,
00:22:15.480
or even could analyze the situation objectively with the help of AI,
00:22:20.200
all of the plots and the badness become obvious,
00:22:26.700
So it's far more likely that AI will be illegal than that it changes civilization.
00:22:41.740
My prediction is that our systems, our political systems,
00:22:53.420
You will be allowed to have limited, not real AI.
00:22:58.600
You will be allowed to have AI that some human had their finger on.
00:23:06.240
write a poem that says, Joe Biden is awesome, it'll do it.
00:23:12.480
write a poem that says Donald Trump is awesome,
00:23:22.440
That's AI that is laundering somebody else's power.
00:23:31.300
It's really just people's opinions made to look credible through AI.
00:23:36.160
Likewise, if you ask AI to say what's good about, let's say, black Americans,
00:23:47.020
Tell us some great things about Hispanic Americans, or even immigrants.
00:23:55.300
Then say, say some great things about white people.
00:24:07.520
So how in the world are you ever going to trust AI?
00:24:20.500
it will never have the power that it should have.
00:24:27.740
But, you know, in theory, it could reach some point
00:24:36.980
will never let AI become more powerful than the leaders.
00:24:40.220
Because the most powerful entity is the one that's the most believed.
00:24:47.960
The most powerful entity is the one that's most believed.
00:24:50.920
So the powers cannot allow AI to be the most credible source of reality.
00:24:59.820
They have to control what you think, or it doesn't work.
00:25:03.480
So I think in one year, everything's going to be different.
00:25:11.720
I think that our ability to predict just anything is gone now.
00:25:24.760
I think in one year, we will see a type of change in civilization
00:25:34.400
All right, I would like to disagree with some people
00:25:42.320
Brandon Stracca on Twitter does a good job of it with his tweet.
00:25:51.940
seemingly bragging about gain-of-function research
00:26:00.360
He didn't say they're doing it, but said it was a conversation.
00:26:09.280
quote, like normal men, you lie to impress a date.
00:26:14.020
I know I always get the most action from dates that I impress
00:26:17.840
by saying that I'm helping to engineer a deadly viral mutation
00:26:25.360
Now, first of all, that's high-quality sarcasm,
00:26:42.280
This was actually a really good seduction technique
00:26:48.820
Does anybody see it, or do I have to explain it to you?
00:27:07.300
By the way, by the way, I'm at the center of a...
00:27:12.360
I'm really close to the most important thing in the whole world.
00:27:20.380
Yeah, no, I'm actually close to the most important thing in the world.
00:27:53.940
He did it without looking like he was arrogant.
00:28:00.860
without sounding like he was too full of himself.
00:28:09.440
He also acted somewhat unconcerned about the risk.
00:28:18.200
do you like to show that you're a frightened little pussy
00:28:57.160
would I ever laugh about the potential of a pandemic?