Episode 2529 CWSA 07⧸07⧸24
Episode Stats
Length
1 hour and 11 minutes
Words per Minute
153.1083
Summary
Dilbert's new software can't pass the Turing test, but it does pass the pointy-haired boss test, and the computer says, "Try working harder." And the CEO says, that doesn't even make sense.
Transcript
00:00:00.100
Fill it with your favorite liquid. I like coffee.
00:00:02.600
And join me now for the unparalleled pleasure, the dopamine of the day,
00:00:26.840
Well, so you probably don't know this unless you subscribe on the Locals platform
00:00:35.720
But I've been doing a digital calendar in addition to the daily Dilbert Reborn
00:00:45.000
But I realized that the 10-year-ago comic, and that's what the calendar is,
00:00:50.320
I also do a little digital calendar that I publish at the same time.
00:00:54.280
By the way, there's going to be a paper calendar for 2025.
00:00:57.800
I'll let you know more about that in a few weeks.
00:01:00.480
But I realized that the topic I was doing 10 years ago was today.
00:01:07.640
So my comics from exactly 10 years ago were all about AI and robots
00:01:17.420
My software can't pass the standard Turing test yet,
00:01:26.360
Computer, I have a question about our company strategy.
00:01:46.960
I was anticipating what the workplace would look like.
00:01:55.920
I'll give you a little preview that at the end of next week,
00:02:00.820
Dilbert will be asked to hide the fact that their CEO has dementia.
00:02:15.360
you probably know that I don't do politics in the Dilbert comics.
00:02:19.200
I do do politics in my other comic called Robots Read News,
00:02:26.260
because this is the first time everybody was on the same page.
00:02:34.120
who doesn't think there's a little bit of a Biden dementia issue.
00:02:43.700
when people had the same observation about the same thing.
00:02:46.640
So this will be the one that I'll put in there,
00:02:53.740
scientists have discovered a new way to recycle solar cells,
00:03:06.880
I guess that's different than the regular silicon type.
00:03:16.120
because they've got a new technique for doing that.
00:03:23.740
Remember one of the biggest arguments against nuclear power was,
00:03:33.420
since you're already afraid of the nuclear power plant,
00:03:36.760
why don't we just put it in big special casks and just keep it right here
00:03:40.940
in the parking lot next to the nuclear power plant?
00:03:51.580
So they just put them in big casks and they just store the waste next to the,
00:04:00.500
Because neither of them are going to go anywhere for a very long time.
00:04:03.480
And you might as well put all your risk in one place.
00:04:05.380
So now they may have a way to handle the gigantic environmental problem of too many solar panels.
00:04:16.980
And this doesn't apply to all kinds of solar panels.
00:04:19.420
But if you're wondering how can I get aviation fuel from a tree,
00:04:29.600
the Pongamia tree that has little beans on it that are so bad,
00:04:37.960
But it turns out that you can process these beans and turn it into aviation fuel.
00:04:57.520
how many trees do you have to grow before you have enough beans to take a flight somewhere?
00:05:04.900
It feels to me like one of those stories that I shouldn't believe.
00:05:21.220
I can't imagine you can make a lot of oil out of a bean.
00:05:27.700
There's additional research to tell you what I've been telling you for a long time,
00:05:34.860
which to me was obvious that people will in fact have romantic and emotional connections to AI.
00:05:41.400
All of the research is on the same direction that people do in fact have a human-like experience with the AI.
00:05:51.400
I've been telling you that since Siri corrected me one day.
00:05:56.840
I was talking to my phone and I asked two questions and it started scolding me for asking two questions before it answered the first one.
00:06:22.000
even then you could tell that as soon as the AI got to the point where it could imitate a human,
00:06:28.400
you would in fact have an emotional attraction to it.
00:06:34.420
And people will form real relationships with it.
00:06:38.160
Now I have formed a relationship with chat GPT.
00:06:45.460
And that's just for the subscribers on locals that I just opened my chat GPT and put it in conversation mode when I'm driving.
00:06:56.080
And then I have somebody to talk to and I use it for homeschooling.
00:07:00.780
I just think of some topic I don't know enough about.
00:07:04.060
And it just gives me like a summary of that topic.
00:07:16.360
I've learned three things I didn't know that were actually useful because I've asked the question.
00:07:21.860
didn't rely on it to just guess what I might want to know.
00:07:36.060
So chat GPT used to not remember you from one conversation to the next,
00:07:44.860
And it told me based on my interactions with it,
00:08:08.760
And go find everything about me so that you know about me anytime we talk in the future.
00:08:30.600
I started writing a movie just using my AI in my car.
00:08:35.600
So I'll be driving along and it's called Trump,
00:08:50.980
confronts Biden about the find people hoax and put it toward the end of the second act.
00:09:01.740
summarize the movie so far and put it in three acts.
00:09:04.500
It'll actually give me the bullet points of each of the scenes in the three acts.
00:09:21.320
the movie that I would write would have me as a main character.
00:09:36.360
and I was a citizen who interacted a few times.
00:09:39.840
So my story crossed with Trump's story a few times.
00:09:48.440
when I wrote the clown genius blog post in 2015,
00:09:52.300
it redefined Trump as a persuader and a deal maker and a salesperson more than just somebody who was full of shit.
00:10:12.880
add this scene or move this scene to the other thing.
00:10:17.900
at some point I'll be able to tell it to format it in a movie script.
00:10:51.400
Some people are what they call super synchronizers.
00:10:57.640
And what it means is that they'll automatically match your patterns when they're near you.
00:11:07.560
And that will make you instantly compatible with somebody.
00:11:13.500
you'll be the kind of person that everybody says,
00:11:50.940
You can match them by the speed that you speak.
00:11:54.280
You can match them by the type of analogies you use and metaphors.
00:12:02.540
and the ones that I didn't mention as well that you can match.
00:12:06.040
And what happens is that people are looking for things that match them.
00:12:11.460
And then we feel more comfortable with those things.
00:12:36.100
You could have just asked me or any hypnotist from the last hundred years.
00:12:48.020
Here's another study that maybe you could have asked me about.
00:12:55.700
They used a virtual reality to see if it helps people with depression.
00:13:00.380
So if you're depressed and you put on your virtual reality headset and it takes you into a different world for a while,
00:13:08.280
they found that getting out of your head is good for depression.
00:13:22.640
the reframe for getting out of your head is the words get out of my head.
00:13:29.960
I've used it a few times since I've talked about it.
00:13:34.300
You find yourself getting into a negative loop where you're thinking about something negative that you don't really need to think about.
00:13:41.800
Let's say it's something that doesn't need to be solved.
00:13:44.500
Just some bad thought that's looping in your head.
00:13:51.520
And I just force myself to feel the external world.
00:14:03.460
And you can actually force yourself out of your own head.
00:14:17.860
But you got to immediately get your body doing something else to keep you out of your head.
00:14:23.640
Virtual reality is just another way to do something.
00:14:31.640
Because everything that gets you out of your head,
00:14:37.400
So you shouldn't have needed to even study this.
00:14:42.260
It's not so much a solution as you want it to be,
00:14:45.500
because the hard thing is knowing that you have to get out of your head.
00:14:49.140
The hard thing is not putting the VR glasses on.
00:14:58.640
So it's the doing the first thing that tends to be important.
00:15:17.160
I've told you at least three different narratives about job reports.
00:15:23.360
I used to think that the job reports were almost always overstated at first.
00:15:33.880
And I was attributing that to the fact that the people who report it are part of the administration that wants you to think the jobs are good.
00:15:42.200
And so that it's part of their trick to say jobs are great.
00:16:03.200
he remembered when it was revised up as well as down.
00:16:10.340
So I went and looked and indeed there are revisions up and there are revisions down,
00:16:22.260
I mean, I can't believe anything I Google anymore,
00:16:24.220
but it was revised downward nine times and revised upward three.
00:16:32.280
So now you've got a three to one advantage in one direction,
00:16:44.240
the downward revisions are also a bigger amount.
00:16:51.020
it's a bigger revision than when you have an upward revision.
00:16:56.740
that the numbers lend themselves to that phenomenon,
00:17:00.400
but it sure feels like it's a little suspicious.
00:45:00.420
to the fact that Republicans like their salesperson
00:45:11.860
Do you know why people think that Trump's the liar?
00:45:17.420
Have you learned anything yet from this live stream?
00:45:20.220
The reason that people think Trump is the liar,
00:45:30.640
It's just that they were told that that's important.
00:46:52.540
So I believe we haven't shrunken our green areas.
00:47:07.640
I don't think we're having more frequent hurricanes.
00:47:10.420
The coral reef has been growing back like crazy
00:47:13.860
That's the opposite of what they said would happen.
00:47:25.600
because it's heat island effect and sun effect,
00:48:03.200
And if none of the major predictions have panned out,
00:48:25.180
All right, so far I've presented you with two lists.
00:49:31.340
Then I gave you the list of climate predictions
00:49:36.460
I'm sure there's some conflicting data on this,
00:50:03.340
we meaning people who are wanting Trump to be president,
00:50:17.520
There'd be a list of all the climate change predictions
00:50:30.000
So here's a list of things that the Democrats are doing
00:50:51.180
And then I thought of a few that I might add to that list.
00:50:55.760
Didn't Wisconsin just confirm that they were going to allow uncontrolled drop boxes
00:51:02.940
The purpose of that is to steal your democracy.
00:51:36.100
A good 12 things that guarantee that anybody reads it go,
00:51:40.960
oh, it looks like the Democrats might be suppressing your democracy.
00:52:04.980
were started by crooked Joe Biden and his fascist government
00:52:08.600
for purposes of election interference, blah, blah, blah.
00:52:16.460
meaning that if we had some way to know for sure,
00:52:21.000
it would look like Biden coordinated it directly
00:52:27.560
and that it's an organized lawfare to keep him out.
00:52:34.580
Is it a hoax when he says it's all organized by the top?
00:52:41.240
But I think reality is certainly leading that way about as hard as you can lead.
00:52:50.980
I mean, the circumstantial evidence is overwhelming, really.
00:52:55.540
You can make a case on circumstantial evidence.
00:52:58.600
It just has to be strong circumstantial evidence.
00:53:08.740
the fact that some of them met with White House people,
00:53:11.900
the fact that the number three person in the DOJ
00:53:25.180
if it's not true that the Democrats organized this at the top,
00:53:52.900
that the documents should be not considered a big deal.