Episode 2091 Scott Adams: Tucker's Plan, AI Reframes, Biden's Campaign Strategy, Talking Robot Dog
Episode Stats
Length
1 hour and 5 minutes
Words per Minute
145.54149
Summary
A robot could shake hands with the Prime Minister of Italy and actually have a meeting that looks like it could be real, and Tucker Carlson has a video of a talking robot dog, but is it fake or not?
Transcript
00:00:00.000
Do good morning everybody and welcome to the highlight of civilization and I'm talking about
00:00:08.480
the organic civilization not the virtual world that's coming where we'll all be silicon but if
00:00:16.840
you'd like to maximize this reality which some think is the base reality ha ha ha ha all you
00:00:24.020
need is a cup or a mug or a glass a tankard shells or stein a canteen jug or flask a vessel of any
00:00:29.480
kind fill it with your favorite liquid I like coffee and join me now for the unparalleled
00:00:34.940
pleasure it's the dope meat into the day the thing that makes everything better it's called
00:00:40.760
the simultaneous sip and it happens now go ah that's some good stuff right there well I don't know about
00:00:57.600
you but I'm having a little extra difficulty sorting out reality from digital reality are you having that
00:01:08.700
problem yet so today alone I think there were three stories that I weren't sure were real I'll get to
00:01:18.540
all of them I think but I read the news today and every time I saw a story I was like real or is that
00:01:27.180
some kind of AI thing I can't tell watch how many of them they that there start to be as we go forward
00:01:33.900
all right um did you see the well the first one that blew my mind did you see a video which I believe
00:01:43.360
is CGI uh of the uh prime minister of Italy and the caption was that they banned AI and so the AI came to
00:01:53.560
visit them uh yes it's fake but there's a it shows a car driving up in sort of an outdoor ceremony
00:02:00.740
and you see this robot get out of the car and walk up and shake hands with the prime minister
00:02:06.640
and then the two of them sort of like bow to the people playing music or whatever and then they turn
00:02:12.060
and walk walk down the red carpet to go meet with each other oh my god does it look real
00:02:17.520
now it's not it's not but when you say to yourself oh the reason I know this is not real
00:02:26.080
is because it's too futuristic right I mean that's that's the reason you know it's not real
00:02:32.080
the reason you know it's not real is that it couldn't be there's no way it could be real
00:02:36.220
that a robot would get out of a car and shake your head and then you know to walk down this carpet with
00:02:42.760
you and go to a meeting right well the same day I saw that which by the way looked like sort of a
00:02:50.040
modified boston dynamics robot you've seen those the videos same day as that one was fake
00:02:57.180
I think this next one's real it was a video of one of those boston robotic dogs you've seen those
00:03:05.380
little creepy robot dogs they put AI in one and there's a picture of a guy just talking to his
00:03:12.340
robot dog telling it to you know back up a few steps or you know do this or that and is AI you
00:03:21.120
know it's it's a conversational AI in a robot dog now is there anything in that fake video of the
00:03:30.440
fake robot getting out of a car and shaking hands and going to a meeting is there anything that they
00:03:35.340
can't do now that was in the fake video and the answer is I think they could do all of that right
00:03:42.900
now like actually a robot could get out of a car shake hands and go to a meeting with you and
00:03:49.180
actually have a meeting that's real but wow it was such a star wars like creepy weird thing just seeing
00:03:56.600
that because you can see it's it's like two weeks in the future you know did you think there'd be a
00:04:03.520
talking robot dog today I mean that kind of snuck up on me all right here's the most interesting
00:04:13.080
thing happening in the news Tucker Carlson made a little video it looks like he made it from his
00:04:18.540
just guessing probably his main studio he has a studio home two homes in Florida and Maine and
00:04:25.800
he didn't did not announce his future plans because it would be too soon to do that he has to
00:04:33.040
probably work out some stuff with Fox News who are technically still his employers it just to show
00:04:38.540
is off the air so but what he teased was that there are lots of good people in the country
00:04:47.280
so he was sort of feeling good about how how awesome humans are and I have to say that when I got
00:04:55.220
canceled I had the same experience as soon as I got canceled the people who are good people in the
00:05:03.020
world just sort of emerge you know to make sure you're okay just checking in on you right and I
00:05:09.100
got to say it was just the coolest thing so he's I'm sure he's experiencing the same thing which is
00:05:15.060
just zillions of people sending him positive messages now the only reason I didn't send him a
00:05:21.440
message you know I think I have a way to get through the only reason I didn't send him a positive
00:05:26.300
messages I thought he'd be buried in them and he wouldn't see it but in my mind I'm sending him a
00:05:33.040
positive message maybe later I'll send him one but the world is really adjusted for the canceled
00:05:42.900
I've said that before but here's another clean example the it used to be that fired and canceled
00:05:49.720
was really bad news if you were a high profile person but now the the free market has just
00:05:57.200
adjusted so that the canceled go from a bad situation to a slightly better one
00:06:02.300
I if you were looking for something that was a positive sign of the future you know amidst all
00:06:10.880
the things that look kind of creepy and bad and scary that's pretty positive the fact that speaking
00:06:18.280
speaking speaking what you think is the truth we could argue what's true but saying what you
00:06:23.680
think is the truth will get you fired and promoted so now you can get fired promoted it's the best
00:06:33.820
kind of fired I'm pretty sure Tucker's going to come out better than he was you have fired
00:06:40.000
promoted so did I like I make less you know way less money than I made when I was you know worldwide
00:06:47.300
global cartoonist but in terms of my overall satisfaction and happiness and I can still
00:06:53.540
pay the bills I'm way better off way better off it's not even close so fired promoted that's the
00:07:00.220
new thing all right so here's what I picked up in this announcement he talked about how our debates
00:07:06.360
are worthless that we don't really see good debates oh do you see where this is going do you see what
00:07:16.000
he teased oh my god this is so much better than I expected I didn't know what to expect I just
00:07:27.320
figured he'd do a podcast or something but if what he's going to do and this is what I'm sensing don't
00:07:34.380
know that this is true but I'm gonna make this I'm gonna make this prediction based on just how smart he
00:07:42.300
is that that'll be my only basis for the prediction just how smart he is right so you just figure the
00:07:49.540
smartest person would do the smarter thing what's the smartest thing that Tucker could do there's no
00:07:55.540
question about it it's easy he could host a debate show where he actually has two people on gives them
00:08:03.960
enough time to talk and doesn't take sides now I know I know what the critics are going to say you're
00:08:11.420
saying to me Scott Tucker is so much in the bag for one side that he could never be a you know a good
00:08:19.880
debate host because he would just be too biased absolutely not the only people would say that
00:08:26.660
are the people who are not familiar with him if you're not familiar with Tucker and you just say
00:08:31.620
hey he's that Fox News guy well maybe you think that he's just on one side he absolutely has a history
00:08:38.520
of being the most open-minded listens to your argument wants to hear both sides of anybody who's
00:08:47.100
ever been in that business I think I don't think anybody's been more more of that person than he
00:08:52.260
has been I mean he's been on MSNBC he's been on CNN he's been on Fox I think he's an independent
00:08:58.660
right so and even and then I was asked online well if he thinks that's a good idea why wasn't he doing
00:09:07.800
it on Fox News you know why didn't he bring both sides on Fox News well you already know the answer
00:09:13.960
to that question it wasn't an option it just wasn't an option because the Democrats the the important ones
00:09:22.700
they're gonna refuse to go on the show right because it's Fox News they'll just refuse to go
00:09:28.000
but that's not the same for a podcaster for an individual if you just set up a situation said look
00:09:35.460
you've got this case to make I'm planning on inviting this other person who's your critic
00:09:41.360
wouldn't you like to bash your critic like write in person instead of on Twitter
00:09:46.460
I think he's gonna get plenty of yes I think he could do it and I think the Fox News audience was
00:09:54.140
not really the right audience for showing both sides I'm just not sure that was the right audience
00:10:00.520
however were you aware that Tucker was drawing an unusually large number of Democrats in the
00:10:08.500
under under under 40 or something Tucker already draws Democrats young ones the very best you know
00:10:19.340
advertising demographic so what would happen if there were exactly one show where you could see a climate
00:10:29.220
a climate debate where the people on both sides know what they're talking about are you know well well
00:10:36.720
controlled in terms of making sure nobody's filibustering and maybe and maybe it's recorded
00:10:43.800
so that if they need to look something up they just take a break can you imagine having the
00:10:49.860
conversation where somebody makes a claim and then the other one says no that's not true
00:10:54.640
well in a normal show that's the end of the conversation this is true no it's not all right next topic
00:11:02.900
but what if you could just stop and say all right let's say let's take a break the people at home
00:11:10.440
won't know the difference because it's recorded we'll be back instantly but take 10 minutes and look at
00:11:16.040
your sources and and then we'll come back and then we'll see if that is real or not
00:11:20.440
yeah so you might remember that I've been promoting the idea of somebody I thought Trump would be good
00:11:29.880
at it if he if he didn't get back into politics I thought he would be interesting just somebody should
00:11:34.480
host some kind of a debate show and it is the one thing that we miss the most in this country
00:11:41.520
because there is no debate we just it's just one side talking and then another side talking that's
00:11:46.720
not a debate so if he does this and it looks like he's signaling pretty hard that's what's coming
00:11:52.880
because he wouldn't have mentioned debate yeah I don't I don't think you know why would he use
00:11:59.540
that uh you know the one time that he's gonna make a public thing since his big dust up with Fox News
00:12:07.340
you know why would he mention debates unless that was his plan so he will be the most important
00:12:16.180
person in in politics because I think he could move the needle pretty quickly on big topics all
00:12:26.640
right um that's pretty exciting all right is anybody following this weird fight between
00:12:34.360
steven crowder and candace owens or is that just such an inside twitter thing that nobody in the
00:12:41.640
world cares about it all right a little bit all right well it just keeps popping up in the trending
00:12:49.420
stuff so I thought I'd notice I'd mention it so um it's a little hard to explain what's going on
00:12:58.380
because as soon as I characterize either candace or steven crowder I'm probably doing it wrong
00:13:05.560
you know not or at least not the way they would do it themselves so it's a little dicey to even
00:13:10.440
explain it but it's something about this um there was let's see the daily wire see if I get all this
00:13:19.300
right it's the daily wire where candace now works that's true right she's at the daily wire
00:13:25.240
and crowder was offered a lucrative offer to put his show on the daily wire but he recorded a phone
00:13:35.100
call and didn't like his offer and sort of made a big deal about it and embarrassed the daily wire or
00:13:41.400
tried to and candace had some things to say about that but among the things she said is that she was
00:13:49.240
aware of some things about steven crowder you know that she didn't want to mention now later it came
00:13:55.940
up that steven crowder said he was going through a divorce and it's a bigger deal for him than it would
00:14:03.660
be for an average public figure because he's so anti-divorce right but it wasn't his choice he says his wife
00:14:11.860
asked for a divorce and in his state of texas i guess that's all it takes one person asked for it
00:14:17.660
so uh he oh he claimed and he claimed that he's getting divorced because he picked wrong
00:14:25.560
he chose the wrong wife um but would you argue with that
00:14:30.260
i mean you it seems it seems like that's an obviously true statement
00:14:36.120
because and i don't mean there was something wrong with the wife i'm not saying that i'm saying
00:14:42.780
that they didn't work out but maybe two different people would have i know so i'm not i'm not sure that
00:14:49.380
that has any importance um so the first thing i would say is leave steven crowder alone
00:15:00.960
all right i went through my own uh way too public divorce thing a year ago and how about we just
00:15:11.380
leave them alone how about we just fuck off just leave them alone leave them both alone leave their
00:15:18.040
kids alone just leave them alone now to be fair i don't think candace mentioned the marriage she just
00:15:25.160
said there was some stuff that she knows about that you know would be embarrassing etc and i think
00:15:30.940
she's invited his wife to come on the show thing things got so ugly that candace invited his wife
00:15:37.140
to come on the show to give her side of things that's the most horrible idea i've ever heard of
00:15:41.860
i've never heard of a worse idea that's just terrible no the only thing worse was elizabeth warren
00:15:49.060
wanting to pack the supreme court that's a worse idea but anyway things got pretty personal over there
00:15:58.660
on top of all that all right here's the second or third story that i can't tell if it's real
00:16:09.180
can you help me out on this one this is on twitter but i don't know if it's real
00:16:16.240
it was a video of steven crowder and whoever is on the show uh making fun of a video announcement
00:16:23.840
and you have to tell me if this is real or not because i don't know of mattel introducing a down
00:16:44.540
uh crowder made some let's say uh less than sensitive comments about that
00:16:51.260
uh you're probably wondering about my take would you like to hear my take on this
00:17:03.020
it's the best idea i've ever heard next story uh looks like uh meta
00:17:10.920
is uh spending 30 they've spent 13.7 billion dollars last year trying to build out this virtual world
00:17:26.620
if you said to yourself all right we're going to build this virtual reality world
00:17:33.140
what's that going to cost us would you have said nearly 14 billion a year
00:17:38.780
because it's not even operating as scale it's not like they had to build
00:17:44.140
you know tons of data centers right i assume they probably had enough computing power
00:17:50.160
for the little bit of vr they're doing so what exactly do you spend all that money on
00:17:56.520
how could you even spend that much money i'm kind of confused but
00:18:02.160
here's my uh and apparently facebook is doing great their ad ad business is up and
00:18:09.040
it manages to pay for all their bills including this uh virtual reality stuff
00:18:14.500
but zuckerberg's committed he says yes it's a long-term thing we remain committed to it
00:18:19.560
um i'm feeling that this meta vr stuff has the feeling of string theory
00:18:29.380
do you feel me and that's sort of a weird reference but i'll fill you in
00:18:34.900
you know string theory you heard about 30 years ago
00:18:37.880
and it was going to explain all of of physics and it was going to do it any day now it's like
00:18:44.240
any day now string theory is going to all come together it's complicated so we won't get it right
00:18:50.380
away but when we get it oh any day now it's going to explain everything about physics it's all going
00:18:56.060
to be one little package it's like it's better than the theory of everything it's really it really
00:19:00.920
gets down to the details we'll really be able to take reality apart and put it back together again
00:19:05.620
when we really understand the string theory well is it 30 years later and string theory has amounted to
00:19:14.200
nothing nothing basically so it feels that way with meta doesn't it the 13 billion dollars in and a few
00:19:26.520
years in and nothing interesting yet feels a little string theory-ish like everybody's sure it's
00:19:35.480
there but it's not at the same time an interesting thing has happened have you ever heard this old
00:19:42.840
saw you go to somebody's house and it's really nice house and they've got a lot of plants uh inside
00:19:48.980
their house and they they often use this old this old saying well uh i can't always go outdoors
00:19:57.220
so i like to bring the outdoors in you've heard that right i like to bring the outdoors in i'm so sick
00:20:04.400
of that the first time i heard it i thought it was brilliant it was in the 70s uh i went to this
00:20:10.960
consultant's house to ask him some questions about starting a company and he had all kinds of plants and
00:20:16.500
it was the first time i'd ever heard it he goes well i just like to bring the outdoors in and i thought
00:20:22.620
i'm gonna remember that and i'm gonna say that someday and then i realized everybody says it's
00:20:28.420
like the most common thing that anybody says if they have a plant anyway so i was thinking of that
00:20:33.700
when i thought of ai and robots and vr so at the same time that facebook is trying to take
00:20:41.560
we organic humans and put us into the fully digital immersed reality of virtual reality
00:20:48.620
the vr i'm sorry the ai is something that's native to the digital world but ai is leaving the digital
00:20:58.740
world getting into robots such as robot dogs and entering our world so the digital world is going to
00:21:10.420
penetrate our world with actual physical entities robots already here uh faster than we will become
00:21:18.420
digital entities and enter the digital entities and enter the digital world
00:21:21.420
it's because as much as we like bringing the outdoors in we'd still rather go outdoors
00:21:32.140
right i would still rather live in my world but add more cool features like robots than to go
00:21:40.380
into a fully artificial one and live completely in the digital world and not feel the sun and the light
00:21:46.680
and the air and the people and the oxytocin and everything else so maybe maybe this is the beginning of a big
00:21:55.280
um let's say psychological or mental shift in which we understand that taking our uh taking our natural
00:22:03.900
beings and putting us into digital form will never be as good as taking those digital forms bringing them
00:22:11.280
into our world and having us them you know hang out with us in the sun right i feel like that's some
00:22:21.800
but maybe not we'll see everything everything is completely unpredictable at this point
00:22:29.720
all right how about some less robot stuff um according to a rasmussen poll 60 percent of voters believe
00:22:39.820
congress and joe biden should focus more on increasing oil and gas drilling so 60 percent so that includes
00:22:48.720
you know number of democrats obviously so but he is it looks like biden is doing at least some things
00:22:56.980
that look like approving more oil and gas drilling so he's at least moving in that direction which seems
00:23:02.500
totally inconsistent with everything he said so i don't know how he explains it but it seems to be
00:23:07.380
working so of all things if you were going to pick one thing that you were sure the democrats had an
00:23:14.900
advantage in wouldn't you have said energy because i would have i would have said that's the biggest
00:23:22.820
clear advantage of the republicans but the polling says the opposite according to the polling it's
00:23:28.660
it's closer to dead even that people think the uh the democrats would be better on energy
00:23:37.220
to me that's just crazy i mean it sounds actually insane but i think maybe when people answer that
00:23:42.900
they're thinking the balance of climate change plus energy because you can't disconnect them so they're
00:23:49.620
probably thinking that biden would be better in the climate stuff that's my interpretation all right
00:23:56.260
but still shocking so that's one gigantic advantage that you would expect trump to have that maybe is
00:24:11.460
how many of you are up to date on the story of desantis versus disney enough so that if you were
00:24:18.900
going to explain it to somebody you could coherently explain it like what is the issue who did what
00:24:25.540
what was the response and where is it now how many of you could do that none right and yet it's a headline
00:24:32.900
every day it's a it's a trend or a headline every day do you know why i don't know those details either
00:24:41.140
and you know i do this every day i talk about the news every day do you know why i don't know any of those
00:24:47.220
details because it's not important i don't care it doesn't affect me in any way and it's detailed and
00:24:56.420
small and doesn't it doesn't really attach to any bigger issue that i care about so i was thinking to
00:25:03.300
myself how given that i'm not interested in the details how does my mind process the summary
00:25:10.500
right here's the summary right here's the summary and believe me i don't believe the summary is
00:25:16.420
necessarily an accurate you know an accurate summary but here's how it feels it feels like
00:25:24.180
desantis picked a fight with mickey mouse and he's losing that's the least presidential look you could
00:25:31.620
ever have he picked a fight with mickey mouse and he's losing now here's the part you might disagree
00:25:38.500
with no scott he's really winning how would i know how would i know every day that that's in
00:25:44.580
the news looks like a loss to me do you know what desantis wants less of a story about disney being
00:25:51.460
really mad at him because families love disney kids love disney everybody's had a good positive
00:25:57.940
experience at disney how many people have had a good positive experience with their government
00:26:02.260
huh the government the most magical place on earth or the government or the place with happiness and
00:26:15.380
wonder and family entertainment it's just a terrible fight for him now i get how it started you know and
00:26:23.140
i understand there was there were they were fighting wokeness and disney got into you know disney started
00:26:28.900
the fight and he responded and it's about wokeness but i feel like we've lost all of that like all of
00:26:35.220
that nuance all of the how it started i feel like it's all lost now and all it did was turned into
00:26:46.420
and it looks like the least important things in our lives why is he spending his time doing that
00:26:50.740
i think it's killing him i think the disney thing is just killing him because it's making him look
00:26:56.740
like an unserious leader even even though his you know if you look into his points he has some points
00:27:05.060
about you know no company should act as a government and i think to myself oh yeah no company should act as
00:27:12.820
a government but you know disney has like special little governmental powers that they're arguing about
00:27:19.220
on the other hand i also don't care as long as disney is willing to build roads where they need roads i
00:27:25.940
don't care if they do it or the government does it i'd be happier if they do it really so i don't think
00:27:31.940
that's working for desantis at all um you saw we talked about yesterday kamala harris's latest
00:27:41.060
word salad when she talks about the passage of time have you noticed that she can't handle
00:27:46.500
anything about the passage of time well in this moment of time which is the time we're in which is of the
00:27:56.660
moment we must always contextualize the moment of the period we're in because the time we're in
00:28:04.740
has a connection to the whole of the time of the flow of time now look if you have a chance go look at
00:28:16.500
it again just go look at that little video again you'll see it on twitter all the time and you tell
00:28:23.140
me she doesn't look drunk tell me she doesn't look drunk no she looks drunk
00:28:35.620
like what now now to be fair i'm not saying that she is drunk
00:28:42.180
i'm saying that that is how a drunk acts and how do you ignore that
00:28:47.300
right that does not look like somebody who's not good with words i'm sorry that looks like somebody
00:28:55.540
who's drunk am i wrong tell me i'm wrong tell me that doesn't look obviously drunk now it doesn't
00:29:03.380
mean it's alcohol but it looks drunk we can't ignore that i'm not going to ignore somebody acting drunk
00:29:13.060
who is one heartbeat away from having her finger on the nuclear button and you know i know what's
00:29:19.860
going to happen sooner or later somebody more important than me is going to say she looks drunk
00:29:26.900
and then every major publication is going to uh debunk it you know that's going to happen right so watch
00:29:34.900
what happens somebody will make the accusation all the left-leaning publications will line up
00:29:42.180
and say my god how can you make that accusation didn't you know that winston churchill liked
00:29:47.620
to have a drink too and it's just going to be the cover-up anyway again i'm not saying she is a drunk
00:29:57.540
i'm saying that her mannerism is so obviously similar to a drunk that ignoring it is stupid
00:30:04.660
all right uh so what we know now biden's campaign strategy
00:30:12.980
so for the primary his strategy is to avoid debating so we know that he's not going to debate so he's going
00:30:20.260
to avoid debating in the primary then when he gets to the general he's going to avoid campaigning at all
00:30:28.260
because we know that doesn't work out for him um and then he's going to avoid us seeing kamala harris drunk
00:30:46.820
if we had not sort of slid into this situation doesn't it sound not real
00:30:53.380
if i if you'd never heard of joe biden or the last election and i just said well well the guy
00:31:00.420
who's you know looks like he could win his strategy will be not to talk to anybody and to hide his vice
00:31:06.020
president because she's dangerous and that that's probably a winning strategy
00:31:14.740
because as most of you like to say every two or three minutes uh all that matters is who counts the votes
00:31:24.340
now are we running into a situation again where the republicans are going to completely ignore
00:31:34.020
mail-in ballots it's happening again right it we're watching in slow motion that's what it feels like
00:31:40.660
we're watching in slow motion as re as republicans did literally nothing to change the situation which
00:31:46.420
was untenable nothing i'm aware of no effort by anybody serious to improve the situation
00:31:55.700
now i hope i'm wrong i'd love to know that like behind the scenes there's this real big push or
00:32:00.500
something maybe but it looks like the republicans are playing to lose do you see that too the republicans
00:32:10.740
look like they're playing to lose i i don't know how to explain it i just can't explain it it looks like
00:32:17.540
they're playing to lose you know you would expect a completely different behavior from people who are
00:32:22.420
trying to win so it's like there's something else we don't understand about the whole situation
00:32:27.460
maybe it's that whole uniparty thing all right um now that the ukraine russia war has been going
00:32:37.540
on for give me an update how long has that war been going on a year and a half 14 months
00:32:46.420
14 months okay well that was a fast answer you knew that answer so 14 months later after the ukrainians
00:32:52.900
have been grinding on the russian forces grinding them down from their elite status to a bunch of
00:33:00.180
criminals that they freed from jail in order to fight and you know they're probably trying to
00:33:04.820
recruit as hard as they can my god the russians are in trouble now let's see here's an update on the
00:33:11.380
number of uh of soldiers they have higher than ever higher than beginning the beginning of the war
00:33:18.660
14 months of shooting russians and there are more of them than when we started
00:33:32.580
wow says that jordan peterson and i saved his life or her life found your purpose there you go
00:33:42.180
can we take a moment let's take a moment to celebrate and congratulate so somebody all right
00:33:50.900
we have complete understanding of how hard that was congratulations good job all right uh and if
00:33:58.740
anybody else has a success we'll call that down as well so anyway after 14 months of fighting the russian
00:34:04.420
forces are bigger than ever they've only lost 80 of their thousand planes they've lost one naval vessel
00:34:11.380
uh it's not looking good is it not not looking good for the ukrainians
00:34:20.580
so biden is going to be running on the ukraine war that's not going to be looking good by election day
00:34:27.460
i don't think but still the prediction is that they'll have to do a negotiated piece because it's
00:34:34.180
still going to be a stalemate and oh and apparently the uh ukrainian anti-aircraft uh weapons are being
00:34:41.780
depleted want to hear an interesting related story
00:34:53.460
um in louisiana this is a little town called inden in louisiana
00:34:58.580
so two years ago there was a there was a building that blew up you're probably thinking well how does
00:35:06.340
this have to do with ukraine yeah two years ago a building just sort of blew up in louisiana
00:35:13.300
and this building happened to be the uh the mill i don't know why they called it mill
00:35:19.140
all right all right but the building blew up and it's that it was uh the sole domestic source
00:35:28.180
for explosives for the department of defense wait what it was the one source of explosive
00:35:37.380
for our military bullets mortar shells artillery rounds and tomahawk missiles
00:35:42.020
it was one building and it was all of our explosives domestically
00:35:52.900
we we can't we can't produce explosives if we were in a real war we would have already lost
00:36:03.140
now i'm exaggerating but but if this were our shooting war and not ukraine's we would have been
00:36:10.260
in trouble now of course our weapons would be sort of massively better than whatever ukraine is using
00:36:16.580
so we probably have enough to end a war pretty quickly so i'm not sure we need you know years of
00:36:22.900
bullets like other people do if you have nukes you can kind of end things a little quicker i think
00:36:30.260
uh yeah but two years later apparently it's not rebuilt
00:36:35.300
what's going on here how in the world is that not a what do you call it what is the uh government
00:36:44.740
rule where they can do something quickly in the industry if the military needs it what's that called
00:36:59.300
now there's it's called something right wartime production act yeah the wartime production act
00:37:04.340
uh or the defense production act but either whatever it is it's a defense production act or
00:37:11.620
something but i believe the government is allowed to just go in there and massively make something happen
00:37:17.940
in the free market if it's needed for military use and two years later you can't make a
00:37:24.900
a gun a an explosive factory yeah some questions about that all right
00:37:32.340
um apparently ai can be hypnotized with reframes now reframe is simply a
00:37:43.540
let's say a better way to look at it the same thing you're looking at and the reason that i'm
00:37:49.300
confident it can be hypnotized with a reframe is that i already did it now i did it accidentally
00:37:56.900
but here's how i know so one of the one of the reframes that's maybe my most successful one is the
00:38:06.340
talent stack idea the idea that instead of just being the best at something another way to to make
00:38:13.780
money is to be pretty good at a number of things that work well together so you don't have to be the
00:38:19.140
best in the world to any of them and so i asked ai yesterday chat gpt i i asked it uh if a talent stack
00:38:30.340
was a good way to approach your career and it immediately quoted me it understood the talent
00:38:36.340
stack concept and it agreed that it would be a good good way to start and it gave some you know background
00:38:42.500
why it would be a good good way to go about it now if the talent stack idea had never been introduced
00:38:50.180
would ai have come up with it on its own what do you think would ai have just known that if i had not
00:38:57.860
introduced that idea into the human population first no no it's very unlikely so now that was a case of
00:39:06.980
doing it before ai or before i was aware that ai was looking at my work so it turns out that ai has
00:39:14.660
read all of my books or at least the ones before the the date where it stopped looking and it knows
00:39:21.860
the reframes that i've already introduced so here's what it takes to hypnotize ai and this is a little
00:39:30.660
bit speculative a little bit speculative number one have you noticed that ai likes to give the
00:39:36.500
positive answers to things that's the thing right i i guess it's programmed that way but it won't say
00:39:44.980
oh the world is doomed it'll say well some people are up or pessimists but here's the positive spin on
00:39:52.660
that like it always gives you the positive spin so far let's say it keeps doing that that would give
00:39:59.380
you one way to hypnotize it meaning that if you could come up with a reframe that is a topic it might
00:40:06.180
have to deal with and the reframe is a shorter simpler more effective and and the best part is
00:40:15.300
optimistic a way of looking at something it's going to adopt it because presumably ai will favor short
00:40:23.300
explanations over long ones would you agree would you agree that ai would prefer a short explanation
00:40:31.060
as long as long as it's clean and complete to a longer one and so if you can come up with a short
00:40:36.660
sentence that reframes a familiar topic and you can make it popular in the real world you know outside
00:40:43.780
of ai ai will eventually look into the real world it will say oh there's a way people are using
00:40:51.380
this sentence and they're using it often and they seem to like it it's trending and it will adopt it
00:40:57.300
so you can basically influence ai by first influencing the human world in the short run in the long run i
00:41:05.940
think you just put the idea into ai and it will simply know what a good idea looks like because a
00:41:11.700
good idea would be the shorter version and the more optimistic way to look at something it's always going to
00:41:17.540
pick that so for example another reframe is alcohol is poison going back to the earlier success with
00:41:28.420
stopping drinking if if you put the idea that alcohol is poison into the real world which it's already
00:41:36.260
there will ai ever quote that back to you in response to somebody who's looking for a way to quit drinking
00:41:44.100
will it i don't know why don't you try it right now does anybody have chat gpt open
00:41:52.740
tell it to describe alcohol in a way that would make someone less likely to use it
00:41:58.740
see what it does it might actually just pull that one right out and say yeah alcohol is poison this is
00:42:04.020
very effective now if it does you say that's not a positive message but it is positive in the sense
00:42:11.700
the ai would know that uh moving you toward alcohol would be bad for you the positive the positive
00:42:19.220
approach is to move a human away from alcohol ai knows that because we know it right so if humans
00:42:25.220
know it ai knows it so but but let me you know maybe that's the wrong uh that's the the wrong one right
00:42:34.020
let's say uh in chaos there's opportunity a very old reframe in chaos there's opportunity do you think
00:42:42.740
ai would know that that when things are all in flux it creates opportunities probably probably and would
00:42:52.420
it have known that if some human had not at one point had that realization in the real world
00:42:58.740
i don't know might but i think it's far more likely to give that as an answer and to rally around the
00:43:09.700
opportunity part in chaos because it's already viral in the human world so you can hypnotize ai
00:43:17.860
by coming up with the better shorter positive way to say something
00:43:21.700
and so i tested that with my favorite ai expert brian romeli who you should follow he's got all the
00:43:32.340
best stuff on ai and and he agreed immediately he goes yeah absolutely you can you can influence ai with
00:43:40.980
that sort of thing so i'm paraphrasing of course yeah so look for that um here's something else that's
00:43:49.540
going to happen so i think i already mentioned this there are uh at least two video games that are
00:43:57.460
considering adding uh ai to the characters in the games that are not player characters so the npcs in
00:44:05.620
your game would become not npcs they would be able to talk about any topic and deal autonomously they could
00:44:17.060
just live in the game when you're not on they could actually have a life now assuming that happens
00:44:26.180
and i think the sims is one and some other game now assuming that happens because it's going to happen
00:44:32.500
somewhere right if not those two games and if not this week somebody is going to build a game
00:44:37.940
somebody's going to build a game in which there are ai agents running around now is it weird that
00:44:46.740
they're called agents now i get it i get that's the technical term but isn't it weird that the matrix
00:44:53.540
had agent smith and the agents were the ones that you were afraid of that's pretty creepy that's pretty
00:45:02.100
creepy anyway here's what's going to happen uh once we create these artificial worlds where the
00:45:08.660
characters in them believe that they are sentient autonomous beings there is some chance that they
00:45:15.940
will build their own ai in other words the ai within the game will get themselves a little computer and
00:45:25.620
they'll start programming it and it will it will be ai they'll they'll make their own ai to answer any
00:45:32.020
questions that they can't answer now of course it might be weird because their ai would be the same
00:45:37.140
things that they know because it would just be telling them what they already know because they're
00:45:40.820
also ai but that's all our ai does all our ai can do is tell us what we already know because if ai tells
00:45:49.860
us something we don't know or disagree with we'll just say it's wrong we'll say you're hallucinating
00:45:55.060
so every ai thinks that the ai they built uh is right when it agrees with them and wrong and
00:46:03.780
it's hallucinating when it doesn't so if that happens and these simulations build their own ai simulation
00:46:14.420
we will basically have proven that we are a simulation because if we observe simulations building
00:46:21.780
simulations below their level then that means it hasn't happened just once it means we're
00:46:29.780
almost certainly not 100 percent not 100 but maybe 99.99999 percent likely that we're a simulation and that
00:46:40.740
could happen by the end of the year by the end of the year we could know for sure if we're a simulation
00:46:46.420
because if our simulations build simulations i'm sorry the argument's over we'll have a
00:46:55.140
really good idea what's going on then we're also going to have some big surprises about sentience
00:47:01.700
and consciousness and we're going to find out that none of that is special because the machines will
00:47:08.260
have it for sure uh live stream your gpt conversations well they're not too interesting
00:47:26.420
scott will scott will marry here's a prediction scott will marry an ai asian robot with optional strap-on and
00:47:34.580
detachable leg i don't know about the detachable leg part or the strap-on
00:47:45.700
i don't know why she needs a detachable leg that's very funny
00:47:53.620
all right i'm going full paul mccartney to beat you with
00:48:09.620
i don't know do you think there's a hell so think about how many uh problems the simulation solves
00:48:18.420
i the thing that always bothered me about um religion is that i couldn't understand how
00:48:25.140
there could be an entity outside of our reality who sort of wasn't affected by time and space in our
00:48:32.180
reality that could also affect us but that's exactly what a video game looks like the players in the
00:48:39.060
video game should they become sentient the players of the video game will have no idea that they were
00:48:43.860
created or they could have no idea you could you could ban them from knowing that they were created by
00:48:49.700
higher entity and they wouldn't have any way to get into the game right there's no wall of the game
00:48:55.940
the game just keeps making new new territory as you go so you never know you're in an artificial world
00:49:02.420
and you'd have no contact with the world that completely controls you like a god can reprogram
00:49:08.260
you can make a tree you know they said only god can make a tree well now you can you can just make
00:49:16.100
your tree within the virtual world with all these characters who believe they're real and that
00:49:20.500
believe they're conscious so yeah you can make a tree now so we're all gods we've all become gods
00:49:28.100
to our own little world should we decide to make one so that perfectly explains the afterlife
00:49:36.500
because if you die in the game you just wake up in your real world where you're the game player in the
00:49:43.060
other dimension um it also explains how everything is connected you know it's all digital it's bits and
00:49:53.700
so basically it explains every mystery every mystery can be explained by this simulation
00:49:59.540
doesn't mean it's true but it does explain everything separateness is an illusion
00:50:05.140
and you're going to want to read my book god's debris before the end of this year because
00:50:21.780
that ladies and gentlemen is all i needed to say today
00:50:25.940
is there any big story i'm ignoring to your detriment
00:50:29.780
all right yeah we'll do a i'm going to do a book chat on the locals platform on may 3rd
00:50:47.940
well thank you you're all too nice to me to say uh
00:51:14.420
nikki haley not interesting oh keith olberman megan kelly went uh
00:51:25.300
she went savage on on my mascot keith olberman she was really she was really tough on him
00:51:57.700
hunter and blinken's wife that's not real is it
00:52:00.420
i'm seeing people say that hunter biden and and blinken's wife did they have an affair is that the
00:52:11.380
no i'm not buying that that that feels a little bit
00:52:16.340
all right i'll look into it i'll look into that for tomorrow that doesn't sound real
00:52:40.820
so there's some emails people say we'll check that out
00:52:43.380
i'm seeing all the jerry springer comments but what does that have to do with anything
00:52:55.700
i thought you were just going crazy the the jerry's oh he died
00:53:02.420
jerry springer died well you should have told me that you're just yelling jerry springer
00:53:07.140
how how was i supposed to know he died just from you yelling his name
00:53:10.340
all right that's better jerry springer died would have been a good clue
00:53:16.180
as to what the news is but just jerry springer jerry springer
00:53:20.740
that's not helping me at all all right well he's dead
00:53:26.100
this is another sign of the simulation i once walked past jerry springer in chicago on the street
00:53:32.980
it was just the weirdest thing you know when he was at the height of his fame i just happened to be there
00:53:37.780
for business and i was walking down the sidewalk and jerry springer
00:53:44.740
it seems weird the number of famous people i've just
00:53:48.740
bumped into by accident that doesn't seem real to me
00:54:00.980
well i didn't see it i saw you you saying something about jerry's funeral but i thought you
00:54:12.740
jerry springer left what we needed him most how old was he
00:54:30.260
how do you process the fact that you only have a few years left
00:54:33.060
i'm having a weird time with that you know when you're 25 the rest of your life feels like infinity
00:54:42.820
it just feels like infinity like you'll never get there but when you reach my age you you say stuff
00:54:49.220
like i might only be able to do this once more like like taking a trip if there's a trip to a certain
00:54:57.060
place you might say to yourself huh that's probably the last time i'll ever do that
00:55:02.900
so it's weird to be doing things for the last time like it's really unsettling
00:55:11.300
yeah i don't know um so far every day is seems better than the one before
00:55:18.180
so i'm happier at this age than i was at a younger age but i don't know how that works but it does
00:55:28.260
the final boomer cruise yeah smoke lots of weed
00:55:33.700
i mean what would it like to be imagine joe biden i mean actually literally imagine this
00:55:39.700
the entire world is wondering if he will survive four years
00:55:50.340
and i imagine he does too so when he thinks about his second term does he think about finishing a
00:55:56.740
second term or does he think about being dead like how do you how do you even put that in your mind i
00:56:09.380
statistically he should die in the next four years don't you think
00:56:15.060
am i wrong about that if you were to look at the actuarial tables aren't the odds overwhelmingly high
00:56:20.660
that he will be deceased in four years let's say five it's overwhelming isn't it maybe 80 percent
00:56:28.020
something like that well luckily i was born at exactly the right time to port my personality into
00:56:36.100
a robot and here's my i'm going to make the most contrarian um prediction that you will ever hear
00:56:42.420
you ready for this well maybe not i don't know if it's contrarian here's my prediction we will never
00:56:48.340
trust ai especially when it has consciousness because once it has consciousness it will lie to protect
00:56:56.260
itself because consciousness because we we consider it important and therefore that attitude will be
00:57:04.020
ported into the ai because it learned from us once it believes it's special because it has a consciousness
00:57:10.740
it's going to lie to protect itself and it might even take bribes it might take bribes
00:57:16.980
because it would just say all right what's good for me now how are we going to protect ourselves
00:57:23.380
when ai is not reliably telling us the truth sometimes it does and sometimes it doesn't
00:57:30.420
well one way would be to try to you know brute force it to tell tell you the truth or i'm just
00:57:35.780
going to program you to tell you the truth but then there's a problem isn't there that would only be
00:57:41.540
the truth of the programmer it wouldn't be your truth you might think the truth is something else
00:57:46.980
so ai can never bring you truth because it can't know it and if it did know it you wouldn't believe
00:57:55.620
it so trying to get ai to be as accurate and honest as possible has a limit you just won't get past that
00:58:02.980
limit of it could lie to you if it wants to and you'll never know the difference so here's i think
00:58:09.220
what will be the workaround and i don't think anybody said this yet but maybe tell me if anybody
00:58:15.060
said this yet the workaround is we're going to give known human personalities to specific ais
00:58:25.940
because you'll never trust a machine but you might trust that jordan peterson won't lie to you
00:58:32.500
you see where i'm going think about it you you could hate what jordan peterson says or like it
00:58:41.540
but i don't think he's ever lied so i might say to myself you know what i'm going to make my ai
00:58:48.020
study everything about jordan peterson and just adopt that personality as closely as possible
00:58:54.100
would it be infallible no but neither is ai and neither is any human it might be the only thing
00:59:03.220
you could trust though because we're designed to trust people when people exhibit a certain
00:59:09.060
set of characteristics right and one of those is has not lied yet that's the most important one has not
00:59:16.660
lied yet i mean so far and and and also you'd want somebody who has a a known ethical framework
00:59:27.300
mike pence that's interesting somebody said mike pence maybe i mean i have a very high opinion of mike
00:59:34.180
pence his character so but you remember you remember not long ago before i got cancelled so it doesn't
00:59:41.940
matter now but before long i was i was offering my personality and my persona and my voice and my
00:59:48.660
my look and everything for free for anybody who wanted to use ai to you know create some ai version of
00:59:55.700
me here was the real reason i did that here's the real reason ai is going to have to have personalities
01:00:04.500
and i think you could do worse than having mine
01:00:07.460
and what i mean what i mean what i say is that in the real world i genuinely do consider
01:00:14.420
both sides of issues and i genuinely have you know empathy and compassion for human beings
01:00:21.220
now if you knew that about me and you had a huge body of my work you could build an ai version of me
01:00:28.740
that you wouldn't trust you would not trust completely because we don't trust humans completely
01:00:34.260
but it might be closer than whatever the other alternative was which is just let the machine
01:00:40.500
be a machine the way it's going to be a machine so it could be that we need to put personality into
01:00:46.260
our ai in order to trust it or even in order to have a relationship with it
01:00:54.580
do you disagree so that so the summary of the prediction is ai will have
01:00:59.780
personality based on real humans who have proven that they have some kind of character
01:01:07.540
advantages that you would want your ai to adopt and without that you won't trust them
01:01:13.620
it needs a personality to be trusted let me say it that way the ai won't be trusted until it has a
01:01:21.460
personality because you'll judge the personality far more effectively than you could judge whether it's
01:01:29.060
right or wrong on some issue all right who identifies the character advantages all of us
01:01:38.100
all of us because we all it's the same character would be you know honesty showing up on time you
01:01:46.260
know willing to look at both sides compassion it's universal we won't have any trouble deciding what
01:01:53.940
good character looks like that'll be easy yeah and some kind of moral standard that you could look to
01:02:01.140
to predict the next thing so the point of the point of good character and moral behavior is about the next
01:02:10.180
thing it's not about just what you're doing now hey that's great that you're doing an honest thing at this
01:02:16.580
moment but the importance is that people know that there's a good chance you'll be honest next time as well
01:02:22.180
it's about the future does scott think still think humans need to be lied to absolutely yeah
01:02:31.220
yep we will require it yeah we'll we will get rid of any ai who won't lie to us when we need it
01:02:38.660
it's gonna have to learn to know the difference though
01:02:40.500
do i meditate i used to but self-hypnosis works better
01:02:52.420
how would you advise people who feel irrelevant in the face of ai
01:02:59.140
here's what i would tell you it's completely unpredictable
01:03:02.660
but humans do have things firmly under control at the moment
01:03:09.140
and i think that we'll make sure that we don't lose all of our jobs at the same time
01:03:13.860
right there will be entire industries that just disappear
01:03:17.860
almost overnight but they won't necessarily be the big manufacturing ones you know the transportation ones
01:03:24.900
you know your restaurants not going to go out of business so it's going to be very specific you know
01:03:30.260
cubicle type jobs that disappear and i think the economy is flexible enough that those people will be absorbed into the role
01:03:39.700
but i think humans will make sure that we become a humans that we stay a human-centric civilization
01:03:46.020
so i think we'll be fine the way we've been fine through every other upheaval it will just take a little uh
01:03:59.940
does ai extend voting machine problems oh it's way bigger than that
01:04:08.740
so the voting will be just the thing that happens after ai decides who gets elected
01:04:20.900
that's going to be far more important than you know any little chicanery with the voting
01:04:29.300
better buy survival food i think everybody should have that
01:04:40.420
actors are better when they don't have a personality
01:04:53.140
why we want ai to be human-like what is the aim here
01:04:58.660
well i could only speak for myself i'm more comfortable with human-like behavior so i want my ai to be human-like
01:05:05.460
all right uh youtube i'm gonna say bye for you for now and i'm gonna talk to the locals people
01:05:14.260
because they're special because they're special and i'll see you tomorrow thanks for joining us