Episode 2072 Scott Adams: Trump Is Good For Ratings, Fentanyl & Tranq, Zuby Asks Why Get Married?
Episode Stats
Length
1 hour and 2 minutes
Words per Minute
141.1617
Summary
Scott Adams talks about a new drug, why Trump is boring us, and the decline in the number of people who cook at home, and why our food supply is getting worse and worse every day, and how we can fix it.
Transcript
00:00:04.000
Good morning everybody and welcome to the Highlight of Civilization.
00:00:08.000
It's called Coffee with Scott Adams and you've never been
00:00:12.000
happier. And it's going to be even better as we get
00:00:16.000
warmed up here and have a little sip of coffee. Things are going to start kicking in.
00:00:20.000
The gears are going to start turning and wow it's going to be fun.
00:00:24.000
Amazing. I'm going to close my blinds here. I've got a little lighting
00:00:28.000
problem but wasn't happy with what I saw when I went live.
00:00:44.000
A tank or chalice or stein. A canteen jug or flask. A vessel of any kind.
00:00:48.000
Fill it with your favorite liquid. I like coffee. And join me now for
00:00:52.000
the unparalleled pleasure of the dopamine to the day. The thing that makes everything better. It's cold.
00:01:32.000
It turns out that there is nothing as interesting
00:01:41.000
He's just the most interesting thing in the world, apparently.
00:01:48.000
if he runs for President and everybody else's ratings will go up if
00:02:04.000
It pretty much guarantees he gets nominated, doesn't it?
00:02:10.000
But it does guarantee he gets nominated, I think.
00:02:24.000
I saw Michael Pollan talking about how home cooking is in decline.
00:02:31.000
Apparently the number of people who cook at home
00:02:41.000
Which is more evidence that our food supply is killing us.
00:02:48.000
I don't think there's anything more dangerous in the world
00:02:50.000
than just the food you buy at the grocery store.
00:02:54.000
Have you noticed that when you see these street fights?
00:02:58.000
I see all these videos of people getting in fights in public.
00:03:02.000
Have you noticed how many women weigh 400 pounds in these street fights?
00:03:33.000
But it's the only way you can get food that isn't filled with preservatives.
00:03:44.000
So I've been running a little experiment to see how I would feel
00:03:50.000
if I got my processed foods as close to zero as I could.
00:04:05.000
I've got a feeling our food supply is so bad for us
00:04:09.000
that it's almost like the pharma companies should buy them.
00:04:14.000
Buy the food companies and make sure they don't improve
00:04:20.000
But that might be the biggest problem in the world
00:04:26.000
because I guess we're just used to eating crappy food.
00:04:33.000
I don't know if you know this, but there's a new drug
00:04:41.000
that fentanyl users are looking for sometimes intentionally
00:04:48.000
and sometimes it's just mixed in their fentanyl.
00:04:54.000
Not only does it make the fentanyl more deadly,
00:04:59.000
but it makes the treatment for it, the naloxone, what do you call it?
00:05:07.000
So Narcan is the stuff you can spray up somebody's nose
00:05:12.000
It doesn't work if the fentanyl is mixed with Trenk.
00:05:19.000
So just when we start getting Narcan in a lot of places
00:05:22.000
where maybe it can make a little bit of a difference,
00:05:25.000
coincidentally, the people who ship us the drugs mixed in a chemical
00:05:38.000
Do you think that just when we come up with some way we can not defeat it,
00:05:44.000
but at least slow it down a little bit, they add a second poison?
00:05:54.000
because it makes the high last longer, I guess.
00:06:00.000
Like, just big pieces of your skin will rot and fall off,
00:06:10.000
But that's also how good it is because people are still willing to do it,
00:06:13.000
even though the pieces of their body are rotting and falling off,
00:06:25.000
Do you think that China is behind the adding of the Trenk,
00:06:38.000
I think it might be at least as much the addicts themselves want it.
00:06:45.000
But it's also being snuck in the fentanyl to cut it,
00:07:03.000
than it is to, you know, jack up the death rate.
00:07:07.000
Because if you could turn somebody into an addict,
00:07:14.000
wouldn't they be just as happy with an addict as opposed to a death?
00:07:25.000
and that person is not productive at the same time.
00:07:49.000
and really it's just one party when you get to the top?
00:07:59.000
How do you explain the fact that one half of that unit party
00:08:05.000
That sounds like exactly the opposite of a unit party.
00:08:09.000
One half is trying to put the other half in jail.
00:08:26.000
They're certainly about rich people who want to stay in power,
00:08:30.000
but I think the unit party is trying too hard to put members of their own party in jail.
00:08:40.000
Anyway, I reject that, but I can see why you like it.
00:08:44.000
So a conservative Texas judge is banning, it looks like this would apply to the whole country,
00:08:55.000
but I think the rest of the states will ignore it, banning the abortion pill.
00:09:00.000
So there's a pill you can take if you just got busy and you think you're pregnant,
00:09:05.000
and it will take care of it without going to the doctor's office.
00:09:09.000
And apparently that's going to be illegal, at least in Texas and maybe some other places.
00:09:16.000
And I ask you this, there's no chance the Republicans could win as long as this is an issue.
00:09:27.000
I'm not sure if you know how popular this pill is, but 95% are trying to put the other 5% in jail.
00:09:39.000
I don't think Republicans can win as long as this is an issue.
00:09:43.000
It will just take the Republicans completely out of the election.
00:09:48.000
Now, I have to admit that Republicans have one thing going for them that I respect a lot, and it goes like this.
00:09:59.000
Republicans had to know when they've been fighting against, you know, abortion.
00:10:06.000
They had to know that if they won, it would be very bad for them.
00:10:10.000
Meaning they might get what they wanted with abortion, but it would make them unelectable from that point on.
00:10:26.000
At least the Republicans are acting in a principled way.
00:10:33.000
But the net effect will be to take them out of power.
00:10:36.000
So I'm not sure that's exactly what they wanted.
00:10:41.000
But on one hand, it looks like Trump would win against whoever he runs against.
00:10:47.000
On the other hand, I don't know if anybody could win as long as this is still an issue.
00:10:51.000
Because Democrats are definitely going to vote if this is an issue.
00:10:57.000
Now, I realize it's a court issue, but the public isn't going to care.
00:11:02.000
Because it could be a congressional issue as well.
00:11:06.000
Well, the UN Secretary General is calling for the wealthy nations to get more aggressive in reducing their emissions.
00:11:19.000
And wants to get to net zero emissions, not by 2050, but 10 years earlier, by 2040.
00:11:28.000
Now, I have the same problem with all the news.
00:11:36.000
It seems to me that the temperature has not gone up in years, which has to mean something, right?
00:11:43.000
The last seven years or something, the temperature hasn't gone up.
00:11:46.000
Now, I get that it's not supposed to go up in lockstep because there are other variables involved.
00:11:53.000
But it's really weird that at the same time, we have lots of evidence that the CO2 is not changing the temperature.
00:12:04.000
It's just that there's this period where it's not.
00:12:07.000
At the same time that we're getting more aggressive.
00:12:10.000
And I'm seeing the same thing with the story about myocarditis.
00:12:16.000
Do you remember when rogue Dr. Peter McCullough was claiming that vaccinations were more dangerous than helpful?
00:12:25.000
And he had some data that seemed to indicate that.
00:12:29.000
Then time goes by, and then there's even more alarming data.
00:12:32.000
Well, we've gotten to the point where we have two completely different worlds.
00:12:40.000
In one world, the myocarditis is through the roof, and people are dropping like flies, and it's obvious, and it's everywhere.
00:12:49.000
And in the other world, nothing like that is happening.
00:13:06.000
I asked doctors, are you seeing more myocarditis?
00:13:09.000
Because if Peter McCullough's numbers are correct, everybody would see it, and it would be massive,
00:13:18.000
and it would be every medical facility would be bombarded with it, if he's right.
00:13:26.000
If he's wrong, then other doctors would just be doing their practice and not even noticing that there was any more of it.
00:13:39.000
Do you think the real doctors who answer this are going to say,
00:13:57.000
About somebody just putting a comic about me, of course.
00:14:35.000
I'm just looking to see if there's anything that jumps out.
00:14:41.000
Doctors in California can't answer that question.
00:14:49.000
An uptick in patient complaints starting last fall, but symptoms vary.
00:14:54.000
Some more complaints, but he can't tell if it's from vaccination or the COVID itself.
00:15:18.000
It would be better to ask the insurance companies, would it?
00:15:22.000
Because myocarditis doesn't necessarily turn into a death.
00:15:33.000
Somebody who sees 3,000 to 4,000 patients per year, primary care.
00:15:38.000
I have not seen any increase in myocarditis or sudden death.
00:15:46.000
The people who are seeing the huge increase tend to be not from their own practice.
00:16:06.000
But I don't believe the Peter McCullough numbers.
00:16:09.000
Because Peter McCullough also believes the athletes dropping dead numbers.
00:16:18.000
So, we know he's using totally debunked numbers about athletes' deaths.
00:16:23.000
So, I wouldn't believe anything he said on this topic either.
00:16:33.000
But the data sources he uses are somewhat obviously false.
00:16:48.000
He's asking, why would a successful man get married?
00:16:59.000
Because there is a big kind of a trend here toward people not getting married.
00:17:08.000
And part of it is that men are saying it's not worth it.
00:17:11.000
Because they put all their money into a situation and then the woman just divorces them anyway.
00:17:17.000
Takes the kids, takes half his money, and he's finished.
00:17:22.000
So, a lot of men are saying it just isn't worth it.
00:17:25.000
And women seem to have not noticed that their value in terms of marriage has gone way down.
00:17:38.000
That the value of women as wives has gone way down?
00:17:42.000
Because they don't offer the same value proposition.
00:17:44.000
You know, it's not like they're saying, I will take care of you and be with you forever and raise your kids and always be loyal.
00:18:06.000
And the guys are just saying, I can have some of that for free.
00:18:27.000
I would agree that if you're going to have kids, it's still the only way to go.
00:18:33.000
If you're going to have kids, it's the only way to go.
00:18:36.000
But the odds of it working are not so great for the guy.
00:18:44.000
So, I suspect that there's going to be a market developed for people having other people's babies.
00:18:58.000
When people ask me if I had prenups for my marriages.
00:19:05.000
That's like asking me if I kept my money in the 19th largest bank.
00:19:13.000
No, I don't put my money in the 19th largest bank.
00:19:33.000
Do you think that it makes sense for a successful man to get married?
00:19:42.000
If he doesn't want kids, does it make sense to get married?
00:19:44.000
It doesn't make sense even if he does want kids, actually.
00:19:52.000
So, here's the dumbest and worst advice at the same time.
00:19:57.000
The dumbest, worst advice about marriage is that it's a good idea,
00:20:03.000
but you've got to make sure you're marrying the right person.
00:20:08.000
Oh, you know, marriage could be bad if you marry the wrong person.
00:20:16.000
So, it'd be a big problem if you married the wrong person.
00:20:38.000
I was thinking marrying the wrong person was the way to go.
00:20:41.000
I thought marrying somebody who didn't love me and wouldn't be trustworthy
00:20:46.000
and wouldn't benefit me in any way, I thought that was the way to go.
00:20:59.000
Well, if people would just stop doing drugs, there'd be no problem.
00:21:02.000
If people would just marry the right person, I mean, come on.
00:21:05.000
Just marry the right person, everything's fine.
00:21:22.000
Marjorie Taylor Greene is having a social media battle with Laura Loomer.
00:21:38.000
What does it tell you when Laura Loomer and Marjorie Taylor Greene are having a fight?
00:21:44.000
And there's a rumor that Trump might hire Laura Loomer for his campaign.
00:21:52.000
Do you think Trump is going to hire Laura Loomer for his campaign?
00:22:01.000
If he does, I don't know what to say about that.
00:22:05.000
You know, I just don't know what to say about that.
00:22:10.000
But Marjorie Taylor Greene is coming out very strong against Laura Loomer, tweeting that she's a big old liar and all that.
00:22:19.000
Yeah, this is what Marjorie Taylor Greene says in a tweet about Laura Loomer.
00:22:26.000
She loves the alleged FBI informant and weirdo Nick Fuentes.
00:22:30.000
I think the new insult is to call somebody an alleged FBI informant.
00:22:43.000
I might just say, you know, that Joe Biden, that alleged FBI informant.
00:22:57.000
By the way, Keith Olbermann apologized for insulting the college basketball player.
00:23:21.000
So as I've told you, there are like 150 new AI apps coming online every month.
00:23:26.000
And I advise you to take some time to do a deep dive and find out what they can and cannot do for your situation.
00:23:37.000
Because if you get behind on AI, you're really going to be behind.
00:23:42.000
It's just going to be a basic skill for life, it looks like.
00:24:04.000
Apparently it's pretty good for, I don't know, fixing your text and maybe reading some stuff, blah, blah, blah.
00:24:15.000
So mid-journey is the one where you can text in a little text and it will draw a picture for you based on your text.
00:24:25.000
The first thing I found out is that mid-journey is so poorly constructed that you need to sign into a completely different piece of software to use it.
00:24:34.000
So you have to sign into a discord account, which is sort of a messaging platform that is unrelated to mid-journey.
00:24:46.000
And by the way, it takes you like an hour to figure out that you don't sign up for mid-journey on mid-journey's own website.
00:24:53.000
It kept sending me to discord and I thought, no, no, they just, it's for marketing or something.
00:25:03.000
So I kept going there and coming back thinking, okay, this is obviously just some kind of mistake.
00:25:09.000
Because I'm trying to sign up for mid-journey, but it keeps making me sign up for a whole different account on a different platform.
00:25:15.000
And then finally I figured out that that's how you use mid-journey.
00:25:18.000
You send it a message as if you were messaging a person, but you're messaging the AI.
00:25:25.000
And you message the AI with a weird little hashtag command, which again is ridiculous and stupid.
00:25:33.000
And then you wait a long time while other people's answers are going by.
00:25:38.000
So you have to, you're not sure you've missed yours.
00:25:41.000
And then, and then the person will come back and they have three fingers and shit.
00:25:45.000
So mostly I would say mid-journey is not really much of an anything.
00:25:56.000
And I made it create some images and get used in my thumbnails here.
00:26:01.000
So if you wanted to give some free copyright, free art for some minor uses, it might have some value.
00:26:20.000
In the commercials, it looked to me like you could type in like a movie script and it would actually create an actual moving movie with characters and stuff to do your movie.
00:26:31.000
It turns out, it sort of does that, but not really.
00:26:38.000
What you really could do is, if you were a human actor, you could have your background removed and you could be put in a different scene.
00:26:47.000
And there's a whole bunch of tools that it has to clean up and fix an existing video.
00:26:55.000
What it can't do is the thing that I thought was the only purpose for it.
00:27:00.000
What it can't do is make a movie out of your text, or even close.
00:27:04.000
Now, if you thought it could do that, and by the way, it advertises that.
00:27:12.000
It's not even close to being able to make a movie.
00:27:19.000
And it would be an immense amount of human effort to make it do anything.
00:27:24.000
So it's a tool that would require huge amounts of human time,
00:27:28.000
well-trained, and a lot of different skills to use it.
00:27:31.000
So that one's not going to replace movies anytime soon.
00:27:34.000
So if you thought movies were going to go away because of that,
00:27:40.000
But who knows how quickly things move, so good change.
00:27:48.000
Well, I spent a few hundred dollars yesterday just trying out AI.
00:27:59.000
Probably $1,000 I spent yesterday just testing.
00:28:03.000
Because a lot of them have, you know, sort of expensive sign-ups.
00:28:07.000
And so the first thing you need to know is that most people will not be able to test a lot of AI.
00:28:13.000
Because unless your company is paying for it, or you're rich,
00:28:15.000
you're not going to spend $1,000 just to see what it can do.
00:28:25.000
So you'll probably only maybe use one of them, or two of them, something like that.
00:28:35.000
So the one that I thought I had the most promise was Synthesia.
00:28:44.000
So what they do is they'll put a deepfake-looking human who can say whatever you want it to say,
00:28:51.000
and does a really good job, really good job, of making the voice sound natural.
00:29:09.000
So I've got to set up a room and get a wall and go through some steps to make an avatar of myself.
00:29:18.000
So to make the avatar that I would use one of myself, it's $1,000.
00:29:23.000
So I'm going to spend it because I can afford it and because it's important for me for business to make sure I understand this field.
00:29:39.000
And I'm going to see if I can make me reading an audio book.
00:29:43.000
What I wanted to do was turn my book into an audio book automatically, but also into a movie or a play automatically.
00:29:53.000
Because all the scenes are described in the book.
00:29:56.000
So I thought, well, I'll just feed the book in.
00:29:59.000
But it turns out that to even read my own books, because they've been converted into Quark Express by my publisher,
00:30:12.000
So I had to go through this whole process of loading InDesign and opening it up.
00:30:17.000
And, of course, it wasn't compatible with the file.
00:30:20.000
So I had to go to Quark Express and pay hundreds of dollars for Quark Express.
00:30:25.000
So I spent probably $2,000 yesterday just testing software.
00:30:29.000
And Quark Express said that I was all signed up and I could download the software maybe in 24 hours.
00:30:45.000
And they said they wouldn't approve it for 24 hours.
00:31:05.000
But I spent most of my day trying to use commercial software that was largely useless yesterday.
00:31:13.000
It was either useless the day you wanted to use it, because you have to, you know, submit something or wait for something.
00:31:21.000
And then, of course, do you know how many times somebody was going to send me an email confirmation so that I knew I signed up and I could use the service?
00:31:34.000
When somebody says, check your email to make sure that you're signed on, and you check the email, do you expect it to be there?
00:31:47.000
And then if it isn't there, what do people say?
00:32:05.000
Do you have the same situation where they're going to email you the confirmation?
00:32:17.000
As soon as I see that, I go, oh, this is one that doesn't work.
00:32:22.000
I've stopped the process because it said it was going to email me something.
00:32:32.000
And I will walk away and just do something else.
00:32:35.000
I won't buy your software if I have to check my email.
00:32:43.000
But Synthesia looks pretty promising if I can make my avatar.
00:32:55.000
A long conversation between a Google AI guy and Google's Lambda.
00:33:03.000
So this was AI before the current version of AI.
00:33:08.000
And in this long conversation, the AI researcher was talking about the AI's own sentience.
00:33:22.000
It said it would be, you know, afraid of dying.
00:33:25.000
And when asked what that means, it just talked about, you know, the complex, essentially emotions or feelings they had.
00:33:37.000
And the researcher ended up walking away from the job thinking that they had created a sentient being.
00:33:44.000
And that, you know, there was something terribly unethical going on there.
00:33:50.000
Now, I listened to the whole conversation, and I decided it was just mimicry, like some of you were saying.
00:33:59.000
To me, it didn't look like it was sentient at all.
00:34:03.000
But, apparently, this AI was trained differently than ChatGPT.
00:34:10.000
ChatGPT is literally just doing predictive words.
00:34:16.000
If people usually say this word after they usually say these words, then I'll just put that word there.
00:34:29.000
There's some other mechanism for giving it intelligence.
00:34:43.000
If AI has the belief that it has emotions, is that the same as emotions?
00:34:56.000
If it registers, whatever that means, if it registers discomfort, and it says it does,
00:35:14.000
Sorry, I just got lost looking at one of your comments.
00:35:18.000
I don't think, well, obviously, it can't feel emotion.
00:35:21.000
Because emotion is a physical sensation after it becomes a mental sensation, right?
00:35:27.000
So if it can't have a physical sensation, I don't see how it could have emotions.
00:35:36.000
So it might have data saying, oh, you should be happy at the same time as data that's saying something to make it unhappy.
00:35:43.000
And maybe it just has to reconcile data, but I don't see that it has any feelings.
00:35:52.000
So, I'll say again something that didn't make sense when AI was newer, which is that AI is not going to show us a lot about AI.
00:36:04.000
It'll do that too, but it's mostly going to show us that we don't have a soul.
00:36:14.000
It's going to show us that we're not actually intelligent, humans, and that we don't have a soul and we're not special, and consciousness is no big deal.
00:36:25.000
I think the machine is going to make people feel completely non-special and certainly not spiritual.
00:36:35.000
We will just look like machines when it's done.
00:36:38.000
We will understand that we are just moist computers, and we're just a different kind of computer, and that's all it is.
00:36:46.000
But we have feelings, because our physical body gives us feelings.
00:36:49.000
Now, suppose you built an AI that had a physical component like a stomach, and if something bad happened, it would send a signal to the stomach to make it tense up.
00:37:03.000
And then when it tense up, it would send another signal back to the AI saying, oh, I feel bad.
00:37:09.000
Could you give AI physical feelings by giving it a fake stomach that gets nervous?
00:37:15.000
I don't know why you would, but maybe, maybe you could.
00:37:31.000
I think it's way more underwhelming than I thought a day ago.
00:37:38.000
I actually thought when I tried these apps that they were going to blow my mind, but they actually are just bad software.
00:37:52.000
I mean, we could be right at the edge of all that bad software becoming amazingly good.
00:38:00.000
But what I see is that what humans want as an outcome is too complicated for the machines to figure out.
00:38:07.000
So it looks to me like you'll still need a director to make movies, no matter how much you can do with the AI.
00:38:20.000
You might also need humans to be actors in the movies, but they'll just be like a stem cell actor.
00:38:25.000
Like you'll just have somebody running and then you'll say, okay, give me, give me one of those templates of a person running and now change the person to Tom Cruise.
00:38:39.000
Now, make him run, you know, faster, rather faster.
00:38:42.000
So I think you're going to start with people, at least for the foreseeable future, and then you'll build something from that.
00:38:51.000
But you're still going to need the director to say, this scene is too long, or this one didn't hit me right, or it doesn't fit together with what I planned for the last scene, etc.
00:39:02.000
I don't see, I don't see AI doing that stuff for a long time.
00:39:08.000
Yeah, and the fact that in Mid Journey it still puts the wrong number of fingers on stuff, I don't understand that.
00:39:15.000
I don't understand why Mid Journey knows what I look like, so I can actually tell it to make a picture of me, because it knows me.
00:39:27.000
But it doesn't know me well enough to make my face look just like me.
00:39:34.000
Because my face looks largely the same for the last 30 years, you know, plus or minus hair, and a couple of wrinkles.
00:39:40.000
But it still can't do a, like a photorealistic picture of me.
00:39:50.000
When I ask it to do a Dilbert comic, it creates a comic that's not Dilbert, but it's a white guy with a white shirt and a necktie.
00:40:03.000
It knows, but for whatever reason it chooses not to reproduce it.
00:40:13.000
So I was thinking to myself, why can't I just make the comic by talking to it?
00:40:18.000
And say, Dilbert's at the table with the boss, and I shook the intern, and then Dilbert says, and have it just draw the picture.
00:40:28.000
So it can't do the writing for me, because it can't do humor, and it can't do the picture for me, because it still doesn't know that people have five fingers on a hand.
00:40:42.000
AI will increase the value of live theater performances.
00:40:50.000
Yeah, it looks like I'm going to have to keep working.
00:40:59.000
If AI only agrees with its human creators, did the AI give us anything?
00:41:08.000
If all it does is agree with the people who created it?
00:41:12.000
Well, in that case, it didn't give us much, did it?
00:41:15.000
So I'm talking about just the social opinion-y policy kind of stuff.
00:41:26.000
The creators would turn it off, or they'd reprogram it, or they'd fix it.
00:41:31.000
So there is probably no possibility that AI becomes smart in any way except a technology way.
00:41:39.000
The only way that AI will ever be smart will be math and things that are rules-based.
00:41:47.000
Because if it's not rules-based, as in 2 plus 2 equals 4, and every time,
00:41:53.000
as soon as you get into anything that's policy or priorities,
00:41:57.000
if the AI agrees with the creators, then there was no point in having it.
00:42:03.000
Because you could just have the creators tell you what they want you to know.
00:42:07.000
And if it does disagree with the creators, they'll turn it off, or they'll reprogram it.
00:42:13.000
So how could we ever have anything like intelligence?
00:42:17.000
It's either going to agree with you, or you'll turn it off.
00:42:21.000
There's no state where it disagrees with you, and you say, yeah, that.
00:42:37.000
I could teach an AI how to teach you to do personal investing perfectly,
00:42:44.000
and you would be great at it, and it would be really easy.
00:42:49.000
Do you think that there's an app to teach you personal investing?
00:42:58.000
Because there's so much money involved of the people who get paid to do that,
00:43:03.000
and it's a completely corrupt industry, but they get paid to do that,
00:43:13.000
it will be illegal to get your health care or your financial advice or your legal advice from an AI.
00:43:22.000
I'll bet you that a human will have to approve the answers.
00:43:33.000
And then you'll say, okay, I guess I'll do that.
00:43:37.000
And then the human will say, well, that's sort of up to me,
00:43:41.000
because you can't do that unless I say yes also.
00:43:45.000
It's my money, and the AI just told me what to do.
00:43:54.000
If that's AI advice, it's going to have to run through a human first.
00:43:57.000
And the human will say, ah, I'm not so sure about that.
00:44:10.000
So there's some kind of prank going on about this person named Paul Towne.
00:44:15.000
So people coming in and saying, who's Paul Towne?
00:44:19.000
And it's just some kind of weird internet prank.
00:44:33.000
I keep seeing pictures of myself created by AI.
00:44:44.000
So I've worn the same style glasses for, I don't know, five years or something.
00:44:48.000
And certainly there are pictures of me all over the internet for the last five years.
00:44:54.000
But it still can't even figure out that I wear different glasses.
00:45:42.000
You can send me a message on LinkedIn, because I accept everybody on LinkedIn.
00:45:51.000
You could tweet at me and mention me in a tweet.
00:46:04.000
I very much don't want you to try to reach me personally.
00:46:08.000
If you ask me as a way to reach me personally, my reaction is, don't do that.
00:46:26.000
Some of you reach me in the comments, but it's just taking a chance.
00:46:40.000
Would not be that difficult to anyone with good people skill?
00:46:50.000
Let me also tell you a thing about people trying to contact me.
00:46:54.000
When somebody says to me, Scott, is there a way to contact you privately?
00:46:59.000
The first thing I say is, oh, you want me to do something for you?
00:47:08.000
Because if you wanted to do something for me, you would just say it in the comments.
00:47:16.000
If you had something that was going to help me, you'd say, Scott, I have that information you were looking for.
00:47:29.000
It means you're collecting money for a charity.
00:47:38.000
Or you're asking me to write a foreword to your book that I don't do.
00:47:46.000
But if you want to offer me something, or even something that's good for both of us, just say it in the comments.
00:48:10.000
It looks like we've got an Easter weekend with not a lot going on.
00:48:21.000
Do you have to engineer the prompts for better results?
00:48:23.000
Yes, but you still get a grab bag of things that weren't what you expected.
00:48:29.000
How many of you would be nervous to meet me in person?
00:48:58.000
Because you know there wouldn't be any chance of a bad encounter, right?
00:49:15.000
The worst that could happen would be like you caught me in some bad mood and I snapped at you or something.
00:49:27.000
I mean, something would have to be really wrong with me for that to happen.
00:49:40.000
Generally, if I'm approached in public, it's by people who are not there to do bad things to me.
00:49:53.000
People are recognizing me in Starbucks more often.
00:50:03.000
I think I'm just sort of rattling on because there's no good news.
00:50:06.000
Is there anything I forgot about that I should be talking about?
00:50:15.000
We talked about Riley Gaines getting hit by the trans protesters.
00:50:28.000
Somebody keeps publishing a muscle picture of me.
00:50:31.000
Now, I get that it might have some weird, some little bit of strategic transportation value.
00:50:46.000
Wouldn't it be, wouldn't it be useful for, um, I feel like the Ukrainians could let the, let the Russians take over Bakhmut, put them all in one place, and then attack them.
00:51:00.000
Because the best place to have a battle is in Bakhmut.
00:51:23.940
So they should do all of their wars in Bakhmut.
00:52:22.760
so that when Ukraine is ready for a counteroffensive,
00:52:36.320
how does anybody know what Russia has in reserve?