Episode 1840 Scott Adams: If Democrats Tell The Truth About Trump, It Makes Them Look Like Monsters
Episode Stats
Words per Minute
145.08688
Summary
On today's show, Scott Adams talks about the mid-term elections, college censorship, and why he thinks Sam Harris is wrong about his views on the election. Plus, a new update on the DeSantis campaign.
Transcript
00:01:58.740
Now, why did nobody see this coming? How can the
00:02:06.660
get away with it? They can't, apparently. Apparently
00:02:09.220
it's the most obvious thing you can't do. So while
00:02:14.120
DeSantis had been on this long winning streak of
00:02:45.800
terms of speech. They can certainly change what
00:02:56.140
it looks like colleges will get to say whatever
00:03:03.180
situation again, because there's an update. So as
00:03:08.160
you know, famous rational person, Sam Harris, some
00:03:23.880
laptop was suppressed by the left as a conspiracy
00:03:28.520
basically, and that it probably made a difference in the
00:03:31.920
election. And he thought that that was fine because the, there
00:03:37.500
had been similar, you know, laptop related things. Hillary's,
00:03:42.400
what's, Wiener's laptop, you know, was a factor in the first
00:03:46.560
election with Trump versus Hillary. And so Sam Harris was
00:03:50.840
saying, well, if Trump lost because of a laptop thing, but maybe he also won
00:03:56.780
because of a laptop thing, it's not the biggest thing in the world. And it's
00:04:01.360
probably fair because, in Sam Harris's view, Trump is so dangerous that
00:04:08.320
gaming the system a little bit was probably appropriate in terms of,
00:04:13.060
specifically in terms of that laptop thing. Now, of course, people got on him
00:04:18.860
and said to him, does that mean that you would justify the election being
00:04:25.360
rigged? Which is the obvious next question, right? And so he wanted to, to
00:04:32.400
tell us that he did not think that. So here's, here's his full answer. He said,
00:04:38.660
there's a podcast clip. Now, here's what you should listen for. I want you to
00:04:43.880
listen for the specificity of his answer. It's a little too specific, all
00:04:50.940
right? But wait for it, see if you catch it. So it's a, it's a
00:04:54.780
multi-part tweet. He said, this, there's a podcast clip circulating that
00:05:00.760
seems to be confusing many people about my views on Trump, which is
00:05:05.620
understandable because I wasn't speaking very clearly. So for what it's
00:05:10.180
worth, here's what I was trying to say. All right. So now I, I'm a proponent of
00:05:17.260
forgetting what he said before if a clarification is being offered. So forget
00:05:23.860
about what he did say. He's got 48 hours, according to my 48 hour rule that I've
00:05:29.180
made up to clarify. And if he does try to clarify, you should just take the
00:05:33.860
clarification. Don't, don't try to beat somebody up about what they said if they've
00:05:39.340
already clarified. All right. So I'm on board with him so far. Let's look at the
00:05:44.020
clarification and sort of ignore what he said before. I like that as a standard,
00:05:48.940
even if you think maybe it causes people to lie or backtrack or weasel or
00:05:54.120
whatever. But I think we should just keep that as a social standard. If somebody
00:05:58.880
clarifies, just take the clarification. All right. So he goes on. He said, I was
00:06:05.900
essentially arguing for a principle of self-defense. Interesting. So the idea here
00:06:11.080
is that Trump is dangerous and therefore our citizens would have a right of
00:06:15.320
self-defense. And then he goes on where there's a continuum of proportionate force
00:06:21.640
that is appropriate and necessary to use. So again, I think he's going back to if the
00:06:27.440
Wiener laptop made a difference in one election, maybe the Biden laptop makes a
00:06:32.740
difference in the other direction and that those would be roughly equivalent. I think
00:06:36.580
that's the argument. I've always viewed Trump as a very dangerous person to elect
00:06:40.900
as president of a fake university, talking about Trump University, let alone the U.S.
00:06:47.420
And when he became a sitting president who would not commit to a peaceful transfer of
00:06:52.040
power, I viewed him as more dangerous still. However, I've never been under any
00:07:01.980
So here he's saying that he's not under the illusion that he's Orange Hitler, which is
00:07:09.740
interesting because when I talked to him on his show, he did, he did allude, he did make
00:07:14.900
a Hitler, Hitler analogy with Trump. So I guess the Hitler analogies are not to be taken
00:07:21.200
seriously. Okay. So he doesn't compare Trump to Orange Hitler. And then on the podcast,
00:07:31.980
he goes on, I was speaking narrowly about the wisdom and propriety of ignoring the Hunter
00:07:37.480
Biden laptop story until the election. I've always thought that was a very hard call,
00:07:43.220
ethically and journalistically. But given what happened with the Anthony Wiener laptop in
00:07:48.620
the previous election, I think it was probably the right call. Now here's the part you want
00:07:54.020
to look for the specificity. All right. Look what he says. And then what he doesn't say.
00:08:01.260
All right. So watch for what he doesn't say. This is the important part. He goes on. Nothing I said
00:08:07.820
on that podcast was meant to suggest that the Democrats would have been right to commit election
00:08:14.200
fraud. All right. So what he's saying is that he didn't mean to suggest that Democrats would have
00:08:32.140
been right to commit fraud. He didn't mean to suggest it. Does that mean he doesn't think it?
00:08:38.760
Because that's different, right? It's one thing to say I didn't mean to suggest it,
00:08:46.300
which is an interesting way to say it. How about I didn't mean to say it? But, you know,
00:08:54.620
what I do say is this. If it was confusing, I change it. But interesting. He didn't mean to suggest it
00:09:01.760
doesn't really give his opinion about it, does it? I didn't mean to suggest it is very different from
00:09:08.760
that wasn't my opinion. Am I wrong? I feel like when you say I didn't mean to suggest it,
00:09:17.100
it means I didn't mean to say it. It doesn't mean he doesn't think it. And we don't know what he
00:09:23.560
thinks, right? I'm not putting a thought into his mind there with my imagination. I'm just saying
00:09:30.200
that he didn't specify that it wasn't his opinion. He simply specified he didn't mean
00:09:38.580
to suggest it. Slightly different. You can imagine them being the same, or you can imagine
00:09:45.800
that he meant them to be the same. But they don't sound exactly the same, do they? Sound a little
00:09:51.320
different. And he says, nor do I think that they did that. Now, whether he thinks they did that or not
00:09:59.320
is irrelevant to whether they did that or not. But didn't mean to suggest that it would have been
00:10:07.060
right to commit election fraud or other illegal means. Didn't mean to suggest it. It feels like
00:10:14.180
we still don't know if he thinks it would have been a good idea. Like, why not tell us directly?
00:10:21.980
Because here's the thing. If it's not a good idea to rig an election to keep Trump out of office,
00:10:27.720
why isn't it? Why isn't it a good idea? If you believe what you say about his danger,
00:10:34.800
it would definitely be a good idea. I would do it. If I believed what they believe,
00:10:39.700
I would rig the election. Would you? Would you? If you were keeping a monster out of the office,
00:10:50.500
or yeah, I think he actually compared Trump to an asteroid barreling toward the earth. Because if
00:10:56.700
you imagine, let's say you imagine from the left point of view, if climate change is going to, you
00:11:04.120
know, destroy the world in some some fashion, and you think Trump's going to make climate change
00:11:11.460
worse, it's kind of like an asteroid coming toward earth. All right. And somebody says that if you
00:11:20.300
break the law, they'll do it to you. What about that argument? If you cheated on the election,
00:11:26.580
they'll do it to you. How does that deal with the fact that everybody who can get away with it does
00:11:35.480
it already? I mean, we live in a world where everybody who can get away with stuff does it.
00:11:41.200
It's just whether they think they can get away with it. That's the only thing that limits people.
00:11:46.100
In the real world, that's it. If they thought they could get away with it, pretty much everybody
00:11:51.740
would do it. Pretty much. Now, you think you're the exception. Okay, I get it. You're the exception.
00:11:57.840
And people in your family are awesome. No, people do what they can get away with. That's the world
00:12:03.880
you live in. Now, getting away with it includes God watching you, right? If you're a religious person,
00:12:13.000
you believe God will judge you. So that might not be getting away with it. In your own point of view,
00:12:19.120
that would not be getting away with it because you're being watched by God. But if you didn't
00:12:23.600
believe God was watching, and you didn't believe anybody else was watching, and you didn't think
00:12:27.260
there was any chance of getting caught, you would all rig the election. I know you think I'm wrong,
00:12:34.000
but I'll give you my opinion. It's just an opinion. I can't prove it. So it's just my view of the
00:12:41.080
world. 100% of the world would rig the election if they thought they were saving the world.
00:12:46.100
If they thought they were saving the world, 100% would rig the election. If they thought
00:12:52.720
they could get away with it, nobody was watching. No God was watching. Your conscience, well,
00:12:58.920
your conscience, I guess you still have that. Yeah. Now, I accept that you don't agree with
00:13:05.160
me. I accept that you don't agree. But I'm positive. I'm positive in my opinion.
00:13:15.180
Very rarely am I this sure about an opinion, but I'm positive about this one. Now, you could say
00:13:20.920
there would be some exception, right? If you took a million people, yeah, you could get somebody to
00:13:26.420
not do it, definitely. But the ordinary person would save the world if there was no risk.
00:13:34.960
That's what I'm saying. Now, the ordinary person might not destroy the world for their own benefit.
00:13:44.700
That's true. I can see that they wouldn't destroy the world just to get a little extra gain. Some would,
00:13:51.120
some wouldn't. But everyone would break the rule to save the world if they thought that's what was
00:13:59.700
happening. Everybody would. The exceptions would be so rare that you wouldn't even discuss them.
00:14:05.980
It would be crazy people, stuff like that. And so I want to see if I can get somebody to say,
00:14:12.820
say right here in the comments, under this scenario, that you wouldn't break the rule,
00:14:19.160
because, you know, the rules are the rules, and we're a rules-based people. But you wouldn't break,
00:14:24.900
tell me you wouldn't break the rule even if you thought it was the end of the world.
00:14:30.040
It's the end of the world if, unless you break the rule. Would you break the rule?
00:14:35.940
How many would break the rule to prevent the end of the world? You would not. So there are people
00:14:42.700
on here who would not, who would not break a rule made by humans, and they would let all the humans die.
00:14:50.460
You would let all the, there are people actually saying that on the locals platform. Yeah. And,
00:14:58.880
and people are confirming, yep. People are saying yes. They would let the entire world die,
00:15:04.620
because following the rule is more important. You're actually saying that, right? I'm not
00:15:11.260
misinterpreting you, right? Yeah. Yeah, there's, so there are actually a number of people
00:15:19.860
who would destroy the world to, to protect themselves, because that's what that is. Yeah,
00:15:28.800
if what you're doing is, is not breaking the rule, because you can't break the rule,
00:15:32.920
that's really protecting yourself at the expense of the entire planet. And there are people saying
00:15:38.760
they would do that. They would protect their own feeling about, you know, being a protector of rules.
00:15:44.820
More importantly, that all of the people who have ever, who are alive, all 7 billion of them.
00:15:53.300
Yeah. Now, am I supposed to take you seriously?
00:15:59.220
How can I take that seriously? Because you're, you're either lying or you're so stupid that you
00:16:06.140
wouldn't be able to log on and have an account. I mean, are you lying? No, nobody would do that.
00:16:13.500
Unless you were like mentally ill or something. Yeah. Now, remember, my, my situation is that,
00:16:23.240
you know, the world is going to be destroyed. It's, there's no question about it. And you would
00:16:26.780
still do it. Now, if you were uncertain whether you were right, that would make sense. If you said,
00:16:33.100
well, I don't know if Trump's, you know, the, the end of the world, but I worry about it. In that
00:16:38.900
case, I wouldn't break a rule. Would you? Suppose you just had a suspicion that things would go wrong
00:16:46.060
with some candidate. I wouldn't, I wouldn't throw an election in that case. I'm not even close. I'm
00:16:53.480
talking about, you're sure that this candidate is just going to destroy the world. You're sure of it.
00:16:59.020
In that case, sure. I would break any rule there was. All right. So the people who were saying that
00:17:08.300
they would not break the rule, even at the risk of the entire planet, now I'm seeing the real
00:17:14.200
thinking is being revealed here. They don't believe this set up. So in other words, they rejected the
00:17:19.320
hypothetical that there could ever be a case when there would be any danger because of who got elected.
00:17:25.640
Right. If that's what you're saying, then you're not good at answering questions. So that's the
00:17:31.380
problem. Right. The reason that I ask hypotheticals is to force you to clarify your thinking. If you
00:17:39.820
refuse to clarify your thinking, that's a message too. That says something. All right. But here's the
00:17:49.320
problem with the Sam Harris opinion. And again, I guess I would be speculating and reading some minds
00:17:57.920
a little bit here. It's a little bit unfair, but it's hard to avoid. How can you be honest
00:18:05.220
in this situation? And Sam Harris, I don't think he has the capability of just lying.
00:18:14.280
I think it's sort of beyond his framework. Because first of all, I don't think I've seen an example
00:18:22.180
of it ever. I don't think he's even been accused of lying, has he? Like ever? I mean, it's pretty
00:18:32.120
amazing when there's a public figure that we can all sort of have some familiarity with. And as far
00:18:39.200
as you know, that person has never tried to lie to you. You could disagree with them all day long,
00:18:44.720
but they haven't tried to lie to you. Right. So I don't think that Sam Harris is lying. Do you?
00:18:52.700
No evidence of that. But what do you do if your honest opinion is that somebody is so dangerous
00:18:59.660
that bending some rules might make some sense, if that's his opinion? You really couldn't express
00:19:05.560
that, couldn't you? Because it would be so damaging. Suppose you were a person who only
00:19:11.820
tells the truth, and it's really important to your, I don't know, your psychological makeup,
00:19:16.840
your brand, your legacy. It's just real important that you tell the truth all the time. And I think
00:19:21.940
Sam Harris is probably the best example of that. His entire, you know, being is wrapped around
00:19:30.100
rationality and being honest. So how does he deal with the fact that if he believes
00:19:37.500
what he says about Trump, it does make perfect moral sense to rig an election. But I think that
00:19:46.440
it's so dangerous to say that, because that gets taken out of context, of course. Of course,
00:19:54.060
it would be taken out of context. And of course, it would be used as evidence that it was rigged
00:19:59.920
when we don't have proof of that. So wouldn't it be really dangerous for Sam Harris to be honest
00:20:07.440
about the hypothetical? That if you believe that Trump was as bad as he believes, and you have the
00:20:14.960
ability to rig the election, then maybe it wouldn't be the dumbest thing in the world.
00:20:19.260
I don't think that they can say out loud what they're actually thinking. Imagine having a point
00:20:27.700
of view that you're afraid to say out loud. It's common to all of us, I think. But I think the
00:20:37.220
problem is that they actually can't say their actual opinion, because it would be, it would sound
00:20:44.920
disgusting, even to themselves when they said it out loud. So if you have an opinion that you
00:20:50.620
can't, a political opinion that you can't express in public, that's a problem.
00:21:00.860
He's afraid of being canceled? I suppose we all are. But anyway, this is more interesting than any of
00:21:08.180
the other people in the news, because you know, well, you don't know, but a strong assumption
00:21:15.800
he's not lying. So think about how interesting this is. The reason he's such a good discussion
00:21:22.800
case. Not lying, not stupid, not under-informed. When do you ever see that? In all of politics,
00:21:34.640
when do you ever see not lying, not stupid, and not misinformed? And then he has a different
00:21:45.240
opinion than I do. Now, you don't know it, but I can know it of myself. I'm not lying. I think
00:21:54.280
I'm smart enough to be in the conversation, and I think I'm well-informed enough to be in
00:22:01.000
the conversation, too. So how do you explain, of course, this is me complimenting myself here,
00:22:07.720
but how do you explain two people who are not lying, they're both smart, they're both well-informed,
00:22:13.860
and have completely different views on Trump? How do you explain that?
00:22:21.760
Well, the explanation I would give is that one of us is experiencing cognitive dissonance.
00:22:27.300
And then how do you tell which one it is? Do you remember my trick for doing that?
00:22:34.220
What's the trick for seeing who has the cognitive dissonance? The trick is if you can take the other
00:22:41.520
side of the argument and give a full-throated argument for it to show that you understand it.
00:22:48.140
But if you can't do that, you're probably in some other kind of a mental situation.
00:22:52.680
Now, I believe I could take Sam Harris's argument and completely explain it. I don't know if he could
00:23:00.300
do that with mine. Probably could, but I don't know it. And that would be interesting, wouldn't it?
00:23:07.260
How would you like to see the two of us, you know, say, talk, split screen, and we just do that
00:23:14.040
exercise, where I try to take his point of view and explain it as well as I can, and then he tries to
00:23:19.540
take my point of view and explain it as well as he can. Do you think we could do it?
00:23:25.740
Here's the interesting thing. I'll bet we could. Probably could. I don't think I've ever seen it
00:23:31.060
before. I've never, actually, I've never seen it. I've never seen anybody do that before. It would be
00:23:35.960
fascinating. But I suspect also maybe we might think the other one hadn't quite nailed it, you know, maybe
00:23:42.820
left something out. But we'd probably end up pretty close on that. All right, here's my theory. I think
00:23:51.040
that, in my opinion, it looks obvious that he's experiencing cognitive dissonance. To me, it looks
00:24:00.940
obvious. But it might look obvious to him that I'm experiencing cognitive dissonance. And how would
00:24:09.620
you know who's right? You wouldn't, really, unless you do that exercise. And we might both pass that
00:24:14.860
as well. So here's what I would say about that. Before Trump was elected the first time, I think it
00:24:29.540
was a reasonable fear that maybe he was as bad as the worst, the worst impressions. Right? You
00:24:38.260
didn't know. I didn't know. In the first election, it wasn't impossible that he could have turned
00:24:44.600
into some kind of monster once he got in office. It wasn't impossible. I didn't think it would
00:24:49.240
happen. But it wasn't impossible. So four years go by, and it doesn't happen. The one thing
00:24:56.700
that many of us would disagree with Trump is how he handled the losing. If handling the losing is the
00:25:05.420
only, you know, major thing that justifies their opinion, I feel like that's clearly cognitive
00:25:13.100
dissonance. So here's the part that I would like to see Sam Harris respond to. What would be his
00:25:20.620
response to the fact that a Harvard study showed that two-thirds, at least two-thirds of the people
00:25:26.560
who were at the protest believed that they were saving the republic? They honestly believed the
00:25:32.600
election had been rigged, and not necessarily because Trump told them. They felt that from the
00:25:39.040
moment they saw the result. It wasn't because Trump told them. Everybody saw it. So how does he explain
00:25:45.900
that two-thirds of the people there, genuinely, in their deepest feelings, felt something had gone
00:25:54.060
wrong and they were there to fix it, to actually repair the republic? Not to overthrow it, to repair it.
00:26:00.440
Why would two-thirds of the people there who believed they were there to repair the public, or more,
00:26:05.980
I think? Why would you imagine that Trump was not one of those? Because the majority opinion was that
00:26:14.520
something was wrong and they need to fix it. It was the minority opinion that nothing was wrong and we
00:26:21.760
need to overthrow the country anyway. Yeah. Well, why would you imagine that Trump would be signing on to
00:26:30.140
the dumbest of the opinion when the smartest one is just right there? It's right there. People thought
00:26:36.320
they were saving the republic. How do you rule that out? You can't. Now, today's live stream will be a
00:26:47.960
little truncated because my Dilbert career is in great jeopardy at the moment, so I'm spending all my
00:26:54.740
time trying to fix that. Problem is, I've been drawing for, I don't know, 15 years or something on a
00:27:02.820
Wacom tablet in Photoshop, and I think Photoshop either changed how the program works, or there's some
00:27:12.280
setting, or I've got a bug, I don't know, but I can no longer paste something and then draw on top of it. I mean, I
00:27:21.620
can manipulate some things so I can get it done, but the manipulation would take too long. I just
00:27:26.420
couldn't use the tool if I had to do that. So I've published my problem on Twitter for everybody to
00:27:36.320
give me advice. What do you think happens when you ask for a technical advice? Let me tell you.
00:27:42.940
They will tell you to reboot, which I've already done. They'll tell you to make sure you have the new
00:27:48.140
version of the software, which I've already done. They will also look at the screen that you give
00:27:53.400
them, and they'll say, oh, your problem is that you have that icon pressed, even though the picture
00:27:59.700
shows it's clearly not indicated. So some will hallucinate that there's a lock that's set when
00:28:06.060
it's not. You can see it clearly. Others will tell you, Scott, why don't you Google it? Now, you could
00:28:12.600
Google all day long. You'll never find a technical solution by Googling. Have you ever found a technical
00:28:18.940
solution by Googling it? I've tried. I've tried about a billion times. When you do it with consumer
00:28:28.060
software, it might be different if you're programming. You know, a developer probably does find answers,
00:28:33.500
but if you're just using a commercial piece of software, what happens when you look for your
00:28:37.480
answer online? It's all different versions. What happened when I asked people, they sent me
00:28:44.100
screenshots of exactly what menu I should select. Do you think there's any chance that the screenshots
00:28:51.020
that people sent me actually exist on my software? Of course not. It's a different version that they
00:28:56.800
solved. It's either for Windows or it's an old one or something. Then somebody will say, well,
00:29:02.760
have you tried the Framajan blah, blah menu? And I'll say, I would try that if it existed,
00:29:11.180
but mine does not have a Framajan menu. I don't know what you're looking at. So basically,
00:29:17.500
all of your technical help is going to be read the manual, what doesn't help, Google it that
00:29:23.120
doesn't help, reboot that doesn't help, update the software that doesn't help. And then there's
00:29:28.360
always this one bastard. There's always one bastard who says the only thing you can do
00:29:34.680
is delete all of your software and reload it. I'm not going to do that. I will sell my computer
00:29:42.940
before I do that. Because you know it's just going to be problems. Yeah. So the problem is that when
00:29:48.880
I paste a layer, I can't draw on the layer unless I send it to background. And I didn't ever have to
00:29:56.160
send it to background before. You should be able to write on the layer you just pasted. That's just
00:30:01.760
basic. Yeah. So if I have to change to a new software, there's going to be a big learning curve.
00:30:08.940
I've got a deadline I'm up against. So here's the problem. I don't know that I can solve this
00:30:14.360
problem before my deadline. And I don't know that it's solvable because the solution might be to stop
00:30:21.020
using Photoshop and use something else. Now, some people said, why don't you contact Adobe
00:30:27.120
technical support? I would never even try. I'd never even try. Do you think there's any chance
00:30:37.000
that's going to work? Yeah, it's not a welcome problem. It's definitely not a welcome problem.
00:30:42.360
It's definitely a Photoshop problem. No, it's not going to work. Have you ever tried to get
00:30:48.940
technical support on the phone? That's not a thing. No, you'll just be sent through the phone
00:30:55.560
trees and then it'll disconnect. And then you get somebody who started yesterday. Yeah. And it would
00:31:03.520
take me probably two hours even to find a contact number. I mean, I'd Google it. It'd be the wrong
00:31:09.720
number. I'd try this. They'd tell me that I got to use the other number. Yeah. Customer service isn't
00:31:16.340
really a thing that happens anymore. So there's not really any customer service from a tech company.
00:31:23.500
Right? I mean, you could try, but it's just going to be a waste of half a day sitting on hold and then
00:31:30.260
getting nothing. What would Dilbert do? I don't know, but I know what Dogbert would do.
00:31:40.280
Dogbert would make it a huge public problem. So the Adobe has to fix it for me. Because it's
00:31:50.040
probably, here's what I think. I think they did it intentionally. I think it's an intentional
00:31:56.820
change to the software. I think. I don't think it's a bug. I think it's intentional. Now it could be
00:32:03.420
that somebody requested it because there's some advantage about the way they're doing it.
00:32:07.060
So I think somebody benefits from this change, but it makes it useless to me. It would make my job
00:32:14.640
so hard. I don't know if I could even bother doing it with the software anyway.
00:32:25.420
Yeah, you can't really reinstall the old version because it's a subscription service.
00:32:29.900
Yeah. If I tried to do that, it would take longer than I have. So I'm either going to have to miss
00:32:37.940
my deadline or a miracle has to happen. Brush mode behind. The culprit is brush mode and behind.
00:32:52.460
Well, I don't know what that means. So most of the people would give me, all right, so here's a good
00:32:59.040
example. The culprit is brush mode and behind. What do I do with that? I don't know what to do with
00:33:04.560
that. Those are just words. Brush mode and behind. What's that mean? I know the layer needs to be
00:33:12.940
unmoved, but I don't want to move the layer. That's the problem.
00:33:27.760
But most of your advice is going to be for a Windows. That's the problem.
00:33:36.040
Even compressing, even flattening the layers, you can't draw on it.
00:33:44.660
Now, there's lots of things I can do to hack it, but it doesn't just work.
00:33:54.680
You can't debug it, pick up the pen, draw, debug it, pick up the pen.
00:34:01.260
You can't get closer to drawing or you can't even use it.
00:34:10.900
So what happened was when I paste a layer, I can't draw on the new layer.
00:34:17.620
And don't say that I got something locked because nothing's locked.
00:34:27.720
Now you're saying a two-finger click on the trackpad.
00:34:31.500
But when you tell me Windows commands for my Mac, I don't know what to do with that.
00:34:54.840
Make a backup of the coot, erasing install over.
00:35:00.520
I'm not going to erase my whole hard disk and reinstall all my software for this one bug,
00:35:10.720
If you think it's a bug, that's a whole different problem.
00:35:15.480
This appears to be a software change, which maybe I can unchange with some kind of a setting,
00:35:36.580
Now, the problem with this is that I'll get so many suggestions that I won't know which ones are right,
00:35:45.580
and I could pour through the wrong ones all day long.
00:35:52.940
I don't think there's any kid that could solve this, honestly.
00:35:56.480
I think that maybe Adobe just made a change and there's nothing I can do about it.
00:36:30.120
Google, how to revert to previous software versions.
00:36:32.660
No, I'm not going to revert to a previous software version.
00:36:40.080
I'm going to fix it with this version or some future one, but I'm not going to revert.
00:36:47.140
Because that's just asking for a whole bunch of new problems.
00:37:03.380
So there's nothing else I can do today except work on this one problem until it's fixed.
00:37:08.680
So that will be all I'm going to do until that's fixed.