Episode 1919 Scott Adams: Two Days Before Elections And The News Is Delicious & Stimulating. Join Us
Episode Stats
Length
1 hour and 32 minutes
Words per Minute
146.00072
Summary
The pre-election sip that you ve all been waiting for, the one you ve been hoping for, and the one that you've all been dreading. It's the pre-Election Sip of the Day, and it's a good one.
Transcript
00:00:00.880
Good morning, everybody, and welcome to the highlight of civilization, the day before
00:00:06.120
Election Day, and our minds are, I would say, focused.
00:00:12.620
I would say that things are starting to move in the right direction, maybe.
00:00:20.200
How would you like to take it up to a new level of awareness, to a higher dimension,
00:00:25.140
a higher level of performance than you have ever experienced before?
00:00:29.240
Well, all you need is a cup or a mug or a glass, a tank or a chelsea, a stein, a canteen jug
00:00:44.100
The dopamine here of the day, the pre-election sip that you've been waiting for, it's all
00:01:04.100
Well, I had been working pretty hard, but I managed to take some of my workload off by
00:01:20.060
So thankfully, a lot of the chores that one normally does, I managed to put off on my part-time
00:01:29.180
assistant, who immediately had a sporting accident and is on vacation right now.
00:01:35.840
So that didn't work out as well as I'd hoped, but it will.
00:01:44.280
Have you seen the video of the Iranian women knocking the headgear off of clerics in Iran?
00:01:55.360
So, you know, you're watching the, I guess the government is cracking down on the women
00:02:01.460
who are trying to not wear the traditional, you know, face coverings and stuff.
00:02:12.020
I thought they would be stamped out by now, but they continue.
00:02:19.980
You can see a compilation where the young women, it looks like, will run up behind a cleric who's
00:02:27.500
What's the name of the headgear that a cleric wears?
00:02:55.800
But the young women are, you know, they can just knock the hats off the clerics and just
00:03:00.820
Because there's nothing the cleric could do about it.
00:03:03.900
And I thought to myself, the mullahs or the clerics?
00:03:28.320
If I were the women in Iran, I would start to all dress alike everywhere.
00:03:36.240
I would wear all black bottoms, all black tops of the same kind.
00:03:51.100
Because I'd be knocking the hat off of every fucking cleric within miles.
00:03:57.060
And they wouldn't be able to catch anybody because they'd all look the same.
00:04:01.220
So I think that the Iranian women should embrace and amplify.
00:04:05.760
They should over-cover themselves and all look the same so they could be knocking hats
00:04:11.560
off of clerics all day long and nobody could get caught.
00:04:32.440
They're asking for you to be completely unidentified.
00:04:38.880
Show them what happens when you're unidentified.
00:04:47.360
I've told you this before, but man, is it true that remember when we were waiting for AI artwork to someday be as good as a human?
00:05:04.120
I've seen enough examples now where one after another AI art I want to put on my wall.
00:05:12.820
Like, I'll see it and I'll go, I would put that on my wall.
00:05:16.840
I don't really see human art that makes me do that.
00:05:19.260
I don't see human art that makes me want to put it on a wall.
00:05:32.620
But Brian Machiavelli did another piece of artwork in which he just asked it to do a Dilbert comic.
00:05:41.860
I'll show you which AI he was using here in a moment.
00:05:44.140
And the artwork, in my opinion, as a professional cartoonist, is equal to the best human cartoonist.
00:05:56.600
But there's an interesting quality about it that blows my frickin' mind.
00:06:03.080
So you won't be able to see it as clearly as I want it.
00:06:06.340
But the request from Machiavelli's Underbelly was to make a Dilbert cartoon about a zebra.
00:06:24.280
And look at the quality of the cartoon at the bottom.
00:06:30.120
That is 100% as good as the best human cartoonist right there.
00:06:39.860
And like this art, like that bottom cartoon, that is so well designed.
00:06:54.520
Do you notice that the humans have weird characteristics?
00:06:59.660
Like they don't quite look like they have the right features and stuff.
00:07:03.440
They look all interesting, but even from one panel to the next, the AI is making the person look different.
00:07:10.940
Like one has, you know, three lenses and blah, blah, blah.
00:07:25.600
You know how the human brain gives you a perfect image of things that you don't see perfectly?
00:07:36.020
So, for example, you will have a memory of something that didn't even happen.
00:07:42.640
You also, if you're watching, let's say, a tennis ball being hit hard and it bounces near a line,
00:07:59.260
But did you know you don't see that tennis ball?
00:08:02.120
Your brain or your eyes can only pick up the tennis ball every, depending on the speed it's going,
00:08:06.980
it can only pick up the tennis ball every five feet or something.
00:08:09.960
But all the stuff in between, your brain filled in and it wasn't there.
00:08:15.260
But your memory is you saw the ball the whole way.
00:08:19.920
Your brain allowed you to imagine you saw it the whole way.
00:08:23.160
Now, do you accept that your brain is, in real time, translating things into things that are not true?
00:08:35.440
Your brain is always translating your approximate environment into a specific picture, which is not true.
00:08:44.240
Now, what happens when you take AI, high intelligence, and you train it to look at human faces,
00:08:53.240
and then you tell them to reproduce a human face in some artwork?
00:08:56.600
And then the AI makes the human face look different in each panel.
00:09:05.900
It seems like it doesn't seem to matter which AI you're using.
00:09:09.980
And you can train the AI to really know what a face looks like.
00:09:14.600
Because faces are approximately, you know, the same, they're approximately symmetrical, right?
00:09:27.300
Are you telling me that AI can't make a human face?
00:09:41.100
It could be that human faces don't look the same from minute to minute, and that we can't tell.
00:09:54.260
Because our human brain is making your face look the same as it looked a minute ago to me.
00:10:00.760
What if our faces don't look the same from moment to moment?
00:10:05.040
And that it's actually like a, you know, it's more like a wave function.
00:10:16.520
And when somebody looks at it, they're picking up the potential.
00:10:19.560
They're forming in their mind their own face, and then they lock it in.
00:10:24.180
And so every time they look at you, they see that face that they've locked in.
00:10:33.040
Because give me one other reason that AI can't make a face the same twice in a row.
00:10:44.160
The technology is telling you what is actually there.
00:10:48.420
The AI is telling you that our faces are not the same from minute to moment.
00:10:59.600
That is within the realm of something that could be true, which is freaky by itself, even if it's not true.
00:11:06.540
But I can't think of another reason that AI can't make a face.
00:11:15.840
Real clear politics had a poll asking people if we thought the country was heading in the right direction or the wrong direction.
00:11:26.280
But I want to see if you can use your powers of deduction.
00:11:32.300
What percentage of the population thinks that things are moving in the right direction?
00:11:52.960
I've never seen a group that could guess things better than you.
00:11:56.160
But once again, once again, with no help whatsoever, you got the right answer.
00:12:05.360
About a quarter of the country thinks things are fine and in the right direction.
00:12:19.360
I had a realization the other day that two things are true.
00:12:29.240
I think you'll agree with the first one, but not the second one.
00:12:31.780
The first one, everything appears to be broken.
00:12:39.660
All of our systems, all of the way we think about things, education, government, supply chains, military.
00:12:57.380
When everything looks broken, you could be wrong.
00:13:04.680
Because, remember, the central fact of our human cognition is that we're looking for problems so we can fix them, so we can survive.
00:13:23.700
Because if it keeps you happy and stupid, you're also going to be dead, right?
00:13:38.420
A news model that gets more clicks for bad news, right?
00:13:46.340
So now, people are automatic problem identifiers and solvers.
00:13:50.860
And then the news model gives you nothing but problems.
00:13:54.040
What in the world would you think about your world?
00:14:04.140
But when was the last time you saw a big news story about good news?
00:14:08.460
Hey, here's a news story of something that's working great.
00:14:11.540
I try to give you those stories, but even when you see them in the press, they don't really frame them that way.
00:14:18.380
But, you know, I'll tell you, hey, things are moving in the right direction.
00:14:23.440
So number one, I think we all agree everything looks broken.
00:14:29.960
I would say it's not just that it looks broken.
00:14:37.960
Would you take it further and say it doesn't just look broken.
00:14:44.040
In the sense of it's not where we want it to be.
00:14:46.300
No, not maybe 100% broken, but everything's not where we want it to be.
00:14:54.960
Humans love it when things are not where we want them to be.
00:15:00.220
You're continually like, ah, got to build this thing, got to fix this thing.
00:15:32.420
That our politics and our free markets and our, you know, flow of information, which isn't as free as it could be, but it's getting better.
00:15:48.860
Now, I told you about, you know, Germany was going to freeze over the winter because they didn't have Russian gas.
00:15:55.420
And then the latest report was, oh, somehow they figured out how to store enough gas so they could make it through the winter.
00:16:09.340
Because everything's broken, that gives you license to improve it.
00:16:14.360
Did you ever work in a company where something's working fine?
00:16:20.460
Everybody would say, why are you focusing on that?
00:16:23.600
But everything looks broken, which causes us to try to fix everything.
00:16:30.260
We're in this frenzy, you know, in society, in the world in general.
00:16:39.660
Everything's getting redesigned, usually by the free market, right?
00:16:43.920
Absolutely everything's being redesigned from almost the bottom up.
00:16:47.640
We're in a reinvention phase that you won't recognize until it's, you know, toward the end.
00:16:55.800
But you don't recognize we're in a creative, inventive phase that will be one of the great ones of all human history.
00:17:07.040
Now, I would argue that after World War II, we had a creative, you know, explosion.
00:17:14.440
You know, and then there was the industrial revolution, creative explosion, et cetera.
00:17:19.680
But I feel like we're in one, but you don't feel it.
00:17:23.620
When you're in it, it just feels like everything's broken because you're just trying to fix everything.
00:17:33.380
And I think that applies everything from our energy policy to the way we run elections to Twitter to you name it.
00:17:42.540
Literally everything is trending positive if you look at it as a system that's, you know, in the phase of correcting.
00:17:51.960
It does not say we should not be vigilant about all of our problems.
00:17:55.620
We'll do that automatically because we're good at being vigilant.
00:17:58.740
How many of you saw the video of John Fetterman giving his speech outdoors?
00:18:04.780
He had a bunch of American flags behind him and the wind blew them all down while he was talking.
00:18:13.920
And Fox News had an article about how social media was mocking it and, you know, had some tweets there from some notable people saying funny things.
00:18:24.480
And I could not be more angry at Fox News today.
00:18:29.000
Fox News was saying that people were saying clever things about that Fetterman flag thing and they did not include my tweet.
00:18:41.280
And so, since they only mentioned some lesser clever tweets, which were pretty good, but they were lesser clever than the one I did.
00:18:53.340
Hypothetically, if God had been trying to warn us for months not to vote for a particular candidate, what would that look like?
00:19:03.700
I'm a professional joke writer and I declare that was the best joke written on this topic.
00:19:13.720
This is a joke writing lesson for you, so you can take this way.
00:19:17.840
I've told you this before, but every time you see an example, it helps.
00:19:21.440
If you can do a punchline or a joke in which the audience has to fill in the details, that's your best joke.
00:19:35.100
If you over-specify, you know, what you want somebody to imagine, well, that can be a good joke, too.
00:19:41.060
But the great jokes are where it takes you a moment to fill it in.
00:19:45.480
And when you fill it in, you're filling it in with your own stuff.
00:19:48.700
So when I said God had been trying to warn you for months, you can fill it in with everything that happened to John Fetterman, which was a tragedy.
00:19:59.720
But if you had, you tried to fill it in, you're like, okay.
00:20:04.540
Yeah, that sounds very much like that old joke.
00:20:09.000
The floodwaters are rising and neighbor knocks on the elderly person's door and says, you know, hey, the water's rising.
00:20:18.060
And the woman says, no, no, you know, God will provide.
00:20:22.080
And the water rises and, you know, the woman has to go up to the second floor of her house.
00:20:26.140
And then a boat comes up to the second floor window and says, get in the boat.
00:20:34.780
And she ends up on the roof of her house because the water has risen.
00:20:38.600
And she's on the roof of the house and a helicopter comes by and says, get on the helicopter.
00:21:21.260
Anyway, it sounded so much like the joke, I couldn't resist.
00:21:30.200
It's sort of a Twitter competitor, which I'm not going to tell you is credible.
00:21:40.220
Because if I told you it's credible, I would be kicked off of social media.
00:21:51.640
So, there's a woman who claims that she was doing construction on a home that she and her husband owns for a couple of years.
00:22:02.140
But you can reach the mailbox, even though it's behind a chain link fence.
00:22:06.760
So, you can sort of reach in and get to the mailbox, I guess.
00:22:09.340
And, she found it was stuffed with address changes.
00:22:15.360
So, a whole bunch of people, allegedly, had changed their address to her address, which is an abandoned building.
00:22:23.700
Now, obviously, the implication is that people are doing things to be able to vote by mail fraudulently.
00:22:35.900
They were just using the mailbox as a change of address place.
00:22:38.680
Now, anecdotally, I heard other reports of this being a ballot fraud thing that people do, etc.
00:22:50.460
You have seen one real person, apparently, describing what looks like a pretty credible claim.
00:22:59.140
Use your critical thinking to answer this question.
00:23:24.020
So, do you remember what I predicted would be true after the 2020 election?
00:23:29.560
If you remember, I said a whole bunch of times, I said this over and over again.
00:23:34.640
I said, every claim you hear about election fraud is at least 95% chance likely to be false.
00:23:45.200
Now, what was the final outcome of all the 2020 claims?
00:23:53.520
I told you that even if something is true, and I didn't have an opinion at that time, I didn't know if there was some true stuff.
00:24:00.300
But I told you that 95% of what people surface will definitely not be true.
00:24:05.480
So, when you see this new claim, just keep the 95% thing in your mind.
00:24:15.760
If you put a gun to my head and said, all right, you're going to have to bet whether this particular story was true, I want to think it is.
00:24:26.000
But then I have to retreat to my rational mind and say, okay, the fact that it looks true means what?
00:24:33.680
How much credibility do you put on, it's on video, it's a real person, and it looks true.
00:24:47.940
And the first thing you ask yourself is, why is it only on Getter?
00:24:53.520
Is there some reason that would be not on other platforms?
00:24:57.340
So, it could be, you know, from a different year.
00:25:01.200
It could be an operative who's just, you know, doing an op.
00:25:20.200
Being one of its biggest critics for years, I feel like I have some kind of social obligation
00:25:30.120
And not only today do they have Daniel Dale doing a hard fact check on Biden, and when
00:25:38.520
I say a hard fact check, he just goes right at him.
00:25:42.980
Daniel Dale went at Biden the way you saw him go after Trump.
00:25:58.380
I'm talking about the major claims of the president.
00:26:04.360
So congratulations to CNN for what looks like a successful business pivot.
00:26:15.200
The day before elections, CNN's website, the following headline.
00:26:22.580
Democrats are out of touch with American voters.
00:26:24.920
That's a headline on CNN the day before the election.
00:26:30.040
Democrats are out of touch with American voters.
00:26:33.000
And then it goes on in the article to say that Republicans are running for office on the
00:26:49.280
Today feels like a Monday to me for some reason.
00:26:57.080
My mind is completely blown by the fact that something's going right.
00:27:14.260
And do you see that they're redesigning in a way that's useful to the country?
00:27:29.160
I mean, it's a big mystery where things will go.
00:27:45.520
So I'm going to root for CNN's ratings to improve because I think they're acting like patriots.
00:27:54.800
In my opinion, CNN is taking a patriot perspective on their business, which is to try to give you the actual news, which is different.
00:28:08.180
But that's another example of something that's going right.
00:28:11.740
Biden, of course, gaffed again and said he wanted to close all the coal plants.
00:28:17.620
And then somebody from his own party, two days before elections, Joe Manchin, is asking the president to apologize.
00:28:24.360
You know, if members of your own party are asking you to apologize two days before the midterms, that's not a good look.
00:28:34.980
Now, a year ago, do you think that Democrats were willing to admit that Biden is brain damaged and useless?
00:28:45.320
A year ago, even the Democrats would have said, oh, stop making fun of his speech defect.
00:28:56.580
There is a completely different tone from the Democrats about their own president.
00:29:10.460
Now, you're going to get people who are still supportive because it's a political world.
00:29:14.240
But let me give you a little story of something that happened to me.
00:29:20.600
I want to see if any of you have had a similar experience.
00:29:25.400
So last night, I went to a large-ish social gathering of people in my town.
00:29:34.400
So I don't want to be any more specific than that.
00:29:39.480
And I live in Northern California, and now we're outside of San Francisco.
00:29:46.560
So what do you assume is the political nature of my town?
00:29:55.880
I spent an entire night at an event and heard a number of pro-Republican statements from people privately.
00:30:15.820
I don't think there was anybody there who was favorable to the Democrat platform at the moment.
00:30:26.560
If you could put on goggles and see everybody's voting party, I'll bet you they're mostly Democrats.
00:30:43.260
And you don't even see people agreeing with wokeness.
00:30:47.840
At least within, let's say, the suburban family kind of people.
00:30:53.200
And I used to feel like I was in a little bit of danger when I just went out in public.
00:31:03.240
I mean, if you're notable as a Trump supporter, there was a time when it would have been dangerous for me to just, I was a little wary even just going to the grocery store.
00:31:14.760
Because I thought I'd be confronted with somebody.
00:31:19.380
But I thought somebody would get in my face, you know, throw a milkshake on me or some bullshit like that.
00:31:24.940
But I don't get any pushback anywhere where I live.
00:31:41.700
There is not a single instance of somebody in my own community who's told me I've, you know, gone too far or I'm a Nazi or I'm on the wrong side.
00:31:56.400
Are you seeing that in mixed gatherings there is no Biden support or even Democrats support, really?
00:32:12.740
And a lot of people don't care about that as much as you imagine they do.
00:32:16.600
Yeah, there was, you know, a little talk of Trump, but even he's not the important thing these days.
00:32:32.100
There's a clip on Twitter of MSNBC interviewing a Democrat governor candidate or governor, Kathy Hochul.
00:32:42.900
Now, imagine MSNBC interviewing a Democrat governor two days before an election.
00:32:58.660
You think that would be really just a friendly, right?
00:33:01.360
They'd be trying to get Hochul over the finish line?
00:33:10.160
I didn't know what I was looking at there for a moment.
00:33:16.040
Yeah, I didn't catch the name of the host, but credit to the host.
00:33:21.800
The host said, you know, I guess the context was a crime.
00:33:32.020
I walk into my pharmacy and everything is on lockdown because of shoplifters.
00:33:45.480
Did MSNBC just drive a stake through her fucking heart?
00:33:51.820
It looked like even MSNBC is like, we can't even do this anymore.
00:34:01.600
I think they just drove a stake right through her heart on live TV.
00:34:06.460
Because, and again, if somebody knows, if you saw the clip, I'd like to give a shout out to the host.
00:34:26.940
I wish somebody would say her name just so we could give her credit.
00:34:34.400
Jack Dorsey has apologized for growing Twitter too quickly.
00:34:42.900
And therefore, you know, being a participant, I guess he would say, for how many people had to get laid off.
00:34:49.640
Because he thinks it grew a little too quickly.
00:34:57.180
That's sort of a, it's kind of the context you wanted to hear, right?
00:35:01.360
Didn't you wonder why there were so many people?
00:35:05.220
And could you actually get rid of a lot of people and still run the company?
00:35:11.700
If grew too fast is a true statement, and it certainly looks that way,
00:35:16.840
then probably there's a little bit of, or maybe a lot, of, you know, cushion, stuff they can cut.
00:35:26.040
You know, I was trying to imagine what it would be like to be Jack Dorsey and know that all these layoffs are happening.
00:35:37.440
Like, that's got to be, it's just got to be crushing.
00:35:46.920
But I had to close a restaurant that wasn't working out.
00:35:50.500
And, you know, I had to tell the staff that they were all fired, you know, with severance and stuff like that.
00:36:02.700
It's twice a bad day because as I was telling them that I would, you know, treat them right and everything,
00:36:09.060
there were, some others were actually robbing the storeroom of all the electronics.
00:36:13.600
As I was talking to the staff, they were robbing, they were actually robbing the restaurant.
00:36:27.660
Musk met with some civil rights leaders, the Anti-Defamation League and NAACP.
00:36:35.620
And I saw an interview by the head of the NAACP, Derek Johnson.
00:36:41.360
Now, interestingly, the head of the NAACP appears to be, and I don't want to make an assumption here,
00:36:52.400
It's a weird thing about 2022 when I find out that the head of the NAACP is at least apparently black.
00:37:04.820
But I would like to give him a shout-out because you know how you get an immediate reaction to some public figures?
00:37:14.420
Do you remember when, maybe you don't remember it, but when Barack Obama burst on the scene,
00:37:20.500
the first time you saw him give a speech, even if you were a Republican,
00:37:25.620
you probably said to yourself, ooh, there's something special there, right?
00:37:33.440
And the special thing was, he wasn't doing the thing you expect.
00:37:42.560
And every time he didn't run as a black guy, which he did better than anybody ever did, right?
00:37:49.080
Obama was just the master of using race without using it.
00:37:54.820
Because it was there, you could see it, you could make your own decision.
00:37:58.440
But if he'd even said once, you should vote for me because I'm black, he would be done.
00:38:05.920
That's absolutely, you're off the list if you say that.
00:38:10.580
But he never did, because he's very good at this stuff.
00:38:13.700
So I want to give, like, a similar shout-out to this Derek Johnson.
00:38:17.800
I'd never seen him before, but he was talking about the meeting,
00:38:22.340
and he first said that he thought Musk agreed with the people who were there.
00:38:28.820
So no confrontational anything, just it looks like he agreed.
00:38:32.400
But there was a question of whether Musk would be able to execute,
00:38:35.580
which is exactly the best question you could ever have.
00:38:44.920
That's like what he does better than anybody who's ever done anything.
00:38:51.380
maybe in the whole world who's ever done anything.
00:38:56.020
But here's why I want to give Derek a shout-out.
00:39:01.060
and he talked about how he wanted to make sure that, you know,
00:39:03.180
Twitter was protecting communities that, you know, were marginalized.
00:39:26.500
without me hating him for saying I'm bad or something.
00:39:34.680
And he didn't even talk about it like in the normal team play.
00:39:44.020
And I've got to say, I want to see more from him, Derek Johnson.
00:39:48.660
Because one of the things I've been saying forever
00:39:50.440
is that the black American population doesn't have a good leader.
00:39:54.800
You know, somebody who would appeal to whoever they're trying to persuade
00:39:59.400
at the same time as appeal to the black community.
00:40:12.660
This is another thing that, in my opinion, is going right.
00:40:15.080
In my opinion, and this will be really controversial,
00:40:20.340
racial, what would you say, harmony, is actually improving.
00:40:36.160
Because we're wising up to why it happened in the first place.
00:40:39.260
We're wising up that it was the media that was causing it.
00:40:42.020
And I think people are wising up that everybody wants everybody to do well.
00:40:49.540
If there's one thing that you can say about Republicans
00:41:00.280
There are no Republicans who want black people not to do well.
00:41:07.720
They don't want to necessarily be the ones that pay for it.
00:41:17.100
if there was something that made sense to help on.
00:41:26.720
Because, you know, a black parent and a white parent,
00:41:30.680
when they're talking about educating their kids, same page.
00:41:42.560
I don't think about race unless I'm in social media.
00:41:47.500
When was the last time you had a real-world racial anything?
00:41:57.100
Now, I live in a pretty harmonious place in the country.
00:42:01.560
But when was the last time you had a racial confrontation?
00:42:12.380
It's literally not on my consciousness any time except social media.
00:42:18.860
So this is weird and wonderful and mind-blowing
00:42:30.300
that Twitter had sold somebody a blue check for $15,000
00:42:36.280
and that there was a system, sort of an underground system,
00:42:40.340
where people could get to some Twitter employee
00:42:43.300
who would illegally, I don't know if it's illegal,
00:42:46.980
but would offer to sell them blue check, you know,
00:42:52.980
And apparently a number of people took advantage of it.
00:43:04.820
Absolutely confirmed that Twitter had been selling blue checks.
00:43:12.800
Because, and it blows my mind on several levels.
00:43:17.660
Watching Elon Musk fact-check the world in real time
00:43:26.820
you have to wait until they give a speech, right?
00:43:29.800
Or wait until the statement comes out or something.
00:43:34.580
there'll be something that's just like a complete lie,
00:43:37.140
or in this case, something that's weird but true,
00:44:07.520
the most important thing that's happening lately.
00:44:35.780
and I love that he apparently cares about the country
00:44:38.520
and people and wants good things for all of us,
00:45:25.000
Is that the worst take you've ever seen in your life?
00:45:56.300
he's saying that it's not a good look for Elon.
01:18:25.020
Well, that's the first question you're going to ask me
01:18:36.620
By modern retired, I mean I've simply stopped doing
01:18:40.300
the things I found unpleasant that I did for money.
01:18:49.220
I was adding up how many hours per week I work now
01:19:02.580
because, you know, 60 to 70 would be more normal.