Episode 2400 CWSA 03⧸01⧸24
Episode Stats
Length
1 hour and 38 minutes
Words per Minute
140.54535
Summary
In this episode of Thick & Thin, I talk about the Octopus Murders, the CIA takeover of Dilbert's company, and the Pope Francis. I also talk about artificial intelligence and why we should be worried about it.
Transcript
00:00:00.000
Now I want to tell you that you don't have to be a racist to use YouTube or any of the other
00:00:08.980
Google products, but it does help. It does help. If you'd like to take this experience up to levels
00:00:17.120
that even racists like Google can't understand, all you need is a cup or a mug or a glass of
00:00:22.340
tanker, chels, just dye it in a canteen jug or flask of vessel vinegar and fill it with your
00:00:26.140
favorite liquid. I like coffee. Yay. Join me now for the unparalleled pleasure of the dopamine
00:00:32.980
of the day. The thing makes everything better. It's called the simultaneous sip. It happens now.
00:00:37.220
Oh, that's good. That's good. Well, let's see what's happening. So Netflix apparently has some
00:00:51.180
news series. Texas Lindsay is posting about this. And apparently it's a reality-based thing
00:00:58.200
based on the allegation that there are eight ex-CIA people who run the world.
00:01:09.860
before Netflix coincidentally goes out of business or possibly there's a terrible terrorist attack on
00:01:19.580
their data center. Because if any of this is true, they're all going to be killed. And it's called
00:01:25.760
the octopus murders. So I'm guessing after watching enough Mike Benz material about how the country is
00:01:34.920
really being run, this is really interesting. Now, the allegation is that the octopus, which is all the
00:01:43.160
tentacles of the, you know, the shadowy people, apparently they can just murder anybody they want.
00:01:49.300
You know, I'm going to watch that. I'm all in on that. Netflix. I'd be, well, I'm just kind of amazed
00:02:00.420
that it even exists. Are you kind of surprised that that even exists and got on, got onto Netflix? Anyway,
00:02:09.060
speaking of that, by coincidence, if you were a subscriber to the Dilbert comic, which is the only
00:02:14.180
way you can watch it now, either on X you can subscribe or you can get lots of my political
00:02:19.100
stuff plus the comic on the locals, scottadams.locals.com. But you'd be watching today the CIA doing a
00:02:27.560
takeover of Dilbert's company. How long do you think it takes the CIA to completely dominate and take over
00:02:34.480
Dilbert's company? It goes like this. Have you ever been to any sketchy islands? Not since Epstein died.
00:02:45.260
Here's a video of you in a blue dress. And we're done. Ten seconds. Ten seconds that took the CIA to
00:02:54.920
take over Dilbert's company. Now, while the robots are coming, I guess OpenAI is going to use a company
00:03:02.400
called Figure to make the robots that their AI will go into. So they're calling it Embodied AI.
00:03:09.940
Embodied AI. Embodied AI. We don't need a new name for robots, unless they're smart robots. I mean,
00:03:18.720
do we really have to call them smart robots? That's sort of like, do you remember years ago when they
00:03:24.420
called computers multimedia computers? And you knew that would go away because all computers are
00:03:32.140
multimedia, you know, eventually. So it seems to me since all robots are going to be AI, we don't need
00:03:38.940
the word Embodied AI pretty soon. Maybe by the end of the year, that one will be gone. And what do you
00:03:46.400
think? Is this the beginning of the robots taking over? Are you worried that there will be real robots
00:03:55.880
that are doing their own thinking? I'm going to say no. I'm going to say no. I think that we will
00:04:04.880
easily handle that risk. I'm not worried about the robots. I think, you know, we know how to make
00:04:13.840
consumer goods that, you know, get rid of most of the risk. I worry about AI that's in the system.
00:04:20.180
You know, some AI that gets loose in the internet in general, and it's disembodied. So I'm not worried
00:04:26.580
about embodied AI, because I think we can stop them by stopping the body. We'll probably have,
00:04:31.940
you know, a kill switch on the physical body. But I do worry about AI that's a virus someday.
00:04:39.320
That seems like it'd be a problem. And I think there's some report now that there is an AI virus
00:04:44.920
that can hop from one AI to another. That's pretty scary. So I'd be worried about that.
00:04:54.780
The Pope has made a strong stand against gender ideology, he calls it. Pope Francis.
00:05:02.660
And I realized that if I had not read that the Pope's name is Francis, I don't think I would have
00:05:08.000
remembered it. How many of you, how many of you, if you were quizzed on the street, would know that the
00:05:14.140
name of the Pope is Francis? It's Pope Francis. I'm just curious. How many of you would have known
00:05:18.420
that? Now, is it because I'm just not paying attention and all Popes look like the same guy
00:05:25.800
to me? A lot of you would have known it. Okay. I'm kind of impressed. I suppose if you're Catholic,
00:05:31.940
it's easier. Yeah. But I didn't even know his name. It's weird. Is that because the Pope is that
00:05:39.740
much less important? Or this one's less of a superstar? Oh, it's probably because he's less
00:05:46.080
of a superstar. He's not going around, you know, doing a lot of fancy trips and stuff.
00:05:52.760
But anyway, he says that gender ideology is the worst danger of our time. It makes everything the
00:05:58.540
same. And I guess it gets rid of the special, what do you call it? The special tension between the
00:06:04.920
sexes that is productive. Well, that's the Pope's point of view. Do you think gender ideology is the
00:06:12.740
worst danger of our time? That feels like a bit of an exaggeration. Really? I would say DEI is more
00:06:21.780
dangerous. Because I think there's a natural limit on the gender ideology stuff. I think it's mostly
00:06:28.880
limited to the mentally ill and, you know, some very small percentage of people who have something
00:06:35.320
that's more legitimate, you know, like they're born different. Would you agree? I think gender
00:06:42.120
ideology has a natural cap. There's a limit to how brainwashed you can get. I don't think you're
00:06:51.480
to brainwash the average person. And I think here's another, I don't know if this is a good
00:07:00.100
comparison, but did you know that the percentage of young women who are vegetarians in college
00:07:07.040
is like super high? Did you all know that? When young women go to college, the number, this is like
00:07:14.380
a third, like one third of them are vegetarians or something. It's a real high number. But then as soon
00:07:19.580
as I get out, you know, it normalizes with the rest of the population. Probably the same with
00:07:24.460
same-sex experimentation. You know, there's a lot of stuff that happens when you're young that you
00:07:30.140
decide not to do later. Anyway, I see his argument. I don't know it's the biggest problem. I feel like
00:07:39.420
it's naturally limited, but we'll see. I think DEI is the bigger problem.
00:07:47.200
Glenn Greenwald is talking about something that could be called Russia derangement syndrome,
00:07:52.040
but he hasn't used that phrase. That's my own. And he's talking about a few current examples. So
00:07:59.220
there's somebody on the X platform says that Senator Schumer should investigate Speaker Johnson
00:08:07.820
because he's working with Putin. So now we're blaming the Speaker of the House, you know, who's
00:08:14.680
never had any kind of suggestion of that kind of impropriety. Suddenly he's working with Putin
00:08:21.060
because everything you don't like is Putin. And then separately, Pelosi recently called on the FBI,
00:08:29.340
this is Greenwald saying this, to investigate pro-Palestinian protesters to see if they have Kremlin
00:08:35.780
ties. So now we think the pro-Palestinian people are maybe Russian inspired. Maybe Johnson is being
00:08:45.320
run by the Russians. And as Greenwald points out, it's a form of mass mental illness. But is it?
00:08:54.060
Do you think it's mass mental illness is sort of the same as Trump derangement syndrome? Because
00:08:59.540
this feels a little different. You know what it feels like? It feels like more evidence that humans
00:09:06.840
are no different than AI. And that what we think is our thinking is really large language model patterns.
00:09:17.120
If you just dropped in, you know, an unprogrammed person, like a human being, you just dropped him into
00:09:24.220
2024. And you just say, you know, listen to how people are talking, you know, try to form some
00:09:30.200
opinions. You would find that there's such a thick discussion that is usually people know they're
00:09:39.360
lying. You know, remember the original Russia collusion thing, the people who created it knew
00:09:46.560
they were lying. I imagine a lot of the people who spread it knew they were lying. So the large language
00:09:53.360
model would be trained on nothing but lies about Russia. Would you agree? If you trained a large
00:10:02.000
language model, explain Russia to us, it would be trained on 100% lies, because that's all we ever
00:10:10.120
hear about Russia. We never hear anything real about Russia. It's all lies. Or it's all, you know,
00:10:14.940
put into context that's friendly to the people telling the story. So how would people be different?
00:10:21.300
Everything that a human learns is by looking at what else is going on. You know, they're looking
00:10:27.520
at the total society, what people say. And it's exactly the same as AI. So I'm not sure this is a
00:10:34.020
form of mass hysteria, the way Trump derangement syndrome definitely is. This feels different.
00:10:41.340
It feels that there are so many lies on the topic, that it has become essentially embedded in our
00:10:49.580
intelligence, that the first thing you suspect is Russia. It's just on the top of your mind. So I
00:10:57.020
think it's really a top of mind problem. And the fact that there's so much repetition of these similar
00:11:03.220
kinds of lies that I don't know if it's, it doesn't look like mental illness to me, like Trump derangement
00:11:08.740
syndrome, literally is mental illness. Literally, it's a mass hysteria. This doesn't look like it.
00:11:15.040
The Russia stuff looks like some combination of pattern and lying and, you know, something a little
00:11:21.780
more ordinary, but terrible. Terrible. It's a problem, but more ordinary. It feels. Well, here's something
00:11:31.980
that's not ordinary. And apparently I became a semi-viral clip because I doubted that this could
00:11:40.400
possibly be real. And I still do because I can't wrap my head around it. I just can't wrap my head
00:11:48.520
around this could be real. That Canada, so end wokeness is on the X platform says this, says Canada is
00:11:58.540
trying to pass a new law that will crack down on hate on the internet and other telecommunications.
00:12:04.380
And the bill would allow judges to sentence a person to life in prison for advocating genocide in words.
00:12:18.460
So if you went on Twitter and said everybody in some category should be killed, you would go to jail for
00:12:27.340
life for your life, for your opinion. Now, I think you're a very bad person if you think everyone should
00:12:34.300
be killed. But let's game this out a little bit. So let's say that you say that the situation in Gaza is a
00:12:45.720
genocide. Now, I'm not going to agree or disagree. I'll say, for example, that Erdogan from Turkey says it's
00:12:54.600
approaching a genocide or is a genocide. A lot of commentators are calling it a genocide. Again,
00:13:01.560
not my opinion. I'm not giving you any opinion on that. I'm just describing it. So suppose we get to
00:13:08.280
the point where it's a common opinion that what's happening in Gaza, let's say it's a common opinion in
00:13:17.480
Canada, hypothetically, that whatever's happening at some point starts to look like a genocide.
00:13:24.440
And people start using that word. And then what happens, let's game it a little bit further.
00:13:31.240
If somebody goes on and says, you know what, goes on social media and says, I think what the Israeli
00:13:37.820
military is doing is exactly right, because in my opinion, let's say it saves more lives later,
00:13:44.940
or there was nothing else you could do, whatever the opinion is. Again, not my opinion. I'm giving
00:13:49.980
you some examples. Would you be very close at that point to have advocated genocide?
00:13:57.020
Because other people, including the head of Turkey, have said this thing's a genocide. And then you just
00:14:05.420
said you're in favor of it. You don't have to use the word genocide. Apparently, you just have to be in
00:14:12.300
favor of it on social media. Now, did I just describe a case where everybody who said a pro-Israeli thing
00:14:19.900
could be sentenced to life in prison? I think so. Is that too much of a stretch? If it is commonly
00:14:30.380
understood to be a genocide, I'm not saying it is. I'm saying if Canada, you know, decides government-wise
00:14:36.940
it's totally a genocide, and that I say Israel is doing the right thing, do I go to jail for life?
00:14:45.340
Because I accidentally was in favor of a genocide, according to somebody else's definition of that
00:14:53.100
word? Well, this is the worst looking thing I've ever seen in my life. Now, let's take me.
00:15:03.420
If you follow the news, you know that the news said that I said something terribly hateful about black
00:15:10.060
Americans. That's the way the news reported it. What's the truth? What I said was, oh,
00:15:17.740
based on this professional survey, a very large percentage of black Americans seem to have a bad
00:15:24.140
opinion about me for being white. Now, that's either true or not, and so I said if. If it's true,
00:15:33.020
you should stay away from people who have a bad opinion of you. Do I go to jail in Canada for saying
00:15:38.620
that? That you should stay away from people who are trained by DEI and CRT and DSJ, actually trained
00:15:45.900
by the government, officially trained by the government to see me as the problem, and that
00:15:52.300
I have their money and need to give it back, reparations, etc. Now, you should stay away from
00:15:57.900
that situation. You should run away from it if you can. I don't know how you could, really. It's hard to
00:16:02.620
get away. You'd have to cease that or something. But it has nothing to do with any quality of black Americans.
00:16:08.220
It's not about genetics or culture. It's literally about white people brainwashing a segment of the
00:16:15.180
population to hate me. Because if it were for the white people, there wouldn't be any ESG and CRT
00:16:23.820
and DEI. It would be this marginal little idea that people ignored. It's white people I was complaining
00:16:31.500
about. Did anybody understand that, by the way? I don't know. I don't think I've ever said it as
00:16:36.540
clearly as I'm saying it now. There would be no problem if white people weren't the main cause.
00:16:44.620
So I think people took it as an attack on black America, because that's sort of the
00:16:49.100
bad wording I used. I said, stay away. But not stay away because I don't like them.
00:16:54.460
I love black people. I've had only positive experiences in person. Only. In person, 100%
00:17:03.740
positive experience. I like anybody who's got an interesting story. I like anybody who's nice to me.
00:17:10.540
I like anybody who's got a sense of humor. And that usually covers most people. Most people are nice to
00:17:16.540
you if you're nice to them. So I love black people. But if the government is going to weaponize them
00:17:22.940
against me, and when I say the government, I'm not talking about black people in the government.
00:17:27.900
I'm talking about mostly white people promoting it and letting it happen. So my problem is specifically
00:17:34.620
with my people. My complaint is my people. The problem that my people are causing is a rift
00:17:42.940
between me and black Americans that I would very much like to enjoy, you know, a full relationship
00:17:48.060
with. But it's very hard when my people are teaching black Americans that I'm the problem.
00:17:57.100
How am I supposed to navigate that? So I say I'll stay away from it because I don't know how to
00:18:01.740
navigate a situation where my people are telling some other group they should hate me. What do I do
00:18:08.380
about that? You should get away. Try to find some way not to be in that situation, whatever it takes.
00:18:16.460
Now, according to me, I've said nothing that is hate speech, only good advice that basically
00:18:25.660
anybody would agree with. If you explained it the way I just explained it, literally nobody would
00:18:31.180
disagree. Nobody. But if I traveled to Canada or became a Canadian citizen, let's say I'm just a
00:18:39.180
visitor. Could I go to jail? How does that work? I mean, it's a law in Canada. I don't have to be a
00:18:48.380
Canadian citizen, do I? What if I just simply I'm in Toronto and I send down a post from Toronto?
00:18:56.540
Do I go to jail? The fact that I don't know the answer to that question, at the same time we're
00:19:05.020
watching Russia, I'm sorry, I called Canada Russia because it's so similar. Watching them go after
00:19:14.220
Jordan Peterson for his free speech, trying to take away his license. I would never travel to Canada
00:19:21.420
for tourism. Canada is actually completely off my list as of today. Before that, I would have said,
00:19:29.100
eh, they're not going to bother me. I'm just a tourist. I feel sorry for the citizens, but
00:19:35.420
if I just want to visit, that's no problem. But at the moment, if this were to pass,
00:19:41.820
I would never go to Canada. I would feel that that would be literally a risk of jail,
00:19:47.180
and it would be a high risk. It wouldn't be a low risk. It looks like it would be a pretty high risk
00:19:51.420
because if you're a political person, the feelers are out.
00:19:58.300
So Canada, you just destroyed your tourism for people like me. Elon Musk says that Grok, the AI he's
00:20:07.660
created for the X platform, will soon be able to summarize complicated legislation so you know what's in it.
00:20:20.220
You might actually understand what you're, and it's funny that AI, we would have to use AI
00:20:28.300
to untangle the confusopoly that our government intentionally creates, it's all intentional,
00:20:35.340
so that we don't understand what they're doing. We actually have a government that's a confusopoly,
00:20:42.300
meaning that we can't really participate because they're lying to us about everything.
00:20:47.900
We don't even know what we're talking about most of the time. We don't have real facts.
00:20:52.700
What they tell us is too confusing to understand. All the lawfare against Trump is so confusing now.
00:20:59.100
If you're not actually a lawyer, it's really hard to follow 91 indictments,
00:21:05.260
four cases, several venues, different schedules. Yeah. So what if Grok could summarize the bills?
00:21:15.500
Do you think he can? Do you think Grok at any time in the future,
00:21:22.940
with any amount of correction and tweaking, could summarize legislation?
00:21:29.740
I'm going to say no. I'm going to say it's going to fail.
00:21:35.900
It's going to fail because AI is trained on humans, and humans can't do this.
00:21:50.380
In other words, if a human looked at one of these bills, and let's say we were smart enough,
00:21:57.500
and AI is smart enough, to know what all the parts are, so we can at least sort out what it's doing.
00:22:05.580
What would you do then? Once you understood it, what would you do?
00:22:10.220
Suppose you had to describe it. As soon as you go to describe it, which is the useful part,
00:22:17.900
like this does this or this does that, it's a narrative.
00:22:23.660
So AI can't do narratives, or if it did, it shouldn't, because the narrative is the brainwashing,
00:22:30.940
the propaganda, the framing. We don't want it to do that, but it cannot do that. It doesn't have the option.
00:22:40.460
So you're either going to have one that's biased by its creators in one way, or biased by its creators
00:22:45.740
in another way. Suppose the bill said something like Biden is saying.
00:22:54.460
So Biden is saying the Republicans are to blame because they didn't approve the border bill that would
00:22:59.740
give funding that would be very useful to reducing the illegal flow. So what would Grok say about that?
00:23:12.620
Would Grok say they're asking for some money for the border and just leave it at that?
00:23:20.460
Because you wouldn't know if that's good or bad.
00:23:23.660
Right? Suppose it said it's money for the border that definitely will make it easier to stop illegals.
00:23:30.140
Is that accurate? Because I thought they do a clever thing where they say everybody's legal.
00:23:42.140
You see where I'm going with this? We play this game where we call the illegal migrants legal,
00:23:49.660
because we just make them go through a certain doorway, where if you say, can you say the word asylum?
00:23:56.060
Ah, asylum. Good enough. You're legal now. You see what I mean? So we turn them from illegal to legal
00:24:06.460
simply by saying you're legal now. That's all we do. Now, would Grok pick that up?
00:24:12.540
Would it know that the purpose of the bill, the funding to increase border funding,
00:24:19.900
would they know that the secret real reason for that is not to decrease anybody? It's basically just to
00:24:29.180
bring them in under the legal umbrella, but it's all the same. Whether they came in and gave their
00:24:35.740
name or not, it's all the same. So do you think Grok could ever learn to suss out what is the trick
00:24:44.380
and what is the real implication? I say no, because the only access it would have
00:24:51.660
would be narratives that we've created. So the best I could do is say, here's what it says.
00:24:58.220
I've simplified it. Here's what the Democrats tend to say about it. Here's what the Republicans
00:25:04.780
tend to say about it. But it's not going to go further than that. And you know that's all that
00:25:09.980
we have on TV right now. If I turn on the TV and say, hey, tell me about this border bill,
00:25:16.700
I'm going to hear Republicans say this, Democrats say that, and I don't know what's true.
00:25:22.620
I just know what they say. So I think the problem is that we imagine that AI is intelligent in some
00:25:30.860
independent way. It's not. It's just a reflection of us. And we imagine that it can solve a problem
00:25:37.500
that is sort of data factual when that was never the problem. The problem is not just understanding
00:25:46.300
the details of what it says. It's all the narrative interpretation, you know, who's in on this,
00:25:53.420
who gets helped by it, who gets elected because of it. That's all the real stuff. That's the stuff
00:25:59.580
you got to understand that Grok's not going to get you there. All right. Tucker Carlson
00:26:08.620
is saying that the Biden administration helped install a pro-Chinese government in Brazil,
00:26:15.500
which immediately shut down opposition media and began arresting dissidents.
00:26:19.020
They talked to two of its victims. All right. Every part of the statement makes your eyes go,
00:26:27.340
whoa. Wait a minute. The Biden administration installed a government in Brazil?
00:26:35.020
We installed the government of Brazil? Of course we did. Because we install all the governments in
00:26:42.300
in our hemisphere. And if we don't, the CIA has to get fired because that's their job.
00:26:47.820
Yeah. The job of the CIA is to make sure that our hemisphere
00:26:51.660
is kind of on our team. And everybody knows it, right?
00:26:56.860
So I guess, you know, nobody should be surprised at that. But it's just funny to see it in writing.
00:27:02.060
You know, you're not supposed to write it down. You're just supposed to sort of know it.
00:27:05.900
But was it a pro-Chinese government? I don't know about that. I suppose that would be an opinion.
00:27:17.340
But I would like to add that why do we think that our government is better than that?
00:27:24.300
You know, the way it's framed is, oh, you know, too bad they don't have our freedoms here.
00:27:28.700
Am I wrong that we don't have anything like a free press? We don't have freedom of speech. Not
00:27:37.180
even close. We have in America the freedom to say things that don't bother people who matter.
00:27:43.420
We have the freedom to say things that don't bother the people who matter.
00:27:49.260
If it bothers them, you will get canceled. And you just won't be able to talk to anybody anymore.
00:27:55.580
Because it bothers the wrong people. We don't have any freedom of speech here.
00:28:01.820
Not even a little bit. We have only the freedom not to bother important people.
00:28:12.460
So we have fake news here that, I mean, basically, most of the press is controlled by the CIA,
00:28:20.700
it looks like, at least for the geopolitical stories. And so we have the freedom to be part
00:28:27.820
of a fake news entity. And we have the freedom to listen to fake news. And on social media,
00:28:34.140
we know that we're being suppressed if the government doesn't like us. So in my case,
00:28:41.180
I'd love to know if Elon Musk could even answer the question of why I'm in such a silo
00:28:46.220
that not even, no Democrat ever sees my material now. On social media. I have over a million
00:28:53.260
followers. And I don't think anybody who's a Democrat ever sees my material. We're so little,
00:28:58.220
it's trivial. So what is my point of even talking? I have no persuasive capabilities,
00:29:05.180
because I don't cross over to anybody who disagrees with me. I end up talking to people who largely agree
00:29:10.380
with me. And maybe we'll learn something or a better way to say it. But yeah. Is there anybody
00:29:16.700
here who's a Democrat that is here because they want to hear a different point of view?
00:29:27.340
I'm just looking at, see if there's anybody here. Oh, we have an ex-Democrat. That's as close as we
00:29:34.540
can get. Yeah. No. So we have fake freedom of speech because you can't get your speech to anybody
00:29:42.700
who disagrees with you. And talking to people who agree with you is like, pretty close to not even
00:29:48.380
close to free speech. What other fake freedoms do we have? Well, let's see. Catherine Herridge,
00:29:58.140
the reporter who was fired from CBS. And she was one of the people who reported on the Hunter laptop
00:30:05.660
story, et cetera. And so she might be too much of a truth teller, people are suspecting. And now
00:30:12.940
there's a judge that's holding her in civil contempt for refusing to divulge her sources. Does that sound
00:30:21.900
like freedom of the press? Freedom of the press? Does it sound like the courts are doing their job,
00:30:31.980
just objectively doing their job? Well, let's find out a little bit more about this court.
00:30:38.380
Let's see. The judge, Chuck Ross, who's at Chuck Ross DC on the X platform,
00:30:49.740
he says the same judge and Obama appointee, so the judge who's holding Catherine Herridge in civil
00:30:58.860
contempt, same judge and Obama appointee blocked special counsel John Durham from entering evidence
00:31:05.500
against Clinton lawyer Michael Sussman that would have detailed their plot to leak false Trump Russia
00:31:11.500
info into the media. Huh. Okay. Well, you know, it's a small town. You know, everybody is an important
00:31:19.260
lawyer, an important judge. They've probably had contact with other important cases. So I mean,
00:31:24.540
by itself, that's not too alarming, is it? That's not too alarming. So why would we be worried about
00:31:31.100
this judge being some kind of like biased judge? Oh, also from Chuck Ross, the judge, his name is
00:31:38.460
Christopher Cooper. He's married to a Democrat lawyer, Amy Jeffries, who represented Lisa Page.
00:31:51.980
From the Russia collusion situation? Yeah, Strzok's girlfriend.
00:32:00.780
So now let me ask you this. When was the last time you could not predict the outcome of a trial
00:32:13.020
I don't remember a time. I'll say it again. When was the last time you didn't know the result of the trial
00:32:19.900
just by knowing who appointed the judge? It's pretty much every time, you know, right? For the political
00:32:27.340
stuff. You know, not not so much for the just ordinary crimes. But why do we have why do we have a court
00:32:34.540
if we always know how it's going to go based on who took took the case as the judge?
00:32:41.100
That's not even that that's nothing like freedom.
00:32:45.420
That's not even in the general neighborhood of any kind of a justice system that's like real.
00:32:52.860
I don't know what it is, but it's not any kind of real justice system.
00:33:14.240
And he found out the gym where Judge N'Goran goes to the gym.
00:33:25.760
And his disguise is he puts on a Tim Pool beanie.
00:33:36.240
I'd like to give you now my impression of James O'Keefe going undercover.
00:34:03.520
I swear to God, he could go undercover just by getting a haircut.
00:34:11.200
He could put like an earring in his ear and go undercover.
00:34:18.880
And I don't know, it just gets funnier and funnier.
00:34:21.520
Every time he goes undercover, there's something about it that just gets funnier.
00:34:29.120
The scoop he came away with is that the judge is kind of scrawny
00:34:40.020
You know, he's a certain age, so I mean, it's fine.
00:34:43.360
But then the other part of the story is that somebody else in the gym
00:34:48.420
says that the judge is always creeping on the women.
00:34:52.220
And there's this grainy video with no audio, I think,
00:34:58.720
like put her hand out like she's, you know, telling him to go away.
00:35:06.220
Probably it was just a hand gesture for whatever she was talking about.
00:35:29.520
So, of course, Biden and Trump visited the border
00:35:57.160
because they were chanting Trump, Trump, Trump?
00:36:19.520
Trump could stand there with voter registration forms
00:36:41.560
just stand there and hand out Republican ballots.
00:37:03.140
You can sign up to be a Democrat or a Republican.
00:37:25.120
where your teacher can change the gender of your child
00:38:00.840
And the Democrats would like the right to kill you
00:38:48.800
And it would scare the crap out of the Democrats.
00:40:24.140
could make three out of four migrants Republicans,