Episode 798 Scott Adams: The Clearview AI app that Identifies Criminals, With David Scalzo, Plus Impeachment
Episode Stats
Words per Minute
167.82063
Summary
In this episode, we're joined by David scalzo, founder and Managing Partner of Kirinego, a venture capital firm, and an investor in a company called Clearviewai, an app that allows law enforcement to take a photo of a person and identify them using a photograph.
Transcript
00:00:00.000
bum bum bum bum bum bum bum hey everybody come on in we got a good one today it's gonna be
00:00:07.420
it's gonna be lit it's gonna be off the hook it's gonna be amazing today we're gonna have
00:00:13.560
a special guest which i'll introduce in a moment but first things first right that's right first
00:00:21.340
things first and i think you know what's coming up first um we're a little low on followers for some
00:00:30.940
reason so let me see if there's uh i think i've got a bug here i'm gonna tweet going live
00:00:39.360
going live now on paris i'm gonna tweet that out because i think the notification failed or
00:00:48.800
something well happy weekend and i think i know why you're here at least part of the reason you're
00:00:56.980
here yes i do it's because you'd like to be here for the simultaneous sip i usually wait for a thousand
00:01:04.220
people but is everybody watching the uh maybe you're watching the impeachment or something
00:01:09.220
somebody says i didn't get a notification yeah i think there's something wrong with the notifications
00:01:15.340
and it might have something to do with having a guest on so i don't know exactly but we're going
00:01:22.360
to find out anyway first first if you'd like to participate in a simultaneous sip all you need
00:01:30.960
is a cup or mug or glass a tank or chalice or stein a canteen jug or flask a vessel of any kind
00:01:35.640
fill it with your favorite liquid i like coffee and join me now for the unparalleled pleasure the
00:01:43.080
dopamine to the day the thing that makes everything better simultaneous sip go
00:01:47.220
yeah yeah that's a good one yep just as good as i thought now if technology is working
00:01:58.200
and i believe it is i'm going to activate my guest and give you a little uh let's see i'm gonna pick my
00:02:08.560
guest and we're going to be talking to david scalzo founder and managing partner of kirinego i hope i
00:02:19.500
am pronouncing it correctly if you're wondering why we're going to be talking to david scalzo
00:02:24.580
it's because he's an investor in a company called clearview ai which is all over the news front page
00:02:32.160
of the new york times controversy everywhere it's the app that allows law enforcement at the moment
00:02:39.720
law enforcement is using it to take a picture of anybody could be a dead body a john doe a person on
00:02:46.440
the street that they've stopped and it will tell you tell the police who it is it will give the
00:02:51.740
identification from a photograph scary we'll find out david can you hear me i can hear you good
00:03:00.700
morning how are you doing scott i'm i'm doing great good morning now i'm correct that you are
00:03:06.240
an investor in clearview ai right yeah absolutely yeah we're uh we're venture capitalists uh the name
00:03:12.680
of our firm is kieranaga it's actually japanese uh name japanese term but uh before we get started
00:03:19.060
i just want to say you know i'm an engineer by training and when you're an engineer by training
00:03:23.820
you know there's two people in the world you have a man crush on one is elon musk and the other is scott
00:03:28.860
adam so you know i i really appreciate the fact that i'm able to be here with you today
00:03:34.060
well well thank you uh i did notice from your biography that you've got two degrees in engineering
00:03:39.780
engineering and then engineering management and then you've got a an mba from a top school in the
00:03:45.500
country so you're sort of an example of what i talk about in terms of talent stacks you know you've
00:03:50.880
you've combined uh different skills together so that you've just got a greater view of the world which
00:03:57.080
brings us to clearview ai you put money into that your firm did tell tell us uh well first
00:04:04.380
describe what the app is doing today so that we can get a little bit of context here absolutely so
00:04:11.900
there is an explosion of digital information out in the world uh more and more people are being
00:04:17.340
are connecting to the internet and everyone or a lot of people are posting information it could be
00:04:22.920
information on academic journals it could be uh videos of uh it could be people doing tiktoks
00:04:28.800
um it could be philosophy sports religion charity everything people are putting on there and some of
00:04:34.680
the most powerful tools for humans to improve ourselves to be give a happier healthier life
00:04:41.100
is using things like search engines like google for instance to search for words and find where
00:04:48.820
those certain keywords come up on the internet so what clearview does is something very similar
00:04:54.220
you take a picture a photo of a person and then it directs you to links essentially all across the
00:05:01.300
internet showing where that person shows up in various contexts so so in short it's a search engine for
00:05:09.580
faces so that you can put an identification with the face absolutely absolutely which is now let's talk
00:05:16.480
about the the social implications but but tell us where it's being used how many people are using
00:05:20.920
it who's using it who who who has access to it is it the public or is it just police force yeah so
00:05:27.620
right now it's a startup and whenever you do a startup you know startup 101 whether you're at
00:05:32.200
harvard northwestern stanford business school is you target a specific industry vertical someone who is
00:05:38.500
a super user and where they're using right now is law enforcement so there's over a thousand
00:05:44.000
independent uh independent law enforcement agencies and so this is everything from the federal
00:05:48.660
government all the alphabet agencies to state police departments to county sheriffs who are elected
00:05:54.120
directly by voters to um local municipal um police officers and they're all making independent
00:06:01.060
decisions to use clearview now is there uh it's not limited to the united states right well right now it is
00:06:08.940
but um you know the technology is definitely powerful enough and broadly enough to uh to be
00:06:13.820
used elsewhere as well all right so how give us some uh use cases tell us what uh crimes are solved
00:06:21.220
why do police like using this thing well if you think about a detective any crime show you watch what's
00:06:27.880
the number one thing they want to do they want to generate leads who are possible suspects and then they
00:06:32.840
want to track them down using their skills and they use credit card data and cell phone data and all
00:06:37.940
sorts of other information fingerprints so what clearview does is it helps them generate leads
00:06:43.480
more quickly so if you have a suspect in a human trafficking child exploitation case and you only
00:06:50.180
have a photo of someone you can use clearview clearview the photo and then it will show where that person may
00:06:57.640
be in uh on the internet whether they have a twitter profile or a facebook profile or whether they're
00:07:03.800
uh you know at some sort of basketball game and that generates leads and ideas for the police officers
00:07:09.820
and detectives where it's exceptionally powerful is when you're working with um gangs whether they are
00:07:16.760
um uh terrorist networks whether they are drug cartels whether they're mob gangs and where you may have
00:07:22.960
an inkling of one person in the game but you may not know the the other five six seven people and so if you
00:07:28.900
take a group photo then what you can do is identify the other people or at least give you some leads
00:07:34.300
and then of course our law enforcement will follow those leads and determine whether uh they're you
00:07:40.440
know whether the person should be uh implicated or not all right leads are fun but how many cases have
00:07:46.320
been solved tell tell us about uh any happy endings that we know about so far well the new york times
00:07:53.140
article um specifies one where the indiana state police have really talked about where um they um
00:07:59.560
busted a whole network of um unfortunately individuals who were taking advantage of children
00:08:04.220
um and so uh getting those people off the street is very good there was another one in uh in florida that
00:08:10.660
they talked about where a person had a certain type of tattoo and that tattoo was then uh being was able
00:08:17.420
to be matched uh based on that those are some examples now what what's what types of crimes are the ones
00:08:25.660
that are most likely to be usefully solved well it's it's clearly the most vile crimes out there right
00:08:33.320
it's uh you know it's the um the child exploitation the human trafficking the domestic violence the uh
00:08:39.820
the drug the uh the um um drug cartels terrorists those are the ones more likely to be solved but
00:08:48.300
it's also you know for shoplifting petty thieves assault subway gropers is one where um it's used a
00:08:56.000
lot where you have a grainy photo of someone that's uh you know pickpocketing on a subway um those are
00:09:01.600
those are different types of things where clearview can be used all right i'm watching the audience the
00:09:07.280
audience is going crazy in the comments because they want me to push you on the social risks so
00:09:14.580
let's talk about that so accepting that i would imagine law enforcement would be drooling over this
00:09:20.840
product because it allows them to identify somebody who may not want to identify themselves
00:09:25.700
or it might be in a situation where you're not necessarily in a conversation with them
00:09:29.940
but now they've also kind of be used to identify let's say a dead body that doesn't have
00:09:36.220
identification it would do that right oh it could it could be helpful in that you know what um i've
00:09:41.980
talked to paramedics and emt people and a lot of times what you have is someone who
00:09:45.680
has had a heart attack or who has fallen ill and you know unconscious and you who is this person do
00:09:52.300
we have any context and so uh to be used with uh paramedics would be great well what if somebody
00:09:58.420
is i just saw a question a really good question go by in the comments what if somebody was an
00:10:02.920
undercover undercover cop and and the bad guys got this app could they find their undercover cops
00:10:08.640
dot potentially sure absolutely so so there are uh like every new technology there's going to be the
00:10:16.360
the good and the bad now people are asking where did the data come from are you scraping only from
00:10:22.320
public legal sources do you have secret sources where do you get your database of faces from
00:10:28.560
so we have to you know let's start out what i talked about there's an explosion of digital
00:10:33.140
information that people are voluntarily putting up on the internet right and so there's millions of
00:10:38.540
faces up there and people have public personas on facebook on linkedin on tiktok on instagram uh in
00:10:46.260
addition to all the other places that information's put for instance almost every single uh work uh employee
00:10:52.440
corporation out there has a public facing website that has the pictures of for instance all the
00:10:58.440
lawyers in their firm or all the consultants in their firm or their leadership so what we do is we
00:11:03.420
index all that public information and then just connect the dots essentially link it all together
00:11:09.080
so it is only public sources now there are other things that are public sources and that the government
00:11:14.100
wants to get out those informations are like the most wanted list where they you know we we link that
00:11:20.380
together they have um the center for uh child exploitation and trafficking we link those stuff we link those
00:11:27.760
together but it's all publicly available information um that is helpful to everyone so i um i noticed that
00:11:35.900
i saw a story that the new jersey is the new jersey police department decided to not use it to ban it
00:11:42.240
is that do i have that information right so there was a pronouncement by one of the ags of something to
00:11:48.540
that extent but let's let's take a broader view you know there's autonomous vehicles that are taking photos
00:11:53.760
there's going to be fedex uh that drones flying over our heads that are going to be taking photos
00:11:58.040
the best thing we can do is get this technology out to everyone because what it does is it increases
00:12:04.700
transparency and when you increase transparency you increase trust with people and when you have
00:12:09.920
trust you increase the opportunity more meaningful relationships now here's what's going on in in
00:12:15.680
new jersey and you know this is obviously very early so i don't want to sound too hyperbolic on
00:12:21.540
how this happens but let's say they ban this technology and they um only allow government
00:12:29.000
to use it what's going to happen is you're going to have a two-class you know a hunger games type
00:12:33.480
society where the private where the the politicians can live in security and the rest of us wander among
00:12:40.740
thieves and exploiters and so what will happen in new jersey is more people will move to places where
00:12:46.440
it's safer like north carolina and florida um and we'll you know that is the dystopian future when
00:12:52.260
it's actually banned if we allow it to be used freely i creates a better society for all of us all right
00:12:58.580
let me let me uh let me tell you my uh personal uh stake in this in the sense that as a semi-famous
00:13:06.780
person my my reality is that when i go in public people recognize me by face quite often especially
00:13:15.020
since i've been doing the live streaming so pretty much everywhere i go there's going to be somebody
00:13:18.960
who recognizes me by face but they also know because i'm a public figure they know something
00:13:24.620
they read about me that could be bad usually not true something you know something about my background
00:13:30.640
my education my career so i've lived in this world you know 20 years ahead of the rest of you for a long
00:13:37.720
time and so you're probably wondering what would it be like if every time i went out in public
00:13:43.800
you know we're going to assume that this technology creeps into the public because it's inevitable
00:13:48.480
wouldn't you say it's inevitable that the public will have this tool well peter diamantes says there's
00:13:53.940
going to be a billion internet connected sensors in the next five years which means we're going to know
00:13:59.740
everything at any time in anywhere um and so it is inevitable that this digital information will be
00:14:08.460
out there um so let's continue yeah so let's uh just complete the thought so if you're wondering
00:14:15.400
those of you watching this you're wondering about yourself right you're thinking okay what does this
00:14:19.960
mean to me you know what would happen to me how would it feel what's life like if every time i went out
00:14:26.220
anybody who saw my face could know also my biography and the answer is that's my life that's exactly my
00:14:33.780
life every time i go outside people recognize me i don't recognize them and they know my whole
00:14:39.540
biography and they and they can treat me differently if they if they choose to i'm sure some people
00:14:45.460
discriminate against me because they think oh he said good things about the president that one time i
00:14:51.060
you know i've got to be mean to him so i've lived in that world and i got to tell you
00:14:55.520
doesn't make much difference so here's here's the weird thing about it if i could snap my fingers and
00:15:01.920
change it and make all of my privacy come back i don't know if i'd care you know believe it or not
00:15:09.760
if you had asked me would i choose this i probably would have said no i want my privacy because everybody
00:15:15.240
that's everybody's first reaction right there's nobody who wants to give up privacy if if you ask
00:15:21.500
them but nobody asked me i just lost my privacy because of my life choices and it didn't make any
00:15:29.700
difference i can tell you i go through life and it makes absolutely no difference now if let's say i
00:15:35.200
had a criminal record um would it make a difference to me if people i i interacted with knew i had a
00:15:44.220
criminal record but i i paid my time i'm clean with society what do you say about that world where
00:15:51.480
somebody who really has paid their debt followed the rules but now it's going to follow follow them
00:15:57.000
around in a way that it couldn't have before what about that person what do you say to that
00:16:01.460
yeah so those so the question is whether information should be hidden from people and then the second
00:16:08.860
part is whether it's relevant to decision making or or being left alone and so we're against
00:16:14.380
discrimination it's very simple so we have laws in this country for instance that have statues of
00:16:19.900
limitations for instance let me let me stop you right there nobody cares about the laws in the real
00:16:25.800
world people are going to react the way they react and they don't give a fuck about the law
00:16:30.180
so you know people are either going to kick you out of the restaurant or not or or give you bad
00:16:36.480
service and dirty looks or or not hire you but they won't say it's because of your criminal record
00:16:42.020
although what what is the law on that is the current law that if you have a criminal record you have
00:16:49.060
to disclose it or you you can choose not to what is the law on that well i i i'm not an expert in
00:16:55.740
here and it very much depends on context let's say let's say you're applying for a job if you're
00:17:00.540
applying for a job can the employer ask that question i don't know what the current law is on
00:17:04.560
well david coke and charles coke of coke industries very famously have an initiative called ban the
00:17:10.820
box where they want to make it illegal for employers to ask that question because they want
00:17:15.220
to encourage people who have had felonies or prison sentences and have reformed to be able to be
00:17:22.460
on a level playing field and get hired based on their skills and and and you know initiative
00:17:28.140
rather than being uh discriminated against because they have to disclose something that may have
00:17:33.400
happened a long time ago what what if their crime was uh let's say a sex crime you know let's say they
00:17:39.000
had uh raped somebody don't you think that the other i'm putting this on you like you have to solve
00:17:45.160
all of the world's uh moral and ethical dilemmas but uh would the employees of that company have a
00:17:52.340
right to be to know that somebody who had a history of let's say sexual assault which is unique it's not
00:17:59.140
like anything else um should they have a right to know that hey scott i'll just say these are very
00:18:06.960
complex issues that a lot of people need to think about but i'll put it this way you know based on your
00:18:11.640
writing that humans are biased right that we that we take information and we filter it through our
00:18:16.980
own lens and decide how to interact so the question is do we hide what people concede we try to manipulate
00:18:23.960
their bias so they act a certain way or do you give them all the information all the data points
00:18:29.940
so that they can have a better sense of how they want to make a decision and we're of the belief that
00:18:36.440
transparency is better for everyone than trying to manipulate people by hiding information so so let
00:18:43.060
me let me do what uh i always caution uh people to do which is to show both sides because typically
00:18:49.380
these conversations one somebody's an advocate for one thing or the other so they're just going to say
00:18:54.500
the costs or just going to say the benefits so if i could list the benefits it would be massive
00:18:59.960
improvement i mean really substantial improvement in law enforcement catching people who need to get
00:19:05.860
caught so that would be one benefit another benefit would be um more information people would simply
00:19:12.280
have better more information deeper information about other people that could protect the people
00:19:17.980
who get the information it could be bad for the person whose whose privacy has been given up because
00:19:23.940
they might want to hide some things until later but certainly good for the person who who's getting the
00:19:29.060
information that person has benefited at somebody else's expense but and and you make this point which
00:19:34.840
is fascinating and i'm trying to think throughout human history can anybody come up with an example
00:19:41.460
where more transparency was worse and i think the the worst case scenario which you hinted at
00:19:49.040
is that when somebody has full privacy and somebody does not that that's your very worst situation
00:19:55.820
especially if the people have the privacy or the government and the people who do not are the citizens
00:20:01.000
that's horrible but it's also just as bad if your neighbor knows everything about you but you don't
00:20:06.540
know anything about your neighbor so this is the sort of tool that sort of just opens pandora's box
00:20:12.840
and allows all of us to know way too much about each other and i've long predicted i'm i'm in writing
00:20:18.940
i have predicted this a number of times that partial uh privacy is the worst situation where somebody has
00:20:27.700
it and somebody doesn't the best situation which we may never get to people will resist because
00:20:33.220
it's just natural to resist it is where we all know enough about each other that we no longer give
00:20:39.560
a shit about each other's flaws and i know that's hard it's a hard concept to hold amen but if you
00:20:45.960
actually know a lot about another person and you know all their flaws and they know a lot about you
00:20:51.800
and they also know your flaws and they still say hey scott you want to go to lunch i am so cool with
00:20:58.620
that person i'm as cool with that person as you could possibly be because i know you you know me
00:21:04.820
if you're cool with me and i'm cool with you i'm gonna understand you're a human being
00:21:10.380
you know we've all messed up this is why in part why i have the 20-year rule and the 48-hour rule
00:21:16.800
about forgiving people's past behaviors i'm a real big proponent of judging people by who they are
00:21:23.460
right now not judging people by that thing they did that's that follows them around forever and i feel
00:21:30.180
like we all have to get to that place and this may force us there because if you think about it
00:21:36.540
you wouldn't like anybody if you knew all of their mistakes you wouldn't like anybody it would be the
00:21:45.440
end of friendships the end of hiring the end of marriage everything but if you all know that the
00:21:51.120
other is just as flawed as you are different flaws but it just exposes your humanity you say
00:21:57.320
huh i think i'm okay with that and by the way just tying everything back to president trump because we do
00:22:02.500
that um i think the fact that all of his let's say his personal life peculiarities some would say
00:22:09.900
their flaws um are so known that we're kind of comfortable with them there's something about
00:22:16.600
knowing people's flaws that allows you to get past them it's the not knowing that can be kind of scary
00:22:22.460
so given all that what what is the biggest pushback what do you think is the what do you think society is
00:22:30.340
going to gel around to say we don't you because you know there's going to be pushback what do you
00:22:35.640
think they're going to focus on as the thing that's the big expense of this well i i think the biggest
00:22:41.520
complication is when people say the word privacy it's really split into two things anonymity and
00:22:48.160
autonomy or can you be hidden and can you be free and those can be separated being hidden maybe everyone
00:22:54.880
knows everything about you but the most important thing is is just leave me alone and those are the that's
00:23:01.040
the complication and that's where we need to work as a society i'm figuring out what the rules are
00:23:05.680
about being hidden which i think is is being eliminated by technology and being left alone
00:23:12.260
which i think as a government as a society we come together and figure out what the rules of engagement
00:23:16.900
are you know we're always afraid of the unknown and one of the things i like to point out is if you
00:23:24.040
could if you could rewind to 20 years ago and i said to any one of you how would you feel if you lost
00:23:30.700
all of your privacy about where you go and what you buy you know all of your transactions and all
00:23:37.580
of your physical location what if what if that was all knowable or known how would you feel and most
00:23:43.500
people would say my god i could not live in that world but we live in that world we live in a world
00:23:48.540
where all of our transactions many of us have our dna already on some database i do i've got my dna
00:23:55.160
several databases i would think and it doesn't really make much difference i still wake up drink
00:24:02.200
my coffee have my periscope and unless i'm committing a crime and somebody's looking for me it makes no
00:24:07.780
difference so i think we're approaching a time when crime might just go away because we would give up so
00:24:14.720
much privacy not to every single person but at least to the government and law enforcement should
00:24:20.180
they want to check and i'm looking for the costs you know i keep looking for the tragic societal costs
00:24:28.420
of either me who has no privacy when i go outside people know who i am and the government knows
00:24:35.580
everything i buy and everything they could probably tell everything i browse all of it and it doesn't
00:24:42.140
seem to make any difference in my life now i don't know how unique that is i guess it's but if i were
00:24:48.060
involved in criminal activity it might make a pretty big difference but i'm not so um i think
00:24:55.380
for most of you you would be more afraid than you need to be it's just natural to be afraid of the
00:25:00.980
unknown and it's natural to be afraid of giving up any kind of privacy but uh what what would you say
00:25:06.600
is the the most valid uh argument by the people who say hey there's some costs coming at us some
00:25:13.240
societal costs what what is their best argument on the other side i like to ask this to to see how
00:25:19.940
um see how unbiased you are yeah i think the best argument is always about historical data you know
00:25:26.360
what happens if someone accused me of something i didn't do or was found um innocent what happens if
00:25:32.040
it's a long time in the past um is that information readily available should it be available what friction
00:25:38.240
is there and then who controls the past right who decides whether data can be erased or not shown
00:25:44.120
and who shows um you know who decides what can be shown and that is a very difficult question to
00:25:52.720
answer and a um and there's there's a cost to to doing it both ways there's a cost to letting people
00:25:59.420
see everything that happened in the past and there's a cost to allowing some group to manipulate the
00:26:05.400
past about what can be shown so there it's not an easy answer for that one so so i'm seeing uh some
00:26:12.720
good questions going by here in in the comments uh people are saying what if you're not a multi-millionaire
00:26:18.200
i assume that's i assume that's directed at me you know and pointing out that i don't have as much to
00:26:23.980
lose so i would imagine if you were trying to you know get a foothold in life get you know get a job
00:26:30.220
after some bad situation in your past that would be pretty devastating but what how do you see that
00:26:36.880
evolving do you think there's some people who will just be just totally left by the side of the road
00:26:41.860
by this kind of technology because just no nobody will nobody will interact with them because of
00:26:47.020
something they did in the past so i have three teenagers and they upload hundreds of photos every
00:26:52.960
week and they're interacting with hundreds of friends and instagram and tiktok and snap
00:26:57.460
and it's interesting that they realize most of the stuff is transitory and even dumb and they do
00:27:03.760
allow their friends to evolve and change positions and they're still friends with them kind of like
00:27:09.100
what you said is that they can be authentic and vulnerable and out there without having long-term
00:27:16.020
biases against these friends and the other thing that people need to understand is most people don't give
00:27:21.840
crap about you or all your flaws they really don't i mean i think we all like to have this huge ego that
00:27:28.880
i am so important and that anyone knows that i did something wrong 10 years ago they're gonna care
00:27:35.100
most people don't care and if someone does care find a different friend it's very simple yeah let me give
00:27:40.840
you two two other filters to look through certainly that's that's one of the most important things that
00:27:47.100
people don't care about you so much but let's say everybody knew that um that you were a furry you
00:27:54.540
know you i'm just making this up let's say they do that you like to dress up in animal costumes sure
00:28:00.160
now now your neighbors would immediately say oh i don't know if i can let the kids come over there
00:28:05.660
anymore because you like to dress up in animal costumes but here's the thing people never never think
00:28:11.060
about all of the other people who dress up in animal costumes would also be able to identify you
00:28:17.760
and next thing you know your life is 10 times better because you don't care too much about your neighbor
00:28:22.740
who doesn't like your habit you just met a hundred friends that you're hanging out with and they all
00:28:27.300
have the same interests so the first thing is if you're the only person they know has some weird
00:28:33.620
you know or let's say unusual i'm not even going to call it weird let's say a non-standard
00:28:37.460
practice you can find all the other people like you and suddenly life's actually better not worse
00:28:43.620
the second thing is i'm going to give you this real life example my late stepson when he was uh
00:28:50.020
i think he was 18 or so i got him one of the greatest jobs you could ever have which is a job
00:28:55.740
at a dj company and he was going to be an apprentice to be a dj kind of a cool life etc he lasted one week
00:29:03.220
on the job because somebody sent his boss a picture of somebody else who had taken a photo
00:29:08.900
and put it on snapchat in which he was in a room with some marijuana paraphernalia and the boss quite
00:29:15.720
reasonably i don't disagree with the boss at all said i don't want any employees who have this kind
00:29:21.540
of a picture on social media and he fired him it was a great job lasted one week and got fired now
00:29:27.940
imagine if he also knew what was happening with his other employees what he would have found is he
00:29:35.280
would have had to fire his whole damn staff only one of them got caught every one of them did
00:29:41.320
something that would be you know roughly equivalent to this level of you know bad behavior if you can
00:29:47.940
call it that and i would say that probably my stepson would have kept his job if your technology was
00:29:55.740
ubiquitous i would agree with you yeah i mean what did what did the boss do when he was in college
00:30:00.980
and uh high school you know what if there anyways i agree because because remember the boss did not
00:30:07.500
fire him for his behavior he fired him for being caught in his behavior it was very specific he it
00:30:14.880
wasn't a moral it wasn't anything moral ethical or anything he just said that's that's not the
00:30:19.980
reputation i want associated with my company because we deal with the public he would have soon found out
00:30:25.540
that was pretty standard with all of his employees i think all right so uh give us give us some uh
00:30:31.280
wrap up here i've got a few more topics i want to share with my peeps so look i know i know on your show
00:30:37.700
you talk a lot about america and great and and the reason why america is the greatest and the most
00:30:42.840
prosperous is because of our bill of rights and the first amendment rights says we do not have to be
00:30:48.400
hidden to be free we do not have to be hidden to say what we want to share ideas to share
00:30:54.080
information and to be with people and so if we just embrace that and understand that that's why
00:30:59.540
we're the greatest we don't need anonymity we can be free because of those bill of rights and clear
00:31:05.640
view helps us you know make a happier healthier safer place i would even go so far as that it might
00:31:14.600
help even race relations because you're going to start seeing people for what they do instead of
00:31:21.600
your first impression but that's that's maybe a little little too optimistic so thank you david
00:31:28.340
scalzo tell us again the name of your investment firm uh kieranaga partners spelled k-i-r-e-n-e-g-a
00:31:37.820
a-g-a yep a-g-a sorry no problem and where can they find you on twitter uh at scalzo underscore david
00:31:46.940
all right thank you so much for joining us and uh we're gonna talk about some other topics and i'll
00:31:54.140
talk to you later all right that was fun um this this topic fascinates the heck out of me because
00:32:01.760
it's going to change everything but let's talk about some other stuff let's talk about impeachment
00:32:06.800
and adam schiff and all that stuff um scott jennings wrote an interesting piece for cnn in which he
00:32:15.860
notes that adam schiff is essentially doing putin's work because if what if what we were worried about
00:32:22.200
is that russia was trying to undermine trust in our system and that's what we're worried about right
00:32:28.280
we're worried about those pesky russians interfering with our elections and undermining our faith in our
00:32:35.280
own system and as scott jennings points out is there anybody who's eroding faith in our system faster
00:32:42.760
than adam schiff i mean that's all he's doing so we can't really care about eroding faith in our system
00:32:51.540
if we're all engaged in doing exactly that i mean even just watching it you're part of that
00:32:57.340
um and as uh joel pollack pointed out uh the democrats are literally obstructing their own
00:33:06.760
their own election because the democrats by forcing this impeachment vote have taken uh several of the
00:33:14.400
people running for president as democrats off the field and they they basically are ceding iowa
00:33:21.020
to the people who are not already employed you know joe biden and uh and buddha judge
00:33:27.180
so it turns out that being unemployed just turns into a big advantage for running for president if
00:33:33.180
you're a democrat so i don't think we can claim that there's any kind of moral superiority going on
00:33:40.500
with any of this impeachment stuff because everybody involved is doing nothing but chasing their own
00:33:46.060
political game you know in lawsuits how if you sue somebody and i don't know if this works in every
00:33:53.640
case but if you sue somebody and it turns out that you're wrong and the other party wins they can
00:34:00.340
often if not always i don't know how this works but they can often recoup their legal fees so in other
00:34:06.580
words there should be a cost to impeaching and failing you you feel me if you impeach and win
00:34:14.360
then the the side that starts the impeachment wins politically and every other way i guess
00:34:18.840
and but what if they try to impeach and fail what if they fail should there be any blowback
00:34:27.080
any any any cost to that and i think maybe there should be because tucker carlson always says this
00:34:37.040
and i swear i probably heard tucker carlson say this for a solid year because it's one of those
00:34:42.840
things he says regularly and every time i shook my head and i said to myself tucker tucker that's crazy
00:34:50.180
you know i agree with a lot of stuff you say but that's crazy and here's what he says
00:34:55.080
he says that every time the democrats accuse the republicans of doing something it's because they
00:35:01.620
themselves are doing it and i thought to myself there's no logical reason that's true and i haven't
00:35:08.480
really noticed it and it's sort of a crazy thing to say and then i started paying attention
00:35:15.160
and i don't know what causes it i don't know if there's a cause and effect i don't know if it's a
00:35:21.940
perceptual thing but damn it's consistent we're watching it again it is just time after time after
00:35:31.240
time it is exactly that the democrats are complaining that trump has put our system at risk by degrading
00:35:41.180
its you know trust and integrity but it's what they're doing they're spending 12 hours a day
00:35:49.160
eroding our trust in our system they've actually just destroyed part of the constitution while we
00:35:56.640
were watching you know that part of the constitution that said impeachment was a a real solemn thing
00:36:03.100
they just took a real solemn tool probably the most one of the most important maybe the most important
00:36:11.100
it could be the most important part of the constitution that there's a way to remove the top person
00:36:15.900
and they've they've degraded it they've turned it into a joke
00:36:20.820
you can't make a cleaner argument that they're doing actively at this moment the thing they're
00:36:27.840
accusing somebody of doing you've never seen a cleaner example that i i wish i could tell
00:36:33.020
why you know when tucker says it it makes me think that it's actually a strategy
00:36:40.440
no he doesn't say that but it but it feels like a strategy even if it isn't just the way it plays out
00:36:48.080
all right and so i ask you this if shift's claim is that the only reason that trump did what he did
00:36:57.540
with ukraine asking them for the investigation the claim is that the only reason it was done
00:37:03.080
was for his personal political benefit the only reason now that borders on crazy
00:37:11.060
because there's obviously something to be worried about if your next president might have some blackmail
00:37:18.980
material or some corruption entanglement anything that's a problem over in ukraine so obviously
00:37:25.060
there's a little bit of interest or or should be um but what is the standard for how much a decision
00:37:35.480
by a government official how much of it is politically motivated in other words personally
00:37:41.000
beneficial versus good for the country what if it's 90 10 what if it's 90 for their own good
00:37:48.700
and 10 for the public can they be impeached what if it's 99 and one what if one percent of it
00:37:57.280
is for the public good yeah it's good for the public too in some trivial way but 99 of it is just
00:38:04.660
for me personally is that the standard 50 50 what if it's 49 49 51 the point is there is no rational
00:38:15.040
logical way to make a standard for how much you know what percentage of the reason is personal
00:38:23.040
versus what percentage of the reason is good for the public and certainly with this uh biden and
00:38:29.420
burisma situation it is trivially easy to show that there's some national interest i mean you could
00:38:38.360
argue how big that national interest is but i don't think you can argue it exists it clearly exists let
00:38:46.060
let me prove it to you let me prove that there's some national interest in the burisma biden thing
00:38:52.740
here this is an absolute proof right i'm a citizen of the united states i have an interest in knowing
00:39:00.480
what happened with burisma and biden i'm not lying absolutely honestly i'm interested in knowing that
00:39:08.180
because i think it could be important if i had to guess probably not that much but i'm interested i would
00:39:15.420
like to eliminate that as a risk now can it be said that there's a national interest yes i just proved
00:39:22.660
it i proved that one out of 300 and whatever 60 million americans is legitimately interested in that
00:39:30.800
question would it matter if there are two of me does it make a difference if there's there are a million
00:39:38.920
people like me at what point can you say you've satisfied the question of national interest because
00:39:47.560
i'm part of the nation i just said i had an interest so there's some percentage that certainly has an
00:39:53.360
interest so my point is you could not create a standard where you're trying to parse out what
00:39:58.880
percentage is national interest and what percentage is personal and that's the entire case their entire
00:40:07.200
case is that that's a standard which they can recognize and act on and it can't be done it is rationally
00:40:14.900
logically impossible to parse those out now did our founders know that to be the case yes they did they
00:40:23.340
designed a system in which that those those decisions of what percentage was personal and what percentage
00:40:32.020
was for the nation they designed a system where that doesn't matter does not matter because you can vote
00:40:39.640
and you can you can change it to uh lawyers just said mind reading
00:40:45.960
is there somebody i would love this to be true so i'm just saw in the comments and i assume that that means
00:40:54.680
that the impeachment defense has started have are you talking about officially as part of the defense
00:41:02.200
did they blame schiff of and their side of mind reading is that what happened i'll have to check on
00:41:07.760
that but it looks like somebody's saying that um all right we'll look for confirmation there anyway so
00:41:16.100
it's a it's a standard that can't be enforced which is what percentage was for your own good um
00:41:22.240
let me ask you this there are two things that people say just like it's certainly true
00:41:32.060
and it could be but i'm going to push back on both of them one is that russia wants to undermine
00:41:40.320
the trust in our system so that's one claim another claim is that china has this strategy of something
00:41:49.180
called total war in which it's not just military but they're in a current war with the united states
00:41:55.880
goes goes this line of thinking in which they are trying every possible avenue to damage the united
00:42:02.300
states and lessen us for their own benefit and that total war would include everything from
00:42:08.980
spying and stealing intellectual property sending us fentanyl you know you name it it's like
00:42:15.140
everything you know messing with our elections just everything it's all on the table so so the the
00:42:22.540
two claims are that russia and china are two biggest rivals in terms of military prowess that
00:42:30.880
they're engaged in a current war with us and i say i'm not convinced i'm not convinced now i do believe
00:42:42.720
that all the things we talk about are you know they're probably relatively true russia probably hacks us
00:42:49.560
china does all the things that it's been said but do they have do they have a some kind of a
00:42:57.700
comprehensive plan or strategy that starts at the top it only counts if it's the leaders of the
00:43:04.400
country who want to do this it are the leaders of the country saying to themselves uh-huh if i can
00:43:11.760
degrade trust in the united states by 20 russia's gdp will go up what how do they connect the dots
00:43:21.440
can somebody explain to me what a rational putin or a rational president chi what would they actually
00:43:31.220
be thinking in which this would be smart because i can't think of it now if they were really dumb
00:43:39.380
people then you could explain it you'd say oh they're so dumb they think they're going to put the
00:43:44.980
united states on a business with their clever tricks and then once we're on a business they'll have all
00:43:50.100
our they'll take over and they'll have all our resources and then they'll be richer or something
00:43:54.220
do you think they're thinking that i mean it doesn't pass any kind of a sniff test we are clearly in a
00:44:02.000
world of abundance meaning that we don't really have shortages of stuff we just have systems that
00:44:09.200
are not optimized to get that stuff to the right people in some cases but we don't really have a
00:44:14.500
shortage of anything and so when you move from a world in which you have shortages of stuff and
00:44:21.460
maybe you need a war because if you don't have food you don't have resources maybe you need a war
00:44:26.500
but if everybody can get everything they want which is our current world if they if they play their
00:44:33.180
systems right and they work their economy right what reason do we have to be anything like an enemy to
00:44:39.940
china or anything like an enemy to russia and and vice versa there simply is no reason there isn't
00:44:47.840
and it seems to me that somebody like a president trump could change the frame on this and let me
00:44:54.520
suggest a frame i believe that the united states russia and china have a common enemy maybe more than one
00:45:02.320
and the common enemy is anything that disrupts the system in other words anything that could put
00:45:09.480
any one of the three of us in a business is something that all three of us probably ought to fight
00:45:14.240
against for example how about a major pandemic wouldn't we be on the same side i think so don't
00:45:23.120
you think russia and the united states are going to fight as hard as we can to help china stop this
00:45:29.520
latest coronavirus thing i think we will we got a common enemy there there's no such thing as one
00:45:35.520
side wins in this thing what about climate change now some of you say it's a hoax blah blah blah but
00:45:41.980
you can at least agree that you want cleaner environment and cheaper energy so don't we have
00:45:48.860
a common interest there to make sure the planet doesn't get destroyed don't we have a common interest
00:45:53.680
in making sure that a terrorist state doesn't attack any one of us don't we have a common interest in
00:45:59.980
that uh what if aliens attack we definitely have a common interest there so my point is that we
00:46:04.780
should reframe what russia china and u.s care about and we should be on the same team instead of spending
00:46:12.720
all our money and wasting it being enemies of each other why don't we find some common enemies
00:46:19.420
you know some we can fight on the same team for because there's just no strategic
00:46:26.200
advantage to being enemies and picking in each other like this if it's happening some of it may
00:46:33.620
be fake news i would also like to know what is it the united states is doing to russia and doing to
00:46:42.900
china that we the public don't know about are we poking them just as hard as they're poking us and
00:46:48.760
what's our reason has anybody ever told us our reason is it revenge is it to make sure they don't
00:46:55.900
poke us so hard you know we'll poke you if you poke us so that's how we keep you in check what's the
00:47:00.580
reason i mean maybe there's a good one i'm not even saying there's no reason i just i feel like i should
00:47:06.220
know it as a citizen of this country can somebody explain to me why the the countries that absolutely
00:47:13.120
should be on our team fighting with us as hard as they can against common challenges why are we at
00:47:21.300
each other just doesn't make any sense i think maybe that'll change maybe we'll rethink that
00:47:28.200
um let's talk about schiff a little more he's talking about uh future crimes of the imagination
00:47:34.740
so schiff wants to get rid of president trump according to his uh eloquent speeches that he's
00:47:42.160
been giving to the senate he wants to get rid of president trump not so much because of what he's done
00:47:48.180
because what he's done has no measurable impact so everything he's done so far or is alleged to do
00:47:57.660
everything that schiff and team alleges president trump has done if you added it all together you
00:48:04.720
couldn't you couldn't fill a tablespoon there's no measurable negative in anything the president has
00:48:13.940
already done and so the democrats knowing that there's no measurable damage at all are trying to
00:48:21.860
make the case that it's the future somebody says you're kidding no i'm not kidding do you see any
00:48:29.800
measurable harm that has come from the president's actions nobody's even alleging any nobody has made
00:48:37.280
the case nobody said we lost this money we lost this deal we we got attacked none there's there's zero
00:48:45.420
alleged cost and so they have to make the case that this president is just going to get worse
00:48:52.820
we're going to encourage him we're going to be encouraging this president to future unspecified
00:49:00.240
bad behavior what could be less american is there anything less american than publish than uh trying
00:49:08.180
to remove somebody from a job for the future potential problems i don't think the country is going to buy
00:49:15.040
that but maybe they will um so this is how schiff argues it he said quote can you have the least bit
00:49:24.240
of confidence talk about trump that he'll stand up and protect our national interest over his own
00:49:30.060
you know you can't which makes him dangerous to this country really
00:49:35.680
you don't trust that the president who will be watched more than anybody has ever been watched
00:49:44.980
including when he made the phone call he had all kinds of witnesses and he released the transcript
00:49:51.180
it's the most transparent presidency you've ever had in your life he has already lost all of his
00:49:57.400
all of his uh privacy essentially uh so am i are any of you worried about
00:50:04.700
the president doing something so purely um personal that it would damage the country
00:50:11.580
because i don't think he has any incentive for that what incentive does the president have to do a
00:50:18.900
bad job as president i mean let me frame it that way to show you how ridiculous it is so schiff is
00:50:29.660
arguing that in the future the president would be in a position in which he would choose he would
00:50:37.060
rationally choose according to schiff what is good for trump personally and is bad for the country
00:50:44.200
can you see any any imaginary world in which president trump could even imagine a situation
00:50:54.960
where doing the worst job you could do as a president would benefit him personally
00:51:01.040
because because he would allegedly be doing this selfish thing how do you even what kind of world is
00:51:08.960
that the president is watched all the time if he did something that was purely for him and had no benefit
00:51:15.900
whatsoever not even one that you could argue that would be terrible for his legacy why would he do that
00:51:23.180
it doesn't make any sense all right uh the funniest thing about it was when schiff was getting
00:51:29.360
choked up with emotion did and and uh all of the uh the shift shift lovers on cnn and other networks
00:51:38.760
were saying that his his uh presentation was emotional and passionate and they were saying that that was a
00:51:46.640
positive because he was so passionate and emotional and i looked at it and all i saw was a bad actor
00:51:53.080
trying to act like he was emotionally distraught did any of that look genuine to me yeah he almost cried
00:52:01.840
to me that didn't even look slightly believable now i hope he's better at writing screenplays than he is
00:52:10.760
at acting but seriously did anybody is there anybody in this country who was so stupid that they would watch
00:52:19.880
those days of presentations and at the end see that little fake emotional thing and imagine that that
00:52:25.740
was real could anybody imagine that was real oh my god i don't think so
00:52:30.940
um i would like to defend myself from uh the future problems i'm going to have uh i have a i have a crime
00:52:42.460
in progress right now crime against humanity uh let's see if i can show it to you it's better as a visual
00:52:50.360
so in order to understand this what you have to know is that uh i after the dilbert comics are drawn
00:52:59.200
they are sent off to my syndication company who has i think it's a third party that they
00:53:05.880
higher and the third party adds the color so you can see that the strip well you can't see it but
00:53:12.820
let me see if i can lower the temperature on this so you can see it take it way down
00:53:18.380
um and so all the characters are filled in with color as you can sort of see here
00:53:27.000
so i actually don't see the the colored version until it runs so the first time i see it uh colorized
00:53:35.220
is when it runs in on the internet and in papers and i was tweeting the sound today
00:53:40.840
and i noticed that whoever adds the color had decided that the character who's talking to dilbert
00:53:48.700
should um you know display some diversity because most of the characters are generic white people
00:53:54.880
except for ashuk the intern and they decided to add a little diversity now i think to myself
00:54:00.900
good choice you know especially when i have you know one of the non-regular characters
00:54:07.440
let's let's have that character as you know somebody who's not a generic white guy
00:54:12.900
show a little flavor in the strip that's a good idea right except the person they decided to add
00:54:20.520
the darker skinned color to is the character that i had decided to depict as an idiot
00:54:27.900
that's right so my decision was that the second character would be the idiot and dilbert would call
00:54:36.340
him out for being an idiot i'll read you the comic so the character comes up behind dilbert and says
00:54:40.660
this data can only can mean only one thing and then dilbert says actually it can mean any one of
00:54:46.560
about seventeen things and then the other character says then why can i only think of one
00:54:52.480
and dilbert says please don't make me answer that now of course uh you may find some correlation with
00:55:01.360
current events in this but i promise you i was not the one who decided that the dumb guy
00:55:07.220
would have the darker skin can we put that on the record not me all right so i have accidentally
00:55:14.620
been turned into a racist by somebody i don't know who decided to colorize that comic so there you go
00:55:21.980
um the simultaneous simultaneous sip did happen but i'm sorry that some of you missed it
00:55:31.420
um why are why are we using leftist framing yeah well i'm just telling you what the public is going
00:55:43.080
to say all right um it's all right no big deal well those of you who say it's no big deal don't know
00:55:54.680
the history uh the cartoonist experience when i introduced ashuk the intern so if you follow the
00:56:02.580
dilbert strip you know that the intern is named ashuk a-s-o-k which is the less common spelling
00:56:08.900
it's usually a-s-h-o-k but i knew somebody i worked with who spelled it differently a-s-o-k so i named my
00:56:17.820
character after that and now ashuk was uh born in india but he's uh an american citizen in the east strip
00:56:26.080
and uh as soon as i introduced that character what do you think happened that's right the african
00:56:33.580
american community attacked the newspapers and said why is this why is this african american
00:56:40.580
character so stupid when all of the other characters you know are smarter now first of all
00:56:48.280
it wasn't and isn't an african american character so the the basis of the complaint was wrong it was
00:56:55.640
a somebody born in india who's an american citizen now secondly it's not the dumb character in the strip
00:57:04.280
i make a comic strip in which all of the characters are acting dumb on different days sometimes they're
00:57:11.060
all acting dumb it's only a comic about dumb people it's that's all it is everybody in the comic except
00:57:18.460
dog were i guess it was not even a human is dumb it's a whole comic about dumb people now what do i do
00:57:25.800
i introduce one character who's got browner skin who's an indian american
00:57:33.720
and the african american community tries to get me canceled for not even talking about them in any way
00:57:42.100
whatsoever now many people have asked and actually a lot of a lot of black fans of the strip say can you
00:57:49.900
add can you add an african american character i would love to i would love to do you think that
00:57:57.960
i'm perfectly happy having a comic strip with a whole bunch of white people in it in 2020 absolutely
00:58:03.840
not i am not happy with that it isn't because the strip is designed to sort of you know mirror
00:58:11.700
civilization and it doesn't you know if you go to silicon valley are you going to see a bunch of only
00:58:17.740
white faces in the in the technical staff no you are not you're not going to see only a bunch of
00:58:23.860
white faces you're going to see you're going to see the whole world represented there so my strip is
00:58:28.860
completely out of touch with the real world on diversity i would love to fix that can i no we don't live
00:58:38.020
in a world in which i can introduce a character easily who would be an african american regular character
00:58:44.080
and have any bad flaws because to to make a comic character interesting they have to have flaws
00:58:50.180
it's the flaws that make it a comic you know dilbert is is socially inept and he's a you know
00:58:58.040
he's a little too trusting he's gullible you know he's got his flaws alice is angry and combative
00:59:04.140
and wally's lazy and ashuk i tried to give ashuk the intern the least objectionable flaw
00:59:11.920
i mean it's the the least smallest little flaw human can have in experience because he's young
00:59:20.820
that's it ashuk's only flaw is that he's not yet very experienced because he's young
00:59:28.900
and still i almost got canceled for that little flaw so imagine me introducing a a black character into
00:59:36.840
the strip and i just think i'm going to have a good time i'm adding a little diversity
00:59:41.100
trying to give people what they want and the first time that black character
00:59:45.800
exhibits any kind of a flaw doesn't even matter what it is i'm canceled that's it so that's the
00:59:52.640
world we live in you could have your diversity or you can have your outrage but you can't have both
00:59:57.640
i can't give you both that's all for today talk to you tomorrow