Valuetainment - December 11, 2023


Big Data Industry: The Hidden Ways Your Data is Manipulated


Episode Stats


Length

13 minutes

Words per minute

215.68724

Word count

2,913

Sentence count

226

Harmful content

Misogyny

2

sentences flagged

Toxicity

1

sentences flagged

Hate speech

1

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Data is a new asset class, and it's worth more than $365 billion in 2029. According to World Privacy Forum, the average data broker has information on approximately 1,500 data points for every consumer. They probably know that one girl you kissed that no one knows about. They have it in here.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Do you ever find yourself talking to your spouse or friend about a camera you want to buy or a car
00:00:03.800 you want to buy or a specific thing you want to buy for the next 24 hours? All you see is all
00:00:08.140 these ads. You never searched it. You never did anything. Are they watching? Are they listening
00:00:11.900 to you? What's your data worth? Do you know what percentage of Americans actually know anything
00:00:16.560 about how marketing companies track your data, my data? Do you know what the number is? Less than
00:00:21.600 3% know exactly. They did a sample of 2,000 Americans asking 17 questions to see how many
00:00:27.920 of these questions they would get right. This is the result. 77% got anywhere between 0 to 9 right.
00:00:33.900 That's a failing grade. 15% got 10 to 11 right. 6% 12 to 13. Only 1% got 14 to 15 right. 0.03% got 16
00:00:44.180 right. Perfect score was pretty much impossible out there. Why? Most people don't know how this
00:00:48.200 industry works. The whole industry, global data broker-like, meaning if you're selling other
00:00:52.100 people's information, you know how big that was in 2022? $268 billion. An estimated to be $365
00:00:59.100 billion in 2029. By the way, most people don't even know how this industry works. We're going
00:01:03.560 to talk about that today.
00:01:10.560 To give value out of this video, give it a thumbs up and subscribe to the channel. Let
00:01:17.180 me kind of share with you what World Economic Forum said in 2011. They said it was theorized
00:01:22.280 that personal data will become a new asset class and will become one of the most valuable resources
00:01:27.940 on the planet. So valuable that 68% of businesses worldwide in 2021 were buying third-party data
00:01:35.860 from data brokers. And by the way, if you want to see which countries were leading the way,
00:01:39.360 here's what it looks like. At the top, US 73%. EU, not including UK, was 71%. Non-EU Europe, 69%.
00:01:48.120 Canada, 59%. Asia, 56%. Latin America, 51%. Middle East, 48%. UK, 43%. And you got the rest of them.
00:01:54.360 But the US, very interested in what you and I do. Very, very interested. And just to kind of give
00:01:59.460 you a visual of what this really means, in 2021, it was estimated that data brokers held information
00:02:04.780 about customers that would fill, you know how many filing cabinets? Ready? 16 trillion filing
00:02:11.120 cabinets with all the data they gather with you and I. By the way, do you know how much information
00:02:15.780 they have on you? Like what different data points? You ready? Is it like 50, you think? 100,
00:02:19.220 maybe 200. That's crazy. 200, right? You ready? According to World Privacy Forum,
00:02:23.120 the average data broker has information on approximately 1,500 data points for every
00:02:28.480 consumer. They probably know that one girl you kissed that no one knows about. They have it in
00:02:32.020 here. They probably got it here when they're selling that information. A study done by FTC found that
00:02:36.760 one data broker segment alone held information on 1.4 billion consumer transactions and over 700
00:02:41.960 billion raw data elements. To put it in perspective, it's about 90 times the number of tweets sent globally
00:02:47.700 in a year. So you may be saying, Pat, how is this even allowed? It's because we give them permission.
00:02:51.920 How do we give them permission? Have you seen the terms of service of different companies? By the
00:02:55.380 way, do you want to take a wild guess? Which one of these big online companies, I don't know, Amazon,
00:03:00.100 Facebook, Instagram, Twitter, Microsoft, Apple, which one do you think has the longest terms of
00:03:05.160 service that you and I sign? I'm going to break it down for you here in a minute, but there is no
00:03:09.320 federal law in the U.S. that regulates the data broker industry. We give them permission and here's
00:03:15.440 what it looks like. Terms of service. Take a look at this here on which one's the longest. When you look at it
00:03:19.540 here, they got Facebook, Instagram, Spotify, Twitter, LinkedIn, YouTube, Apple, Amazon, TikTok,
00:03:25.680 Netflix, Microsoft. We got a bunch of them, right? The shortest ones are actually Instagram and
00:03:30.620 Netflix. Instagram is 2,451 total word count. Netflix is 2628. You know who the longest one is?
00:03:37.640 Microsoft. You know where Microsoft at? 15,260. That's terms of service. 15,260 terms of service.
00:03:45.960 When's the last time you read all the terms of service? When's the last time we read them? Do we
00:03:49.080 go through them? You know how long it would take if we went through all of these? That's what happens
00:03:52.180 when we just say agree, they get all our data. By the way, just a bonus for you, just to put that
00:03:55.980 in perspective, you know the Microsoft terms of service, the amount of words that it has? The book
00:03:59.680 Art of War by Sun Tzu is only 12,035 words. That means Microsoft's terms of service is bigger
00:04:06.240 than you reading The Art of War by Sun Tzu. Pretty wild. By the way, another thing to consider here when it
00:04:11.660 comes down to data, you ever see all these websites that at the bottom, like, I want to really read this
00:04:15.440 article. Accept all cookies. And what do we do? Accept all cookies. That's also data they're
00:04:19.600 getting. You know what percentage of websites have cookies? 42.4% of websites globally use cookies.
00:04:27.100 65% of survey conducted by Deloitte respondents expressed profound concerns regarding the excessive
00:04:31.840 use of cookies and its potential impact on their personal data. So this industry is such a big
00:04:36.800 industry. Do you know how many companies in America just focus on selling data? What do you think it is?
00:04:41.280 Like, how many real estate companies are out there? How many insurance companies are out there?
00:04:44.440 How many companies do you think in America just focus on selling data? You ready? You think it's
00:04:47.760 220? 580? How about 1,100? That's crazy. Try 4,000. 4,000 companies that specialize in selling
00:04:55.180 your data. And by the way, you know, they say in 2019, 45% of your information that they sold was
00:05:01.240 marketing and advertising, data brokers, market demand. But the question is, what do they do with
00:05:05.280 the other 55%? What's the information being sold? No one knows. There's a lot of people speculating,
00:05:08.960 but no one knows. A Pew Research surveyed revealed that 91% of adults agree or strongly agree that
00:05:15.440 consumers have lost control over their personal data. And by the way, which industry do you think
00:05:20.880 is the fastest growing in data brokerage? What industry do you think? Just take a wild guess.
00:05:25.420 By 2022, the health and pharma industry, anyone surprised? Is projected to have the fastest growth
00:05:32.020 in data and brokerage services. And by the way, whether you're in the military or previous,
00:05:36.440 or you have friends and family that are in it. This is a very interesting report from Duke
00:05:40.440 Sanford School of Public Policy. Companies are gathering, inferring, aggregating, and then
00:05:44.740 selling, licensing, and sharing data on Americans, as well as providing technological services based on
00:05:49.960 that data. The Duke report also said data brokers are selling data on U.S. military personnel. The data
00:05:55.960 brokerage ecosystem poses risk to national security by compiling large, detailed databases on military
00:06:02.200 personnel, and subsequently selling that data on the open market. Here's what they're gathering. Ready?
00:06:07.880 I'll go through it fairly quickly. Detailed information on service members, government records,
00:06:11.580 medical conditions, medical records, financial situation and credit scores, political affiliation,
00:06:15.800 religious identity, gender and sexuality, address and contact information, children and families,
00:06:20.480 surveys, healthcare directories, active military occupational data, on-base housing information,
00:06:25.140 the prominent veteran affairs mortgage data, hobbies such as gambling,
00:06:28.200 or international travel capital. They want to know whether you gamble or not if you're in the
00:06:35.960 military. That's pretty wild. Detail geolocation data that can be used to identify military locations
00:06:41.980 and movements. Let me get this straight. We're selling the data to tell whoever wants to buy it
00:06:47.600 where our military personnel is. I wonder who would want to buy that. I mean, I can't think of any
00:06:52.040 countries. Definitely China wouldn't be interested. I know Iran wouldn't be interested. Russia wouldn't be
00:06:56.600 interested in things like this. They don't want to know where our military personnel are. Not at all.
00:07:00.880 They're trying to mind their own business. Right. Utility and new phone connection records,
00:07:04.660 code forms, order forms, sweepstake forms, partnership with list providers, commissaries
00:07:09.320 on military approved buyers, partnership with over 900 sources, including data gathered from public
00:07:13.860 records, social media accounts, online purchase records, online purchase records, online purchase
00:07:19.320 records, public tax documents, public tax documents. Doesn't that have your social on it?
00:07:23.680 Credit reports, national clearing house records, phone, email, postal surveys, call center,
00:07:28.160 compilation, live feed, voter data, data from commercial resources, medical records, government
00:07:33.500 records, nonprofits, serving military and veteran causes, public records, all of this stuff.
00:07:38.500 They are selling that data. And by the way, what do you think your data is worth? If you're in the
00:07:43.700 military watching this right now saying, I'm a Navy SEAL. I'm a Army Green Beret. I'm a Sergeant.
00:07:48.380 I'm an officer. I'm a Colonel. I'm a one-star general. You know what they're selling this per
00:07:52.640 military service member? What do you think it is? 200 bucks, a hundred bucks. You're like,
00:07:57.320 it's got to be worth more than that. What if I told you 22 cents? That's what they're selling.
00:08:02.360 You may buy it. You're watching and say, I would want to buy it for my company, right? They're selling
00:08:07.100 it for only 22 cents a pop. And by the way, I can't give you so many other weird stories. I want to give
00:08:12.700 you a couple of random stories. FYI, every year, you know, like your information gets stolen. Do you know
00:08:16.900 how many children's information gets stolen on a yearly basis? Children's identity. 915,000. By the
00:08:23.140 way, did you hear the story about the mother that stole her daughter's identity to go become a high
00:08:28.880 schooler so she can have pom-poms and be a cheerleader and she gets caught and goes to jail? 0.91
00:08:34.780 Of course, she's got issues. 33 years old to do this and she steals her daughter's information. By the 1.00
00:08:40.300 way, luckily, this is the mom doing it to the daughter. Imagine if a stranger is doing it to
00:08:43.640 somebody else. This is the story of Wendy Brown, who stole her daughter's information, Jamie. That's
00:08:48.060 mom-daughter. 915,000 per year. But let me give you another one. I'm going to take you back to a story
00:08:52.320 in 2004. Back in the days, people have been stealing information for a long time. In 2004,
00:08:57.840 Facebook, I think, came out in 04, give or take. That's like when these guys come out. We're not
00:09:01.700 using social regularly. This guy named Philip Cummings is working at this company. He's got a regular
00:09:07.540 desk job at Teledata Communications in Long Island, New York. And he helped people run routine
00:09:13.080 credit checks. When he left his job, he packed up his belongings along with the confidential
00:09:17.940 information that belonged to 33,000 people. Him and his accomplice end up selling this thing to
00:09:24.020 other people that were interested. You know how much the fine was at the end? The U.S. Department
00:09:28.400 of Justice estimated they made somewhere between 50 to 100 million dollars. Data has been important for
00:09:36.200 a very, very long time. Back in the days, people did it this way. Today, it's a lot easier because you
00:09:41.720 use all these apps regularly and you're always accepting cookies and you're always accepting
00:09:45.580 terms of service. You're not sitting there for 10 hours reading all these terms of service. So
00:09:48.940 guess what? We're essentially giving them the permission to use all our data. And military
00:09:52.840 information is being sold for 22 cents on a dollar. So let me give you an idea with all this data stuff
00:09:58.480 where I'm not concerned and where I am concerned. So for example, if you're using the data to retarget and
00:10:03.960 sell me a product that I showed a little bit of interest in, cool. I get it. It's on me to say yes or no.
00:10:09.440 Now, some people are like, but that's not, that's not cool. They're targeting and all this. Stop it.
00:10:14.160 You know, if you want to go to a market, supermarket, go to Ralph's, go to Kroger's.
00:10:18.180 Where do you think they have their milk? Right in the front? The number one product people buy when
00:10:21.840 they go to a market is what? Milk. They put that shit all the way in the back because they want us
00:10:25.560 to do what? Walk all the way to the back. You're like, I'm only going to buy milk. And then you're 1.00
00:10:29.520 like, bam, bam, bam, bam, bam, bam, bam. You went to buy milk for five bucks. You end up leaving it
00:10:33.980 $88. They use it. You fell for it. I fell for it all the time. It's normal. Costco does it. Where do they put
00:10:39.040 all this stuff that we have to buy? Meat is where? 17 miles all the way down. You got to walk up and
00:10:43.940 then you got another 45 minutes of stuff you're going to pick up. Next thing you got a case of
00:10:47.500 mustard as if you're going to start a side business of selling mustard. Why do you need 17 years of
00:10:51.780 supply of mustard? We picked it up. You know what? I'm going to save some money as if you're going to
00:10:55.000 use mustard all that in a month. So I don't mind if they're retargeting to sell me a new product or the
00:10:59.960 same product I'm interested in. I'm okay with that. What I am concerned about is the movie Minority Report,
00:11:04.440 if you remember it, which was future behavior, what you can, future crimes you can commit. There's
00:11:09.160 this show on HBO called Westworld where they have this AI system that based on this AI, do they hire
00:11:15.300 you? Do they not hire you? Are you good to come on board? What if they do this? And what if they do
00:11:19.780 that? And you know, you're not a good person to hire because your behavior in the future is going to be.
00:11:22.940 That's the stuff that gets a little bit interesting where can people get arrested on a future crime? Can
00:11:28.640 people get held back for certain things? You know, China's already using the social credit system to track 0.68
00:11:33.580 what you can get, what you can't get. Voting, this is where you're going to be at and they put you in
00:11:37.240 a box. That's the part that's a little fishy to me. I have no problem with capitalists trying to put
00:11:41.980 their product in front of me to buy it over and over again. I'm okay with that. I'm a big boy. If I
00:11:46.280 buy it, great. If I don't buy it, I have to be able to control it. It's the other things that I'm a
00:11:50.880 little bit concerned about. So I'll give you a couple things as a solution while you're going through
00:11:54.260 this because I know some of you guys that are paranoid people, you're freaking out right now. Like,
00:11:57.500 oh my God, what do I do? Take all the apps down. Delete every app you got. Get off of everything,
00:12:02.240 right? So I'm not telling you that, but you have to know it's a different world we're living in.
00:12:05.580 It's better to be aware. Here's a few things on how to protect your personal data. Freezing your
00:12:09.660 credit is one way to do it. Being alert for scams. There's a lot of different apps you can use to give
00:12:14.960 you alerts on somebody's using the things that they're using. I get so many alerts when somebody
00:12:19.220 buys. How'd you know? Because I got alerts that I know who's buying and who's using what. Practicing
00:12:23.680 good password hygiene, protecting your devices, locking up your stuff. A sound data security plan is
00:12:28.960 build on five key principles is what FTC tells you. Number one, take stock. Know what personal
00:12:32.840 information you have on your files and on your computers. Number two, scale down. Keep only what
00:12:37.380 you need for your business. Number three, lock it. Protect the information that you keep. Number four,
00:12:42.020 pitch it. Properly dispose of what you no longer need. And last but not least, plan ahead. Create a plan
00:12:46.840 to respond to security incidents. So if you got value out of this video, give it a thumbs up. Subscribe to
00:12:51.120 the channel. By the way, this is probably an episode you may want to watch with your spouse, mom, dad,
00:12:55.160 grandparents. It's probably something everybody needs to watch or learn more about it. But
00:12:58.340 I interviewed somebody a few years ago, a guy named Matthew Cox. You've probably seen him on TV.
00:13:02.320 This was FBI's most wanted guy in the area of loopholes in the system. He actually breaks down
00:13:10.160 how he would defraud banks using other people's credit. Absolutely wild what he did. If you want
00:13:16.980 to learn exactly how they use it, you may as well learn it from somebody that did it for many years
00:13:20.680 and made millions of dollars doing this. If you've never watched this before, click here to watch it.
00:13:24.900 Aside from that, have a good day, everybody. Take care. Bye-bye, bye-bye.
00:13:28.340 you