Valuetainment - December 20, 2019


Episode 404: Misconception About Automation And Artificial Intelligence


Episode Stats

Length

7 minutes

Words per Minute

208.13148

Word Count

1,604

Sentence Count

172

Misogynist Sentences

1


Summary

Automation has given us thousands of hours back to do whatever we want to do with our lives. What are you doing with all that time? What do you do with it? What is it doing to you?


Transcript

00:00:00.000 How often have you heard experts and politicians talk about the fact that AI and automation
00:00:21.960 could be the biggest threat to society?
00:00:23.840 How often have you heard that?
00:00:25.060 Now here's the reality of it.
00:00:26.060 There is a community that's going to pay a big price for automation and AI, and I'll
00:00:30.680 get into that community here in a minute, but let me give you a complete different perspective
00:00:33.720 about automation and AI.
00:00:35.460 What if I told you automation and AI has given us thousands of hours back to do whatever
00:00:41.180 we want to do with our lives?
00:00:42.200 What do you mean, Pat?
00:00:42.940 Let me explain.
00:00:44.380 There used to be a time we would spend Sundays putting our albums together, and we'd go take
00:00:49.740 the pictures, then go to Costco or whatever store you would go to, print the pictures,
00:00:54.280 come back, buy a couple albums, spend hours putting it together.
00:00:57.940 Then remember when the albums would tear?
00:00:59.660 You would need to put the tape over it, and then the kid tears your pictures.
00:01:02.860 You got to go back again, drive.
00:01:04.220 Oh my gosh, I don't want to do this again.
00:01:06.060 Or how about writing letters?
00:01:07.040 Remember writing letters?
00:01:07.720 You would write a letter to somebody, and then you really wanted to say what was on your mind.
00:01:11.740 They're like, oh no, I don't like this.
00:01:13.140 Then you would have to erase it.
00:01:14.020 And then you're like, I wrote a page and a half.
00:01:15.420 Then you would have to tear it apart, rewrite it.
00:01:17.140 You don't want them to see that you erase because the word can be read.
00:01:19.820 And then you have to do that again.
00:01:20.840 And then you have to go drive around the community, where are these blue boxes at?
00:01:24.100 Oh my gosh, where's the mailbox at?
00:01:25.940 And then you would get out of the car in the rain, drop it in the box, and then get back.
00:01:30.460 And finally, a week later, it gets out there, right?
00:01:33.380 Today, what do you do?
00:01:34.200 You text?
00:01:34.800 You email?
00:01:35.440 You don't like a sentence?
00:01:36.280 You just go back.
00:01:37.080 And then you send.
00:01:37.800 A second later, the person has it.
00:01:39.040 They can think about it, get back to you.
00:01:40.120 But oh my gosh, I'm heartfelt.
00:01:41.480 Thank you for this message.
00:01:42.700 Gym membership cancellations.
00:01:43.860 You have to send it in.
00:01:44.600 And gyms would always say, oh, we didn't receive your letter of cancellation.
00:01:48.140 Yes, you did.
00:01:49.500 No, we never received your letter of cancellation.
00:01:52.620 Nowadays, you go online, you unsubscribe, you cancel, whatever you want to do, right?
00:01:56.400 Automation.
00:01:57.240 Automation has given us so much.
00:01:58.420 Now, let's go worst case into automation.
00:02:00.140 Because back in the days, even TV is automation.
00:02:02.640 We had to get out of the car.
00:02:04.280 Great grandparents had to go to a place, drive.
00:02:07.200 You know, you didn't have AC or heater back in the days.
00:02:09.520 And you go to a place and you get entertained.
00:02:12.340 Now, you sit in your room, watch a show from the comfort of your bedroom, your living room.
00:02:18.580 You can Netflix and chill all you want.
00:02:21.060 If you chill in public, they're going to put you in prison.
00:02:24.800 You're being entertained.
00:02:25.760 That's automation.
00:02:26.720 How much time did I give you to you and your girl or you and your boyfriend or your husband
00:02:29.620 or wife?
00:02:29.900 How much more?
00:02:31.240 It's automation.
00:02:32.680 But here's the thing.
00:02:33.760 Let's go to automation today with cars.
00:02:35.580 Oh my gosh, what if we're not going to drive cars anymore?
00:02:39.560 Cars are automated.
00:02:40.560 They take us places.
00:02:41.840 Planes, trains, a lot of things have been automated for a long time.
00:02:45.520 We're not panicking about that, but cars we are.
00:02:47.660 Here's the best thing about cars being automated.
00:02:50.100 What is it, Pat?
00:02:51.000 Let me explain.
00:02:52.200 If cars are automated, say you drive an hour a day.
00:02:56.060 Say that's 400 hours a year, right?
00:02:58.620 Now, when you're driving, what are you doing?
00:03:00.840 You're driving, hopefully you're paying attention.
00:03:02.360 Maybe you can respond to a text or two when you're at a red light.
00:03:04.380 But you can't really be doing a lot of different things, right?
00:03:06.720 Okay.
00:03:07.140 Now, let's say automation is here with cars.
00:03:08.820 You don't have to drive anymore.
00:03:09.740 You put an address, it takes you there, right?
00:03:12.100 Gives you 400 hours per year.
00:03:13.800 Here's the real question with automation.
00:03:15.880 Why I tell you nothing is new.
00:03:18.440 What are we going to do with that 400 hours that's given back to us?
00:03:22.500 Uh-huh.
00:03:23.440 What do you mean by that?
00:03:24.360 Here's what I mean by that.
00:03:25.800 We can easily use that 400 hours to watch more TV on our phone.
00:03:29.560 We can use that 400 hours to follow more sports on our phone.
00:03:33.300 We can use that play games on our phone.
00:03:35.640 We can use that 400 hours to just look at everybody's newsfeed on our phone.
00:03:39.420 We can use all that stuff.
00:03:40.680 Or we can use that 400 hours to buy a master class, to watch a YouTube channel that's going
00:03:45.320 to teach you some stuff, to read articles, to do research, to take a new course, to study,
00:03:49.400 to get smarter, to read books, to listen to audio books.
00:03:52.440 We can do that.
00:03:53.540 So the challenge is not automation.
00:03:55.540 The challenge isn't AI.
00:03:56.700 Because the reality of it is automation and AI is going to make it so easy to create wealth.
00:04:03.320 Look how many rags to riches stories you're going to see the next 10 to 20 years.
00:04:07.260 But do you want to know why politicians don't talk about the habits of people?
00:04:10.780 Because they lose voters.
00:04:12.660 That's what they do.
00:04:14.000 See, they don't want to talk about the habits of people because then people are going to
00:04:16.960 say, I don't like that guy.
00:04:18.440 He called me lazy that I don't work.
00:04:21.040 I don't like that guy.
00:04:22.220 So what if I want to watch TV while the car is driving?
00:04:26.240 I want to entertain myself.
00:04:27.920 It's better to be healthy.
00:04:29.080 Why should I always be working?
00:04:30.500 I'm not telling you to always be working.
00:04:32.040 All I'm telling you is politicians are selling you what you want to hear, not what we need
00:04:36.480 to hear.
00:04:37.320 Unfortunately, the same people they're talking to are the people they're hurting the most.
00:04:41.500 It's so wild.
00:04:43.400 And we are so naive and we fall for this stuff because they're just wanting votes.
00:04:49.500 They just want to make sure they become presidents.
00:04:51.400 That's what they want.
00:04:52.960 How often do you think they're going to get out and actually give an action plan that's
00:04:56.500 us having to change about ourselves?
00:04:59.620 It's us.
00:05:01.120 You know, it's us moving.
00:05:02.380 Stephen A. Smith's mother asked him, what could you have done differently?
00:05:05.840 Not what society could have done differently.
00:05:08.680 I mean, you go back and look at people who ended up doing something very big with their
00:05:11.920 lives.
00:05:12.520 They took responsibility.
00:05:14.560 They.
00:05:15.240 What can I do differently?
00:05:16.600 What habits can you change?
00:05:17.780 What habits can I change?
00:05:18.860 I had terrible habits.
00:05:19.980 I had to change the habits.
00:05:21.740 It wasn't society's fault.
00:05:23.640 It wasn't my parents' fault.
00:05:25.480 It wasn't my friends' fault.
00:05:26.960 I control my decisions.
00:05:29.520 I am free.
00:05:30.740 You are free.
00:05:31.900 See, I believe in the betterment of people.
00:05:35.260 I believe you're capable.
00:05:36.920 I believe deep down inside you want to go out there and do something big with your life
00:05:39.600 so you own it.
00:05:41.280 So people say, you did it.
00:05:43.380 You found a group of people.
00:05:44.760 You got to work.
00:05:45.580 You became a good boss.
00:05:46.760 You became a good leader that people wanted to be around you for a while.
00:05:50.160 And guess what?
00:05:51.120 People behind your back say, that's a flippin' leader, man.
00:05:54.000 I respect that guy.
00:05:55.340 That guy gets things done.
00:05:56.840 That girl gets things done.
00:05:58.280 Those guys get things done.
00:05:59.580 And then you have that pride to say, I did something in my life.
00:06:02.660 So the next time you hear about automation and AI, a complete different perspective.
00:06:07.180 What can you do to have automation and AI work in your favor for you to become smarter,
00:06:11.820 wiser, better, with new skills so you can have an edge in the marketplace?
00:06:15.280 So the next opportunity that comes up, you can pivot and boom!
00:06:18.520 Another Rags to Riches story.
00:06:19.840 You.
00:06:20.320 Not somebody you read about in Forbes magazine.
00:06:22.440 I'm talking about you.
00:06:24.240 Your story.
00:06:25.300 That's what this whole thing is really all about.
00:06:27.060 So don't let them fool you.
00:06:28.320 Don't be naive.
00:06:28.900 Keep your ears open, your eyes open, and go do your own research, and then come to
00:06:33.000 full conclusion.
00:06:33.640 Don't even believe me.
00:06:34.680 Go do your own research based on what I'm saying.
00:06:35.940 Say, Pat, you're wrong.
00:06:36.560 Great.
00:06:37.260 Go do your own research about it.
00:06:38.720 But don't say, just because I offended you or you're hurt because I called you out on
00:06:43.360 something.
00:06:44.280 Do something about it.
00:06:45.780 Anyways, having said that, have a killer week this week.
00:06:48.400 Thanks, everybody, for listening.
00:06:49.580 And by the way, if you haven't already subscribed to Valuetainment on iTunes, please do so.
00:06:54.320 Give us a five-star.
00:06:55.720 Write a review if you haven't already.
00:06:57.060 And if you have any questions for me that you may have, you can always find me on Snapchat,
00:07:01.280 Instagram, Facebook, or YouTube.
00:07:03.240 Just search my name, PatrickBitDavid.
00:07:05.140 And I actually do respond back when you snap me or send me a message on Instagram.
00:07:10.140 With that being said, have a great day today.
00:07:11.880 Take care, everybody.
00:07:12.580 Bye-bye.
00:07:12.840 Bye-bye.
00:07:13.160 Bye-bye.
00:07:18.020 Bye-bye.
00:07:19.800 Bye-bye.
00:07:19.940 Bye-bye.
00:07:20.820 Bye-bye.
00:07:20.880 Bye-bye.
00:07:21.360 Bye-bye.
00:07:21.740 Bye-bye.
00:07:22.020 Bye-bye.
00:07:22.860 Bye-bye.
00:07:23.640 Bye-bye.
00:07:24.660 Bye-bye.
00:07:25.820 Bye-bye.
00:07:25.960 Bye-bye.
00:07:26.400 Bye-bye.
00:07:27.120 Bye-bye.
00:07:27.600 Bye-bye.
00:07:32.220 Bye-bye.
00:07:32.880 Bye-bye.
00:07:33.400 Bye-bye.
00:07:34.860 Bye-bye.
00:07:35.220 Bye-bye.
00:07:35.540 Bye-bye.
00:07:36.540 Bye-bye.
00:07:37.580 Bye-bye.
00:07:38.420 Bye-bye.
00:07:39.860 Bye-bye.
00:07:40.540 Bye-bye.
00:07:40.840 Bye-bye.
00:07:41.360 Bye-bye.