Valuetainment - January 31, 2026


"We Are The VIRUS" - AI CEO’s Doom Warning Sparks Tech Apocalypse FEARS


Episode Stats

Length

21 minutes

Words per Minute

186.61098

Word Count

3,962

Sentence Count

271

Misogynist Sentences

6

Hate Speech Sentences

2


Summary


Transcript

00:00:00.940 I'm Kelly Kennedy. I've been called a hope dealer for leaders and with over 300 episodes of the Business Development Podcast, we'll guide you and we'll always be in your corner.
00:00:12.900 Follow the Business Development Podcast on Spotify and let's make 2026 your best year yet.
00:00:24.180 Investing is all about the future. So what do you think is going to happen?
00:00:28.200 Bitcoin is sort of inevitable at this point.
00:00:30.900 I think it would come down to precious metals.
00:00:33.480 I hope we don't go cashless.
00:00:35.600 I would say land is a safe investment.
00:00:38.100 Technology, companies.
00:00:39.300 Solar energy.
00:00:40.260 Robotic pollinators might be a thing.
00:00:42.900 A wrestler to face a robot. That will have to happen.
00:00:46.200 So whatever you think is going to happen in the future, you can invest in it at Wealthsimple. Start now at Wealthsimple.com.
00:00:54.380 Anthropic CEO comes out and says this, okay?
00:00:57.140 It says humanity needs to wake up to dangers of AI.
00:01:01.620 And I think what he's specifically talking about, Tom, is super AI, which I'm coming to you on this one.
00:01:07.740 But, Rob, I think you've got a clip on this.
00:01:09.820 Humanity needs to wake up to the potentially catastrophic risks posed by powerful AI systems in the years to come.
00:01:16.740 Is this him, Rob?
00:01:17.480 Yes, sir.
00:01:17.820 And, by the way, just so the audience knows, Anthropic is – how do you explain Anthropic, Tom?
00:01:22.560 It's a $350 billion auto company.
00:01:25.100 They just 2X'd in valuation in a few months.
00:01:27.700 Everybody wanted to get in on Anthropic.
00:01:29.720 It's the next big thing that people are looking at.
00:01:32.040 Here's what their CEO had to say.
00:01:33.360 Go ahead.
00:01:33.640 You know, I think if we wait three years, like this technology is progressing exponentially, right?
00:01:40.260 Three years ago in 2023, the models were maybe as smart as like a smart high school student.
00:01:45.320 Now we have engineers at Anthropic where the model writes all the code for them, you know, and the engineer maybe edits it.
00:01:52.480 But we're very close to, you know, mid to high professional level, right?
00:01:58.060 And so that was just in three years.
00:01:59.640 If we wait another year, three years, I think we'll get what I call in the essay a country of geniuses in the data center, maybe less than three years.
00:02:08.980 And so, you know, three years is an eternity in this field.
00:02:13.480 And so I think we absolutely need to act before then.
00:02:16.980 One place where I really have hope is I think as these problems start to manifest, again, they're not going to be partisan, right?
00:02:25.500 Like, you know, it may start with, you know, one party or one side having an anti-regulatory ideology.
00:02:32.480 But I think as these problems become real, there's going to be a demand among everyone.
00:02:36.960 And in the note, you outline the different risks, whether it's bioterror or whether it's authoritarian regimes with too many tools to do subversive behavior.
00:02:47.660 Like what?
00:02:48.880 Like how worried are you?
00:02:50.200 I mean, you're obviously worried enough to state it and you're worried enough to raise it.
00:02:54.200 But like in your mind, how likely is that outcome, particularly if we don't do anything for the next three years?
00:03:00.140 Is it like a one percent or like, no, no, if you don't do anything for three years, like we could be screwed.
00:03:04.640 Yeah, it's always hard to tell.
00:03:07.520 One of the things I say in the essay is, you know, we just don't know, right?
00:03:12.440 We could look back and we could say, ha-ha, AI-driven bioterror.
00:03:15.640 You know, that was, you know, that sounded like it could happen at the time.
00:03:19.540 But like, you know, it just didn't happen at all.
00:03:24.680 And it's very unpredictable.
00:03:26.480 You know, the way I would say it.
00:03:27.960 Tom, what are you at with this?
00:03:28.880 So, 2023, 2024 for AI, those were the years of the chip.
00:03:35.920 NVIDIA comes out of nowhere.
00:03:37.280 Everybody asking each other, do you own NVIDIA?
00:03:39.180 Oh, my gosh.
00:03:40.340 And then the rest of us looking at it and saying, you know, why is NVIDIA coming out of nowhere?
00:03:43.700 So the number of chips is out there.
00:03:45.040 Then last year, the AI agent showed up as well as the collective realism that the data centers
00:03:53.340 that are planned today are going to take 12% of the power in the U.S.
00:03:56.140 And it says, wait a minute, wait a minute.
00:03:57.680 12% of the power?
00:03:58.720 That means there must be a hell of a lot of processing and a hell of a lot of data centers.
00:04:02.880 That's correct.
00:04:04.480 That's absolutely correct.
00:04:05.560 And now we're getting, Pat, to the age of the AI agents.
00:04:09.420 Well, the AI agent is basically AI that can do something for you.
00:04:13.480 Automatically answering emails is a simple one that's available to everybody right now
00:04:18.100 using Microsoft Pilot.
00:04:20.380 Certain emails, certain rules, it'll automatically respond for you with a simple answer
00:04:25.280 or find an open place on your calendar.
00:04:28.300 That's an AI agent.
00:04:29.580 What he's talking about is the advance in the computational power and the library, also
00:04:35.920 known as LLM, large language model, that it has to use that you could have a bad actor
00:04:42.140 learns how to build a nuclear bomb much quicker or learns how to deploy a biotech poison much
00:04:49.940 quicker.
00:04:50.480 And that what's happening is that now the agents and the collective intelligence behind the agent
00:04:56.700 is sort of now the third, the fourth wave.
00:05:00.480 Chip, data center, agents, and now collective processing power and super AI powering used for
00:05:07.420 bad.
00:05:08.140 Okay.
00:05:08.780 You can look back in history and saying, oh my gosh, the folks that invented the nuclear
00:05:14.180 weapon, we need to regulate and control this.
00:05:17.200 We need to figure this out.
00:05:18.380 We're about to go to what will be the next wave, which is levels of control through oversight.
00:05:26.700 So if you think that Google listening on your device or your Alexa device to sell you ads
00:05:33.440 is spooky, the backend of AI is going to know who's looking for what.
00:05:38.600 And the government is going to, by default, have to be snooping on that to see who's using
00:05:44.300 AI to find out what is the best stream in America to put a viral poison in to kill the
00:05:50.800 most number of people based on snowpack, runoff, melting, and the speed of the stream.
00:05:57.140 People are going to be asking those questions.
00:05:59.560 Where's the best place for a suicide bomber to kill the maximum number of people in New
00:06:04.180 York City on Black Friday?
00:06:06.200 Oh, well, it'll look at traffic.
00:06:08.540 So the government's going to have to see this.
00:06:10.800 And that's what he's talking about.
00:06:12.160 Now we get to the super AI and the ability of bad people to do really bad things and to
00:06:18.820 be enabled by it.
00:06:20.180 Now you've got the government's going to have to be there on the back, which I'm not endorsing,
00:06:24.580 but I'm saying that is what he at Anthropic, which is building these models and knows what
00:06:31.280 they're talking about in terms of the power of them.
00:06:34.300 Sorry for the long answer, but I wanted to take people through what they've been hearing
00:06:38.080 in the last four years.
00:06:38.920 Yeah, say six years ago, could you have imagined what ChatGPT does right now?
00:06:45.740 Even the concept of it, no.
00:06:47.160 We all knew about AI, we knew about technology advancement, but just the sheer abilities
00:06:51.780 of what ChatGPT or Grok or Gemini could do, never thought that could happen.
00:06:56.280 So yeah, the craziest thing about it is the speed of technology, how it goes up at an exponential
00:07:01.620 rate.
00:07:02.140 So don't know what the hell is going to be around the corner.
00:07:04.580 Don't know if we're even going to have control over it, which that's the weird part,
00:07:08.100 is that it gets to the point where the thing starts communicating with each other.
00:07:11.240 Like if you've heard a bunch of stories where Google or Amazon had to shut it down because
00:07:16.360 they started talking to each other.
00:07:18.040 So I don't know, just what happens when AI starts talking to each other and deems people
00:07:22.460 inefficient or starts trying to have a mind or a philosophy of its own.
00:07:27.000 I don't know how far away we are from that, but that's the creepy part to me.
00:07:31.680 Right.
00:07:31.780 Well, all technology kind of starts out clumsy, right?
00:07:36.320 So I think it'll get better.
00:07:38.000 But when the people that make it are warning you, I think you should be listening to what
00:07:44.800 they're saying.
00:07:45.740 I just can't understand why, what's the danger.
00:07:50.540 So I'm going to go read that because what would be the danger?
00:07:53.400 Like, why do they think, what are they going to, AI going to become to where we just, we're
00:07:59.940 the problem, we're the virus, so it eliminates us.
00:08:02.060 What Terminator?
00:08:03.000 What's the possibility of AI becoming so super intelligent?
00:08:06.760 I just, in my mind, I'm thinking, well, that just makes life easier.
00:08:10.900 How's it going to, what's it going to do?
00:08:12.300 Why are, why is it dangerous?
00:08:14.660 Yeah.
00:08:15.280 Let me tell you where I'm at with that.
00:08:16.920 If, let's just say, go to the horror that they're selling, jobs are all going to be
00:08:23.460 replaced, people are not going to have jobs, they're not going to have income, they're
00:08:29.420 not going to be able to pay bills, who are their customers?
00:08:33.480 Who buys from them?
00:08:35.440 Do you understand what I'm saying?
00:08:36.380 Yeah.
00:08:36.660 Then who buys from them?
00:08:37.580 By the way, what happens to mankind, what happens to any society when people don't
00:08:44.900 make money, don't have anything to lose, don't have a job, don't have a family, who loses?
00:08:51.540 Let me ask that question one more time.
00:08:53.700 What happens to any society where people don't own homes, they don't have jobs, they're not
00:08:59.360 having their dreams become a reality, they have nothing to lose, what does that society
00:09:03.220 look like?
00:09:03.840 That's anarchy.
00:09:04.480 If no one has anything to lose, you have the worst of human impulse and you have chaos.
00:09:10.820 Who loses?
00:09:11.780 Who loses?
00:09:12.740 Everyone.
00:09:13.740 You know who loses?
00:09:15.100 The people of power.
00:09:16.620 Right.
00:09:17.380 That's who loses.
00:09:18.760 So guess what?
00:09:19.540 If they go in this direction where you lose your customers, you lose your employees, you
00:09:24.060 lose your consumer, you lose your buyer, what the hell are you producing?
00:09:27.420 Who cares what you're producing?
00:09:28.920 You heard the story about Amazon that came out and they're shutting down their, what was it,
00:09:33.080 21, 23 cashless, cashier-less stores in California?
00:09:38.500 Oh yeah, the, what is the brand name they use for those stores?
00:09:43.180 It's not Whole Foods.
00:09:43.980 Is it Whole Foods or is it, which one is it?
00:09:45.500 Amazon Fresh.
00:09:46.720 Amazon Fresh.
00:09:47.640 Right.
00:09:47.860 Is this the one that they're closing?
00:09:49.140 Rob, you want to play this clip?
00:09:50.220 Yes.
00:09:50.560 Go for it.
00:09:51.000 Why the change in strategy?
00:09:52.740 Yeah.
00:09:53.300 Closing and shuttering what was already a diminishing footprint, right?
00:09:56.220 There were only 14 Go stores.
00:09:58.020 We have one of them or had one of them here in SF and 58 Amazon Fresh stores.
00:10:02.060 And as you point out, that real estate gets converted to Whole Foods.
00:10:05.760 This is not Amazon saying that they are done with grocery.
00:10:08.780 Far from it.
00:10:09.640 It was interesting when those headlines hit.
00:10:11.320 Clearly there's an algo reaction, right?
00:10:13.020 And Amazon paired some of its gain.
00:10:14.720 It's now back up to a six tenths of a percent gain.
00:10:16.740 But Walmart took a leg lower because the commitment from Amazon is very much to grocery in the context of sub-same day, right?
00:10:25.420 They've been focused very much on the online offering and doing those hourly deliveries and slots.
00:10:30.400 But foot traffic's been falling away on these early locations.
00:10:34.160 I don't know if you guys have ever been into an Amazon Go.
00:10:36.180 Just so you guys know.
00:10:36.940 Audio continues, but the video stops.
00:10:38.480 Facility.
00:10:39.240 Robby, if you want to pause it.
00:10:40.900 So, you know, the question then becomes, you're saying, you know, the fears.
00:10:45.820 What are they afraid of?
00:10:47.040 I need to go read this article.
00:10:48.180 Why would you be afraid of what's going to be taking place?
00:10:51.380 Tom, Amazon's move that they're going to, okay?
00:10:54.840 And they're bringing it back to Whole Foods.
00:10:56.520 How much patterns are you noticing of companies realizing, you know, even with, let's just say AI gets so good that you can tell it to create videos and teach XYZ content on videos.
00:11:11.760 You saw what YouTube said.
00:11:12.880 YouTube said we're not paying you monetization if it's not a human person.
00:11:17.160 July 15th of last year, I think YouTube came out with a new rule that if it's not a human being creating the content, all the, I don't know what word they use, the faceless content that you're creating, we're not paying you AdSense.
00:11:31.620 We want people creating content.
00:11:33.900 More and more companies are going to go to, hey, we kind of need people.
00:11:38.020 We kind of want people.
00:11:39.480 We kind of need you guys to also have jobs.
00:11:41.360 Can we pay you?
00:11:42.240 I actually think this AI thing is going to go this way, and then it's going to go that way.
00:11:47.160 It's going to be like, no, our differentiators, when you come to our store, we have people.
00:11:52.340 We have human beings.
00:11:53.720 You will not be talking to robots.
00:11:56.200 I think people are going to go that route.
00:11:57.560 This is a different story about Amazon, though, Tom, if you want to give us an update on this story.
00:12:01.240 Amazon, as the analyst said very, very correctly, Amazon is not giving up on groceries.
00:12:07.180 They have Whole Foods, and they have that.
00:12:08.820 And more importantly, if you go into Whole Foods, the locker space has doubled over the last 18 months.
00:12:15.940 And that's the lockers, those Amazon lockers that are on one side of where when we were all growing up, that's where the 50-pound bags of dog food would be stacked up along with charcoal in bags.
00:12:26.940 Well, now you have these lockers where you go up there and you open it, and that's where your Amazon delivery was.
00:12:32.460 It was in one of those lockers.
00:12:33.600 So Whole Foods is alive and well and being well-used and exploited by Amazon.
00:12:37.660 They tried to make these Amazon fresh ghost stores so they could save on labor because grocery has a very, very tight margin in the first place.
00:12:49.040 Hey, we'll make convenient ghost stores.
00:12:50.720 You walk in with your phone.
00:12:52.000 We'll see that it's your phone, and you can Apple Pay walking out.
00:12:54.780 It'll just beep.
00:12:55.740 Well, then there were the people with a hoodie and a baseball cap and dark glasses that had no phone on them because they knew the game and walked up, took five things, and walked out.
00:13:07.520 Boy, this is easy.
00:13:09.060 So there was shrinkage.
00:13:10.760 So basically, Amazon couldn't make this thing work.
00:13:15.580 They couldn't make Amazon fresh and the ghost store work.
00:13:18.340 So they're closing it and putting all their effort and grocery back into Whole Foods.
00:13:24.100 That's what happened.
00:13:25.420 And so it found out that, you know, human security guards or human people checking out were really helpful to curb shoplifting.
00:13:35.500 Versus a guy walks in, I'll have two Red Bulls and that sandwich and walk out.
00:13:40.280 We'll scan his phone.
00:13:41.400 They deliberately didn't have phones in their pockets.
00:13:43.320 Yeah, I think that's what I'm saying.
00:13:44.080 I think the more AI accelerates, the more these big AI companies are going to sit there and say, we need people.
00:13:51.640 I think that's what's going to happen.
00:13:53.100 By the way, you were one of the first guys that built an online university.
00:13:58.580 You have 1,000 business owners that have their stuff with you at Lightspeed, and you've got millions of customers, right, millions of clients that use your technology.
00:14:05.380 What have you seen over the years when you first started and then seeing other guys that are going ahead of AI?
00:14:10.800 What have you done to compete with how fast AI is growing?
00:14:14.620 Well, we leverage it.
00:14:16.360 You know, we drop it into the technology so the knowledge is more quickly created, delivered, tracked, and measured.
00:14:26.720 So, again, I mean, I think before the knowledge was the key.
00:14:31.380 Well, now for $20, I have a doctor in my pocket, a nutritionist, a lawyer.
00:14:36.460 So, knowledge is getting commoditized.
00:14:42.740 You're going to have to understand, at least we do, that training is all about good content, repetition, practice, and accountability.
00:14:52.220 And before, people were paying for content.
00:14:54.780 Now content is very inexpensive.
00:14:58.140 And that's a good thing for us.
00:14:59.940 I think it's a good thing for human beings.
00:15:01.760 Why?
00:15:02.000 Well, because now you're just a choice away from becoming more relevant, more successful, and more, you know, at the end of the day, it's a good thing to me.
00:15:13.540 I am not afraid of AI.
00:15:15.160 I can't understand the dangers in it so far.
00:15:18.720 But if everyone's out of work, Patrick, don't you think they would just start giving out a basic income?
00:15:25.180 I think that's terrible.
00:15:26.400 It's the plan.
00:15:26.880 Well, yeah.
00:15:27.360 I think that's terrible because when a group of people, majority, don't have something they're in pursuit of and you give them income, that's a horrible society.
00:15:41.520 That's a horrible society.
00:15:43.360 Problems will happen.
00:15:45.800 People with time on their hands will start doing horrible things.
00:15:49.400 It'll be catastrophic.
00:15:50.760 That's not a society you want to live in.
00:15:52.640 If people don't have something to do that they're in pursuit of, I do not want to live in that society.
00:15:57.620 I don't want to live in that society.
00:15:59.180 I want to be around other people where you're going after something, you're going after something, you're going after something.
00:16:04.460 Great.
00:16:04.940 Collectively, we have to make sure the city is going to be safe.
00:16:07.620 We have to make sure collectively, oh, you know what?
00:16:10.260 Send us more money.
00:16:11.320 Send us more food.
00:16:12.240 Send us free this.
00:16:13.660 That is a scary, scary society.
00:16:16.200 And by the way, it's been tested.
00:16:18.480 It's not like UBI has never been tested.
00:16:20.020 Alaska tested it for a minute.
00:16:22.780 Many countries have tested UBI.
00:16:25.480 When Andrew Yang suggested this, I don't know which election it was when he brought this up.
00:16:29.780 Was it 2015-16?
00:16:31.380 Or was it?
00:16:31.760 No, I don't think it was 2020 with Biden.
00:16:33.360 I think it was.
00:16:34.200 Was it 15-16 or was it previous to that?
00:16:36.740 Maybe it was 15-16.
00:16:38.520 Something like that.
00:16:39.660 It was 2020?
00:16:40.840 Okay.
00:16:41.040 Oh, sure.
00:16:41.460 It was 2020.
00:16:42.040 So when he introduced this $1,000 a month thing, it was like, yeah, it's a good idea.
00:16:45.740 And it was like a Milton Friedman negative income tax.
00:16:49.040 What did Milton Friedman call it, Rob?
00:16:51.080 Can you type in Milton Friedman negative income tax that he proposed years ago?
00:16:58.100 Yeah, negative income tax in his 1962 book, Capitalism and Freedom, people with incomes
00:17:05.140 below a specific level receive supplemental payments, negative taxes from the government
00:17:10.100 while those above it pay positive taxes, work incentive unlike traditional welfare that
00:17:15.840 often reduces benefits dollar for dollar.
00:17:18.960 The negative income tax reduces benefits by only a fraction for every dollar earned.
00:17:23.460 So encouraging recipients to find employment.
00:17:25.980 I think we got to keep the society involved.
00:17:28.920 I think we do.
00:17:30.120 Too much time on your hands is not a good thing.
00:17:31.860 People should start focusing on things that are done with labor, your hands.
00:17:37.840 Because again, I don't see AI replacing your toilet or rewiring your building.
00:17:43.280 So I think blue collar is going to become white collar.
00:17:45.800 White collar is going to start working for the blue collar.
00:17:48.200 So either own the business or get good with your hands.
00:17:52.540 Folks, once a year, we host an event called the Sales Leadership Summit.
00:17:58.620 That event is coming up in the next two months.
00:18:02.100 It'll be in South Florida at Trump Doral.
00:18:06.060 And it's for those of you that run a business because sales is king.
00:18:09.720 Most people don't understand the power of developing sales leaders that develop sales people.
00:18:15.420 This video will break down what's happening at SLS.
00:18:17.980 And hopefully, those of you guys that are doing a million plus, you'll get a chance to get a ticket for yourself.
00:18:22.380 Go ahead, Rob.
00:18:22.760 Play the clip.
00:18:23.840 So many years ago, I realized the size of your income, your net worth, your lifestyle is a pure reflection of the size of problems you solve.
00:18:30.460 So for me, going back 20-some years ago, I was a good salesperson.
00:18:34.460 I learned how to sell.
00:18:35.940 I knew if I ran three, four appointments a day, I could sell two, four, six, maybe eight insurance policies on a given day.
00:18:42.020 Then I asked myself, how do I sell 50 in a day?
00:18:44.560 How do I sell 100 in a day?
00:18:45.880 There's no way I can do it by myself.
00:18:47.580 I had to solve a big problem and go from being a salesperson to being a sales leader.
00:18:52.360 By the way, it is very different being a salesperson than being a sales leader.
00:18:56.600 That's a massive problem to try to solve.
00:18:58.880 What happened later on?
00:19:00.060 I went from selling two to three policies a day personally to eventually we sold one million insurance policies with our company.
00:19:08.160 And we sold that company for $250 million three years ago, licensing 60,000 insurance agents.
00:19:13.520 We solved a massive problem, got paid massively.
00:19:18.100 Let me bring it back to you.
00:19:19.560 In America today, according to the Bureau of Labor Statistics, we have 13.4 million sales people in America that go out selling every day working for commission.
00:19:27.080 We have 132,000 VP of sales, according to LinkedIn, and we have roughly 8,000 chief sales officers in America.
00:19:33.920 Now, when it comes down to putting the hat on of being a sales leader, it's a very different accountability, tough conversations, challenging, transferring your knowledge on how to get referrals, how to follow up on leads, how to properly follow up where you don't offend the person, the script you use when you DM versus when you email, which is when you make a phone call, how to give better presentations, the types of phone calls to make, the types of contests to run.
00:19:57.000 How to hold them accountable and drive them and not upset them and they still want to come.
00:20:01.040 How do you steer competition?
00:20:02.820 All of this are things companies that solve big problems that become multibillion dollar companies do.
00:20:09.540 So once a year, I host a conference called the Sales Leadership Summit.
00:20:14.480 This happens once a year.
00:20:15.720 To attend this, you need to do a minimum of a million dollars a year and have five sales people that report to you.
00:20:21.920 If you want to join us at this year's Sales Leadership Summit that happens end of March, we'll go through a 200-page manual together on how to go through A through Z of being a great sales leader.
00:20:33.060 Click on the link below, fill out the information.
00:20:35.680 One of our representatives from BedDavid Consulting will reach out to you and tell you more about the Sales Leadership Summit.
00:20:41.380 Rob, what is the website to go to this?
00:20:44.000 Do we have it in the link?
00:20:45.200 We do.
00:20:45.560 It's in the description.
00:20:46.920 It's also pinned to the chat.
00:20:48.580 Can you click on a link just to see what it looks like so everybody sees it?
00:20:51.580 So go, if that's you, click on a link.
00:20:53.940 We'll spend two days together at Trump Doral.
00:20:55.960 And we go through A through Z.
00:20:57.140 And it's a great place to network with other performers that are also doing well.
00:21:01.520 There you have it.
00:21:02.220 So what's the website called, Rob?
00:21:03.920 SLS.BetDavidConsulting.com.
00:21:06.560 Beautiful.
00:21:07.160 All right.
00:21:07.600 Fantastic.
00:21:08.400 If you enjoy this video, you want to watch more videos like this, click here.
00:21:11.120 And if you want to watch the entire podcast, click here.