In this episode, we discuss the current state of the U.S. political system and how to deal with it. We also discuss the impact of the immigration reform bill and how that could impact the future of the country.
00:13:21.580You know, everything is going to be radically transformed because anything that we apply thinking to is going to be very much transformed by it.
00:13:30.400We did an interview with a woman called Casey Means.
00:13:34.480She's a Stanford-educated surgeon and really one of the most remarkable people I have ever met.
00:13:40.760In the interview, she explained how the food that we eat produced by huge food companies, big food, in conjunction with pharma, is destroying our health, making this a weak and sick country.
00:13:54.240The levels of chronic disease are beyond belief.
00:13:57.180What Casey Means, who we've not stopped thinking about ever since, is the co-founder of a healthcare technology company called Levels.
00:14:06.120And we are proud to announce today that we are partnering with Levels.
00:14:12.580Levels is a really interesting company and a great product.
00:14:15.640It gives you insight into what's going on inside your body, your metabolic health.
00:14:20.300It helps you understand how the food that you're eating, the things that you're doing every single day are affecting your body in real time.
00:14:52.140But the bottom line is big tech, big pharma, and big food combine together to form an incredibly malevolent force, pumping you full of garbage, unhealthy food with artificial sugars, and hurting you and hurting the entire country.
00:15:07.820So with Levels, you'll be able to see immediately what all this is doing to you.
00:15:11.160You get access to real-time personalized data, and it's a critical step to changing your behavior.
00:15:17.100Those of us who like Oreos can tell you firsthand.
00:15:20.380This isn't talking to your doctor in an annual physical, looking backwards about things you did in the past.
00:15:25.660This is up to the second information on how your body is responding to different foods and activities, the things that give you stress, your sleep, et cetera, et cetera.
00:15:52.880This is the beginning of what we hope will be a long and happy partnership with Levels and Dr. Casey Means.
00:15:58.860They speak of darkness and danger, but totalitarian novels also give us hope.
00:16:04.820Showing us how to defend our society from the horrors of tyranny.
00:16:09.280In Hillsdale College's free online course, Totalitarian Novels, Hillsdale President Larry Arnn teaches us lessons from classic novels like George Orwell's 1984 that are as relevant today as ever.
00:16:22.240Sign up now for Hillsdale College's free online course, Totalitarian Novels, at tuckerforhillsdale.com.
00:18:34.580People just haven't yet experienced all of that.
00:18:37.660And so you're very quickly going to be in a situation where the problems are going to be given to it.
00:18:45.500You're going to ask it strategies and so on that can take into consideration all of the things that are happening from everywhere and how the cause-effect relationships work.
00:20:43.800Can you take kind of the art and the guessing out of it at this point?
00:20:46.620So you're saying, you have said many times, you've written a lot about it, but the need of governments to get down to 3% of GDP with their debt.
00:22:29.980And there'll be maybe five measures of inflation.
00:22:33.520And we're using the term inflation because our minds are limited in its capacity of the number of things we could think about.
00:22:42.540When we're now in this new reality, which we now are, you can go down to a molecular level essentially in saying, I could see all the different transactions of what was bought and what was sold and why.
00:22:57.940And now I can really have a level of understanding.
00:23:01.460We don't have to be at this grand level that we don't.
00:23:04.820We're going to be at the molecular level of understanding individual transactions and what's affecting them and be able to deploy resources at the individual molecular level, just like we can do it in biology or physical existence and so on.
00:23:22.800So, I mean, this will, lots of downsides to AI.
00:24:20.480What I mean, it's going to feel like, pooh, you're going through over the next five years.
00:24:25.860And that environment is because of these five major forces, all of these things, and the changes in the technologies, particularly artificial intelligence and related technologies.
00:24:38.840So the world five years from now is going to be a radically different world.
00:24:44.120And I don't know what that's going to look like.
00:24:45.900When you go into the world of quantum computing and what quantum is like in so many different ways, it raises questions of, you know, what is that like?
00:24:54.820I'm not smart enough to tell you what that world is going to look like.
00:25:09.420Well, that's, I'd say my business is try to predict, but I'd say, first thing, whatever success I've had in life has more to, been due to my knowing how to deal with what I don't know than anything I know.
00:25:32.460So, yes, my business in a nutshell is I try to find a bunch of bets that I think are good bets, but to diversify well so that I have a bunch of diversified bets because I do not know.
00:25:46.620I mean, in terms of my actual track record, I've probably been right about 65% of the time.
00:26:38.380I can place some bets that allow me, you know, they're not the certain bets, but I can place enough bets and have enough diversification that I can be relatively confident of some things.
00:26:52.280But never absolutely all totally confident.
00:26:56.160But I think when we're coming back anyway, that's the reality.
00:26:59.380I'm just describing our reality the best I can.
00:27:03.160That's why people who are confident in the future and are just experiencing the present, you know, right now, people are describing, all of them are describing how things are.
00:27:16.940And almost everybody thinks the future is going to be a modified version of the present.
00:27:30.000Well, I'll guarantee you, there will be big changes.
00:27:33.220So those are dumb people, you're saying, making those comments.
00:27:35.700I'm not, I'm saying it's understandable, but when you study change and the nature of change, it's a, you know, the world changes in dramatic ways because of causes that we can look at and get a good understanding of.
00:27:54.360But we can't be sure about anything because of the nature.
00:28:00.080But in this specific case, I mean, that's always true and wise people understand that.
00:28:03.860Like, you don't, you're not in control of the future.
00:28:06.420But in this specific case where there are specific technologies whose development we understand because we're watching it, it almost feels like there's no human agency here.
00:28:16.300Like, not one person ever suggests, like, well, why don't we just stop the development of the technologies by force?
00:28:20.880Well, there's, I think you're being theoretical again, you know.
00:29:30.380Okay, how does the machine work to make decisions?
00:29:34.560Okay, so if we can agree on this person makes these types of decisions about these things and it works this way, then we can say we, the collective we, can do that.
00:29:46.260But this theoretical collective we that is going to make decisions, like we could sit here and be very theoretical.
00:30:04.120Is it true that one country will be completely dominant by the end of this race and that that will be meaningful?
00:30:10.340No, I think that what's going to happen is, and again, I'm speaking now probabilistically, I think that there will be different types of developments.
00:30:20.880But by and large, it's very difficult to keep intellectual property that when you take the products of the intellectual property and you put them in the public.
00:32:29.520You know, so, you know, I would say we're not going to have competitive advantages in those things.
00:32:38.740What we're competitive in is that small percentage of the population that is uniquely inventive in terms of inventiveness.
00:32:52.300You know, the number of Nobel Prize winners in the United States.
00:32:56.460The United States dominates Nobel Prize winners in the world.
00:33:00.260The inventiveness, best universities and so on.
00:33:02.580We have a system that is a legal system and a capital market system, and we can bring the best from the world all to the United States to create an environment.
00:33:14.240If we can work well together in that inventiveness with rule of law working and all of that working, we have those things that are our competitive advantage.
00:33:25.140We do not have manufacturing, and we're not going to go back and be competitive in manufacturing with China in our lifetimes, I don't believe.
00:33:34.900OK, so now the question is how we deal with that.
00:33:38.800Our inventiveness, you just said, and many have said, comes from our education system, from our universities.
00:33:44.480But then you began the conversation by saying that AI is already.
00:33:58.640And there's a lot to be attracted to in the United States from the best and the brightest, because we are a country of all of these different people operating this way.
00:34:07.540And we create these equal opportunities.
00:34:10.040Look who's running some of the countries, companies.
00:34:13.680If we can have the best in the world come in that kind of environment to be creative and so on, we can invent and so on, but we can't produce.
00:34:24.820But those, the people you're describing have come to our universities.
00:35:50.420But you began the conversation by saying that AI is now at the point where, you know, the machines have the equivalent knowledge of a PhD in every different topic.
00:36:00.240So, like, at some point, are you going to have universities?
00:36:04.160We'll redefine what universities are like.
00:36:06.780But you're going to have that combination of things working together.
00:36:11.340Because, still, we're a long way from, not a long way, but we're away from the point of the decision-making will be made by the AI.
00:36:22.000Because, okay, and the wisdom will be by the AI.
00:36:26.180Like, you're not going to have the AI determine how you raise your kid.
00:36:30.720And different people will raise their children differently and so on.
00:36:33.700But the actual, you'll rely on it, but it's really the magic for the foreseeable future is remarkable people with remarkable technologies producing remarkable changes.
00:36:47.700And then we're going to have, then, the consequences of that.
00:36:51.500As long as human decision-making plays a role, I'm totally fine with it.
00:36:54.820But you say that the university is going to change.
00:39:03.000It's online free for anybody who wants to look at it.
00:39:04.980And I rated different powers, economic power, military power, education power, and so on.
00:39:13.520Then I rated health, how long you live, the diseases you're encumbered by, and so on.
00:39:20.180And happiness, what your happiness level is.
00:39:23.520And what's interesting about that is that the measures of power don't have a, past a certain level of living standards, don't have a correlation with health, which is amazing, because you have all the money to produce the health, or, and have no correlation with happiness.
00:39:48.100That, like, for example, in the United States, which is the most powerful country in the world, by these measures, our life expectancy is five years less than Canadians.
00:40:03.560So they're right next to us, so they're right next to us, and five years less than countries of equal income levels, okay?
00:40:11.140So health, we don't, there's poor correlation.
00:40:14.880And in happiness, there's no correlation.
00:40:20.420Like, Indonesia has the second highest happiness rating, you know?
00:40:26.620So all I'm saying is, think about also how we work with each other, how we are with each other.
00:40:33.880The highest determinant of happiness is community.
00:42:34.180I'm answering your question about how we deal with this.
00:42:38.040I'm saying we have to deal with it together.
00:42:39.920There's only in an environment where there's harmony rather than fighting, are you going to be able to address the types of questions that you're raising, right?
00:42:50.120How do you, you know, when you ask me, you know, is what's going to happen with AI?
00:42:56.740And then you say, we need to do this and we need to do that.
00:43:01.380It strikes me that how the we's deal with each other, to be able to deal with those things is the most important things.
00:43:09.840And there are basics of how we deal with each other.
00:43:35.360If you get past the things that you, your basics, you know, if I can, you know, health, education, habitat, and you get past those, you've got everything you need.
00:43:47.340And then if you have community, you have everything you need.