Robert Tracinski - The Problem of Expertise

March 17, 2022 01:01:02
Robert Tracinski - The Problem of Expertise
The Atlas Society Chats
Robert Tracinski - The Problem of Expertise

Mar 17 2022 | 01:01:02

/

Show Notes

Join Senior Fellow Robert Tracinski for "The Problem of Expertise" where he will ask: How does a non-expert figure out how to rely on the advice of experts? How do we figure out who is a "real" expert and who isn't? How can we be "independent thinkers" if we have to rely on the knowledge of others?

View Full Transcript

Episode Transcript

Speaker 0 00:00:00 I expect to see more people as we get going, but we can go ahead and get started. We're going to be talking today about, uh, the problem of expertise where Rob will discuss a question that's been brought up by the pandemic. How does a non-expert figure out how to rely on the advice of experts and how do we figure out who was a real expert and who isn't, how can we be independent thinkers? If we have to rely on the knowledge of others, that's from the original promotion, but, uh, thank you all for joining our clubhouse room today. I'm Scott Schiff hosting the Atlas society, senior fellow Robert on the very timely topic of the problem with expertise. Uh, we encourage people in the room to ask questions. So as Rob has giving his opening, if you have comments or questions, raise your hand and we'll bring you up. I want to quickly encourage people to sign up for the Atlas. Society's weekly newsletter, which has some excellent content. I'll put up a link as we get started. This is being recorded for educational purposes. Uh, Rob, thanks so much for doing this topic today. So, uh, what's the problem with expertise? Speaker 1 00:01:14 Well, I put it as the problem of expertise, cause that's sort of a classic philosophers formula. It's one I don't particularly like as the formula because oftentimes, you know, you talk to a philosopher and they'll say the problem of X, what it really means is I don't want X to be true. So therefore I'm going to declare it to be a problem and then refuse to accept any solutions to the problem. Uh, but there is actually a legitimate sort of problem or quandary raised by this issue of expertise. And that is that often. It's not just because they've been, I'm not going to give a whole bunch of other examples. Often we come across issues where we need to address an issue. It's an important issue in our life. It has real consequences that we can't avoid, but it's an issue that requires years of study. Speaker 1 00:02:02 It requires advanced knowledge. It requires experience. It requires knowledge that you cannot possibly gain in anything like the timeframe you, even if it were, even if you had the background to it and you could say, oh, well, I can break your brush up epidemiology. You can't get that knowledge, the necessary base of knowledge in the timeframe that you need to make the decision. Uh, so you inherently have to say, okay, I'm going to have to go listen to people who have that knowledge who have that advanced knowledge who spent the years studying it. Uh, so, and that raises the question that I, in the promotional materials that I put out, which is how can you be an independent thinker? How can you be a first-hand thinker when you have to rely on knowledge that you cannot have yourself firsthand? Uh, and so the price of are things like a global pandemic happens and suddenly you need to know something about epidemiology, you know, how does the disease spread? Speaker 1 00:03:00 What are the meat? What are the mechanisms that you can use to protect yourself from it, et cetera, or, you know, some damn fool starts a war in Europe and you suddenly have a whole bunch of knowledge about, um, uh, about what's going on in Ukraine. What's the history where who's Latin Putin, where did all that come from? You know, what are the issues behind this? And also, you know, what's going on in the battlefield, you know, so you have a lot of people. There's the joke on, on Twitter usually is you have a bunch of people who got their degrees in epidemiology from Twitter university. And then they, you know, seamlessly go from being Twitter experts and epidemiology to being Twitter experts on, on, on warfare and battle of military affairs. Right? So, you know the question it's like, well, what is, you know, w what's going on with the offense, are the Russians really failing as bad as they seem to be failing? Speaker 1 00:03:53 The answer to that by the way is yes, but it's not obvious. You have to go to the experts and, and figure out is this really what it seems to be? Um, or global warming. It's a perennial one, right? You know, go boring goes back. They've been making claims about global warming for more than 40 years and comes from a quote unquote, consensus of experts. And you have to assess, well, is there really a consensus of experts? Uh, what's what's really going here. What's the basis for it. Or the financial crisis comes up and they say, oh my God, we have to have bailouts or else the economy's going to collapse and you have to wonder, well, okay, so, or they say, oh, this ballot was caused by the greed of big corporations. And we have to have government bailouts to the economy's going to collapse. Speaker 1 00:04:35 And you have to take those claims. You have to say, well, what really caused us a financial crisis? And are the bailouts actually going to, are they actually necessary? I've actually got to help. Are they going to make things worse? So that requires expertise on finance that most of us, uh, don't have. Uh, and then of course, it also applies a lot. Now these are all sort of public political things. This also applies a lot in our personal lives. Uh, you saving for retirement. Well, where should you invest it? Well, your cousin's friend says, and then you get it. You'll get all this contradictory advice from different people. Somebody says, I'll put it all in Bitcoin. Uh, and you get all this contradictory advice from different people. And you need to, without yourself being an expert in finance, because you have a job, you have a life, you have to come up with some strategy for where am I going to put my money if I want it to still be there, uh, that I, so I can, I can retire on it. Speaker 1 00:05:27 Or you go to the doctor and you get a diagnosis, right. You know, it's, it's cancer, you've got six months to live, but there's this treatment you want to know, okay, is it, do I really have, this? Is this really the best treatment? What's the best thing. How, how should I deal with this, this health problem? Uh, again, you don't have the medical degree, but it's a life or death issue. You need to know what's the best treatment or education for your kids. I see this a lot with, um, you know, I, I go to, I send my kids to a Montessori school and I'm often amazed by how many parents just really give they come. And they have no clue about different approaches to education, different theories of education. You know, I was, uh, well, I was aware of Montessori long time and thinking about these issues a long time before I had kids, most people are just thinking about it for the first time when their kids two years old. Speaker 1 00:06:15 So, uh, you know, there's all these issues where there's this a Vance expertise that you need to make a decision and you don't have the time, or you don't have the timeframe in which to acquire that expertise. So you have to fit, go out there and say, okay, who knows what they're talking about of this? Who can I go to for advice? All right. So what's the solution to this. That's the problem of expertise. What's the solution to it? Well, solution is at least on the first level is the question of how can you be an independent thinker? If you have to rely on the expertise of others, is that you use your independent thought to pick which experts you're listening to. So you use your independent thought to try to figure out who seems to know what they're talking about and who doesn't. So then the question is, how do you do that? Speaker 1 00:07:05 All right. So I'm doing this from pick their perspective. I think I've got the reason why I wanted to do this as I think I have a particularly good background for this, because this is basically my job. Uh, so I I'm somebody who writes about politics, but I write about, you know, I don't have one particular niche and I've got a few things I know better than others, but I don't have one particular niche. I'm not like, you know, an energy guy or a foreign policy guy or a finance guy where I, I I've covered one field and really get to become an expert at it. And stay only in that field, I'm a generalist, you know, whatever's happening in the news, whatever the top stories are, it's my job to go figure out what's going on. And that means, you know, every year or every six months, I gotta, there's a new topic that comes up and I have to do precisely what I was just talking about, which is I have to go out and figure out, okay, of all the people out there talking and arguing and all the people with Twitter threads going on, who are the people who actually know what they're talking about on this and who are the people who were just blowing smoke. Speaker 1 00:08:08 Um, and so as a, you know, basically there's another big issue that comes along academia, ology, counter-insurgency, uh, uh, um, uh, financial sanctions, whatever it is that that's happened, the news I have to go out and sort of figure out who knows what they're talking about on this. So that who, who, who can I read so that I can stay informed and who can I recommend to, you know, the readers of my newsletter and say, oh, here's a really good article. You should read. If you want to know what's going on on this topic. Um, so it requires getting up to speed and learning and, and, and, uh, figuring out who the experts are. So based on my, you know, 25 years of experience of doing this, I came up with five sort of quick rules of thumb or guidelines for how to do this. Speaker 1 00:08:53 So this is my first pass on this. I, I want to write something in much more in depth on this, but I thought talking it out here would be, would be helpful. Okay. So the first guideline is, don't assume you already know that is, don't assume that you can use the existing knowledge that you have in whatever field to apply it immediately to, to this new thing that has just come up. Now, I mentioned this because I know it's a problem for philosophers and I I've had some issues with Objectivists in particular. Cause you know, we have this idea, well, ideas, move history, philosophical ideas of the most important thing. So therefore, if I'm an expert on philosophy, I should be able to assess counterinsurgency. Well, no, you have to know counterinsurgency to assess counterinsurgency. So, so to superior knowledge is that this is a pet peeve of mine because there are a bunch of people who is back in years ago during the Iraq war where people said, oh, well, there's principles of counters. Speaker 1 00:09:44 One of the principles of counterinsurgency is you have their conditions on your wish. You want to make sure you minimize civilian casualties. And they said, see that's altruism. I was like, well, no, that's actually how counter insurgency war works. You can't just say, it's this philosophical principle that you disagree with. You have to understand how the actual, the fighting this type of war, how it works and how you win. All right. And the more common example I've seen sort of outside of the philosophers and outside of objectivism, the more common one I've seen, especially on social media, especially on Facebook for some reason is I hate to say it. I don't want to offended, but in the audience I have very good friends who are engineers, but engineers are the worst on this because an engineer actually has acquired expertise. He's acquired systematic rigorous knowledge in the field of, in science and engineering. Speaker 1 00:10:33 And oftentimes it gives them a little bit of hubris that you've all met this guy on, on, on Facebook or online somewhere, or maybe, you know, somebody like this. Who's like, I'm an engineer. I know how I know how systems work. I know how to assess this assess knowledge. Therefore I can instantly and very quickly acquire expertise on whatever new big topic comes up. Uh, so there's a sort of, uh, a certain show about a hubris that comes to this. And the important thing here is, is that even if you have a certain degree of knowledge that you have as a base, which is very, very helpful, don't assume that your existing knowledge is going to be adequate. You always, when a new issue comes up, you have to be active in searching out, uh, and knowing the limits of your knowledge and gaining the new knowledge that, that, that you don't and finding out the new knowledge that you don't have. Speaker 1 00:11:22 And oftentimes the new knowledge that you don't know, you don't have, you know, Don Rumsfeld used to talk about the unknown unknowns. You got to search for the unknown unknowns and not assume that your existing knowledge that you have is going to be adequate necessarily to the new situation. That's the first principle. The second is assess for prior politicisation and bias, right? So sometimes, so the thing is, you cannot, you cannot come out assuming that the experts are wrong or are biased. So you should not come out with a inherent or blanket suspicion of elites and experts, because the fact is that most fields are not politicized. In fact, most fields are neglected. This is what problem I have with people saying, oh, well, you can't trust epidemiology because this is the government. Or, you know, they have, or the epidemiologists on COVID have, they have, uh, an agenda and a bias. Speaker 1 00:12:19 And to my knowledge, you know, PR prior to 2020, and I know, I know some people who are epidemiologists, so, uh, prior to 2020 epidemiology was not a highly politicized field. It was extremely neglected field. That is, you know, the politicians didn't want to pay any attention to it at all. Now that's as opposed. So, so if, if you are going to look, that doesn't mean that bias doesn't exist and bias. Sometimes isn't political bias. It can be, uh, you know, a bias towards some existing, you know, prevailing theory that may or may not be true, but you have to, you have to have evidence for the bias. So for example, in as opposed to epidemiology, I would say climate science is the one case where I would say you could definitely look for a political bias there because it's been a politicized issue for 40 years. Speaker 1 00:13:09 And you can actually, we've actually seen operating out real time. The mechanism of bias wishes, you know, you've seen scientists being demonized, or, uh, just in the last year was a guy disinvited from a conference because he said something skeptical about global warming. So you can see how a group think is being actively enforced. So there you can say, yes, there is a bias. So somebody says, there's a consensus of scientists in favor of global warming. You can say, well, of course there's a consensus because you don't let anybody else say, you don't let anybody say anything else, but you have to have evidence for that bias. You can't just say, oh, experts are always biased. You have to look for a specific evidence of bias. So that's the second principle. And then the third one I'm going to put out here is, um, you, you do assess experts the same way you would with anyone else. Speaker 1 00:14:01 You know, if you're buying a car or if you're, uh, uh, uh, you know, if you're, if you're talking to anybody about any common issue, you, you talk to people and where you, you questioned them, or you read what they have to say, and you look for, do they have command of the facts? Do they have clear explanations? I look for people who have what I call knowledge on tap. Meaning anytime you ask them a question, they'll have, oh, well here, you have to understand, they'll have an explanation. They have a fund of knowledge on this topic that is really vast enough that any unexplained aspect of this, you can ask them and they know what they're talking about. Um, uh, my most, the most fun I have, I have an in-house expert on architecture at home, my wife and the great thing about her is every time I asked her a question about, well, how does this work about a house or out about construction or about colors or whatever she has this, all this knowledge. Speaker 1 00:14:54 I didn't even know she had it. I've been living with her for 30 years. Um, and then also the fact that, um, when asked questions or when pressed for explanations, they react, um, with candor, with honesty, you know, with, uh, if you asked a question and they don't know the answer, they'll tell you, I don't know the answer. Or if it's something where there's uncertainty or differences of opinion, they'll say there are differences of opinion. We're not a hundred percent certain about this, uh, or you know, that they will not try to shout down objections or evade objections. Uh, and the way I put it as, you know, people who react to the facts and arguments and not with attitude, uh, then the fourth thing I say is you look for people with a previous record, but within limits. And the general rule here is you don't want to put too much trust in anyone, because I have seen people who go off the rails or people who are, are, have been, have good things to say, or, or they're an expert in one area. Speaker 1 00:15:55 And then they go off the rails and other areas and come up with a crackpot theory. So you do look for a previous record, but you always keep that you don't, you, you, you give previous record respects. You don't give it blind trust. Fifth principle. The last one is really important is be where your biases. So there's a tendency to what go the term they use for this as expert shopping, right? You have a certain conclusion you want to be true. So you'll looking around for the guy who has the right letters after his name, uh, you know, the right credentials, the right position, somebody who gives you an excuse to say, oh, well, I trust this guy. He has a PhD. And you look for somebody who will reinforce your biases. And you, you glom onto that one expert who tells you what you want to hear. Speaker 1 00:16:41 And you ignore all the other experts who are telling you something you don't like. So you have to always be on guard against the fact that you have your own biases or your own predilections coming in to answers that you want to hear or are predisposed to hear. And you should always mistrust yourself. You should always be looking for the arguments that go against what you want to hear and considering them seriously and not just, you know, pretending to consider them seriously. Why you go back to the expert you like, but actually considering them seriously. All right. Uh, one of my favorite, uh, sort of internet meme. So this is an old article and the onions is like worst person. You know, it makes a good point. Uh, you know, the, the tragedy we've all experienced where somebody, we absolutely hate says something that makes sense. Speaker 1 00:17:24 And we have to admit that he's right. Um, so, you know, look, look for the worst person in the world who makes a good point, uh, and, and, and be open to that. Um, now those that's my five principles, but the one other thing I can add too, is the final issue is sometimes there's no excuse for just simply knowing the topic, knowing a lot about the topic yourself. Now, you may not be able to become an expert, but you can become, you can gain a lot. You can do a lot of research and gain a lot of knowledge and become a sort of a quasi expert. My, my dad is a, uh, an amateur historian of the Roman empire. And he says, he knew he was, uh, uh, you know, the, the, he sort of studied for many years. And he said, he got to it's. Speaker 1 00:18:10 I got, I knew I was really doing well, but I got to a point where I would see a guy, a historian being interviewed. And I would know, I'd be able to tell who knows what they're talking about or who doesn't, you know, and, but the only way he knows who's those that they're talking about or who doesn't is because he spent years gathering knowledge of his on his own and, and knowing. So he could tell when somebody was skipping over the, and leaving out important facts, or when somebody was glossing over a complicated topic with a glib, uh, uh, um, generalization. So sometimes there's no excuse. There's no substitute for just simply knowing a lot about the topic yourself and learning a lot, if you have the time and ability to do so, and really how much you need to know about a topic depends on it's important to you. Speaker 1 00:18:57 It depends on its consequences for your life. So, you know, if you got that cancer diagnosis, I suggest you go out and learn a lot about cancer, uh, because that's going to be something that there was a lot, if you know, there's a war on Ukraine, but your opinion on it is not going to make all that much difference. You're going to be naturally interested in wanting to know it's you, it may not be worth your effort to become a quasi expert, uh, in, in a war that, you know, might be over in six months. So that's my last sort of note on that. So with that, I want to sort of open this up to wider discussion and questions. Speaker 0 00:19:31 Great. And, uh, we do want to invite people up with, uh, questions and comments. Uh, I took the Liberty of jotting down some notes of your five principles. So, uh, I can try to get things started, uh, very, uh, in depth for first pass. Um, talking about engineer hubris, and I'm curious in your second example, um, I, for hubris, where they just, um, you know, they, they don't want to be called wrong on anything. So just kind of an arrogance and that's even beyond political. I think we've seen that with some of the CDC people. Speaker 1 00:20:08 Well, I think when you get government agencies involved, by the way, so there's interesting an expert in a government agency, it's sort of like, you know, how people, it groups oftentimes make worse decisions that, well, actually I've had some pushback on this. I just did an interview with, I just recorded it yesterday, a podcast with Steven Pinker on his book called rationality that just came out, um, which, and, and, uh, he, there's some pushback in there that apparently there've been studies that people actually do make better decisions in groups than they do as individuals under certain rules and conditions. And the reason being that, you know, one person might have a bias. One, people just might make a mistake, but if you have to discuss it with other people, then you're going to be forced to refine your arguments and you'll, you'll spot the errors and you might make a better decision. Speaker 1 00:20:57 And I think there's truth to that. But oftentimes, you know, it's the case that, you know, was it the saying is if, if you have a committee, the, the IQ of the committee is the lowest as the IQ of the lowest person in it divided by the number of people in the committee. Uh, uh, so, uh, and, but that, and that TEF definitely tends to be true of government agencies because government agencies, and this is why throughout the pandemic, I sort of like, okay, here's what CD the CDC says. Here's what Fowchee says. That's all off to one side, but there's plenty of actual individual experts you can go to, uh, without, you know, I think there's a certain fallacy with the pandemic of saying, well, there's the CDC and they say this, and if they're wrong, then the experts don't know what they're talking about. Speaker 1 00:21:42 Well, the CDC is not the experts. The CDC is a government agency. That's taking the knowledge of the experts and homogenizing it and reducing it to the safest, bureaucratic, lowest, common denominator. And that's the problem with the CDC. And I think the bigger problem the CDC has is, uh, and I, I, I need to do some up on this, but I get this impression from some of the, uh, epidemiology people. I talked to that the CDC is always trying to tell you not the truth, but the thing they think that if they tell it to you, we'll get you to act the way they want you to act. Right? So they, you know, and th they take into account the fact or real fact, which is that people tend to not listen to them. And, you know, if you tell them to do one thing, they'll misunderstand it, or they'll oversimplify it, or they'll, they'll do something similar to it, but not exactly the same thing. Speaker 1 00:22:34 So if you tell them, you know, there's an 80% chance that, you know, this will kill you, they'll say, oh, but there's a 20% chance I'll live. You know, people will do dumb things. So they'll tell you, you know, this is, so if there's an 80% chance, they'll tell you it's a hundred percent chance to, to forestall those, you know, that group of people who say, oh, well, this is a 20% chance. I'll make it. So you see what I mean? They're, they're always adjusting their messaging for how they think you're going to screw it up. And that's another thing that I think distorts, uh, the CBC in particular. And so, yeah, sometimes X and also put at the same time, you also get the T V expert types, the Dr. Oz types who are, you know, they have some degree of expertise, but they're always, uh, sort of over-exaggerating it in order to titillate the audience, right. Speaker 1 00:23:24 Uh, and you know, Dr. Oz always has the miracle cure for you. That's going to make you feel 10 years younger. And it, most of the time, 99% of the times it's bogus, but it solves ball on TV. So that's, you know, you always have those problems when you are this, this is why we need to assess the experts is you always have problems that there are experts, individual experts who are wrong, or individual government agencies that have weird perverse incentives. And that's why you have to go, you know, uh, uh, the, I guess the thing is you, you can't treat experts as this homogenous group, cause there's always differences between them and disagreements between them. And so that's why you have to look through and say, okay, who are the people within this group who seem to not be doing these things who seem to not have the arrogance or not have the institutional, the weird institutional incentives, or not want to always exaggerate everything and, and, you know, cause it sounds good on TV. So that's why you have to do that work of trying to figure out who's who, Speaker 0 00:24:26 All right, good. Uh, we've got some people on stage, Brian Speaker 3 00:24:33 Hey, thank you all. Uh, I love this topic and my question is we, we see a lot of people in real life and especially here on, uh, the app where they, I think mistake education for wisdom. So they get so, uh, full of knowledge, facts, and figures, uh, that they can quote you, you know, page and paragraph of, you know, some obscure, esoteric, uh, reading. But yet when you talk to them about solutions or a strategic level, uh, almost philosophical level distinctions, they really struggle. So they, you know, they really have a hard time separating or integrating, you know, those facts into a coherent message. And I think that's a lot of what I appreciate about Ron's work. So, uh, you know, I, I call that maybe wisdom or sense-making. So I'm wondering your thoughts on that and how can we, or should we, um, maybe, uh, start prioritizing that in our, in our education system. I know we have kind of a liberal arts education where we try to get breadth, you know, at the K through 12 and college level, you know, so we can talk and connect dots at, at different levels. But, um, I just see a big disconnect between, you know, that, uh, education and wisdom or sense-making piece. Speaker 1 00:25:56 Well, I think there's something to that in that, uh, you know, that's something I think I would need to add to my, so, like I said, I'm, I'm sort of workshopping a future article of this. And one thing I think they need to add there is that when, when you're, when you're rationally using experts, you're using them not to give you a yes or a no, or a very specific conclusion. You should do X when you go to you are looking for those solutions, but you're also what you're really looking for. If you want to rationally use experts is not here's somebody to tell me what to do, but rather here's somebody who can give me in a digestible way. Give me an idea of what the reasons are for why I'm doing this. It gives me an ex, somebody who gives me an explanation and some advice and guidance as here's why we're doing this. Speaker 1 00:26:45 Here are the, uh, here are the, the, the grounds, the, the, the reasons behind it, here's the evidence behind it. Not in, not that you're going to, they're going to make you an expert too, but that they can take the knowledge that the experts have and put it into a form that is possible for you to have some grasp of. So you're not just flying blind. So that's, when you talk about the wisdom, what you're looking for from an expert is here are the reasons for why I'm giving this recommendation, that you can then assess on your own, um, and, and bring in your own, uh, uh, use it to make your own, your own decisions. But also something, as I mentioned, I just interviewed Steven Pinker and he has his, his book on rationality, I think is very interesting. And one of the things I like about it, it has some things, I, I, some criticisms I ha I have of it, but one of the things I like about it is it talking about rationality. Speaker 1 00:27:38 He goes, you know, it's not just, you know, here's Aristotelian logic. He also goes into some of the stuff about probability and Basie and reasoning, and these, uh, techniques for tree figuring out how to make decisions under conditions of uncertainty, right? You don't know what the outcome is. You have a certain, you know, limits to your knowledge. You have some idea of what the probable outcome is, how do you balance those probabilities and make a decision? And I think that's a really valid, you know, thing that people people need to learn. And I, you know, you talked about, we're trying to do it with education. Our educational system last is very far from producing a good, you know, general base of knowledge, uh, uh, that to help us do that. But, you know, I think that's the goal of education. Uh, unfortunately I think our system is very far from actually achieving that for most people. Speaker 3 00:28:30 Yeah. I just see a different, uh, approach to, um, the way we teach and, and, uh, try even try to have people, um, slide up and down that spectrum. You know, if you think of the top of that spectrum as say abstract or, or, you know, connecting those dots in a coherent way, um, internally consistent way, and then down at the, at the bottom where you think all the different facts and figures, the minutiae, being able to slide up and down that spectrum to correspond, or, you know, interact with someone at the, let's say calibrated or commensurate level is extremely rare. And I, eh, um, you know, I've, I've taught decision-making and I just see as a disconnect, people who get locked at one tier, if you will, they're either down in the weeds or they're up in the cloud, you know, and being able to slide up and down is extremely rare. Speaker 1 00:29:24 Yeah. It was a whole, there's a whole like 2,500 year philosophical history on that of, you know, the sort of the rationalist versus the impure, or to put it in, in sort of Renaissance terms, the rash Renaissance or enlightenment era, you know, circa 1600, 1700, it was the rationalist versus the empiricist, right. You had the rationalist for the ivory tower guys who like to cart would say, well, you look into your head and you, you examine your concepts. You, you, you, you do deuce purely you gain all knowledge purely by deducing from, from, from the highest rarest, you know, most abstruse abstractions. And then on the other side, you have the empiricists where you pile up all the data, but you can't draw conclusions, you know, that sort of David Hume type, but you pile up all this data, but you don't, you can't say for certain whether the sun's going to rise tomorrow. So there's this long history and philosophy of you're either down in the weeds or you're up in the clouds. And, you know, the being able to connect abstractions to facts and the big picture to the minutia is, is, has been one of the biggest problems, uh, in terms of just figuring out that it can be done and how it can be done, and then propagating that out into the educational system. Speaker 0 00:30:36 Good issues. Thank you. Well, Roger. Speaker 4 00:30:40 Hey, what's up? Um, so yeah, I wonder kind of walk you through, um, my experience with expertise and, and then tie it into a question directly to, to your five, uh, propositions there. W uh, my background is in enterprise sales and I've been an individual contributor I've led large teams. Um, and w w when you're in sales, you gain expertise that you believe is relevant. Uh, but you have to rely on other experts. For example, uh, it's believed that the best way to communicate with people is to present a slide deck to them. Well, I can talk, I can talk to people and convince them and be persuasive, but I'm not good at making those slide decks critique. So I have to go to the marketing team and you don't go to the head of marketing. You find the person that actually can make the slides pretty. Speaker 4 00:31:31 And, uh, people at the Atlas society would appreciate this because you guys have very elegant marketing. And so whoever's doing that is amazing. Um, and, and so as a sales person, I go to the person that can make this thing look elegant for me. Then I go to the product team and I'm like, okay, what does this stuff actually do? What problem does it solve? So then I can gain that level of expertise, and then I could weave it into my own words to be able to communicate it to an end-user. Well, what what's interesting is that then you ask the question to clients, you know, Hey, when you buy things, what is your preferred buying process? And all of a sudden you start realizing, and this is where it came from. And it took an embarrassing amount of time. Is you find out that the thing that they hate the most about a selling process or a buying process is sitting through the damn slide deck pre, Speaker 1 00:32:28 That was what I was going to say. When your first set of slide deck, I said, I hate slide decks. Speaker 4 00:32:32 Yeah. So when you, when you, when you realize that they don't like slide deck, you realize, well, I've been asking all the wrong questions and I've been exercising this muscle to get better at presenting slide decks when that was never really the thing that was going to be the force multiplier that could really change the business. And so once you realize that, okay, well, that's not what they like, well, what is it that they want? And so tying it back to what, uh, what you were talking about, how do you like what's the framework that you could follow to figure out the unknown, unknown? Because we were, we were just chasing after getting better at making slide decks, presenting slide decks, and, and measuring the success of the presentation of the slide deck when the unknown unknown was that nobody cared about slide decks. Speaker 1 00:33:21 I see. That's a great question. I think the, the, the, my first answer to that is the first, you know, like we say, in, in AA, the first step is knowing that you have a problem. Uh, it's it's being aware that, you know, the, too many people, I think too many people have the approach of there's a conventional wisdom. This is how this is done. And they want to, a lot of people want to be able to say, okay, I'm just going to plug into that conventional wisdom. This is how things are done. I will then learn to be really good at doing things the way they're supposed to be done. And a lot of times that works. And, you know, if, if you do not have the ambition to be more than, uh, you know, if, if you're a Bishan is I'm just going to be one of the guys who takes orders and does stuff, that's probably fine for you. Speaker 1 00:34:06 But the thing is that oftentimes, you know, the real productive work isn't saying, well, is the way things are always done the way they ought to be done or need to be done. Is there a better way of doing them, or maybe the way things are done is, um, is, is wrong. And people have, have gone chasing off in a wrong direction. Um, there was, I've been the, probably the most, one of the most recent notorious examples of this was the pivot to video. This is, uh, my field of, of, of media. There was the pivot to video, and it was the idea that, uh, there, there was this one presentation that was given by Facebook where the piece face, the people at Facebook said, you know, when we see people people's engagement, when you put out video, people's engagement goes way up, we're seeing this huge response that you get way more engagement. Speaker 1 00:34:54 As you put out video, as this one presentation made by Facebook. And then everybody in the BD business says, oh, we have to pivot to video. And it was terrible for somebody who's not a video guy here. Somebody who's a writer. The written word is my medium. I do a little video, but it's not my big thing. This was a disaster for people like me, because everybody was saying, oh, we're not gonna hire anymore. Uh, we're not gonna hire more writers. We're gonna hire videographers. Uh, and we're going to hire, you know, pretty young Instagram influencers who look good on camera instead of, you know, dumpy middle-aged guys who liked to do like to talk to people on apps, where there are no pictures. Uh, Speaker 1 00:35:37 Great. Uh, but then the funny thing is about, is it turned out to all completely be wrong, that that Facebook had screwed up and they'd gotten their, their measurements wrong, and they have produced a wrong a result. And that actually, wasn't this huge burst in, um, in engagement, if you had video. But in the meantime there who had just spent two years pivoting everything, you know, investing a huge amount of effort into more video material on, on their media websites. So that's another example of how sometimes, you know, the conventional wisdom and the new big thing that everybody thinks is right, is not right. And the main thing is simply to be, is I think the first, my first sort of rule of thumb applies here, which is the idea that you don't assume you already know how things work and always be active minded and actively seeking out the unknown unknowns and the things that you are, things that you're not aware of. And sometimes that includes questioning, you know, the very basics of everybody says, this is how you do it, but maybe that's actually not that effective or maybe there's something better. Speaker 0 00:36:44 Awesome. Thanks, Rob. Good. Yeah. At some point, someone in you or someone else, uh, got to such a piercing level of well deal, like the slide deck, Speaker 1 00:36:58 And sometimes it's really simple. Like you just go ask people questions, you know, Speaker 0 00:37:04 Uh, Chris, you have a question for Rob, Why don't we go to Brian and Chris you'll have to unmute yourself when we come back to you Speaker 1 00:37:17 As this Brian with an I, Speaker 0 00:37:18 Yes. Speaker 1 00:37:19 Okay. Speaker 6 00:37:21 Thanks Scott. Hi, Rob. Um, yeah. So just a couple of things that I caught, I missed the first few minutes of the talk. Um, one, it sounds like you're actually saying, you know, question people's motivations, whether it's Dr. Oz or whoever. So I take that to be pointed. Um, the second is, um, you know, what I call ask for the math, uh, in other words, what's their methodology, or what rubric are they prescribing for making a decision? What's the decision criteria? What's the rationale? So I take that, um, here's my question. There were a number of talks here on the app, what a quite a while ago on public intellectuals, the role of public intellectuals and discourse and what makes a public intellectual. And, um, certainly a public intellectual would be a unique kind of expert. Um, do you have any thoughts on what you would consider the role of a public intellectual to be? Should we have them and how do we determine, uh, you know, how do we return, whether it's, uh, Edwards I eat or whoever it is. Right. And that's my question. Speaker 1 00:38:23 Yeah. Public intellectual is like a, a, is, is a phrase kind of like thought leader that I kind of roll my eyes at a little bit, partly because nobody used it until about 10 years ago. Um, you know, back back in the day we just called the pundits or something like, you know, George Wolf got up there, he was a pundit. Uh, so we didn't call him a public intellectual, which is pretentious term for it. Uh, and the fact is that, you know, what makes someone a public intellectual is they get up on TV and talk. I mean, that's basically all there is to it, right. Uh, or, or, or they get up on TV or they have a column somewhere. Um, I mean, I suppose you could call me a public intellectual, even though I'm, I'm very rarely on television. Uh, uh, so the, you know, it's, it's something it's a term. Speaker 1 00:39:10 So Vegas be kind of meaningless now where I think, and, and, and, and, and where I think there's a role for the idea of the quote unquote public intellectual is the idea that there are people, there are a lot of people out there who have tremendous expertise, uh, you know, people who are often in academia who have studied a topic for, you know, 20 years and in-depth, and tremendous detailed in-depth knowledge and, and, uh, you know, profound ideas who don't do any public facing work, meaning that the public facing, you know, sort of with these new fangled terms, but meaning that they, you know, they write academic journals, they teach their students. But as far as the public, the general public is concerned, they might as well not exist. And, uh, they're, they're, they're totally unknown. Um, and so you do need people who form those bridges, who say, okay, here is this knowledge that exists in academia and has been developed, you know, by, by experts in academia or in private research has been developed by experts. Speaker 1 00:40:14 And here's, you need somebody to have to translate that knowledge, that it can be made comprehensible to the average person who's, um, who's reading the newspaper, uh, uh, and, and, uh, uh, or who's, or God forbid, who's watching cable TV. Now, there there's a certain Murray Gell-Mann effect that you're trying to avoid there. Right. Which is, uh, Murray Gell-Mann is the famous physicist who talked about the, uh, uh, phenomenon where if you read an article in the newspaper, uh, about a topic that, you know, well, you spot all the errors and all the problems and all the oversimplifications and it just howlers of factual errors. And then of course, he says, the problem is you then turn the page and you read the next article on a subject. You don't know and assume that, and you forget everything you learned from that last article, and you assume you're getting valid, valid information. Speaker 1 00:41:08 Uh, so there is this role necessary for somebody who's then going to actually have the expertise or know the experts. Uh, who's going to be able to convey that valid knowledge to the, to the public. So I think that's the role required for a quote unquote, public intellectual is you're gonna use that term. Uh, and that is, you know, and it is a real skill to be able to take arcane that requires years of study, and then be able to condense it down in a way that somebody who doesn't have that your years of study can get something useful out of it. Uh, you know, that, that they're going to become an expert just by listening to you, but that they will get some useful sense for where, for, for why the expert knowledge leans in one direction or another. Speaker 0 00:41:59 Great. And, uh, again, I want to encourage everyone in the link up top. You can get updates from the Atlas society. Uh, Chris, are you there and able to ask your questions? Speaker 7 00:42:11 Yeah, I'm here. Sorry about that. Um, and I had some issues last week, but anyway, uh, one of the problems, and I think it's almost becoming a crisis with expertise now is that there are certain incentives that experts have that might bias them. I mean, you know, the, the, the best example of course, is the doctor who gets free samples of, uh, you know, of a prescription drug and then has to give them out. But, you know, they, they can even be things that more, you know, more innocent, for example, like, let's say I have a travel agent book, a trip for me, and then I find out, oh, her husband's a pilot with Delta airlines. So that's why she always books. You know, I mean, it seems as though we have a real problem, and I don't know how we're going to address that. And, you know, and it raises a question of how do we trust these experts or if, if they can be compromised in this manner. Speaker 1 00:43:10 Okay. So my, you know, my experience with experts is outside of a few fields, like the problem, oftentimes isn't that a field is politicized, but that has been neglected. Uh, and so most cases, it's the, the imagined idea that somebody has this incredible financial incentive to lean one direction or another is very exaggerated because usually they don't make that much money. There's not that much money in leaning one way or the other. There's no, you know, there's, there's very little mechanism for some of these experts to actually monetize, uh, leaning one direction or another. I think the much greater danger, the one you have to look for and guard against is the group thing, a phenomenon that, you know, you will be looked down upon or dismissed by your professional colleagues, if you have a viewpoint that differs from them, so that therefore you, um, uh, you should, uh, you have an incentive to sort of cling to what the status quo view is. Speaker 1 00:44:13 Um, and you know, this, I've nippy doing an upcoming clubhouse at one of these times about the issue of a consensus. I think it's very interesting. One night, my very, very brief statement on that is, you know, a consensus can be useful. It can actually ha calves. Some it can, it can point you in the right direction if, by a consensus, you mean of group of people independently coming to the same conclusion when they are free to disagree, right? I suppose you have a bunch of people are free to disagree, and they're an inter they're engaging their own independent lines of inquiry. And they all come to the same conclusion that tells you, you know, if there's a good chance, that could be true, right? If, if people come to it independently, but it's the independent part, that's the problem. Because oftentimes, you know, when you have very clubish, uh, academic circles or whatever, and people are not coming to it independently, there are subtle social pressures. Speaker 1 00:45:07 And it's not about, you know, getting invited to Georgetown, Georgetown cocktail parties, which is what people, usually people usually think I write what I write, because I want to get invited to Georgetown cocktail parties. And I've never been to a cocktail party in Georgetown. Um, and, and or they think, oh, you're paid by big oil or you're, you're in the pocket of the big corporations of a big pharma or whatever. And most of the time, that's not, it it's, it's the, if to the extent that there's bias, it's going to be that sort of vague pressure to say what everybody else is saying so that you aren't looked askance at as a crackpot. And because oftentimes, you know, there are cases where everybody believes one thing and it turns out it's the co it's, the contrarian guy who has a completely different notion who turns out to be right on the other hand, I'm going to cost. Speaker 1 00:45:54 So you definitely have to look for those biases and try to adjust for them. And if you find everybody sort of repeating a certain conventional wisdom, you want to ask how well grounded is that conventional wisdom? Is it something everybody is just repeating, like, like, like Roger's got Roger Rogers, uh, sales, all the salesman, and Roger knows saying, oh, you have to have, uh, a slide deck. Is that something they're all just repeating? And nobody's actually asked the client, do you like slide decks? And, and you find these things, sometimes there was an in the pandemic, there was one that came up early about, um, there's a particular standard that was being used for like the size of particles that could, uh, that, you know, if you exhale a particle liquid, the size of particle that was needed to be able to, to, to, uh, transmit infectious disease. Speaker 1 00:46:41 And it turns out when you looked at it, the standard that had been set and accepted a certain size of particle, uh, had been set like 60 years ago, but it was done because there was one study about tuberculosis where that, that number was valid. And then somebody had come along and taken the study of tuberculosis and generalized it to all infectious diseases, not realizing that, you know, tuberculosis has its own specific. I think the idea of tuberculosis is it has to get really far down in the lungs to, to affect you. So it was something that was unique to tuberculosis. It didn't apply to other diseases. And so that number was still being used as if it applied as a general rule, but it was actually a misinterpretation of a study that didn't, uh, that you, uh, it was a wrong interpretation of a study that just got accepted for 50, 60 years as, as the conventional wisdom. Speaker 1 00:47:36 And nobody had gone back and checked it. So you do have those things. Now, on the other hand, I want to say there is also the danger of the professional contrarian, you know, cause there's, if there's a niche to be, if there's a comfortable niche to be made in the profession, in any area, in any area of expertise, a comfortable niche for somebody who repeats the status quo, there's also a kind of a needs to be made for the guy who's always being the contrarian. And, you know, that's especially true on, uh, on, on for the sub-sect intellectual, right? You can, you can do really well on subsect by being the guy who's always out there saying everybody else is wrong. And, and they're sort of like, you know, the, there are certain characters I have in mind who are like this, who they start out as, oh, that guys are really interesting contrarian. Speaker 1 00:48:21 And then you realize after a while, no, he's just a guy who likes to say the opposite of whatever everybody else is saying, because it gets them attention. Uh, so you always have to be, you know, you have to be careful that each time you're looking at this, you don't accept the status quo just because it's the status quo you ask, why is this the status quo? And you don't accept the contrary. And just because it's contrary, uh, uh, you also have to say, well, do you have good reasons for doubting the status that, you know, is the consensus there for a reason, or do you have good reasons for Dottie? Speaker 0 00:48:55 I, uh, agree with much of that. Just going back to what Chris was saying for a moment. I, I think that, um, I think of like several years ago, there were some Sears mechanics that were over recommending things on cars. And sometimes I feel like my doctor's doing that to me. Speaker 1 00:49:17 I found with my, of my doctor, the last doctor I had, um, I have new one I'm going to now, but the last one I had, my old doctor retired is that I had to adjust for his biases, but they weren't a matter of, you know, what was he selling to you? It was a matter of, uh, you know, he had certain sort of predilections to, uh, you know, uh, the big thing he had is that, you know, he would tell me, uh, now that I'm over 50, I shouldn't be doing, you know, I shouldn't be doing anything too strenuous. And it was like, it looked, I'm like, oh, so I shouldn't like, you know, put 300 pounds in my back. And, and is it, you should bend your knees below, you know, down too far. So, oh, sorry, I shouldn't put 300 pounds on my back and do that. Speaker 1 00:49:58 And of course he knew that's exactly what I've been doing. So we had a very different like attitudes towards, towards heavy exercise. And I had to sort of adjust for that bias that he had. Um, so you always have that and whenever somebody is telling you something, yeah, there's always the, you know, you know that they've got, you know, they've got the upsell they have to do, or they're there and you have to take into, but again, that has to do with also how this question of how you make decisions in the, in the, in the face of uncertainty. And sometimes you might think, okay, fine. Maybe he's selling me this particular part because he gets, uh, he gets extra money. If I buy this particular part on the other hand, maybe it's not worth my trouble to chose, to go, to acquire the expertise in automotive automotive repair to know exactly which is the best part. Maybe if, you know, if he wants to sell me a sales serious part, fine, I'll do a serious part because it's not worth my effort to become a great enough expert to know when, uh, which part I should, I should really be getting. Speaker 0 00:51:04 Yeah, I can appreciate that. We worry if it can happen on higher scales in society. But again, I want to invite people up on a stage, uh, raise your hand. We'll be glad to bring you up in the last few minutes. Uh, just to recap a little bit for JAG and others who may have joined late, uh, Rob shared, uh, five basic principles for dealing with experts. Uh, things like don't assume, you know, um, assess for prior politicisation, but don't assume they're politicized. Uh, the third one I had a question about, do they have command of the facts and clear explanations? And, uh, the question I had about that, I mean, wouldn't someone with the political narrative always have a quick answer to, Speaker 1 00:51:49 Well, yeah, that's what makes us difficult, right? So there is always the glib guy who has an answer for everything, even if it's not true. And that's where I think that, you know, you are assessing somebody always though in a context. And the great thing about a free living in a free society is we owe and with freedom of speech, the reason why we have to have freedom of speech is so that one guy who says he's an expert, you know, uh, when comrade, uh, Lysenko has his theory on genetics, you know, for those who know, you know, uh, trophy, my Cinco was a Soviet scientist who had this crackpot theory X, it was completely wrong. And he set back Soviet, uh, uh, genetic science, bio, Soviet biology, and agriculture set it back, you know, by 40 years with these crackpot theories. But the big problem is you couldn't question comrade Lysenko because he had, you know, he was friends with Stalin. Speaker 1 00:52:46 And he had the backing of Solomon. And you didn't, you didn't question somebody who had the backing of solid and some people who did question him, ended up in the Gulag. We literally ended up in the Gulag for, for, um, uh, for asking too many questions about his, his crackpot theories of genetics. So the point here is that the great thing about our free society is you have this variety of voices. So when you go assess one guy and say, well, he says this and this and this, and he seems to have really good answers. You get to go to the next guy who says, no, that's all completely wrong. And here's why you should go, you know, here here's, here's why that's wrong. And then you have a choice to say, okay, which guy seems to have the better? And like I said, it's a difficult thing sometimes because sometimes you will have people who have seemingly convincing explanations on one side and another guy who has a seemingly convincing explanation on the other side. Speaker 1 00:53:38 Sometimes that's because one guy's just really good at flimflamming sometimes it's because there's a legitimate disagreement among the experts or legitimate uncertainty on the topic like masks versus no masks. You know, I, I, when I said, one of the advantages I had when the pandemic first started is, and I think one of the really good things that you do when, when you're faced with something like this is if you know somebody who you already knew before this became a political football, somebody you already had contact with and had a good opinion of, and you can talk to them because, you know, it's not just somebody who, who is, you know, uh, who emerged because this, when this became a public controversy. So I had a subscriber, uh, who, uh, somebody I'd known and talked to before, who is a well-regarded epidemiologist. I was able to call him up and ask. Speaker 1 00:54:29 So what's going on with this coronavirus stuff? And when I asked them things like, what about masks? He said, you know, there are people who have literal, shouting matches one way or the other on masks. So I can tell you what I think is the best position, but acknowledged that, you know, people really are, the evidence is ambiguous there aren't, you know, there are, there are controlled studies. There's not enough controlled studies and the controlled studies that exist. Some go one way, some go. The other, I can tell you what I think is the balance of probability on this. So sometimes, you know, when you have that issue of trying to assess between two different experts and trying to figure out who can answer questions, well, it's just going to be difficult because there are different views on it. Uh, I'm going through that right now with this question of the response to Ukraine of, you know, should we be, you know, some people are saying we should impose a no fly zone. Speaker 1 00:55:22 I think that's wrong, but there's some people saying, well, we shouldn't do a no fly zone, but we could do this, that, and there are other people saying, no, we have to be more cautious because, you know, with Russia being a nuclear power, there's, there are certain stricter limits to what we can do if we want to help you crane. And there's a lot of back and forth on that going on among people who I, I think are well-regarded experts. People, I think, know what they're talking about, but even they have disagreements on how far can we push this? How far should we push this? Because it's a legitimately difficult topic. So that's one thing you always have to do too, is I have not expert assessing. This is you realize that is to realize there is not necessarily one proper true expert answer. There may be con there's. There may be infrequently is legitimate controversy among the experts. And you have to then weave in amongst that and say, and, and make your, make your judgment on what you think the best balance of probabilities is. Speaker 0 00:56:20 Good stuff. Uh, Liberty Shamrock, or Tommy. Good to see you. Quick question for Rob, Speaker 9 00:56:31 Actually. Yes. Just a quick moment here. Speaker 0 00:56:34 Sure. Um, you know, I just wanted to mention quickly, we've got two great TAs events tomorrow at 5:00 PM. Eastern, uh, TAs interviews, the legend, Peter Diamandis across most social media channels. And here on clubhouse, uh, 4:00 PM Eastern on Thursday, Richard Salzman will be talking about at least four different types of equality. I liked shows about making finer distinctions. Uh, are you ready to county? Speaker 9 00:57:07 Yes. Thank you so much. I was trying to get on my Terran and set my things down. I'm multitasking. Of course. Speaker 9 00:57:17 Um, basically my question was about, um, conformity, um, like giving an example and it's something that, that my husband and I have done all across the country. Just, just to be funny, but, um, basically an experiment, if one person looks up at the sky, you know, do others come and join or not? And then how many more joined? Like if there were already two, like how much more it expands and so forth. Um, and that's just something that's always interested me, you know, reading back to like Solomon nauseous, um, writings and so forth. Um, my question is, um, do you think in today's society that they conform as much to that as they did when this was originally studied back? Well, not originally studied probably, but the studies in the fifties and sixties would have changed in today's world. Uh, since so many people consider themselves their they're critical thinkers. Do you think this still happens? Speaker 1 00:58:22 Well, I think it always happens in conformity. Conformity is always the problem, uh, that when, uh, other people believe something, there's a sort of pressure on you to say, well, you know, baby, they're right. Uh, or, uh, if somebody else claims to see something, especially if it's something ambiguous, it's very easy to cause it's yourself that you see it too. Uh, but now I want to point out something that I didn't know if you were here earlier. Cause I, I, um, and Steven Pinker's book Hema, he refers to a study by the way, some of the studies from the fifties and sixties, you know, the funny thing about that is there's a reproducibility crisis in psychology that a lot of studies that people have, you know, have a suit, oh, this proves this point. And it's definitely true. People have gone back and tried to reproduce it and found they can't reproduce it. Speaker 1 00:59:08 So this thing that people have thought, you know, so probably it'd be very ironic if a study on conformity was something that people kept repeating, cause they were all conforming to it and it wasn't valid in the first place. Um, but, but this goes back to point that that Steven Pinker was making, he says people can actually make better decisions in groups than they do as individuals, if it's done in the right way, if they do it according to the right rules and where it's not just, we're all forming, you know, if we had a discussion with it, I said, if it's more like, if it's less like Twitter and more like say Wikipedia, you know, where we're competing, if there's a process and you have debates and different points of view are aired and there are certain rules for how you edit a piece, not that Wikipedia always gets it right, but it's, it's more accurate than, you know, the group think you had to get on Twitter where all the incentives are, how many likes can I get? Speaker 1 01:00:00 How many retweets can I get? How can we all agree with each other and form and gang up on somebody who doesn't agree with us? So it, it, you know, people can in groups actually make very good decisions, uh, because they're, you know, making objections and, and, and having to convince each other and being able to spot errors that you might've missed, but has to be under the right rational rules for it. And I think that's, that's, that's the real trick of how do you figure out how to get those rules so that you get, uh, better decisions by taking in other people and other people's ideas into account rather than producing the group thing. Speaker 0 01:00:37 Great. Well, uh, this has been a great topic, Rob. Um, you know, I agreed with you, uh, more than I thought I was going to. And, uh, I want to thank everyone for joining us. Uh, you know, um, you can sign up for the Atlas society newsletter and Atlas society.org. Uh, I'm Scott Shu for, uh, the Atlas society. We hope to see you again, next time.

Other Episodes

Episode

April 25, 2024 00:56:54
Episode Cover

Privacy Is Not A Luxury: The Atlas Society Chats with Naomi Brockwell

Join Atlas Society CEO Jennifer Grossman for a special Spaces on X with tech journalist Naomi Brockwell to discuss privacy and digital privacy concerns....

Listen

Episode

March 21, 2024 01:02:11
Episode Cover

Conservatives on Heels: What Do They Hope to Conserve? with Richard Salsman

Join Atlas Society Senior Scholar and Professor of Political Economy at Duke Richard Salsman, Ph.D., for a Twitter/X Spaces discussion exploring the challenges conservatives...

Listen

Episode

December 14, 2023 01:31:23
Episode Cover

Thoughts on Javier Milei with Stephen Hicks and Richard Salsman

Join Atlas Society Senior Scholars Stephen Hicks, Ph.D., and Richard Salsman, Ph.D., for a Spaces discussion and analysis of president-elect Javier Milei and what...

Listen