By KAYLA WEBLEY
To read the factoids David Freedman rattles off in his book Wrong is terrifying. He begins by writing that about two-thirds of the findings published in the top medical journals are refuted within a few years. It gets worse. As much as 90% of physicians’ medical knowledge has been found to be substantially or completely wrong. In fact, there is a 1 in 12 chance that a doctor’s diagnosis will be so wrong that it causes the patient significant harm. And it’s not just medicine. Economists have found that all studies published in economics journals are likely to be wrong. Professionally prepared tax returns are more likely to contain significant errors than self-prepared returns. Half of all newspaper articles contain at least one factual error. So why, then, do we blindly follow experts? Freedman has an idea, which he elaborates on in his book Wrong: Why Experts Keep Failing Us – and How to Know When Not to Trust Them. Freedman talked to TIME about why we believe experts, how to find good advice and why we should trust him – even though he’s kind of an expert. (See the top 10 everything of 2009.)
You say that many experts are wrong, yet you quote many experts in your book. Are these experts wrong too?
They very well may be, but these are people who study expertise. They know how other experts go wrong because this is what they study, so maybe they’re better at avoiding some of these problems. Maybe they’re a little more careful with their data and they work a little harder to not mislead people. That’s just a suggestion. I mean, who knows? But that’s the best I can do to defend myself here.
In Wrong you write about the “Wizard of Oz” effect. Basically, from a young age we’re taught to think that someone else always knows best. First our parents, then our teachers, and so on.
The fact of the matter is, unless you’re the smartest person in the world, there is someone out there who knows more than you do. So it’s not that we want to discard expertise – that would be reckless and dangerous. The key becomes, how do we learn to distinguish between expertise that’s more likely to be right and expertise that’s less likely to be right?
And how do we go about that?
It would be nice if we could look at the experts’ track record and look at all their pronouncements to see what percentage were right. But we can’t do that, so you have to play a sort of statistics game here and ask the question, “What does better advice have in common?” so we can look for those features. Or, conversely, “What does bad advice have in common?” so we can avoid it.
What have you learned about bad advice?
Bad advice tends to be simplistic. It tends to be definite, universal and certain. But, of course, that’s the advice we love to hear. The best advice tends to be less certain – those researchers who say, ‘I think maybe this is true in certain situations for some people.’ We should avoid the kind of advice that tends to resonate the most – it’s exciting, it’s a breakthrough, it’s going to solve your problems – and instead look at the advice that embraces complexity and uncertainty.
But it’s not really natural to take less-certain studies and advice seriously, is it?
You’re exactly right, and that’s part of the problem. It goes against our intuition, but we have to learn to force ourselves to accept, understand and even embrace that we live in a complex, very messy, very uncertain world. Therefore, the experts who are more likely to steer us in the right direction are the ones who acknowledge that. It probably would be helpful if all study reports came with a little warning label like cigarette packs that simply spelled out generically that, by the way, experts are usually wrong. (Comment on this story.)
You say brain scans show that when presented with expert advice, we actually lose our ability to make our own decisions.
Yes. Now, let me point out, I always feel a little funny when I quote the results of a brain-scan study or even quote the findings of any study because, of course, my book is all about pointing out the problems with studies. But for what it’s worth, people have actually looked at this question of what happens to brain activity when people are given expert advice, and sure enough, you see that the brain activity dies out in a way that suggests the person is thinking for themselves less. The brain actually shuts down a bit in the face of expert advice. When we hear an expert, we surrender our own judgment.
So we essentially just blindly follow experts?
That’s exactly what it is. And there are certain experts who, not only is their advice very resonant, but they themselves are very resonant. Some experts project tremendous confidence. They have marvelous credentials. They can be very charismatic – sometimes their voice just projects it. Some experts get very, very good at this stuff. And what do you know? It really sort of lulls us into accepting what they say. It can take a while to actually think about it and realize their advice makes no sense at all.
You found some cases of experts who willingly discarded data that didn’t fit with the conclusion they were after?
That is a huge understatement – it is almost routine. Now, let me point out that it’s not always nefarious. Scientists and experts have to do a certain amount of data sorting. Some data turns out to be garbage, some just isn’t useful, or it just doesn’t help you answer the question, so scientists always have to edit their data, and that’s O.K. The problem is, how can we make sure that when they’re editing the data, they’re not simply manipulating the data in the way that helps them end up with the data they want? Unfortunately, there really aren’t any safeguards in place against that. Scientists and other experts are human beings, they want to advance their careers, they have families to support, and what do you know, they tend to get the answers they chase.
So you’re saying, if I set out to prove that wine is good for you, I can find the data to back up that claim?
You can. We see that all the time. In fact, we’re seeing it constantly. There are studies that come out that say obesity is actually good for you and those that say exercise doesn’t do you any good. If there’s a certain answer that you want, for example, an exciting research finding that’s going to get published in a research journal, then you will probably find some way to achieve it.
You say that some advice is good and even critically important. So how do we go about picking out the good from the bad? It seems like finding a needle in a haystack.
It is a needle in a haystack. Part of the problem is, we’re kind of lazy about it. We would like to believe that experts have the answer for us. And what we pay the most attention to are the most recent, most exciting findings. Newspapers, magazines, TV and the Internet oblige us by constantly reporting the stuff. We face this sea of advice all the time. So where is that needle in the haystack? I think the best thing to do is to discount as much as possible the more recent findings and pay more attention to the findings that have been floating around for some years. With a little bit of work, I think most of us can figure out how to answer some of these basic questions about whether advice seems to be pointing in the right direction or whether it seems to be falling apart.
What about studies that involve animal testing and take what they study on animals and apply it to humans? Is that really an effective way to determine what we should eat or what cancer treatments will work?
There are some things we just can’t study on humans because it would be incredibly unethical. Of course, it’s a much debated question of whether it’s ethical to study on animals too, but putting that question aside, clearly it can really help science move forward to do animal research. However, the fact of the matter is, the majority of animal research does not translate well to human beings, and in spite of the fact that scientists love to point out that we share anywhere from 90% to 99% of our genes with different types of mammals, we know we’re really different than mice and we’re even really different than apes. Again and again and again we see that drugs and behavior and almost anything you want to look at in animals turns out to not apply well to human beings. So, yes, it advances basic science to ask these questions, but does it result in good advice for us? In general, the answer is no.
O.K., this question has to be asked. You’re kind of an expert of experts, so should we not trust you either?
Yes, you should not trust me either. I mean, how could I possibly claim that I have some foothold on the truth that these other people I’m talking about don’t have? I don’t. Of course, I’m biased. I want a nice sexy story. How boring would it be for me to come out and be like, “You know, those experts, they’re pretty good, they’re right a lot of the time.” We wouldn’t even be having this conversation if I said that. There are all kinds of reasons why I might fudge the data myself or mislead people about this. But I’m not trying to give people answers here. What I am trying to do is provoke thinking, raise awareness and point out that there are real questions here that we all should be asking. We should all try to be smarter about how we pick our advice. How could I possibly be wrong about that?