It's been said that the problems you encounter in life stem not so much from what you don't know, but from what you know for sure that isn't so. Who said it? We don't know, although many people are certain that it was Mark Twain. More on that later. For now, why would it be less hazardous—to yourhealth, toproductivity, tohappiness—tonotknow a whole bunch of things than to believe things that aren't true? Because if you're sure that you know something, youacton it with the strength of conviction and resolve.
If you're sure that an alternative treatment will help cure your cancer better than "Western medicine," you'll forego the traditional treatment. This is exactly what happened to Steve Jobs—after being diagnosed with pancreatic cancer, he pursued a kind of new age, Northern-California alternativedietin lieu of medical treatment. By the time he realized it wasn't working, it was too late for medicine to help him.
If you're sure that your choice of political candidate is right if you know it for sure, you're not likely to be open-minded about any new evidence that might come in that could—or should—cause you to change your mind.
I am a college professor, and one of the things I do for a living is train PhD students in science. They come into my laboratory full ofconfidence. After all, they have been at the top of every class they've been in all throughout their school lives. If they hadn't been, they wouldn't have gotten into a first-rate college, and if they hadn't been at the top of their classes there, they wouldn't have gotten into the very competitive graduate programs at the universities where I've taught and those like them—the Stanfords, Berkeleys, Dartmouths, and McGills. But here's the problem: They come in thinking that they are hot stuff. They have learned massive amounts of information, and unfortunately, they are so sure that their knowledge is correct, they are wont to add new knowledge without questioning the foundations of the old. In their time under my tutelage, I spend most of my time trying to teach them that they don't know what they think they do. I don't teach graduate students so much asunteachthem. This takes four to six years. In some cases, eight.
When a graduate student comes to me and says "I just realized I don't know anything aboutcognitiveneuroscience" I congratulate them and tell them they're now ready to receive the PhD. The PhD is effectively a license for someone to become a lifelong learner, certifying the kind of open-mindedness and critical thinking skills necessary to become a creator of knowledge. Knowledge can't be created in anenvironmentwhere everything is already known. It can only be created in an environment where we're open to the possibility that we're wrong. For those of you steeped in Easternphilosophy, you'll recognize the Zen connection. A book was written about this by the philosopher Alan Watts—TheWisdomof Insecurity.
I wroteA Field Guide to Liesbecause I think that all of us are capable of this kind of critical thinking, regardless of our educational background. The kind of inquisitiveness and curiosity I'm talking about is innate. Every four-year-old asks a series of incessant "why" questions: Why is there rain? Because of condensation. Why is there condensation? Because of changing temperature conditions. Why are there changing temperature conditions? Etcetera. We have this beaten out of us early on by worn-downparentsand teachers. But thiswhymode is the key to all critical thinking. Think like a four-year-old. Ask "why" and "how." Ask them often.
Don't believe something just because everyone else does.If you like Latin, this is calledargumentum ad populum. Yes, there is such a thing as the wisdom of the crowds, but it has limited applicability, especially when the crowds aren't thinking critically. Believe something because you find the evidence compelling. (Think slavery.) As Tolstoy said, “Wrong does not cease to be wrong because of the majority share in it.” Or St. Augustine: “Right is right even if no one is doing it; wrong is wrong even if everyone is doing it.”
Don't believe something just because it is backed by a fancy website, or scientific terms or equations.Pseudo-science hijacks the words of science without using the methods of science to get you to believe things that aren't so. Too many of us are bamboozled by fancy terms, bold headlines, and testimonials. Take a moment to look more carefully at the evidence being presented. There is no miracle pill that will enhance brain function, no magnet bracelet that will enhance stamina.
Don't reject a source just because it is occasionally wrong.Don't accept a source just because it got one or two high-profile things right.The New York Timesis one of the most reliable and rigorously fact-checked news sources in the world. They do make mistakes and they print corrections every day. But on the whole, if you read something there, it has a very high likelihood of being true. Supermarket tabloids do occasionally get stories right, but on the whole, if you read something there, it is unlikely to be true. Elvis is not alive on a spaceship circling the moon, and Michelle Obama does not have a newly discovered identical twin sister.
Check for plausibility.Many claims are just impossible; many more are improbable. A car that needs no fuel and can generate its own power seems to contradict the laws of physics. A 200-year old woman living in China whose secret to longevity issmokingtwo packs of cigarettes a day flies in the face of medical science. One widely reported statistic was that 150,000 girls in the U.S. die each year ofanorexia. That can't be true: The total number of deaths for girls from all causes in a single year is only about 8,500 (or 55,000 if your definition of "girls" includes women under the age of 44). You'd find that out by checking reputable sources, such as the CDC.
Correlation is not causation.Two things can change together, but it doesn't mean that one caused the other. Ice-cream sales tend to increase during months when people are wearing short pants, but you wouldn't want to conclude that eating ice cream causes people to wear shorts, or that wearing shorts causes people to eat ice cream. A third factor, high temperatures, could be said to cause both. But not all things that occur together are influenced by a third factor, either: The day that the stock market reached an all-time peak, I saw a whale jump out of the ocean in Washington state. I don't think he was celebrating the increase in his portfolio. The two events are probably unrelated.
Does the evidence actually support the conclusion?Fast-talking, loose purveyors of information may flummox you with a whole bunch of data that aren't related to the claim. Sometimes they do this intentionally; sometimes they don't know that they're doing it. Consider an investment manager's claim that he can double your money in three years. He cites as evidence his academic degrees and a new system he has developed. Those are not evidence that he can do what he says—they may add credibility, but they are notevidence. Even previous high-yield performance is not evidence—investments are inherently risky, conditions can change, and the economic climate is a complex and unpredictable system.
Look for a missing control condition.A new pill claims to cure headaches within four hours. A look at the evidence reveals that people with headaches were given the pill and reported that their headaches got better. What wedon'tknow is how many headaches would have gotten better on their own in that time. To know that, you'd need a controlled experiment—that is, an experiment in which a control group of people, randomly selected, get no treatment (in the form of aplacebopill) and are compared with the treatment group. A wealthy white socialite may not believe claims of police brutality because her interactions with those nice officers have always been so pleasant. But she is not controlling for the fact that her economic class, neighbourhood, andracemay be contributing to those interactions. In the language of science, she has not controlled for those factors in forming her opinion. White journalists Ray Sprigle and John Howard Griffin pioneered the study of such interactions by posing as African Americans and documenting very different treatment.
Allowing ourselves to realize that we don't always know what we think we know opens our minds to new knowledge, and allows us to navigate the world more effectively, choosing among options (or political candidates) that are more likely to maximize our success and well-being. In the current election climate, many people decided early on which candidate they wanted to support, based either on a gut feeling or the information they had back then. If they're not open to new information as it becomes available, they may support someone who is unlikely to embody the principles they value.
Mark Twain is widely cited as stating some version of the phrase that opened this article, that it ain't what you don't know, but what you know for sure that ain't so that will get you in trouble. Many people believe he said it. A thorough search of sources reveals that he not onlydidn'tsay it but didn't say anythinglikeit. The source of the quote is unknown. Sometimes you don't know what you think you do.
This article was originally written byDaniel J. Levitin, Ph.D., and full credit goes to Psychology Today, who published this article over a year ago. This article has been reprinted for the purpose of education.
People face so many pitfalls on their path to making reasonable decisions—whether they’re a consumer making a purchase or an employee meeting performance goals—how do you make it more likely that they’ll land on the most beneficial decision for themandfor your organisation?
Our understanding of the unconscious mind has come a long way since Sigmund Freud, grounded in decades of research into what drives ordinary, everyday human behaviour. Today’s behavioural scientists like to say that we are predictably irrational. And what can be predicted can be managed, at least to some degree.