Saturday, April 15, 2017

For Those Who Love Truth- Interview with Lee C. McIntyre- with update

Yes, we've been doing a lot of interviews here, because there are so many cool writers doing great work! Here's one that's a bit different, as Lee is publishing so far in the non-fiction world.
But we'll be talking about fiction- specifically why so many people believe in things that aren't true, even when shown the truth.
Disclaimer- those who are offended by facts, and prefer their opinions over verified science and reality should not read further. They are likely to have some beliefs challenged by a rational thinker. 


Bio:
Lee McIntyre is a Research Fellow at the Center for Philosophy and History of Science at Boston University and an Instructor in Ethics at Harvard Extension School. He holds a B.A. from Wesleyan University and a Ph.D. in Philosophy from the University of Michigan (Ann Arbor). He has taught philosophy at Colgate University, Boston University, Simmons College, Tufts Experimental College, and Harvard Extension School.

His most recent book is Respecting Truth:Willful Ignorance in the Internet Age (Routledge, 2015)


in which he explores the problem of why people sometimes refuse to believe something even when they have good evidence that it is true. In a forthcoming book, Post-Truth – which will be part of the “essential knowledge” series at MIT Press – he explores the recent attack on facts and truth since the 2016 Presidential election.

*****

Q. Lee, you wrote this book well before the current climate of a mass disbelief in facts. Did you see all this coming? Please tell us a bit about the origin.

A. I wouldn’t say I “saw it coming” because I had hoped it wouldn’t get to this point, but the idea of “denialism” was certainly out there and I was fighting against it. One of the most maddening things is that the tactics which were successfully used to obfuscate the truth about things like evolution, climate change, and vaccines have now made the jump to ALL factual topics. It used to be that political ideology was keeping people from believing the truth about science. Now it’s about things like whether it rained during Trump’s inauguration or whether the murder rate is going up. This is distressing because we’re moving in the wrong direction. 

Q. For some of us, it's frightening and impossible to understand how millions of people can just choose not to believe in reality, and still function. Tell us how this comes about.

A. It’s pretty frightening to me too, even though I’m trying to understand it better. In my new book Post-Truth, I’m examining some psychological research that has explored the question of disbelief in the face of evidence. What they’ve found is that we are wired with cognitive biases that can smooth the path toward irrationality. I don’t think anyone can really explain how evolution allows this (what’s the reward for disbelieving in truth?) but it is there, at the neural level. Of course, we’ve known for years that emotion, desire, and motivation can color our beliefs. Way  back in the 1950s Solomon Asch was doing work that showed that if you put someone in a room with others, and they all gave the wrong answer to a factual question, he’d do it too. These were situations where it was easy to tell that the answer was wrong, but there is a strong human desire to conform. Unfortunately, this is exacerbated by getting positive feedback for your mistakes, so when people hunker down in a news silo or a chat room where they are believing in wild things but everyone around them is too, they don’t get the kind of negative feedback that is necessary to change their beliefs. Belief becomes tribal. More recent psychological work has shown that once we get to this level, it is very hard to convince someone to change their mind, even when the facts are in their face. They just don’t see it. It’s not that they are being stubborn: they literally can’t see the truth anymore.

Q. Can you give an example?

A. Sure. During the 2016 election, conservative voters in Pennsylvania, Michigan, and Wisconsin were targeted with a deluge of fake news in their Facebook feeds. This was a coordinated attack by Russia, where they hired thousands of hackers to produce ridiculously false and horrible stories about Hillary Clinton. You’d hope that people would be able to apply some critical reasoning skills and know that these weren’t true, but when your friends are passing the same stories back and forth, one might begin to wonder: “Does Hillary have a brain tumor?” “Did she organize a child sex slave ring out of a Washington DC pizza restaurant?” This sounds absurd, but in politics you ignore absurd stories at your peril. Remember the “Swift Boat Veterans for Truth” during the 2004 election, who were trying to make the case that John Kerry was a coward in Vietnam? He didn’t want to “dignify” it with a comment for two weeks. By then it was too late. When people hear false stories over and over again, and their friends are talking about them, they are more likely to believe it.

Q. What do you feel is the main reason people believe something obviously untrue?

A. It’s called motivated reasoning. In short, they want to believe it. If someone wants to believe something then there is an easy pathway in their brain to try to make it true. Daniel Kahneman talks about this in his wonderful book Thinking Fast and Slow. When we hear something that we want to be true, we engage in something called “confirmation bias,” which is when we go out and look for reasons to think that the belief is right. But the problem is that if you’re on the hunt for reasons to believe something you’re probably going to find it, even if the belief is wrong. This is why science has so much invested in testing a hypothesis – in trying to disconfirm a theory. You don’t learn much by examining evidence that one of your beliefs is true, you learn by trying to find evidence that it’s not. But who is going to take the time to do this? When we all got our news from the same media sources, there was more opportunity to work from the same set of facts. Now a lot of the alternative media are simply making things up, and no one can tell what the facts are anymore.

Q. Why is this mindset dangerous?

A. The main reason is that it’s so easy. Like I said, it’s wired in. Whether we’re liberal or conservative, our brains are set up to engage in a process that feels a lot like thinking, but it really isn’t. In the past, it may have felt safe to dismiss the kind of people who believed in conspiracy theories about climate change or government surveillance. Now those people are running the White House. And it’s dangerous at a general societal level too. Remember that fake news story about Hillary Clinton running a child sex slave ring?  A deranged man read the story and showed up at the pizza restaurant where it was allegedly taking place and fired off a few rounds from his shotgun. An even better example occurred a few weeks later when the Pakistani Defense minister read a fake news story that said that Israel would nuke Pakistan if they sent any ground troops to Syria. He immediately threatened nuclear retaliation against Israel. Fake news can get people killed.

Q. When public officials go on record with "alternate facts," do they know they're lying, or are they blinded by their ideology?

A. That’s a good question and it’s hard to know. There is a long tradition in American politics of “spinning,” which is putting the most favorable face on a set of facts. But I think we’re way beyond that now. It’s not necessarily that they know they’re lying, but maybe they’re not really sure what’s a lie and what’s the truth anymore. When you watch Kellyanne Conway , I think that most of the time she’s lying and she knows it. She’s too good at what she does to avoid the truth so assiduously. It must be a deliberate campaign of obfuscation. That said, some have argued that the best way to deceive others is to deceive yourself first. We saw this back on election night in 2008. Remember when Karl Rove was doing the color commentary for FOX News and he just wouldn’t accept that Obama had won the election? Even though FOX had already called it, he kept insisting that the numbers were wrong and that when a few more counties came in from Ohio, Romney would win “in a landslide.” That is delusion. That is someone who is so deep into their ideology that they can’t see the facts anymore. The goal is to stop people from getting to that point. Every lie has an audience. Even if you can’t convince the liar, what about the people who are listening? If we can stop someone before they make that slide from ignorance to “willful ignorance” to full blown denialism (or delusion), then we’ve done a good thing. But Karl Rove? I think he’s a lost cause. Kellyanne Conway too, because even if she’s aware of what she’s doing, she’ll never admit it.

Q. Tell us what rational, thinking people can do to counter this mass hysteria.

A. Fight back. Don’t let a lie go unchallenged. Keep relentlessly pushing the truth. The problem occurs when people are only hearing one side of the narrative. Propagandists have known this since Joseph Goebbels and probably before. It’s called the “reiteration effect.” If you hear something over and over you are more likely to believe it’s true. There is also something called “source amnesia,” which is when you remember the message, but forget whether it came from a reliable source. People who want to get others to believe their lies capitalize on this and they have to be fought. One of the most encouraging things I’ve read recently comes out of some of psychological research which shows that  if you just keep hammering people “right between the eyes” with the facts, eventually is has an effect. At first they resist and it may even backfire, but you can also break through. Also, remember that if someone hears the same facts from more than one source it will help them to believe it. The reiteration effect works both ways. Truth is a powerful weapon. And remember: all of these “irrational” people don’t think that they’re being irrational. In their mind, they’re looking for the truth too. (We know this from fiction right? The villain is the hero of his own narrative). We can capitalize on this. Give them some facts that challenge the narrative of lies they’re being fed by the ideologues and the propagandists.

Q. Is there anything we can do to keep this from happening in the first place?

A. Teach critical thinking! And teach it early. I just read about a 5th grade teacher in California who was teaching his students how to spot fake news. He made a game out of it. He gave them a rubric such as “look for copyright” and “look for a date on the story.” Simple things. Things a fifth grader could do. And they LOVED it. He said he can’t get them to go out for recess now until they play “the fake news game.” That is the right track. Also we forget that the expectation of objectivity in a news source is a fairly recent luxury. The concept of objectivity didn’t even exist for American news until about the 1830s and didn’t really catch on until the scourge of “yellow journalism” in the 1890s. People need to learn how to be skeptical of what they are reading again. We need to engage our brains and expect to question things. And if we want more objective, fact-checked, double-sourced, investigative journalism we should darn well be prepared to pay for it. I bought a subscription to both the New York Times and the Washington Post just after the election. I hear a lot of other people had the same idea.

Q. If someone came to you for advice on how to deal with the current political situation, how would you help?

A. This is a tough one, because I’m a philosopher and not a political activist. But the one thing I’d say is don’t give up. Stand up for what you believe in and make sure your elected representatives know how you feel. Truth matters. Facts matter. But your voice matters too. If you don’t make a statement about your beliefs they will get drowned out.

Q. Give us a bit of hope, some good news about all this.

A. It may seem that we have given up on truth, but that is not true. Even when people are going to be personally hurt by something that is true, they are reluctant to destroy it. Nixon kept the Watergate tapes. Criminals keep souvenirs. Why do we do this? Because I think that at some level people have a deep desire to know that the truth exists, even if they want to ignore it for a while. It’s like taking a kayak out into the ocean. It’s fun and exhilarating, but you want to make sure to keep the shoreline in sight.

Q. When you wrote Respecting Truth, did you map a good deal out in your head (or even outline) before crafting, or did you piece together ideas until a form came about?

A. I had been working on the issue of science denial for quite some time, so a lot of the outline was already there. But then I had to really dig into the examples and figure out how to make them accessible for a general audience. Another challenge was to figure out how to write a book where I was offering some perspective on the topic, while still telling a story. Philosophy is so argument oriented that we sometimes forget people are more convinced by an example or a story than a syllogism. I always outline. I can’t help it. But when I sit down to write it’s an act of pure serendipity. I’ve got all of these sources and pieces of things I want to say and I just draw on them and put them together. I guess it’s sort of like quilting (though I’ve never done that). You have the pieces but you have to be ready for snags and surprises along the way.

Q. What would you want a reader to take away from reading this book?

A. That it is possible to understand why science denial is happening and that we can do something about it. My goal in writing these days is to engage the general reader. I still write some technical philosophy that’s primarily for my professional colleagues, but I enjoy the challenge of trying to reach a wider audience as well. In Respecting Truth, I want to think clearly about issues that are important to all of us, and draw the public into debates that might have seemed closed off. Truth and reason have been the subject of philosophy for the past 2400 years. All of a sudden they’re sexy topics. I think we need to embrace that.

Q. Who should we be reading and listening to now? Are there writers with similar themes to yours? Who are your influences (can be writers, or even artists, musicians, or others) and what is it about their work that attracts you?

A. Everyone who is interested in the story of how we came to be at a point where facts and truth are in question should read Naomi Oreskes and Erik Conway’s book Merchants of Doubt. It is a devastating history of how denialism over scientific topics (like smoking, acid rain, ozone, and climate change) has been manufactured by those who had money at stake. Ideology often has deep roots in economic interests. I don’t write about that aspect of it myself, but this book is great background for anyone who thinks it is all ideology. One of the most profoundly important books I’ve read in recent years is Robert Trivers’s The Folly of Fools, which talks about the role of deception and self-deception in human behavior. Trivers is a biologist, but he writes like a dream, and his insights are marvelous. In philosophy, I enjoy the work of Michael Lynch, Noretta Koertge, and Alex Rosenberg. In social science, there’s been some stunning work done by Sheena Iyengar, Brendan Nyhan, Jason Reifler, Daniel Kahneman, and Cass Sunstein. Some of my favorite “general audience” writers about similar topics are Robert Wright and Michael Shermer.

Q. Any goals you've set for yourself, professionally or personally? What's the next step in your writing world?

A. In addition to Post-Truth, I’m currently working on a book about scientific reasoning. At heart, I’m a philosopher of science and I have a theory of what’s so special about scientific reasoning. That’s not quite a general audience issue, but I’m writing it as clearly as possible, because I think that these days especially it’s an important issue for all of us. Post-Truth is a book that I’m really excited about. It’s short, pointed, and goes right to the heart of what I think is the main threat to our society today. But it’s also much more political than anything I’ve ever written. I’ve joked with my wife that if they ever start having political prisoners in the USA, they’ll have a cell waiting for me. I’m also an aspiring novelist. I love to read Joe Finder, Harlan Coben, and Linwood Barclay. I remember sitting on a beach one day reading John Grisham saying “I’ll bet I could do that…how hard can it be?” That was in 2004.

Q. Tell us a fun fact about yourself.

A. I once sat in the cockpit of an F-15 Eagle and got a perfect score on the Secret Service test to detect counterfeit money (not on the same day). I’ve also had a painting rejected by the Museum of Bad Art (not because it was too good, I can assure you).

Addendum: Lee now has a piece that has been accepted into the the permanent collection of the Museum of Bad Art.
Congratulations! Keep after your dreams to make them happen!

Q. Any other information you'd like to impart?

A. I believe that reading is our strongest weapon against tyranny.

---

Web page: leemcintyrebooks.com/

Where to buy:  https://www.amazon.com/Respecting-Truth-Willful-Ignorance-Internet/dp/1138888818/ref=sr_1_1?ie=UTF8&qid=1492098049&sr=8-1&keywords=lee+mcintyre

No comments:

Post a Comment