Truth matters, in politics, culture, personal beliefs

Some big questions have demonstrably true answers. But when they don’t fit powerful narratives, some powerful people are making the questions irrelevant.

Or coming up with pragmatic answers, you know, whatever works at the moment to dodge the truth.

As the Planned Parenthood president just did this week, saying that when life begins is not really relevant to the abortion debate.

“It is not something that I feel is really part of this conversation,” Cecile Richards of Planned Parenthood told Fusion’s Jorge Ramos on Thursday. “I don’t know if it’s really relevant to the conversation.”

When pressed, Richards said that in her view life began for her three children when she delivered them.

She explained that the purpose of her organization is not to answer a question that “will be debated through the centuries,” but to provide options for pregnant women.

People who choose to deny the facts may find them debatable or beyond their ability to debate, or just reduce them to an incoherent diversion.

But it is not debatable when life begins. It is scientific fact.

Planned Parenthood’s Cecile Richards dodging the question of human life by saying it’s irrelevant to the abortion debate is seriously dishonest and disingenuous, at best. It provides the occasion to recall former abortionist Dr. Bernard Nathanson, one of the original architects of the abortion movement in America, telling the story behind the lies and deceptions for many years after his conversion. Late in his life, in a dramatic effort to help secure legislation in South Dakota that would strengthen informed consent laws, he made this video admission that as one of the original founders of NARAL, they made up the numbers and the ‘facts’, to ‘save abortion at all costs.’

His lesson about the importance of devising and driving a narrative “at all costs” applies to the whole choice movement, and Richards’ response reveals where incoherence inevitably leads.

It happens in other kinds of politics, too often. Remember Hillary Clinton facing a congressional task force inquiry into what really happened in the notorious Benghazi attacks, finally and angrily shouting ‘what difference does it make?

Political commentator Charles Krauthammer says there’s all the difference.

There’s a difference between the truth and a lie. The difference is that people in high office with public trust ought not lie. And if it was a lie, for whatever political or other reason, it shouldn’t have happened, and the administration itself should have traced it down and corrected it. And they didn’t. And that’s what is disturbing and remains disturbing.

And some people are still seeking the truth about that.

Ans, as Dr. Martin Luther King Jr. taught, there is an eternal truth, and it applies to all social issues. And those who seek it will find it.

The life of the mind

Opinions are individual and subjective. But facts are facts. Trouble is, they’re usually communicated or interpreted by someone. That’s where opinion comes back in….

And, says the Boston Globe, the facts backfire.

Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

What?

…most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study.

“Cognitive dissonance” is more than mumbo-jumbo. This is where we are, culturally. Academically. People are increasingly afraid to be wrong, claims First Things’ R.R. Reno.

For a long time as a young teacher, I believed the danger of prostituting their minds by believing falsehoods was the preeminent, or even singular, intellectual danger my students faced. So I challenged them and tried to teach them always to be self-critical, questioning, skeptical. What are your assumptions? How can you defend your position? Where’s your evidence? Why do you believe that?

I thought I was helping my students by training them to think critically. And no doubt I was. However, reading John Henry Newman has helped me see another danger, perhaps a graver one: to be so afraid of being wrong that we fail to believe as true that which is true. He worried about the modern tendency to make a god of critical reason, as if avoiding error, rather than finding truth, were the great goal of life.

That tendency has only grown in modern times, as the Boston Globe piece suggests.

Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts…

Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

Or take some other highly consequential action.

Reno says we have to want to know ‘the truth’ and “risk error as we leap forward to grasp” it.

Sometimes, the dangers of failing to affirm the truth are far greater than the dangers of wrongly affirming falsehood.

If we see this danger—the danger of truths lost, insights missed, convictions never formed—then the complexion of intellectual inquiry changes, and the burdens of proof shift.

As Chesterton said, it’s not what you look at, but what you see.