Mind Matters Natural and Artificial Intelligence News and Analysis
Analysis of a sample of water.jpg
Analysis of a sample of water from a river or sea, ocean. The scientist in the glove took water in a test tube.

Information Today Is Like Water in the Ocean. How Do We Test It?

Often, we must sort through many layers of bias in information to get at the facts that matter
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Last week, I wrote about the recent Netflix documentary, The Social Dilemma, focusing on the importance of objective truth in directing our lives and relationships. Equally important is freedom.

Freedom of speech is one of the single most important freedoms that we have in the United States. Our First Amendment states “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

Many countries have attempted to copy the First Amendment’s general intent. The problem we immediately face, when we are free to speak, should be obvious. We are free to tell lies and others are free to believe us. The vast majority of us can spot obvious snake-oil. But the devil is in the details… the subtle half-truths that put a slight twist on reality.

For example, we might be told that a local contractor has often been accused of violating building codes — which creates a certain impression. Later, we may learn, perhaps, that everyone in that industry has been accused of violating building codes a number of times. It’s a pattern and we cannot deduce information about a specific individual from that pattern alone. We need more information than the half-truth we were told.

Then of course there is our personal “lens,” the one through which we view existence and reality. Everyone has their own lens, their own framework for understanding and interpreting what they see, hear, feel, and experience. The older we get, the thicker our lens gets and the more we have a pre-existing layer of interpretation already in place to explain what we see.

When we put these two effects together — the half-truths and the lens through which they are viewed — we experience some interesting and troubling downstream effects. What we want to believe matters more than what is truly real! And this scares me.

The Social Dilemma addressed the impact of social media bubbles on the growing division between political factions in the United States. Another possible resource is Scientific American’s recent e-book, Deception in the Digital Age, which offers some definitions that may be helpful when sorting through the confusion:

Cognitive bias: The “Einstellung effect” — the human brain’s propensity to continue trying known solutions to new issues and ignore alternative options that are better. We have a hammer so everything looks like a nail…

Confirmation bias: We ignore data that does not agree with preconceived notions or desired outcomes. For example, we may want to believe that a person we admire is trustworthy so we don’t examine a decades-long history of betrayals.

Conformism: We prefer to look and act similar to others in the community. The desire to be the same is a very deep part of what makes us human but it can lead to going along with things that we personally know to be false or wrong. As a result, we often behave worse than if we were left to ourselves.

Epistemic relativism: The notion that truth is relative and changes depending on the perspective of the individual. That view is quite prominent today and, for some, it includes a literal demand that 2 + 2 = 5.

Epistemic trespassing: An expert in a given field may opine on another field without adequate understanding of that field — and make mistakes. For example, in 2017, Science reported (and received broad coverage for doing so) that a DNA study had disproved the Bible’s suggestion that the Canaanites were wiped out in antiquity. But the Bible repeatedly states the opposite, that the Canaanites were not wiped out. Apparently, the DNA experts and the media did not feel the need to consult experts on the Bible before they wrote about it.

False balance: Validating extremist positions in an attempt to be open to all ideas and opinions. G.K. Chesterton (1874–1936), a defender of common sense, warned that “Merely having an open mind is nothing. The object of opening the mind, as of opening the mouth, is to shut it again on something solid.”

Implicit bias: The human brain is extremely good at recognizing patterns, and does so often unconsciously. Thus we may stereotype others automatically. A person might assume, based on life experience, that a dentist would be male and a kindergarten teacher would be female—and thus be surprised if, in a new neighborhood, the reverse is true.

Post-truth: In reference to a society where obvious lies are not only being told, but tolerated, especially with regard to politicians, we must assume that statements do not necessarily convey information only. British sociologist Steve Fuller has written a book about that, Post-Truth: Knowledge as a Power Game, in which he asks some questions for reflection, which could be summarized as: Who is allowed to decide what is science, as opposed to non-science or anti-science? On what basis? How did they achieve their positions? What events might change the way the situation is viewed?

Social learning: We learn most from those we trust—like parents, teachers, or good friends. Even if those trusted do not teach truth, they are still trusted. As with cognitive bias above, we may not be willing to address the emotional impact of reckoning with ways that, as we have begun to realize, they misled us.

Social trust: We believe what we learn socially and we accept those sources of evidence as more reliable than others, despite the fact that other data sources may be more accurate. We may learn, for example, a variety of social myths about alcohol, some of which could cause us to fail a liver function test or a roadside sobriety test.

Most of what we communicate to others (including what you are reading now) is designed to do at least two things: a) impart information and b) influence understanding and opinion about that information. It is important to realize that communication is designed to achieve those ends. If communication is not truthful or only partly truthful, then influence becomes manipulation.

The World Economic Forum Report (2017) stated that two major issues facing our society today are “deepening social and cultural polarization” and the “post-truth political debate:

Free speech and the lively contest of ideas are a fundamental part of the democratic process, but they depend on all participants accepting each other’s good faith and a shared set of underlying facts. Historically, relatively small numbers of media outlets provided a widely trusted common foundation for national debates. Increasingly, however, the media landscape is characterized by fragmentation, antagonism, and mistrust, with individuals tending to segregate themselves according to their values and beliefs. Online “echo chambers” reinforce rather than challenge people’s existing biases, making it easier for misinformation to spread.


So, when humans work together as a group on a project to find a solution, they should come up with a consensus based on objective data. However what happens sometimes is the opposite: polarization: entrenched disagreement about two or more possible solutions or outcomes, for example:

  • We should ban plastics for the ocean’s sake! vs. No, plastics are important in the fight against infectious diseases!

or

  • We should ban high-school football to prevent head injuries! vs. No, football is what keeps many kids in school and off drugs!

So, freedom is important. The truth, equally so. It’s hard enough for our human minds to discern the truth when we have our own biases and worldviews. We don’t need others lying to us, even a little, nor should we lie to ourselves. So what do we do? Here are just a few things that will make things a bit easier:

  • Be nice. Be kind. Be gentle. Be polite. It is absolutely amazing how powerful this simple tool can be, and how troubling it is when not used. It’s hard to do sometimes.
  • Calm down. The division between “us” and “them” is growing, and it should be shrinking. “We” are all human. We all live on earth. We need to remain human, and keep living here, and to do that well, we need to do that together.
  • Accept that our thoughts, feelings, opinions, and beliefs are not universally shared, and realize that we might be wrong about something. Importantly, what we want to believe may be wrong. If we find out that we are wrong—own up to it. Going deeper down the wrong path won’t help anyone.
  • Verify. Realize that others may really believe that what they are telling us is true. But take the time to verify before repeating. And don’t just verify with one source—especially one that we really like. Pass it through an opposing filter.
  • Listen. The best way to communicate is to listen well, and to listen with the goal of finding a solution, not finding a way to win. A dialog can only be had if one listens.

You may also enjoy:

The Social Dilemma: You’re not the customer, you’re the product. A new Netflix documentary explores the techniques used to explore, then shape and sharpen, our attitudes, values, and beliefs. One real danger of letting social media shape a bubble for us to live in is that we may not know enough of the real world to have a genuine, personal opinion. (James C. Patterson II)

Are you trapped in a news bubble? The news filtered to you might leave out important things you need to know. But how can you tell? (Russ White)

and

Escaping the news filter bubble: Three simple tips Spoiler: Reduce the amount of information big providers have about YOU (Russ White)


James C. Patterson II

James C Patterson II is the Chairman of the Department of Psychiatry and Behavioral Medicine at LSU Health Shreveport. Dr. Patterson received his undergraduate degree in biology from Lamar University followed by his combined MD and PhD in Neuroscience from the University of Texas Medical Branch in Galveston, where he also completed his Internship and Residency in Psychiatry. He completed a Psychiatry/Functional Neuroimaging Fellowship at the National Institute of Mental Health in Bethesda, Maryland. He is Board Certified by the American Board of Psychiatry and Neurology. He and his wife of many years have two children, two grandchildren, three cats, and two dogs. He has multiple hobbies including science apologetics, carpentry, landscaping, and computers.

Information Today Is Like Water in the Ocean. How Do We Test It?