On a warm afternoon in June, two men have appointments with a psychiatrist. The first has been dragged to the office by his wife, much to his irritation. He is a biologist who suffers from schizophrenia, and the wife insists that his meds are not working. “No,” says the biologist, “I’m actually fine. It’s just that because of what I’m working on right now the CIA has been bugging my calls and reading my email.” Despite his wife’s skepticism and his understanding of his own illness, he insists calmly that he is sure, and he lines up evidence to support his claim. The other man has come on his own because he is feeling exhausted and desperate. He shows the psychiatrist his hands, which are raw to the point of bleeding. No matter how many times he washes them (up to a hundred in a day) or what he uses (soap, alcohol, bleach or scouring pads) he never feels confident that they are clean.
In both of these cases, after brain biochemistry is rebalanced, the patient’s sense of certainty falls back in line with the evidence. The first man becomes less sure about the CIA thing and gradually loses interest in the idea. The second man begins feeling confident that his hands are clean after a normal round of soap and water, and the cracks begin healing.
How do we know what is real? How do we know what we know? We don’t, entirely. Research on psychiatric disorders and brain injuries shows that humans have a feeling or sense of knowing that can get activated by reason and evidence but can get activated in other ways as well. Conversely, when certain brain malfunctions occur, it may be impossible to experience a sense of knowing no matter how much evidence piles up. V. S. Ramachandran describes a brain injured patient who sees his mother and says, “This looks like my mother in every way, but she is an imposter.” The connection between his visual cortex and his limbic system has been severed, and even though he sees his mother perfectly well, he has no sense of rightness or knowing so he offers the only explanation he can find (Capgras Delusion).
From malfunctions like these, we gain an understanding of normal brain function and how it shapes our day to day experience, including the experience of religion. Neurologist Robert Burton explains it this way: “Despite how certainty feels, it is neither a conscious choice nor even a thought process. Certainty and similar states of knowing what we know arise out of involuntary brain mechanisms that, like love or anger, function independently of reason.”(OBC, xi) This “knowing what we know” mechanism is good enough for getting around in the world, but not perfect. For the most part, it lets us explain, predict, and influence people or objects or events, and we use that knowledge to advantage. But as the above scenarios show, our ability to tell what is real also can get thrown off.
Burton says that the feeling of knowing (rightness, correctness, certainty, conviction) should be thought of as one of our primary emotions, like anger, pleasure, or fear. Like these other feelings, it can be triggered by a seizure or a drug or direct electrical stimulation of the brain. Research after the Korean War (e.g. R Lifton) suggested that the feeling of knowing or not knowing also can be produced by what are called brainwashing techniques: repetition, sleep deprivation, and social/emotional manipulation. Once triggered for any reason, the feeling that something is right or real can be incredibly powerful–so powerful that when it goes head to head with logic or evidence the feeling wins. Our brains make up reasons to justify our feeling of knowing/ rather than following logic to its logical conclusion.
For many reasons, religious beliefs are usually undergirded by a strong feeling of knowing. Set aside for the moment the question of whether those beliefs tap some underlying realities. Conversion experiences can be intense, hypnotic, and transformative. Worship practices, music and religious architecture have been optimized over time to evoke right brain sensations of transcendence and euphoria. Social insularity protects a community consensus. Repetition of ideas reinforces a sense of conviction or certainty. Forms of Christianity that emphasize right belief have built in safeguards against contrary evidence, doubt, and the assertions of other religions. Many a freethinker has sparred a smart, educated fundamentalist into a corner only to have the believer utter some form of “I just know.”
Does this mean that rational argumentation about religion is useless? The answer may be disappointing. Religious belief is not bound to regular standards of evidence and logic. It is not about logic but about something more intuitive and primal. Arguments with believers start from a false premise—that the believer is bound by the rules of debate rather than being bound by the belief itself. The freethinker assumes that the believer is free to concede; but this is rarely true. At best the bits of logic or evidence put forth in an argument go into the hopper with a whole host of other factors. And yet each of us who is a former believer (we number in the millions) reached some point in our lives when we simply couldn’t sustain our old certainties. Our sense of knowing either eroded over time or abruptly disappeared. So sometimes those hoppers do fill up.
Given what I’ve said about knowing, how can anybody claim to know anything?
We can’t, with certainty. Those of us who are not religious could do with a little more humility on this point. We all see “through a glass darkly” and there is a realm in which all any of us can do is to make our own best guesses about what is real and important. This doesn’t imply that all ideas are created equal, or that our traditional understanding of “knowledge” is useless. As I said before, our sense of knowing allows us to navigate this world pretty well—to detect regularities, anticipate events and make things happen. In the concrete domain of everyday life, acting on what we think we know works pretty well for us. Nonetheless, it is a healthy mistrust for our sense of knowing that has allowed scientists to detect, predict, and produce desired outcomes with ever greater precision.
The scientific method has been called “institutionalized doubt” because it forces us to question our assumptions. Scientists stake their hopes not on a specific set of answers but on a specific way of asking questions. Core to this process is “falsification” – narrowing down what might be true by ruling out what can’t be true. And to date, that approach has had enormous pay-offs. It is what has made the difference between the nature of human life in the Middle Ages and the 21st Century. But knowledge in science is provisional; at any given point in time, the sum of scientific knowledge is really just a progress report.
When we overstate our ability to know, we play into the fundamentalist fallacy that certainty is possible. Burton calls this “the all-knowing rational mind myth.” As scientists learn more about how our brains work, certitude is coming to be seen as a vice rather than a virtue. Certainty is a confession of ignorance about our ability to be passionately mistaken. Humans will always argue passionately about things that we do not know and cannot know, but with a little more self-knowledge and humility we may get to the point that those arguments are less often lethal.
[i] Robert Burton, On Being Certain: Believing You Are Right Even When You’re Not (New York: St. Martin’s Press, 2008), xi.