The Risky Mix of Aging, Public Visibility and Social Media

Evolution to twitterTweeting, like driving, creates perils that become more dangerous as we age, and the more stature or visibility people attain, the more they have to lose.

In history books, Oxford zoologist Richard Dawkins will be known for his contributions to science. But you’d never know that based on the last two years of his media coverage, which have centered on a series of controversial tweets, mostly about women and Islam. The tweets, and Dawkins’ attempts to defend them have provoked fierce debate between atheists about whether his visibility as an advocate for secularism has become a liability to the cause. The story raises challenging questions for aging public figures in an era of instantaneous communications.

What’s at Stake

Richard Dawkins received an endowed chair at Oxford and honorary doctorates from 12 other universities for good reason. His scholarship and writing about biological and cultural evolution changed the thinking of his generation. Two game-changing ideas he advanced were the notion that evolution operates at the level of genes rather than individuals, and that natural selection operates outside of biology—determining, for example, which bits of culture spread or get handed down to future generations, and which don’t. Dawkins popularized these ideas in his book, The Selfish Gene, in which he coined the term meme, defined as a unit of cultural information that replicates, like a melody, fad, joke, or religious belief. The term meme, itself, replicated widely along with the concept it represents, spawning a new field of study.

Today concepts like viral marketing are commonplace, and engineers write computer code that uses natural selection to generate novel solutions to complex problems. But at the time Dawkins first spelled them out, his ideas were radical. They built on the work of scientific greats including Charles Darwin, Ronald Fisher, George C. Williams, Bertrand Russell and Nikolass Tinbergen, but Dawkins combined their findings to advance novel lines of thinking. If citation counts are any indicator, his scholarship laid the foundation for a generation of scientists who have gone on to test or critique his ideas, replacing some and refining others—which is exactly how scientific inquiry functions at its best.

After the September 11 2001 attack by Islamic terrorists who used airliners full of people and jet fuel as bombs, killing 3000, Dawkins shifted focus and took up trailblazing of another sort. He used (and to some extent spent) his standing as an academic scientist to create a vigorous public critique of religion as a source of falsehood and harm. Together with journalist Christopher Hitchens, philosopher Daniel Dennett and neuroscientist Sam Harris, he became known as one of the “four horsemen” of the New Atheist moment. As in biology, Dawkins and his colleagues set in motion waves of change, and as in biology their work has provoked a fierce critique and debate among other intellectuals, many of whom agree in essence with their thinking but find parts oversimplified, overgeneralized or incomplete.

A View From Gen Y

As someone who believes that we each get one precious life, someone fascinated with understanding the intricacies of the natural world and the human mind, I owe Dawkins a debt of gratitude. And when, once, I found myself passing time with him before a speaking event, his demeanor as a seemingly genteel but awkward introvert made me even more grateful that he has been willing to tolerate such a high level of conflict and publicity for so long in the service of change.

But when I said as much to my college age daughter, Brynn, she wrinkled her nose. “I think he’s a bit of an ass,” she shot back. I winced, but I knew what she was talking about. Brynn’s primary impression of Dawkins comes from those awkward comments, often about women, that make their way from Twitter to Facebook to news and opinion sites where indignant commentators call him a bigot or sexist and suggest that he should step down.

As an introvert and nerd-girl, Brynn calls things as she sees them. If our strengths and weaknesses are two sides of the same coin, one strength of introverts is that they have less need for social approval than extroverts and so speak uncomfortable truths when others won’t. (A corresponding down side is that they often exhibit less social finesse and sensitivity in how and what they say.) One of the strengths of nerds, including scientists, is that their obsession with figuring out what’s real makes them less prone to self-delusion than many of us. (A corresponding down side is that they often don’t consider the potential harms of their hypotheses and discoveries.)

Brynn is just a college student, a possibly aspiring scientist, but her blunt assessment of Dawkins was a mirror image of his qualities and behaviors that prompted her reaction. Part of the reason Dawkins has been so influential in positive ways is that he has a compelling drive to say what he thinks, whether his opinion is popular or not. But coupled with the Twitter, this quality has created a public relations storm that threatens to undermine his legacy as a pioneer, his public influence, and the ability of his foundation to advance important secular values including the study of religion as a natural phenomenon.

The Peril of Social Media

All of us have thoughts that are insensitive or reactive and driven by emotion, or simply not well thought out, or a product of our blind spots. Twitter is the ultimate means of blurting those thoughts to the world, which means you can’t take them back. Kids who come of age with social media learn this young, often by trial and error. Childhood and adolescence can be thought of as laboratories for trying on social behaviors and figuring out what works—and what doesn’t—with the goal of having gotten through a bunch of the errors before they really count. Mercifully, at age 12 or 15, the consequences of blurting 140 characters to the world are usually minor.

But consider those of us who grew up with party-line phones and manual typewriters. For us, the trial-and-error starts at a time when our utterances have far heftier consequences, especially if we have been successful enough in some arena to attract a following. If we care about reaching an audience, we probably have some hazy notion that social media are where it’s at, that we should be tweeting and posting to Facebook and Instagram (but who knows what and when).

As old dogs it gets tougher to learn new tricks. Add to this the fact that the brain’s executive function, which was immature but growing during those adolescent years (one reason we forgive teen peccadillos), begins to decline as middle age winds down. It gets harder to pay attention and we become less good at editing impulses.

Social media have inherent social risks, regardless of age, but the intersection between aging and social media is particularly perilous, and the more stature and visibility a person has attained, the more he or she has to lose. Many public figures, like politicians and corporate executives, have staff to vet their communications and institutional rules intended to prevent the kind of reputation damage that comes from broadcasting ideas that are unseemly or not thought out or badly expressed. Others—freelance writers like me, documentary filmmakers, actors, and Oxford professors emeritus are largely on our own.

Realities We’d Like to Ignore

Sometime between middle age and end of life, as bone and muscle composition change, most people make self-protective decisions to scale back physical activities that feel risky: skiing, surfing, lifting big rocks, climbing ladders. At some point most people stop driving as reaction time, focus, and visual acuity decline, although the age at which the risks exceed the benefits varies widely.

Cognitive changes related to aging are less obvious and understood than physical changes, but no less significant. Cognitive declineSome aspects of information processing begin to decline in the 30s, for example processing speed. Verbal ability remains stable the longest, which can mask changes in other kinds of processing (and allow politicians to remain in office indefinitely). In one small pilot study, 90 percent of adults over 70 show some measurable evidence of cognitive decline; more conservative and complex measures suggests that 22 to 50 percent experience a condition that researchers term Mild Cognitive Impairment. Although researchers are frantically searching for medications or exercises that might reverse cognitive decline, so far results are disappointing. On current course, almost a third of us will eventually die with dementia, and given the indicators I assume that includes me.

Engaged For the Long Haul

Prudence dictates that at some point each of us with an audience should step out of social media and eventually the public eye. But between cognitive peak in middle age and the point at which even the most innovative thinker or eloquent orator should focus on long walks and loved ones lies a stretch as long as 20 to 40 years.

Developmental theorist Erik Erickson described middle adulthood, meaning the years from age 35 to 60 thereabouts, as the time of “generativity,” the stage at which we are most creative and productive and make our mark. He believed that the primary goal of later years was “integrity,” an inward process of synthesizing values and experiences and preparing for mortality.

But defining clean stages oversimplifies the matter. During the integrity years, those of us who care about discovery and society can contribute in powerful, important ways by staying engaged, and in fact for some of us that will be our most generative time. However, this means actively managing and minimizing the risks associated with the psychology and biology of aging. It can be done–with a little help from our friends (and, perhaps, other editorial staff).

Who Has Your Back?

Managing aging wisely and well may have some parallels with managing drinking. Back in my youth, people drove themselves home after parties and sometimes paid the price. My daughter’s peers have a totally different ethic. They agree in advance who is taking care of whom; they trade off; and they step it up when one of their circle is going through a rough patch. When you are a college student at a party, it really, really helps to have friends watching out for you–friends who know you well enough to recognize when you’re not at your best, who trust you enough to tell you so, who are smart and strong enough to take your keys if need be, and who love you enough to see that you get home safe.

The presence of an honest, protective, confrontive community of love and support makes a world of difference during college and in our waning years, but also at every point in between. As poet Gwendolyn Brooks put it, “We are each other’s business; we are each other’s harvest; we are each other’s magnitude and bond.”

When I mentioned to my husband (and chief supporter and critic) that I was wrestling with this article, he asked, “Are you now adding a theme of aging and cognition to your writing topics?”

“It seems so,” I responded. “This is the fourth time it has come up.”

“That’s good,” he said. “You’ve been paying a lot of attention to this issue lately.” He’s right. With our daughters off to college I find myself keenly aware of the next phase, and how quickly it will be over. “Besides,” he added. “If you keep writing about it, as you start declining your readers will recognize what’s happening.” We both laughed. For some reason, it was a comforting thought.

Maybe I look forward to having more excuse for my inevitable failures of logic or compassion or fact-checking. As an independent writer, I find myself cringing regularly about something that I’ve said in print, something that I wish I could clarify or nuance or even retract, but can’t because my words have replicated across the internet. Occasionally I have to chase from site to site, confessing in a series of humiliating emails that something I wrote is factually inaccurate or patently unfair, and requesting corrections. I keep reminding myself that it would feel even worse to let the errors stand, and that the power of the web to replicate ideas is a good thing, and that I am more than my lapses.

The Lucky Ones

Recently I sent Brynn a video based on an excerpt from Richard Dawkins’ book, Unweaving the Rainbow. I wanted her to know that Dawkins is more than his worst moments and to remind her, by analogy, that she is more than hers, as I am more than mine.

The video opens with a small brown bird falling from a nest and floundering. “We are going to die,” Dawkins begins, “and that makes us the lucky ones.

Most people are never going to die because they are never going to be born. The potential people who could have been here in my place but who will in fact never see the light of day outnumber the sand grains of Sahara. .”Dawkins video clip brown bird
Certainly those unborn ghosts include greater poets than Keats, scientists greater than Newton. We know this because the set of possible people allowed by our DNA so massively exceeds the set of actual people. In the teeth of these stupefying odds it is you and I, in our ordinariness, that are here.”

Brynn messaged me back a couple of days later: “The Dawkins video was pretty beautiful.” I knew she would get it. Even in our ordinariness, amidst our faults and foibles and glitches and blurts, we create beauty, and knowledge and love—because that is what it means to be alive. And as Richard Dawkins said, this complicated and awkward but ordinary mix makes us the lucky ones.

Valerie Tarico is a psychologist and writer in Seattle, Washington. She is the author of Trusting Doubt: A Former Evangelical Looks at Old Beliefs in a New Light and Deas and Other Imaginings, and the founder of www.WisdomCommons.org.  Her articles about religion, reproductive health, and the role of women in society have been featured at sites including AlterNet, Salon, the Huffington Post, Grist, and Jezebel.  Subscribe at ValerieTarico.com.

Related:
Dementia Care? No Thanks!
Two Liars for Jesus and an Aging Philosopher
Atheist Arrogance – Beneath the Stereotype, Things Get Complicated

About Valerie Tarico

Seattle psychologist and writer. Author - Trusting Doubt; Deas and Other Imaginings.
This entry was posted in Musings & Rants: Life and tagged , , , , , . Bookmark the permalink.

12 Responses to The Risky Mix of Aging, Public Visibility and Social Media

  1. richardzanesmith says:

    Thank you, thoughtfully written….one amazing thing about brilliance is its budding flower-like nature. The chart shared was a little scary -( I’m edging to the end of my 35-60 prime) but a healthy flower blooms into the sun invites its pollinators and spreads its essence standing there being awesome. Eventually it whithers, fades and falls, becoming absorbed back into the earth, having done what it was here to do. With Dawkins, maybe it’s our natural tendency to expect someone we admire to blossom longer, or have more overarching authority on a greater array of subjects beyond fallible limitation, incapable of bias or hypocrisy …even if we aren’t witnessing a bit of flower shriveling? But feeling a little disappointed in someone we admire is also pretty dang human too.
    “Rely not on the teacher/person, but on the teaching. Rely not on the words of the teaching, but on the spirit of the words….” attributed to the Buddha

    Liked by 1 person

  2. archaeopteryx1 says:

    As long as the world can tolerate a Pat Robertson, I can live with Richard Dawkins’ few foibles.

    Like

  3. Ron Taska says:

    I found the answer to my question by Googling “Richard Dawkins Gets Into Comments
    War With Feminists” describing “Elevatorgate.” How do we know that this is due to Dawkins getting old rather than to his being a jerk?

    Like

    • My apologies. I hesitated to link a bunch of inflammatory rants, but I realize that this has been covered more in the British than U.S. press. I have added links to the controversies now. I don’t know for sure that the problem is age+twitter (or either/or), but it’s hard to imagine that he would have attained the stature and recognition he did if he were as prone to simplistic blurts throughout his professional career.

      Like

  4. Allan Avery says:

    Affecting. Both of you in your hugely valuable and beautiful ways. I’m 19 years past that “prime” span, and experiencing the symptoms. (All six of the cognitive factors in your chart. Along with rapid decline in strength and mobility, that I must stop “fighting.”) But I’m not ready to Settle yet. (EITHER with my “Lucky Stars” or my life’s “work.”) Maybe there still is a blog in my future.
    First learn how.

    Like

  5. MB says:

    Valerie, the New Atheism is male chauvinistic. Just admit it and write an article about it. If Katha Pollitt can call out other atheists on their misogyny, you can too.

    Like

  6. richardzanesmith says:

    ahh…….love the smell of frying fish…especially big ones. I think one way to challenge patriarchy is to refuse to use patriarchal tactics. I’m speaking as a tribal member involved in the revitalization of our traditional matriarchal clan systems. In our traditional systems clan mothers were the bedrock, the foundation of society, not the ones to always get their weapons bloody (though they certainly had the strength to). They ordered defenses, farming, village construction, they supervised the adoption of captives, marriages, they chose (and fired) representatives to speak for their respective clans. I know its not the same today. but within these systems was a pattern that created an amazing democracy that ol’ Ben Franklin seemed to have overlooked when he visited Longhouse communities. He didn’t see the clan mothers. He didn’t notice the women at all.

    Like

    • I think of humanity moving into a period when strengths traditionally (and perhaps to some extent biologically) associated with female gender are becoming increasingly important. I hope we will be able to draw on the residual wisdom of tribal cultures like the one you describe.

      Liked by 1 person

  7. Thanks for reminding us to focus on the best in others, especially famous writers, who also err (and more so it seems as they, too, age). I know I shouldn’t have lost sight of such a point, but as one ages, it’s all too easy to become focused on the negative.

    Richard Dawkins has tweeted some seriously bad stuff lately, but I will always be thankful for him writing The Ancestor’s Tale, a great user-friendly science book with his amazing prose and humorous asides.

    Like

Leave a comment