One afternoon back in the late ’80s, the publisher of the weekly newspaper I edited herded me and my staff into an abandoned cubicle to introduce us to the future of journalism: a personal computer. We stood, slightly bemused, as he tried mightily to spark some interest in the operational nuances of this exotic machine. After he devoted several minutes to a breathless explanation of RAM, ROM, and DOS, I believe I spoke for my colleagues when I raised my hand and timidly inquired, “How do you turn it on?”
Like many of my boomer compatriots, I traversed those early days of PCs, floppy discs, dial-up modems, and listservs with a combination of diffidence and delight. It was indeed a technological revolution and one that made my job easier — once I figured out how to take advantage of its efficiencies. But as the years passed and I grew increasingly comfortable with (and dependent upon) search engines, news feeds, social media, and texting, mounting evidence began to suggest that the more pertinent question had become, “How do you turn it off?”
German neuroscientist Manfred Spitzer, PhD, first sounded the alarm about the cognitive dangers of an excessive reliance on digital devices in his 2012 book Digital Dementia. Spitzer essentially argued that the brain is a muscle that atrophies when we rely too heavily on the internet and other computer-based tools. His work focused primarily on the effects of screen time on children, but subsequent research has extended those warnings to older adults.
As Jared Benge, PhD, and Michael Scullin, PhD, note in a study published last month in Nature Human Behaviour, it’s reasonable to presume that seniors would be even more vulnerable than youngsters to a digital dumbing down. “[G]iven that older adults are at greater risk for cognitive control difficulties,” they write, “device-driven distractions could hypothetically worsen the real-world impacts of normal age-related cognitive deficits.”
For those of us who came of age in a completely analog world, the digital revolution offered unique challenges, they argue. The tools we once relied upon to navigate ordinary life — checkbooks and encyclopedias, physical maps and handwritten letters — had all been rendered obsolete, replaced by devices designed to make everything easier. “What is the cognitive impact of such dramatic changes in the environment?” they ask.
To answer that question, the two neuroscientists reviewed data from 136 studies tracking the use of digital technology and the cognitive health of some 400,000 older adults. Accounting for a range of demographic and socioeconomic variables, they found that those who regularly engaged with their digital devices were about half as likely as nonusers to develop cognitive impairment over an eight-year follow-up period. In other words, the impact of our digital revolution may be more salutary than sobering.
That’s because the daunting process of adapting to a new way of operating in the world requires a certain amount of brainpower, and that effort builds “cognitive reserve” — a key to brain health as we grow older. “One of the first things that middle-age and older adults were saying [in these studies] is that ‘I’m so frustrated by this computer. This is hard to learn.’” Scullin says. “That’s actually a reflection of the cognitive challenge, which may be beneficial for the brain even if it doesn’t feel great in the moment.”
And even if your mastery of digital devices extends only to an ability to email, text, or make a Facetime call, your brain may benefit. Because loneliness is a well-documented factor in the development of dementia, those connections can enhance your cognitive health. “Now you can connect with families across generations,” he explains. “You not only can talk to them; you can see them. You can share pictures. You can exchange emails, and it’s all within a second or less. So that means there’s a great opportunity for decreasing loneliness.”
It’s clear that Scullin and Benge are not recommending that seniors spend hours doomscrolling on X or frantically searching for the next big thing on TikTok. And their study doesn’t offer much advice to those who have adapted to the technology and find their brains less challenged by the digital universe than ruled by it. Yes, technological change is inevitable and often perplexing, but at a certain point it may become more of an annoyance than an opportunity to flex our cognitive muscles. And that’s when those muscles may need a different form of exercise.
This Post Has 0 Comments