Dementia is relative

The last time I visited my mother she expressed great surprise. And not because she hadn’t been warned in advance: I stopped doing that around the time she forgot that phones were an option. She isn’t surprised when ‘waiters’ bring meals and meds or put her to bed, so she did seem to know who I was… until she asked me about the weather ‘where you come from’.

Maybe because memory failed her, she sidestepped questions about her own life by politely inquiring after mine. How did I feel about ‘changing countries’? What did I do ‘in England’? I explained I’d spent only two weeks there more than 25 years ago. So then she just stared at the mute TV, confused as to why she couldn’t hear it, while a report on a TV in the next room attributed the decline of Earth’s oceans to humans.

At one point my mother said she hadn’t seen any magazines, though four lay within easy reach. The copy of Hello! open in front of her gave me a clue: lacking recall of me, she’d taken cues from a glossy British weekly.

This is what dementia looks like. And yet not even her doctors can say exactly when my mother’s ignorance of me began to signify brain changes rather than mere disdain. When did chronic depression tip over into ‘dementia’? Apparently, professionals use the label loosely (also true of the word ‘hysteria’ during the Victorian era; labels tend to shift the focus off the context and onto their subject). And for the purposes of these musings, I’ll use the word loosely too.

So if dementia can look like, e.g., someone reduced to viewing others exclusively through the lens of whatever media they’ve been consuming, the condition could be said to afflict our whole society, the severity of its symptoms in each of us just a matter of degree.

Since my mother’s move into residential aged care, I’ve watched the progress of her dementia speed up. Carers are quick to locate the cause somewhere in her brain – but what about the system? Surrounded by residents with vacant expressions who don’t interact, free only to refuse social contact (such as it is), and denied any privacy, who wouldn’t slide towards mindless oblivion? And maybe my mother, who’s never read widely, let alone travelled far, while warding off challenges to her values, had a head start?

In our society it’s not just the aged who face deepening isolation, with more and more activities that used to involve direct human contact – banking, shopping, studying, chatting etc. – migrating online. Touch, smell, and visual and aural complexity are fading from the field of daily social interaction, especially for those who live alone (an increasingly common phenomenon).

Once upon a time, our species had respect (palely echoed today by green activism) for the vast web of terrestrial life. But as our brains grew in size we began to move up the food chain. And now, having enjoyed a brief interlude on top, we’ve been displaced by technology. In the vast artificial web that newly connects us, people are disposable, places are negotiable, and things are the new religion. Things – machines – make life more convenient, so who’s complaining? But today, many bank staff can’t add a few figures manually, never mind manage long division. Social media is training our brains to respond to ‘likes’, infantile emojis and throwaway comments the way a lab rat’s brain responds to a hit of cocaine. And as social media moguls like Mark Zuckerberg know, reward for next to no effort promotes compliance and docility.

According to psychiatrist and brain researcher Iain McGilchrist, in The Master and His Emissary (2009), our whole society demonstrates a left-hemisphere emphasis, evident in its focus on detail to the detriment of the big picture. Yet long before I began to read his paradigm-busting tome, even I could see that our reality is atomised. For the sake of analogy, take the plastic polluting our air, earth and oceans, its degrading microparticles hidden in the countless creatures (marketed as seafood) that eat and breathe them, so that humans keep consuming what they’ve tried to get rid of.

Like plastic waste, dementia can’t be neatly corralled. And yet our society lumps sufferers of the latter together (in facilities that bury them like landfill) as if their slippage were their identity. A common response when I volunteer facts about my mother to nursing home staff is the single, simple word ‘Dementia’ – an institutionally sanctioned excuse for not paying too much attention.

Of course my take is generalised and simplistic. But so is the standard medical narrative, which sometimes seems designed above all to disguise its own ignorance. And to what extent does ageism stop society at large from inquiring more closely? Like most, if not all, mental health diagnoses, dementia carries a stigma, which predisposes those not yet affected to dissociate – as if we weren’t dissociated already.

‘Dementia’ as an explanation is like an injunction not to think. And given similarly simplistic accounts in other areas of life, too many tend to take them at face value. Not that our education system equips us to challenge official versions of reality. Rewarded in the short term for conforming, the majority adopt prescribed roles, a compromise that seldom pays off if or when they get old.

Maintaining optimal cognitive function means bypassing well-worn paths through the brain to engage with the external world in multiple, improvised ways – such as if, due to injury, you were to use your left hand instead of the right, or toes or teeth instead of fingers. According to psychiatrist Norman Doidge in The Brain That Changes Itself (2010):

Anything that requires highly focused attention will help [the control system for plasticity]—learning new physical activities that require concentration, solving challenging puzzles, or making a career change that requires that you master new skills and material (p. 87).

Does solving challenging puzzles include grappling with new kinds of literature? Of course any suggestion that dependency on social media, the popular press and/or formulaic fiction etc. is stunting our neural potential can be criticised as elitist. And it seems unlikely that lovers of difficult reading (Ulysses? Infinite Jest?) run less risk of developing dementia. Reading texts that bend and stretch the mind is no kind of substitute for reading the world and deeply comprehending it. Nonetheless, a diet of The Australian Women’s Weekly, Reader’s Digest and Hello! doesn’t come recommended.

Advertisements
This entry was posted in use & abuse of language and tagged , , , , , , , . Bookmark the permalink.

2 Responses to Dementia is relative

  1. interesting…yes…given the difficulty of facing and solving the disasters we are creating for our world and our future selves, switching off/dementia could be a viable option…one which unfortunately bites us back, allowing the body to survive without cognition..

  2. Thanks for your comment, Annette. Lately, I’m seeing what could pass for very early symptoms in my ageing estate agent. A little unnerving. And yet from what I know, this person has recently undergone bereavement. Even my (slightly younger) solicitor has displayed mental rigidity &, maybe as a result, disconcerting forgetfulness… not something people want to acknowledge in our high-pressure, ego-driven environment…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s