Do you ever long for a time when the world was a simpler and more enjoyable place, when food tasted better, people were more pleasant and tolerant, and summers were longer?

Sliding into a warm bath of nostalgia can be seductive and convincing but none of it is remotely true. Our positive perceptions of the past are mostly influenced by the excitement and novelty of youth.

Memories of a first love, foreign holiday, or Christmas can have a flawless, almost mystical quality because our senses at the time were overwhelmed by the joy of discovery, and they have filtered out and disposed of the mundane, negative bits.

My grandmother used to talk dreamily about the sense of togetherness and common purpose she remembered from living through the Second World War, while ignoring the austerity, the drabness and, of course, the death and destruction.

The Herald: We spend an increasing amount of time onlineWe spend an increasing amount of time online (Image: free)

She permanently railed against price increases, harking back to the glory days when you could get the weekly shop and the bus fare home for less than five bob, forgetting that wages were also correspondingly lower, that everyone had rickets and most people lived in rat-infested slums.

The reality is that the passage of time has brought advances in technology and more enlightened governance, which have led to higher living standards and a better quality of life for more people.

In short, the world has become a progressively more affluent and tolerable place to live than at any time in human history. So why doesn’t it feel like it?

Why are more people than ever taking antidepressants to relieve symptoms of stress and anxiety? Why is there a sudden explosion of people seeking a diagnosis of attention deficit hyperactivity disorder (ADHD)? Why have suicide rates risen by more than a third in the past 25 years?

Much of the sense of disillusionment and alienation felt by people could be caused by the times we are living through. No previous generation has experienced such rapid acceleration of technological advancement in such a short period.

In an era of unprecedented information overload, our brains are grappling with a constant influx of data from various sources, including computers, smartphones, and connected devices.

While the convenience of having information at our fingertips is undeniable, research suggests that this might not necessarily be a wholly positive thing.


READ MORE:

AI job losses: Workers need a hero like miners' leader Mick McGahey

Libraries: Protecting them helps everyone in the community

'Lorna Slater is adopting a modern day Marie Antoinette approach'


When I was growing up the greatest threat - if we survived the impending nuclear Armageddon - was thought to be from totalitarianism governments. Dystopian novels like Nineteen Eighty-Four, Brave New World, Darkness at Noon, We, and Fahrenheit 451, foretold of the grim fates that awaited us if we allowed malign and unstoppable political ideologies, usually from the left, to prevail.

The collapse of the Soviet Union and the supposed ‘end of history’, followed by the digital revolution of the past 20 years, ushered in a new era, where the greatest threat appears to come not from governments, but from global, information-rich organisations. To paraphrase the nefarious O’Brien in Nineteen Eighty-Four, whoever controls the data, controls the future.

From being the preserve of actuaries and statisticians a generation ago, suddenly we have all become big data handlers. Although we might not realise it, we all receive and organise flows of information from multiple channels every day.

For many people, it can feel overwhelming, and the reality is that it’s only going to intensify. By next year, it is estimated that we will be generating 175 zettabytes – or 175 trillion gigabytes – of data globally every year.

Since one gigabyte is equal to one thousand million (109) bytes, 175 trillion gigabytes is expressed as 175,000,000,000,000,000,000,000 or 17.521 bytes, representing a fivefold increase in data generation since 2018, and 180 times more than was generated 20 years ago.

To put that figure into some kind of perspective, the Apollo 11 moon landing was achieved using around 4,000 bytes of computing power. A modern smartphone typically uses four gigabytes of random-access memory (RAM).

Rather than George Orwell or Aldous Huxley, perhaps the most prescient prognosticator was EM Forster, whose 1909 short story The Machine Stops envisioned a dystopian future, where a machine controls every aspect of society, from food provision to human interaction.

Foreshadowing the arrival of remote communication platforms, very like Teams and Zoom, all direct human communication has been eradicated, rendering face-to-face meetings obsolete. The machine exerts influence over collective consciousness, fostering a universal dependence on its functionality. In the narrative, societal breakdown follows, when the machine malfunctions.

Anyone who has tried to have a conversation with an AI-generated chatbot on an insurance or retail website will know what that feels like.

The Herald: George Orwell George Orwell (Image: free)

Such machines have become so convincing it can take several exchanges before you realise you are not actually speaking with a human being.

While the technology will, no doubt, improve and become more effective, what is already apparent is that machines can only ever be an inferior, ‘one-size-fits-all’ alternative to human interaction because the programmers can never factor in empathy.

If your requirement is for anything outwith the most common scenarios for which the organisation has provided a standard response, then you will always be left dissatisfied and exasperated. At least communicating with a human offers some reassurance that your concerns have been noted and that something will be done to address them. We may be living through the mobile phone age, but trying to get hold of a human at the end of a phone in any organisation is like trying to find a binary digit in a mainframe supercomputer.

Meanwhile, concerns about the effects of digital media on brain function, structure, physical and mental health, education, social interaction, and politics are increasing.

In 2019, the World Health Organization (WHO) published guidelines on children's screen time, and laws restricting smartphone usage in schools were introduced.

Studies have linked intensive digital media use to reduced working memory capacity, psychological problems, and the decline in text comprehension when reading on screens.

The developing brain is also influenced by digital media, with studies showing that early extensive screen use in preschoolers affects language networks. Diffusion tensor MRI indicates a correlation between intensive digital media use and poorer white-matter integrity in brain tracts, which are crucial for language development.

The kind of multitasking associated with heavy digital media use has been shown to affect attention span, concentration, and working memory capacity with the heaviest users showing poorer memory function, increased impulsivity, less empathy, and higher anxiety levels.

Wallowing in nostalgia may be a fool’s game but progress has shown us that not everything in the past was necessarily worse and, particularly where technology is concerned, not every development presages a brighter future.