Storytelling in technology: The myth of progress

"Imaginary flying machines" by Source. Licensed under Fair use via Wikipedia - https://en.wikipedia.org/wiki/File:Imaginary_flying_machines.jpg#/media/File:Imaginary_flying_machines.jpg
Source: Imaginary flying machines

A system is an imaginary machine, invented to connect together in the fancy those different movements and effects which are already in reality performed. – Adam Smith

In his book, Technology’s Storytellers: Reweaving the Human Fabric,  John M. Staudenmaier describes how the Lakota people in North and South Dakota, USA, did not use clocks to measure time. Instead, they used events and experience as a reference point, as they had done since before people measured time in a systematic fashion, that is, before clocks were invented.

However, this different timekeeping system conflicted with the modern world and so society decided that the Lakota people were unreliable and incapable of managing their time. The cognitive dissonance of this belief sadly led to raised levels of depression and dysfunction amongst the Lakota people. Staudenmaier questioned the belief that keeping time according to a clock is a superior way of living, and began to ask why do we always perceive technological invention to be progress.

To answer this question, Staudenmaier analysed all of the articles published in Technology and Culture journal from 1957 to 1980 and decided that storytelling and mythmaking is as prevalent in technology as it is everywhere else.

The myth of progress

Historian, Reinhard Rürup has said that technology is an independent force holding sway over humans, which may contrast with what actually happened. Historiographically speaking (that is the study of the history of history), technology is always a success. Historians interpret history with the presumption that any advance is progress.

There is no documentation or eye-witness accounts from the Iron Age and Stone Age, so historians have created a story to interpret the past and describe the advancement of humans which is taught in schools today. The Iron Age was better than the Stone Age for it was a more advanced age and society because of the tools archaeologists have found.

We don’t really know that for sure though. In fact, we have no idea. Perhaps there were other tools which were far more sophisticated but didn’t endure through time.  As it stands, the tools which have been found are what the story of history is based on. It is possible that history didn’t happen like that at all.

For we assume that when we look back the best tools were adapted and others were discarded. However, this is may not be the case. If we think about recent history and two Sony inventions: Betamax and the mini disc, we can see how these were good products. Betamax was superior in quality to VHS, but VHS was cheaper, as were recordable CDs, and this is what ultimately influenced consumers to choose VHS and CDs – cost not superior technology.

Once society has embraced a specific technology, momentum gains, and society adapts its working systems. Think about it: How many times have we updated and changed our music systems in the last 30 years? Vinyl to Cassette to CD and now mp3. Each time we have lost sound quality, which makes me imagine Stone Age old-timers sitting amongst the Iron Age entrepreneurs reminiscing about bronze tools: None of this Iron tools rubbish, we had great bronze hammers…

Rarely do we question if we are making the right sort of progress.

Humans against techology

In 1811, textile workers known as Luddites began systematic attacks on the expanding factories and mills, and smashed up the wide frames or machines which had began to replaced the skilled workers with unapprenticed factory hands who worked long dangerous hours and produced cheaper cloth.

The attacks continued for two years and were punishable by hanging and troops were sent in to protect the factories. Ultimately, the Luddites failed and the Industrial Revolution caused no end of misery and replaced one way of working with another for financial gain. Human satisfaction was not factored into the equation. Factory owners did not care if their workers were happy or safe, or if the new system suited them, rather like the Lakota people, the mill workers had to put up and shut up in order to survive.

The term Luddite was not really used again until the 1950s when publicists adopted it as a term of insult for people who did not want to adopt new technology, it was ultimately a way of shaming people to conform.

Invention, Innovation, Development

In his quest to identify how progress takes places, Staudenmaier classified technological advances in three ways: invention, innovation and development.

  • Invention is a personal mysterious act challenging what we do and how to do it differently. The success of an invention depends on how persuasive the inventor can be. If the inventor doesn’t have a compelling argument, then the invention goes the way of Betamax.
  • Innovation is always linked with entrepreneurs and is driven by economic factors. And, like in the case of the Lakota people or the Luddites, there is always a tension between tradition and innovation. Businesses will squeeze costs to measure success. From call centres to farmers feeling the squeeze, money talks.
  • Development is a group endeavour, step-by-step and what is feasible rather than what is hoped for. Eventually what was hoped for is forgotten the feasible becomes the success.

In each one of these approaches, failure is rarely dwelt upon. Businesses rewrite their stories constantly to tell everyone about their triumphs, and to persuade everyone that technology makes things better, even when it causes deep unhappiness.

Science Fiction

Science fiction (SF) has been a way for writers to criticise governments, institutions and businesses without getting into trouble for centuries and as such there are recurring themes which reflect our worries about technology such as: humans destroying the world, living in a post-apocalyptic world or dystopia, robots taking over, mind control (or dumbing down).

However, for every story there is about the horrors of technology and it being something humans have invented but can’t control, there equally as many stories about how technology will save us and create a cosmic bliss where we will all live happily ever after. And, there are many areas – medicine, sanitation, electricity, communication – where life is infinitely better than it was, even 20 years ago, albeit not for everyone. In some countries, the above remain scarce and as far out of reach as the moon.

However, as the great SF writer Jules Verne himself said:

While there is life there is hope. I beg to assert…that as long as a man’s heart beats, as long as a man’s flesh quivers, I do not allow that a being gifted with thought and will can allow himself to despair. – Journey to the Centre of the Earth

We just have to make sure when we are recording new stories of technology and advancement, we include everyone, so that we can all attain cosmic bliss, not just the persuasive ones.

Love the machine, don’t rage against it

Humans C4 courtesy of The Guardian
Humans C4 pic courtesy of The Guardian

The future is here. It’s just not widely distributed yet. – William Gibson

I was glued to the telly during the Channel 4 series Humans which is set in our present day but with a fictional history of robotics. In this alternate present, robots, who are known commonly as synths, have advanced to the point that they look, walk and talk like humans.

However, they have replaced many humans in the workforce causing high unemployment, protests, and rioting. (They have their own twitter hashtag #WAP – We are people). Smart and computer-savvy teenager Matilda rebels at school because if the synths do all the jobs what is the point of her working hard to try and get one?

But, it is not all bad, synths do all the chores around the house. How fabulous is that? Looking on the tie-in Persona Synthetics website, I could get Sally the synth to do childcare, cooking, and personal training.

What a shame household-synths are just fiction, even hoovering robots, which do exist and look very cool, wouldn’t do much to alleviate the repetitive household tasks of cooking and cleaning. Alas, I just don’t see a robot coming onto the marketplace anytime soon to keep my home running efficiently. Nor, do I see them taking over the world and turning me into a battery.

Derek Thompson in Atlantic magazine is not so sure. He thinks it won’t be long before technological advances have made such an impact on our society that there are no jobs for people.

In his article A world without work, he says that robots are everywhere: Operating theatres, fast-food counters, checkout screens, and in the sky flying as drones. Currently in the US, manufacturing is on a cyclical upturn so we can’t really see where else robots may be stealing jobs until recession hits, which is when employers turn to technology to cut costs. The effects of replacing humans may not be seen until the next recession, or the recession after that. But in the meantime Thompson says Airbnb has cut hotel jobs and Google’s self drive car threatens the most common American job of all – driving.

As humans, we adapt very quickly. Ask yourself: Would I trust a car without a driver? I trust the DLR and that doesn’t have one. What about black cabs? Would I miss the friendly banter of a London cabbie? I think I’d manage.

And, research has shown that even areas in which we imagine robots wouldn’t be as useful, such as in the field of psychology, people are very happy.  This is because they believe that robots don’t judge them like humans naturally do.

Sociologist Sherry Turkle took robots into old people’s homes and found it heart wrenching to witness one woman talk to an emo-seal about the loss of her daughter. However, I have to agree with Genevieve Tran’s comment below Turkle’s Ted talk 

The elderly person confiding in an electronic emo-seal is no different from a person praying to a god, who may or may not be there, or talking to a pet that definitely doesn’t have a grasp of life or death, but can give comfort by its presence.

And that is the point of  inventing anything: to give comfort and to make life more comfortable for humans.

Making life better

Since the beginning of recorded time, humans have always created things or artefacts to make life easier and/or better. For example:

These solutions probably created lots of new jobs such as butchers, engineers, drivers, night soil collectors, jobs which still exist today.  Ghanaian night soil collectors I am sure would welcome robots and technology to help solve their sanitation crisis and worry less about being replaced or robots taking over.

The fear of humans being replaced by computers

Joel Lee is worried too and has written a blog post to reassure himself that humans will always be needed in the creative arts, professional sports, healthcare and medicine, education, quality assurance, politics and law.

Poor Joel! The comments below his blog say that computers can do these things already. I haven’t checked all the links but they sound reasonable enough: Computers create art. An IBM mainframe is working with doctors to diagnose cancer, betters than doctors do. And neural networks are reasoning up a storm in many areas. As for sports, I remember when Chris Coleman was manager at Fulham FC and was asked why his team had no one English in it one Saturday. He answered by saying that he would put out a team of aliens if it allowed him to win a game. So, I am sure he would definitely been open to a team of robots.

Technology creates  jobs too

Technology may take away jobs but there are new jobs which could not be done without a computer: biomedical scientistsquantitative analysts, anyone working with big data: big data engineers are in fields from manufacturing right through to food production and hospitality along with big data architects who structure the big data, to name but a few.

However, these are highly skilled jobs in which you have to be skilled at the domain and skilled in computing. So, for example in hematology in biomedical engineering you have to know everything about blood and a lot about computing.

But, never fear there are loads more jobs with varying skill sets which didn’t exist before computers such as: twitter feed manager, video game designer, website manager, usability consultant.

I guess if machines got clever enough they could do these too. A quick google round the Internet shows me that a lot of people are upset about the idea that computers may one day do away with all jobs.  But really, if we are so advanced why do so many boring jobs still exist today? And why are new boring jobs springing up all the time?

Humans do jobs computers should do

In one of writer Elizabeth Gilbert’s podcasts, Elizabeth talks to Missy, a Florida call centre worker, who has to follow a script when talking to people who phone up to sort out their insurance. Missy is not allowed to deviate from the script or engage with the human on the end of the line in any empathetic way otherwise she is reprimanded. Consequently, Missy describes her job as the most boring job in the world.

Surely this is a perfect job for automation – it doesn’t seem to have been designed with humans in mind inside or outside of the call centre.

The paradox of work

Sadly though, Missy is not alone. Investors In People published a survey at the beginning of this year which said that 60% of UK workers are unhappy in their jobs, citing lack of job satisfaction.  The majority of people who work are doing for the money to pay for the things we need: food, shelter, etc., the things at the bottom of the pyramid of Maslow’s hierarchy of needs.

Two years ago the Swiss voted no to universal wages which is a scheme which would ensure that everyone, who was legally entitled to work in Switzerland, whether working or not would be paid a basic income.  Key supporter Enno Schmidt’s argument was that a society in which people work only because they have to have money is: no better than slavery. Instead, a universal income would allow people more freedom to decide what they really want to do.

The Guardian ran an article about writers on the dole saying that unemployment benefits have given many writers the freedom to learn their craft without starving. Imagine, if everyone got paid something without the need to explain themselves at the job centre. Oooh – no more jobs for the job centre workers.  Interesting.

It wouldn’t be enough though would it? Because we define ourselves using a premise which is false:  The more we do, the more we are worth. And so those people who used their universal wage to lie on the sofa and watch telly – very happily indeed,thank you very much – rather than tackle the upper levels of Maslow’s hierarchy of needs such as status, reputation and self-actualisation, sadly, would be judged lacking. We judge everybody including ourselves.

And, this is perhaps where robots and computers can teach us something new and liberating, like the robot psychologists who don’t pass judgment. If we could all just be more flexible with our interpretation of worthiness and our expectations of how things like call centres should work (especially those ones in which humans are forced to behave like robots),  then perhaps we could learn to love the machine and not rage against it.

Is the future of techology to be found in fiction?

copyright of http://www.history.com/minisites/space/images/space_image.jpg

When Jurassic Park was on at the cinema, I remember laughing out loud with a couple of my computing mates when the young girl, Lex, looks at a computer screen and says: “It’s a UNIX system. I know this.” At the time, UNIX didn’t have much in the way of a graphical-user interface (GUI), unless you wanted to write one yourself. And it definitely looked nothing like the screen she recognised. Nowadays, a quick look around the many Linux and UNIX distributions demonstrates that GUIs are everywhere. There are probably some as fancy as the screen she was looking at before she got the Jurassic system up and running again to save them all from being eaten by dinosaurs.

Continue reading “Is the future of techology to be found in fiction?”

Using patterns to shape our world

Escher picture

In the 1990s, Erich Gamma changed the way I thought about software engineering forever! Gamma visited the Ecole Polytechnique Federale de Lausanne where I was a PhD student, in order to give a seminar on design patterns.

The idea of extracting a solution template from a piece of software to turn it into a pattern which can be reused, was to me, an exciting step forward in software engineering. Instead of reusing software from a library that needs to be maintained and ported as necessary, abstracting the solution and creating a pattern repository gives software engineers a toolbox of meta-level solutions.

Continue reading “Using patterns to shape our world”

Are computers making us stupid?

artificial intelligence copyright atariarchives.org

In 1996, I listened to Lofty Zadeh, the daddy of fuzzy logic, give his keynote speech at the ‘Artificial Intelligence in Design’ conference, Stanford University. He described the excitment of artificial intelligence in the 1950s and how Marvin Minsky, father of frames, told a press conference that 50 years on, computers would read and understand Shakespeare. When Zadeh asked Minsky what possessed him make such a claim, Minsky said that he didn’t know, he had just gotten carried away.

Continue reading “Are computers making us stupid?”