The ghosts of AI

I fell in love with Artificial Intelligence (AI) back in the 1990s when I went to Aberdeen University as a post-graduate stalker, even though I only signed up for the MSc in AI because it had an exchange program which meant that I could study in Paris for six months.

And, even though they flung me and my pal out of French class for being dreadful students, and I ended up living in Chambéry (which is so small it mentions the launderette in the guidebook) instead of Paris, it was a brilliant experience, most surprisingly of all, because it left me with a great love of l’intelligence artificielle: Robotics, machine learning, knowledge based systems.

AI has many connotations nowadays, but back in 1956 when the term was coined, it was about thinking machines and how to get computers to perform tasks which humans, i.e., life with intelligence, normally do.

The Singularity is nigh

Lately, I have been seeing lots of news about robots and AI taking over the world and the idea that the singularity – that moment when AI becomes all powerful it self-evolves and changes human existence – is soon. The singularity is coming to get us. We are doomed.

Seriously, the singularity is welcome round my place to hold the door open for its pal and change my human existence any day of the week. I have said it before: Yes please dear robot, come round, manage my shopping, wait in for Virgin media because they like to mess me about, and whilst you are there do my laundry too, thank you.

And, this got me thinking. One article said the singularity is coming in 2029 which reminded me of all those times the world was going to end according to Nostradamus, Old Mother Shipton, the Mayan Calendar, and even the Y2K bug. As we used to say in Chambéry : Plus ça change, plus c’est la même chose. To be honest, I never, ever said that, but my point is that our fears don’t change, even when dressed up in a tight shiny metallic suit. Nom du pipe!

We poor, poor humans we are afraid of extinction, afraid of being overwhelmed, overtaken, and found wanting. True to form I will link to Maslow’s hierarchy of needs and repeat that we need to feel safe and we need to feel that we are enough. Our technology may be improving – not fast enough as far as I am concerned – but our fears, our hopes, our dreams, our aspirations remain the same. As I say in the link above, we have barely changed since Iron Age times, and yet we think we have because we buy into the myth of progress.

We frighten ourselves with our ghosts. The ghosts which haunt us: In the machine, in the wall, and in our minds where those hungry ghosts live – the ones we can never satisfy.

The ghost in the machine

The ghost in the machine describes the Cartesian view of the mind–body relationship, that the mind is a ghost in the machine of the body. It is quoted in AI, because after all it is a philosophical question: What is the mind? What is intelligence? And, it remains a tantalising possibility, especially in fiction that somewhere in the code of a machine or a robot, there is a back door or cellular automata – a thinking part, which like natural intelligence is able  to create new thoughts, new ideas, as it develops. The reality is that the guy who first came up with the term talked about the human ability to destroy itself with its constant repeating patterns in the arena of political–historical dynamics but used the brain as the structure. The idea that there is a ghost in the machine is an exciting one which is why fiction has hung onto it like a willo the wisp and often uses it as a plot device, for example, in the Matrix (there’s lots of odd bits of software doing their own thing) and I, Robot (Sunny has dreams).

Arthur C Clarke talked about it when he said that technology is magic – something, I say all the time, not least of all, because it is true. When I look back to the first portable computer I used and today, the power of the phone in my hand, well, it is just magic.

That said, we want the ghost in the machine to do something, to haunt us, to surprise us, to create for us, because we love variety, discoverability, surprise, and the fact that we are so clever, we can create life. Actually we do create life, mysteriously, magically, sexily.

The ghost in the wall

The ghost in the wall is that feeling that things change around us with little understanding. HCI prof, Alan Dix uses the term here. If HCI experts don’t follow standards and guidelines, the user ends up confused in an app without consistency which gives the impression of a ghost in the wall moving things, ‘cos someone has to be moving the stuff, right?

We may love variety, discoverability and surprise, but it has to be logical to fit within certain constraints and within the consistency of an interface with which we are interacting, so that we say: I am smart, I was concentrating, but yeah, I didn’t know that that would happen at all, in the same we do after an excellent movie, and we leave thrilled at the cleverness of it all.

Fiction: The ghost of the mind

Fiction has a lot to answer for. Telling stories is how we make sense of the world, they shape society and culture, and they help us feel truth.

Since we started storytelling, the idea of artificial beings which were given intelligence, or just came alive, is a common trope. In Greek mythology, we had Pygmalion, who carved a woman from ivory and fell in love with her so Aphrodite gave her life and Pervy Pygmalion and his true love lived happily ever after. It is familar – Frankinstein’s bride, Adam’s spare rib, Mannequin (1987). Other variations less womeny-heterosexy focused include Pinocchio, Toy Story, Frankinstein, Frankenweenie, etc.

There are two ways to go: The new life and old life live happily ever after and true love conquers all (another age old trope), or there is the horror that humans have invented something they can’t control. They messed with nature, or the gods, they flew too close to the sun. They asked for more and got punished.

It is control we are after even though we feel we are unworthy, and if we do have control we fear that we will become power crazed. And then, there are recurring themes about technology such as humans destroying the world, living in a post-apocalyptic world or dystopia, robots taking over, mind control (or dumbing down), because ultimately we fear the hungry ghost.

The hungry ghost

In Buddhism, the hungry ghosts are when our desires overtake us and become unhealthy, and insatiable, we become addicted to what is not good for us and miss out on our lives right now.

There is also the Hungry Ghosts Festival which remembers the souls who were once on earth and couldn’t control their desires so they have gotten lost in the ether searching, constantly unsatisfied. They need to be fed so that they don’t bother the people still on earth who want to live and have good luck and happy lives. People won’t go swimming because the hungry ghosts will drown them, dragging them down with their insatiable cravings.

In a lovely blog the Chinese character which represents ghost but in English looks like gui, which is very satisfying given this is a techyish blog – though I can’t reproduce the beautiful character here, is actually nothing to do with ghosts or disincarnate beings, it is more like a glitch in the matrix – a word to explain when there is no logical explanation. It also explains when someone behaves badly – you dead ghost. And, perhaps is linked to when someone ghosts you, they behave badly. No, I will never forgive you, you selfish ghost. Although when someone ghosts you they do the opposite to what you wish a ghost would do, which is hang around, haunt you, and never leave you. When someone ghosts you, you become the ghost.

And, for me the description of a ghost as a glitch in the matrix works just as well for our fears, especially about technology and our ghosts of AI – those moments when we fear and when we don’t know why we are afraid. Or perhaps we do really? We are afraid we aren’t good enough, or perhaps we are too good and have created a monster. It would be good if these fears ghosted us and left us well alone.

Personally, my fears go the other way. I don’t think the singularity will be round to help me any time soon. I am stuck in the Matrix doing the washing. What if I’m here forever? Please come help me through it, there’s no need to hold the door – just hold my hand and let me know there’s no need to be afraid, even if the singularity is not coming, change is, thankfully it always is, it’s just around the corner.

Human-computer interaction, cyberpsychology and core disciplines

A heat map of the multidisciplinary field of HCI @ Alan Dix

I first taught human-computer interaction (HCI) in 2001. I taught it from a viewpoint of software engineering. Then, when I taught it again, I taught it from a design point of view, which was a bit trickier, as I didn’t want to trawl through a load of general design principles which didn’t absolutely boil down to a practical set of guidelines for graphical-user interface or web design. That said, I wrote a whole generic set of design principles here: Designing Design, borrowing Herb Simon’s great title: The Science of the Artificial. Then, I revised my HCI course again and taught it from a practical set of tasks so that my students went away with a specific skill set. I blogged about it in a revised applied-just-to-web-design version blog series here: Web Design: The Science of Communication.

Last year, I attended a HCI open day Bootstrap UX. The day in itself was great and I enjoyed hearing some new research ideas until we got to one of the speakers who gave a presentation on web design, I think he did, it’s hard to say really, as all his examples came from architecture.

I have blogged about this unsatisfactory approach before. By all means use any metaphor you like, but if you cannot relate it back to practicalities then ultimately all you are giving us is a pretty talk or a bad interview question.

You have to put concise constraints around a given design problem and relate it back to the job that people do and which they have come to learn about. Waffling on about Bucky Fuller (his words – not mine) with some random quotes on nice pictures are not teaching us anything. We have a billion memes online to choose from. All you are doing is giving HCI a bad name and making it sound like marketing. Indeed, cyberpsychologist Mary Aiken, in her book The Cyber Effect, seems to think that HCI is just insidious marketing. Anyone might have been forgiven for making the same mistake listening to the web designer’s empty talk on ersatz architecture.

Cyberpsychology is a growing and interesting field but if it is populated by people like Aiken who don’t understand what HCI is, nor how artificial intelligence (AI) works then it is no surprise that The Cyber Effect reads like the Daily Mail (I will blog about the book in more detail at a later date, as there’s some useful stuff in there but too many errors). Aiken quotes Sherry Turkle’s book Alone Together, which I have blogged about here, and it makes me a little bit dubious about cyberpsychology, I am waiting for the book written by the neuroscientist with lots of brainscan pictures to tell me exactly how our brains are being changed by the Internet.

Cyberpsychology is the study of the psychological ramifications of cyborgs, AI, and virtual reality, and I was like wow, this is great, and rushed straight down to the library to get the books on it to see what was new and what I might not know. However, I was disappointed because if the people who are leading the research anthropomorphise computers and theorise about metaphors about the Internet instead of the Internet itself, then it seems that the end result will be skewed.

We are all cyberpsychologists and social psychologists now, baby. It’s what we do

We are all cyberpsychologists and social psychologists, now baby. It’s what we do. We make up stories to explain how the world works. It doesn’t mean to say that the stories are accurate. We need hard facts not Daily Mail hysteria (Aiken was very proud to say she made it onto the front page of the Daily Mail with some of her comments). However, the research I have read about our behaviour online says it’s too early to say. It’s just too early to say how we are being affected and as someone who has been online since 1995 I only feel enhanced by the connections the WWW has to offer me. Don’t get me wrong, it hasn’t been all marvellous, it’s been like the rest of life, some fabulous connections, some not so.

I used to lecture psychology students alongside the software engineering students when I taught HCI in 2004 at Westminster University, and they were excited when I covered cognitive science as it was familiar to them, and actually all the cognitive science tricks make it easy to involve everyone in the lectures, and make the lectures fun, but when I made them sit in front of a computer, design and code up software as part of their assessment, they didn’t want to do it. They didn’t see the point.

This is the point: If you do not know how something works how can you possibly talk about it without resorting to confabulation and metaphor? How do you know what is and what is not possible? I may be able to drive a car but I am not a mechanic, nor would I give advice to anyone about their car nor write a book on how a car works, and if I did, I would not just think about a car as a black box, I would have to put my head under the bonnet, otherwise I would sound like I didn’t know what I was talking about. At least, I drive a car, and use a car, that is something.

Hey! We’re not all doctors, baby.

If you don’t use social media, and you just study people using it, what is that then? Theory and practice are two different things, I am not saying that theory is not important, it is, but you need to support your theory, you need some experience to evaluate the theory. Practice is where it’s at. No one has ever said: Theory makes perfect. Yep, I’ve never seen that on a meme. You get a different perspective, like Jack Nicholson to his doctor Keanu Reeves says in Something’s Gotta Give: Hey! We’re not all doctors, baby. Reeves has seen things Nicholson hasn’t and Nicholson is savvy enough to know it.

So, if you don’t know the theory and you don’t engage in the practice, and you haven’t any empirical data yourself, you are giving us conjecture, fiction, a story. Reading the Wikipedia page on cyberpsychology, I see that it is full of suggested theories like the one about how Facebook causes depression. There are no constraints around the research. Were these people depressed before going on Facebook? I need more rigour. Aiken’s book is the same, which is weird since she has a lot of references, they just don’t add up to a whole theory. I have blogged before about how I was fascinated that some sociologists perceived software as masculine.

In the same series I blogged about women as objects online with the main point being, that social media reflects our society and we have a chance with technology to impact society in good ways. Aiken takes the opposite tack and says that technology encourages and propagates deviant sexual practices (her words) – some I hadn’t heard of, but for me, begs the question: If I don’t know about a specific sexual practice, deviant or otherwise, until I learn about on the Internet (Aiken’s theory), then how do I know which words to google? It is all a bit chicken and egg and doesn’t make sense. Nor does Aiken’s advice to parents which is: Do not let your girls become objects online. Women and girls have been objectified for centuries, technology does not do anything by itself, it supports people doing stuff they already do. And, like the HCI person I am, I have designed and developed technology to support people doing stuff they already do. I may sometimes inadvertently change the way people do a task when supported by technology for good or for bad, but to claim that technology is causing people to do things they do not want to do is myth making and fear mongering at its best.

The definition of HCI that I used to use in lectures at the very beginning of any course was:

HCI is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them (ACM, 1992).

For me, human-computer interaction was and still remains Gestaltian: The whole is greater than the sum of the parts, by this I mean, that the collaboration of a human and a computer is more than a human typing numbers into a computer and then waiting for the solution, or indeed typing in sexually deviant search terms into a web crawler to find a tutorial. And, with the advent of social media, HCI is more than one person connecting to another, or broadcasting online, which is why the field of cyberpsychology is so intriguing.

But the very reason why I left the field of AI and went into HCI is: AI reasons in a closed world and the limits of the computational power you have available. There are limits. With HCI, that world opens up and the human gets to direct the computer to do something useful. Human to human communication supported by technology does something else altogether which is why you might want the opinion of a sociologist or a psychologist. But, you don’t want the opinion of the sociologist on AI when they don’t understand how it works and has watched a lot of sci-fi and thinks that robots are taking over the world. Robots can do many things but it takes a lot of lines of code. And, you don’t want the opinion of a cyberpsychologist who thinks that technology teaches people deviant sexual practices and encourages us all to literally pleasure ourselves to death (Aiken’s words – see what I mean about the Daily Mail?) ‘cos she read one dodgy story and linked it to a study of rats in the 1950s.

Nowadays, everyone might consider themselves to be a bit of a HCI expert and can judge the original focus of HCI which is the concept of usability: easy to learn, easy to use. Apps are a great example of this, because they are easy to learn and easy to use, mainly though because they have limited functionality, that is they focus on one small task, like getting a date, ordering a taxi, sharing a photo, or a few words.

However, as HCI professor Alan Dix says in his reflective Thirty years of HCI and also here about the future: HCI is a vast and multifaceted community, bound by the evolving concept of usability, and the integrating commitment to value human activity and experience as the primary driver in technology.

He adds that sometimes the community can get lost and says that Apple’s good usability has been sacrificed for aesthetics and users are not supported as well as they should be. Online we can look at platforms like Facebook and Twitter and see that they do not look after their users as well as they could (I have blogged about that here). But again it is not technology, it is people who have let the users down. Somewhere along the line someone made a trade-off: economics over innovation, speed over safety, or aesthetics over usability.

HCI experts are agents of change. We are hopefully designing technology to enhance human activity and experience, which is why the field of HCI keeps getting bigger and bigger and has no apparent core discipline.

It has a culture of designer-maker which is why at any given HCI conference you might see designers, hackers, techies and artists gathering together to make things. HCI has to exist between academic rigour and exciting new tech, no wonder it seems to not be easy to define. But as we create new things, we change society and have to keep debating areas such as intimacy, privacy, ownership, visibility as well as what seems pretty basic like how to keep things usable. Dix even talks about having human–data interaction, as we put more and more things online, we need to make sense of the data being generated and interact with it. There is new research being funded into trust (which I blogged about here). And Dix suggest that we could look into designing for solitude and supporting users to not respond immediately to every text, tweet, digital flag. As an aside, I have switched off all notifications, my husband just ignores his, and it just boggles my mind a bit that people can’t bring themselves to be in charge of the technology they own. Back to the car analogy, they wouldn’t have the car telling them where they should be going.

Psychology is well represented in HCI, AI is well represented in HCI too. Hopefully we can subsume cyberpsychology too, so that the next time I pick up a book on the topic, it actually makes sense, and the writer knows what goes on under the bonnet.

Technology should be serving us, not scaring us, so if writers could stop behaving like 1950s preachers who think society is going to the dogs because they view how people embrace technology in the same way they once did rocknroll and the television, we could be more objective about how we want our technological progress to unfold.

Is this progress? Humans, computers and stories

As a computer scientist, I have to say my job has changed very little in the last  last twenty-odd years. The tech has, admittedly, but I am still doing what I did back then, sitting in front of a computer, thinking about how computers can make peoples’ lives easier, what makes people tick, and how can we put the two together to make something cool?  Sometimes I even program something up to demonstrate what I am talking about.

It seems to me though that everyone else’s jobs (non-computer scientists) have changed and not necessarily for the better. People do their jobs and then they do a load of extras like social media, blogging, content creation, logging stuff in systems- the list is endless – on top of their workload.

It makes me wonder: Is this progress?

Humans and stories

As a teenager, on hearing about great literature and the classics, I figured that it must be something hifalutin’. In school we did a lot of those kitchen sink, gritty dramas (A Kind of Loving, Billy Liar, Kes, etc.,). So, when I found the section in the library: Classics, Literature, or whatever, it was a pleasant surprise to see that they were just stories about people, and sometimes gods, often behaving badly, and I was hooked. Little did I know that reading would be the best training I could receive to become a computer scientist.

Human and computer united together

In my first job as systems analyst and IT support, I found that I enjoyed listening to people’s stories in and amongst their descriptions about their interactions with computers. My job was to talk to people. What could be better? I then had to capture all the information about how computers were complex and getting in the way and try to make them more useful. Sometimes I had to whip out my screwdriver and fix it there and then. Yay!! Badass tech support.

The thing that struck me the most was that people anthropomorphised their computers, talking about them needing time to warm up, being temperamental, and being affected by circumstances, as if they were in some way human and not just a bunch of electronic circuits. And, that the computer was always the way of progress, even if they hated it and didn’t think so.

I think this is partly because it was one person with one computer working solely, so the computer was like a companion, the office worker you love or hate, who helps or hinders. There was little in the way of email or anything else unless you were on the mainframe and then it was used sparingly, especially in a huge companies. Memos were still circulated around. The computer was there to do a task – crunch numbers, produce reports, run the Caustic Soda Plant (I did not even touch the door handles when I went in there) –  the results of which got transferred from one computer to another by me, and sometimes by that advanced user who knew how to handle a floppy disk.

Most often information was transferred orally by presentation in a meeting or on paper with that most important of tools, the executive summary whilst the rest of it was a very dry long winded explanation, hardly a story at all.

Human and computer and human and computer united

Then the Internet arrived and humans (well mainly academics) began sharing information more easily, without needing to print things out and post them.  This was definitely progress. I began researching how people with different backgrounds like architects and engineers could work together with collaborative tools even though they use different terminology and different software. How could we make their lives easier when working together?

I spent a lot of time talking to architects and standing on bridges with engineers in order to see what they did. Other times I talked to draftsmen to see if a bit of artificial intelligence could model what they did. It could up to a point, but modelling all that information in a computer is limiting in comparison to what a human can know instinctively, which is when I realised that people need help automating the boring bits, not the instinctive bits.

I was fascinated by physiological computing, that is, interacting using our bodies rather than typing – so using our voices or our fingerprints. However, when it was me, my Northern accent, and my French colleagues, all speaking our fabulous variations of the English language into some interesting software written by some Bulgarians I believe, on a slow running computer, well, the results were interesting, to say the least.

Everyone online

The UK government’s push to get everything electronic seemed like a great idea, so everyone could access all the information they needed. It impacted Post Offices, but seemed to free up the time spent waiting in a queue and to provide more opportunities to do all those things like pay a TV licence, get a road tax disc, and passport, etc. This felt like progress.

I spent a lot time working on websites for the government with lovely scripts to guide people through forms like self-assessment so that life was easier. We all know how daunting a government form can be, so what could be better than being told by a website which bit to fill in? Mmm progress.

Lots of businesses came online and everyone thought that Amazon was great way back when. I know I did living in Switzerland and being able to order any book I wanted was such a relief as opposed to waiting or reading it in French. (Harry Potter in French although very good is just not the same.) Progress.

Then businesses joined in and wanted to be seen, causing the creation of banners, ads, popups, buying links to promote themselves, and lots of research into website design so they were all polished and sexy, even though the point of the Internet is that it is a work in progress constantly changing and will never be finished.

I started spending my time in labs, rather than in-situ, watching people use websites and asking them how they felt. I was still capturing stories but in a different way, in a more clinical, less of a natural habitat, way which of course alters what people say and which I found a bit boring. It didn’t feel like progress. It felt businessy – means to an end like – and not much fun.

Human -computer -human

Then phones became more powerful and social media was born, and people started using computers just to chat, which felt lovely and like progress. I had always been in that privileged position of being able to chat to people the world over, online, whatever the time, with the access I had to technology, now it was just easier and available to everyone – definitely progress.   Until of course, companies wanted to be in on that too. So, now we have a constant stream of ads on Facebook and Twitter and people behaving like they are down the market jostling for attention, shouting out their wares 24/7, with people rushing up asking:  Need me to shout for you?

And, then there are people just shouting about whatever is bothering them. It’s fantastic and fascinating, but is it progress?

The fear of being left behind

The downside is that people all feel obliged to jump on the bandwagon and be on multiple channels without much to say which is why they have to do extras like creating content as part of their ever expanding jobs. The downside is that your stream can contain the same information repeated a zillion times. The upside is that people can say whatever they like which is why your stream can contain the same information repeated a zillion times.

Me, I am still here wondering about the experience everyone is having when this is all happening on top of doing a job.  It feels exhausting and it feels like we are being dictated to by technology instead of the other way around. I am not sure what the answer is. I am not sure if I am even asking the right question. I do know how we got here. But is this where we need to be? Do we need to fix it? Does it needs fixing?  And, where we should go next? I think we may need a course correct, because when I ask a lot of people, I find that they agree. If you don’t, answer me this, how do you feel when I ask: Is this progress?

Designing story (4): Women

When they write about you do they talk about your thighs? Or your girlfriend? They validate me through having a boyfriend, someone wants me – Abby Whelan, Scandal.

[ Part 4 of 5:  1) The intimacy of the written word, 2) Structure, 3) Archetypes and aesthetics, 4) Women 5)  Possession, the relations between minds]

Scandal is extraordinary, precisely because the women in it, like Abby Whelan above, articulate exactly how society views them in 2016 and depressingly enough, she is spot on. Women are still viewed by the way they look and the men with whom they are associated.

It is said that Jesus had a whole entourage of women who travelled with him. But if the women were there, we don’t know anything about them when we read the stories in the Bible. If they held his hand, uttered words of wisdom, or stood in the light receiving the same appreciative words of confirmation that God uttered over him, no one cared to write it down.

Prostitutes and saints

The one time they had to, was when Mary Magdalene went to Jesus’s grave on Easter morning to find him resurrected. The men had fled, so she was the only one there to meet him. History has rewarded her by calling her a prostitute and even though historians have said that wasn’t the case at all, the label has stuck. All the men got sainthoods, btw.

It reminds me of Joseph Campbell’s monomyth. Women only appear in it as temptresses or goddesses, and they only have support roles. We don’t hear their stories or their trials and tribulations. Instead they are silent.

In his book, Christopher Vogler tries to demonstrate how the hero’s quest could apply equally well to women, like this:

The masculine need to overcome obstacles to achieve, conquer and possess may be replaced in the woman’s journey by the drive to preserve the family and the species, make a home, grapple with emotions, come to an accord or cultivate beauty.

Cheers, thanks for that Chris!

Busy women

Campbell himself said that we only find women in fairytales because women have always been too busy to sit around telling stories. And, when Frank McConnell analysed how hero’s stories make us better in his book Storytelling and Mythmaking, it is men who do the self-actualisation, whilst women are playing prostitutes with hearts of gold, or enduring like Penelope, whilst Odysseus is off chasing glory.

It is the same with the archetypes discussed in the previous blog. We have women playing the shadow or the trickster purely as a plot devices to move the plot along; like the damsel in distress, the old crone jealous of the fair maiden, or the jilted lover. These are all tropes which the hero battles and conquers. The poor women are never the heroine, never the mentor, and they are never allowed to self-actualise. The rare cases in which they do, they become outcasts (don’t be taken in by the sexy pic above of the goddess trinity), shunned and lonely, or punished. Because they are not there to be anything but decoration and to soothe a man’s brow.

Women in the movies

Thankfully, things are changing. In previous blogs I have talked about Rey in Star Wars, and the women’s worlds of Spy and Suffragette. And, to this I want to add Ghostbusters (2016) .

I watched it last night for the first time, and thought it was brilliant. I have never watched the original Ghostbusters, because I never wanted to. The first time I was aware of it on TV, I was a teenager and as it started, I thought: Huh blokes and I went upstairs and read a book.

Last night was totally different. I loved every second, it made me laugh out loud, and as someone who has decided not to dye her grey hair anymore, the riff on hair dye was really funny, because that was happening to me a lot. And when Sigorney Weaver turned up at the end to high-five and utter the immortal line: Safety lights are for dudes… well my life felt complete.

A room of one’s own

There was no patronising female quest of creating a home or attracting a man to make a woman feel validated, it was just smart women being themselves and saving the world. They didn’t need recognition, just a nice space to carry on doing what they love. Virginia Woolf would be so proud.

I can’t wait to see more stories like this one. Lot’s more.

Part 5: Possession, the relations between minds

Designing story (3): Archetypes and aesthetics

I’ve learned that people will forget what you said, people will forget what you did, but people will never forget how you made them feel

– Maya Angelou

[ Part 3 of 5:  1) The intimacy of the written word, 2) Structure, 3) Archetypes and aesthetics, 4) Women 5)  Possession, the relations between minds]

I used to believe that my time was the most precious commodity that I had to give to someone else. Now, I know it is the energy that I bring with me.

Our energy is what stays with other people long after we have left and vice-versa. You never forget how people made you feel, and even if you never see them again, the very thought of them can enrich or deplete you.

Archetypes can do the same in any given story. They bring energy based on the story patterns we know from when us humans first started telling stories. They are ritualistic and encourage the reader to infuse the narrative with their own emotions. Archetypes can arouse fears and anxieties or yearnings and desires.

The field of archetypal literary criticism looks at how all narratives have intertextual elements. This means that we recognise story patterns and symbolic associations from other texts. For example, we know what a black pointy hat signifies. We know a night time setting is quite different from day time. We have learnt this from all the others stories we have read prior to the one we are reading now. Meaning does not exist independently.

They broke the mould when they made you

In design, archetypes are the first examples of their type, and may be reused to inspire a new design but they are never just copied. And, so it is the same in storytelling. We don’t want the same character but we want the new character to be moulded in a similar way as the original. The Greek word archetupos means first-moulded, which probably inspired the idiomatic, yet delightful, compliment: They broke the mould when they made you.

Archetypes are not just characters but can be settings such as caves, mazes, deserts, mountains, etc. Each one suggests certain ritual experiences such as the discovery of self (cave), spiritual quests (maze or mountain), or spiritual wastelands (desert). Each setting brings a certain energy which primes our expectations and how to respond. Archetypes can also be events: birth, death, first love, leaving home, those rites of passage which define our lives.

Common patterns and ways of living

It was Carl Jung who first suggested that as these common patterns and ways of living have emerged in many cultures, they are hardwired in our brains and they lead to us being predisposed to think a certain way about them, even unconsciously.

Jung worked on his theory of archetypes for many years often without pinning down exactly what he meant. Though finally, he compared the form of the archetype to the lattice of a crystal which dictates the crystal’s shape and matter, even though it has no material existence of its own, rather like the form follows function principle in design where something is pared down, is beautiful, and develops organically.

Today, it is common to refer to a specific number of archetypes like the 12 in the image above. Variations of this image are used in marketing, as marketers like to tap into stories already known in order to make their products seem familiar, which they then wrap up in beautiful aesthetics because research has shown that art rewards us. It lights up our reward centre in our brain.

Archetype + aesthetic = irresistible

A familiar archetype wrapped in lovely packaging – known as the art infusion effect – is irresistible. We want to consume shiny new things, and if they seem a little familiar and they resonate, what could be better?

I like the above diagram because it has echoes of Maslow’s hierarchy of needs and we can put the four groups of archetypes alongside the five groups of needs (plus the extra transcendance needs Maslow later added to his pyramid) like so:

  • Physiological and Safety needs =  Provide structure to the world.
  • Social needs = Connect with others.
  • Esteem needs = Leave  a mark on the world.
  • Self-actualization and transcendence needs =  Yearn for paradise.

I first realised that our needs and motivations haven’t changed since the Iron Age when I visited a Crannog one summer. We may look like we are doing things differently, but we still need to feel warm and fed, loved and connected, esteemed and self-actualised, which then begs the question: How much have our stories changed through time when our needs haven’t?

Motifs, symbols and facets

Scriptwriter Christopher Vogel uses seven archetypes because he has based his on the hero’s journey or monomyth which was first identified by mythologist Joseph Campbell.

Inspired by Soviet folklorist Vladimir Propp’s 31 story functions  and seven character functions, which are used in the study of semiotics, Vogel says that the  archetypes in a good story function like motifs or symbols, or as a facet of the main character’s personality. Propp himself said that they were like the masks we adopt in life to fit in, to protect ourselves, to not be vulnerable, until finally we become that version of who we think others want us to be, and we forget who we really are.

The seven archetypes are often used in psychotherapy. The theory is that if we know what archetypes we have adopted  and how they make up our thought patterns and beliefs in our mind, then we can work with them to stop limiting beliefs and engage in personal growth.

Seven archetypes for storytelling

  1. The hero is the one with whom the audience identifies, and includes growth, action, and sacrifice. The hero represents Freud’s ego and our search for identity and wholeness.
  2. The mentor stands for our highest aspirations or higher self, a teacher or a gift giver. The mentor is our conscience and a plot device which plants information. There are also dark mentors who mislead us.
  3. The shapeshifter is the one who brings doubt and suspense into a story. It may be a lover who is close and loving but turns out to betray us.
  4. The threshold guardian represents our barriers or neuroses, they test us and a successful hero learns that threshold guardians are useful allies.
  5. The trickster is the one who cuts us down to size, teaches us humility, and bring about healthy change and transformation. The trickster also provides comic relief, thus releasing tension and making us laugh at ourselves in order to create change.
  6. The shadow can represent suppressed emotions such as a hero damaged by doubt. The shadow may be external who challenges the hero and gives them a worthy opponent. They may even be a love interest – someone who is bad for us, who doesn’t have our best interests at heart. Sometimes, the shadow represents feelings such as anger, which is a healthy emotion until it is suppressed, turned inward, and can lead to depression.
  7. The herald brings words of challenges, to signal change to move the plot on whenever they are needed.

How do we feel when we meet these archetypes in the characters, in our familiar story-structures, in the spaces that writers create? Do we feel the energy they have to impart? Does it suppress or empower us? Delight or dismay us? We enter into that shared heart space and we come out the other end a little different to when we went in, carrying a new energy that we got to keep. Storytelling, the most powerful way to communicate ever.

 Part 4: Women