Game theory in social media (1): Fate and power

Early dice made from knucklebone in Ancient Greece © British Museum

[Part 1 of 4: Game theory & social media: Part 2, Part 3, Part 4]

Humans love games. We just love playing them.

The earliest proof we have, so far, dates back to 3600BCE: Six-faced dice with coloured pebbles made from heel bones of sheep and deer have been found on archaeological digs in Assyria, Sumeria, and Egypt.

Today, a walk around the British Museum, one of my favourite places, (which you can now do on Google), shows us that places where guards sat, probably for hours, like the entrance to the palace of King Ashurnasirpal II (883-859 BC) at Nimrud, has a board and a tally scratched into the side of one of the enormous winged human-headed lions.

By the time of the birth of Christ, many types of random number generators, including dice, were common, and were used for betting on or with board games. They were often spoken of as the workers of the blind goddess of fate, fortune, or destiny. And, it says in the Bible, that they cast lots to decide how to divide up Jesus’s possessions (Matthew 27:35). Even nowadays we talk about the roll of the dice when we talk chance and the things which happen to us.

By 10th century Europe, cards were the most popular thing with which to play games. There might be some skill, but really, a lot of it is up to chance, and don’t we all know that cliche about playing the hand you were dealt?

Highs and lows on the roll of a dice

The first formal attempt at analysing games, especially of chance, was written in 1520 (but published in 1663) by Gerolamo Cardano and has been recognised as the first step in probability theory. Cardano was a compulsive gambler, so would have felt the highs and lows of the roll of the dice more than most. He was foremost in the minds of Pascal and Fermat who published a book in 1654, continuing his work. And, it was Fermat’s last theorem which remained a phenomenon until it was solved in 1994. Imagine, it took three hundred and fifty years to solve a puzzle.

Later,  writer Fyodor Dostoyevsky described our love of excitement and chance when playing games and how our fortunes can flip in an instant. He wrote about it in letters to his sister and his short novel, The Gambler. He was convinced that you needed to detach and keep a clear head, but had difficulty doing either, for it is much easier said than done. Consequently, gambling and games are ubiquitous, from church bingo to nationwide lotteries. Life can really change with a roll of the dice – or so it seems.

Game theory not gamification

It was in 1928 that the first theory of game theory was first written about by (rock star) John von Neumann who amongst many things designed the first computer architecture in 1945.

But, it has to be said, game theory isn’t the same as gamification, at all. Please don’t mix them up. Gamification is about turning things into games such as business objectives and anything else we want to make more engaging and more fun. When gamification is well designed, it works really well. But game theory is much bigger, and much more than just games.

In 1944, von Neumann and Oscar Morgenstern translated and expanded von Neumann’s theories in order to produce: The theory of games and economic behaviour. For his 1928 paper was mainly about two people playing a game together with only one winner (known as: two person game-zero sum) but game theory is much bigger than this, and it is not just about games and game playing.

It might be based in mathematics, but game theory has people in it, of course, which is why it can be used to think about everything: economics, political science and psychology. And, it has the crazy assumption that people behave rationally, which if there is one thing I know about life, people never behave rationally, nor should you expect them to. The other thing is that, we can only partially model any prescription because the world is huge and constantly changing, and we can never model everything in a computer. It really doesn’t matter how clever computers get. We have a long way to go yet when modelling humans and behaviour, but game theory is a start.

That said, power is the name of the game: group voting, economic theory and how to influence people, especially in areas like interpersonal cooperation, competition,  conflict, labour negotiations, and economic duopolies, can all be understood in terms of game theory.

Game theory for explaining social media

Social media is the big new tool of the Internet, for business, politics, etc, and as of yet, no one knows how it works. So, this series is going to take a look at some of the big hitters of game theory: the prisoners dilemma, the Nash equilibrium, and so on, to see if these strategies can help us understand better how social media works. Are people cooperating or conflicting in ways these models describe on social media? If yes, can we understand and anticipate behaviour?  If not, what other theories could we come up with?

Let’s take a look.

[Part 2]

Gaming: Storytelling and ludology

Minecraft from playstation
Source: Playstation

Playing video games is, I am sorry to say, not my favourite pastime. My first attempt at playing was The Hobbit back in the 1980s on my brother’s Spectrum 48k and it frustrated me no end. Though, I did like playing PacMan a couple of years later, on a handheld device which only had Pacman on it. In the late 1990s I was introduced to MOOM – one of the first mass multiplayer online games – which I felt very excited to take part in it (I was asked by one of the creators) but alas, I didn’t really persevere because I didn’t have the patience.

Now when I play video games (and I get asked everyday) I last about 10 minutes because I hate learning all the rules to find out what to do. However, I love watching others, especially my girls, playing video games, because the games are fantastic entertainment. So, I understand completely how the likes of Stampy became so popular, and I love thinking about what gaming means. Apparently, this means that I like thinking about fun rather than having fun, sort of a theory of fun.

Storytelling

You know what kind of gamer I am? When we come to a cinematic, I jump it. I go ‘I’m not watching a movie’ – Guillermo Del Toro

Video games can be viewed in a context of storytelling, or narratology – the way we construct meaning from creating stories about the world around us. Games have cinematic effects, great plots, soundtracks and super cool music, as well as cut scenes which explain backstory, or give rewards to players, or move the story along.

I love cut scenes and enjoy watching whole movies of cut scenes like LEGO Lord of the Rings. But, film directors, Guillermo Del Toro and Steven Spielberg have criticised cutscenes saying that they interrupt the flow of the game, as they are non-interactive.

With or without cut scenes, video games have structure and tell a story to engage players emotionally which then motivates them to perform certain actions. Consequently, they have been analysed in the humanities as interactive storytelling or electronic literature which began before the WWW and focuses on readers interacting with stories to change the outcome of the narrative. Games can be played many times, and each time it is different. Narratives generally, unless they are our favourites, are read once, and don’t change each time we read them. We change though and our interpretations change too (which is a different though equally interesting phenomena to blog about).

Each experience in a game is different and we can be surprised and delighted with what happens next, like the time my girls went swimming in the Los Angeles River in Grand Theft Auto and were eaten by a shark.

Once dead, they could start that level again and follow another outcome not necessarily following the prescribed narrative, because they love unstructured play and often choose open world settings. This desire to play without structure is another area of gaming study, and has led to many video games set in real world simulations like Sims and Second Life.

Simulation and simulacra

In these virtual worlds, we can explore and make, we can all be designers, and we can have different experiences in order to fulfil our basic needs but we do it in an immersive environment. That is to say, we feel like we have left our world and are present in a simulated world. When we are so immersed, there are fewer blanks we need to fill in in order to make sense of that world. It feels normal to walk about The Shire, drive a car round LA, or ride a horse in Red Dead Redemption.

The stronger the narrative is and the more the environment demands of us, along with giving our senses all the information they need – sight, sound, touch (haptic feedback) the more complete it feels.  And our minds, don’t really know, or care if it is real or not. So, we feel like we are stealing cars in Los Angeles or being a super hero in New York. And, often we interact with simulated humans in video games which are non-player behavioural algorithms that look like humans.

It was The Matrix which first got us all talking about algorithms which aren’t human as well as simulation and simulacra. Simulation is a copy or version of something, say the real world, and simulacra is a version which does not have an original copy. For example, a digital file is not real until it is printed out, and music which is recorded in a studio one instrument at a time is not a performance and never has been. It is a simulacra of a performance.

Ludology

However, Professor of Humanistic Informatics, Espen Aarseth has contested the idea of describing video games as storytelling narratives simulated or not and proposed the term ludology because, after all in video games we, via our avatars, are normally action driven and want to win.

Ludology is the study of games. When playing a game, we need to: 1) learn the rules, 2) play the game, 3) win or lose. In terms of ludology, we play to win.

However, Aarseth proposed this back before the World of Warcraft (WOW) which was released back in 2004 and became one of the most popular most popular massively multiplayer online games with more than 10 million active subscribers worldwide. Apparently numbers have dropped. WOW allows gamers to play however they want by choosing which class you want to be in in the land of Azeroth and then the quests comes from that choice.

Minecraft too carries this idea further, released in 2009, it has been in development ever since and allows players to be and do whatever they want. Players can build extraordinary works of architecture, or live in villages and interact with villagers (non-player behavioural algorithms) who grunt instead of talking. It is an amazing construct, which is really popular.

In 2015, Minecraft released Minecraft: Story mode which is very much like an interactive novel, you can choose to be a girl or a boy, who with a small group of friends tries to win a building competition. Unlike the original Minecraft, it is a game of levels, cut scenes and branching conversations, and little in the way of exploration or creativity. The theory behind it seems to be that people who have an emotional attachment to Minecraft might enjoy experiencing a story in it. Rather like fan-fiction backwards, I guess.

Video games defy categorisation, just when we find a way of thinking about them, a new game comes along to challenge that. And video games remain the fastest growing form of entertainment sector, so it is hard to label constant change. One constant remains though, most gamers when asked tell you that they play for fun. There exists a theory of fun and its purpose is to allow game designers to change the face of game design even further by creating more fun. The theory of fun at its best.

Conclusions: The limits of the social animal on social media (9)

Personifications of social media sites
Source: www.techweez.com

[Part 9 of 9: The Social Animal on Social Media, Part 1Part 2, Part 3, Part 4, Part 5, Part 6, Part 7, Part 8]

Social media may be changing the way we do business and how we connect with others, but I don’t believe it is changing us fundamentally as humans. My theory, after writing this series, is that social media reflects the way we behave, and we behave the way do because we are human. And, because we are human, we just can’t get enough of social media, which really isn’t our fault, it is just the way we are made.

Social media not only lights up the nucleus accumbens, the part of our brain which deals with rewards, but does so randomly, which is called a variable interval reinforcement schedule. Rats or birds who have been trained to get rewards randomly will work harder for rewards, and take longer to give up checking once all rewards for the behaviour is removed. We are the same, we will randomly check all our social media for a very long time, before it no longer rewards us.

One reason is that, social media is much easier on us than the face-to-face contact of daily life and that in itself is a reward for we much prefer people who are nice to us without us having to make a massive effort. For once we are tangled up with other people – as we have seen throughout this blog series – we conform and betray ourselves, we behave aggressively and act with prejudice, all so we can avoid feeling rejected. Then, we feel so bad about our shoddy behaviour that we have to find ways to feel better by reducing our cognitive dissonance and the gap between who we are (good people) and the things we do (behave badly towards other, or towards ourselves, like when we agree to do favours for people we don’t like).

We all need connection

It really isn’t our fault. Brene Brown, Professor of Sociology, says, that we are neurobiologically wired to want to connect with our fellow human beings. We all want to feel that we matter. So, of course we would choose social media. Why not choose the quickest and easiest way possible to feel connected to others? It seems like less of an emotional investment, but as this series has demonstrated, it really isn’t.

It might have been okay if social media had stayed as it began: easy and quick ways to share pictures, videos, texts between groups of friends, or networks for sharing interests across time and space. But once, we realised that anyone could be a star in the land of digital culture, then we all spent more time there trying to be loved or trying to make money – it amounts to the same thing, after all: money=influence, influence=feeling loved and valued.

And, then once news could get delivered the way we liked it, via, for example, the Huffington Post who serve up the same article with two different headlines and then they go with the headline which attracts the most hits (aka A/B testing), we never stood a chance. Web media started giving us what we want, right around the clock which encouraged traditional news outlets to try and keep up. Consequently in-depth coverage and accuracy seems to have suffered.  Facts are cherry-picked for nice looking memes which can remain unsubstantiated assertions because the rest of the facts don’t get checked half as much as what the crowd says. Journalism is engaging in groupthink.

Limits on friendship

Marketer, Marcus Sheridan, wrote a funny blog called Chris Brogan unfollowed me on Twitter and now I hate my life. How many of us measure our worthiness by the amount of un/followers we have on Twitter or friends on Facebook? How many of these people do we actually know?

Dunbar’s number, proposed by an anthropologist of the same name, postulated a limit to the number of people with whom one can maintain stable social relationships. These are relationships in which an individual knows who each person is. In reality the number is a series: 5 – 10 close friends, then 5-10 x 3 = people you might have to dinner, and so on, until you reach a maximum of 150 (for a wedding or party) of close people you know who are there to celebrate an event in your life. Dunbar’s number is tiny compared to the numbers seen on Facebook.

I’d rather be anywhere than here

Google Designer, Jake Knapp wiped social media and email off his iPhone because he felt that by constantly checking social media apps he wasn’t present in his present moment, which is so true. If we are constantly distracted by our apps, or eager to share or capture a moment, then we are not really present in that moment.

This got me thinking, if we are constantly looking at other people’s moments and memes on social media, then when we get to experience that moment for ourselves, aren’t we having a second hand experience? Will the landscape remind us of a photograph? Will an emotion remind us of a meme? Are we experiencing what we feel we should rather than what would make us feel good?

Get nuanced

Meditation teacher davidji, has said that those voices who are ranting on Facebook are usually the loudest voices (not normally the most accurate or uplifting, just influential) and they have an impact on us. We react to what other people are saying and doing online instead of following our own agenda. daviji believes that we need to get nuanced, and know what we are feeling, so we do not get hijacked by other peoples’ opinions. Otherwise we don’t stand a chance of not being influenced, and this is why we are endlessly fascinated by people who influence us. We want to know how they do it so we can wrestle back our power or try influence others so we can be heard.

I still believe that social media has the capacity to augment us, even though I have seen throughout this blog series the many ways it can diminish us, but that is because we are human, who haven’t yet realised that we all count and are all connected anyway. Social media just can’t do that for us. It is not a brave new world, it the same old world on a small screen. To find a brave new world we have to do that ourselves, and we have to start by looking inside ourselves, instead of inside our phones.

Social media explained

Corey Smith on Social Media
Corey Smith on social media

The above image by Corey Smith is great. It has many variations: doughnuts, wee, or a piss-poor explanation of social media, which have been doing the rounds for years now.

This is because, when humans are presented with anything new or old, they have to categorise, classify it and wrestle it to the ground, in order to understand and manage the world around them. And, then they like to tell others how to do it properly. Sometimes these humans are wise and are leaders, they are the culture carriers of society. Other times they are not, like the two people who told me, this week, that I am doing Twitter wrong.

The main problem they have with my wrong approach is that I like to read every Tweet. What a weirdo! Consequently, I don’t follow many people because I find it hard to keep up. Also, I don’t like everything I read, so if it happens repeatedly, I unfollow the tweeters who are filling up my feed. And, normally they unfollow me. This seems to me to be a realistic approach. Isn’t that how it’s supposed to work? Isn’t that like life?

Apparently not, according to my Twitter advisors, I am supposed to follow zillions of people and dip in and out. And there is lots of software to help me do this including the who unfollowed me app. I tried it out – apparently, @Oprah, @DalaiLama, and @DeathStarPR unfollowed me. What? They never followed me in the first place. There has never been any reciprocation of my fandom and I didn’t expect it either.

But, like all things in life, the more you do, the more you are worth. On Twitter, the more followers you have , the more you are worth, especially if you are influential, because you can turn that into money. And then the more money you have, the more you are worth until you have an epiphany and give back to society and then become truly worthy. It is all about Maslow’s Hierarchy of Needs.

Writer Jennifer Werner is a social media influencer and writes brilliantly about having her influence monetised in the New York Times.

I too had an influential moment, when I reviewed the first Iphone on the eve of its launch and had loads of companies contacting me to buy up links on my page. This was 2007 when SEO was the main thing to get seen. Me being influential in that tiny window of the Iphone launch was heady stuff! Not really, the companies weren’t anyone I wanted to dilute my brand for (*guffaws*).

Last summer, I went to a day of Sharing is Caring – social media strategies at Campus London last year. It was a very interesting (and exhausting) day – full of how to advice such as:

  • You have 15 minutes on twitter to build momentum.
  • You have 1 hour on facebook.
  • Speed is imperative.
  • Try to latch onto a world event to get noticed.
  • Peer to peer content is more valuable than anything else.

Everyone was furiously scribbling it down, and tweeting away on the hashtag #campuslondon. Just remembering that day, makes me want to tweet, facebook and generally rush about to get noticed. Even though I figured out a while ago that blogging is what I like to do.

One of the speakers was Malcolm Bell of Zaggora.com whose success is used by Harvard Business School as a case study in, I guess, social media success. He talked a lot about different strategies in particular using influencers like Jennifer Werner. But the thing he said which struck me the most was that:

No one has any idea how social media works.

Nobody. Not the CEOs of Facebook. Not the influencers of Twitter. No one.

And like most things which us poor humans don’t understand, we need an explanation, especially, when there are people who are making money from it. It is fascinating. Which is why there is big business in doing and being a social media strategist.

The twitter hashtags: #contentmarketing #socialmedia are full of:

  • Seven ways to get more ….
  • Use #contentmarketing to grow your…
  • Social media explained, etc.,

But, for me this all leads back to the thing I always say in every blog about social media. Actually in every blog about anything, which is: Whether you are a big business selling a product to make money, or you are an individual wandering around the Internet cocktail party looking for good conversation, it is all the same. We all want to be heard, we all want to feel like someone is listening to our story and we all want to hear a good story.

And for those of us who want to be rich and famous, well that is just a variation of being seen and heard. Money=power, power=people listening to us. Right now, social media seems to be the latest thing to make that possible.

Sociologist Sherry Turkle, has said that there is no proper conversation on Twitter. But, I disagree. I think that there is, it’s just that I haven’t completely found the conversation of my dreams yet.

But when I do, I will let the world know, well 64 of them anyway.

Alone together: Is social media changing us?

technology-disconnect-s from vortaloptics.com

The Information Superhighway is just a f***ing metaphor! Give me a break!
-Randy Waterhouse, Cryptonomicon (1999)

During a 2012 Ted talk based on her book Alone Together, Social Studies of Technology Professor, Sherry Turkle said: ‘Technology is taking us to places we don’t want to go.’

Turkle admits that this is a contradiction to what she has said before, especially circa 1996 in another Ted talk, when she celebrated life on the Internet. As a psychologist, she went online to learn about herself in the virtual worlds of chat rooms and online communities, so that she could unplug and use this knowledge in the real world. Nowadays, she regretfully admits that she sleeps with her phone.

As a human-computer interaction researcher, I have watched users anthropomorphise computers and sociologists theorise about metaphors about the Internet instead of the Internet itself. Listening to Turkle, she seems to be doing both – which makes me ask:

  • What sort of information was she looking to learn about herself online?
  • Why did she think she would/could only learn this online?
  • And why does she feel the need to sleep with her phone? (What does she even mean by that anyway? Is it on her pillow?)

Prior to the Information Superhighway of online communities, computers sat on your desk and as a human you interacted with them in order to achieve an end result e.g., an answer to a calculated problem, a neatly typed document, or a graph to explain some figures. As technology evolved, we shared this information over The Net as this documentary shows: The Internet in 1995 for work and for fun.

With our smart phones we now have the ability to interact with people on the other end of communicating technologies, as Turkle refers to them, aka social media, in real time – wherever and whenever we want. And particularly with social media, often, we are not interacting with a computer to solve a problem, we are just interacting with groups of people to share different types of information, for a number of reasons.

Turkle has found people using mobile technology during board meetings, lectures, meals, and funerals. She says this is bad because not only are people removing themselves from a situation e.g, a parent texting instead of listening to a child during dinner, but also from our feelings such as grief during a funeral.

The last funeral I went to was my Dad’s funeral. I didn’t grieve the whole time. Some of the time I laughed and chatted to people as we remembered my Dad and his great gift of being able to make you laugh no matter what. And since then, even on the saddest day when grief has felt unbearable, I had found that it is impossible to grieve non-stop. You don’t do full-on, full-time. Grief is exhausting.

Researchers are still trying to understand how many emotions we feel in one day, and where one emotion ends and another starts. When you are deep in grief and hit by an intense wave of it, on average it lasts 90 seconds and you have to hang on in there until it passes. You don’t get a choice.

So for me, people go into their phones not because they are escaping their emotions, but because they are choosing to stop one interaction and start a new one, like finding someone different to talk to at a cocktail party, or turning to the other side when seated at dinner. And Turkle in some way concurs with this by saying that people want to feel that they have control over their attention. But that is not because of technology. A few years ago, I was invited to attend a one hour meeting which went on for SEVEN HOURS. All, I can say is that I wish I had had a distraction that day. As it was, I was grateful for my zoning out abilities.

Turkle obviously has never had to suck it up, otherwise she would not believe that by tolerating the boring bits of meetings we help ourselves. She says that when we are in communication with others we are in communication with ourselves. But, we are not all psychologists and we don’t always want to examine ourselves like Turkle did in her 1996 brave new world. Often we just want a break, like I did from the guy who really had no respect for himself or anyone in the room as he droned on incessantly. The only thing I was learnt that day was that I wasn’t ever going to work with him again.

Turkle goes on to say that if we don’t have to reflect on ourselves, we don’t learn or know how to be alone. Again, I disagree, people have long found ways to avoid solitude and other people with work, alcohol, food, TV, radio, overeating, smoking, the list goes on… People have also engaged in meaningless interactions with many vs. quality time with few, and inattentive parents and friends are nothing new. Technology has not made us like this, we have always been seekers of distraction and stimulation.

One of the main disappointments of this talk is that Turkle doesn’t believe that people on Twitter have meaningful conversations or that they are learning and knowing about each other. Instead she thinks that we use online others as spare parts of ourselves, which makes me believe that she hasn’t really engaged with people on Twitter in a normal way in conversation. Especially, when she says that we believe that technology will listen when others don’t. Argh! It is not technology which is listening. It is a person at the end of a phone who is listening and responding. Which begs the question: Who are all these people Turkle has interviewed who are desperate to escape their lives? Why are they seeking an audience? Are they surrounded by people who are not listening and not giving them what they need?

Perhaps, or perhaps not?  They could be a certain type of person who enjoys taking a couple of hours off work to get paid to sit in a lab and talk about about themselves.  I have often met people like this whilst performing usability research and they are a perfect example of how we all want to be heard, we all want to be seen.

The world is changing and technology is making us able to connect with people regardless of time and space, which does change how we behave to a certain extent, i.e, many people have learnt to text on their phones whilst maintaining eye contact and a conversation with the person in front of them. However, I do not believe it is changing us to become incapable of connecting in a meaningful way. All the people in Turkle’s study demonstrated that. If they found the connections they had in meetings and at dinner so stimulating, they would put down the technology which makes it easy for them to connect elsewhere.

Turkle began the talk by saying that she got a text from her daughter and it felt like a hug! I was a bit surprised by this assertion. Perhaps hugs and texts do feel the same to her. Although, I would want to see her brain scanned during both events to see if it lights up in the same way. Personally, a text from one of my daughters – symbols on a digital screen – could never feel like their arms around my neck and their beautiful faces next to mine, never.

We have seen the end of society predicted many times with the advent of rock n’ roll and with television. With hindsight, we realise that the bad thing in question is not an agent of change but an agent of reflection. So, perhaps the question is not: is social media is changing us? But: How is social media reflecting us?

Alone together (2)