Human-computer interaction, cyberpsychology and core disciplines

A heat map of the multidisciplinary field of HCI @ Alan Dix

I first taught human-computer interaction (HCI) in 2001. I taught it from a viewpoint of software engineering. Then, when I taught it again, I taught it from a design point of view, which was a bit trickier, as I didn’t want to trawl through a load of general design principles which didn’t absolutely boil down to a practical set of guidelines for graphical-user interface or web design. That said, I wrote a whole generic set of design principles here: Designing Design, borrowing Herb Simon’s great title: The Science of the Artificial. Then, I revised my HCI course again and taught it from a practical set of tasks so that my students went away with a specific skill set. I blogged about it in a revised applied-just-to-web-design version blog series here: Web Design: The Science of Communication.

Last year, I attended a HCI open day Bootstrap UX. The day in itself was great and I enjoyed hearing some new research ideas until we got to one of the speakers who gave a presentation on web design, I think he did, it’s hard to say really, as all his examples came from architecture.

I have blogged about this unsatisfactory approach before. By all means use any metaphor you like, but if you cannot relate it back to practicalities then ultimately all you are giving us is a pretty talk or a bad interview question.

You have to put concise constraints around a given design problem and relate it back to the job that people do and which they have come to learn about. Waffling on about Bucky Fuller (his words – not mine) with some random quotes on nice pictures are not teaching us anything. We have a billion memes online to choose from. All you are doing is giving HCI a bad name and making it sound like marketing. Indeed, cyberpsychologist Mary Aiken, in her book The Cyber Effect, seems to think that HCI is just insidious marketing. Anyone might have been forgiven for making the same mistake listening to the web designer’s empty talk on ersatz architecture.

Cyberpsychology is a growing and interesting field but if it is populated by people like Aiken who don’t understand what HCI is, nor how artificial intelligence (AI) works then it is no surprise that The Cyber Effect reads like the Daily Mail (I will blog about the book in more detail at a later date, as there’s some useful stuff in there but too many errors). Aiken quotes Sherry Turkle’s book Alone Together, which I have blogged about here, and it makes me a little bit dubious about cyberpsychology, I am waiting for the book written by the neuroscientist with lots of brainscan pictures to tell me exactly how our brains are being changed by the Internet.

Cyberpsychology is the study of the psychological ramifications of cyborgs, AI, and virtual reality, and I was like wow, this is great, and rushed straight down to the library to get the books on it to see what was new and what I might not know. However, I was disappointed because if the people who are leading the research anthropomorphise computers and theorise about metaphors about the Internet instead of the Internet itself, then it seems that the end result will be skewed.

We are all cyberpsychologists and social psychologists now, baby. It’s what we do

We are all cyberpsychologists and social psychologists, now baby. It’s what we do. We make up stories to explain how the world works. It doesn’t mean to say that the stories are accurate. We need hard facts not Daily Mail hysteria (Aiken was very proud to say she made it onto the front page of the Daily Mail with some of her comments). However, the research I have read about our behaviour online says it’s too early to say. It’s just too early to say how we are being affected and as someone who has been online since 1995 I only feel enhanced by the connections the WWW has to offer me. Don’t get me wrong, it hasn’t been all marvellous, it’s been like the rest of life, some fabulous connections, some not so.

I used to lecture psychology students alongside the software engineering students when I taught HCI in 2004 at Westminster University, and they were excited when I covered cognitive science as it was familiar to them, and actually all the cognitive science tricks make it easy to involve everyone in the lectures, and make the lectures fun, but when I made them sit in front of a computer, design and code up software as part of their assessment, they didn’t want to do it. They didn’t see the point.

This is the point: If you do not know how something works how can you possibly talk about it without resorting to confabulation and metaphor? How do you know what is and what is not possible? I may be able to drive a car but I am not a mechanic, nor would I give advice to anyone about their car nor write a book on how a car works, and if I did, I would not just think about a car as a black box, I would have to put my head under the bonnet, otherwise I would sound like I didn’t know what I was talking about. At least, I drive a car, and use a car, that is something.

Hey! We’re not all doctors, baby.

If you don’t use social media, and you just study people using it, what is that then? Theory and practice are two different things, I am not saying that theory is not important, it is, but you need to support your theory, you need some experience to evaluate the theory. Practice is where it’s at. No one has ever said: Theory makes perfect. Yep, I’ve never seen that on a meme. You get a different perspective, like Jack Nicholson to his doctor Keanu Reeves says in Something’s Gotta Give: Hey! We’re not all doctors, baby. Reeves has seen things Nicholson hasn’t and Nicholson is savvy enough to know it.

So, if you don’t know the theory and you don’t engage in the practice, and you haven’t any empirical data yourself, you are giving us conjecture, fiction, a story. Reading the Wikipedia page on cyberpsychology, I see that it is full of suggested theories like the one about how Facebook causes depression. There are no constraints around the research. Were these people depressed before going on Facebook? I need more rigour. Aiken’s book is the same, which is weird since she has a lot of references, they just don’t add up to a whole theory. I have blogged before about how I was fascinated that some sociologists perceived software as masculine.

In the same series I blogged about women as objects online with the main point being, that social media reflects our society and we have a chance with technology to impact society in good ways. Aiken takes the opposite tack and says that technology encourages and propagates deviant sexual practices (her words) – some I hadn’t heard of, but for me, begs the question: If I don’t know about a specific sexual practice, deviant or otherwise, until I learn about on the Internet (Aiken’s theory), then how do I know which words to google? It is all a bit chicken and egg and doesn’t make sense. Nor does Aiken’s advice to parents which is: Do not let your girls become objects online. Women and girls have been objectified for centuries, technology does not do anything by itself, it supports people doing stuff they already do. And, like the HCI person I am, I have designed and developed technology to support people doing stuff they already do. I may sometimes inadvertently change the way people do a task when supported by technology for good or for bad, but to claim that technology is causing people to do things they do not want to do is myth making and fear mongering at its best.

The definition of HCI that I used to use in lectures at the very beginning of any course was:

HCI is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them (ACM, 1992).

For me, human-computer interaction was and still remains Gestaltian: The whole is greater than the sum of the parts, by this I mean, that the collaboration of a human and a computer is more than a human typing numbers into a computer and then waiting for the solution, or indeed typing in sexually deviant search terms into a web crawler to find a tutorial. And, with the advent of social media, HCI is more than one person connecting to another, or broadcasting online, which is why the field of cyberpsychology is so intriguing.

But the very reason why I left the field of AI and went into HCI is: AI reasons in a closed world and the limits of the computational power you have available. There are limits. With HCI, that world opens up and the human gets to direct the computer to do something useful. Human to human communication supported by technology does something else altogether which is why you might want the opinion of a sociologist or a psychologist. But, you don’t want the opinion of the sociologist on AI when they don’t understand how it works and has watched a lot of sci-fi and thinks that robots are taking over the world. Robots can do many things but it takes a lot of lines of code. And, you don’t want the opinion of a cyberpsychologist who thinks that technology teaches people deviant sexual practices and encourages us all to literally pleasure ourselves to death (Aiken’s words – see what I mean about the Daily Mail?) ‘cos she read one dodgy story and linked it to a study of rats in the 1950s.

Nowadays, everyone might consider themselves to be a bit of a HCI expert and can judge the original focus of HCI which is the concept of usability: easy to learn, easy to use. Apps are a great example of this, because they are easy to learn and easy to use, mainly though because they have limited functionality, that is they focus on one small task, like getting a date, ordering a taxi, sharing a photo, or a few words.

However, as HCI professor Alan Dix says in his reflective Thirty years of HCI and also here about the future: HCI is a vast and multifaceted community, bound by the evolving concept of usability, and the integrating commitment to value human activity and experience as the primary driver in technology.

He adds that sometimes the community can get lost and says that Apple’s good usability has been sacrificed for aesthetics and users are not supported as well as they should be. Online we can look at platforms like Facebook and Twitter and see that they do not look after their users as well as they could (I have blogged about that here). But again it is not technology, it is people who have let the users down. Somewhere along the line someone made a trade-off: economics over innovation, speed over safety, or aesthetics over usability.

HCI experts are agents of change. We are hopefully designing technology to enhance human activity and experience, which is why the field of HCI keeps getting bigger and bigger and has no apparent core discipline.

It has a culture of designer-maker which is why at any given HCI conference you might see designers, hackers, techies and artists gathering together to make things. HCI has to exist between academic rigour and exciting new tech, no wonder it seems to not be easy to define. But as we create new things, we change society and have to keep debating areas such as intimacy, privacy, ownership, visibility as well as what seems pretty basic like how to keep things usable. Dix even talks about having human–data interaction, as we put more and more things online, we need to make sense of the data being generated and interact with it. There is new research being funded into trust (which I blogged about here). And Dix suggest that we could look into designing for solitude and supporting users to not respond immediately to every text, tweet, digital flag. As an aside, I have switched off all notifications, my husband just ignores his, and it just boggles my mind a bit that people can’t bring themselves to be in charge of the technology they own. Back to the car analogy, they wouldn’t have the car telling them where they should be going.

Psychology is well represented in HCI, AI is well represented in HCI too. Hopefully we can subsume cyberpsychology too, so that the next time I pick up a book on the topic, it actually makes sense, and the writer knows what goes on under the bonnet.

Technology should be serving us, not scaring us, so if writers could stop behaving like 1950s preachers who think society is going to the dogs because they view how people embrace technology in the same way they once did rocknroll and the television, we could be more objective about how we want our technological progress to unfold.

Women in Tech: Society, Storytelling, Technology (7)

Ada Lovelace and her laptop

The world’s first programmer, Ada Lovelace. Source: Mashable

We cannot live in a world that is not our own, in a world that is interpreted for us by others. An interpreted world is not a home. – Hildegard of Bingen

[Women Part 7 of 9: 1) Introduction, 2) Bodies, 3) Health, 4) Work, 5) Superwomen, 6) Religion, 7) In Tech, 8) Online 9) Conclusions]

A couple of years ago, one of the dads at my girls’ school, following an initiative at his workplace, wanted help setting up an after school coding club to teach kids to program. He asked me if I would come along and help because there was a bit about Ada Lovelace and the guidelines would preferably have a woman giving that presentation.  I said I would be pleased to be a role model to guide young girls into IT. I said I would bring my girls and yep, sign me up, show me the materials.

One of my girls at the time was one year too young for the club (following his guidelines) but I said that it would be fine, she’s smart with a love of mathematics, she should come, Indeed she had to come as I look after her, but this man was insistent that she couldn’t come. He didn’t want me childminding – not that I would have been, I would have been teaching – and doing a job. His own wife who had worked in IT stayed at home and looked after his children whilst he ran the code club.

So there you have it. If there hadn’t been a mention in his materials about needing a woman to talk about their job in IT, I doubt he would have even asked me, male group think is prevalent in IT, as well as lots of parts of society. He certainly never felt the need to explain his reasons for not updating me on his plans, and he ran the club regardless with other dads and never mentioned it to me again nor did he ever show me any of the materials. The worst bit of all in this troubling tale is that this man is an IT manager.  A manager!!!

This anecdote, for me, sums up many experiences I have had in the world of IT: A socially awkward male cannot imagine what it is like to be a woman nor can he bend a tiny rule for something bigger than himself.

I am so used to this sort of nonsense in society, I just let it slide.  His individual lack of initiative and imagination can be found everywhere. There are a million stories of women being treated as unimportant in the computing industry and other domains as I discussed in the blog on Women’s Work and that is before we mention the purposeful aggression and sexism and appalling behaviour which happens towards women too.

The picture above is a mashup of Ada Byron, Countess of Lovelace, who worked with Charles Babbage on his computing machine so officially she is the first computer programmer.  A lot of computing pioneers were women. According to National Program Radio, who looked at the statistics for women in computing, the number of women studying computer science grew faster than the number of men until 1984, when the home computer was invented and marketed to boys, inventing the nerd stereotype and overwriting all the true stories of women in IT.

I was a final year undergraduate the first time I heard about Ada Lovelace and the only reason I learnt about her was because the programming language ADA is named after her. Sitting in a lecture hall full of men, the story of a woman was so invigorating, I taught myself ADA and wrote my final year project in ADA. It only took a few facts of her life to make me feel excited, included, inspired. What other things might I have decided to do had I known about NASA programmer Margaret Hamilton whose code put men on the moon,  she brought her daughter with her to the lab too, and Grace Hopper and her machine independent language ideas which led to COBOL? I learnt COBOL in my second year but no one ever thought she was worth a mention. I tell you COBOL and I might have gotten along much better had I known about Grace.

Female computer scientists were not mentioned during my many years of formal education. Rather like the early 19th century women scientists Caroline Herschel, Jane Marcet, and Mary Somerville, who in their lifetimes were recognised as being at the forefront of European science, but were no longer spoken about by the end of the 19th century because all women had been barred from graduating from university. Written out of history, and not given the legitimacy of belonging like men. What message does that send a woman?

Our culture sends messages whether we like or not and mass culture likes to give us what we already like because it is based on economics. So the moment the male computing geek stereotype was invented, that narrative excluded women, it overwrote those great female stories. Like sells like, and fiscal reasoning doesn’t care about telling new stories especially when it comes to women. Progress is a myth where technology is concerned. We think that any progress is an advancement but it is not. Semiotically speaking, we look for a how not a what, and we choose and reject stories based on how true they feel, which is based on familiarity i.e. the stories we know. So, if a constant narrative is that girls don’t do computing and boys do then this must be true.

It encourages a cultural devaluation of women across society and in particular in technology. Take Stuff Magazine, a magazine for men who are interested in technology. It made me so cross objectifying women that I had to write a whole blog slagging it off and I only slag things off when I am angry. A Menkind shop has just opened up near me which is a gadget shop. Why is it called Menkind? When I passed it, it had a Harry Potter cutout in the window.  Harry Potter eh? We all know that J K Rowling chose her pen name so that she would appeal to young boys. Heaven forbid that society encourages little boys to take women seriously and to listen to whatever story they might have to tell. The bottom line is like sells like, and the bottom line is hard cold cash. Progress is a myth and women’s stories are unimportant.

New Scientist news editor @PennySarchet  wrote in a tweet how she was advised during her PhD to explain everything really simply as if you were talking to a child or your mother. The original tweet she quotes and which has been deleted says grandmother. The cultural devaluation of women starts at home with the mother.

And yet there is hope. There is always hope. Recently, I read  Goodnight Stories for Rebel Girls by Elena Favilli and Francesca Cavallo which in the link there to the Guardian has the female reviewer saying her daughter was disappointed not to find J K Rowling and the reviewer herself was disappointed to find Margaret Thatcher. J K Rowling writes books, yes successfully, whereas Thatcher was the first UK female Prime Minister, so the book has made the right choice. You can’t edit Thatcher out of history just because you don’t want to hear her story. She is, historically speaking, an incredibly important figure. Rowling, we can’t say yet, time will tell. But we can say this, she wasn’t the first woman writer in UK history. She is just one that the female reviewer’s daughter has heard of because she hasn’t heard many women’s stories. Why? Because many women have been written out of history.  Am I repeating myself?

I read the book with my daughter who was really interested in the coders and physicists because of me. She kept showing me them and having a chat about it because she is looking for stories which make sense about her world, (even though she was excluded from code club, miaow), a world in which luckily for her, her mother loves computing, and takes up space in that field. But what about those girls whose mothers don’t and only the dads do computing in after school code club?

Lillian Robinson says in Wonder Women: Feminism in stories is about the politics of stories. Each time a story about a woman doing something in a domain that society has traditionally defined as a man’s world is told, that narrative becomes part of the information we women and our girls coming after us use to process our experiences, which leads to that man’s world becoming less male and more populated by women. Hopefully an equal world of equal opportunity. And, the opposite is true, if all the sources of narrative tell the same story about women then nothing will ever change. Like sells like remember.

Let us know as truth that the narratives behind the field of computer science need to be rewritten, let’s stop dealing in stereotypes and lazy journalism, and the misogyny of female prime ministers (which is a whole other blog in itself). Let us look at the big picture, the bright one which stops telling us only men do IT.  In Living a Feminist Life, Sara Ahmed says:

Feminism helps you to make sense that something is wrong; to recognise a wrong is to realise that you are not in the wrong.

Don’t make our girls wrong about computing.

[8) Online]

Is this progress? Humans, computers and stories

As a computer scientist, I have to say my job has changed very little in the last  last twenty-odd years. The tech has, admittedly, but I am still doing what I did back then, sitting in front of a computer, thinking about how computers can make peoples’ lives easier, what makes people tick, and how can we put the two together to make something cool?  Sometimes I even program something up to demonstrate what I am talking about.

It seems to me though that everyone else’s jobs (non-computer scientists) have changed and not necessarily for the better. People do their jobs and then they do a load of extras like social media, blogging, content creation, logging stuff in systems- the list is endless – on top of their workload.

It makes me wonder: Is this progress?

Humans and stories

As a teenager, on hearing about great literature and the classics, I figured that it must be something hifalutin’. In school we did a lot of those kitchen sink, gritty dramas (A Kind of Loving, Billy Liar, Kes, etc.,). So, when I found the section in the library: Classics, Literature, or whatever, it was a pleasant surprise to see that they were just stories about people, and sometimes gods, often behaving badly, and I was hooked. Little did I know that reading would be the best training I could receive to become a computer scientist.

Human and computer united together

In my first job as systems analyst and IT support, I found that I enjoyed listening to people’s stories in and amongst their descriptions about their interactions with computers. My job was to talk to people. What could be better? I then had to capture all the information about how computers were complex and getting in the way and try to make them more useful. Sometimes I had to whip out my screwdriver and fix it there and then. Yay!! Badass tech support.

The thing that struck me the most was that people anthropomorphised their computers, talking about them needing time to warm up, being temperamental, and being affected by circumstances, as if they were in some way human and not just a bunch of electronic circuits. And, that the computer was always the way of progress, even if they hated it and didn’t think so.

I think this is partly because it was one person with one computer working solely, so the computer was like a companion, the office worker you love or hate, who helps or hinders. There was little in the way of email or anything else unless you were on the mainframe and then it was used sparingly, especially in a huge companies. Memos were still circulated around. The computer was there to do a task – crunch numbers, produce reports, run the Caustic Soda Plant (I did not even touch the door handles when I went in there) –  the results of which got transferred from one computer to another by me, and sometimes by that advanced user who knew how to handle a floppy disk.

Most often information was transferred orally by presentation in a meeting or on paper with that most important of tools, the executive summary whilst the rest of it was a very dry long winded explanation, hardly a story at all.

Human and computer and human and computer united

Then the Internet arrived and humans (well mainly academics) began sharing information more easily, without needing to print things out and post them.  This was definitely progress. I began researching how people with different backgrounds like architects and engineers could work together with collaborative tools even though they use different terminology and different software. How could we make their lives easier when working together?

I spent a lot of time talking to architects and standing on bridges with engineers in order to see what they did. Other times I talked to draftsmen to see if a bit of artificial intelligence could model what they did. It could up to a point, but modelling all that information in a computer is limiting in comparison to what a human can know instinctively, which is when I realised that people need help automating the boring bits, not the instinctive bits.

I was fascinated by physiological computing, that is, interacting using our bodies rather than typing – so using our voices or our fingerprints. However, when it was me, my Northern accent, and my French colleagues, all speaking our fabulous variations of the English language into some interesting software written by some Bulgarians I believe, on a slow running computer, well, the results were interesting, to say the least.

Everyone online

The UK government’s push to get everything electronic seemed like a great idea, so everyone could access all the information they needed. It impacted Post Offices, but seemed to free up the time spent waiting in a queue and to provide more opportunities to do all those things like pay a TV licence, get a road tax disc, and passport, etc. This felt like progress.

I spent a lot time working on websites for the government with lovely scripts to guide people through forms like self-assessment so that life was easier. We all know how daunting a government form can be, so what could be better than being told by a website which bit to fill in? Mmm progress.

Lots of businesses came online and everyone thought that Amazon was great way back when. I know I did living in Switzerland and being able to order any book I wanted was such a relief as opposed to waiting or reading it in French. (Harry Potter in French although very good is just not the same.) Progress.

Then businesses joined in and wanted to be seen, causing the creation of banners, ads, popups, buying links to promote themselves, and lots of research into website design so they were all polished and sexy, even though the point of the Internet is that it is a work in progress constantly changing and will never be finished.

I started spending my time in labs, rather than in-situ, watching people use websites and asking them how they felt. I was still capturing stories but in a different way, in a more clinical, less of a natural habitat, way which of course alters what people say and which I found a bit boring. It didn’t feel like progress. It felt businessy – means to an end like – and not much fun.

Human -computer -human

Then phones became more powerful and social media was born, and people started using computers just to chat, which felt lovely and like progress. I had always been in that privileged position of being able to chat to people the world over, online, whatever the time, with the access I had to technology, now it was just easier and available to everyone – definitely progress.   Until of course, companies wanted to be in on that too. So, now we have a constant stream of ads on Facebook and Twitter and people behaving like they are down the market jostling for attention, shouting out their wares 24/7, with people rushing up asking:  Need me to shout for you?

And, then there are people just shouting about whatever is bothering them. It’s fantastic and fascinating, but is it progress?

The fear of being left behind

The downside is that people all feel obliged to jump on the bandwagon and be on multiple channels without much to say which is why they have to do extras like creating content as part of their ever expanding jobs. The downside is that your stream can contain the same information repeated a zillion times. The upside is that people can say whatever they like which is why your stream can contain the same information repeated a zillion times.

Me, I am still here wondering about the experience everyone is having when this is all happening on top of doing a job.  It feels exhausting and it feels like we are being dictated to by technology instead of the other way around. I am not sure what the answer is. I am not sure if I am even asking the right question. I do know how we got here. But is this where we need to be? Do we need to fix it? Does it needs fixing?  And, where we should go next? I think we may need a course correct, because when I ask a lot of people, I find that they agree. If you don’t, answer me this, how do you feel when I ask: Is this progress?

Web design (5): Structure

A collaborative medium, a place where we all meet and read and write.
Tim Berners-Lee

[Part 5 of 7 : 0) intro, 1) story, 2) pictures,  3) users, 4) content, 5) structure, 6) social media, 7) evaluation]

Many designers have adopted a grid structure to design web pages because a) it lends itself well to responsive design and b) it allows a design which is easy for users to understand. Designers literally have about five seconds before a user will click away to find a different service/page/content provider if the page is laid out in a way which is difficult to understand.

In a great talk for An Event Apart, Designer and Developer Advocate at Mozilla, Jen Simmons looks offline at magazines for inspiration and remembers how there was much experimentation and creativity online until everyone adopted grids and fell into a rut of grids.

But, it is easy to understand why everyone adopted grids, because users create their own understanding of a webpage from its structure. Text is complete within itself and meaning comes from its structure and language rather than the ideas it contains. This is a fundamental principle of semiotics, the study of meaning.

Managing expectations

When a webpage is judged to be useless, it is often because it does not behave in the way the user is expecting, particularly if it is not very attractive.

Designers either need to manage a user’s expectations by giving them what they are expecting in terms of the service they are looking for, or they need to make it super attractive.  Attractive things don’t necessarily work better but we humans perceive them as doing so  because they light up the brain’s reward centre and make us feel better when we are around them. We are attracted to attractive things which is given by certain Gestalt principles such as unity, symmetry, and the golden ratio.

Gestalt: similarity, promixity

Good design is one thing, but we also have specific expectations about  any given webpage. We scan for headings and white space and interpret a page in those terms.  This is because according to Gestalt theory we will interpret items according to their proximity: items which are close together, we will group together; or similarity, items which are similar we interpret as together.

And also, because we have been to others sites and we transfer our experiences from one site to another and anticipate where certain functions should be.

Where am I? Where have I been? Where am I going?

Main menus are usually at the top of the page, grouped together and are used for navigation through the site.  Secondary navigation may take place in drop down menus, or in  left or right hand columns. Specific house keeping information can be found in the footer, or the common links bar if there is one.

If users are completely lost they will use the breadcrumbs, which Google now uses instead of the URL of sites as part of the results their search engine serves up. Therefore, it is in a designer’s interest to put breadcrumbs on the top of page.

Users will stay longer and feel better if they can answer the three questions of navigation as articulated by usability consultant Steve Krug:

  1. Where am I?
  2. Where have I been?
  3. Where am I going?

Often this answered by changing links to visited, not visited and enforcing the consistency of the design by adopting a sensible approach to colour. There is a theory of colour in terms of adding and subtracting colour to create colour either digitally, or on a palette, but there is alas, no theory about how to use colour to influence branding and marketing, as personal preferences are impossible to standardise.

HTML 5 & CSS 3

As discussed earlier in part 1 of this series, we separate out our content from our presentation which is styled using CSS 3. Then, once we know what we want to say we use HTML 5 to structure our text to give it meaning to the reader. This may be a screen reader or it may be a human being.

HTML 5 breaks a page into its header and body, and then the body is broken down further into specific instructions. Headings from <h1> to <h6>, paragraphs, lists, sections and paragraphs, etc., so that we can structure a nice layout.  There are thousands of tutorials online which teach HTML 5.

The nice thing about sections is that we can use them to source linked data from elsewhere and fill our pages that way, but still keep a consistent appearance.

Theoretically one page is great, or a couple of pages fine, but once we get into hundreds of pages, we need to think about how we present everything consistently and evenly across a site and still provide users the information for which they came.

Information architecture

Information architecture (IA) is the way to organise the structure of a whole website. It asks: How you categorise and structure information? How do you label it so that users can navigate or search through it in order to find what they need?

The first step is to perform some knowledge elicitation of the  business or context and what everyone (owners, customers) known as stakeholders expect from the proposed system. This may include reading all the official documentation a business has (yawn!).

If there is a lot of existing information the best way to organise it is to perform a card sort. A card sort is when a consultant calls in some users, gives them a stack of index cards with content subjects written on them, along with a list of headings from the client’s site—“Business and News,” “Lifestyle,” “Society and Culture”— then users decide where to put “How to floss your teeth”.

This can take a few days each time and a few goes, until a pattern is found, us humans love to impose order on chaos, we love to find a pattern to shape and understand our world.

Once we have a structure from the card sort, it becomes easier to start designing the structure across the site and we begin with the site map.

The site map reflects the hierarchy of a system (even though Tim Berners-Lee was quite emphatic that the web should not have a hierarchical structure).

Then, once a site map is in place, each page layout can be addressed and the way users will navigate. Thus, we get main menus (global navigation), local navigation, content types to put in sections and paragraphs, etc., along with the functional elements needs to interact with users.

Other tools created at this time to facilitate the structure are wireframes, or annotated page layouts, because if is is a big site lots of people may be working on it and clear tools for communication are needed so that the site structure remains consistent.

Mock up screen shots and paper prototypes may be created and sometimes in the case of talented visual designers, storyboards are created. Storyboards are sketches showing how a user could interact with a system, sometimes they take a task-base approach, so that users could complete a common task.

Depending on the size of a project, information architects will work with content strategists who will have asked all the questions in the last section (part 4) on content and/or usability consultants who will have spoken to lots of users (part 3) to get an understanding of their experiences, above and beyond their understanding of the labelling of information in order to answer questions such as:

  • Does the website have great usability which is measured by being: effective and efficient; easy to learn and remember; useful and safe?
  • How do we guide users to our key themes, messages, and recommended topics?
  • Is the content working hard enough for our users?

Sometimes, it may just be one person who does all of these roles and is responsible for answering all of these questions.

It takes time to create great structure, often it takes several iterations of these these steps, until it is time to go on to the next stage (part 6) to start sharing this beautiful content on social media.

[Part 6]

Game theory & social media marketing (4): Conclusions

The Royal Game of UR, Early Dynastic III, 2600BC, British Museum

[Part 4 of 4: Game theory & social media: Part 1Part 2, Part 3]

No, I’m no super lady, I don’t have no game whatsoever,
I put my high heels on and see how that goes, yeah
– Pauline, Sucker for love

Ask a mathematician why they like maths, and they will tell you that mathematics gives a definite yes or no. There is beauty in clarity. And, everyone likes to feel that they understand and have control over what is happening in their world. This feeling of certainty is reflected in the bottom two rows of Maslow’s hierarchy of needs: physiological and safety needs.

Tapping into fear and belonging

That said, we also love variety and surprise, which is the most popular information shared on social media. We crave new stimulus which is why we love games. We love the idea of chance or fortune transforming our lives for the better, and surely if we learn the rules, then we will succeed. And, that is why marketing has such a pull on us. Marketers tell us that we will have improved lives if we do/buy/or have what they are selling, and, marketers themselves will have improved lives too if we do/buy/or have what they are selling.

There are so many ways to market something, this link has 52 types of marketing strategies. The most effective, of course, aims at the bottom of Maslow’s hierarchy of needs – safety – which is why fear quite often drives news and coupled with specific instructions gives a compliant society.

Tapping into belonging is another way to market, which is why the connection economy and building friendship with your customers is gaining so much traction as a marketing strategy.

Modelling emotion and what-ifs

Modelling human emotion is impossible to do with game theory especially on social media, a fluid, still unknown, type of communication. We will never quite know who our audience is. We may target our demographic, but if they retweet or share something outside of that, then you never exactly know who is looking at your content, or how they will react to it. All game theory can do is offer interesting and potentially useful partial explanations to model a selection of what-ifs scenarios when employing different strategies.

In the last post (part 3), we looked at various game theory strategies from the aggressive to the altruistic, and saw that people generally behave like the people around them (hawk-dove) and that Kermit was in a bit of hurry to get together with his girl, which caused him to behave passive-aggressively, and probably not get what he wanted.

 Don’t be like Kermit

Game theory is a tool for social media marketing and the best application of it is recording trial and error attempts (with statistical significance) whilst using our emotional intelligence.

Be aware of your emotions and triggers (your personal competence) so you don’t get involved in a big wrangle either privately, which could damage a relationship, or publicly, which might be retweeted everywhere and could wreck your brand or reputation.  Even in the mathematics of game theory we need to understand other players moods and motives (social competence) and not assume anything. We need to ask for more clarification, so that when we do make a move, we do so with clarity and certainty that we are doing the right thing, and as any mathematician would tell you if you asked them, there is beauty in clarity for it gives us certainty and a sense of control, things which are harder to come by in our ever changing world.