User or Used? Human-Computer Interaction: Dialogue, Conversation, Symbiosis (3)

If you torture the data enough, it will confess to anything.
– Darrell Huff, How to Lie With Statistics (1954).

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

In the last blog I wrote about human dialogue with a computer versus the conversation between humans via a computer. The dialogue with a computer is heavily designed whereas human conversation especially via social media has come about serendipitously. For example, Twitter came from texting which was invented by engineers as tool to test mobile phones.

This is an example of what I call serendipitous design which works by users employing systems to do what they want it to do, which is the no-function-in-structure principle and then a designer find ways to support them. In contrast, the way to create systems which support users to do their job better uses the cardinal rule: know your user with all the various tools and techniques UX designers have borrowed from anthropology. You design with your user in mind, you manage their expectations, and you have them at the front of your mind as a major factor of the design so that the system has specific goals.

But, however hard you try, with each new system or software or form of communication, you often end up changing how people work and the dialogue is less about a field of possibilities with insight, intuition, and creativity, and more about getting people to do extra stuff on top of what they already do. And, because people are keen to get in on whatever new thing is happening they buy into what I call the myth of progress and adopt new ways of working.

This begs the question are we creating systems for users or the used?

This begs the question are we creating systems for users or the used? Today, I was chatting to a roadsweeper, he told me that last year he was driving a lorry but the council’s initiative to reduce carbon emissions means that 80 lorries were taken off the road and the drivers are now out sweeping the streets on foot. He showed me his council-issue mobile phone which tracks his every move and presumably reports back to the office his whereabouts at all times. Not that he needs it, if he sits on a wall too long, local residents will phone the council to complain that he is sitting down and not working hard enough.

Tracking is not new, smart badges, invented at Xerox PARC, were trialled in the 1990s in the early days of ubiquitious computing (ubicomp). The idea was to move computers off the desktop and embed them into our infrastructure so that we interact with them without our knowledge, freeing the user from the need to learn complex systems. In the badges’ case, everyone could be located by everyone else in the building, rather like the Harry Potter Marauder’s map. However, it smacks rather too much of surveillance, especially if your boss decides you are spending too long in the toilet or by the water cooler and, that your behaviour needs to change. The road sweeper instead of a badge has a mobile phone and people who spy on him and grass him up in part because they lack context and don’t know that he is entitled to a 20 minute break.

Must I really run all my important ideas past my fridge?

But it’s not just as part of a job, we have Google Maps recording every journey we make. And yet, ubicomp was less about having a mobile device or about surveillance, it was the forerunner to the Internet of Things, the ambient life, which is there to make things easier so the fridge talks to your online shopping to say that you need more milk. But what if I go vegan? Do I need to inform my fridge first? Must I really run all my important ideas past my fridge? This is not the semiotic relationship psychologist and mathematician J.C.R. Licklider had when he had his vision of man-computer symbiosis.

I was speaking to someone the other day who monitors their partner’s whereabouts. They think it’s useful to see where the partner is at any given time and to check that the partner is where they said they would be. No biggie, just useful. I mentioned it to another person who said that they had heard several people do the same. I wonder why am I so horrified and other people just think it’s practical.

Insidious or practical? I feel we are manipulated into patterns of behaviour which maintain the status quo.

Last week, I woke up and checked my Fitbit to see how I had slept which is slightly worrying now – I never needed anything to tell me how I slept before – and there was a new box in there: Female Health. I clicked on it. It asked me about birth control, when my next period is due, how long it lasts and so on. Intrigued, I entered the requested data. The resulting box said: Your period is due in eight days. Really? I mean, really? It was wrong even though I had tinkered with the settings. So, then it had a countdown: Your period will last four more days, three more days…etc. Wrong again. And, now it is saying: Four days to most fertile days. This is so irritating. It feels like Argos, you know, how the system and the reality of you getting something you’ve ordered never quite match up. I know together me and my Fitbit can build up data patterns. Will they be insightful? Time will tell. The bits which really concern me is that it said it wouldn’t share this information to anyone, okay… but then it added that I couldn’t share this information either. What? I am guessing that it wants me to feel safe and secure. But what if I wanted to share it? What does this mean? Menstrual cycles are still taboo? I can share my steps but not my periods? My husband and I laughed about the idea of a Fitbit flashing up a Super Fertile proceed with caution message when out on date night.

I regularly lie in bed pretending to be asleep to see if I can fool my fitbit

But, it’s not just me and my Fitbit in a symbiotic relationship is it? Someone is collecting and collating all that data. What are they going to do with that information prying into me and my Fitbit’s symbiotic space? It rather feels like someone is going to start advertising in there offering birth control alternatives and sanitary protection improvements. It feels invasive, and yet I signed up to it, me the person who thinks a lot about technology and privacy and data and oversharing. And, even now as I sit here and think about my mixed feelings about my Fitbit, the idea of wearing something on my arm which only tells me the time, and not my heart rate, nor the amount of steps I am doing, feels a bit old-fashioned – I am myself am a victim of the myth of progress. I am user and used. Confession, I regularly lie in bed pretending to be asleep to see if I can fool my Fitbit. It’s changing my behaviour all the time. I never used to lie in bed pretending to be asleep.

Back in 2006, I watched Housewife 49, it was so compelling, I bought the books. Nella Last was a housewife in Barrow-in-Furness who kept a diary along with 450 other people during and after the war. It was part of the Mass Observation project set up by an anthropologist, a poet, and a filmmaker, which sounds rather like the maker culture of HCI today. They didn’t believe the newpapers reporting of the abdication and marriage of King Edward VII, so went about collecting conversation and diary entries and observations on line. Rather like today, we have social media with endless conversation and diary entries and observations. The newspapers are scrambling to keep up and curate other peoples’ tweets because they have traditionally been the only ones who shape our society through propaganda and mass media. Now, we have citizens all over the world speaking out their version. We don’t need to wait for the newspapers.

We are living through a mass observation project of our own, a great enormous social experiment and it is a question worth asking: User or used? Who is leading this? And what is their goal? And, then we have the big companies collecting all our data like Google. And, we all know the deal, we give them our data, they give us free platforms and backups and archives. However, it doesn’t necessarily mean that they are right about the results of their research on our data, or have the right to every last piece of information to use, even if you give it freely, because there is a blurring of public and private information about me and my steps and periods and birth control.

Anthropologist Agustín Fuentes has written a thoughtful article about the misuse of terms such as biology in Google’s manifesto and consequently, the sweeping generalisations to which it comes. Fuentes says we have no way of knowing what happened before we collected data and even now as we collect data, we have to maintain our integrity and interpret it correctly by using terms and definitions accurately. Otherwise, we think that data tells the truth and stereotypes and bias and prejudices are maintained. I love the quote:

If you torture the data enough, it will confess to anything.

Information is power. Hopefully, though, there are enough anthropologists and system designers around who can stop the people who own the technology telling us what to think by saying they are having insights into our lives whilst peddling old ideas. We need to pay attention to truth and transparency before we trust so that we can have more open dialogue in the true sense of the word – an exploration of a field of possibilities – to lead to real and effective change for everyone.

Let us all be users not the used.

[Part 4]

Human-computer interaction, cyberpsychology and core disciplines

A heat map of the multidisciplinary field of HCI @ Alan Dix

I first taught human-computer interaction (HCI) in 2001. I taught it from a viewpoint of software engineering. Then, when I taught it again, I taught it from a design point of view, which was a bit trickier, as I didn’t want to trawl through a load of general design principles which didn’t absolutely boil down to a practical set of guidelines for graphical-user interface or web design. That said, I wrote a whole generic set of design principles here: Designing Design, borrowing Herb Simon’s great title: The Science of the Artificial. Then, I revised my HCI course again and taught it from a practical set of tasks so that my students went away with a specific skill set. I blogged about it in a revised applied-just-to-web-design version blog series here: Web Design: The Science of Communication.

Last year, I attended a HCI open day Bootstrap UX. The day in itself was great and I enjoyed hearing some new research ideas until we got to one of the speakers who gave a presentation on web design, I think he did, it’s hard to say really, as all his examples came from architecture.

I have blogged about this unsatisfactory approach before. By all means use any metaphor you like, but if you cannot relate it back to practicalities then ultimately all you are giving us is a pretty talk or a bad interview question.

You have to put concise constraints around a given design problem and relate it back to the job that people do and which they have come to learn about. Waffling on about Bucky Fuller (his words – not mine) with some random quotes on nice pictures are not teaching us anything. We have a billion memes online to choose from. All you are doing is giving HCI a bad name and making it sound like marketing. Indeed, cyberpsychologist Mary Aiken, in her book The Cyber Effect, seems to think that HCI is just insidious marketing. Anyone might have been forgiven for making the same mistake listening to the web designer’s empty talk on ersatz architecture.

Cyberpsychology is a growing and interesting field but if it is populated by people like Aiken who don’t understand what HCI is, nor how artificial intelligence (AI) works then it is no surprise that The Cyber Effect reads like the Daily Mail (I will blog about the book in more detail at a later date, as there’s some useful stuff in there but too many errors). Aiken quotes Sherry Turkle’s book Alone Together, which I have blogged about here, and it makes me a little bit dubious about cyberpsychology, I am waiting for the book written by the neuroscientist with lots of brainscan pictures to tell me exactly how our brains are being changed by the Internet.

Cyberpsychology is the study of the psychological ramifications of cyborgs, AI, and virtual reality, and I was like wow, this is great, and rushed straight down to the library to get the books on it to see what was new and what I might not know. However, I was disappointed because if the people who are leading the research anthropomorphise computers and theorise about metaphors about the Internet instead of the Internet itself, then it seems that the end result will be skewed.

We are all cyberpsychologists and social psychologists now, baby. It’s what we do

We are all cyberpsychologists and social psychologists, now baby. It’s what we do. We make up stories to explain how the world works. It doesn’t mean to say that the stories are accurate. We need hard facts not Daily Mail hysteria (Aiken was very proud to say she made it onto the front page of the Daily Mail with some of her comments). However, the research I have read about our behaviour online says it’s too early to say. It’s just too early to say how we are being affected and as someone who has been online since 1995 I only feel enhanced by the connections the WWW has to offer me. Don’t get me wrong, it hasn’t been all marvellous, it’s been like the rest of life, some fabulous connections, some not so.

I used to lecture psychology students alongside the software engineering students when I taught HCI in 2004 at Westminster University, and they were excited when I covered cognitive science as it was familiar to them, and actually all the cognitive science tricks make it easy to involve everyone in the lectures, and make the lectures fun, but when I made them sit in front of a computer, design and code up software as part of their assessment, they didn’t want to do it. They didn’t see the point.

This is the point: If you do not know how something works how can you possibly talk about it without resorting to confabulation and metaphor? How do you know what is and what is not possible? I may be able to drive a car but I am not a mechanic, nor would I give advice to anyone about their car nor write a book on how a car works, and if I did, I would not just think about a car as a black box, I would have to put my head under the bonnet, otherwise I would sound like I didn’t know what I was talking about. At least, I drive a car, and use a car, that is something.

Hey! We’re not all doctors, baby.

If you don’t use social media, and you just study people using it, what is that then? Theory and practice are two different things, I am not saying that theory is not important, it is, but you need to support your theory, you need some experience to evaluate the theory. Practice is where it’s at. No one has ever said: Theory makes perfect. Yep, I’ve never seen that on a meme. You get a different perspective, like Jack Nicholson to his doctor Keanu Reeves says in Something’s Gotta Give: Hey! We’re not all doctors, baby. Reeves has seen things Nicholson hasn’t and Nicholson is savvy enough to know it.

So, if you don’t know the theory and you don’t engage in the practice, and you haven’t any empirical data yourself, you are giving us conjecture, fiction, a story. Reading the Wikipedia page on cyberpsychology, I see that it is full of suggested theories like the one about how Facebook causes depression. There are no constraints around the research. Were these people depressed before going on Facebook? I need more rigour. Aiken’s book is the same, which is weird since she has a lot of references, they just don’t add up to a whole theory. I have blogged before about how I was fascinated that some sociologists perceived software as masculine.

In the same series I blogged about women as objects online with the main point being, that social media reflects our society and we have a chance with technology to impact society in good ways. Aiken takes the opposite tack and says that technology encourages and propagates deviant sexual practices (her words) – some I hadn’t heard of, but for me, begs the question: If I don’t know about a specific sexual practice, deviant or otherwise, until I learn about on the Internet (Aiken’s theory), then how do I know which words to google? It is all a bit chicken and egg and doesn’t make sense. Nor does Aiken’s advice to parents which is: Do not let your girls become objects online. Women and girls have been objectified for centuries, technology does not do anything by itself, it supports people doing stuff they already do. And, like the HCI person I am, I have designed and developed technology to support people doing stuff they already do. I may sometimes inadvertently change the way people do a task when supported by technology for good or for bad, but to claim that technology is causing people to do things they do not want to do is myth making and fear mongering at its best.

The definition of HCI that I used to use in lectures at the very beginning of any course was:

HCI is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them (ACM, 1992).

For me, human-computer interaction was and still remains Gestaltian: The whole is greater than the sum of the parts, by this I mean, that the collaboration of a human and a computer is more than a human typing numbers into a computer and then waiting for the solution, or indeed typing in sexually deviant search terms into a web crawler to find a tutorial. And, with the advent of social media, HCI is more than one person connecting to another, or broadcasting online, which is why the field of cyberpsychology is so intriguing.

But the very reason why I left the field of AI and went into HCI is: AI reasons in a closed world and the limits of the computational power you have available. There are limits. With HCI, that world opens up and the human gets to direct the computer to do something useful. Human to human communication supported by technology does something else altogether which is why you might want the opinion of a sociologist or a psychologist. But, you don’t want the opinion of the sociologist on AI when they don’t understand how it works and has watched a lot of sci-fi and thinks that robots are taking over the world. Robots can do many things but it takes a lot of lines of code. And, you don’t want the opinion of a cyberpsychologist who thinks that technology teaches people deviant sexual practices and encourages us all to literally pleasure ourselves to death (Aiken’s words – see what I mean about the Daily Mail?) ‘cos she read one dodgy story and linked it to a study of rats in the 1950s.

Nowadays, everyone might consider themselves to be a bit of a HCI expert and can judge the original focus of HCI which is the concept of usability: easy to learn, easy to use. Apps are a great example of this, because they are easy to learn and easy to use, mainly though because they have limited functionality, that is they focus on one small task, like getting a date, ordering a taxi, sharing a photo, or a few words.

However, as HCI professor Alan Dix says in his reflective Thirty years of HCI and also here about the future: HCI is a vast and multifaceted community, bound by the evolving concept of usability, and the integrating commitment to value human activity and experience as the primary driver in technology.

He adds that sometimes the community can get lost and says that Apple’s good usability has been sacrificed for aesthetics and users are not supported as well as they should be. Online we can look at platforms like Facebook and Twitter and see that they do not look after their users as well as they could (I have blogged about that here). But again it is not technology, it is people who have let the users down. Somewhere along the line someone made a trade-off: economics over innovation, speed over safety, or aesthetics over usability.

HCI experts are agents of change. We are hopefully designing technology to enhance human activity and experience, which is why the field of HCI keeps getting bigger and bigger and has no apparent core discipline.

It has a culture of designer-maker which is why at any given HCI conference you might see designers, hackers, techies and artists gathering together to make things. HCI has to exist between academic rigour and exciting new tech, no wonder it seems to not be easy to define. But as we create new things, we change society and have to keep debating areas such as intimacy, privacy, ownership, visibility as well as what seems pretty basic like how to keep things usable. Dix even talks about having human–data interaction, as we put more and more things online, we need to make sense of the data being generated and interact with it. There is new research being funded into trust (which I blogged about here). And Dix suggest that we could look into designing for solitude and supporting users to not respond immediately to every text, tweet, digital flag. As an aside, I have switched off all notifications, my husband just ignores his, and it just boggles my mind a bit that people can’t bring themselves to be in charge of the technology they own. Back to the car analogy, they wouldn’t have the car telling them where they should be going.

Psychology is well represented in HCI, AI is well represented in HCI too. Hopefully we can subsume cyberpsychology too, so that the next time I pick up a book on the topic, it actually makes sense, and the writer knows what goes on under the bonnet.

Technology should be serving us, not scaring us, so if writers could stop behaving like 1950s preachers who think society is going to the dogs because they view how people embrace technology in the same way they once did rocknroll and the television, we could be more objective about how we want our technological progress to unfold.

Web design (5): Structure

A collaborative medium, a place where we all meet and read and write.
Tim Berners-Lee

[Part 5 of 7 : 0) intro, 1) story, 2) pictures,  3) users, 4) content, 5) structure, 6) social media, 7) evaluation]

Many designers have adopted a grid structure to design web pages because a) it lends itself well to responsive design and b) it allows a design which is easy for users to understand. Designers literally have about five seconds before a user will click away to find a different service/page/content provider if the page is laid out in a way which is difficult to understand.

In a great talk for An Event Apart, Designer and Developer Advocate at Mozilla, Jen Simmons looks offline at magazines for inspiration and remembers how there was much experimentation and creativity online until everyone adopted grids and fell into a rut of grids.

But, it is easy to understand why everyone adopted grids, because users create their own understanding of a webpage from its structure. Text is complete within itself and meaning comes from its structure and language rather than the ideas it contains. This is a fundamental principle of semiotics, the study of meaning.

Managing expectations

When a webpage is judged to be useless, it is often because it does not behave in the way the user is expecting, particularly if it is not very attractive.

Designers either need to manage a user’s expectations by giving them what they are expecting in terms of the service they are looking for, or they need to make it super attractive.  Attractive things don’t necessarily work better but we humans perceive them as doing so  because they light up the brain’s reward centre and make us feel better when we are around them. We are attracted to attractive things which is given by certain Gestalt principles such as unity, symmetry, and the golden ratio.

Gestalt: similarity, promixity

Good design is one thing, but we also have specific expectations about  any given webpage. We scan for headings and white space and interpret a page in those terms.  This is because according to Gestalt theory we will interpret items according to their proximity: items which are close together, we will group together; or similarity, items which are similar we interpret as together.

And also, because we have been to others sites and we transfer our experiences from one site to another and anticipate where certain functions should be.

Where am I? Where have I been? Where am I going?

Main menus are usually at the top of the page, grouped together and are used for navigation through the site.  Secondary navigation may take place in drop down menus, or in  left or right hand columns. Specific house keeping information can be found in the footer, or the common links bar if there is one.

If users are completely lost they will use the breadcrumbs, which Google now uses instead of the URL of sites as part of the results their search engine serves up. Therefore, it is in a designer’s interest to put breadcrumbs on the top of page.

Users will stay longer and feel better if they can answer the three questions of navigation as articulated by usability consultant Steve Krug:

  1. Where am I?
  2. Where have I been?
  3. Where am I going?

Often this answered by changing links to visited, not visited and enforcing the consistency of the design by adopting a sensible approach to colour. There is a theory of colour in terms of adding and subtracting colour to create colour either digitally, or on a palette, but there is alas, no theory about how to use colour to influence branding and marketing, as personal preferences are impossible to standardise.

HTML 5 & CSS 3

As discussed earlier in part 1 of this series, we separate out our content from our presentation which is styled using CSS 3. Then, once we know what we want to say we use HTML 5 to structure our text to give it meaning to the reader. This may be a screen reader or it may be a human being.

HTML 5 breaks a page into its header and body, and then the body is broken down further into specific instructions. Headings from <h1> to <h6>, paragraphs, lists, sections and paragraphs, etc., so that we can structure a nice layout.  There are thousands of tutorials online which teach HTML 5.

The nice thing about sections is that we can use them to source linked data from elsewhere and fill our pages that way, but still keep a consistent appearance.

Theoretically one page is great, or a couple of pages fine, but once we get into hundreds of pages, we need to think about how we present everything consistently and evenly across a site and still provide users the information for which they came.

Information architecture

Information architecture (IA) is the way to organise the structure of a whole website. It asks: How you categorise and structure information? How do you label it so that users can navigate or search through it in order to find what they need?

The first step is to perform some knowledge elicitation of the  business or context and what everyone (owners, customers) known as stakeholders expect from the proposed system. This may include reading all the official documentation a business has (yawn!).

If there is a lot of existing information the best way to organise it is to perform a card sort. A card sort is when a consultant calls in some users, gives them a stack of index cards with content subjects written on them, along with a list of headings from the client’s site—“Business and News,” “Lifestyle,” “Society and Culture”— then users decide where to put “How to floss your teeth”.

This can take a few days each time and a few goes, until a pattern is found, us humans love to impose order on chaos, we love to find a pattern to shape and understand our world.

Once we have a structure from the card sort, it becomes easier to start designing the structure across the site and we begin with the site map.

The site map reflects the hierarchy of a system (even though Tim Berners-Lee was quite emphatic that the web should not have a hierarchical structure).

Then, once a site map is in place, each page layout can be addressed and the way users will navigate. Thus, we get main menus (global navigation), local navigation, content types to put in sections and paragraphs, etc., along with the functional elements needs to interact with users.

Other tools created at this time to facilitate the structure are wireframes, or annotated page layouts, because if is is a big site lots of people may be working on it and clear tools for communication are needed so that the site structure remains consistent.

Mock up screen shots and paper prototypes may be created and sometimes in the case of talented visual designers, storyboards are created. Storyboards are sketches showing how a user could interact with a system, sometimes they take a task-base approach, so that users could complete a common task.

Depending on the size of a project, information architects will work with content strategists who will have asked all the questions in the last section (part 4) on content and/or usability consultants who will have spoken to lots of users (part 3) to get an understanding of their experiences, above and beyond their understanding of the labelling of information in order to answer questions such as:

  • Does the website have great usability which is measured by being: effective and efficient; easy to learn and remember; useful and safe?
  • How do we guide users to our key themes, messages, and recommended topics?
  • Is the content working hard enough for our users?

Sometimes, it may just be one person who does all of these roles and is responsible for answering all of these questions.

It takes time to create great structure, often it takes several iterations of these these steps, until it is time to go on to the next stage (part 6) to start sharing this beautiful content on social media.

[Part 6]

Web design (7): Evaluation

desktopetc

A collaborative medium, a place where we all meet and read and write.
Tim Berners-Lee

[Part 7 of 7 : 0) intro, 1) story, 2) pictures,  3) users, 4) content, 5) structure, 6) social media, 7) evaluation]

Even though evaluation is the final part of this series, it should not be left to the end of any software project. Ideally, evaluation should be used throughout the life cycle of a project in order to assess the design and user experience, and to test system functionality and whether it meets user requirements without creating unexpected results or confusion.

Expert analysis

Expert (or Theoretical) analysis uses a detailed description of the design, which doesn’t have to be implemented. This creates a model of the user’s activity and then analysis is performed on that model.

It is one way of assessing whether a design has good usability principles. It cannot guarantee anything but can hopefully flag up any design flaws before time and money gets spent on implementation.

Expert analysis is best used during the design phase and experts can assess systems using:

Heuristics which are rules of thumb and not true usability guidelines. Usability expert Jakob Nielson developed 10 usability heuristics in 1995 and they are still widely used and quoted today.  Design consultant, Ari Weissman says that heuristics are better than no testing at all, but to say that they can replace getting to know your users and understanding them just silly. Researchers at the University of Nebraska found that heuristic evaluation and user testing complement each other and are both needed.

Review-based evaluation uses principles from experimental psychology and human-computer interaction (HCI) literature to provide evaluation criteria such as menu design, command names, icons and memory attributes to support/refute design decisions. Reviews may even use style guidelines provided by big companies such as Microsoft and Apple.

Model-based evaluation uses a model to evaluate software. This model might be taken from HCI literature such as Stuart Card’s GOMS and Ben Shneiderman’s Eight golden rules of dialog design.

Cognitive walkthroughs are step-by-step inspections which concentrate on what the user is thinking whilst learning to use the system. Alas, it is the analysts who act as the user and try to imitate what the user is thinking. Walkthroughs can be used to help develop user personas.

However, the main criticism is that novice users are often forgotten about because analysts have lots of experience and their pretending to be users can introduce all sorts of bias into your system. The advantages of this approach is that areas which are unclear in the system design can be easily flagged up and fixed cheaply and earlier on in the life cycle.

Using your user: user testing

The most informative types of evaluation always take place with the user. This can happen in the laboratory or in the field. In the laboratory, usability consultants have a script, such as this one by usability expert Steve Krug. The usability consultant asks the user to either do whatever they are drawn to do, or to perform a specific task,such as buying a product on the site, whilst talking aloud. This thinking aloud protocol not only identifies what the problem is, but also why. The best thing about usability testing is that clients can hear a user saying something which may be obvious to the consultant but not to the client and which the client might not believe if the consultant just told them. Co-operative evaluation is a very similar technique to usability testing.

Outside the laboratory, you can follow the user about and shadow them in the workplace, to see how the user interacts with your software, or the current software that your new software will hopefully improve upon. This is ethnography and a way of learning about the context in which your users work. It can be very expensive and time consuming to hire ethnographers to go into users’ workplaces.

A cheap and cheerful way of reproducing this shadowing is to get the users to keep a diary or blog, known as a cultural probe.  They are quick and easy to put together using open-ended questions which encourage users to say all the things they might not say during a testing session.

Empirical evaluation

Another relatively cheap and cheerful method is to get your user group to fill out a questionnaire or a survey in order to get their feedback.

The questionnaire needs to be designed very carefully, following these instructions, otherwise you can end up with a lot of information, but nothing tangible. The main advantage is that you get your users opinions and you can measure user satisfaction quite easily.

The disadvantage is it that is hard to capture certain types of information in a questionnaire such as the frequency of a system error, or the time taken to complete a task.

Logging

Computers can collect statistics of use, to tackle the sorts of questions like time taken and frequency of system errors.  Web stats are a great way of seeing this sort of information as well as which pages are the most attractive and most useful to users.  Eye-tracking software and click captures are also useful ways of collecting data. However, care needs to be taken not to introduce any bias in the interpretation of this data.

Informal evaluation

Informal evaluation methods can be useful, in the design stage for example, but are better suited in the context of performing research as they do not always yield usable results which can be used to guide design.

Focus groups: This is when you get a group of users together and they discuss subjects led by a moderator. Focus groups can be useful. However, they can lead to users telling you what they think they want, rather than what they need. As this 2002 paper asks: Are focus groups a wealth of information or a waste of resources?

Controlled experiments test a hypothesis like this great example: College students (population) type (task) faster (measurement) using iPad’s keyboard (feature) than using Kindle’s keyboard, by identifying independent and dependent variables that you can collect data on after testing in a simulation of real world situations such as in a college where iPads and Kindles are used.

No matter how great your website or software system is, it can always be improved by some method of evaluation. There are many methods involving users and experts to make your system as good as it can be throughout the whole lifecycle of your website or your software. Evaluation is the only way to identify and correct those design flaws.

Web design (6): Sharing and caring on social media

 

desktopetc

A collaborative medium, a place where we all meet and read and write.
Tim Berners-Lee

[Part 6 of 7 : 0) intro, 1) story, 2) pictures,  3) users, 4) content, 5) structure, 6) social media, 7) evaluation]

Nowhere is Berners-Lee’s vision of the World Wide Web more true than on social media. We all have access to as many conversations as we want. We can instigate new conversation, listen to other people talking, and dip in and out of art, music, video, and other amazing creations.

Most of the articles on social media for web design is about content marketing and content marketing strategy, which is a way for businesses to raise their profiles and create brand awareness, generate new sales, new customers, and keep customers loyal.

The main way to do this is by creating targeted content which is valuable and useful to the user/customer who then trusts a company and is more likely to buy from them. Content marketing is big business, and getting bigger every year according to i-scoop.eu.

Moz.com has published a best practices for social media marketing saying that before businesses promote their products and news, they must also build relationships with their customers so that they feel like they are part of a community. Sharing different types of content, not just information about their products and promotions, is one way of starting new conversation and creating new experiences with customers to encourage a trustworthy feeling.

Newspapers like the FT let their customers to do some of their marketing by providing tweetable quotes throughout their articles which link back to the news item on their website. This is a great way of using all the best content on your website.

A hierarchy of social media?

The types of information we share on social media fit nicely into Maslow’s hierarchy of needs which I call Maslow’s hierarchy of social media.

Content marketers believe that the further down Maslow’s triangle you are, the more likely it is that you are fulfilling customers’ basic needs which may encourage customer loyalty. However, customer needs aside, the information type which is shared more than anything other on social media is surprising information in the form of stories, short videos, images. Apparently, we all seek that twist in the tale.

Alone together

Spiritual thinker Deepak Chopra believes we are connected and are raised up by social media. In contrast Sociologist Sherry Turkle feels that social media is changing us and not in a good way. Writers Jennifer Weiner and Jonathan Franzen both concur and believe that social media encourages the worst in us. Social media, of course, offers both experiences: enriching and depressing. It can be a feeding frenzy of attack but also an amazing way of augmenting humans with others’ talents and skills and knowledge.

Of course, the reality is that no one really knows how social media works, which is why companies spend billons each year trying to find better and faster ways of reaching their target market by tweeting, facebook, instagram and blogging.

A masterclass

One of the best brands online is OWN: The Oprah Winfrey Network. During Oprah Winfrey’s 25-year TV series, she created a community. Her message was: You are not alone, by which, Oprah tapped into one of our deepest needs – we all want to feel that we matter. We want to be included a community and to be heard in conversation. We want to feel connected, so that we can be open and participate in life with others.

Since ending her award winning show, Oprah and her network OWN have reached out to its audience via social media to give information and courses and communitas. They have given us all a masterclass in how these tools should be used to satisfy both the customer and the business, and they continue to go from strength to strength.

Social media is an exciting way of instantly connecting to your customers and creating community in order to direct people to your website. Done well you users will happily co-create alongside you on your website, enriching you in ways only Tim Berners-Lee had the vision to see.

[Part 7]