Cognitive Science: What makes your users tick

phrenology pic from www.hfac.uh.edu

Like many usability consultants I have spent hours locked in rooms with strangers saying: “What do you think about this web page?” It is boring way to earn a living especially as you often know the answers and could tell clients without asking the questions.

Alas, most clients only believe opinions about their websites when it comes from random users – not you, the expert. Luckily the industry takes Jakob Nielson’s advice on testing: five users only to establish a pattern of responses (and because it’s cheap). Although, if we were really serious, we would need 30 users to talk about the statistical significance of our results.

Some of this boredom could be avoided (and client money saved) if everyone employed the patterns which already exist in users’ heads to create more intuitive webpages and GUIs. Cognitive science, the study of mind and intelligence, enables us to understand what makes our users tick. Continue reading “Cognitive Science: What makes your users tick”

Security and usability: Don’t let your users get you down

security picture of a padlock a monitor and software

After my first year at university I spent the summer working in a delicatessen in Putney. One morning during my first week, whilst in the middle of carefully carving six slices of Parma ham for some lady’s dinner party, we were told to evacuate the building as security had been warned that there was a bomb. I dropped everything and ran for my life. We stood around in the car park until we got the all clear and I arrived back at the counter to find the same woman ready to berate me for abandoning her dinner party plans.

Every week after that there was on average three or four scares per week and I became like the dinner party woman: nonchalant. Instead of leaving the building immediately, I would go upstairs to the locker room and get my purse – just in case we went to the pub. Sometimes people would shout up, “Bring my fags/jacket/bag down will you?” Bomb scares became part of my daily routine, so much so that they were no longer life-threatening traumatic events but welcome tea-breaks.

Humans are adaptable, which is why we have survived so long. However, this flexibility means that the security systems specialists work hard to put into place are undermined by users’ perceptions of risk. As the old adage says familiarity breeds contempt. As I did in the delicatessen, users get used to a certain level of risk, become blase about warnings, until security and safety awareness procedures get ignored. Even the simplest routines can be forgotten. How many of us carry out weekly back ups? Hands up if you check that your automated back ups work? Most people don’t even think about back ups until they lose an important document. The same goes for security systems. When there is a security breach 80-90% of the time the human or organisational element is responsible not the algorithms on which the system is built.

Users are not the enemy

Sometimes this leads managers to view users as the enemy and they spend all the their time and energy hating stupid users. By all means track their behaviour, collect data on their log-ins and outs, check what they are doing. Information is power. But rather than drowning in data and hating everyone, use the information to learn about your user group and visualise their needs. Redesigning log-in procedures and reducing human error can go a long way towards reducing security breaches. Your users are not the enemy, they need to be supported and viewed as part of the system.

Psychological responsiblity

Another step is to give users the tools and information necessary so that they can become part of the solution and not the weakest link. Since we first started interacting with machines, researchers have studied what makes us tick. Cognitive science in particular concentrates on what humans are thinking when they are presented with a computer. And humans have been practising cryptography – hiding and transmitting sensitive information – since time began.

The best way to get users to not violate the security built into systems is to encourage psychological acceptability so that users learn to apply security procedures correctly. Psychological acceptability involves good usability, feedback, system transparency, and a sense that users are responsible for their actions.

Give users transparent interfaces and good feedback

Usability is the effectiveness, efficiency and satisfaction with which people can carry out a given task using a computer. A system should be easy to use. Normally the graphical user interface (GUI) or just a user interface is responsible for the way users interpret what is going on and is the main factor in influencing how users feel. Good GUIs should be transparent – familiar or somehow intuitively apprehensible and the internal workings of the system should match the users’ mental models. And if it can’t then users need to be given the appropriate guidance and level of feedback so that they can learn how the system works and what it does.

By giving good feedback and creating a transparency of interface, so it is easy to understand how a system came to a specific conclusion, users begin to have confidence in their computers and will trust what computers say. Loss of trust means that users start ignoring warnings and overriding defaults.

Allows users to accurately estimate risk

With good feedback and a sense of trust users get a more accurate perception of the risks to security. And if the GUI is a good one – easy to use, with a clear uncluttered design – users can more easily concentrate on the information they are being given.

Some researchers have suggested user profiling for improving users’ estimation of risk. Target users are split into groups and system responses are designed to respond to a particular user profile. Each profile ideally contains an individual’s cognitive style based on users’ behaviour that has been tracked and analysed. If a given user’s behaviour changes and a security breach is anticipated then this user would be assigned a new profile so that the level of information security increases.

Avoid the erosion of security awareness

Jose J Gonzalez, a professor of systems dynamics based in Norway, argues that the erosion of security awareness or users’ inability to follow safety standards is comparable to the erosion of safe sex practices that we are seeing in teenagers across the globe. We all want to interact safely but when the moment arrives, circumstances or time pressures, and lack of cooperation may cause users to deviate from safe practices.

When dealing with users whose compliance with security measures declines as perceived risk declines, Gonzalez recommends that users should know about the ever present dangers. He suggests using refresher training courses during risk-free periods to remind users and managers to carry out safety procedures. In this way users avoid behaviour built on superstitious learning, which is when they learn how systems work in an almost cross your fingers way instead of viewing their interactions as a logical process with a finite number of steps.

It is also good to tell users when a security breach has been averted. In this way users know that although they are protected, the systems they use are not invincible and are constantly under attack. Reminding uses that they are part of this system, that they are key to the smooth running of the place, and that they need to follow safety standards is just good practice.

Make your users responsible so that they don’t become the ones who inadvertedly let your system down.

Getting your hands on Apple’s iPhone

the iphone

Another Apple marketing frenzy has led to the UK bracing itself for the launch of the iPhone tomorrow. The Carphone Warehouse is expecting large queues and Scotland Yard are warning customers to hide their new handsets so that they don’t get mugged.

Aside from the excitment there are criticisms. The main ones centre on the iPhone’s choice of network: O2. O2’s coverage isn’t great, apparently even in the Apple store on Regent Street. And unlocked iPhones that early adopters are already using, thanks to Ebay, won’t be able to download new software without damaging them. Vendor lock-in experts Apple are as bad as Microsoft with their need to dictate to customers how their products should be used, which ultimately is a big problem when you talk about the iPhone’s user experience and usability. Continue reading “Getting your hands on Apple’s iPhone”

Checklists

checklists

  • [pdf] Designing web usability
  • [pdf] Ensuring web accessibility
  • [pdf] Using software for accessibility and usability
  • [pdf] Usability tools and techniques
  • [pdf] Measuring and increasing website return on investment

(Stalker-Firth, R., July 2003)