Social media security: Sharing is caring?

social media pic

Recently, YouTube prankster Jack Vale searched the closest posts on Twitter, Facebook, and Instagram to his current location and introduced himself to the people behind them.

The resulting video is really interesting. Most of the people were amazed that a random stranger knew so much about them and one man even felt threatened enough to say he would call the police. Yet, all of the information Vale had ‘on’ this man had been put into the public domain by the man himself.

The dicotomy of people wanting to keep their personal life personal whilst posting it all online shows that we are still on a learning curve when it comes to sharing via social media.

In the past users may have been posting and inadvertently geotagging their location, but as Wikipedia says, enough celebrities have been mobbed at a specific location after posting online and, ebay sellers have had stuff nicked whilst on holiday, to make even the most security unconscious user turn off the location tagging on their smartphones.

When I lectured IT Security, I would use Jose J Gonzalez’s example of teenagers not practising safe sex as analogy for users compromising system security. Everyone wants to practice safe interaction but when the moment arrives, circumstances, time pressures, and the thought that others are getting on down without worrying too much about the consequences, causes safe practice deviation.

The teenage sex comparison was useful when we were worried about users inadvertently breaching security systems. Nowadays the worry is more about users themselves becoming the target of a security breach. What is a useful analogy for that?

I have given many a lecture saying don’t share your address, your phone number, date of birth, place of birth, mother’s maiden name, favourite pet, first job, etc., all things that are asked by systems and are used to create user accounts online. This information is often used to hack accounts and in a worst case scenario, identity theft. But today, in the brave new world of social media this advice seems quite quaint. A quick Google+ about and how much of this info is revealed?

The problem with social media is that we are sharing and caring with our friends who know all this information already, so why not have it online? Facebook is always telling me that I won’t forget another birthday if I use the relevant app and let others know when I was born too. Great! It only gets a little weird when complete strangers come up to you in the street and wish you ‘Happy Birthday’.

We are human, we want to be heard, we want to bear witness, we want to share. I know. When my daughter was born with kidney failure and it was super difficult for a very long time, I kept a blog to explain things to friends and family, and to myself. One day she might not thank me for the overshare. But hopefully, she will acknowledge that I stopped well before I typed: ‘Today, J got her first bra.’
And also, before each post, I thought carefully about an older girl reading her history online. I vetoed some media coverage of her which to me was insensitive. My imagined perception of her comfort with what was shared was more important to me that day than the help someone might have gotten from reading that article about her. Who knows though? As someone growing up in a social media world perhaps she won’t feel about privacy in the same way I do. I have blogged before that information is power but it only becomes powerful when you wield it. And you might ask why would anyone? And how could they use certain information? If people know things about you, so what?

When I had breast cancer, a few of my friends said: ‘Oh Ruth, why don’t you keep a blog about breast cancer?’. But, I didn’t want to share. I didn’t want anyone thinking about my breasts. I didn’t even want to think about my breasts. Even now typing ‘my breasts’ makes me blush (my breasts, my breasts, my breasts). But at the same time, reading other peoples’ blogs on breast cancer helped me in so many ways. Their sharing was caring. Some of those people were so candid and funny, they brightened my dark days. Did they overshare? I don’t think so, they shared what I wasn’t willing to, but that wasn’t oversharing, to me that was bravery.

The boundaries online are as fuzzy as they are in real life, except, as I have blogged before, in real life we know exactly who our audience is, and online it is hard to know to whom we speak and even more difficult, is being conscious of what exactly we are putting out there, if we are not at least a littlebit tech savvy.

The psychological acceptability that has traditionally accompanied system design, especially in IT security, which involves good usability, feedback, system transparency, and a sense that users are responsible for what they do, seems to be intentionally blurred on social media.

In an article on www.national.ae from 2010 Mark Zuckerberg is described as ‘Dr Evil’ for encouraging the thinking that privacy is an old fashioned concept. It mentions too that the Facebook privacy settings change all the time so that users have a hard time keeping information private. In contrast, Zuckerberg’s quotes on thoughtcatalog.com, make him sound completely naive and just idiotically ignorant of the need for user safety and security.

Knowing ourselves what to keep private can be a hard call and can change from day to day. However, not empowering the user to take personal responsibility for feeling safe and secure (the base level in the pyramid of Maslow’s hierarchy of needs) is irresponsible. Social media moguls have a duty to make this really easy for everyone so that when a user presses that post button, they know what they have posted and who is reading it.

Until that happens, Jack Vale has definitely got me thinking about what I share on Facebook, and I have changed a few settings so that I feel more comfortable.

Sharing is caring, definitely. But, in the heat of the moment, a deep breath and a little bit of safety compliance never did anyone any harm.

Security and usability: Don’t let your users get you down

security picture of a padlock a monitor and software

After my first year at university I spent the summer working in a delicatessen in Putney. One morning during my first week, whilst in the middle of carefully carving six slices of Parma ham for some lady’s dinner party, we were told to evacuate the building as security had been warned that there was a bomb. I dropped everything and ran for my life. We stood around in the car park until we got the all clear and I arrived back at the counter to find the same woman ready to berate me for abandoning her dinner party plans.

Every week after that there was on average three or four scares per week and I became like the dinner party woman: nonchalant. Instead of leaving the building immediately, I would go upstairs to the locker room and get my purse – just in case we went to the pub. Sometimes people would shout up, “Bring my fags/jacket/bag down will you?” Bomb scares became part of my daily routine, so much so that they were no longer life-threatening traumatic events but welcome tea-breaks.

Humans are adaptable, which is why we have survived so long. However, this flexibility means that the security systems specialists work hard to put into place are undermined by users’ perceptions of risk. As the old adage says familiarity breeds contempt. As I did in the delicatessen, users get used to a certain level of risk, become blase about warnings, until security and safety awareness procedures get ignored. Even the simplest routines can be forgotten. How many of us carry out weekly back ups? Hands up if you check that your automated back ups work? Most people don’t even think about back ups until they lose an important document. The same goes for security systems. When there is a security breach 80-90% of the time the human or organisational element is responsible not the algorithms on which the system is built.

Users are not the enemy

Sometimes this leads managers to view users as the enemy and they spend all the their time and energy hating stupid users. By all means track their behaviour, collect data on their log-ins and outs, check what they are doing. Information is power. But rather than drowning in data and hating everyone, use the information to learn about your user group and visualise their needs. Redesigning log-in procedures and reducing human error can go a long way towards reducing security breaches. Your users are not the enemy, they need to be supported and viewed as part of the system.

Psychological responsiblity

Another step is to give users the tools and information necessary so that they can become part of the solution and not the weakest link. Since we first started interacting with machines, researchers have studied what makes us tick. Cognitive science in particular concentrates on what humans are thinking when they are presented with a computer. And humans have been practising cryptography – hiding and transmitting sensitive information – since time began.

The best way to get users to not violate the security built into systems is to encourage psychological acceptability so that users learn to apply security procedures correctly. Psychological acceptability involves good usability, feedback, system transparency, and a sense that users are responsible for their actions.

Give users transparent interfaces and good feedback

Usability is the effectiveness, efficiency and satisfaction with which people can carry out a given task using a computer. A system should be easy to use. Normally the graphical user interface (GUI) or just a user interface is responsible for the way users interpret what is going on and is the main factor in influencing how users feel. Good GUIs should be transparent – familiar or somehow intuitively apprehensible and the internal workings of the system should match the users’ mental models. And if it can’t then users need to be given the appropriate guidance and level of feedback so that they can learn how the system works and what it does.

By giving good feedback and creating a transparency of interface, so it is easy to understand how a system came to a specific conclusion, users begin to have confidence in their computers and will trust what computers say. Loss of trust means that users start ignoring warnings and overriding defaults.

Allows users to accurately estimate risk

With good feedback and a sense of trust users get a more accurate perception of the risks to security. And if the GUI is a good one – easy to use, with a clear uncluttered design – users can more easily concentrate on the information they are being given.

Some researchers have suggested user profiling for improving users’ estimation of risk. Target users are split into groups and system responses are designed to respond to a particular user profile. Each profile ideally contains an individual’s cognitive style based on users’ behaviour that has been tracked and analysed. If a given user’s behaviour changes and a security breach is anticipated then this user would be assigned a new profile so that the level of information security increases.

Avoid the erosion of security awareness

Jose J Gonzalez, a professor of systems dynamics based in Norway, argues that the erosion of security awareness or users’ inability to follow safety standards is comparable to the erosion of safe sex practices that we are seeing in teenagers across the globe. We all want to interact safely but when the moment arrives, circumstances or time pressures, and lack of cooperation may cause users to deviate from safe practices.

When dealing with users whose compliance with security measures declines as perceived risk declines, Gonzalez recommends that users should know about the ever present dangers. He suggests using refresher training courses during risk-free periods to remind users and managers to carry out safety procedures. In this way users avoid behaviour built on superstitious learning, which is when they learn how systems work in an almost cross your fingers way instead of viewing their interactions as a logical process with a finite number of steps.

It is also good to tell users when a security breach has been averted. In this way users know that although they are protected, the systems they use are not invincible and are constantly under attack. Reminding uses that they are part of this system, that they are key to the smooth running of the place, and that they need to follow safety standards is just good practice.

Make your users responsible so that they don’t become the ones who inadvertedly let your system down.