Privacy

Privacy is shorthand for breathing room to engage in the process of … self-development. – Julie E. Cohen

Writer Muriel Spark kept her own archives. Every bus ticket, theatre ticket, diary, shopping list, cheque stub, etc., she kept and stored in boxes for years until she sold the lot to the National Library of Scotland.

When I first read about Spark’s archive, I loved her chutzpah. But, in Appointment in Arezzo, Alan Taylor explains that the archive was far from her having one eye on posterity. Spark kept it so that she had irrefutable proof of who she was and the experiences which had shaped her. She could use that archive to know the truth about herself and her past especially when people she had known and loved wrote about her unfavourably.

Nowadays we all have similar archive, online. It boggles my mind how Google has recorded every journey I have ever made when using its maps. Elsewhere I am in databases in the workplace, pension plans, the doctor’s, the dentist, the TFL Oyster card system, and so on. My offline archive is just a mountain of old diaries.

Personal information, like the fields found in a database, wasn’t really collected until after WWII, and even then it didn’t become a commodity until much later on when businesses began to collect it to sell us things. Before that, there wasn’t much anyone didn’t know about you in your community say like your village. I know where I grew up everyone knew everything about me. But there is a massive difference between the facts that are known about me by neighbours and the journals I have kept.

It is the same today. I mean I don’t care if you know where I go, or what I buy, or how old I am. I don’t publicise these things and definitely not online, but even so, if you asked me I would probably tell you. However, if you were to come round my house and read my diaries I would be mortified. They are private.

Privacy is a social construct. Historically people lived closely together so there was no privacy. It was only in the US in 1980, it came to mean the right to be let alone as defined in Samuel D Warren and Louis Brandeis’ article titled The Right to Privacy.

UK and EU law is more piecemeal, we have privacy of information and the right to respect for our private and family life but nothing as clear as the US torts.

There might be lots of personal information about us in databases or in other people’s heads where we fit demographically, but that is not the same as our hopes, our dreams, or our irritating habits, which is why when someone shares that sort of information about us or indeed reads it in a diary, without our permission, especially if it is something we wouldn’t want the whole village or indeed Internet to know, it can feel like a horrible betrayal and a violation of our privacy.

That said, our everyday lives are a constant trade off between privacy and intimacy, between sociability and creating relationships. Privacy is not an absolute state and it can be doubly difficult to figure out where we are, when we are the individuals who have offered up our private space in the first place, which is what we do when we put up pictures of our houses, or our lunch, or ourselves, online.

Knowing yourself in the face of others

Knowing what to keep private can be a hard call and can change from day to day. With people online, whom we chat to, we tend to fall into an immediate trust and share more readily because trusting and sharing is what builds intimacy, and as we have little to go on with a virtual someone else, we may violate our own privacy to drum up a sense of intimacy and trust, and if the other person turns out to be not what they said they were then we may feel a bit foolish, that’s if we are lucky!

We all wear masks, and the time comes when we cannot remove them without removing some of our own skin. – André Berthiaume

But it is not our fault. Laurence Scott says in The Four Dimensional Human, that the modern message is that we are fundamentally isolated from each other and that when we get online we have the abstract promise of going home, it has become part of the rhythms of almost every waking hour to look for a sign or word elsewhere.

In other words, connection gives our lives meaning and we will readily trade some privacy for the promise of not feeling socially excluded. And, if Scott is to be believed, then technology has trained him to be permanently online hoping for some connection.

The hoped for self

And, if that is true, it is no wonder that Scott remains frustrated that people do not share the things which he feels really need to be shared and instead curate their lives carefully to makes themselves look like they are having a life well lived. In his words: We gentrify our web presence and describes social media as a bit of a stage performance.

But how else are we to behave? Being honest and vulnerable online or off takes courage, so if the person or indeed the whole gang of people with whom you are sharing don’t understand or empathise, and in a worst case scenario, let you know, you can feel crushed and ganged up against. It is only with a strong sense of self can you recover.

Privacy provides you with a space in which to discover that sense of self but if you are never offline then how can you cultivate one? You cannot do it online if you are wanting randomers to satisfy your painful yearnings for connection.

I read something today that the optimum number of friends of Facebook is 300. Anymore and you look like you have no friends. Elsewhere, like Twitter or LinkedIn, lots of followers makes you look fabulous. Connectedness is a commodity and we work hard to keep our numbers up. We cannot win. Emotional Intelligence author Daniel Goleman has said that we are under siege in this pervasive digital culture and there are a lot of rules made up by social media experts for us to manage and succeed online. We need to be authentic, unless of course we are not very nice then we have to hide that and pretend to be nice, authentic, and the same as everyone else.

We like rules to make sense of things and we have long been told how we should live our lives by the media, with social media there are just more ways to be told how to conform.

In Cave in the snow author Vicki MacKenzie, describes how Buddhist monk Tenzin Palmo moved into a cave up the Himalayas so that she could meditate in peace:

She could begin to unravel the secrets of the inner world – the world that was said to contain the vastness and wonder of the entire universe.

More and more I am beginning to think this aptly describes privacy. We could all do with a bit of solitude to build our emotional and digital resilience. The Internet is fabulous as it compresses time and space, great for maintaining friendships, keeping in touch with loved ones, running businesses, and so on. But if all we do is constantly look online to find meaning,connection and validation then we will never give ourselves that time and space to give those things to ourselves.

We don’t have to go mad like Tenzin Palmo and sit in a cave for 12 years or indeed emulate Christopher Knight the man who lived alone in the woods for 27 years and experienced deep transcendental moments in nature. We don’t  even need to delete our social media accounts as Jared Lenier warns us we must. But, we need to protect our inner world, our privacy, so that if we never unravel the secrets of the entire universe, or transcend ourselves watching the fog lift at sunrise, we know enough to love and respect our own dear selves, so that we are able to connect with love and respect to our fellow human beings, by transcending the painful yearning we sometimes get when our needs are not being met.

The man in the woods’s observation of the mobile phone is fascinating: Why, he wonders, would a person take pleasure in using a telephone as a telegraph machine? “We’re going backwards,” he says.

Privacy is the space in which we come on home to ourselves. There’s no need to camp out online in the hope of making a home in a stranger’s photo album.

Human-Computer Interaction Conclusions: Dialogue, Conversation, Symbiosis (6)

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

I love the theory that our brains, like computers, use binary with which to reason and when I was an undergraduate I enjoyed watching NAND and NOR gates change state.

As humans, we are looking for a change of state. It is how we make sense of the world, as in semiotics, we divide the world into opposites: good and bad, light and dark, day and night. Then we group information together and call them archetypes and symbols to imbue meaning so that we can recognise things more quickly.

According to the binary-brain theory, our neurons do too. They form little communities of neurons that work together to recognise food, not-food; shelter, not-shelter; friends, foes; the things which preoccupy us all and are classed as deficiency needs in Maslow’s Hierarchy of Needs.

Over on researchgate, there was discussion about moving beyond binary which used this example:

Vegetarian diet vs Free Range Animals vs Battery Farmed Meat

If it was just vegetarian diet v battery farming it would be binary and an easy choice but add in free range and we see the complexities of life, the sliding continuum from left to right. We know life is complex but it is easier in decision making to just have two options, we are cognitive misers and hate using up all our brainpower. We want to see a change in state or a decision made. It also reflects the natural rhythms of life like the tide: ebb and flow, the seasons: growing and dying, it’s not just our neurons its our whole bodies which reflect the universe so patterns in nature resonate with us.

I began this series with an end in mind. As human-computer interaction (HCI) is an ever expanding subject, I wanted to pin it down and answer this question: What am I thinking these days when I think about human-computer interaction?

For me, HCI is all about the complexities of the interaction of a human and a computer, which we try to simplify in order to make it a self-service thing, so everyone can use it. But with the progress of the Internet, HCI has become less about creating a fulfilling symbiosis between human and computer, and more about economics. And, throughout history, economics has been the driving force behind technological progress, but often with human suffering. It is often in the arts where we find social conscience.

Originally though, the WWW was thought of by Tim Berners-Lee to connect one computer to another so everyone could communicate. However, this idea has been replaced by computers connecting through intermediaries, owned by large companies, with investors looking to make a profit. The large companies not only define how we should connect and what are experience should be, but then they take all our data. And it is not just social media companies, it is government and other institutions who make all our data available online without asking us first. They are all in the process of redefining what privacy and liberty means because we don’t get a choice.

I have for sometime now gone about saying that we live in an ever changing digital landscape but it’s not really changing. We live the same lives, we are just finding different ways to achieve things without necessarily reflecting whether it is progress or not. Economics is redefining how we work.

And whilst people talk about community and tribes online, the more that services get shifted online, the more communities get destroyed. For example, by putting all post office services online, the government destroyed the post office as a local hub for community, and yet at the time it seemed like a good thing – more ways to do things. But, by forcing people to do something online you introduce social exclusion. Basically, either have a computer or miss out. If you don’t join in, you are excluded which taps into so many human emotions, that we will give anything away to avoid feeling lonely and shunned, and so any psychological responsibility we have towards technology is eroded especially as many online systems are binary: Give me this data or you cannot proceed.

Economic-driven progress destroys things to make new things. One step forward, two steps back. Mainly it destroys context and context is necessary in our communication especially via technology.

Computers lack context and if we don’t give humans a way to add context then we are lost. We lose meaning and we lose the ability to make informed decisions, and this is the same whether it is a computer or a human making the decisions. Humans absorb context naturally. Robots need to ask. That is the only way to achieve a symbiosis, by making computers reliant on humans. Not the other way round.

And not everything has to go online. Some things, like me and my new boiler don’t need to be online. It is just a waste of wifi.

VR man Jaron Lanier said in the FT Out to Lunch section this weekend that social media causes cognitive confusion as it decontextualises, i,e., it loses context, because all communication is chopped up into algorithmic friendly shreds and loses its meaning.

Lanier believes in the data as labour movement, so that huge companies have to pay for the data they take from people. I guess if a system is transparent for a user to see how and where their data goes they might choose more carefully what to share, especially if they can see how it is taken out of context and used willy-nilly. I have blogged in the past how people get used online and feel powerless.

So way back when I wrote that social media reflects us rather than taking us places we don’t want to go, in my post Alone Together: Is social media changing us? I would now add that it is economics which changes us. Progress driven by economics and the trade-offs humans think it is ok for other humans to make along the way. We are often seduced by cold hard cash as it does seem to be the answer to most of our deficiency needs. It is not social media per se, it is not the Internet either which is taking us places we don’t want to go, it is the trade-offs of economics and how we lose sight of other humans around us when we feel scarcity.

So, since we work in binary, let’s think on this human v technology conundrum. Instead of viewing it as human v technology, what about human v economics? Someone is making decisions on how best to support humans with technology but each time this is eroded by the bottom line. What about humans v scarcity?

Lanier said in his interview I miss the future as he was talking about the one in which he thought he would be connected with others through shared imagination, which is what we used to do with stories and with the arts. Funny I am starting to miss it too. As an aside, I have taken off my Fitbit. I am tired of everything it is taking from me. It is still possible online to connect imaginatively, but it is getting more and more difficult when every last space is prescribed and advertised all over as people feel that they must be making money.

We need to find a way to get back to a technological shared imagination which allows us to design what’s best for all humanity, and any economic gain lines up with social advancement for all, not just the ones making a profit.

Codependency or Collaboration? Human-Computer Interaction: Dialogue, Conversation, Symbiosis (4)

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

The fig tree is pollinated only by the insect Blastophaga grossorun. The larva of the insect lives in the ovary of the fig tree, and there it gets its food. The tree and the insect are thus heavily interdependent: the tree cannot reproduce without the insect; the insect cannot eat without the tree; together, they constitute not only a viable but a productive and thriving partnership. This cooperative “living together in intimate association, or even close union, of two dissimilar organisms” is called symbiosis. (Licklider, 1960).

The above quotation is from JCR Licklider’s seminal paper Man-Computer Symbiosis (1960). I had to grit my teeth as he kept going on about man and men and computers. To distract myself from the onset of a patriarchal rage, I decided that I needed to know the precise definition of seminal. Google gave me seed of semen.

The original meaning of the word computer actually means a person who did calculations, like women in 1731 who ran a household and were advised to be clever with their husband’s wages, and whilst I am wondering about who is the tree and who the insect in this scenario of man-computer symbiosis, I am thinking that it really isn’t a good idea to aspire to have computers and people functioning as natural living organisms who cannot survive without each other. I love technology I really do, but the idea that I cannot function without it is, at the very least, disturbing.

We got a new boiler the other day complete with a smart thermostat. The smart thermostat uses an app now installed on everyone’s phone along with our locations activated to see if we are home or not and the thermostat uses the information and corrects the temperature accordingly. It will also build up patterns of our daily routine what time we get up, have a shower, take a bath, etc., so that it can be ready for us. But, only if we are home and have our phones charged. Thankfully, there are buttons on the thermostat if the WiFi goes down, apparently the earlier versions didn’t have them, and we can also log into a web browser to make changes to its routine.

Theoretically I should be thrilled, it is better than my Fitbit asking when my period is due so that it can tell me when my period is due – and since that blog, I have been given Google adverts for different sanitary products, I told you so! – or Google Nest which needs you to tell it how you want your house to run rather like my Fitbit. And, I do like data and patterns so I am interested in what it collects and what it does and if it is clever enough to respond to a cold snap or whatever.

But old habits die hard, so far we have got out an old room thermometer to check if the smart thermostat is measuring the temperature correctly as it seemed a bit high. It was right. (Just checked it, it says 25.8c the room thermometer says 22.8c quite a big difference) I guess I have just worked in computing too long and I have technology trust issues. If the Sonos is anything to go by when the WiFi goes down we are completely without music, well digital music at any rate. Last time, we got out the guitar and turned into the Vonn Trapps, I am not even joking. The alternative would be to keep other music formats and a player. But that idea doesn’t do a lot for me, I am more of a donor than a collector. I hate stuff filling up my space.

When I reader Licklider, I am reminded of ubiquitious computing rather than any other technology. I know I would rather my tech be ubiquitious than making me feel codependent on my mobile phone. All these apps for my heating, my music, my blogging, my Bikram, my electricity, my gas, it is slowly but surely turning into the precious and I feel like Gollum. I worry about my phone and can’t stand anyone touching it. Whereas ubicomp had the idea of one person, lots of computers interacting with, if I was doing it, the physiology of a person and making changes accordingly, rather than with the person’s mobile phone’s location, which strikes me as being a bit simplistic and not smart at all. (Just ‘cos you call it smart doesn’t make it smart.) And, then collecting and sharing over the Internet which causes us all to have the symptoms in the above codependent link – my phone does make me anxious and I try to please it all the time. I am forever staring at it and hoping it is doing ok, and I don’t like that feeling.

I have spent a lot of time writing about what social media can do for us, and how we can feel connected. But, in this scenario when it is not other people and it is an app, or sometimes it is other people via an app, if we are not in control, we become disconnected from ourselves and then we become addicted to the app or to the person. We give away our power and our data. The problem with these apps is that we have no control and we are barely a factor in the design. Our mobile phone is the factor in the design and it trains us to be codependent, addicted, anxious. Warmth after all is the bottom rung of Maslow’s hierarchy of needs and I am staring at my bloody phone hoping for the best. This is not symbiosis, this is codependency.

But back to Licklider and his seed of semen paper, a lot of what he was trying to imagine was probably about cooperation or collaboration of the kind I have blogged about before: a space of solutions to explore, with the computer doing the grunt work and the human doing the thinking. And, I believe it is possible to do that even with apps.

In a post, David Scott Brown, looks at what Licklider suggested in Man-Computer Symbiosis and what is possible today: shared processing, memory, better input-output, different languages, etc. And, I would add, in fields where the language is precise and limited, for example, Trading think about: buy, sell, high, low, and often over the phone so applications of AI are useful and will be able to do all sorts. All the data and conversations can be recorded and used and mined. It is extremely exciting, and memory and storage is it seems infinite which would make Licklider’s mind boggle.

As an undergraduate I had to learn sparce matrices, memory was something not to waste, it was expensive. In his paper, Licklider says:

The first thing to face is that we shall not store all the technical and scientific papers in computer memory. We may store the parts that can be summarized most succinctly-the quantitative parts and the reference citations-but not the whole. Books are among the most beautifully engineered, and human-engineered, components in existence, and they will continue to be functionally important within the context of man-computer symbiosis.

Imagine his face looking at the data centres all over the world storing memes and cat pictures and abusive tweets repeatedly without checking if they have already been saved, without indexing, without any sort of check on redundancy, an endless stream of drivvel. I am sure even Tim Berners-Lee wonders sometimes about the monster he has created.

And, books take so long to write, beautifully engineered they are, we lose ourselves in them and learn from them, they take us out of ourselves in the same way our phones do, but we are addicted to our phones and to our social media to that little hit of dopamine that social media gives us, which our books don’t always do. Books are work and we are passive whereas on our phones we feel active, but because our phones are controlling our homes and training us to be codependent and anxious and powerless, it is a vicious circle of more phones, fewer books.

In these times when I look at where we are going and I am not feeling good about it, like Licklider I turn to nature, as the best designs are in nature. I also look to the wisdom of yoga. So this is what I have:

When a bee gathers pollen, it also gathers a small amount of poison along with the pollen, both get transformed into nectar in the hive. The yogis say that when we learn to take negative situations and turn them into wisdom, it means we are progressing, and becoming skilful agents of the positive.

So, even though I worry about what happens when my whole life is literally on my phone and the world’s nature reserves are all full of data centres which contain every last terrible expression of humanity, and we are so disconnected from the nature around us that the oceans are filled with plastic, and many of us are in offices far away from the natural world staring into our bloody phones, and many of us do it to create technology. Surely we can create technology to change where we are. If we want a symbiosis we must make a human-planet one not a human-computer one. I don’t care what my Fitbit says I don’t want any technology in my ovaries thank you very much.

So, with that thought and the amazing technology I have at my fingertips today, I want to share an animated gif of my cat drinking from his water fountain. Licklider said:

Those years should be intellectually the most creative and exciting in the history of mankind

And, they are. I remain hopeful that we can collect enough data on ourselves to become self-aware enough to transform it into wisdom and create something better for all humanity and our planet. In the meantime I will be enjoying watching my cat have a drink of water and I am sure Licklider in 1960 would have been just as amazed too to see my cat drinking from the fountain on a phone more powerful technologically and psychologically than he ever could have imagined. It remains to be seen whether this is progress or not.

[Part 5]

User or Used? Human-Computer Interaction: Dialogue, Conversation, Symbiosis (3)

If you torture the data enough, it will confess to anything.
– Darrell Huff, How to Lie With Statistics (1954).

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

In the last blog I wrote about human dialogue with a computer versus the conversation between humans via a computer. The dialogue with a computer is heavily designed whereas human conversation especially via social media has come about serendipitously. For example, Twitter came from texting which was invented by engineers as tool to test mobile phones.

This is an example of what I call serendipitous design which works by users employing systems to do what they want it to do, which is the no-function-in-structure principle and then a designer find ways to support them. In contrast, the way to create systems which support users to do their job better uses the cardinal rule: know your user with all the various tools and techniques UX designers have borrowed from anthropology. You design with your user in mind, you manage their expectations, and you have them at the front of your mind as a major factor of the design so that the system has specific goals.

But, however hard you try, with each new system or software or form of communication, you often end up changing how people work and the dialogue is less about a field of possibilities with insight, intuition, and creativity, and more about getting people to do extra stuff on top of what they already do. And, because people are keen to get in on whatever new thing is happening they buy into what I call the myth of progress and adopt new ways of working.

This begs the question are we creating systems for users or the used?

This begs the question are we creating systems for users or the used? Today, I was chatting to a roadsweeper, he told me that last year he was driving a lorry but the council’s initiative to reduce carbon emissions means that 80 lorries were taken off the road and the drivers are now out sweeping the streets on foot. He showed me his council-issue mobile phone which tracks his every move and presumably reports back to the office his whereabouts at all times. Not that he needs it, if he sits on a wall too long, local residents will phone the council to complain that he is sitting down and not working hard enough.

Tracking is not new, smart badges, invented at Xerox PARC, were trialled in the 1990s in the early days of ubiquitious computing (ubicomp). The idea was to move computers off the desktop and embed them into our infrastructure so that we interact with them without our knowledge, freeing the user from the need to learn complex systems. In the badges’ case, everyone could be located by everyone else in the building, rather like the Harry Potter Marauder’s map. However, it smacks rather too much of surveillance, especially if your boss decides you are spending too long in the toilet or by the water cooler and, that your behaviour needs to change. The road sweeper instead of a badge has a mobile phone and people who spy on him and grass him up in part because they lack context and don’t know that he is entitled to a 20 minute break.

Must I really run all my important ideas past my fridge?

But it’s not just as part of a job, we have Google Maps recording every journey we make. And yet, ubicomp was less about having a mobile device or about surveillance, it was the forerunner to the Internet of Things, the ambient life, which is there to make things easier so the fridge talks to your online shopping to say that you need more milk. But what if I go vegan? Do I need to inform my fridge first? Must I really run all my important ideas past my fridge? This is not the semiotic relationship psychologist and mathematician J.C.R. Licklider had when he had his vision of man-computer symbiosis.

I was speaking to someone the other day who monitors their partner’s whereabouts. They think it’s useful to see where the partner is at any given time and to check that the partner is where they said they would be. No biggie, just useful. I mentioned it to another person who said that they had heard several people do the same. I wonder why am I so horrified and other people just think it’s practical.

Insidious or practical? I feel we are manipulated into patterns of behaviour which maintain the status quo.

Last week, I woke up and checked my Fitbit to see how I had slept which is slightly worrying now – I never needed anything to tell me how I slept before – and there was a new box in there: Female Health. I clicked on it. It asked me about birth control, when my next period is due, how long it lasts and so on. Intrigued, I entered the requested data. The resulting box said: Your period is due in eight days. Really? I mean, really? It was wrong even though I had tinkered with the settings. So, then it had a countdown: Your period will last four more days, three more days…etc. Wrong again. And, now it is saying: Four days to most fertile days. This is so irritating. It feels like Argos, you know, how the system and the reality of you getting something you’ve ordered never quite match up. I know together me and my Fitbit can build up data patterns. Will they be insightful? Time will tell. The bits which really concern me is that it said it wouldn’t share this information to anyone, okay… but then it added that I couldn’t share this information either. What? I am guessing that it wants me to feel safe and secure. But what if I wanted to share it? What does this mean? Menstrual cycles are still taboo? I can share my steps but not my periods? My husband and I laughed about the idea of a Fitbit flashing up a Super Fertile proceed with caution message when out on date night.

I regularly lie in bed pretending to be asleep to see if I can fool my fitbit

But, it’s not just me and my Fitbit in a symbiotic relationship is it? Someone is collecting and collating all that data. What are they going to do with that information prying into me and my Fitbit’s symbiotic space? It rather feels like someone is going to start advertising in there offering birth control alternatives and sanitary protection improvements. It feels invasive, and yet I signed up to it, me the person who thinks a lot about technology and privacy and data and oversharing. And, even now as I sit here and think about my mixed feelings about my Fitbit, the idea of wearing something on my arm which only tells me the time, and not my heart rate, nor the amount of steps I am doing, feels a bit old-fashioned – I am myself am a victim of the myth of progress. I am user and used. Confession, I regularly lie in bed pretending to be asleep to see if I can fool my Fitbit. It’s changing my behaviour all the time. I never used to lie in bed pretending to be asleep.

Back in 2006, I watched Housewife 49, it was so compelling, I bought the books. Nella Last was a housewife in Barrow-in-Furness who kept a diary along with 450 other people during and after the war. It was part of the Mass Observation project set up by an anthropologist, a poet, and a filmmaker, which sounds rather like the maker culture of HCI today. They didn’t believe the newpapers reporting of the abdication and marriage of King Edward VII, so went about collecting conversation and diary entries and observations on line. Rather like today, we have social media with endless conversation and diary entries and observations. The newspapers are scrambling to keep up and curate other peoples’ tweets because they have traditionally been the only ones who shape our society through propaganda and mass media. Now, we have citizens all over the world speaking out their version. We don’t need to wait for the newspapers.

We are living through a mass observation project of our own, a great enormous social experiment and it is a question worth asking: User or used? Who is leading this? And what is their goal? And, then we have the big companies collecting all our data like Google. And, we all know the deal, we give them our data, they give us free platforms and backups and archives. However, it doesn’t necessarily mean that they are right about the results of their research on our data, or have the right to every last piece of information to use, even if you give it freely, because there is a blurring of public and private information about me and my steps and periods and birth control.

Anthropologist Agustín Fuentes has written a thoughtful article about the misuse of terms such as biology in Google’s manifesto and consequently, the sweeping generalisations to which it comes. Fuentes says we have no way of knowing what happened before we collected data and even now as we collect data, we have to maintain our integrity and interpret it correctly by using terms and definitions accurately. Otherwise, we think that data tells the truth and stereotypes and bias and prejudices are maintained. I love the quote:

If you torture the data enough, it will confess to anything.

Information is power. Hopefully, though, there are enough anthropologists and system designers around who can stop the people who own the technology telling us what to think by saying they are having insights into our lives whilst peddling old ideas. We need to pay attention to truth and transparency before we trust so that we can have more open dialogue in the true sense of the word – an exploration of a field of possibilities – to lead to real and effective change for everyone.

Let us all be users not the used.

[Part 4]

Human-computer interaction, cyberpsychology and core disciplines

A heat map of the multidisciplinary field of HCI @ Alan Dix

I first taught human-computer interaction (HCI) in 2001. I taught it from a viewpoint of software engineering. Then, when I taught it again, I taught it from a design point of view, which was a bit trickier, as I didn’t want to trawl through a load of general design principles which didn’t absolutely boil down to a practical set of guidelines for graphical-user interface or web design. That said, I wrote a whole generic set of design principles here: Designing Design, borrowing Herb Simon’s great title: The Science of the Artificial. Then, I revised my HCI course again and taught it from a practical set of tasks so that my students went away with a specific skill set. I blogged about it in a revised applied-just-to-web-design version blog series here: Web Design: The Science of Communication.

Last year, I attended a HCI open day Bootstrap UX. The day in itself was great and I enjoyed hearing some new research ideas until we got to one of the speakers who gave a presentation on web design, I think he did, it’s hard to say really, as all his examples came from architecture.

I have blogged about this unsatisfactory approach before. By all means use any metaphor you like, but if you cannot relate it back to practicalities then ultimately all you are giving us is a pretty talk or a bad interview question.

You have to put concise constraints around a given design problem and relate it back to the job that people do and which they have come to learn about. Waffling on about Bucky Fuller (his words – not mine) with some random quotes on nice pictures are not teaching us anything. We have a billion memes online to choose from. All you are doing is giving HCI a bad name and making it sound like marketing. Indeed, cyberpsychologist Mary Aiken, in her book The Cyber Effect, seems to think that HCI is just insidious marketing. Anyone might have been forgiven for making the same mistake listening to the web designer’s empty talk on ersatz architecture.

Cyberpsychology is a growing and interesting field but if it is populated by people like Aiken who don’t understand what HCI is, nor how artificial intelligence (AI) works then it is no surprise that The Cyber Effect reads like the Daily Mail (I will blog about the book in more detail at a later date, as there’s some useful stuff in there but too many errors). Aiken quotes Sherry Turkle’s book Alone Together, which I have blogged about here, and it makes me a little bit dubious about cyberpsychology, I am waiting for the book written by the neuroscientist with lots of brainscan pictures to tell me exactly how our brains are being changed by the Internet.

Cyberpsychology is the study of the psychological ramifications of cyborgs, AI, and virtual reality, and I was like wow, this is great, and rushed straight down to the library to get the books on it to see what was new and what I might not know. However, I was disappointed because if the people who are leading the research anthropomorphise computers and theorise about metaphors about the Internet instead of the Internet itself, then it seems that the end result will be skewed.

We are all cyberpsychologists and social psychologists now, baby. It’s what we do

We are all cyberpsychologists and social psychologists, now baby. It’s what we do. We make up stories to explain how the world works. It doesn’t mean to say that the stories are accurate. We need hard facts not Daily Mail hysteria (Aiken was very proud to say she made it onto the front page of the Daily Mail with some of her comments). However, the research I have read about our behaviour online says it’s too early to say. It’s just too early to say how we are being affected and as someone who has been online since 1995 I only feel enhanced by the connections the WWW has to offer me. Don’t get me wrong, it hasn’t been all marvellous, it’s been like the rest of life, some fabulous connections, some not so.

I used to lecture psychology students alongside the software engineering students when I taught HCI in 2004 at Westminster University, and they were excited when I covered cognitive science as it was familiar to them, and actually all the cognitive science tricks make it easy to involve everyone in the lectures, and make the lectures fun, but when I made them sit in front of a computer, design and code up software as part of their assessment, they didn’t want to do it. They didn’t see the point.

This is the point: If you do not know how something works how can you possibly talk about it without resorting to confabulation and metaphor? How do you know what is and what is not possible? I may be able to drive a car but I am not a mechanic, nor would I give advice to anyone about their car nor write a book on how a car works, and if I did, I would not just think about a car as a black box, I would have to put my head under the bonnet, otherwise I would sound like I didn’t know what I was talking about. At least, I drive a car, and use a car, that is something.

Hey! We’re not all doctors, baby.

If you don’t use social media, and you just study people using it, what is that then? Theory and practice are two different things, I am not saying that theory is not important, it is, but you need to support your theory, you need some experience to evaluate the theory. Practice is where it’s at. No one has ever said: Theory makes perfect. Yep, I’ve never seen that on a meme. You get a different perspective, like Jack Nicholson to his doctor Keanu Reeves says in Something’s Gotta Give: Hey! We’re not all doctors, baby. Reeves has seen things Nicholson hasn’t and Nicholson is savvy enough to know it.

So, if you don’t know the theory and you don’t engage in the practice, and you haven’t any empirical data yourself, you are giving us conjecture, fiction, a story. Reading the Wikipedia page on cyberpsychology, I see that it is full of suggested theories like the one about how Facebook causes depression. There are no constraints around the research. Were these people depressed before going on Facebook? I need more rigour. Aiken’s book is the same, which is weird since she has a lot of references, they just don’t add up to a whole theory. I have blogged before about how I was fascinated that some sociologists perceived software as masculine.

In the same series I blogged about women as objects online with the main point being, that social media reflects our society and we have a chance with technology to impact society in good ways. Aiken takes the opposite tack and says that technology encourages and propagates deviant sexual practices (her words) – some I hadn’t heard of, but for me, begs the question: If I don’t know about a specific sexual practice, deviant or otherwise, until I learn about on the Internet (Aiken’s theory), then how do I know which words to google? It is all a bit chicken and egg and doesn’t make sense. Nor does Aiken’s advice to parents which is: Do not let your girls become objects online. Women and girls have been objectified for centuries, technology does not do anything by itself, it supports people doing stuff they already do. And, like the HCI person I am, I have designed and developed technology to support people doing stuff they already do. I may sometimes inadvertently change the way people do a task when supported by technology for good or for bad, but to claim that technology is causing people to do things they do not want to do is myth making and fear mongering at its best.

The definition of HCI that I used to use in lectures at the very beginning of any course was:

HCI is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them (ACM, 1992).

For me, human-computer interaction was and still remains Gestaltian: The whole is greater than the sum of the parts, by this I mean, that the collaboration of a human and a computer is more than a human typing numbers into a computer and then waiting for the solution, or indeed typing in sexually deviant search terms into a web crawler to find a tutorial. And, with the advent of social media, HCI is more than one person connecting to another, or broadcasting online, which is why the field of cyberpsychology is so intriguing.

But the very reason why I left the field of AI and went into HCI is: AI reasons in a closed world and the limits of the computational power you have available. There are limits. With HCI, that world opens up and the human gets to direct the computer to do something useful. Human to human communication supported by technology does something else altogether which is why you might want the opinion of a sociologist or a psychologist. But, you don’t want the opinion of the sociologist on AI when they don’t understand how it works and has watched a lot of sci-fi and thinks that robots are taking over the world. Robots can do many things but it takes a lot of lines of code. And, you don’t want the opinion of a cyberpsychologist who thinks that technology teaches people deviant sexual practices and encourages us all to literally pleasure ourselves to death (Aiken’s words – see what I mean about the Daily Mail?) ‘cos she read one dodgy story and linked it to a study of rats in the 1950s.

Nowadays, everyone might consider themselves to be a bit of a HCI expert and can judge the original focus of HCI which is the concept of usability: easy to learn, easy to use. Apps are a great example of this, because they are easy to learn and easy to use, mainly though because they have limited functionality, that is they focus on one small task, like getting a date, ordering a taxi, sharing a photo, or a few words.

However, as HCI professor Alan Dix says in his reflective Thirty years of HCI and also here about the future: HCI is a vast and multifaceted community, bound by the evolving concept of usability, and the integrating commitment to value human activity and experience as the primary driver in technology.

He adds that sometimes the community can get lost and says that Apple’s good usability has been sacrificed for aesthetics and users are not supported as well as they should be. Online we can look at platforms like Facebook and Twitter and see that they do not look after their users as well as they could (I have blogged about that here). But again it is not technology, it is people who have let the users down. Somewhere along the line someone made a trade-off: economics over innovation, speed over safety, or aesthetics over usability.

HCI experts are agents of change. We are hopefully designing technology to enhance human activity and experience, which is why the field of HCI keeps getting bigger and bigger and has no apparent core discipline.

It has a culture of designer-maker which is why at any given HCI conference you might see designers, hackers, techies and artists gathering together to make things. HCI has to exist between academic rigour and exciting new tech, no wonder it seems to not be easy to define. But as we create new things, we change society and have to keep debating areas such as intimacy, privacy, ownership, visibility as well as what seems pretty basic like how to keep things usable. Dix even talks about having human–data interaction, as we put more and more things online, we need to make sense of the data being generated and interact with it. There is new research being funded into trust (which I blogged about here). And Dix suggest that we could look into designing for solitude and supporting users to not respond immediately to every text, tweet, digital flag. As an aside, I have switched off all notifications, my husband just ignores his, and it just boggles my mind a bit that people can’t bring themselves to be in charge of the technology they own. Back to the car analogy, they wouldn’t have the car telling them where they should be going.

Psychology is well represented in HCI, AI is well represented in HCI too. Hopefully we can subsume cyberpsychology too, so that the next time I pick up a book on the topic, it actually makes sense, and the writer knows what goes on under the bonnet.

Technology should be serving us, not scaring us, so if writers could stop behaving like 1950s preachers who think society is going to the dogs because they view how people embrace technology in the same way they once did rocknroll and the television, we could be more objective about how we want our technological progress to unfold.