Productive or Experiential? Human-Computer Interaction: Dialogue, Conversation, Symbiosis (5)

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

Recently, I met up with an old friend and as we reminisced about our university days, she wondered if I still went about asking people really nosy questions. Now, I don’t exactly remember asking people really nosy questions, but I do like things to make sense and in my experience, people like to fill in the gaps in their stories and show me things because they know I care.

That said, back in April, I was in Naples where a man came out of his booth to ask me to stop staring at his funicular. I told him that he shouldn’t have it out in public if he didn’t want me staring at it. I remain very pleased to have managed that in Italian, though I still can’t understand why he was so upset about my admiration.

Italian funicular employees aside, I still believe as I have said here many many times before, we all want to be seen, we all want to be heard, we all want to matter. We make sense of ourselves, of others, and the world around us with stories. And, we do this even if we are not trying to write software, we are doing it to tidy up ourselves and our minds.

The thing is though, I thought I went into computing to get away from humans but really all I have done in my job is gravitate towards people to ask them about their life experiences to figure out how technology could make their lives easier, faster, better.

So, I was taken by HCI Professor Brenda Laurel’s division of what software does for us. In her book Computers as Theatre, she said that there are two types of HCI :

  1. Experiential computing for fun and games.
  2. Productive computing which is measured by outcome or seriousness with implications like writing a book and transmitting knowledge.

This chimes with anthropologist Lionel Tiger’s descriptions of designing for pleasure (experiential) or designing for achievement (productive).

But, don’t we do both? If something is designed well and is pleasurable to use, doesn’t it increase our productivity? Isn’t that what Apple has been super successful at doing with aesthetics, discoverability, and user experience? And, isn’t that the point of gamification? To make not fun things fun.

I’ve always wanted to help humans harness the power of computers, to help make their lives easier by automating the grunt work to free up more time to be creative in. I know that creativity is our life force. It keeps us expanding. It keeps us young. And like, J C Licklider, I believe that the best collaboration of computing and humans is a creative one of collaboration not codependency.

I have blogged about eliciting knowledge for web design as a way to get all the information a designer might need. And, my favourite part has always been shadowing people at work. I have done this round building sites, on bridges, chemical factories, exhibition centres, architects offices and half-built apartments, steel rolling mills, print factories, and people using mobile phones . I love to see what happens a day in the life of people doing jobs I will never have the opportunity to do. I am fascinated by people.

Ever since the first time I was in charge of changing some software, which involved users needing more fields in a database, I have loved helping people with their tech. However, simple this job was, it was my first insight into seeing how the database was there to be manipulated by the user to give new insight into the information they had. Nowadays we tell stories with databases. But, the database most always serve the user not the other way round. I think we forget this sometimes.

When I worked in the field of artificial intelligence, I purposely put errors into various parts of a knowledge based system. The idea being that the test cases I wrote to find my errors should uncover other similar errors which were there inadvertently. It needed extensive training for a user to understand what the system was calculating so that code was precious and had to be error free. And, if it needed to be changed because things are always changing in the real world, it needed a computer scientist to add more code. This I didn’t like so much. This was not empowering. Here the user and computer scientist served the code, not the other way round.

Also, it was difficult to model and represent things which experts knew inherently. So, in the case of the exhibition planning, the software I worked on used a constraint solver which could easily allocate the correct sized booths with required utilities such as electricity and water, but it couldn’t easily model or reason with exhibitor A wanting to be by the door, or not near exhibitor B, without a human. This is a common problem also for dinner planning for fundraisers, so I am told. The software has to be told the nuances of human life, but you don’t want to hard code it, as it is forever changing, which is why you either need a human, or you need a super good graphical user interface otherwise it is quicker by hand.

For a while I thought 3D applications and visualisation were the way forward especially in bridges. Bridges are enormous, last a long time, and information gets lost and the data needed to understand them is extensive, so why not visualise it. I got very excited about augmented reality, to overlay a bridge with plans, original ones, proposed changes plans. It was much harder to do back then as you needed to measure and calibrate the exact camera angle with the AR software in order by hand to overlay the original view (i.e. the bridge) with all the extra information (plans, proposed changes, future behaviour). I remember being out on a bridge for ages fiddling away. However, these days it would be much easier if you use an app you have written on the phone and it’s native camera.

But still inputting new information is not easy, especially on a mobile phone in 3D. I was playing games this morning on my mobile phone and I had trouble putting pizzas in boxes using 3D direct hand manipulation. More functionality equates to more complexity and constantly changing instructions which can be clever but requiring a learning curve as it not always intuitive, but if you are having fun, like I was, then I didn’t mind the learning curve, if it’s not fun, then we all need to be aiming for simplexity.

Experience impacts productivity and why wouldn’t it? Websites and apps are are a bit like designing a self-service instrument. As a user you figure out what is going on yourself. The better and easier it is to figure it out, the more likely you come back and the more you enjoy yourself. If not you will go elsewhere, where someone is listening, who wants to hear your story, to make you feel that you count and that your experiences matter. As Danielle La Porte said:

Design is love.

And what is love if it is not the best experience? Experiential HCI makes everything better. Let’s share the love!

 

[Part 6]

Codependency or Collaboration? Human-Computer Interaction: Dialogue, Conversation, Symbiosis (4)

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

The fig tree is pollinated only by the insect Blastophaga grossorun. The larva of the insect lives in the ovary of the fig tree, and there it gets its food. The tree and the insect are thus heavily interdependent: the tree cannot reproduce without the insect; the insect cannot eat without the tree; together, they constitute not only a viable but a productive and thriving partnership. This cooperative “living together in intimate association, or even close union, of two dissimilar organisms” is called symbiosis. (Licklider, 1960).

The above quotation is from JCR Licklider’s seminal paper Man-Computer Symbiosis (1960). I had to grit my teeth as he kept going on about man and men and computers. To distract myself from the onset of a patriarchal rage, I decided that I needed to know the precise definition of seminal. Google gave me seed of semen.

The original meaning of the word computer actually means a person who did calculations, like women in 1731 who ran a household and were advised to be clever with their husband’s wages, and whilst I am wondering about who is the tree and who the insect in this scenario of man-computer symbiosis, I am thinking that it really isn’t a good idea to aspire to have computers and people functioning as natural living organisms who cannot survive without each other. I love technology I really do, but the idea that I cannot function without it is, at the very least, disturbing.

We got a new boiler the other day complete with a smart thermostat. The smart thermostat uses an app now installed on everyone’s phone along with our locations activated to see if we are home or not and the thermostat uses the information and corrects the temperature accordingly. It will also build up patterns of our daily routine what time we get up, have a shower, take a bath, etc., so that it can be ready for us. But, only if we are home and have our phones charged. Thankfully, there are buttons on the thermostat if the WiFi goes down, apparently the earlier versions didn’t have them, and we can also log into a web browser to make changes to its routine.

Theoretically I should be thrilled, it is better than my Fitbit asking when my period is due so that it can tell me when my period is due – and since that blog, I have been given Google adverts for different sanitary products, I told you so! – or Google Nest which needs you to tell it how you want your house to run rather like my Fitbit. And, I do like data and patterns so I am interested in what it collects and what it does and if it is clever enough to respond to a cold snap or whatever.

But old habits die hard, so far we have got out an old room thermometer to check if the smart thermostat is measuring the temperature correctly as it seemed a bit high. It was right. (Just checked it, it says 25.8c the room thermometer says 22.8c quite a big difference) I guess I have just worked in computing too long and I have technology trust issues. If the Sonos is anything to go by when the WiFi goes down we are completely without music, well digital music at any rate. Last time, we got out the guitar and turned into the Vonn Trapps, I am not even joking. The alternative would be to keep other music formats and a player. But that idea doesn’t do a lot for me, I am more of a donor than a collector. I hate stuff filling up my space.

When I reader Licklider, I am reminded of ubiquitious computing rather than any other technology. I know I would rather my tech be ubiquitious than making me feel codependent on my mobile phone. All these apps for my heating, my music, my blogging, my Bikram, my electricity, my gas, it is slowly but surely turning into the precious and I feel like Gollum. I worry about my phone and can’t stand anyone touching it. Whereas ubicomp had the idea of one person, lots of computers interacting with, if I was doing it, the physiology of a person and making changes accordingly, rather than with the person’s mobile phone’s location, which strikes me as being a bit simplistic and not smart at all. (Just ‘cos you call it smart doesn’t make it smart.) And, then collecting and sharing over the Internet which causes us all to have the symptoms in the above codependent link – my phone does make me anxious and I try to please it all the time. I am forever staring at it and hoping it is doing ok, and I don’t like that feeling.

I have spent a lot of time writing about what social media can do for us, and how we can feel connected. But, in this scenario when it is not other people and it is an app, or sometimes it is other people via an app, if we are not in control, we become disconnected from ourselves and then we become addicted to the app or to the person. We give away our power and our data. The problem with these apps is that we have no control and we are barely a factor in the design. Our mobile phone is the factor in the design and it trains us to be codependent, addicted, anxious. Warmth after all is the bottom rung of Maslow’s hierarchy of needs and I am staring at my bloody phone hoping for the best. This is not symbiosis, this is codependency.

But back to Licklider and his seed of semen paper, a lot of what he was trying to imagine was probably about cooperation or collaboration of the kind I have blogged about before: a space of solutions to explore, with the computer doing the grunt work and the human doing the thinking. And, I believe it is possible to do that even with apps.

In a post, David Scott Brown, looks at what Licklider suggested in Man-Computer Symbiosis and what is possible today: shared processing, memory, better input-output, different languages, etc. And, I would add, in fields where the language is precise and limited, for example, Trading think about: buy, sell, high, low, and often over the phone so applications of AI are useful and will be able to do all sorts. All the data and conversations can be recorded and used and mined. It is extremely exciting, and memory and storage is it seems infinite which would make Licklider’s mind boggle.

As an undergraduate I had to learn sparce matrices, memory was something not to waste, it was expensive. In his paper, Licklider says:

The first thing to face is that we shall not store all the technical and scientific papers in computer memory. We may store the parts that can be summarized most succinctly-the quantitative parts and the reference citations-but not the whole. Books are among the most beautifully engineered, and human-engineered, components in existence, and they will continue to be functionally important within the context of man-computer symbiosis.

Imagine his face looking at the data centres all over the world storing memes and cat pictures and abusive tweets repeatedly without checking if they have already been saved, without indexing, without any sort of check on redundancy, an endless stream of drivvel. I am sure even Tim Berners-Lee wonders sometimes about the monster he has created.

And, books take so long to write, beautifully engineered they are, we lose ourselves in them and learn from them, they take us out of ourselves in the same way our phones do, but we are addicted to our phones and to our social media to that little hit of dopamine that social media gives us, which our books don’t always do. Books are work and we are passive whereas on our phones we feel active, but because our phones are controlling our homes and training us to be codependent and anxious and powerless, it is a vicious circle of more phones, fewer books.

In these times when I look at where we are going and I am not feeling good about it, like Licklider I turn to nature, as the best designs are in nature. I also look to the wisdom of yoga. So this is what I have:

When a bee gathers pollen, it also gathers a small amount of poison along with the pollen, both get transformed into nectar in the hive. The yogis say that when we learn to take negative situations and turn them into wisdom, it means we are progressing, and becoming skilful agents of the positive.

So, even though I worry about what happens when my whole life is literally on my phone and the world’s nature reserves are all full of data centres which contain every last terrible expression of humanity, and we are so disconnected from the nature around us that the oceans are filled with plastic, and many of us are in offices far away from the natural world staring into our bloody phones, and many of us do it to create technology. Surely we can create technology to change where we are. If we want a symbiosis we must make a human-planet one not a human-computer one. I don’t care what my Fitbit says I don’t want any technology in my ovaries thank you very much.

So, with that thought and the amazing technology I have at my fingertips today, I want to share an animated gif of my cat drinking from his water fountain. Licklider said:

Those years should be intellectually the most creative and exciting in the history of mankind

And, they are. I remain hopeful that we can collect enough data on ourselves to become self-aware enough to transform it into wisdom and create something better for all humanity and our planet. In the meantime I will be enjoying watching my cat have a drink of water and I am sure Licklider in 1960 would have been just as amazed too to see my cat drinking from the fountain on a phone more powerful technologically and psychologically than he ever could have imagined. It remains to be seen whether this is progress or not.

[Part 5]

User or Used? Human-Computer Interaction: Dialogue, Conversation, Symbiosis (3)

If you torture the data enough, it will confess to anything.
– Darrell Huff, How to Lie With Statistics (1954).

[ 1) Introduction, 2) Dialogue or Conversation, 3) User or Used, 4) Codependency or Collaboration, 5) Productive or Experiential, 6) Conclusions]

In the last blog I wrote about human dialogue with a computer versus the conversation between humans via a computer. The dialogue with a computer is heavily designed whereas human conversation especially via social media has come about serendipitously. For example, Twitter came from texting which was invented by engineers as tool to test mobile phones.

This is an example of what I call serendipitous design which works by users employing systems to do what they want it to do, which is the no-function-in-structure principle and then a designer find ways to support them. In contrast, the way to create systems which support users to do their job better uses the cardinal rule: know your user with all the various tools and techniques UX designers have borrowed from anthropology. You design with your user in mind, you manage their expectations, and you have them at the front of your mind as a major factor of the design so that the system has specific goals.

But, however hard you try, with each new system or software or form of communication, you often end up changing how people work and the dialogue is less about a field of possibilities with insight, intuition, and creativity, and more about getting people to do extra stuff on top of what they already do. And, because people are keen to get in on whatever new thing is happening they buy into what I call the myth of progress and adopt new ways of working.

This begs the question are we creating systems for users or the used?

This begs the question are we creating systems for users or the used? Today, I was chatting to a roadsweeper, he told me that last year he was driving a lorry but the council’s initiative to reduce carbon emissions means that 80 lorries were taken off the road and the drivers are now out sweeping the streets on foot. He showed me his council-issue mobile phone which tracks his every move and presumably reports back to the office his whereabouts at all times. Not that he needs it, if he sits on a wall too long, local residents will phone the council to complain that he is sitting down and not working hard enough.

Tracking is not new, smart badges, invented at Xerox PARC, were trialled in the 1990s in the early days of ubiquitious computing (ubicomp). The idea was to move computers off the desktop and embed them into our infrastructure so that we interact with them without our knowledge, freeing the user from the need to learn complex systems. In the badges’ case, everyone could be located by everyone else in the building, rather like the Harry Potter Marauder’s map. However, it smacks rather too much of surveillance, especially if your boss decides you are spending too long in the toilet or by the water cooler and, that your behaviour needs to change. The road sweeper instead of a badge has a mobile phone and people who spy on him and grass him up in part because they lack context and don’t know that he is entitled to a 20 minute break.

Must I really run all my important ideas past my fridge?

But it’s not just as part of a job, we have Google Maps recording every journey we make. And yet, ubicomp was less about having a mobile device or about surveillance, it was the forerunner to the Internet of Things, the ambient life, which is there to make things easier so the fridge talks to your online shopping to say that you need more milk. But what if I go vegan? Do I need to inform my fridge first? Must I really run all my important ideas past my fridge? This is not the semiotic relationship psychologist and mathematician J.C.R. Licklider had when he had his vision of man-computer symbiosis.

I was speaking to someone the other day who monitors their partner’s whereabouts. They think it’s useful to see where the partner is at any given time and to check that the partner is where they said they would be. No biggie, just useful. I mentioned it to another person who said that they had heard several people do the same. I wonder why am I so horrified and other people just think it’s practical.

Insidious or practical? I feel we are manipulated into patterns of behaviour which maintain the status quo.

Last week, I woke up and checked my Fitbit to see how I had slept which is slightly worrying now – I never needed anything to tell me how I slept before – and there was a new box in there: Female Health. I clicked on it. It asked me about birth control, when my next period is due, how long it lasts and so on. Intrigued, I entered the requested data. The resulting box said: Your period is due in eight days. Really? I mean, really? It was wrong even though I had tinkered with the settings. So, then it had a countdown: Your period will last four more days, three more days…etc. Wrong again. And, now it is saying: Four days to most fertile days. This is so irritating. It feels like Argos, you know, how the system and the reality of you getting something you’ve ordered never quite match up. I know together me and my Fitbit can build up data patterns. Will they be insightful? Time will tell. The bits which really concern me is that it said it wouldn’t share this information to anyone, okay… but then it added that I couldn’t share this information either. What? I am guessing that it wants me to feel safe and secure. But what if I wanted to share it? What does this mean? Menstrual cycles are still taboo? I can share my steps but not my periods? My husband and I laughed about the idea of a Fitbit flashing up a Super Fertile proceed with caution message when out on date night.

I regularly lie in bed pretending to be asleep to see if I can fool my fitbit

But, it’s not just me and my Fitbit in a symbiotic relationship is it? Someone is collecting and collating all that data. What are they going to do with that information prying into me and my Fitbit’s symbiotic space? It rather feels like someone is going to start advertising in there offering birth control alternatives and sanitary protection improvements. It feels invasive, and yet I signed up to it, me the person who thinks a lot about technology and privacy and data and oversharing. And, even now as I sit here and think about my mixed feelings about my Fitbit, the idea of wearing something on my arm which only tells me the time, and not my heart rate, nor the amount of steps I am doing, feels a bit old-fashioned – I am myself am a victim of the myth of progress. I am user and used. Confession, I regularly lie in bed pretending to be asleep to see if I can fool my Fitbit. It’s changing my behaviour all the time. I never used to lie in bed pretending to be asleep.

Back in 2006, I watched Housewife 49, it was so compelling, I bought the books. Nella Last was a housewife in Barrow-in-Furness who kept a diary along with 450 other people during and after the war. It was part of the Mass Observation project set up by an anthropologist, a poet, and a filmmaker, which sounds rather like the maker culture of HCI today. They didn’t believe the newpapers reporting of the abdication and marriage of King Edward VII, so went about collecting conversation and diary entries and observations on line. Rather like today, we have social media with endless conversation and diary entries and observations. The newspapers are scrambling to keep up and curate other peoples’ tweets because they have traditionally been the only ones who shape our society through propaganda and mass media. Now, we have citizens all over the world speaking out their version. We don’t need to wait for the newspapers.

We are living through a mass observation project of our own, a great enormous social experiment and it is a question worth asking: User or used? Who is leading this? And what is their goal? And, then we have the big companies collecting all our data like Google. And, we all know the deal, we give them our data, they give us free platforms and backups and archives. However, it doesn’t necessarily mean that they are right about the results of their research on our data, or have the right to every last piece of information to use, even if you give it freely, because there is a blurring of public and private information about me and my steps and periods and birth control.

Anthropologist Agustín Fuentes has written a thoughtful article about the misuse of terms such as biology in Google’s manifesto and consequently, the sweeping generalisations to which it comes. Fuentes says we have no way of knowing what happened before we collected data and even now as we collect data, we have to maintain our integrity and interpret it correctly by using terms and definitions accurately. Otherwise, we think that data tells the truth and stereotypes and bias and prejudices are maintained. I love the quote:

If you torture the data enough, it will confess to anything.

Information is power. Hopefully, though, there are enough anthropologists and system designers around who can stop the people who own the technology telling us what to think by saying they are having insights into our lives whilst peddling old ideas. We need to pay attention to truth and transparency before we trust so that we can have more open dialogue in the true sense of the word – an exploration of a field of possibilities – to lead to real and effective change for everyone.

Let us all be users not the used.

[Part 4]