Replace User, Strike Any Key?

 
Author:  Follow: TwitterFacebook
Job Title:Sarcastic Architect
Hobbies:Thinking Aloud, Arguing with Managers, Annoying HRs,
Calling a Spade a Spade, Keeping Tongue in Cheek
 
 

Our jobs would be so much easier if it weren’t for all those pesky users.

There is a common perception in IT industry that user is the primary source of all the problems. In a sense, it is true: if not for users, there wouldn’t be any IT jobs, and therefore won’t be any IT problems; I am not sure though if this is what IT professionals should really want. While having too many problems on the job is not that good, having one single problem of ‘how to find the job’ is IMNSHO a significantly worse alternative.

The question of relations between users and developers has already been touched in [NoBugs2011], which establishes that it is the user who has the upper hand in the user-developer relationship, and that it is responsibility of developer to make the user happy. This article aims to analyze the problem in more detail and from different angles.

EU Cookie Directive

There is always somebody else to blame

Nihil humani a me alienum puto
(Nothing human can be alien to me)

Publius Terentius Afer, 2nd century BC —

First of all, let’s start with a trivial observation: rabbits (as well as people) in general are rarely willing to admit their own mistakes; one very good book on this subject is [MistakesWereMade]. Here we will not go into details of this phenomenon, but will merely admit that IT professionals (including us, developers), are still human (or sometimes rabbit) beings, and therefore we have a natural tendency to blame the others for our own mistakes. In the case when we didn’t expect that the user will press that specific button in that specific situation, it is very natural to shout ‘You need to be an idiot to <place whatever we didn’t expect user to do here>!!!’

Murphy's Law Anything that can go wrong, will go wrong.— Wikipedia —In fact, the tradition of blaming users for whatever happens to what we’ve made, is not specific to IT. For example, famous Murphy’s Law has originated when Edward Murphy blamed a technician from MX981 team for incorrectly connecting sensors for the system Murphy designed [HistoryOfMurphysLaw]; it obviously didn’t cross Murphy’s mind (at least not at that time) that a robust system should have connectors which don’t allow for incorrect connection. This tradition of blaming users for our own mistakes has flourished in IT world.

On reasonable expectations

If I ordered a general to fly from one flower to another like a butterfly,
… and if the general did not carry out the order that he had received,
which one of us would be in the wrong?
… The general, or myself?

Antoine de Saint-Exupéry, The Little Prince —

One huge mistake developers (and even business analysts) tend to make, is that they expect users to behave rationally at all times. Nothing could possibly be further from reality. From time to time, everybody is entitled to make a mistake; and users are entitled to make them much more often than developers, because usually it is users who’re paying.

Arguing hare:Given the sheer volume of attempts, users will almost certainly try pretty much every erroneous scenario, including those we have not thought about.Even if developer is right 99% of the time, he’s still wrong in 1% of cases. This means that even if users make the same percentage of mistakes (which is, as explained above, a very optimistic estimate), and if your application has 1 million users making 100 operations per day each, you’ll still get 1 million mistakes per day made by users. Among those mistakes, most will be trivial, but given the sheer volume of attempts, users will almost certainly try pretty much every erroneous scenario, including those we have not thought about. Who is here to blame? IT tradition assumes that ‘it is those stupid users’; users (sometimes joined by management) tend to say ‘it is those idiot developers’. In this debate, we take the third position: we say that the blame is on those rabbits who expected that under the circumstances, mistakes (on both sides) won’t happen.

Whenever a system of this scale is first launched, it is reasonable to expect that some of erroneous scenarios won’t be covered in original release, and to be prepared to identify problems and to fix them within reasonable (for the users, not for the developers) time frame. Still, while expecting flawless programs from the very beginning is not exactly reasonable, it is important to realize that while some bugs are inevitable, they still are bugs (and not users fault), and therefore must be fixed.
In addition, it should be noted that in many cases, mistakes are direct result of the fact that developers and users have very different perspectives of the software, which results in miscommunications. We feel that it is the developer’s responsibility at least to try to look at the program from user’s point of view, which leads us to…

Trying on the user’s hat

He that increaseth knowledge increaseth sorrow.

Ecclesiastes, 1. 18 —

Another important thing for the developer is understanding how users will use the program. And here developers tend to have major problems. In [NoBugs2011] it has been noted that developers are notoriously bad in creating UIs; here we will elaborate on it.

Hare pointing out:The very same rabbits who designed bad UIs when they were involved as developers, created very decent UIs when they were designing UIs while being completely in user’s shoesIn fact, it seems that developers are not only bad in creating UIs, but are also bad in any task which needs the programmer to put on the user’s shoes. So, we asked ourselves: is it due to developers being inherently different or because of them already being involved with the project in another role?

To find it out, we’ve tried a small-scale experiment with a few of our fellow rabbits. We asked the very same developers who were notorious for creating pretty bad UIs, to design UI for a project where developers were not involved. The result was rather obvious (though due to small scale it is unclear if it is statistically significant, and further research is suggested): the very same rabbits who designed bad UIs when they were involved as developers, created very decent UIs when they were designing UIs while being completely in user’s shoes, in particular, not being involved in the project in any other way, and without any knowledge about implementation.

If further research will confirm this hypothesis, it will mean that it is knowledge about system implementation which causes developers to fail to look at things from user’s perspective. It calls for creating a separate team of business analysts (BAs); while this practice is already rather common in the industry, what is new is that our research suggests that the same rabbit might be able to work both as a developer and as a BA, as long as her work as a developer and as BA occurs only in completely separate projects.

EU cookie directive

The road to hell is paved with good intentions

— proverb —

For a long time we thought that developers were the worst rabbits to design UIs. Recently, we’ve found there are rabbits out there who can do even worse, and they are bureaucrats. One recent example is an infamous EU Cookie Directive (strict name is ‘Directive on Privacy and Electronic Communications’, but here we will be dealing with one aspect of it, namely with websites being required to ask consent of users before storing a cookie on users’ computers ([2009/136/EC], [ICOGuidance])).

This whole document (as well as preceding directive 2002/22/EC) is based on fatal lack of understanding of underlying technologies, while attempting to regulate those technologies directly. What this will lead to, we will analyze now.

Without going into details of legaleze in the related documents, what it essentially required from web sites (in UK – starting from May 2012), is to ask user confirmation before ‘storing’ a cookie on an end-user’s computer. While obviously well-intended (the idea was to protect users’ privacy), both proposed requirements and their interpretations are fatally flawed. Essentially, what is required is to ask a user before site placing any cookie; many sites have already started changing UIs just to comply with the directive. What will happen when enough sites implement it, is obvious: when users will get used to such requests (usually phrased as ‘to access this site, you need to enable cookies, please confirm <yes>/<no>’), users will start pressing ‘Yes, I want a cookie’ button every time they see it. This phenomenon (known as capture errors) is well-known in security industry (see, for example, [SecurityEngineering], section 2.3.1), and is clearly unavoidable here. As soon as it happens, the whole point of directive would be lost, and it will merely create a nuisance for users, without any perceivable benefit.

Hare with omg face:Unfortunately, without understanding technology involved, an attempt to formalize requirements at technology level has had an exactly opposite effect.This is not the only problem with the directive. Whoever made it, has tried to think about it a bit further; unfortunately, without understanding technology involved, an attempt to formalize requirements at technology level has had an exactly opposite effect. The directive provides for an exemption for ‘where such storage or access is strictly necessary for the provision of an information society service requested by the subscriber or user’. Once again, intentions were good. Unfortunately, ‘strictly necessary’ wording makes it perfectly useless (and actually, even worse than that, as described below). As we know, strictly speaking, cookies are never ‘strictly necessary’ (this is because you can always, for example, put all information you need, into dynamic URL; it is a major hassle, but it is still possible, therefore alternatives are not ‘strictly necessary’). Now things begin to become even worse. The Information Commissioner’s Office has provided an interpretation of the EU directive, where ‘strictly necessary’ is not really ‘strictly’ necessary, but ‘essential, rather than reasonably necessary’ [ICOGuidance]; here ‘strictly necessary’ has degraded to ‘essential’ (which is still not exactly defined). What this travesty means in practice, is that those who will interpret it on a cautious side, will still ask for a confirmation, and will annoy their users, losing business and money; and those who don’t care about privacy at all, will improve their business even further. The whole result seems to be an exact opposite of good intentions behind the directive.

One should never ever try to formalize things at a level which one doesn’t understand. To do a reasonably good job, members of the European parliament have had two options: a) to specify privacy requirements without going into this level of details, and avoiding reference to specific technologies (admittedly, they’ve tried to, but level of requirements they’ve chosen, was apparently still too low), or b) to understand how cookies really work, and to take several less drastic and more reasonable measures, including, probably, a prohibition on third-party cookies (which is where most privacy leaks reside). In fact, members of European parliament have decided to take a middle ground between these two options, which (as we’ve seen above) has failed miserably.

Options: why more is less

Carving is easy, you just go down to the skin and stop.

Michelangelo

A designer knows he has achieved perfection
not when there is nothing left to add,
but when there is nothing left to take away.

Actually, there is one thing useful about EU Cookie Directive: it shows us that shifting responsibility to the user is not always a good thing. In fact, it is rarely a good thing. Obviously, for a developer it is always very convenient, if he has any doubts, just not to make any decisions, and provide user with an option, to shift responsibility from developer to the user.

Wtf hare:In practice, such ‘passing the buck’ is often not a good idea... because any decision should belong to the one who is in better position to make itIn practice, such ‘passing the buck’ is often not a good idea for one simple reason: because any decision should belong to the one who is in better position to make it, and very often it is the developer who has a better understanding of the issue in hand. This is especially true when we’re speaking about technical side of the program: asking user questions ‘what do you want to pay for’ is clearly a user question, but asking ‘how much cache do you want to use for this program’ is a developer question, whether we like it or not. In addition, shifting responsibility to user contributes to anxiety of customers related to having too many choices (for details, see [ParadoxOfChoice] book).

Still, developers often ‘pass the buck’ merely because they don’t want any responsibility (with a common argument being ‘I’ve already provided you with all the options, what else do you want?’). Unfortunately, this tendency is often aggravated by pressure from users (usually via managers) who are asking for slightly different things, usually for no really good reason), Here we should to point out that all modern programming languages are Turing-complete, and therefore are able to do absolutely everything (out of tasks which can be possibly done). It means that any program is essentially a process of reducing this ability to do absolutely everything into an ability to do something useful. Paraphrasing the famous quote of Michelangelo about carving, we can say ‘Programming is easy, you just keep restricting user choices until you get what user really needs and stop’.

Don't like this post? Comment↯ below. You do?! Please share: ...on LinkedIn...on Reddit...on Twitter...on Facebook

[+]References

[+]Disclaimer

Acknowledgements

This article has been originally published in Overload Journal #110 in August 2012 and is also available separately on ACCU web site. Re-posted here with a kind permission of Overload. The article has been re-formatted to fit your screen.

Cartoons by Sergey GordeevIRL from Gordeev Animation Graphics, Prague.

Join our mailing list:

Comments

  1. "No Bugs" Hare says

    Probably this is one of the least coherent articles of mine. Well, I suppose everybody has their ups and downs. If somebody has struggled to make some sense out of it – my apologies.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.