There’s been a bit of a thing about software user experience going off the rails lately. Some people don’t like cross-platform software, and think that it isn’t as consistent, as well-integrated, or as empathetic as native software. Some people don’t like native software, thinking that changes in the design of the browser (Apple), the start menu (Microsoft), or everything (GNOME) herald the end of days.
So what’s going on? Why did those people make that thing that you didn’t like? Here are some possibilities.
My cheese was moved
Plenty of people have spent plenty of time using plenty of computers. Some short-sighted individual promised “a computer on every desktop”, made it happen, and this made a lot of people rather angry.
All of these people have learned a way of using these computers that works for them. Not necessarily the one that you or anybody else expects, but one that’s basically good enough. This is called satisficing: finding a good enough way to achieve your goal.
Now removing this satisficing path, or moving it a few pixels over to the left, might make something that’s supposedly better than what was there before, but is actually worse because the learned behaviour of the people trying to use the thing no longer achieves what they want.
It may even be that the original thing is really bad. But because we know how to use it, we don’t want it to change.
Consider the File menu. In About Face 3: The Essentials of Interaction Design, written in 2007, Alan Cooper described all of the problems with the File menu and its operations: New, Open, Save, Save As…. Those operations are implementation focused. They tell you what the computer will do, which is something the computer should take care of.
He described a different model, based on what people think about how their documents work. Anything you type in gets saved (that’s true of the computer I’m typing this in to, which works a lot like a Canon Cat). You can rename it if you want to give it a different name, and you can duplicate it if you want to give it a different name while keeping the version at the original name.
This should be better, because it makes the computer expose operations that people want to do, not operations that the computer needs to do. It’s like having a helicopter with an “up” control instead of a cyclic and collective controls.
Only, replacing the Open/Save/Save As… stuff with the “better” stuff is like removing the cyclic and collective controls and giving a trained helicopter pilot with years of experience the “up” button. It doesn’t work the way they expect, they have to think about it which they didn’t have to do with the cyclic/collective controls (any more), therefore it’s worse (for them).
Users are more experienced and adaptable now
But let’s look at this a different way. More people have used more computers now than at any earlier point in history, because that’s literally how history works. And while they might not like having their cheese moved, they’re probably OK with learning how a different piece of cheese works because they’ve been doing that over and over each time they visit a new website, play a new game, or use a new app.
Maybe “platform consistency” and “conform with the human interface/platform style guidelines” was a thing that made sense in 1984, when nobody who bought a computer with a GUI had ever used one before and would have to learn how literally everything worked. But now people are more sophisticated in their use of computers, regularly flit between desktop applications, mobile apps, and websites, across different platforms, and so are more flexible and adaptable in using different software with different interactions than they were in the 1980s when you first read the Amiga User Interface Style Guide.
We asked users; they don’t care
At first glance, this explanation seems related to the previous one. We’re doing the agile thing, and talking to our customers, and they’ve never mentioned that the UI framework or the slightly inconsistent controls are an issue.
But it’s actually quite different. The reason users don’t mention that there’s extra cognitive load is that these kinds of mental operations are tacit knowledge. If you’re asked about “how can we improve your experience filing taxes”, you’ll start thinking tax-related questions, before you think “I couldn’t press Ctrl-A to get to the beginning of that text field”. I mean, unless you’re a developer who goes out of their way to look for that sort of inconsistency in software.
The trick here is to stop asking, and start watching. Users may well care, even if they don’t vocalise that caring. They may well suffer, even if they don’t realise it hard enough to care.
We didn’t ask users
Yes, that happens. I’ve probably gone into enough depth on why it happens in various places, but here’s the summary: the company has a customer proxy who doesn’t proxy customers.