Wednesday, July 20, 2016

Apple's continued success, despite lousy product design


Surveys and articles like this continue to make it look like focusing on low-effort consumer product (even to the detriment of content creator markets) and ugly, research-defying design is working out just fine.

Maybe myself and my fellow "believers in actual UI research" really are just a tiny minority, doomed to suffer the tyranny of fads and apathy (rather than everyone getting the benefits of informed design). If the majority doesn't give a damn about readability and functional elegance, then no serious loss of money to companies still pushing hard-to-read, flaky, flat junk that requires less investment to produce.


Did the industry wean people off quality design, or would consumers have embraced iOS 7 just as strongly in 2007 as they embraced iOS 1?


Back to design as a visual thing... We have the cult of "geek chic". These are the tech people who support technology because it's new, not because it's actually great. These are the people that pursue change, for the sake of change, as if a new presentation is equal to a revolution. The easiest way to promote a technology as new is to change its appearance. Very little engineering required for a quick bump in consumer attention. Microsoft has made an art out of changing the package design and offering the same product with little functional improvement (over a couple of decades, actual improvement is certainly notable, but not as much as it would appear on the surface).
Trying to explain the irrationality of it is like trying to explain the placebo effect to people who are unwilling to learn the reasoning and process for the scientific method, while they swear that homeopathy has made them feel better. Yet, tech people... supposed to be more scientific... Aargh. Myths!!


Fads don't stay new, but belief is very difficult to shake. The human brain seems wired to reject data that contradicts belief. So the decades of data rot away in boxes while the fads cycle on. 

The weaning notion has historical precedent: the computer industry has a long history of forcing difficult-to-use gadgetry onto a populous by there being no alternative to clunky tools to do new things (however painfully). The society adapts and becomes normalized to bugs, EULAs, and generally terrible design because not having the tool is more undesirable than the discomfort of actually using it. The businesses and people who get a leg up over their competitors who refuse bad design speaks volumes in capitalism: "use this crap or perish". 

In 2007, the iPhone was the alternative to the miserable, yet standard, computer industry junk. Mac OS was, at the time, not taking the industry by storm, but was continuing to slowly erode the Microsoft dominance of the 90s. Along came iPhone, shocking the world into realizing tech sucked, because this new, easy-to-use (and beautifully designed) OS and hardware made the Internet a truly accessible resource for all (who could afford to buy one; interestingly, a huge number of people found a way to afford one). No geek cred was required.

Now that the competitors have caught up (mostly by aping Apple's researched design choices), there's no further need to continue to cater to ease of use; the pushing and shoving can resume. With Apple's design language now seemingly adopted (surface features only) by everyone (even used to create "geek chic" glitter on non-technology product marketing), Apple apparently wanted to differentiate itself again.

It did that by... Trashing everything it had already done, design-wise, and adopting a fad already in progress thanks to the competitors that Apple had recently seriously bloodied. From desperation and arrogance come bad decisions. 

This isn't just about appearance; functional design is a critical part of design. Steve Jobs refused to take any blame for customers "holding it wrong", but the design was flawed and Apple gave free cases to correct the signal attenuation caused by ... holding the phone. Now the antennas are placed differently (but we still need cases; see below).

After Jobs, Apple made phones so large (and so thin) that one-handed operation is uncomfortable or even impossible for some people. Apple didn't give consumers a 4" option again until a few generations of awkward product. It was a purely MBA-expert-driven choice to sell larger phones, despite ergonomics showing that larger was not better. Stemming the media abuse (and consumer self-abuse to buy as large a phone as possible from google partners) by making a larger phone was sensible. Making it the only option was not. Oh there were options: too large and much too large.

Apple finally released a new 4" phone, after storms of criticism. "Surprise! 4-inch phones are selling well!" becomes the new news.

Back to the software: Apple has soiled ease of use with lousy visual design. It irritates me to no end. However, it would be tolerable if the tools did what they are supposed to do, reliably. They don't.

Apple has introduced countless behavioral problems into the product lines, starting in the iOS 7 era (late 2013). There was apparently a lot of UI code that needed to be rewritten to change the UI so dramatically. We've not seen the end of that even today with the final revision of iOS 9.3 (July 2016). It's not just the visual glitches, and it's not just on iOS. 

There have been several cases of feature regression in Mac software. Final Cut Pro being a well publicized case, I'll leave you to google to learn more about that. Less publicized was the damage done to iWork.

Getting iWork to run on iOS seems to have required totally new software. The apps on iOS are feature incomplete. Instead of making them have feature parity with the more mature and robust Mac OS versions, Apple chose the course of back-porting the iOS versions of iWork to the Mac. They now have feature parity, because the stronger offering on the Mac was crippled to match the iOS version. Could Apple not make them both just as functional as Mac OS iWork was in the '09 edition? Absolutely, but that requires developer attention (money). With a limited number of developers at Apple, any attention expended on iWork is less attention spent on getting the next iOS release onto the market, and the next iteration of pushing customers to buy a new device. iWork languishes. 

Its dedicated users have either refused to upgrade (including avoiding upgrading their OS or computer) or abandoned it for Microsoft's Office suite (and some who did the latter have moved to Windows... or back to Windows). Apple's response? Promote Microsoft Office.

Then there's iCloud. The iCloud feature bullet points are mounting, and a lot of those features are ideologically sound (shared clipboard being the best example to come from the WWDC 2016 announcements), but the execution is inconsistent and the results are unreliable. I continue to get duplicate Notes for no user-caused reason. Notes were one of the earliest features of iCloud synchronization, yet this still happens. 

Worse is that my devices aren't all communicating equally with reminders. The alerts for To-Do lists items appear on all three of my supported devices, due to each device running the necessary service locally. Marking them as complete, however, is a different story. If I mark an item as complete on the iPhone, the iPad Pro shows it as overdue. The reverse also occurs. Which device has "the truth" is inconsistent, but it's usually the one where I made the change. The same goes for deleting items. I found that I can force the laggy device to go ask iCloud by entering the iCloud settings and viewing my account (it triggers a login request and the login forces a data synchronization).

We shouldn't be required to figure out workarounds. It should just work. That's Apple's old mantra. Workarounds are Microsoft territory. Didn't we Apple users become Apple users to avoid that crap? 

Why is it like this? Money. It costs less, and makes more profit, to move on to the next product ASAP, rather than make the existing one work correctly. The computer industry has successfully trained consumers to believe that bugs are an inevitable and unavoidable necessity of computing. This is a meme and a lie, not a fact. Properly executed, software can be made reliable enough that problems are very rare. 

The most serious issue with this isn't in minor annoyances. It's in the back-end of iCloud. If these synch problems are never fully worked out, why do we trust Apple with our data in the first place?

Apple today has the corporate culture of most huge businesses: admit no failing. That was Steve Jobs' thing: admit to mistakes by blaming the engineers working for him (except he was apparently so married to the design of iPhone 4 that he chose to blame the customers instead; proof that he was no technology god). It was mostly in-house, but it always seemed to leak to the media (no such stories with Apple's current leadership; which begs the question: does Apple leadership pay attention to its products or are they just mindless consumers themselves?). 

Maybe the engineers were only following Jobs' lead. Maybe they "didn't get it". Either way, it was a pressure relief valve for avoiding getting stuck in the sunken cost fallacy (the current visual design of iOS). The poor Apple Newton finally got it mostly right at the end, but Jobs killed it because the damage had long been done and Apple was still losing face (and money, so the official story goes).

Who at Apple has the authority and the wisdom to do the same with the current design and execution trends?

This is one of the oldest tricks in the book of capitalism: redesign the packaging, catch market attention again... until people get used to it and you have to do it again to illicit the same spike in attention, again. Repeat chorus. Psychological studies have already noted this as a side-effect in efficiency experiments in industry: so long as the subjects are aware of a change being made, they behave differently, because they think they should. Their changed behavior drops off after a while. That's when you should measure results, no earlier.

When people respond to new package design, the managers who demanded it say "see, I told you that new design was better". No. It is just different. In fact, it might be much worse, but if you're not going to stick around and do real studies, you'll be able to preserve your ignorance and ego.

So Apple changed the package design and the average consumer response was positive: "yay change!" The articles about the consequences of the design changes, written by educated experts in human interfacing, go ignored by Apple and Apple doubles down on the flat ugliness, bringing it to the Mac too.

It wouldn't be so bad if appearance was all there was to it, but major functionality deficits appeared with that new package design. It might not be a direct consequence. It might be coincidence that the visual design came at the same time as architectural changes that created all the problems. One way or another, the confluence sticks out as a concentrated failure of leadership.

I wonder: how long will it take for hipster geeks to get over their 2013-era boredom with detailed visual design, and get bored with the current obsession of flat, low contrast, featureless dullness? (they call this "clean", because every irrational fad needs a marketing term, just like music equipment fetishists use the term "warmth" to justify analog equipment and valves)

To call research-backed quality design "dated" is irrational and is an uneducated opinion; all such people care about is a sense of newness; a "fresh image". The seemingly logic-oriented technology geek community has many members subscribing to this illogical judgment.

Hipster tech geeks. What an historically ironic concept.

This happens in all places of human occupation. My focus here is on the computer tech industry. It results in me talking about this behavior from that context. An automotive designer or a kitchen appliance designer might scream about the same problem in their own industries. A medical professional might scream about fad diets and unproven drugs getting undue credibility (and they do).

Lording over all of the designers and scientists are the same people, though: the business administration "experts" (who are apparently just "mindless consumers" themselves). So long as the few at the top are prosperous, the rest of us will have no choice but to stay the course. Supporting current business trends are what keeps people employed (though, at lower wages, and with less demand for actual training and knowledge, because cheap employment is an obsession with MBA "experts" and shareholders, no matter how destructive to their society these obsessions are).

Eventually, something disrupts the status quo. Apple itself is an impressive historical example. 

Twice.

We're waiting, Apple. Do you no longer have the people and leadership with the vision? 

Maybe someone else does. They just have to wait until the status quo becomes "old enough" (and painful enough) to upset it by moving a large portion of the market to something comparably more cozy and pretty.

Until then, society will just continue along as if there's no dissenting opinion worth merit. Just like environmentalists have been screaming about our destruction of delicate natural support systems for ages... 

Wow, human beings are stupid. 

No comments:

Post a Comment