I'm glad to see there continues to be critical examination of Apple's foolish behavior around its Mac line...
How Apple Alienated Mac Loyalists
Apple has obsessed over the iOS products because of a lucky strike of success (yes, the original design had a lot to do with it, but they've since disposed of that, so...). Apple's only loyalty is to shareholders, and the iPhone and iPad has been very good for Apple's primary stock owners. So the logic seems to have been to make more of the same and cut funds spent on anything else (except for some ludicrous car project, another fad project, due to Google obsessing over the pathological technology of self-driving cars).
Some day the iPhone will no longer be the killer income generator that it has been. Things like phones and watches are fad products. Fads are irrational and unreliable. Someone comes out with something that creates a new fad, regardless of its true value and the old fad ends. Consumers move on abruptly.
It's the core users, the loyalists, the professionals and content creators, that are the consumers Apple should be serving (again, if Apple actually cared to serve customers, rather than Wall Street). If Apple disposes of the Mac in pursuit of capitalism's religious delusion of perpetual profit growth, they will discover their core user base has moved on, once they are done shitting on them and are once again requiring them to keep Apple alive.
The iPhone brought Apple to dominance, but it also seems to be Apple's eventual downfall. Hopefully, some new leadership will take over at Apple before the company suffers too much.
Wednesday, December 21, 2016
Wednesday, November 2, 2016
Apple now run by "tonerheads"?
No new Mac Pro announcement, lackluster and overly expensive MacBook Pro upgrade, no displays from Apple for Apple machines... Yeah... Not feeling it, Apple.
Thursday, September 1, 2016
The Fall of the Designer: The Fad of Laziness and Ignorance Prevails
I just read a fantastic series of articles that, unfortunately, showed me that things are even worse than I realized with regard to the business of design. Instead of summarizing it here, I'm just going to point you to the original content. It's in multiple parts and I recommend all of them.
Wednesday, July 20, 2016
Apple's continued success, despite lousy product design
Surveys and articles like this continue to make it look like focusing on low-effort consumer product (even to the detriment of content creator markets) and ugly, research-defying design is working out just fine.
Maybe myself and my fellow "believers in actual UI research" really are just a tiny minority, doomed to suffer the tyranny of fads and apathy (rather than everyone getting the benefits of informed design). If the majority doesn't give a damn about readability and functional elegance, then no serious loss of money to companies still pushing hard-to-read, flaky, flat junk that requires less investment to produce.
Did the industry wean people off quality design, or would consumers have embraced iOS 7 just as strongly in 2007 as they embraced iOS 1?
Back to design as a visual thing... We have the cult of "geek chic". These are the tech people who support technology because it's new, not because it's actually great. These are the people that pursue change, for the sake of change, as if a new presentation is equal to a revolution. The easiest way to promote a technology as new is to change its appearance. Very little engineering required for a quick bump in consumer attention. Microsoft has made an art out of changing the package design and offering the same product with little functional improvement (over a couple of decades, actual improvement is certainly notable, but not as much as it would appear on the surface).
Trying to explain the irrationality of it is like trying to explain the placebo effect to people who are unwilling to learn the reasoning and process for the scientific method, while they swear that homeopathy has made them feel better. Yet, tech people... supposed to be more scientific... Aargh. Myths!!
Fads don't stay new, but belief is very difficult to shake. The human brain seems wired to reject data that contradicts belief. So the decades of data rot away in boxes while the fads cycle on.
Maybe myself and my fellow "believers in actual UI research" really are just a tiny minority, doomed to suffer the tyranny of fads and apathy (rather than everyone getting the benefits of informed design). If the majority doesn't give a damn about readability and functional elegance, then no serious loss of money to companies still pushing hard-to-read, flaky, flat junk that requires less investment to produce.
Did the industry wean people off quality design, or would consumers have embraced iOS 7 just as strongly in 2007 as they embraced iOS 1?
Back to design as a visual thing... We have the cult of "geek chic". These are the tech people who support technology because it's new, not because it's actually great. These are the people that pursue change, for the sake of change, as if a new presentation is equal to a revolution. The easiest way to promote a technology as new is to change its appearance. Very little engineering required for a quick bump in consumer attention. Microsoft has made an art out of changing the package design and offering the same product with little functional improvement (over a couple of decades, actual improvement is certainly notable, but not as much as it would appear on the surface).
Trying to explain the irrationality of it is like trying to explain the placebo effect to people who are unwilling to learn the reasoning and process for the scientific method, while they swear that homeopathy has made them feel better. Yet, tech people... supposed to be more scientific... Aargh. Myths!!
Fads don't stay new, but belief is very difficult to shake. The human brain seems wired to reject data that contradicts belief. So the decades of data rot away in boxes while the fads cycle on.
The weaning notion has historical precedent: the computer industry has a long history of forcing difficult-to-use gadgetry onto a populous by there being no alternative to clunky tools to do new things (however painfully). The society adapts and becomes normalized to bugs, EULAs, and generally terrible design because not having the tool is more undesirable than the discomfort of actually using it. The businesses and people who get a leg up over their competitors who refuse bad design speaks volumes in capitalism: "use this crap or perish".
In 2007, the iPhone was the alternative to the miserable, yet standard, computer industry junk. Mac OS was, at the time, not taking the industry by storm, but was continuing to slowly erode the Microsoft dominance of the 90s. Along came iPhone, shocking the world into realizing tech sucked, because this new, easy-to-use (and beautifully designed) OS and hardware made the Internet a truly accessible resource for all (who could afford to buy one; interestingly, a huge number of people found a way to afford one). No geek cred was required.
Now that the competitors have caught up (mostly by aping Apple's researched design choices), there's no further need to continue to cater to ease of use; the pushing and shoving can resume. With Apple's design language now seemingly adopted (surface features only) by everyone (even used to create "geek chic" glitter on non-technology product marketing), Apple apparently wanted to differentiate itself again.
It did that by... Trashing everything it had already done, design-wise, and adopting a fad already in progress thanks to the competitors that Apple had recently seriously bloodied. From desperation and arrogance come bad decisions.
This isn't just about appearance; functional design is a critical part of design. Steve Jobs refused to take any blame for customers "holding it wrong", but the design was flawed and Apple gave free cases to correct the signal attenuation caused by ... holding the phone. Now the antennas are placed differently (but we still need cases; see below).
After Jobs, Apple made phones so large (and so thin) that one-handed operation is uncomfortable or even impossible for some people. Apple didn't give consumers a 4" option again until a few generations of awkward product. It was a purely MBA-expert-driven choice to sell larger phones, despite ergonomics showing that larger was not better. Stemming the media abuse (and consumer self-abuse to buy as large a phone as possible from google partners) by making a larger phone was sensible. Making it the only option was not. Oh there were options: too large and much too large.
Apple finally released a new 4" phone, after storms of criticism. "Surprise! 4-inch phones are selling well!" becomes the new news.
Back to the software: Apple has soiled ease of use with lousy visual design. It irritates me to no end. However, it would be tolerable if the tools did what they are supposed to do, reliably. They don't.
Apple has introduced countless behavioral problems into the product lines, starting in the iOS 7 era (late 2013). There was apparently a lot of UI code that needed to be rewritten to change the UI so dramatically. We've not seen the end of that even today with the final revision of iOS 9.3 (July 2016). It's not just the visual glitches, and it's not just on iOS.
There have been several cases of feature regression in Mac software. Final Cut Pro being a well publicized case, I'll leave you to google to learn more about that. Less publicized was the damage done to iWork.
Getting iWork to run on iOS seems to have required totally new software. The apps on iOS are feature incomplete. Instead of making them have feature parity with the more mature and robust Mac OS versions, Apple chose the course of back-porting the iOS versions of iWork to the Mac. They now have feature parity, because the stronger offering on the Mac was crippled to match the iOS version. Could Apple not make them both just as functional as Mac OS iWork was in the '09 edition? Absolutely, but that requires developer attention (money). With a limited number of developers at Apple, any attention expended on iWork is less attention spent on getting the next iOS release onto the market, and the next iteration of pushing customers to buy a new device. iWork languishes.
Its dedicated users have either refused to upgrade (including avoiding upgrading their OS or computer) or abandoned it for Microsoft's Office suite (and some who did the latter have moved to Windows... or back to Windows). Apple's response? Promote Microsoft Office.
Then there's iCloud. The iCloud feature bullet points are mounting, and a lot of those features are ideologically sound (shared clipboard being the best example to come from the WWDC 2016 announcements), but the execution is inconsistent and the results are unreliable. I continue to get duplicate Notes for no user-caused reason. Notes were one of the earliest features of iCloud synchronization, yet this still happens.
Worse is that my devices aren't all communicating equally with reminders. The alerts for To-Do lists items appear on all three of my supported devices, due to each device running the necessary service locally. Marking them as complete, however, is a different story. If I mark an item as complete on the iPhone, the iPad Pro shows it as overdue. The reverse also occurs. Which device has "the truth" is inconsistent, but it's usually the one where I made the change. The same goes for deleting items. I found that I can force the laggy device to go ask iCloud by entering the iCloud settings and viewing my account (it triggers a login request and the login forces a data synchronization).
We shouldn't be required to figure out workarounds. It should just work. That's Apple's old mantra. Workarounds are Microsoft territory. Didn't we Apple users become Apple users to avoid that crap?
Why is it like this? Money. It costs less, and makes more profit, to move on to the next product ASAP, rather than make the existing one work correctly. The computer industry has successfully trained consumers to believe that bugs are an inevitable and unavoidable necessity of computing. This is a meme and a lie, not a fact. Properly executed, software can be made reliable enough that problems are very rare.
The most serious issue with this isn't in minor annoyances. It's in the back-end of iCloud. If these synch problems are never fully worked out, why do we trust Apple with our data in the first place?
Apple today has the corporate culture of most huge businesses: admit no failing. That was Steve Jobs' thing: admit to mistakes by blaming the engineers working for him (except he was apparently so married to the design of iPhone 4 that he chose to blame the customers instead; proof that he was no technology god). It was mostly in-house, but it always seemed to leak to the media (no such stories with Apple's current leadership; which begs the question: does Apple leadership pay attention to its products or are they just mindless consumers themselves?).
Maybe the engineers were only following Jobs' lead. Maybe they "didn't get it". Either way, it was a pressure relief valve for avoiding getting stuck in the sunken cost fallacy (the current visual design of iOS). The poor Apple Newton finally got it mostly right at the end, but Jobs killed it because the damage had long been done and Apple was still losing face (and money, so the official story goes).
Who at Apple has the authority and the wisdom to do the same with the current design and execution trends?
This is one of the oldest tricks in the book of capitalism: redesign the packaging, catch market attention again... until people get used to it and you have to do it again to illicit the same spike in attention, again. Repeat chorus. Psychological studies have already noted this as a side-effect in efficiency experiments in industry: so long as the subjects are aware of a change being made, they behave differently, because they think they should. Their changed behavior drops off after a while. That's when you should measure results, no earlier.
When people respond to new package design, the managers who demanded it say "see, I told you that new design was better". No. It is just different. In fact, it might be much worse, but if you're not going to stick around and do real studies, you'll be able to preserve your ignorance and ego.
So Apple changed the package design and the average consumer response was positive: "yay change!" The articles about the consequences of the design changes, written by educated experts in human interfacing, go ignored by Apple and Apple doubles down on the flat ugliness, bringing it to the Mac too.
It wouldn't be so bad if appearance was all there was to it, but major functionality deficits appeared with that new package design. It might not be a direct consequence. It might be coincidence that the visual design came at the same time as architectural changes that created all the problems. One way or another, the confluence sticks out as a concentrated failure of leadership.
I wonder: how long will it take for hipster geeks to get over their 2013-era boredom with detailed visual design, and get bored with the current obsession of flat, low contrast, featureless dullness? (they call this "clean", because every irrational fad needs a marketing term, just like music equipment fetishists use the term "warmth" to justify analog equipment and valves)
To call research-backed quality design "dated" is irrational and is an uneducated opinion; all such people care about is a sense of newness; a "fresh image". The seemingly logic-oriented technology geek community has many members subscribing to this illogical judgment.
Hipster tech geeks. What an historically ironic concept.
This happens in all places of human occupation. My focus here is on the computer tech industry. It results in me talking about this behavior from that context. An automotive designer or a kitchen appliance designer might scream about the same problem in their own industries. A medical professional might scream about fad diets and unproven drugs getting undue credibility (and they do).
Lording over all of the designers and scientists are the same people, though: the business administration "experts" (who are apparently just "mindless consumers" themselves). So long as the few at the top are prosperous, the rest of us will have no choice but to stay the course. Supporting current business trends are what keeps people employed (though, at lower wages, and with less demand for actual training and knowledge, because cheap employment is an obsession with MBA "experts" and shareholders, no matter how destructive to their society these obsessions are).
Eventually, something disrupts the status quo. Apple itself is an impressive historical example.
Twice.
We're waiting, Apple. Do you no longer have the people and leadership with the vision?
Maybe someone else does. They just have to wait until the status quo becomes "old enough" (and painful enough) to upset it by moving a large portion of the market to something comparably more cozy and pretty.
Until then, society will just continue along as if there's no dissenting opinion worth merit. Just like environmentalists have been screaming about our destruction of delicate natural support systems for ages...
Wow, human beings are stupid.
Tuesday, July 12, 2016
OS "upgrades" degrade performance
Planned obsolescence. It's not a myth or a conspiracy theory. It is the basis upon which the computer industry makes money each fiscal year by selling you the same product repeatedly. Oh yes, many other industries do the same, but I'm going to talk about the one that is most relevant to me.
Each year, computer makers try to sell you the same thing you bought last year, with marginal increases in memory and CPU speeds.
Marginal change? What happened to that racing progress and Moore's law?
That's over. It's getting very hard to produce the kinds of gains seen through the 80s and 90s. Physics is a bitch. Well, not a bitch. It's just nature, and the way we build counting machines is running into physical limits.
So what incentive is there to buy merely marginally improved computers? To accommodate the bloating of software; most notably, the operating systems.
Have you noticed that Apple now provides their operating system upgrades for free? How nice of them. Then again, they sell hardware. It's in their best interest to pack in new features that inspire new hardware purchases. It is also in their best interests to make some of those features be available only to owners of new hardware. They're in business to make money, okay, fair enough. So we can just choose to stop buying new hardware for a while if we're strapped for cash, right?
Well, maybe. If you don't want the new features of the new OS versions that might slow down your existing hardware. But maybe they've promised you a bug fix that isn't going to be put in the existing version because they're working on the latest version. Maybe your existing laptop will die from heat-related materials fatigue.
Apple's previous generation Mac Pro became well regarded for being upgradable and providing many years of use without replacement. If certain components failed, they were often easily replaced or even upgraded with 3rd party product. Notice that doesn't happen any more?
Monolithic construction with glued-in batteries, soldered on memory and CPUs... Portability and design elegance explains it to a degree, but the obsession with thinness has become seemingly pathological at Apple. Disposable computers are not environmentally friendly, regardless of how many toxic materials you remove from their manufacture (and the elimination of lead worked for computer manufacturers because now consumers have to replace laptops dying from heat-related materials fatigue far sooner than ever before, and they won't be paying out to consumers for this design flaw).
What about the average consumer that never cared about modular, "upgradable" computers anyway? (and honestly, upgrading was never what it was cracked up to be; by the time you desired to improve CPU or memory, the rest of the market had abandoned your architecture and you were forced to buy a new system anyway). How does the industry incentivize the average consumer to buy their phone or laptop again when they're plenty content with what they already have? (contrary to popular geek opinion, it's not the end consumers screaming for features; it's mostly just the geeks and professionals, which currently make up a tiny fraction of Apple's customer base)
Giving them something for free! "Ooo free upgrade!"
Making the OS free and convincing the user they're getting more functionality and convenience from that free OS upgrade gains emotional kudos from the user. Letting them experience creeping performance losses over time, where the change is more of a slippery slope kind of thing, rather than a sudden shock, makes it much easier to inspire buying "a faster one" in a year or two. Otherwise, people would mostly just keep using what they have and Apple would only make rapid repeat product sales to the people that can't manage to keep a piece of glass intact longer than six months at a time.
Each new version of iOS is slower than its predecessor. Don't believe me? Go watch the video: http://bgr.com/2016/02/16/apple-ios-9-vs-ios-6-performance/
The only reason I'm typing this on an iPhone 6s is because the number of apps (and websites) usable on my iPhone 4 with iOS 6 was approaching zero. Apple does not maintain Safari like an app on older iOS versions like they used to do with Safari on older Mac OS versions (another thing they've done to shorten the life of Mac OS versions is to stop supporting Safari on them).
The Internet is no longer about HTML. Now it's about bloated pages with unnecessary scripting and popup advertising. Not only are web developers requiring features that Apple doesn't add to old versions of Safari, their pages are just plain bloated, so an old device with new feature support wasn't the answer either (no third party browser).
iOS 7 would've degraded the usability of the phone beyond my tolerance threshold (and its an ugly mess; even uglier without translucency on older phones). I still use my iPhone 4 as a music player and for various abandoned apps. Every time I use it, I marvel at the beauty of the GUI design in iOS 6, which is utterly gone from iOS as of version 7. But this isn't about the ugliness of iOS 7-9. I've plenty of other posts about that. This is about performance and "upgrading". Each version of the OS is slower than its predecessor. This is fact.
Geeks will come pouring from the cracks of the Internet to justify this and proclaim me ignorant of basic geek cred knowledge. "There are hundreds of new features!" they'll shout. Yes, indeed, this is true. But who asked for those features and why do they have to cause degradation in places where they're not in use? Why does it make sense to consume CPU cycles (and battery) doing things that many users might never desire, to support features not being currently called to task? Why, for example, is the keyboard slower, app loading slower, etc., when the user is not calling on all those "hundreds of new features"? Poor software architecture, I propose, is the answer. Laziness, lack of skill or time isn't really the core problem, though.
Throughout my decades dealing with the computer industry, I've observed many tech geeks and developers promoting tolerance for bloating of software by declaring "buy a new computer!" Not all developers. There are plenty of programmers that are appalled by this, just as I'm not the only person screaming about flat design being horrible bullshit. But the excuses are out there in play and are very much the majority.
"Your computer is older than god; buy a new one and stop bitching about the software that developers are making for you!"
This is entrenched dogma and it benefits the hardware makers. Why spend the time and money to optimize code in your own software when its degrading performance benefits your business by inspiring users to buy newer hardware from you?
The same performance degradation can be found in Mac OS. Snow Leopard boots within a few seconds on my MacBook Pro 5,5 (mid-2009). Mavericks booting is in a metric of minutes. The same difference can be seen with application startups and usage of Safari, and there's considerable latency and spinning beach ball activity on various tasks in Mavericks (yes, I upgraded the RAM to 8GB already). I'm okay with it when I've not used Snow Leopard for a while, but when I'm switching back to Mavericks from Snow Leopard, it's egregious. Mavericks is a decent OS all the same, and was what Lion and Mountain Lion failed to be. Hell, we even LOST features in Lion (Rosetta), so Lion should've been a speed boost, right? Ha ha ha... But still, the overall experience has suffered. Has the gain compensated for the loss? It doesn't feel that way, especially when the loss of Rosetta, and the acquisition of Logic 9 bugs with Mavericks keeps me in Snow Leopard most days. I'm not a Luddite. I'm just poor. If I had endless cash, I'd upgrade all the software and hardware that keeps me on Snow Leopard when I'm working on audio projects.
I'm pretty sure the features leading to this slowdown are related to iOS integration. Those are the major differences between promoted features of the two systems (the iOS integration stuff really started at Lion, but Lion and Mountain Lion were unfinished versions of Mavericks, IMO). El Capitan, which is less dramatically different in feature changes, isn't noticeably slower than Mavericks (but El Capitan's Safari bugs and ugliness keep me away).
There's an end point not under my control. Mac OS Sierra looks like a useful, if underwhelming, addition of features, some of which interest me (shared clipboards). I won't be seeing it any time soon, though. Sierra has left my MacBook Pro 5,5 in the dust. I will not be able to upgrade. Whatever Apple added to it has either finally made even Apple embarrassed at the performance degradation... or they just want to stop supporting the hardware.
Fair enough. The supposed benefit of limited hardware support is greater reliability and stability. However, the problem isn't the lack of support. It's the fact that a claimed supported system is severely degraded by "upgraded" software, and that it's progressively worse version after version. That's just performance, and not even including the PLETHORA of bugs and glitches. If they claim support of a machine, the performance should be equivalent. At the very least, there should be a warning supplied prior to the moment of upgrade that informs the user of this potentially undesirable result of an "upgrade", giving them informed consent before committing to an irreversible act. There is no such warning. Worse, Apple supplies propaganda to users of old systems that claim the opposite. My Mavericks system regularly gets unwanted popups declaring that I can "improve the performance of [my] computer" by upgrading to El Capitan. I've run El Capitan on this machine. That statement is a lie.
I'll fully admit, El Capitan runs well under the circumstances. Mavericks is running on an internal drive but my El Capitan test volume was an external USB2 drive. It ran rather well considering. But the Safari bugs were unacceptable. The loss of hardware support was unacceptable and there's no warning of this potential. It's left up to the user, some being more equipped to deal with this than others. Certainly the end users aren't.
What hardware losses? El Capitan added System Integrity Protection. SIP a good idea and I support it. However, it has the consequence of disabling many 3rd party drivers on many people's systems, and many developers of such hardware chose not to update their drivers (because they too wish to sell me the same thing a second, third, fourth time, despite the hardware in question still operating perfectly well with the supported drivers on the supported OS, which is itself another case of planned obsolescence).
The bloating of software cannot go on indefinitely. Moore's law has an end point and we've been standing very close to it for some time. CPU speeds have barely increased at all in many years. More memory is still forthcoming and will always be welcome (frankly, we ought to have solid state flat storage for all uses, not the continued separation of long term, short term, and virtual storage, but the tech still isn't quite there yet and it takes many software engineers seemingly forever to realize when an age-old "that's the way it's done" design needs to be swept aside... like shared libraries...).
More memory, however, will not solve the problem of wasted CPU cycles on activities the user knows nothing of (nor should they be expected to know). This results in serious slowdowns for, what amounts to, no good reason (from the perspective of the user). Perception and experience is what matters here, not technical excuses. However logical the explanations, they're just justifications for bad tool design that is serving an unethical capitalist gravy train. Capitalism should be an economy; a system of exchange. It should not be an antisocial institution that slowly destroys everything around itself (such as society) while consuming all available resources to make a few people super wealthy.
How many cores can we cram into a CPU package before power usage and heat put a crimp on that line of "improvement"? With the way software is terribly optimized, and where multi threading is still not pervasively being utilized (and some tasks cannot be broken into threads to begin with), it seems to me that the constant bloat has an inevitable end point. Nothing is perpetual (and perpetual growth is usually cancer).
At some point the industry will have to get off the gravy train and start optimizing code and offering fewer new features. They will have to find some other inspiration for selling the same thing to their customers again. In fact, the best feature I can imagine being offered right now is "optimized for speed and efficiency, with less storage space and memory consumption".
Remember the last time Apple marketed that at us and it meant something?
That was Snow Leopard.
You know, the OS that made Mac OS a serious contender and resulted in many people ditching Windows (and their Windows software investment) to move to the Apple platform, alongside their amazing new iPhones with beautiful and intuitive GUI. Rumor has it that Snow Leopard was a nice bonus from optimizing Mac OS X enough to run it on the hardware that would become the iPhone. The birth of the iPhone may have given us the most efficient Mac OS ever.
It's too bad that the rest of the iOS legacy is the abandonment of said optimization and maybe even the destruction of Mac OS itself.
iOS has become abysmal (bug video demonstrations)
Here is a playlist of iOS 9 bugs that someone put together (thank you). I've reported some myself. Many months and iOS versions ago. Apple does nothing. iOS 7 broke so much stuff in iOS that the bug list never gets shrunken. The developers can't catch up while driven to add new features (many seemingly designed by marketing, so Apple can announce new versions every 9 months).
iOS 2-6 sold me on this device family, and iOS 7-9 have made me wish droid wasn't a total pile. So long as apple knows droid is worse, iOS can continue to become more bloated and buggy without losing much market share.
Share price is all that matters to the "masters of business administration" management and leadership at Apple. The board of directors is not there for the customer. I was willing to give him the benefit of the doubt for a while, but I'm now fairly certain Tim Cook is not capable of being the kind of CEO that Jobs was. They're different people, surely, and I like Cook's public social politics, but Apple needs a strong visionary (hell, this whole fucking nightmare industry needs a strong visionary, and one that's not a fucking geek), not just a supply chain management genius. Jobs was an arrogant and abusive ass, but he also had a very strong and solid vision for the computer industry, with the forceful personality required to bully the board of directors and major shareholders into allowing the company to focus on product quality, where form resulted from function, rather than the current hipster Jony Ive flat/thin minimalist trend bullshit of function being sacrificed for form.
The look of the software is bad but the behavior is far worse. I can excuse bad visual style for something that works very well. I cannot excuse iOS today. Just try editing text on iOS on pages like blogger or many forums. If you cannot even rely on text selection working correctly on websites, why would you trust your data to be cared for by iCloud?
https://m.youtube.com/playlist?list=PLVQFi5tAhOgeBCheLWnovZExMXFRhQnu-
Edit: the bugs in the videos that I encounter most are the Safari page previews being wrong (same for task switching previews, which is not shown in the video) and the swipe-up control panel and swipe-down notification screen spontaneously failing to operate when the appropriate edges of the screen are swiped up or down.
iOS 2-6 sold me on this device family, and iOS 7-9 have made me wish droid wasn't a total pile. So long as apple knows droid is worse, iOS can continue to become more bloated and buggy without losing much market share.
Share price is all that matters to the "masters of business administration" management and leadership at Apple. The board of directors is not there for the customer. I was willing to give him the benefit of the doubt for a while, but I'm now fairly certain Tim Cook is not capable of being the kind of CEO that Jobs was. They're different people, surely, and I like Cook's public social politics, but Apple needs a strong visionary (hell, this whole fucking nightmare industry needs a strong visionary, and one that's not a fucking geek), not just a supply chain management genius. Jobs was an arrogant and abusive ass, but he also had a very strong and solid vision for the computer industry, with the forceful personality required to bully the board of directors and major shareholders into allowing the company to focus on product quality, where form resulted from function, rather than the current hipster Jony Ive flat/thin minimalist trend bullshit of function being sacrificed for form.
The look of the software is bad but the behavior is far worse. I can excuse bad visual style for something that works very well. I cannot excuse iOS today. Just try editing text on iOS on pages like blogger or many forums. If you cannot even rely on text selection working correctly on websites, why would you trust your data to be cared for by iCloud?
https://m.youtube.com/playlist?list=PLVQFi5tAhOgeBCheLWnovZExMXFRhQnu-
Edit: the bugs in the videos that I encounter most are the Safari page previews being wrong (same for task switching previews, which is not shown in the video) and the swipe-up control panel and swipe-down notification screen spontaneously failing to operate when the appropriate edges of the screen are swiped up or down.
Tuesday, February 2, 2016
Why software sucks (and is there anything you can do about it??)
If I was still into buying books that are merely tomes of rants or piles of salient facts that I already know, and therefore have only purchased them to throw at other people, I would buy this book:
"Why software sucks, and what you can do about it"
The thing is, books like this aren't aimed at the people that need to learn these very important facts. The geeks that are writing code won't read it, would scoff at it in their ignorance (calling the author ignorant, because they don't know what they don't know and he does) and simply reject its lessons. This is why programmers need managers. Programmers come in all different flavors, some of which are even indistinguishable from regular human beings (meaning, they agree with my assessment of how horrific computers are). The problem is that the most extreme programmers, the ones that do incredible DSP code and 3D modeling/rendering packages, generally don't fit into that group (and are usually the ones responsible for the horrors, though marketing has a lot to do with this too). These kinds of programmers make neat tools with shitty GUIs and antagonistic user experiences. They also make for lousy support and service personnel.
The solution is to basically put project managers over these programmers. This is why commercial/proprietary software tends to have better user experience design and support, compared to the awfulness that is open source (because open source projects are composed of programmers beholden to no one, who do the work because they enjoy it, not because it pays their living wages). So if any of you readers (because I'm sure at least one person other than me has accidentally read something here) are project managers for software products, you should already understand everything in this book, and your only real excuse for shitty product is upper management and the marketing team wrecking all your great efforts...
Read any number of Google Books' excerpted pages from this book and you will learn multiple salient facts about why computers are shit... unless you're a programmer that thinks everyone thinks like you (or should think like you). Step one in correcting your distorted perspective is admitting you have this distorted perspective.
"Why software sucks, and what you can do about it"
The thing is, books like this aren't aimed at the people that need to learn these very important facts. The geeks that are writing code won't read it, would scoff at it in their ignorance (calling the author ignorant, because they don't know what they don't know and he does) and simply reject its lessons. This is why programmers need managers. Programmers come in all different flavors, some of which are even indistinguishable from regular human beings (meaning, they agree with my assessment of how horrific computers are). The problem is that the most extreme programmers, the ones that do incredible DSP code and 3D modeling/rendering packages, generally don't fit into that group (and are usually the ones responsible for the horrors, though marketing has a lot to do with this too). These kinds of programmers make neat tools with shitty GUIs and antagonistic user experiences. They also make for lousy support and service personnel.
The solution is to basically put project managers over these programmers. This is why commercial/proprietary software tends to have better user experience design and support, compared to the awfulness that is open source (because open source projects are composed of programmers beholden to no one, who do the work because they enjoy it, not because it pays their living wages). So if any of you readers (because I'm sure at least one person other than me has accidentally read something here) are project managers for software products, you should already understand everything in this book, and your only real excuse for shitty product is upper management and the marketing team wrecking all your great efforts...
Read any number of Google Books' excerpted pages from this book and you will learn multiple salient facts about why computers are shit... unless you're a programmer that thinks everyone thinks like you (or should think like you). Step one in correcting your distorted perspective is admitting you have this distorted perspective.
Thursday, January 21, 2016
Another great rant about Apple destroying design
"Minimalism in software is achieved by simplifying feature sets, not stripping away pixels. In software, affordances are everything. And all affordances are made of pixels. It’s not minimalism to rip away the very things your users need. It’s sadism.THE DIRECTION OF IOS 7, 8 AND 9 IS SIMPLY WRONG.This is not an aesthetic argument. It’s wrong based on 40+ years of computer-human interaction research. It’s wrong based on 30+ years of Apple HIG."
Go read "Destroying Apple’s Legacy..."
PS: Hey Apple: WTF does text selection on websites suck unbelievable amounts of ass in iOS??