Tuesday, July 12, 2016

OS "upgrades" degrade performance

Planned obsolescence. It's not a myth or a conspiracy theory. It is the basis upon which the computer industry makes money each fiscal year by selling you the same product repeatedly. Oh yes, many other industries do the same, but I'm going to talk about the one that is most relevant to me.

Each year, computer makers try to sell you the same thing you bought last year, with marginal increases in memory and CPU speeds. 

Marginal change? What happened to that racing progress and Moore's law? 

That's over. It's getting very hard to produce the kinds of gains seen through the 80s and 90s. Physics is a bitch. Well, not a bitch. It's just nature, and the way we build counting machines is running into physical limits.

So what incentive is there to buy merely marginally improved computers? To accommodate the bloating of software; most notably, the operating systems.

Have you noticed that Apple now provides their operating system upgrades for free? How nice of them. Then again, they sell hardware. It's in their best interest to pack in new features that inspire new hardware purchases. It is also in their best interests to make some of those features be available only to owners of new hardware. They're in business to make money, okay, fair enough. So we can just choose to stop buying new hardware for a while if we're strapped for cash, right?

Well, maybe. If you don't want the new features of the new OS versions that might slow down your existing hardware. But maybe they've promised you a bug fix that isn't going to be put in the existing version because they're working on the latest version. Maybe your existing laptop will die from heat-related materials fatigue.

Apple's previous generation Mac Pro became well regarded for being upgradable and providing many years of use without replacement. If certain components failed, they were often easily replaced or even upgraded with 3rd party product. Notice that doesn't happen any more?

Monolithic construction with glued-in batteries, soldered on memory and CPUs... Portability and design elegance explains it to a degree, but the obsession with thinness has become seemingly pathological at Apple. Disposable computers are not environmentally friendly, regardless of how many toxic materials you remove from their manufacture (and the elimination of lead worked for computer manufacturers because now consumers have to replace laptops dying from heat-related materials fatigue far sooner than ever before, and they won't be paying out to consumers for this design flaw).

What about the average consumer that never cared about modular, "upgradable" computers anyway? (and honestly, upgrading was never what it was cracked up to be; by the time you desired to improve CPU or memory, the rest of the market had abandoned your architecture and you were forced to buy a new system anyway). How does the industry incentivize the average consumer to buy their phone or laptop again when they're plenty content with what they already have? (contrary to popular geek opinion, it's not the end consumers screaming for features; it's mostly just the geeks and professionals, which currently make up a tiny fraction of Apple's customer base)

Giving them something for free! "Ooo free upgrade!"

Making the OS free and convincing the user they're getting more functionality and convenience from that free OS upgrade gains emotional kudos from the user. Letting them experience creeping performance losses over time, where the change is more of a slippery slope kind of thing, rather than a sudden shock, makes it much easier to inspire buying "a faster one" in a year or two. Otherwise, people would mostly just keep using what they have and Apple would only make rapid repeat product sales to the people that can't manage to keep a piece of glass intact longer than six months at a time.

Each new version of iOS is slower than its predecessor. Don't believe me? Go watch the video: http://bgr.com/2016/02/16/apple-ios-9-vs-ios-6-performance/

The only reason I'm typing this on an iPhone 6s is because the number of apps (and websites) usable on my iPhone 4 with iOS 6 was approaching zero. Apple does not maintain Safari like an app on older iOS versions like they used to do with Safari on older Mac OS versions (another thing they've done to shorten the life of Mac OS versions is to stop supporting Safari on them). 

The Internet is no longer about HTML. Now it's about bloated pages with unnecessary scripting and popup advertising. Not only are web developers requiring features that Apple doesn't add to old versions of Safari, their pages are just plain bloated, so an old device with new feature support wasn't the answer either (no third party browser).

iOS 7 would've degraded the usability of the phone beyond my tolerance threshold (and its an ugly mess; even uglier without translucency on older phones). I still use my iPhone 4 as a music player and for various abandoned apps. Every time I use it, I marvel at the beauty of the GUI design in iOS 6, which is utterly gone from iOS as of version 7. But this isn't about the ugliness of iOS 7-9. I've plenty of other posts about that. This is about performance and "upgrading". Each version of the OS is slower than its predecessor. This is fact.

Geeks will come pouring from the cracks of the Internet to justify this and proclaim me ignorant of basic geek cred knowledge. "There are hundreds of new features!" they'll shout. Yes, indeed, this is true. But who asked for those features and why do they have to cause degradation in places where they're not in use? Why does it make sense to consume CPU cycles (and battery) doing things that many users might never desire, to support features not being currently called to task? Why, for example, is the keyboard slower, app loading slower, etc., when the user is not calling on all those "hundreds of new features"? Poor software architecture, I propose, is the answer. Laziness, lack of skill or time isn't really the core problem, though. 

Throughout my decades dealing with the computer industry, I've observed many tech geeks and developers promoting tolerance for bloating of software by declaring "buy a new computer!" Not all developers. There are plenty of programmers that are appalled by this, just as I'm not the only person screaming about flat design being horrible bullshit. But the excuses are out there in play and are very much the majority. 

"Your computer is older than god; buy a new one and stop bitching about the software that developers are making for you!"

This is entrenched dogma and it benefits the hardware makers. Why spend the time and money to optimize code in your own software when its degrading performance benefits your business by inspiring users to buy newer hardware from you?

The same performance degradation can be found in Mac OS. Snow Leopard boots within a few seconds on my MacBook Pro 5,5 (mid-2009). Mavericks booting is in a metric of minutes. The same difference can be seen with application startups and usage of Safari, and there's considerable latency and spinning beach ball activity on various tasks in Mavericks (yes, I upgraded the RAM to 8GB already). I'm okay with it when I've not used Snow Leopard for a while, but when I'm switching back to Mavericks from Snow Leopard, it's egregious. Mavericks is a decent OS all the same, and was what Lion and Mountain Lion failed to be. Hell, we even LOST features in Lion (Rosetta), so Lion should've been a speed boost, right? Ha ha ha... But still, the overall experience has suffered. Has the gain compensated for the loss? It doesn't feel that way, especially when the loss of Rosetta, and the acquisition of Logic 9 bugs with Mavericks keeps me in Snow Leopard most days. I'm not a Luddite. I'm just poor. If I had endless cash, I'd upgrade all the software and hardware that keeps me on Snow Leopard when I'm working on audio projects.

I'm pretty sure the features leading to this slowdown are related to iOS integration. Those are the major differences between promoted features of the two systems (the iOS integration stuff really started at Lion, but Lion and Mountain Lion were unfinished versions of Mavericks, IMO). El Capitan, which is less dramatically different in feature changes, isn't noticeably slower than Mavericks (but El Capitan's Safari bugs and ugliness keep me away).

There's an end point not under my control. Mac OS Sierra looks like a useful, if underwhelming, addition of features, some of which interest me (shared clipboards). I won't be seeing it any time soon, though. Sierra has left my MacBook Pro 5,5 in the dust. I will not be able to upgrade. Whatever Apple added to it has either finally made even Apple embarrassed at the performance degradation... or they just want to stop supporting the hardware. 

Fair enough. The supposed benefit of limited hardware support is greater reliability and  stability. However, the problem isn't the lack of support. It's the fact that a claimed supported system is severely degraded by "upgraded" software, and that it's progressively worse version after version. That's just performance, and not even including the PLETHORA of bugs and glitches. If they claim support of a machine, the performance should be equivalent. At the very least, there should be a warning supplied prior to the moment of upgrade that informs the user of this potentially undesirable result of an "upgrade", giving them informed consent before committing to an irreversible act. There is no such warning. Worse, Apple supplies propaganda to users of old systems that claim the opposite. My Mavericks system regularly gets unwanted popups declaring that I can "improve the performance of [my] computer" by upgrading to El Capitan. I've run El Capitan on this machine. That statement is a lie.

I'll fully admit, El Capitan runs well under the circumstances. Mavericks is running on an internal drive but my El Capitan test volume was an external USB2 drive. It ran rather well considering. But the Safari bugs were unacceptable. The loss of hardware support was unacceptable and there's no warning of this potential. It's left up to the user, some being more equipped to deal with this than others. Certainly the end users aren't. 

What hardware losses? El Capitan added System Integrity Protection. SIP a good idea and I support it. However, it has the consequence of disabling many 3rd party drivers on many people's systems, and many developers of such hardware chose not to update their drivers (because they too wish to sell me the same thing a second, third, fourth time, despite the hardware in question still operating perfectly well with the supported drivers on the supported OS, which is itself another case of planned obsolescence). 

The bloating of software cannot go on indefinitely. Moore's law has an end point and we've been standing very close to it for some time. CPU speeds have barely increased at all in many years. More memory is still forthcoming and will always be welcome (frankly, we ought to have solid state flat storage for all uses, not the continued separation of long term, short term, and virtual storage, but the tech still isn't quite there yet and it takes many software engineers seemingly forever to realize when an age-old "that's the way it's done" design needs to be swept aside... like shared libraries...). 

More memory, however, will not solve the problem of wasted CPU cycles on activities the user knows nothing of (nor should they be expected to know). This results in serious slowdowns for, what amounts to, no good reason (from the perspective of the user). Perception and experience is what matters here, not technical excuses. However logical the explanations, they're just justifications for bad tool design that is serving an unethical capitalist gravy train. Capitalism should be an economy; a system of exchange. It should not be an antisocial institution that slowly destroys everything around itself (such as society) while consuming all available resources to make a few people super wealthy.

How many cores can we cram into a CPU package before power usage and heat put a crimp on that line of "improvement"? With the way software is terribly optimized, and where multi threading is still not pervasively being utilized (and some tasks cannot be broken into threads to begin with), it seems to me that the constant bloat has an inevitable end point. Nothing is perpetual (and perpetual growth is usually cancer). 

At some point the industry will have to get off the gravy train and start optimizing code and offering fewer new features. They will have to find some other inspiration for selling the same thing to their customers again. In fact, the best feature I can imagine being offered right now is "optimized for speed and efficiency, with less storage space and memory consumption". 

Remember the last time Apple marketed that at us and it meant something?

That was Snow Leopard. 

You know, the OS that made Mac OS a serious contender and resulted in many people ditching Windows (and their Windows software investment) to move to the Apple platform, alongside their amazing new iPhones with beautiful and intuitive GUI. Rumor has it that Snow Leopard was a nice bonus from optimizing Mac OS X enough to run it on the hardware that would become the iPhone. The birth of the iPhone may have given us the most efficient Mac OS ever. 

It's too bad that the rest of the iOS legacy is the abandonment of said optimization and maybe even the destruction of Mac OS itself.

No comments:

Post a Comment