I just read a fantastic series of articles that, unfortunately, showed me that things are even worse than I realized with regard to the business of design. Instead of summarizing it here, I'm just going to point you to the original content. It's in multiple parts and I recommend all of them.
Showing posts with label ease of use. Show all posts
Showing posts with label ease of use. Show all posts
Thursday, September 1, 2016
Wednesday, July 20, 2016
Apple's continued success, despite lousy product design
Surveys and articles like this continue to make it look like focusing on low-effort consumer product (even to the detriment of content creator markets) and ugly, research-defying design is working out just fine.
Maybe myself and my fellow "believers in actual UI research" really are just a tiny minority, doomed to suffer the tyranny of fads and apathy (rather than everyone getting the benefits of informed design). If the majority doesn't give a damn about readability and functional elegance, then no serious loss of money to companies still pushing hard-to-read, flaky, flat junk that requires less investment to produce.
Did the industry wean people off quality design, or would consumers have embraced iOS 7 just as strongly in 2007 as they embraced iOS 1?
Back to design as a visual thing... We have the cult of "geek chic". These are the tech people who support technology because it's new, not because it's actually great. These are the people that pursue change, for the sake of change, as if a new presentation is equal to a revolution. The easiest way to promote a technology as new is to change its appearance. Very little engineering required for a quick bump in consumer attention. Microsoft has made an art out of changing the package design and offering the same product with little functional improvement (over a couple of decades, actual improvement is certainly notable, but not as much as it would appear on the surface).
Trying to explain the irrationality of it is like trying to explain the placebo effect to people who are unwilling to learn the reasoning and process for the scientific method, while they swear that homeopathy has made them feel better. Yet, tech people... supposed to be more scientific... Aargh. Myths!!
Fads don't stay new, but belief is very difficult to shake. The human brain seems wired to reject data that contradicts belief. So the decades of data rot away in boxes while the fads cycle on.
Maybe myself and my fellow "believers in actual UI research" really are just a tiny minority, doomed to suffer the tyranny of fads and apathy (rather than everyone getting the benefits of informed design). If the majority doesn't give a damn about readability and functional elegance, then no serious loss of money to companies still pushing hard-to-read, flaky, flat junk that requires less investment to produce.
Did the industry wean people off quality design, or would consumers have embraced iOS 7 just as strongly in 2007 as they embraced iOS 1?
Back to design as a visual thing... We have the cult of "geek chic". These are the tech people who support technology because it's new, not because it's actually great. These are the people that pursue change, for the sake of change, as if a new presentation is equal to a revolution. The easiest way to promote a technology as new is to change its appearance. Very little engineering required for a quick bump in consumer attention. Microsoft has made an art out of changing the package design and offering the same product with little functional improvement (over a couple of decades, actual improvement is certainly notable, but not as much as it would appear on the surface).
Trying to explain the irrationality of it is like trying to explain the placebo effect to people who are unwilling to learn the reasoning and process for the scientific method, while they swear that homeopathy has made them feel better. Yet, tech people... supposed to be more scientific... Aargh. Myths!!
Fads don't stay new, but belief is very difficult to shake. The human brain seems wired to reject data that contradicts belief. So the decades of data rot away in boxes while the fads cycle on.
The weaning notion has historical precedent: the computer industry has a long history of forcing difficult-to-use gadgetry onto a populous by there being no alternative to clunky tools to do new things (however painfully). The society adapts and becomes normalized to bugs, EULAs, and generally terrible design because not having the tool is more undesirable than the discomfort of actually using it. The businesses and people who get a leg up over their competitors who refuse bad design speaks volumes in capitalism: "use this crap or perish".
In 2007, the iPhone was the alternative to the miserable, yet standard, computer industry junk. Mac OS was, at the time, not taking the industry by storm, but was continuing to slowly erode the Microsoft dominance of the 90s. Along came iPhone, shocking the world into realizing tech sucked, because this new, easy-to-use (and beautifully designed) OS and hardware made the Internet a truly accessible resource for all (who could afford to buy one; interestingly, a huge number of people found a way to afford one). No geek cred was required.
Now that the competitors have caught up (mostly by aping Apple's researched design choices), there's no further need to continue to cater to ease of use; the pushing and shoving can resume. With Apple's design language now seemingly adopted (surface features only) by everyone (even used to create "geek chic" glitter on non-technology product marketing), Apple apparently wanted to differentiate itself again.
It did that by... Trashing everything it had already done, design-wise, and adopting a fad already in progress thanks to the competitors that Apple had recently seriously bloodied. From desperation and arrogance come bad decisions.
This isn't just about appearance; functional design is a critical part of design. Steve Jobs refused to take any blame for customers "holding it wrong", but the design was flawed and Apple gave free cases to correct the signal attenuation caused by ... holding the phone. Now the antennas are placed differently (but we still need cases; see below).
After Jobs, Apple made phones so large (and so thin) that one-handed operation is uncomfortable or even impossible for some people. Apple didn't give consumers a 4" option again until a few generations of awkward product. It was a purely MBA-expert-driven choice to sell larger phones, despite ergonomics showing that larger was not better. Stemming the media abuse (and consumer self-abuse to buy as large a phone as possible from google partners) by making a larger phone was sensible. Making it the only option was not. Oh there were options: too large and much too large.
Apple finally released a new 4" phone, after storms of criticism. "Surprise! 4-inch phones are selling well!" becomes the new news.
Back to the software: Apple has soiled ease of use with lousy visual design. It irritates me to no end. However, it would be tolerable if the tools did what they are supposed to do, reliably. They don't.
Apple has introduced countless behavioral problems into the product lines, starting in the iOS 7 era (late 2013). There was apparently a lot of UI code that needed to be rewritten to change the UI so dramatically. We've not seen the end of that even today with the final revision of iOS 9.3 (July 2016). It's not just the visual glitches, and it's not just on iOS.
There have been several cases of feature regression in Mac software. Final Cut Pro being a well publicized case, I'll leave you to google to learn more about that. Less publicized was the damage done to iWork.
Getting iWork to run on iOS seems to have required totally new software. The apps on iOS are feature incomplete. Instead of making them have feature parity with the more mature and robust Mac OS versions, Apple chose the course of back-porting the iOS versions of iWork to the Mac. They now have feature parity, because the stronger offering on the Mac was crippled to match the iOS version. Could Apple not make them both just as functional as Mac OS iWork was in the '09 edition? Absolutely, but that requires developer attention (money). With a limited number of developers at Apple, any attention expended on iWork is less attention spent on getting the next iOS release onto the market, and the next iteration of pushing customers to buy a new device. iWork languishes.
Its dedicated users have either refused to upgrade (including avoiding upgrading their OS or computer) or abandoned it for Microsoft's Office suite (and some who did the latter have moved to Windows... or back to Windows). Apple's response? Promote Microsoft Office.
Then there's iCloud. The iCloud feature bullet points are mounting, and a lot of those features are ideologically sound (shared clipboard being the best example to come from the WWDC 2016 announcements), but the execution is inconsistent and the results are unreliable. I continue to get duplicate Notes for no user-caused reason. Notes were one of the earliest features of iCloud synchronization, yet this still happens.
Worse is that my devices aren't all communicating equally with reminders. The alerts for To-Do lists items appear on all three of my supported devices, due to each device running the necessary service locally. Marking them as complete, however, is a different story. If I mark an item as complete on the iPhone, the iPad Pro shows it as overdue. The reverse also occurs. Which device has "the truth" is inconsistent, but it's usually the one where I made the change. The same goes for deleting items. I found that I can force the laggy device to go ask iCloud by entering the iCloud settings and viewing my account (it triggers a login request and the login forces a data synchronization).
We shouldn't be required to figure out workarounds. It should just work. That's Apple's old mantra. Workarounds are Microsoft territory. Didn't we Apple users become Apple users to avoid that crap?
Why is it like this? Money. It costs less, and makes more profit, to move on to the next product ASAP, rather than make the existing one work correctly. The computer industry has successfully trained consumers to believe that bugs are an inevitable and unavoidable necessity of computing. This is a meme and a lie, not a fact. Properly executed, software can be made reliable enough that problems are very rare.
The most serious issue with this isn't in minor annoyances. It's in the back-end of iCloud. If these synch problems are never fully worked out, why do we trust Apple with our data in the first place?
Apple today has the corporate culture of most huge businesses: admit no failing. That was Steve Jobs' thing: admit to mistakes by blaming the engineers working for him (except he was apparently so married to the design of iPhone 4 that he chose to blame the customers instead; proof that he was no technology god). It was mostly in-house, but it always seemed to leak to the media (no such stories with Apple's current leadership; which begs the question: does Apple leadership pay attention to its products or are they just mindless consumers themselves?).
Maybe the engineers were only following Jobs' lead. Maybe they "didn't get it". Either way, it was a pressure relief valve for avoiding getting stuck in the sunken cost fallacy (the current visual design of iOS). The poor Apple Newton finally got it mostly right at the end, but Jobs killed it because the damage had long been done and Apple was still losing face (and money, so the official story goes).
Who at Apple has the authority and the wisdom to do the same with the current design and execution trends?
This is one of the oldest tricks in the book of capitalism: redesign the packaging, catch market attention again... until people get used to it and you have to do it again to illicit the same spike in attention, again. Repeat chorus. Psychological studies have already noted this as a side-effect in efficiency experiments in industry: so long as the subjects are aware of a change being made, they behave differently, because they think they should. Their changed behavior drops off after a while. That's when you should measure results, no earlier.
When people respond to new package design, the managers who demanded it say "see, I told you that new design was better". No. It is just different. In fact, it might be much worse, but if you're not going to stick around and do real studies, you'll be able to preserve your ignorance and ego.
So Apple changed the package design and the average consumer response was positive: "yay change!" The articles about the consequences of the design changes, written by educated experts in human interfacing, go ignored by Apple and Apple doubles down on the flat ugliness, bringing it to the Mac too.
It wouldn't be so bad if appearance was all there was to it, but major functionality deficits appeared with that new package design. It might not be a direct consequence. It might be coincidence that the visual design came at the same time as architectural changes that created all the problems. One way or another, the confluence sticks out as a concentrated failure of leadership.
I wonder: how long will it take for hipster geeks to get over their 2013-era boredom with detailed visual design, and get bored with the current obsession of flat, low contrast, featureless dullness? (they call this "clean", because every irrational fad needs a marketing term, just like music equipment fetishists use the term "warmth" to justify analog equipment and valves)
To call research-backed quality design "dated" is irrational and is an uneducated opinion; all such people care about is a sense of newness; a "fresh image". The seemingly logic-oriented technology geek community has many members subscribing to this illogical judgment.
Hipster tech geeks. What an historically ironic concept.
This happens in all places of human occupation. My focus here is on the computer tech industry. It results in me talking about this behavior from that context. An automotive designer or a kitchen appliance designer might scream about the same problem in their own industries. A medical professional might scream about fad diets and unproven drugs getting undue credibility (and they do).
Lording over all of the designers and scientists are the same people, though: the business administration "experts" (who are apparently just "mindless consumers" themselves). So long as the few at the top are prosperous, the rest of us will have no choice but to stay the course. Supporting current business trends are what keeps people employed (though, at lower wages, and with less demand for actual training and knowledge, because cheap employment is an obsession with MBA "experts" and shareholders, no matter how destructive to their society these obsessions are).
Eventually, something disrupts the status quo. Apple itself is an impressive historical example.
Twice.
We're waiting, Apple. Do you no longer have the people and leadership with the vision?
Maybe someone else does. They just have to wait until the status quo becomes "old enough" (and painful enough) to upset it by moving a large portion of the market to something comparably more cozy and pretty.
Until then, society will just continue along as if there's no dissenting opinion worth merit. Just like environmentalists have been screaming about our destruction of delicate natural support systems for ages...
Wow, human beings are stupid.
Labels:
Apple,
bugs,
capitalism,
computers,
design,
ease of use,
flat,
flaws,
GUI,
ignorance,
iOS,
iPhone,
Jony Ive,
operating systems,
Steve Jobs,
Tim Cook,
ugly,
usability
Tuesday, February 1, 2011
Photoshop Owns Me
[this is the rant that inspired this blog's existence...]
It's amazing how one tool's methods can become a dominant paradigm in "how to" for everything i want to do. i think Photoshop is pretty logical, but surely there is an amount of "i'm just used to it" going on. i'll be generous to assume that there could be alternate ways of doing some concepts we're familiar with in Photoshop (how to manage layers & selections, etc). Some things are OS-dependent, like cut/copy/paste (& it really gets up my nostrils when apps do not obey OS conventions for stuff like that).
So... i've been watching training videos about Lightwave & ZBrush... Every tool, or manipulation of a tool, that has any analog to something in Photoshop gets judged by me as either being "sensible" or "stupid" (or, in ZBrush's case, as stated by a person on a forum: "created by crackheads").
This is where user customizations can be very helpful. Ideally. If you KNOW the tool already, yes, go for it. Customize it to suite YOU. If you're learning the tool for the first time, forget about customizing. You have to get through the learning curve all the way before you can start taking charge of the UI. Tutorials & documentation are based on factory defaults. This means that customization means nothing to new users of a tool. It's a tease.
There is a concept i really like among some DAWs. The idea is that the creators have realized a user might have become very used to working with one brand but would like to move to another. For example: you may want to move from Cubase to Sonar. Sonar's keyboard shortcut configuration lets you do "sets." Theoretically, you just choose the "Cubase Set" & any common function between Cubase & Sonar is mapped to keys you're already used to. Ok, that's sensible. Some software even goes as far as creating color sets with similar intent. Nice.
Sadly, very few product makers consider this concept. They want you to use THEIR tool, only & forever. You're to learn THEIR methods, or nothing. What's worse are the complex concepts. Sure, you can map keys easily enough to similar commands, but what if you need to work with something like content selection or layers?
In image editors, Photoshop has definitely dominated the landscape. If you can deal with layers & selection in Photoshop, you can pretty much walk right into Painter or Paint Shop Pro without much ass-scratching. i even remember when Painter was changed so that the layers feature (the way the GUI looked & behaved) was specifically changed to be like Photoshop. Before that, Fractal Design did the same thing to make Painter's selection tools work like Photoshop. It was smart because the two programs are highly complimentary & co-exist very well in an artist's workflow. It's sensible to make them behave similarly. But, again, the developers have to give a shit about users in this way. Painter's creators were motivated by market share & expansion of user base. What if a product maker has no care in the world? What if they have had their own way of implementing a pretty common tool in a pretty unique way for the last ten years? They can either change or ... not change.
Software that started as non-Windows or non-MacOS software is pretty bad in this regard (same with Linux-land but it's even worse there & i wont talk about it here because you're probably already wondering why you're reading this). Before there were established norms on how to do things with predefined GUI elements, people made their own methods & systems. These programs were eventually adapted to work in standardized windowing GUI environments like Window or Mac OS (or they died out). Some of them literally redesigned their GUI to use the new environment to its best native advantages. Others... like Lightwave... stayed exactly the same damn way as always because they felt their methods were efficient & sensible.
They might be sensible ways... or might have been... back before GUI design was largely standardized. Back before computer users became accustomed to a set of common conventions. Things like Lightwave continue to rely on extremely heavy keyboard shortcut memorization. Its environment (windows, controls, objects, etc) are still rather non-visual, especially in workflow. They haven't adapted to the new windowing GUI concepts. These concepts aren't even "new" any more. Yet, they're still being ignored as being "cluttery" & "just eye candy." The reality is somewhere in between extreme viewpoints.
Then there are things that are developed with entirely custom GUIs any way, despite always having been inside a windowing GUI OS. The idea is: "If we create our OWN GUI environment, we don't have to do anything differently to have this program on Windows & MacOS." Ideally. Ok. Once again, i turn to Photoshop as the sanity example. Photoshop has always used as much of the OS's appearance & behaviors as possible, even though the GUI is custom. It has tended to use the "best of breed" concepts when Windows & Mac OS differed (sometimes giving more credit to "the Mac way" as that was the first OS Photoshop appeared on). Photoshop LOOKS & BEHAVES like you expect most programs to look & behave. From program to program, things are very standardized, on the OS of your choice (that huge choice between two). It's cross platform AND it's familiar to you, regardless of that OS choice (but not in SPITE of it, like Carrara). Smart. There are plenty of examples of this. Sadly, in the graphics world, there are plenty of examples of criminal behavior in this regard.
Carrara. Lightwave. ZBrush. To name a few recent mindfucks.
Lightwave at least can blame its birth on the Amiga as an excuse. A thin one excuse, but it at least has its reasoning. Carrara is just a reinvention of the GUI wheel for the sake of arrogance (i can't really find a justification for it to look & behave any differently from the OS API other than the simple desire to make it stand out & be "pretty" in their own "unique" ways). ZBrush... Also no such excuse.
ZBrush wasn't even cross-platform at first. Windows-only. Some of the choices were to make it quick to use with a stylus (instead of a mouse), but it really boils down to the fact that the original developer created something for himself, reinventing the wheel whenever he added something to it that was kind of expected in graphics apps (like layers & selection tools), but he did it to service HIS way of working. You can see how he was probably not at all accustomed to Photoshop & its brethren when he started down this path... or he chose to spite them. Afterward, probably accidentally, ZBrush stumbled onto marketability... then years of new, added-on, development. All the while never changing those "unique" methods that are so contrary to what users from most other graphics apps are used to. Hell, it wasn't even a 3D tool; it was a "unique, 2.5D" painting app (3D was shoehorned onto it, yet that's become ZBrush's primary function -the evolution & original focus shows badly in the overall product design).
This is "crackhead" software design. i didn't coin it, but it's fun, so i'm sticking with it.
Lightwave is being rebuilt by NewTek so that it steps into a modern GUI world. Rough task, since they have to attract new users in the world of 3D which is quickly passing them by... AND retain their hard-core, loyal, rabid fan base of existing users who are quite used to the current design. No longer will it be two separate programs (Modeler & Layout). It'll have modern ideas on how to implement tool functions... etc. It's got a shiny new marketing name called "CORE" (yes, all capital letters because that's somehow cool). The problem is, they've been at this for years & it's still not even feature complete for a first version, let alone a BETA. But, they sold it to the hard core users any way. Yeah, CORE is aptly named. People got in on some crazy subscription-like service... "buy-in now & get access to advanced builds, etc..." blah. Fuck that. i'll never see it. It'll take forever to be released & its first official release will lack probably 50% of the functionality of current Lightwave. They're planning to sell Lightwave in two forms concurrently for some time because of this transitional period: CORE & Classic. i wont be getting it because i'm busted ass broke.
That means i'm having to learn, all over again, many concepts that i have become so used to: layers, selection tools... just to learn Lightwave. These ideas should cross-polinate. There should be a shallower learning curve. Nope. It's all "unique." i have to do this TWICE, in different ways, for i'm also trying to learn ZBrush. Without very specific projects, goals & handholding (short of forums, i have no people to talk to), i'm pretty much floundering, flailing, stumbling & cussing my ass off. And creating blogs to rant on.
It's not that Adobe is god or made "THE ONLY RIGHT CHOICES;" they've simply made a lot of effective/good choices which have been adopted by the majority of other developers as a "standard" way of doing things.
Oh yeah... since i'm talking about 3D tools... we have to rant about the "crackheaded" 3D tool of all crackheaded 3D tools: Blender. Not only is it off on "planet independent arrogant asshole" in its core design, it's open source freeware, originally created on Unix/Linux. It's feature-rich, but clumsy, ugly & so independently-minded that you have to re-learn everything you may have learned in other software (except for basic concepts common to all 3D tools such as "this is a vertex, this is a polygon, this is a bump map...") AND you have to deal with the fact that it was ported from a GUI-insane OS to two other OSes that are standardized. Like Carrara & ZBrush, there's NOTHING native in the GUI of Blender on ANY OS. This is largely because a Unix barely HAS a GUI to begin with. Every *nix app with a GUI historically had to create its own API. The X Window System isn't really a GUI. X sits on top of *nix the way Windows used to sit on DOS. Then, on top of X sits the desktop environment, which, among providing a file system browser, provides a common API for applications. It's utter insanity in the *nix world, no matter how many zealots tell you otherwise. But i said i'd not talk about that.
So Blender developers finally decided to do something about the ugliness & insanity... they rebuilt the GUI in version 2.5. i tried it. Hey, there's better usage of screen space, it's more readable, there are meaningful (& nice) icons (NewTek fanboys like to claim that Lightwave's more efficient than the competition because it's not "cluttered with icons..." no, instead you have to memorize hundreds of text commands & keyboard shortcuts that you cannot discover on a perusal of the GUI). So, Blender's becoming easy to look at & very customizable... BUT, in standard Linux open-source mentality, it goes too far with customization. The ways of operating the customizing controls are inconsistent with ANYTHING else (drag & drop is about as much commonality as they maintain). Forget anything Painter, Adobe or the DAWs taught you. In Blender, you don't have tabs & palettes. You have movable views and panels that are controlled by tiny handles that look like resize handles (but aren't). It's too easy to accidentally tear a palette or window off into an unwanted duplicate or scroll or resize it (yeah, you can scroll or size a panel in useless ways - it's up to YOU the user to make sure you don't drag too far & there's no automatic adjustment to ensure the minimum space is used to view all the controls without them being clipped).
So, it's prettier & more ... approachable... until you get to MOUSE controls. It seems they touched nothing here. Overhauling & making things more user-friendly in Linux-land = prettying surface features but not re-thinking much of the core ideology. Do it original. Don't mimic established norms. Except for when you do (Linux is a varied land of not-quite-carbon-copying of everything, including competing or mutually exclusive concepts... a nightmare mishmash of every graphical OS ever made... the bigger names have made huge inroads on cleaning this up but it requires you to keep yourself isolated to software that uses a particular desktop API). It's amazing how many developers STILL seem to misunderstand what "primary" & "secondary" mouse buttons really mean & mix of the common action with the uncommon control. Oh, & my favorite: the activation of a control on the mouse_down event instead of the mouse_up event (important if you've discovered yourself stopping an accidental control click half way through by dragging off of the control before you let go of the mouse button).
Just in case you're willing to learn this new tool because of how free it is, you should know that the developers have cleverly removed the radiosity lighting model from the new version of Blender (apparently so they could "re-factor" the surfacing & lighting features' code). It's a back-step. Back to convoluted, non-realworld workarounds & faking.
crackheads.
i'm finding that i want Adobe to start up a side business that actually has value (instead of all the pseudo useful sidetracks of webby content library/modification/sharing/marketing tools stuck into their product lineup): Making crackhead software act like "industry standard" software. They could make billions for companies who's crackheaded GUIs keep new customers beyond arm's length...
Ok. i'm done. This has been my distraction from the maddening process of learning another tool i don't know if i'll ever really use (mostly because i can't seem to get past the hurdles of learning it WHILE running into bugs & totally unintuitive technical geekoid bullshit required to mimic reality with insane simulations & a bajillion numerical settings on physics concepts no artist gives a rats fucking asshole about). As such, it's opinionated & maybe somewhat factually inaccurate. If you post comments taking me to task on how wrong i am about how right your tool or OS is, take note i might utterly ignore you.
It's amazing how one tool's methods can become a dominant paradigm in "how to" for everything i want to do. i think Photoshop is pretty logical, but surely there is an amount of "i'm just used to it" going on. i'll be generous to assume that there could be alternate ways of doing some concepts we're familiar with in Photoshop (how to manage layers & selections, etc). Some things are OS-dependent, like cut/copy/paste (& it really gets up my nostrils when apps do not obey OS conventions for stuff like that).
So... i've been watching training videos about Lightwave & ZBrush... Every tool, or manipulation of a tool, that has any analog to something in Photoshop gets judged by me as either being "sensible" or "stupid" (or, in ZBrush's case, as stated by a person on a forum: "created by crackheads").
This is where user customizations can be very helpful. Ideally. If you KNOW the tool already, yes, go for it. Customize it to suite YOU. If you're learning the tool for the first time, forget about customizing. You have to get through the learning curve all the way before you can start taking charge of the UI. Tutorials & documentation are based on factory defaults. This means that customization means nothing to new users of a tool. It's a tease.
There is a concept i really like among some DAWs. The idea is that the creators have realized a user might have become very used to working with one brand but would like to move to another. For example: you may want to move from Cubase to Sonar. Sonar's keyboard shortcut configuration lets you do "sets." Theoretically, you just choose the "Cubase Set" & any common function between Cubase & Sonar is mapped to keys you're already used to. Ok, that's sensible. Some software even goes as far as creating color sets with similar intent. Nice.
Sadly, very few product makers consider this concept. They want you to use THEIR tool, only & forever. You're to learn THEIR methods, or nothing. What's worse are the complex concepts. Sure, you can map keys easily enough to similar commands, but what if you need to work with something like content selection or layers?
In image editors, Photoshop has definitely dominated the landscape. If you can deal with layers & selection in Photoshop, you can pretty much walk right into Painter or Paint Shop Pro without much ass-scratching. i even remember when Painter was changed so that the layers feature (the way the GUI looked & behaved) was specifically changed to be like Photoshop. Before that, Fractal Design did the same thing to make Painter's selection tools work like Photoshop. It was smart because the two programs are highly complimentary & co-exist very well in an artist's workflow. It's sensible to make them behave similarly. But, again, the developers have to give a shit about users in this way. Painter's creators were motivated by market share & expansion of user base. What if a product maker has no care in the world? What if they have had their own way of implementing a pretty common tool in a pretty unique way for the last ten years? They can either change or ... not change.
Software that started as non-Windows or non-MacOS software is pretty bad in this regard (same with Linux-land but it's even worse there & i wont talk about it here because you're probably already wondering why you're reading this). Before there were established norms on how to do things with predefined GUI elements, people made their own methods & systems. These programs were eventually adapted to work in standardized windowing GUI environments like Window or Mac OS (or they died out). Some of them literally redesigned their GUI to use the new environment to its best native advantages. Others... like Lightwave... stayed exactly the same damn way as always because they felt their methods were efficient & sensible.
They might be sensible ways... or might have been... back before GUI design was largely standardized. Back before computer users became accustomed to a set of common conventions. Things like Lightwave continue to rely on extremely heavy keyboard shortcut memorization. Its environment (windows, controls, objects, etc) are still rather non-visual, especially in workflow. They haven't adapted to the new windowing GUI concepts. These concepts aren't even "new" any more. Yet, they're still being ignored as being "cluttery" & "just eye candy." The reality is somewhere in between extreme viewpoints.
Then there are things that are developed with entirely custom GUIs any way, despite always having been inside a windowing GUI OS. The idea is: "If we create our OWN GUI environment, we don't have to do anything differently to have this program on Windows & MacOS." Ideally. Ok. Once again, i turn to Photoshop as the sanity example. Photoshop has always used as much of the OS's appearance & behaviors as possible, even though the GUI is custom. It has tended to use the "best of breed" concepts when Windows & Mac OS differed (sometimes giving more credit to "the Mac way" as that was the first OS Photoshop appeared on). Photoshop LOOKS & BEHAVES like you expect most programs to look & behave. From program to program, things are very standardized, on the OS of your choice (that huge choice between two). It's cross platform AND it's familiar to you, regardless of that OS choice (but not in SPITE of it, like Carrara). Smart. There are plenty of examples of this. Sadly, in the graphics world, there are plenty of examples of criminal behavior in this regard.
Carrara. Lightwave. ZBrush. To name a few recent mindfucks.
Lightwave at least can blame its birth on the Amiga as an excuse. A thin one excuse, but it at least has its reasoning. Carrara is just a reinvention of the GUI wheel for the sake of arrogance (i can't really find a justification for it to look & behave any differently from the OS API other than the simple desire to make it stand out & be "pretty" in their own "unique" ways). ZBrush... Also no such excuse.
ZBrush wasn't even cross-platform at first. Windows-only. Some of the choices were to make it quick to use with a stylus (instead of a mouse), but it really boils down to the fact that the original developer created something for himself, reinventing the wheel whenever he added something to it that was kind of expected in graphics apps (like layers & selection tools), but he did it to service HIS way of working. You can see how he was probably not at all accustomed to Photoshop & its brethren when he started down this path... or he chose to spite them. Afterward, probably accidentally, ZBrush stumbled onto marketability... then years of new, added-on, development. All the while never changing those "unique" methods that are so contrary to what users from most other graphics apps are used to. Hell, it wasn't even a 3D tool; it was a "unique, 2.5D" painting app (3D was shoehorned onto it, yet that's become ZBrush's primary function -the evolution & original focus shows badly in the overall product design).
This is "crackhead" software design. i didn't coin it, but it's fun, so i'm sticking with it.
Lightwave is being rebuilt by NewTek so that it steps into a modern GUI world. Rough task, since they have to attract new users in the world of 3D which is quickly passing them by... AND retain their hard-core, loyal, rabid fan base of existing users who are quite used to the current design. No longer will it be two separate programs (Modeler & Layout). It'll have modern ideas on how to implement tool functions... etc. It's got a shiny new marketing name called "CORE" (yes, all capital letters because that's somehow cool). The problem is, they've been at this for years & it's still not even feature complete for a first version, let alone a BETA. But, they sold it to the hard core users any way. Yeah, CORE is aptly named. People got in on some crazy subscription-like service... "buy-in now & get access to advanced builds, etc..." blah. Fuck that. i'll never see it. It'll take forever to be released & its first official release will lack probably 50% of the functionality of current Lightwave. They're planning to sell Lightwave in two forms concurrently for some time because of this transitional period: CORE & Classic. i wont be getting it because i'm busted ass broke.
That means i'm having to learn, all over again, many concepts that i have become so used to: layers, selection tools... just to learn Lightwave. These ideas should cross-polinate. There should be a shallower learning curve. Nope. It's all "unique." i have to do this TWICE, in different ways, for i'm also trying to learn ZBrush. Without very specific projects, goals & handholding (short of forums, i have no people to talk to), i'm pretty much floundering, flailing, stumbling & cussing my ass off. And creating blogs to rant on.
It's not that Adobe is god or made "THE ONLY RIGHT CHOICES;" they've simply made a lot of effective/good choices which have been adopted by the majority of other developers as a "standard" way of doing things.
Oh yeah... since i'm talking about 3D tools... we have to rant about the "crackheaded" 3D tool of all crackheaded 3D tools: Blender. Not only is it off on "planet independent arrogant asshole" in its core design, it's open source freeware, originally created on Unix/Linux. It's feature-rich, but clumsy, ugly & so independently-minded that you have to re-learn everything you may have learned in other software (except for basic concepts common to all 3D tools such as "this is a vertex, this is a polygon, this is a bump map...") AND you have to deal with the fact that it was ported from a GUI-insane OS to two other OSes that are standardized. Like Carrara & ZBrush, there's NOTHING native in the GUI of Blender on ANY OS. This is largely because a Unix barely HAS a GUI to begin with. Every *nix app with a GUI historically had to create its own API. The X Window System isn't really a GUI. X sits on top of *nix the way Windows used to sit on DOS. Then, on top of X sits the desktop environment, which, among providing a file system browser, provides a common API for applications. It's utter insanity in the *nix world, no matter how many zealots tell you otherwise. But i said i'd not talk about that.
So Blender developers finally decided to do something about the ugliness & insanity... they rebuilt the GUI in version 2.5. i tried it. Hey, there's better usage of screen space, it's more readable, there are meaningful (& nice) icons (NewTek fanboys like to claim that Lightwave's more efficient than the competition because it's not "cluttered with icons..." no, instead you have to memorize hundreds of text commands & keyboard shortcuts that you cannot discover on a perusal of the GUI). So, Blender's becoming easy to look at & very customizable... BUT, in standard Linux open-source mentality, it goes too far with customization. The ways of operating the customizing controls are inconsistent with ANYTHING else (drag & drop is about as much commonality as they maintain). Forget anything Painter, Adobe or the DAWs taught you. In Blender, you don't have tabs & palettes. You have movable views and panels that are controlled by tiny handles that look like resize handles (but aren't). It's too easy to accidentally tear a palette or window off into an unwanted duplicate or scroll or resize it (yeah, you can scroll or size a panel in useless ways - it's up to YOU the user to make sure you don't drag too far & there's no automatic adjustment to ensure the minimum space is used to view all the controls without them being clipped).
So, it's prettier & more ... approachable... until you get to MOUSE controls. It seems they touched nothing here. Overhauling & making things more user-friendly in Linux-land = prettying surface features but not re-thinking much of the core ideology. Do it original. Don't mimic established norms. Except for when you do (Linux is a varied land of not-quite-carbon-copying of everything, including competing or mutually exclusive concepts... a nightmare mishmash of every graphical OS ever made... the bigger names have made huge inroads on cleaning this up but it requires you to keep yourself isolated to software that uses a particular desktop API). It's amazing how many developers STILL seem to misunderstand what "primary" & "secondary" mouse buttons really mean & mix of the common action with the uncommon control. Oh, & my favorite: the activation of a control on the mouse_down event instead of the mouse_up event (important if you've discovered yourself stopping an accidental control click half way through by dragging off of the control before you let go of the mouse button).
Just in case you're willing to learn this new tool because of how free it is, you should know that the developers have cleverly removed the radiosity lighting model from the new version of Blender (apparently so they could "re-factor" the surfacing & lighting features' code). It's a back-step. Back to convoluted, non-realworld workarounds & faking.
crackheads.
i'm finding that i want Adobe to start up a side business that actually has value (instead of all the pseudo useful sidetracks of webby content library/modification/sharing/marketing tools stuck into their product lineup): Making crackhead software act like "industry standard" software. They could make billions for companies who's crackheaded GUIs keep new customers beyond arm's length...
Ok. i'm done. This has been my distraction from the maddening process of learning another tool i don't know if i'll ever really use (mostly because i can't seem to get past the hurdles of learning it WHILE running into bugs & totally unintuitive technical geekoid bullshit required to mimic reality with insane simulations & a bajillion numerical settings on physics concepts no artist gives a rats fucking asshole about). As such, it's opinionated & maybe somewhat factually inaccurate. If you post comments taking me to task on how wrong i am about how right your tool or OS is, take note i might utterly ignore you.
Labels:
Blender,
Carrara,
crackheads,
design,
ease of use,
GUI,
Lightwave,
Photoshop,
ZBrush
Subscribe to:
Posts (Atom)