Friday, October 12, 2012

Accessibility: Not Just For People with Disabilities

This is an old post from my old blog. It belongs here.

How many accounts do you have?
  • Banks (savings, checking, ATM, etc.)
  • Credit cards (more than one? More than four?)
  • Car care clubs
  • Insurance companies (multiple?)
  • Mortgage(s)
  • Maintenance service contracts (car, home, appliances, etc)
  • Doctors (yours, your kids' and your pets')...
What other items have i missed here?
Do you access these services on-line for bill paying and general management and tracking purposes? What about other on-line accounts?
  • Special interest forums.
  • Registration and licensing accounts (for multiple product vendors).
  • Technical support (for multiple product vendors).
(Some companies that make you register your product purchase on-line will force you to use a totally different user name and password for the registration system AND their on-line forums. Let's ignore the technical reasons (differing systems that are not in any way integrated) and just say that this is simply dumb)
i have so many accounts to keep track of, with different on-line URLs and different user name and password combinations, that i cannot remember more than a couple of them. My web browsers automatically fill in the fields for me. When that fails, i have a note with each account and its specific details and passwords on my Tapwave Zodiac (my choice of PDA, still, despite Tapwave ceasing to exist).
Yes, not using the information from my head makes it easier to forget. Maybe you would do better, if you didn't grow up with learning disabilities... but would you be able to remember 46 accounts and their associated passwords and IDs? That's how many i currently track on my Zodiac.
Points of interest:
  • Security pundits of the world will tell you not to use the same password for every account. Apparently these pundits are of super-human ability, able to remember more than 7 (plus or minus two) unique identifiers and which accounts they go to. i may have some autistic genius traits, but keeping track of 46 accounts with only my brain is far and away beyond my abilities. Maybe if the information was meaningful to me, but...
  • There is a set of determining factors devised to inform users of the strength of their password. It usually goes by number of characters used, mixed case (yes, capitals and lower case letters are DIFFERENT things in computer land, folks... this is called "case sensitivity"), use of numeric and special characters.
  • Those super human security pundits recommend 8 characters or more... with mixed case... and numbers... and, yeah, don't forget the special characters, too.
  • Security pundits also tell you not to use a real word that exists in a dictionary, especially in your native language, or to intermix numbers and special characters among the letters of a word, at the very least.
  • Certainly you should never use a word that people who know anything about you might be able to guess, even if THAT word is not in an official dictionary.
So, to sum up the requirements:
  • The real recommendation is that you use nonsense passwords...
  • With mixed case letters
  • And numbers
  • And special characters
  • At least 8 characters, but "more is better"...
  • Hell, you better make that 14 characters, just to be safe.
  • And never, ever use the same password more than once.
To make security pundits, network administrators and *nix geeks around the world gasp in disgust, i proudly admit that i use the same six character password for each account, wherever possible. "Wherever possible" is a mighty big variable here...
At least 65% of my accounts demand a password format which my favourite standby password fails to accommodate. About 50% of those accounts will not allow me to make the same generic modification to my standby password. Still other accounts require me to regularly enter personal information, just so they can be sure it's really me. (don't forget your dead parrot's middle name, kids)
One company, astoundingly, and quite opposed to the beliefs of super human security pundits, limits the user's log-in name to eight characters. This is because they are still living in the past. The 8-bit past. The *nix past. The real point, though, is that this leaves me with one single account that does not accept my first initial and last name, nor my email address, in completeness as a user ID (the two most common account name types for a person to use, based on current web standards that none of us voted for).
So, let me get this straight... in order to have "respectable" security in place with your accounts (unless you're using a card swiping mechanism for an ATM which only demands FOUR digit "PINs")... You have two options:
  1. Be super human
  2. Write them all down somewhere
i don't think that many of us fit into category one, above. Yet, the pundits, network admins and the geeks damn we mere mortals for all the security risks and breaches of the world. "Problem Exists Between Computer And Chair" they say. How many of those "experts" follow their own rules, i wonder?
To those same pundits, admins and geeks, i go so far as to declare (not suggest, but DECLARE):
Your security demands CAUSE security breaches by REQUIRING human beings to write things down!!

The goal: Secure Computing.
Where is humane computing?

Back in the day, you know, when everything was limited to 8 characters (and PINs of four numbers were not marks of shame) there were no raging disputes about these things. Mostly because:
  1. The systems were few and far between
  2. The technology did not exist for 256-bit encryption of 14-character passwords with mixed content
  3. No one had yet decided to sponge their fortune from "clients" who needed "Security Advisors," or other such titles, to solve a problem not yet invented
  4. "Hackers" were programmers and "Crackers" were busting copy protection on 8-bit games.
  5. People didn't really care (largely because of points one and three).
These days, though... it's all about security. i mean, we have EVERYTHING on-line! "On-line" used to mean "on computer" but now it means "The Internet." That must also mean that any "hacker" with access to a networked computer (or not, if you believe the crap you see on The X-Files) must also be able to get at your data whenever he (or, sigh, she) feels particularly evil. Ooooo, HACKERS... that sounds so... evil!
Yes, we must make it all secure, now. It didn't matter before (except to some crazy wingnuts playing with something called the "CLI" and something called "ARPANET" on some archaic computer systems created in the 60's), but there is nothing more important today than security. Ask anyone who recently moved to Windows Vista and they will tell you just how much they like the new security features of their computer's latest operating system. Yes, yes, it's all about SECURITY!
 
(Oh, and privacy. How many dead trees do your service providers mail to you and make you read at the office, defining their privacy policies, again and again... and again, despite the fact that pretty much nothing has changed since the laws they must follow were established in the first place. It's all about covering corporate butt. Take a look at my recent set of articles on flickr and ask me how many of those complaints are "covered" by flickr's claim of protecting users' privacy... not MY privacy, per se... just... users in general)

Hell, screw reason, sensibility and rationality. Screw the human beings trying to use these systems!
  • It's not about you.
  • It's not about service.
  • It's all about "SECURITY!"
  • It's all about "PRIVACY!"
  • It's all about Covering Corporate Ass! (and making money doing it)

If you are actually able to get to your data, it's just not secure enough!


Full Disclosure:
i'm a former computer geek. Or so said the flame to which- i mean -the standards to which i was held way back in middle school. i was a computer geek when it was uncool and could get you a punch in the gut, just for fun. Today, thanks to people at advertising agencies working for Apple and several other technology companies who are constantly desperate to widen their market and user-base, computers and geekery is somehow "cool."
Sorry.
"Kool."
Now that it's kool to be a computer geek and "hot" girls wear tiny t-shirts with "i love nerds" on them... i've given it all up (as much as i can, given that i cannot hire my own technical support geeknerd to fix things for me while i go outside and enjoy the sunshine).
i worked in "the industry" for almost two decades (almost). i did pretty much a little of everything at one point or another. Programming, customer service, technical support, network management, etc. (not that the network managers i was filling in for would admit that, as i have no magical certificate that declares me a "specialist").
i even crusaded (rather intensely) for an "alternative operating system" called BeOS. i briefly crusaded for Haiku. During my BeOS/Haiku crusades, i started to recognize that computers are really just junk, made by geeks, for geeks. The attitude of most programmers (not all) and companies (not all) was "RTFM." (wiki that one)
i discovered that this was not at all about making good stuff that would solve problems and make life easier for everyone. It was an elite club and normal people were not allowed (but they were expected to buy the stuff and shut up when it didn't work, because it must be the user's fault).
The computer industry used to be a fascination to me, but now i just want the tools to do what it says on the tin. If it's broke, out of the box, i shouldn't have to fix it. It should have a warranty. Not a statement in the "End User License Agreement" (which you never read, let alone agree to) saying "The entire risk of the purchase is on you. No warranty is expressed or implied, including fitness for a particular purpose."
i call myself a "born again USER." i used to have a career making the lives of other users easier when they came to me saying "I just don't get this computer stuff." i loved to tell them that it wasn't their fault.
This article is probably like walking through a room full of ex-cons with all of my personal information printed, legibly, on my t-shirt, while giving them the finger(s) and calling their mums whores. But, you know what?
A computer is supposed to be a tool, people.
Make it work,
use it,
and
GET OVER IT!

Wednesday, August 8, 2012

Still No Accountability In the Computer Industry?

i'm not going to rant on about EULAs (which deserve many rants). Instead, i direct you to a paper written by someone with more credibility than myself, who has written a well-thought argument for why there ought be accountability in the computer industry.

"Accountability In a Computerized Society," By Helen Nissenbaum.

It's not the easy read of your c|nets and Foxes, and it was published in 1996, an eternity in computer industry time. Yet, it's thoughtful and all the more relevant today, especially as online security violations make software bugs look passé. But why should any of this ever become passé? To quote Nissenbaum:
"...if experts in the field deny that such a distinction can be drawn, in view of the inevitability of bugs and their potential hazard, it is reasonable to think that the field of computing is not yet ready for the various uses to which it is being put."
Well said.


...and here's a cartoon (click for original):

Wednesday, August 1, 2012

Websites Should Behave Like Websites

I don't do much with this blog because, being a GUI perfectionist, I would rant all day about things no one seems to care about. But, well, maybe some people DO care:

Pretenders: Why mobile web apps should stop trying to act like native apps

Tuesday, February 8, 2011

Good Manuals Matter

Due to growing dissatisfaction with Lightwave's clumsy GUI and Carrara's utter brokenness, i've been trying out some new 3D graphics tools over the last couple of weeks. In some cases, all i have access to are the developer's on-line documentation. While this allows me to get an idea of what the tool may be like prior to purchase, it's not near useful enough as a trial or demo would be. Still, there's much to learn from documentation... including just how detail oriented the developers AREN'T.

Of particular note is Modo, by Luxology. While at the Carrara & Lightwave forums, i've seen a lot of users pointing to Modo as an alternative to the buggy & less-than-robust offerings from DAZ & NewTek (especially in the case of DAZ's Carrara). Luxology has a link for "Try Modo" at the top of their web page... and it tells you that a trial will be available "at a future date." Hm, not helpful. So i've proceeded to read their online documentation. Sadly, i'm rather unimpressed with a few things. i'd love to rant about the "Shader Tree" concept, but i don't know enough about it yet, so i'll focus on a more subtle issue that people tend to ignore these days: Readability and language use.

While reading Luxology's documentation for Modo, i've discovered that they really need to hire a proofreader. Now, i have a similar complaint with magazines such as Future Publishing's Computer/Future Music magazines, but, in this case, the problems are a bigger issue than simply embarrassing flubs; they're obstructing the learning process. Worse, the clumsy writing suggests to me a developer-wide lack of attention to detail. It's far easier to correct a document than to debug software. So why not do it? It shouldn't be considered a lower priority. Not only does it reflect poorly on the company as a whole, it makes the product more inaccessible.

Take a read of this page, for example: http://docs.luxology.com/modo/501/help/pages/modotoolbox/modoMindset.html
  • Run-on sentences
  • Sentence fragments
  • Missing commas
  • Possessives (apostrophes in possessives are almost entirely missing)
  • Complex sentences where multiple simple sentences would be better
  • Various other typographical errors
  • Lack of breaking procedural content into steps or bulleted lists
The documentation doesn't read as though the cause is a non-native English speaker. While i think that case is no excuse, it's at least an explanation & the solution is to hire a native English speaking proofreader. Even with a native-English speaker (or native-[insert documentation language here] speaker), you still have to take care. There are plenty of native-English speakers who have horrendous language skills, and that just will not do for imparting information in an efficient and effective manner.

It seems that in today's "do more with less" market mentality, it is assumed a proofreader isn't necessary so long as "spell check" is used. The problem with that attitude is that none of the above bulleted items can be caught with spell checkers. At best, a grammar checker might provide some feedback, but (from my own personal experience using MS Word) they're still a far cry from the abilities of a human being who's native language is that used by the documentation (MS Word's grammar checker also tends to flag a lot of non-problem areas as well).

Worse, it also seems that a dedicated technical writer is also considered "a luxury item" that companies would rather do without. When i read documentation like this, i can almost visualize one of the programmers being told to sit down and explain how the software works. Some people may think that this is ideal; who else knows the product better than the programmers? The problem is one of specialization. Programmers tend to be great at code and even work-flows, but not so great with communication & language (sometimes they're utterly horrible at it). Why make the developers write the documentation when their time is better spent writing (and debugging) code? Put a dedicated technical writer on the team. Someone who's specialty is the conveyance of information to a wider audience (wider than just "the people building the tool").

Clarity in technical writing is essential to the goal: conveying information to the reader. For that reason, i might even argue that readability is even more important for documentation than most other forms of writing.

Most disconcerting of all: if this is the state of their attention to detail in their documentation... what might i infer about the attention to detail in the software design and coding? Looking at postings on the forum, it sounds as though the release of Modo 501 was premature (at best; at worst it's just damned sloppy). Modo 501 is apparently the first Modo release to be 64-bit on the Macintosh. As witnessed with other 64-bit "first versions" of other math-heavy products, there are a lot of "shouldn't be" problems in the commercial release reported by users at the forums.

Maybe this is why Luxology doesn't actually offer a trial version for download, currently; it would give them a black eye in PR. Still, they're selling the product... Not very encouraging.

Tuesday, February 1, 2011

Photoshop Owns Me

[this is the rant that inspired this blog's existence...]

It's amazing how one tool's methods can become a dominant paradigm in "how to" for everything i want to do. i think Photoshop is pretty logical, but surely there is an amount of "i'm just used to it" going on. i'll be generous to assume that there could be alternate ways of doing some concepts we're familiar with in Photoshop (how to manage layers & selections, etc). Some things are OS-dependent, like cut/copy/paste (& it really gets up my nostrils when apps do not obey OS conventions for stuff like that).

So... i've been watching training videos about Lightwave & ZBrush... Every tool, or manipulation of a tool, that has any analog to something in Photoshop gets judged by me as either being "sensible" or "stupid" (or, in ZBrush's case, as stated by a person on a forum: "created by crackheads").

This is where user customizations can be very helpful. Ideally. If you KNOW the tool already, yes, go for it. Customize it to suite YOU. If you're learning the tool for the first time, forget about customizing. You have to get through the learning curve all the way before you can start taking charge of the UI. Tutorials & documentation are based on factory defaults. This means that customization means nothing to new users of a tool. It's a tease.

There is a concept i really like among some DAWs. The idea is that the creators have realized a user might have become very used to working with one brand but would like to move to another. For example: you may want to move from Cubase to Sonar. Sonar's keyboard shortcut configuration lets you do "sets." Theoretically, you just choose the "Cubase Set" & any common function between Cubase & Sonar is mapped to keys you're already used to. Ok, that's sensible. Some software even goes as far as creating color sets with similar intent. Nice.

Sadly, very few product makers consider this concept. They want you to use THEIR tool, only & forever. You're to learn THEIR methods, or nothing. What's worse are the complex concepts. Sure, you can map keys easily enough to similar commands, but what if you need to work with something like content selection or layers?


In image editors, Photoshop has definitely dominated the landscape. If you can deal with layers & selection in Photoshop, you can pretty much walk right into Painter or Paint Shop Pro without much ass-scratching. i even remember when Painter was changed so that the layers feature (the way the GUI looked & behaved) was specifically changed to be like Photoshop. Before that, Fractal Design did the same thing to make Painter's selection tools work like Photoshop. It was smart because the two programs are highly complimentary & co-exist very well in an artist's workflow. It's sensible to make them behave similarly. But, again, the developers have to give a shit about users in this way. Painter's creators were motivated by market share & expansion of user base. What if a product maker has no care in the world? What if they have had their own way of implementing a pretty common tool in a pretty unique way for the last ten years? They can either change or ... not change.


Software that started as non-Windows or non-MacOS software is pretty bad in this regard (same with Linux-land but it's even worse there & i wont talk about it here because you're probably already wondering why you're reading this). Before there were established norms on how to do things with predefined GUI elements, people made their own methods & systems. These programs were eventually adapted to work in standardized windowing GUI environments like Window or Mac OS (or they died out). Some of them literally redesigned their GUI to use the new environment to its best native advantages. Others... like Lightwave... stayed exactly the same damn way as always because they felt their methods were efficient & sensible.


They might be sensible ways... or might have been... back before GUI design was largely standardized. Back before computer users became accustomed to a set of common conventions. Things like Lightwave continue to rely on extremely heavy keyboard shortcut memorization. Its environment (windows, controls, objects, etc) are still rather non-visual, especially in workflow. They haven't adapted to the new windowing GUI concepts. These concepts aren't even "new" any more. Yet, they're still being ignored as being "cluttery" & "just eye candy." The reality is somewhere in between extreme viewpoints.
Then there are things that are developed with entirely custom GUIs any way, despite always having been inside a windowing GUI OS. The idea is: "If we create our OWN GUI environment, we don't have to do anything differently to have this program on Windows & MacOS." Ideally. Ok. Once again, i turn to Photoshop as the sanity example. Photoshop has always used as much of the OS's appearance & behaviors as possible, even though the GUI is custom. It has tended to use the "best of breed" concepts when Windows & Mac OS differed (sometimes giving more credit to "the Mac way" as that was the first OS Photoshop appeared on). Photoshop LOOKS & BEHAVES like you expect most programs to look & behave. From program to program, things are very standardized, on the OS of your choice (that huge choice between two). It's cross platform AND it's familiar to you, regardless of that OS choice (but not in SPITE of it, like Carrara). Smart. There are plenty of examples of this. Sadly, in the graphics world, there are plenty of examples of criminal behavior in this regard.


Carrara. Lightwave. ZBrush. To name a few recent mindfucks.


Lightwave at least can blame its birth on the Amiga as an excuse. A thin one excuse, but it at least has its reasoning. Carrara is just a reinvention of the GUI wheel for the sake of arrogance (i can't really find a justification for it to look & behave any differently from the OS API other than the simple desire to make it stand out & be "pretty" in their own "unique" ways). ZBrush... Also no such excuse.
ZBrush wasn't even cross-platform at first. Windows-only. Some of the choices were to make it quick to use with a stylus (instead of a mouse), but it really boils down to the fact that the original developer created something for himself, reinventing the wheel whenever he added something to it that was kind of expected in graphics apps (like layers & selection tools), but he did it to service HIS way of working. You can see how he was probably not at all accustomed to Photoshop & its brethren when he started down this path... or he chose to spite them. Afterward, probably accidentally, ZBrush stumbled onto marketability... then years of new, added-on, development. All the while never changing those "unique" methods that are so contrary to what users from most other graphics apps are used to. Hell, it wasn't even a 3D tool; it was a "unique, 2.5D" painting app (3D was shoehorned onto it, yet that's become ZBrush's primary function  -the evolution & original focus shows badly in the overall product design).


This is "crackhead" software design. i didn't coin it, but it's fun, so i'm sticking with it.

Lightwave is being rebuilt by NewTek so that it steps into a modern GUI world. Rough task, since they have to attract new users in the world of 3D which is quickly passing them by... AND retain their hard-core, loyal, rabid fan base of existing users who are quite used to the current design. No longer will it be two separate programs (Modeler & Layout). It'll have modern ideas on how to implement tool functions... etc. It's got a shiny new marketing name called "CORE" (yes, all capital letters because that's somehow cool). The problem is, they've been at this for years & it's still not even feature complete for a first version, let alone a BETA. But, they sold it to the hard core users any way. Yeah, CORE is aptly named. People got in on some crazy subscription-like service... "buy-in now & get access to advanced builds, etc..." blah.  Fuck that. i'll never see it. It'll take forever to be released & its first official release will lack probably 50% of the functionality of current Lightwave. They're planning to sell Lightwave in two forms concurrently for some time because of this transitional period: CORE & Classic. i wont be getting it because i'm busted ass broke.

That means i'm having to learn, all over again, many concepts that i have become so used to: layers, selection tools... just to learn Lightwave. These ideas should cross-polinate. There should be a shallower learning curve. Nope. It's all "unique." i have to do this TWICE, in different ways, for i'm also trying to learn ZBrush. Without very specific projects, goals & handholding (short of forums, i have no people to talk to), i'm pretty much floundering, flailing, stumbling & cussing my ass off. And creating blogs to rant on.


It's not that Adobe is god or made "THE ONLY RIGHT CHOICES;" they've simply made a lot of effective/good choices which have been adopted by the majority of other developers as a "standard" way of doing things.

Oh yeah... since i'm talking about 3D tools... we have to rant about the "crackheaded" 3D tool of all crackheaded 3D tools: Blender. Not only is it off on "planet independent arrogant asshole" in its core design, it's open source freeware, originally created on Unix/Linux. It's feature-rich, but clumsy, ugly & so independently-minded that you have to re-learn everything you may have learned in other software (except for basic concepts common to all 3D tools such as "this is a vertex, this is a polygon, this is a bump map...") AND you have to deal with the fact that it was ported from a GUI-insane OS to two other OSes that are standardized. Like Carrara & ZBrush, there's NOTHING native in the GUI of Blender on ANY OS. This is largely because a Unix barely HAS a GUI to begin with. Every *nix app with a GUI historically had to create its own API. The X Window System isn't really a GUI. X sits on top of *nix the way Windows used to sit on DOS. Then, on top of X sits the desktop environment, which, among providing a file system browser, provides a common API for applications. It's utter insanity in the *nix world, no matter how many zealots tell you otherwise. But i said i'd not talk about that.


So Blender developers finally decided to do something about the ugliness & insanity... they rebuilt the GUI in version 2.5. i tried it. Hey, there's better usage of screen space, it's more readable, there are meaningful (& nice) icons (NewTek fanboys like to claim that Lightwave's more efficient than the competition because it's not "cluttered with icons..." no, instead you have to memorize hundreds of text commands & keyboard shortcuts that you cannot discover on a perusal of the GUI). So, Blender's becoming easy to look at & very customizable... BUT, in standard Linux open-source mentality, it goes too far with customization. The ways of operating the customizing controls are inconsistent with ANYTHING else (drag & drop is about as much commonality as they maintain). Forget anything Painter, Adobe or the DAWs taught you. In Blender, you don't have tabs & palettes. You have movable views and panels that are controlled by tiny handles that look like resize handles (but aren't). It's too easy to accidentally tear a palette or window off into an unwanted duplicate or scroll or resize it (yeah, you can scroll or size a panel in useless ways - it's up to YOU the user to make sure you don't drag too far & there's no automatic adjustment to ensure the minimum space is used to view all the controls without them being clipped).
So, it's prettier & more ... approachable... until you get to MOUSE controls. It seems they touched nothing here. Overhauling & making things more user-friendly in Linux-land = prettying surface features but not re-thinking much of the core ideology. Do it original. Don't mimic established norms. Except for when you do (Linux is a varied land of not-quite-carbon-copying of everything, including competing or mutually exclusive concepts... a nightmare mishmash of every graphical OS ever made... the bigger names have made huge inroads on cleaning this up but it requires you to keep yourself isolated to software that uses a particular desktop API). It's amazing how many developers STILL seem to misunderstand what "primary" & "secondary" mouse buttons really mean & mix of the common action with the uncommon control. Oh, & my favorite: the activation of a control on the mouse_down event instead of the mouse_up event (important if you've discovered yourself stopping an accidental control click half way through by dragging off of the control before you let go of the mouse button).


Just in case you're willing to learn this new tool because of how free it is, you should know that the developers have cleverly removed the radiosity lighting model from the new version of Blender (apparently so they could "re-factor" the surfacing & lighting features' code). It's a back-step. Back to convoluted, non-realworld workarounds & faking.

crackheads.

i'm finding that i want Adobe to start up a side business that actually has value (instead of all the pseudo useful sidetracks of webby content library/modification/sharing/marketing tools stuck into their product lineup): Making crackhead software act like "industry standard" software. They could make billions for companies who's crackheaded GUIs keep new customers beyond arm's length...
Ok. i'm done. This has been my distraction from the maddening process of learning another tool i don't know if i'll ever really use (mostly because i can't seem to get past the hurdles of learning it WHILE running into bugs & totally unintuitive technical geekoid bullshit required to mimic reality with insane simulations & a bajillion numerical settings on physics concepts no artist gives a rats fucking asshole about). As such, it's opinionated & maybe somewhat factually inaccurate. If you post comments taking me to task on how wrong i am about how right your tool or OS is, take note i might utterly ignore you.