Most content management professionals know very well the importance of user-based acceptance testing, and understand the high stakes involved. If users fail to embrace a system (e.g., due to poor usability), generally speaking the project fails, heads (often) roll, and it's back to Square Zero, with a greatly diminished budget.
You'd think people in the software world would have learned the importance of acceptance testing (and in particular, usability testing) by now, but memories are short and hubris trumps humility, hence history repeats itself all to often, and you end up with things like... Vista.
A recent sampling of more than three thousand machines connected to xpnet found (get this) that fully 35 percent of "Vista machines" are, in fact, running Windows XP. These are machines that originally shipped with Windows Vista, but have since been downgraded to XP.
It's possible to read many things into this finding (I'll leave that as an exercise for you the reader), but I think at a minimum it's a precautionary tale for people in the software business (at all levels: software vendors large and small; implementation specialists; individual programmers) who think they know what users want, but who don't have the slightest scintilla of data to back their assumptions up.
I'm no usability expert, but I've seen more than one enterprise portal application turn into a million-dollar train wreck . (Oh, the humanity...) Based on that, here's some advice (to be taken with a grain of sodium chloride):
- If you're in the early stages of planning a Web CMS or ECM rollout, don't underestimate the importance of verifying your design assumptions through usability testing.
- Don't wait until the end of the R&D process to find out your design stinks. By then, the cement has set. Create mockups and wireframes of dialogs, wizards, tabs, and UI motifs early on, before programming gets underway. Don't leave it to developers (who are not usability experts) to design your interfaces and usage metaphors.
- The process of arriving at a usable design is just that: a process. Plan for iterations. Have regular UI reviews. Don't assume you'll get things right in one pass.
- If you don't have in-house human factors expertise, bring in an expert from the outside.
- Likewise, if you don't have a usability lab in-house, set one up -- even a simple one. Gather behavior data, not answers to survey questions.
- Have a Plan B. If differences of opinion over usability begin to snowball out of hand, putting the project at risk, you may want to stop and perform a post-mortem before the patient is dead. Have options at that point.
- After the system is in production, do follow-up testing to see how close you came to meeting users' needs. This will either verify your original assumptions, or give you some things to think about for Version 2.0.
Microsoft, are you listening?