I came across this post via the E-consultancy E-business Briefing newsletter: a forum post asking site managers to stop relying on Bobby automated testing for accessibility. I was quite surprised, as I had assumed accessibility had reached a "critical mass" in terms of industry perception.
It's simple: automated tools can point you in the right direction, pick up on issues you might not have noticed and above all else are quick and easy to use, however they do not match a decent expert review and testing with a target audience. Decent alt descriptions, and why "click here" is so bogus, just will not make sense until you observe, for instance, a visually impaired user merrily zipping through your site, but missing the majority of content because it is "inaccessible". Automated tools are great for getting you started, as well as for acceptance testing at the end of a project, but they are not the be-all and end-all - human input is still required, particularly in some of the greyer areas of legislation and guidelines.
Above all else, ensure that your developers learn from the experience, they'll probably appreciate it from an intellectual curiosity and career development perspective, and you'll benefit from more compliant sites henceforward!
UPDATE - 14/07/2005: On this subject, here is an apposite demonstration as to how knowledge of accessibility good practice should be developed (using guidelines, testing with users) as opposed to relying on automated indications - "Accessible Data Tables" from Web Usability.
No comments:
Post a Comment