Home

Syntagm > Design for Usability > Resources > Articles

Inclusive Design: Accessibility Guidelines Only Part of the Picture

(interactions magazine July/August 2004)

City University (London) recently completed its accessibility survey of 1,000 British Web sites on behalf of the UK's Disability Rights Commission. It is probably the largest of its kind ever undertaken, but sadly the results are no surprise - certainly not to anyone who has ever tried to navigate a Web site without a mouse. However, some of the findings are quite interesting and have even managed to spark a minor conflict with the W3C’s Web Accessibility Initiative, which may increase interest in the topic, if nothing else.

First, the predictable information: All 1,000 sites were tested against the WAI’s Web Content Accessibility guidelines with an automatic testing tool. The results are as follows:

  • Fewer than 19 percent of home pages were Level A compliant (assuming that some would also fail the manual tests required by the guidelines).
  • Only six of the 1,000 home pages passed the automatic testing for Level AA, with four of those failing the manual tests. That left only two sites as AA compliant (0.2 percent).
  • No home page was Level AAA compliant.

Of the 1,000 sites originally chosen, 100 were tested by a panel of disabled users and a further 20 underwent expert inspection. The results of these further studies were compared with the automated tests, with some slightly more interesting outcomes:

  • There was no correlation between the number of WAI checkpoint violations and the results of the user tests.
  • 69 percent of the warnings raised by automated testing needed to be manually checked, but only 5 percent of those resulted in violations (so 95 percent of the warnings that required manual checking were false positives)

These are pretty discouraging results, though automated testing does have some compensating strong points:

  • The tests are quick and inexpensive to run.
  • They are (or can easily be) incorporated in development tools.
  • If developers are given the responsibility of ensuring that all their pages pass accessibility tests, they will make it their business to ensure that any manual checking needed is minimal.

The usability tests found 585 issues. The majority of these (425) were attributed to just 10 problems as shown in the table below.

 

Problem

Count

%

1

Unclear and confusing layout of pages

101

17.3

2

Confusing and disorienting navigation mechanisms

96

16.4

3

Inappropriate use of colours and poor contrast between content and background

59

10.1

4

Incompatibility between accessibility software and web pages

45

7.7

5

Graphics and text size too small

44

7.5

6

Incorrect or non-existent labelling of links, form elements and frames

24

4.1

7

Cluttered and complex page structures

23

3.9

8

ALT tags on images non-existent or unhelpful

16

2.7

9

Lack of alternative for audio media and complex terms/language

10

1.7

10

Complicated language or terminology

7

1.2

 

Other

160

27.4

 

Total

585

100.0

This is where the controversy begins. City University says in its report that 45 percent of these problems cannot be attributed to WAI guidelines. The WAI in its defence claims that “95% of the barriers reported are indeed covered by existing checkpoints in WAI Guidelines.” Personally, I agree that problems one and two identified in the table are not directly covered by WAI guidelines. They account for about a third of the total (33.7%). It may be that some of the problems grouped in “other” are also not addressed by the guidelines, but problems three through ten certainly are.

So who is right? The correct answer is that it does not matter. What distinguishes problems one and two from most of the others is that they are fundamentally usability problems. We have become used to thinking of accessibility as an isolated issue, when of course it is simply a prerequisite to usability. And as we know, prescriptive usability guidelines only skim the surface of a very complex problem. This guarantees that the guidelines will never be enough, since only usability testing can provide the larger picture.

In a sense, we are doing a disservice to users with disabilities by focussing on accessibility as if it was important in its own right. Most disability discrimination legislation suffers a similar flaw. Service providers, employers, and others need only ensure that they do not discriminate unfairly against disabled people. There is no requirement to provide usable systems or services to anyone—as long as it is done fairly!

What is needed is something that we might be tempted to call “universal usability” if that term was not already in use for a slightly broader domain (across hardware and delivery platforms, as well as user communities). But we do already have an alternative in use in the U.K., known as “inclusive design.” My hope is that we eventually will see a focus on usability across a broad cross-section of the population, taking disabilities and age-related issues into account. And the quicker that we can move from accessibility to inclusive design, the better.

References

DRC Web Accessibility Report Produced by City University: http://hcid.soi.city.ac.uk/research/Drc.html

WAI Response to UK Web Accessibility Report: http://www.w3.org/2004/04/wai-drc-statement.html

The Author

William Hudson is principal consultant for Syntagm Ltd, based near Oxford in the UK. His experience ranges from firmware to desktop applications, but he started by writing interactive software in the early 1970's. For the past ten years his focus has been user interface design, object-oriented design and HCI.

Other free articles on user-centred design: www.syntagm.co.uk/design/articles.htm

© 2001-2005 ACM. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in interactions, {Volume 11, Issue 4, July + August 2004} http://doi.acm.org/10.1145/1005261.1005278

© 1995-2020 Syntagm Ltd | Contact Us