University website accessibility revisited

Dey Alexander [HREF1], Principal Consultant, Dey Alexander Consulting [HREF2], PO Box 2655, Cheltenham, Victoria, 3192. dey@deyalexander.com.au

Scott Rippon [HREF1], Associate Consultant, Dey Alexander Consulting [HREF2], PO Box 2655, Cheltenham, Victoria, 3192. scott@deyalexander.com.au

Abstract

A 2003 accessibility audit of Australian university websites found that 98 per cent failed to meet the most basic requirements for web accessibility. Four pages from each site were evaluated for conformance with Level-A of the Web Content Accessibility Guidelines, an international standard for web accessibility. Only one university's set of four pages met these standards.

This paper presents the findings from a second audit completed in early 2007. Using a similar methodology we aimed to see if there had been any improvement in university website accessibility. We found that overall, accessibility has slightly worsened. 100 per cent of sites and 92 per cent of pages failed to meet the basic standards. The biggest problem is still the failure to provide equivalent text alternatives for content presented in non-text formats. This is a relatively easy issue to resolve, but has proved intractable.

1. Introduction

In 2003 an accessibility audit of Australian university websites found that 98 per cent failed to meet basic requirements for web accessibility (Alexander, 2003). Four pages from each of the sites were evaluated for conformance with Level-A of the Web Content Accessibility Guidelines 1.0, an international standard for web accessibility [HREF3]. Only one university's set of four pages met these standards.

Since that time, most of the sites have been redesigned or updated. There have also been a range of web accessibility-related initiatives in the Australian university sector:

There have also been accessibility-related activities in the broader web design community. We have seen the popularisation of standards-based design and the creation of web developers' networks such as the Web Standards Group [HREF7]. This group is highly active in Australia and has organised three conferences on web standards since 2004. Web accessibility has also been in focus through the protracted (and still unfinished) development of the second version of the Web Content Accessiblity Guidelines [HREF8].

We wondered if these initiatives had had any impact on the accessibility of university websites.

2. Methodology

There were some minor differences in the methodology we used for this study. Our scope was slightly different, we used some different tools, and two evaluators reviewed the sites. However, the approach we took was sufficiently similar to allow a comparison between our findings and those of the 2003 study.

2.1 Goal of the research

The key goal of this research was to see if there was any measurable improvement in the accessibility of university websites since the last audit. Our benchmark was the same as the earlier study - the priority one checkpoints of the Web Content Accessibility Guidelines 1.0. These are the design guidelines that web pages need to comply with if they wish to claim Level-A conformance. They are the minimum standard for web accessibility in Australia and most Australian university websites would either claim conformance with this standard or be aiming to meet it.

2.2 Scope of the research

2.2.1 Sites and pages included

We audited four pages from the 41 university websites listed on the Australian Education Network website [HREF9]. In the earlier study, a list of 45 university websites published on the former Department of Education, Training and Youth Affairs website was used. The pages evaluated in both studies were:

  1. The home page
  2. The main prospective students page (or an alternative and roughly equivalent page where there was no prospective students page)
  3. An orientation page for incoming students in 2007 (or alternative where necessary)
  4. A student accommodation page (or alternative where necessary).

These pages were chosen in the original study for three key reasons. First, it is important to compare pages that are of similar significance to each institution, and which have a similar function and target audience. Second, each page was important enough within the context of the whole university web presence to have been given significant design and maintenance attention. The home page and prospective students pages are key entry points for users of university websites. The orientation and accommodation pages were likely to have been used most often at the time of year that both audits were done - in the lead up to the start of the new academic year. Finally, it was necessary to examine pages that did not require a log in.

We audited the pages between January and March 2007. Some may have been redesigned or updated since then.

2.2.2 Design checkpoints evaluated

Each page was evaluated against 16 of the 17 priority one checkpoints set out in the Web Content Accessibility Guidelines (Chisholm, Vanderheiden and Jacobs, 1999):

  1. Provide a text equivalent for every non-text element (checkpoint 1.1)
  2. Ensure information conveyed with colour is also available without colour (checkpoint 2.1)
  3. Identify changes in natural language for text and in text alternatives (checkpoint 4.1)
  4. Organise documents so they can be read without stylesheets (checkpoint 6.1)
  5. Update dynamic equivalents as content changes (checkpoint 6.2)
  6. Do not make the screen flicker (checkpoint 7.1)
  7. Provide redundant links for image map hotspots (checkpoint 1.2)
  8. Provide client-side image maps unless you cannot define a hotspot with a geometric shape (checkpoint 9.1)
  9. Identify row and column headers in data tables (checkpoint 5.1)
  10. Use markup to link data cells with their headers in complex tables (checkpoint 5.2)
  11. Give frames titles to help with orientation and navigation (checkpoint 12.1)
  12. Make sure pages work when scripts and applets are turned off. Otherwise provide an accessible equivalent (checkpoint 6.3)
  13. Provide an auditory description of the important information in the visual track of a multimedia presentation (checkpoint 1.3)
  14. For any time-based multimedia presentation synchronise equivalent alternatives with the presentation (checkpoint 1.4)
  15. Make scripts and applets directly accessible if they affect important functionality (checkpoint 8.1)
  16. As a last resort, provide an accessible equivalent page that is updated as often as the original (checkpoint 11.4).

As with the last audit, we did not evaluate checkpoint 1.14, "Use the clearest and simplest language appropriate for a site's content". This checkpoint is difficult to apply in an evaluation that does not include user testing because of the degree of subjectivity in deciding on what is the 'clearest' or 'simplest' language. In any case, a recent study has shown that most university websites use language that is overly formal and often confusing to prospective students (Alexander, 2005).

2.3 Evaluation tools and process

We used an automated and manual process for the audit. The tools we used included Internet Explorer version 6 running on Windows XP and the following:

We first viewed the page in Internet Explorer and saved an image of the screen so we could refer back to it later if needed. We then ran HERA and reviewed its automated report. Next, we looked at page elements that required a manual check, and used The WAVE and the Web Accessibility Toolbar to:

We saved images of the page with stylesheets disabled, a text-only view and the visual report from The WAVE. These provide a record of the pages at the time of the audit and a reference for the decisions we made during the audit.

2.4 Conformance rating

Each page was given a rating for conformance with each of the priority one checkpoints. The ratings were as follows:

The partial rating was not used in the 2003 study. The sorts of design practices we have given a partial rating to in this study were noted in 2003 as undesirable, but given a pass rating. We decided to use the partial rating in this study to highlight minor problems so they might be addressed in future by designers or site managers. We did not fail any pages or sites that were given only partial ratings.

3. Findings

3.1 Overview of the results

Our audit found that the overall level of basic accessibility of university websites has worsened slightly since the 2003 study. Every site and 93 per cent of pages failed in this audit, compared with 98 per cent of sites and 85 per cent of pages in 2003. The best result in this audit was from the University of Sunshine Coast: one page passed, two were given partial ratings and one failed. The following table shows the results for each site and page tested.

Table 1: Results by university and page
University Home page Prospective students page Orientation page Accommodation page Overall rating
Australian Catholic University Failed Failed Failed Failed Failed
Australian Defence Force Academy Failed Failed Failed Failed Failed
The Australian National University Failed Failed Failed Failed Failed
The University of Adelaide Failed Failed Failed Failed Failed
Bond University Failed Partial Failed Failed Failed
The University of Ballarat Failed Failed Failed Failed Failed
Central Queensland University Failed Partial Failed Failed Failed
Charles Darwin University Failed Failed Failed Failed Failed
Charles Sturt University Failed Failed Failed Failed Failed
Curtin University of Technology Failed Failed Passed Failed Failed
The University of Canberra Failed Failed Failed Failed Failed
Deakin University Partial Failed Failed Failed Failed
Edith Cowan University Failed Failed Failed Failed Failed
Flinders University Failed Failed Passed Failed Failed
Griffith University Failed Failed Failed Failed Failed
James Cook University Failed Failed Failed Passed Failed
La Trobe University Failed Passed Failed Partial Failed
Macquarie University Failed Failed Failed Failed Failed
Monash University Partial Partial Failed Failed Failed
Murdoch University Failed Failed Failed Failed Failed
The University of Melbourne Passed Failed Partial Partial Failed
The University of New England Failed Passed Failed Passed Failed
The University of Newcastle Failed Failed Failed Failed Failed
The University of New South Wales Failed Failed Failed Failed Failed
The University of Notre Dame Failed Failed Failed Failed Failed
Open Universities Australia Failed Failed Failed Failed Failed
Queensland University of Technology Failed Passed Failed Passed Failed
The University of Queensland Failed Failed Failed Failed Failed
RMIT University Failed Failed Failed Failed Failed
Southern Cross University Failed Failed Failed Failed Failed
Swinburne University of Technology Failed Partial Partial Failed Failed
The University of South Australia Failed Passed Passed Failed Failed
The University of Southern Queensland Failed Failed Failed Failed Failed
The University of Sydney Failed Failed Failed Failed Failed
The University of Tasmania Failed Failed Passed Failed Failed
The University of Technology, Sydney Failed Failed Failed Failed Failed
The University of The Sunshine Coast Failed Passed Partial Partial Failed
Victoria University Failed Partial Failed Partial Failed
The University of Western Australia Failed Failed Failed Failed Failed
The University of Western Sydney Failed Partial Failed Partial Failed
The University of Wollongong Failed Failed Failed Failed Failed
TOTALS 38 failed 30 failed 34 failed 33 failed 135 pages failed

3.2 Identifying and comparing problems across the two studies

We found failures against seven checkpoints compared with eight in the 2003 study. There were no new checkpoint failures - all seven issues were problems in the last study. Table 2 shows these checkpoints and the number of pages that failed against them. It presents these data alongside the figures from the 2003 audit. Percentages are also shown as the number of pages evaluated in each audit was different (164 this time; 180 in 2003).

The biggest accessibility problem was again related to checkpoint 1.1 which requires page authors to provide an equivalent text alternative for non-text elements. There was, however, a slight increase (0.7 per cent) on the earlier figures. As might be expected given trends in web design:

In the rest of this section we discuss these problems and provide examples found in this audit. Detailed reports for each page from each university website are available on our website [HREF13].

Table 2: Checkpoint failures by type, compared across the 2003 and 2007 audits
Checkpoint 2003 - no. of pages 2003 - % of pages 2007 - no. of pages 2007 - % of pages Difference
1.1 Provide text equivalents for non-text elements 138 76.7

127 (+22 partial)

77.4 +0.7
2.1 Do not convey information with colour alone 1 0.6 0 0.0 -0.6%
5.1 Identify row and column headers in data tables 2 1.1 3 1.8 +0.7
6.1 Organise documents so they can be read without stylesheets 59 32.8

27

16.4 +16.4
6.2 Update dynamic equivalents as content changes 1 0.6 1 0.6 0
6.3 Make sure pages work when scripts and applets are turned off 23 12.8

38 (+15 partial)

23.2 +10.4
11.4 Provide an accessible equivalent page, updated as often as the original 9 5 3 1.8 -3.2%
12.1 Give frames titles to help with orientation and navigation 26 14.4 2 1.2 -13.2%

3.2.1 Problems complying with checkpoint 1.1

Design elements that caused pages to fail against checkpoint 1.1 are shown in Table 3 and discussed in detail below.

Table 3: Checkpoint 1.1 failures by element, compared across the 2003 and 2007 audits
Page element 2003 - no. of pages 2003 - % of pages 2007 - no. of pages 2007 - % of pages Difference
Images 133 74.0 106 (+30 partial) 64.6 -9.4
Scripts 6 3.3 12 7.3 +4.0
Multimedia/Flash 4 2.2 10 6.1 +3.9
Frames 22 12.2 2 1.2 -21.0
PDF documents 13 7.2 28 17.1 +9.9
3.2.1.1 Checkpoint 1.1 requirements for images

All image elements must have an ALT attribute as part of the IMG element used to insert the image into the page. The ALT attribute is the place where the text alternative for the image is included (and for images that require a longer description, the LONGDESC attribute can be used). Images can also be included on a page as background elements. There is no ALT attribute for a background image so if the image contains content, designers need to include a text alternative elsewhere on the page.

When writing text alternatives for images, designers or page authors need to consider the reason the image is being used. If it is providing content, then an equivalent alternative to that content needs to be written. If it is for decorative or layout purposes, it is usually best to use an empty ALT attribute. Descriptive text alternatives for these kinds of images are often a distraction or noise in the background for blind users who hear them read out when using a screen reader to access web pages.

We found several types of problems with text alternatives for images used on university websites. In the last study these problems were categorised into seven types. We have used the same categorisation to allow a comparison between the two studies.

For content images:

For layout or decorative images:

In the 2003 study, problems B and F were recorded as a pass. In this study, we have given them a partial rating for the reasons stated earlier.

Table 4: Problems with image text alternatives, compared across the 2003 and 2007 audits
Image text alternatives 2003 - no. of pages 2003 - % of pages 2007 - no. of pages 2007 - % of pages Difference
Images used to provide content
A: Not equivalent 41 22.8 73 44.5 +21.7
B: Included unnecessary text 21 11.7 11 6.7 -4.0
C: Blank 11 6.1 9 5.5 -0.6
D: No ALT attribute 65 36.1 22 13.4 -22.7
E: Background image - none 1 0.6 14 8.5 +7.9
E2: Background image - not equivalent 0 0 1 0.6 +0.6
Images used for layout or decoration
F: Included unnecessary text 53 29.4 82 50 +20.6
G: No ALT attribute 89 49.4 32 19.5 -29.9

The most interesting data here are the figures for problem types D and G, and then A and F. It is likely that the increased use of content management systems and better web page authoring tools had led to a reduction in the problems with missing ALT attributes (problem types D and G). But the increase in problem types A and F show that designers and authors still do not properly understand the role of text alternatives, or do not take enough care when writing them.

There was a small decrease in problem type B. In the last study this problem related mainly to the use of sliced images. Page authors would repeat the text alternative for each slice. Slicing images was linked to the use of the TABLE element to control layout. Now that layout is more commonly handled by stylesheets, sliced images are not used as often.

Some examples of the image-related problems with text alternatives are outlined below. Each is relatively easily resolved. None require technical re-engineering or major changes to the way that pages are designed. Page designers and authors simply need to take more time to consider the function of the image and write an appropriate alternative that will have the same function or role when the page is used by someone who cannot see the image.

Problem A: text alternative is not equivalent
Problem B: text alternative includes unnecessary text
Problem C: ALT attribute was blank
Problem D: ALT attribute was missing
Problem E: background image had no text alternative
Problem E2: background image text alternative was not equivalent
Problem F: decorative or layout image includes an unneccessary text alternative
Problem G: decorative or layout image has missing ALT attribute
3.2.1.2 Checkpoint 1.1. requirements for PDF documents

Although there have been improvements to this format, most PDF documents remain at least partly inaccessible. This is because most are not created properly and so cannot make use of these improvements. PDF documents are often generated from poorly structured Word documents, and usually converted to PDF using the print option in Word. This means they are not 'tagged' for accessibility. And most PDF documents are not checked to ensure that accessibility features are in place.

In this audit we found more pages where content was published only in PDF format - up 9.9 per cent on 2003. We failed each page that did this because of the incidence of poorly-created documents, and also because the Australian Human Rights and Equal Opportunity Commission recommends against publishing in this way. They say:

"The Commission's view is that organisations who distribute content only in PDF format, and who do not also make this content available in another format such as RTF, HTML, or plain text, are liable for complaints under the DDA" (2002).

It is also worth noting that while PDF is popular with content publishers, many users prefer to avoid it. In a study of prospective students - none with a disability - 23 per cent had difficulty reading or finding information within a PDF document and 15 per cent said they preferred not to use them (Alexander, 2005). Some comments from users in that study were:

"It's so big it's hard to find anything."

"It's frustrating me a bit".

"I have to click on this unfortunately… I don't know why I need to click on an Adobe Acrobat file."

"I know that Acrobat would have details of course that I’m looking for, but I would like to find out the easy way… Acrobat right now for me is not necessary... I would like to find out the easy way."

"I don't like Acrobat… I think it's difficult to read."

3.2.1.3 Checkpoint 1.1 requirements for scripts

When content is included in pages using scripts, a text alternative must be provided using the NOSCRIPT element. This allows equivalent content to be made available when scripting is not available or disabled. We found 12 pages where no text alternative was provided - up 4 per cent on 2003.

Some pages were using JavaScript in the footer of the page to include content such as the date on which the page was last updated, the name of the person responsible for maintaining the page, or links to copyright or disclaimer notices. No text alternative was provided in the following cases:

Several pages used JavaScript to insert a Flash object on the page. There was either no text alternative provided or the alternative was not equivalent. Examples include:

3.2.1.4 Checkpoint 1.1. requirements for multimedia

Text alternatives need to be provided for multimedia content, so that users who cannot see or access the multimedia can make use of the alternative. Most problems with multimedia in this study were with the use of Flash. Examples where no text alternative was provided include:

3.2.2 Problems complying with other checkpoints

The incidence of two other problems increased since 2003: problems with the implementation of stylesheets (up 16.4 per cent) and pages that became unusable when scripting was not enabled (up 10.4 per cent).

3.2.2.1 Stylesheets and checkpoint 6.1

Checkpoint 6.1 requires designers to make sure that pages can still be read when stylesheets are disabled or when users use their own stylesheets. People with colour blindness or low vision may want to use their own styles to avoid colours that they cannot see well, increase the contrast between text and background colours, or to make font sizes larger.

The main problems observed in this and the earlier study was the legibility of navigation elements. Many designers are still using HTML to colour some page elements, so that when stylesheets are turned off, the HTML-coded colours remain. We found several pages where link text was very difficult to read because it was dark blue (the default colour when stylesheets are disabled) on a dark HTML-coloured background. Examples of this problem were found on:

We saw two cases where the page reading order became a problem when stylesheets were turned off:

3.2.2.2 Scripts and checkpoint 6.3

Checkpoint 6.3 requires designers to make sure that pages are still usable when scripting is turned off or not supported. The main problem found in both studies was the use of scripts to activate dropdown menus for "quick links". These menus are unusable if scripting is turned off or not supported. Examples of this problem were found on:

A new problem we observed this time was the use of JavaScript to generate key navigation. Important content areas of a site could not be accessed if JavaScript was not available. Examples include:

And on some sites, the search functionality disappeared or did not work when JavaScript was disabled. See:

We gave a partial rating to several pages that used JavaScript to control text resizing features. These are a new design element that have appeared since 2003. They aim to improve accessibility for those who cannot read text at the size set by the designer. We did not fail these pages because it is still possible to resize text using built-in browser functions or user-defined stylesheets.

We also saw a couple of pages that used JavaScript to activate page printing. Again, we gave these pages a partial rating because the built-in browser function can be used.

In each of these cases, the potential accessibility problem can easily be avoided using techniques based on server rather than client technology:

We did not see any advanced scripting techniques, known as AJAX (or Asychronous JavaScript with XML)[HREF 14]. These are techniques designed to speed up interactions with web-based interfaces. Interactions with a web page are processed on-the-fly without the need to wait for pages to reload. These techniques are believed to improve usability, and are likely to become more popular as a result . They do, however, present the risk of introducing accessibility problems - mainly for blind users using screen readers. This is because screen readers do not deal well with page elements that are updated without reloading the page (Edwards, 2006).

4. Conclusions and recommendations

The trend towards standards-based web design may have had some positive effects on web accessibility in the Australian university sector. Problems related to the use of frames and table-based layouts have decreased since 2003. Stylesheet-related problems have increased, but this is because designers have not moved all colour-related presentation elements to the stylesheet.

Other trends in web design have had a negative impact. The use of PDF documents has increased. Use of scripting is also on the rise, and this may increase further as the popularity of AJAX scripting techniques spreads.

However, the main problems identified in this and the 2003 study have little to do with changing trends in web design or the popularity of standards-based approaches. Page designers and content authors still appear to lack an understanding of the role of text alternatives for non-text elements used on a page.

Making university websites more accessible does not require a huge investment in time or technical skills. And it does not require any significant redesign or re-engineering of existing websites. It simply requires a better understanding of the role of text alternatives, particularly for images.

We recommend that universities:

It is also important that universities include accessibility checking as part of their web publishing quality assurance process. As mentioned in the conclusion of the last study "sign-off authorisation should include not just the acceptance of responsibility for the accuracy of page content, but for the accessibility of that content as well" (Alexander, 2003).

References

Alexander, Dey (2003) "How accessible are university websites?", Ausweb 03 [HREF15]

Alexander, Dey (2005) "How usable are university websites? A report on a study of the prospective student experience", Ausweb 05 [HREF16]

Australian Vice-Chancellors Committee (2004) "Guidelines on Information Access for Students with a Print Disability" [HREF17]

Australian Vice-Chancellors Committee (2006) "AVCC Guidelines Relating to Students with a Disability" [HREF18]

Australian Human Rights and Equal Opportunity Commission (2002) "World Wide Web Access: Disability Discrimination Act Advisory Notes", Version 3.2 [HREF19].

Chisholm, Wendy, Vanderheiden, Greg and Jacobs, Ian (1999) "Web Content Accessibility Guidelines 1.0" [HREF3]

Edwards, James (2006) "AJAX and screenreaders: when can it work?" [HREF20]

Hypertext references

HREF1
http://deyalexander.com.au/about/people.html
HREF2
http://deyalexander.com.au/
HREF3
http://www.w3.org/TR/1999/WAI-WEBCONTENT-19990505/
HREF4
http://www.wanau.org/
HREF5
http://www.wanau.org/forums2005/
HREF6
http://www.dest.gov.au/sectors/school_education/programmes_funding/forms_guidelines/disability_standards_for_education.htm
HREF7
http://webstandardsgroup.org/
HREF8
http://www.w3.org/TR/WCAG20/
HREF9
http://www.australian-universities.com/list/
HREF10
http://www.sidar.org/hera/
HREF11
http://www.visionaustralia.org.au/ais/toolbar/
HREF12
http://dev.wave.webaim.org/index.jsp
HREF13
http://www.deyalexander.com.au/publications/ausweb07/
HREF14
http://en.wikipedia.org/wiki/Ajax_(programming)
HREF15
http://ausweb.scu.edu.au/aw03/papers/alexander3/paper.html
HREF16
http://ausweb.scu.edu.au/aw05/papers/refereed/alexander/paper.html
HREF17
http://www.universitiesaustralia.edu.au/documents/publications/GuidelinesOnInfoAccessForStudentsWithDisablilities.pdf
HREF18
http://www.universitiesaustralia.edu.au/documents/publications/policy/statements/DisabilityGuidelinesMay06.pdf
HREF19
http://www.hreoc.gov.au/disability_rights/standards/www_3/www_3.html
HREF20
http://www.sitepoint.com/print/ajax-screenreaders-work

Copyright

Dey Alexander and Scott Rippon © 2007. The authors assign to Southern Cross University and other educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to Southern Cross University to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the conference papers and for the document to be published on mirrors on the World Wide Web.

=