Dey Alexander [HREF1], Principal Consultant, Dey Alexander Consulting [HREF2], PO Box 2655, Cheltenham, Victoria, 3192. email@example.com
Scott Rippon [HREF1], Associate Consultant, Dey Alexander Consulting [HREF2], PO Box 2655, Cheltenham, Victoria, 3192. firstname.lastname@example.org
A 2003 accessibility audit of Australian university websites found that 98 per cent failed to meet the most basic requirements for web accessibility. Four pages from each site were evaluated for conformance with Level-A of the Web Content Accessibility Guidelines, an international standard for web accessibility. Only one university's set of four pages met these standards.
This paper presents the findings from a second audit completed in early 2007. Using a similar methodology we aimed to see if there had been any improvement in university website accessibility. We found that overall, accessibility has slightly worsened. 100 per cent of sites and 92 per cent of pages failed to meet the basic standards. The biggest problem is still the failure to provide equivalent text alternatives for content presented in non-text formats. This is a relatively easy issue to resolve, but has proved intractable.
In 2003 an accessibility audit of Australian university websites found that 98 per cent failed to meet basic requirements for web accessibility (Alexander, 2003). Four pages from each of the sites were evaluated for conformance with Level-A of the Web Content Accessibility Guidelines 1.0, an international standard for web accessibility [HREF3]. Only one university's set of four pages met these standards.
Since that time, most of the sites have been redesigned or updated. There have also been a range of web accessibility-related initiatives in the Australian university sector:
There have also been accessibility-related activities in the broader web design community. We have seen the popularisation of standards-based design and the creation of web developers' networks such as the Web Standards Group [HREF7]. This group is highly active in Australia and has organised three conferences on web standards since 2004. Web accessibility has also been in focus through the protracted (and still unfinished) development of the second version of the Web Content Accessiblity Guidelines [HREF8].
We wondered if these initiatives had had any impact on the accessibility of university websites.
There were some minor differences in the methodology we used for this study. Our scope was slightly different, we used some different tools, and two evaluators reviewed the sites. However, the approach we took was sufficiently similar to allow a comparison between our findings and those of the 2003 study.
The key goal of this research was to see if there was any measurable improvement in the accessibility of university websites since the last audit. Our benchmark was the same as the earlier study - the priority one checkpoints of the Web Content Accessibility Guidelines 1.0. These are the design guidelines that web pages need to comply with if they wish to claim Level-A conformance. They are the minimum standard for web accessibility in Australia and most Australian university websites would either claim conformance with this standard or be aiming to meet it.
We audited four pages from the 41 university websites listed on the Australian Education Network website [HREF9]. In the earlier study, a list of 45 university websites published on the former Department of Education, Training and Youth Affairs website was used. The pages evaluated in both studies were:
These pages were chosen in the original study for three key reasons. First, it is important to compare pages that are of similar significance to each institution, and which have a similar function and target audience. Second, each page was important enough within the context of the whole university web presence to have been given significant design and maintenance attention. The home page and prospective students pages are key entry points for users of university websites. The orientation and accommodation pages were likely to have been used most often at the time of year that both audits were done - in the lead up to the start of the new academic year. Finally, it was necessary to examine pages that did not require a log in.
We audited the pages between January and March 2007. Some may have been redesigned or updated since then.
Each page was evaluated against 16 of the 17 priority one checkpoints set out in the Web Content Accessibility Guidelines (Chisholm, Vanderheiden and Jacobs, 1999):
As with the last audit, we did not evaluate checkpoint 1.14, "Use the clearest and simplest language appropriate for a site's content". This checkpoint is difficult to apply in an evaluation that does not include user testing because of the degree of subjectivity in deciding on what is the 'clearest' or 'simplest' language. In any case, a recent study has shown that most university websites use language that is overly formal and often confusing to prospective students (Alexander, 2005).
We used an automated and manual process for the audit. The tools we used included Internet Explorer version 6 running on Windows XP and the following:
We first viewed the page in Internet Explorer and saved an image of the screen so we could refer back to it later if needed. We then ran HERA and reviewed its automated report. Next, we looked at page elements that required a manual check, and used The WAVE and the Web Accessibility Toolbar to:
We saved images of the page with stylesheets disabled, a text-only view and the visual report from The WAVE. These provide a record of the pages at the time of the audit and a reference for the decisions we made during the audit.
Each page was given a rating for conformance with each of the priority one checkpoints. The ratings were as follows:
ALTattribute would have been more appropriate).
The partial rating was not used in the 2003 study. The sorts of design practices we have given a partial rating to in this study were noted in 2003 as undesirable, but given a pass rating. We decided to use the partial rating in this study to highlight minor problems so they might be addressed in future by designers or site managers. We did not fail any pages or sites that were given only partial ratings.
Our audit found that the overall level of basic accessibility of university websites has worsened slightly since the 2003 study. Every site and 93 per cent of pages failed in this audit, compared with 98 per cent of sites and 85 per cent of pages in 2003. The best result in this audit was from the University of Sunshine Coast: one page passed, two were given partial ratings and one failed. The following table shows the results for each site and page tested.
|University||Home page||Prospective students page||Orientation page||Accommodation page||Overall rating|
|Australian Catholic University||Failed||Failed||Failed||Failed||Failed|
|Australian Defence Force Academy||Failed||Failed||Failed||Failed||Failed|
|The Australian National University||Failed||Failed||Failed||Failed||Failed|
|The University of Adelaide||Failed||Failed||Failed||Failed||Failed|
|The University of Ballarat||Failed||Failed||Failed||Failed||Failed|
|Central Queensland University||Failed||Partial||Failed||Failed||Failed|
|Charles Darwin University||Failed||Failed||Failed||Failed||Failed|
|Charles Sturt University||Failed||Failed||Failed||Failed||Failed|
|Curtin University of Technology||Failed||Failed||Passed||Failed||Failed|
|The University of Canberra||Failed||Failed||Failed||Failed||Failed|
|Edith Cowan University||Failed||Failed||Failed||Failed||Failed|
|James Cook University||Failed||Failed||Failed||Passed||Failed|
|La Trobe University||Failed||Passed||Failed||Partial||Failed|
|The University of Melbourne||Passed||Failed||Partial||Partial||Failed|
|The University of New England||Failed||Passed||Failed||Passed||Failed|
|The University of Newcastle||Failed||Failed||Failed||Failed||Failed|
|The University of New South Wales||Failed||Failed||Failed||Failed||Failed|
|The University of Notre Dame||Failed||Failed||Failed||Failed||Failed|
|Open Universities Australia||Failed||Failed||Failed||Failed||Failed|
|Queensland University of Technology||Failed||Passed||Failed||Passed||Failed|
|The University of Queensland||Failed||Failed||Failed||Failed||Failed|
|Southern Cross University||Failed||Failed||Failed||Failed||Failed|
|Swinburne University of Technology||Failed||Partial||Partial||Failed||Failed|
|The University of South Australia||Failed||Passed||Passed||Failed||Failed|
|The University of Southern Queensland||Failed||Failed||Failed||Failed||Failed|
|The University of Sydney||Failed||Failed||Failed||Failed||Failed|
|The University of Tasmania||Failed||Failed||Passed||Failed||Failed|
|The University of Technology, Sydney||Failed||Failed||Failed||Failed||Failed|
|The University of The Sunshine Coast||Failed||Passed||Partial||Partial||Failed|
|The University of Western Australia||Failed||Failed||Failed||Failed||Failed|
|The University of Western Sydney||Failed||Partial||Failed||Partial||Failed|
|The University of Wollongong||Failed||Failed||Failed||Failed||Failed|
|TOTALS||38 failed||30 failed||34 failed||33 failed||135 pages failed|
We found failures against seven checkpoints compared with eight in the 2003 study. There were no new checkpoint failures - all seven issues were problems in the last study. Table 2 shows these checkpoints and the number of pages that failed against them. It presents these data alongside the figures from the 2003 audit. Percentages are also shown as the number of pages evaluated in each audit was different (164 this time; 180 in 2003).
The biggest accessibility problem was again related to checkpoint 1.1 which requires page authors to provide an equivalent text alternative for non-text elements. There was, however, a slight increase (0.7 per cent) on the earlier figures. As might be expected given trends in web design:
In the rest of this section we discuss these problems and provide examples found in this audit. Detailed reports for each page from each university website are available on our website [HREF13].
|Checkpoint||2003 - no. of pages||2003 - % of pages||2007 - no. of pages||2007 - % of pages||Difference|
|1.1 Provide text equivalents for non-text elements||138||76.7||
127 (+22 partial)
|2.1 Do not convey information with colour alone||1||0.6||0||0.0||-0.6%|
|5.1 Identify row and column headers in data tables||2||1.1||3||1.8||+0.7|
|6.1 Organise documents so they can be read without stylesheets||59||32.8||
|6.2 Update dynamic equivalents as content changes||1||0.6||1||0.6||0|
|6.3 Make sure pages work when scripts and applets are turned off||23||12.8||
38 (+15 partial)
|11.4 Provide an accessible equivalent page, updated as often as the original||9||5||3||1.8||-3.2%|
|12.1 Give frames titles to help with orientation and navigation||26||14.4||2||1.2||-13.2%|
Design elements that caused pages to fail against checkpoint 1.1 are shown in Table 3 and discussed in detail below.
|Page element||2003 - no. of pages||2003 - % of pages||2007 - no. of pages||2007 - % of pages||Difference|
|Images||133||74.0||106 (+30 partial)||64.6||-9.4|
All image elements must have an
ALT attribute as part of the
IMG element used to insert the image into the page. The
ALT attribute is the place where the text alternative for the image is included (and for images that require a longer description, the
LONGDESC attribute can be used). Images can also be included on a page as background elements. There is no
ALT attribute for a background image so if the image contains content, designers need to include a text alternative elsewhere on the page.
When writing text alternatives for images, designers or page authors need to consider the reason the image is being used. If it is providing content, then an equivalent alternative to that content needs to be written. If it is for decorative or layout purposes, it is usually best to use an empty
ALT attribute. Descriptive text alternatives for these kinds of images are often a distraction or noise in the background for blind users who hear them read out when using a screen reader to access web pages.
We found several types of problems with text alternatives for images used on university websites. In the last study these problems were categorised into seven types. We have used the same categorisation to allow a comparison between the two studies.
For content images:
ALTattribute has been left blank
ALTattribute is missing from the
For layout or decorative images:
ALTattribute is missing from the
In the 2003 study, problems B and F were recorded as a pass. In this study, we have given them a partial rating for the reasons stated earlier.
|Image text alternatives||2003 - no. of pages||2003 - % of pages||2007 - no. of pages||2007 - % of pages||Difference|
|Images used to provide content|
|A: Not equivalent||41||22.8||73||44.5||+21.7|
|B: Included unnecessary text||21||11.7||11||6.7||-4.0|
|E: Background image - none||1||0.6||14||8.5||+7.9|
|E2: Background image - not equivalent||0||0||1||0.6||+0.6|
|Images used for layout or decoration|
|F: Included unnecessary text||53||29.4||82||50||+20.6|
The most interesting data here are the figures for problem types D and G, and then A and F. It is likely that the increased use of content management systems and better web page authoring tools had led to a reduction in the problems with missing
ALT attributes (problem types D and G). But the increase in problem types A and F show that designers and authors still do not properly understand the role of text alternatives, or do not take enough care when writing them.
There was a small decrease in problem type B. In the last study this problem related mainly to the use of sliced images. Page authors would repeat the text alternative for each slice. Slicing images was linked to the use of the
TABLE element to control layout. Now that layout is more commonly handled by stylesheets, sliced images are not used as often.
Some examples of the image-related problems with text alternatives are outlined below. Each is relatively easily resolved. None require technical re-engineering or major changes to the way that pages are designed. Page designers and authors simply need to take more time to consider the function of the image and write an appropriate alternative that will have the same function or role when the page is used by someone who cannot see the image.
ALTattribute on each slice
ALTattribute was blank
ALTattribute was missing
Although there have been improvements to this format, most PDF documents remain at least partly inaccessible. This is because most are not created properly and so cannot make use of these improvements. PDF documents are often generated from poorly structured Word documents, and usually converted to PDF using the print option in Word. This means they are not 'tagged' for accessibility. And most PDF documents are not checked to ensure that accessibility features are in place.
In this audit we found more pages where content was published only in PDF format - up 9.9 per cent on 2003. We failed each page that did this because of the incidence of poorly-created documents, and also because the Australian Human Rights and Equal Opportunity Commission recommends against publishing in this way. They say:
"The Commission's view is that organisations who distribute content only in PDF format, and who do not also make this content available in another format such as RTF, HTML, or plain text, are liable for complaints under the DDA" (2002).
It is also worth noting that while PDF is popular with content publishers, many users prefer to avoid it. In a study of prospective students - none with a disability - 23 per cent had difficulty reading or finding information within a PDF document and 15 per cent said they preferred not to use them (Alexander, 2005). Some comments from users in that study were:
"It's so big it's hard to find anything."
"It's frustrating me a bit".
"I have to click on this unfortunately… I don't know why I need to click on an Adobe Acrobat file."
"I know that Acrobat would have details of course that I’m looking for, but I would like to find out the easy way… Acrobat right now for me is not necessary... I would like to find out the easy way."
"I don't like Acrobat… I think it's difficult to read."
When content is included in pages using scripts, a text alternative must be provided using the
NOSCRIPT element. This allows equivalent content to be made available when scripting is not available or disabled. We found 12 pages where no text alternative was provided - up 4 per cent on 2003.
Text alternatives need to be provided for multimedia content, so that users who cannot see or access the multimedia can make use of the alternative. Most problems with multimedia in this study were with the use of Flash. Examples where no text alternative was provided include:
The incidence of two other problems increased since 2003: problems with the implementation of stylesheets (up 16.4 per cent) and pages that became unusable when scripting was not enabled (up 10.4 per cent).
Checkpoint 6.1 requires designers to make sure that pages can still be read when stylesheets are disabled or when users use their own stylesheets. People with colour blindness or low vision may want to use their own styles to avoid colours that they cannot see well, increase the contrast between text and background colours, or to make font sizes larger.
The main problems observed in this and the earlier study was the legibility of navigation elements. Many designers are still using HTML to colour some page elements, so that when stylesheets are turned off, the HTML-coded colours remain. We found several pages where link text was very difficult to read because it was dark blue (the default colour when stylesheets are disabled) on a dark HTML-coloured background. Examples of this problem were found on:
We saw two cases where the page reading order became a problem when stylesheets were turned off:
Checkpoint 6.3 requires designers to make sure that pages are still usable when scripting is turned off or not supported. The main problem found in both studies was the use of scripts to activate dropdown menus for "quick links". These menus are unusable if scripting is turned off or not supported. Examples of this problem were found on:
In each of these cases, the potential accessibility problem can easily be avoided using techniques based on server rather than client technology:
The trend towards standards-based web design may have had some positive effects on web accessibility in the Australian university sector. Problems related to the use of frames and table-based layouts have decreased since 2003. Stylesheet-related problems have increased, but this is because designers have not moved all colour-related presentation elements to the stylesheet.
Other trends in web design have had a negative impact. The use of PDF documents has increased. Use of scripting is also on the rise, and this may increase further as the popularity of AJAX scripting techniques spreads.
However, the main problems identified in this and the 2003 study have little to do with changing trends in web design or the popularity of standards-based approaches. Page designers and content authors still appear to lack an understanding of the role of text alternatives for non-text elements used on a page.
Making university websites more accessible does not require a huge investment in time or technical skills. And it does not require any significant redesign or re-engineering of existing websites. It simply requires a better understanding of the role of text alternatives, particularly for images.
We recommend that universities:
It is also important that universities include accessibility checking as part of their web publishing quality assurance process. As mentioned in the conclusion of the last study "sign-off authorisation should include not just the acceptance of responsibility for the accuracy of page content, but for the accessibility of that content as well" (Alexander, 2003).
Alexander, Dey (2003) "How accessible are university websites?", Ausweb 03 [HREF15]
Alexander, Dey (2005) "How usable are university websites? A report on a study of the prospective student experience", Ausweb 05 [HREF16]
Australian Vice-Chancellors Committee (2004) "Guidelines on Information Access for Students with a Print Disability" [HREF17]
Australian Vice-Chancellors Committee (2006) "AVCC Guidelines Relating to Students with a Disability" [HREF18]
Australian Human Rights and Equal Opportunity Commission (2002) "World Wide Web Access: Disability Discrimination Act Advisory Notes", Version 3.2 [HREF19].
Chisholm, Wendy, Vanderheiden, Greg and Jacobs, Ian (1999) "Web Content Accessibility Guidelines 1.0" [HREF3]
Edwards, James (2006) "AJAX and screenreaders: when can it work?" [HREF20]
Dey Alexander and Scott Rippon © 2007. The authors assign to Southern Cross University and other educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to Southern Cross University to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the conference papers and for the document to be published on mirrors on the World Wide Web.