How usable are university websites?
A report on a study of the prospective student experience

Dey Alexander [HREF1], Usability Specialist, Information Technology Services [HREF2], Building 3A, Monash University [HREF3], Victoria, 3800. Dey.Alexander@its.monash.edu.au

Abstract

This paper reports on a study of prospective student experiences of university websites. Thirty-nine participants took part in a usability study which examined 15 university websites (13 Australian sites, one site in the United States and one in the United Kingdom). The participants--all prospective students--were asked to find a course they were interested in taking, the cost and entry requirements for the course, where the course was taught from and whether there were any scholarships they would be eligible to apply for.

Only 62 percent of tasks were completed successfully. Participants had the most difficulty trying to find information about tuition fees and scholarships. The study highlighted five key usability problems that contributed to these results: poor information architecture, poor content, poor search results and/or search interface, a reliance on domain knowledge about the higher education sector that many prospective students do not have and negative reactions to or difficulty using PDF documents.

1. Introduction

As part of the requirements analysis for the design of a new prospective students website for Monash University, we conducted a study of the prospective student experience of university websites. Thirty-nine prospective students took part in the study which examined 15 university websites (13 Australian sites, one site in the United States and one in the United Kingdom).

A series of usability tests was conducted between October and December, 2004. The test subjects were prospective students from various market segments including undegraduates (school leavers and non-school leavers), postgraduates, international students and those interested in distance education.

Participants performed a set of information-seeking tasks while being observed and their activities being recorded on video. This paper reports on the findings of the study.

2. Methodology

2.1 Overview of the usability testing protocol

A typical usability testing methodology was employed (Barnum 2002, Dumas and Redish 1999, Rubin 1994). In the orientation phase, participants were greeted by the test facilitator and introduced to any observers. After being offered refreshments, they were shown into the office in which the testing was to be carried out. There they received a short, scripted verbal orientation explaining the purpose of the usability test. They were then asked to read and sign a consent form and complete a short background questionnaire.

The test facilitator then introduced them to the "think aloud" protocol (Nielsen 1994). With this technique, participants are asked to verbalise their thought processes while they are working through the assigned tasks. This helps identify the reasons for participants' actions, and elicits useful information about their reaction to design elements and content.

After checking that the default settings on the computer were suitable for them, participants were asked to perform a set of information-seeking tasks on two of the 15 websites that were studied. The tasks were written on a sheet of paper that included a space where participants were asked to indicate their answers. Participants were advised that they could attempt the tasks in any order and were taken to the home page of the first site they were to test. After completing the tasks on this site, participants repeated the tasks on a second site, again starting from the home page. The order in which sites were viewed was randomised.

Once the tasks were completed on one site, participants were asked to rate their level of confidence in the information they had found. They were then asked to complete a short user satisfaction questionnaire designed to collect and measure their own perceptions of the quality of their experience using that site. The confidence rating and user satisfaction questionnaire were administered again after the second site was tested.

2.2 Participant demographics

Tables 1 to 8 below provide a summary of the demographic information collected from participants via a background questionnaire that was administered prior to the test.

Table 1: Prospective student market segments
Number Group
12 Prospective international students
9 Prospective postgraduate students
9 Prospective distance education students
5 School leavers
4 Non school leavers

Table 2: Level of intended study
Number Study level
24 Postgraduate studies
15 Undergraduate studies

Table 3: Primary study interests
Number Study area
15 Business/commerce disciplines or degrees
5 Computing/information technology
4 Arts/law
3 Science
3 Education
3 Journalism/communication
2 Architecture
2 Social work
1 Engineering

Table 4: Gender
Number Gender
21 Female
18 Male

Table 5: Age
Number Age group
6 15 to 18
21 19 to 24
5 25 to 30
2 31 to 35
1 36 to 40
1 41 to 45
3 Over 45

Table 6: Eyesight
Number Eyesight
19 Did not need glasses or contact lenses
10 Routinely wore glasses or contact lenses
4 Wore glasses for reading only

Table 7: Computer usage
Number Frequency
12 More than 30 hours per week
11 Between 15 and 30 hours per week
9 Between 6 and 15 hours per week
7 Less than 6 hours per week

Table 8: Internet usage
Number Frequency
4 More than 30 hours per week
13 Between 15 and 30 hours per week
12 Between 6 and 15 hours per week
10 Less than 6 hours per week

2.3 Sites included in the study

The following university websites were included in the study:

Each site was tested by 5 participants, except for the Monash University site which was tested 8 times. Some sites were allocated for testing by particular user groups. For example, prospective distance education students tested the websites of universities who are market leaders in distance education and international students tested the sites of universities who target this market segment. Table 9, below, shows which participant groups tested which sites.

Table 9: Participant groups allocated to test sites
University School leavers Non school leavers Distance education Postgraduate International
Australian National University 1 1   3  
Charles Sturt University     5    
Deakin University 2 2 1    
London School of Economics         5
Macquarie University         5
Monash University 2   2 1 3
RMIT University 2 1   2  
University of California, Los Angeles         5
University of Melbourne 2 1   2  
University of New England     5    
University of New South Wales 1 1   2 1
University of Queensland   1   4  
University of South Australia         5
University of Southern Queensland     5    
University of Sydney   1   4  
Total no. of tests (participants ) 10 (5) 8 (4) 18 (9) 18 (9) 24 (12)

2.4 Environment

The test facilitator's office was used to conduct the testing sessions. A Pentium IV-driven Acer laptop with a 17-inch external LCD display set at a resolution of 1024 x 768 pixels was provided for participants' use. The operating system was Windows XP, Service Pack 1 running Internet Explorer Version 6. Participants were asked if they wished to have the screen resolution, browser or font size changed. In all but one case, the default setup was used. One participant chose to use Mozilla instead of Internet Explorer.

A small webcam mounted on top of the monitor enabled us to capture participants' facial expressions and get a rough idea of which part of the screen they were focusing on during the study. Using Techsmith's Morae usability test recording software [HREF18], we were able to record the screen interaction and integrate that with sound and the webcam picture.

2.5 Tasks

Participants were asked to find several items of information related to a course of study. Information-finding tasks were written in the form of questions that participants attempted to answer.

  1. Does this university offer a degree/diploma that you are interested in?
  2. Do you have the qualifications needed to apply for this degree/diploma?
  3. Can you find out how much will it cost you to take this degree/diploma?
  4. Can you find out where this degree/diploma is taught from?
  5. Are there scholarships available for this degree/diploma that you could apply for?

Participants were advised that the universities whose websites they were testing may not have the specific course they may be interested in, and if that was the case, to look for another course that they might still consider.

For task 4, participants who indicated an interest in distance education were asked if they could identify whether the course was offered via distance education. For all other groups, the campus on which the course/degree was offered needed to be identified.

2.6 Data collected

In this study we were not concerned with quantitative usability measures such as task completion times, number of mouse clicks or number of errors. Our aim was to identify good and poor design practices. We did, however, record whether participants were able to complete tasks successfully. The criteria for successful task completion were:

We recorded participants' confidence about the information found on university websites using a five-point Likert scale. We did the same to record user satisfaction across a number of dimensions including:

3. Discussion of findings

3.1 Usability metrics

One of the most important roles for university websites--some would say the most important--is to market courses to prospective students. This study shows that there is significant room for improvement since prospective students could only find course-related information 62 percent of the time. For some sites, this figure was as low as 40 percent. Table 10, below, shows the task success rates for each university.

Table 11 shows the task success rates for individual tasks. Less than half of the participants were able to determine correctly whether a scholarship that they would be eligible for was provided by the university. Only half managed to find the course fees. In many cases, fee and scholarship information was provided on separate sites and participants had difficulty finding it as a result. This information was often presented in a formal writing style with heavy use of jargon. The content also tended to be longer than other course-related content. These issues are discussed, along with other qualitative findings, in section 3.2 below.

Some students (15 percent) were not even able to find a course on their assigned university websites. This was not because the university did not offer something of interest to them, but the result of referring to subjects as "courses" (and courses as "programs", though this was not as significant a factor in task failure). Apparently some universities use this terminology because they have purchased software produced in the United States that uses the "courses/programs" nomenclature rather than "courses/subjects". The latter appears to be the vernacular of most prospective students.

Participants' confidence in the information they found on university websites measured 3.4 on a 5-point Likert scale. And they were marginally satisfied with the performance of these sites, with an average rating of 3.5. Only two university websites rated 4 or more in terms of confidence and only one rated 4 or more for user satisfaction. Table 12 shows the figures for each university.

Table 10: Successful task completion
University No. of tasks No. of tasks successfully completed Percentage of tasks successfully completed
Australian National University 25 14 56
Charles Sturt University 25 19 76
Deakin University 25 16 64
London School of Economics 25 10 40
Macquarie University 25 10 40
Monash University 40 26 65
RMIT University 25 16 64
University of California, Los Angeles 25 11 44
University of Melbourne 25 20 80
University of New England 25 19 76
University of New South Wales 25 15 60
University of Queensland 25 19 76
University of South Australia 25 18 72
University of Southern Queensland 25 14 56
University of Sydney 25 15 60
Overall 390 242 62

Table 11: Successful task completion, by task (percentage)
University Task 1
(Degree)
Task 2
(Entry reqs.)
Task 3
(Fees)
Task 4
(Location)
Task 5
(Scholarships)
Australian National University 80 100 60 40 0
Charles Sturt University 100 100 40 60 80
Deakin University 100 80 40 80 20
London School of Economics 80 40 60 0 20
Macquarie University 60 60 40 40 0
Monash University 88 63 75 50 50
RMIT University 100 80 20 80 40
University of California, Los Angeles 40 80 40 60 0
University of Melbourne 100 80 80 60 80
University of New England 100 80 60 80 60
University of New South Wales 80 40 40 80 60
University of Queensland 80 60 60 100 80
University of South Australia 100 100 60 80 20
University of Southern Queensland 100 60 0 80 40
University of Sydney 60 60 60 80 40
Overall 85 72 50 64 40

Table 12: User confidence and satisfaction (5-point Likert scale)
University Confidence in information found User satisfaction
Australian National University 2.4 3.3
Charles Sturt University 3.8 4.1
Deakin University 3.2 3.6
London School of Economics 3.0 3.0
Macquarie University 3.3 3.5
Monash University 3.8 3.6
RMIT University 3.0 3.7
University of California, Los Angeles 3.0 3.3
University of Melbourne 3.6 3.8
University of New England 3.4 3.4
University of New South Wales 3.4 3.7
University of Queensland 3.4 3.3
University of South Australia 4.4 3.9
University of Southern Queensland 2.8 2.9
University of Sydney 4.0 3.6
Overall 3.4 3.5

3.2 Design problems that negatively impacted on usability

Five main groups of design-related problems contributed to the poor usability metrics and had a negative impact on usability in general. These related to information architecture, quality of content, usefulness of search, an assumption that prospective students have sufficient domain knowledge about the university sector and the use of PDF documents.

3.2.1 Information architecture

Every participant in the study encountered at least one usability problem related either to the structure of the site or to the manner in which navigation systems were designed.

Often, course-related content was not co-located within the site (in a section or sub-site devoted to prospective students, or one of the main prospective student sub-groups), but provided on another part of the site. 97 percent of participants were adversely affected by this. The following behaviours were observed:

The structure of many university websites mimics the structure of the organisations that own them. For example, there might be a Fees Office website and a Scholarships Office website. Each part of the organisation has its own website and the information that it is responsible for lives on that site. While this makes for ease of publishing and maintenance, it creates havoc for those who need to find the information. The publication of course guides and online course databases is an attempt to pull together course-related information into a single publication or sub-site, but much of the ancillary information that prospective students require is located in other places within the wider university web presence.

The findings of this study demonstrate that information that is not contained within a prospective students website--fees and scholarship information, for instance--is much harder to find. Sometimes this was because the information was not linked from course overview pages. Some students didn't know where to look and others simply gave up at this point, concluding that the university did not provide the information online. In other cases, the information was linked, but required the student to navigate through the hierarchy of a separate site and/or wade through content that was generic and left them feeling unsure about whether it was relevant to their particular course. Many participants were confused or annoyed by this, as the following comments indicate:

"I wouldn't have a clue where to go… they should have a link."

"They're not there on that page with all the other information."

"They should have that here… they have everything here about the course except the fees."

"If it doesn't have anything about prices it would haven't anything about scholarships… I don't see anything here."

"I'd imagine they'd have it underneath... that's the most logical."

"I don't know where it is... I really don't know where to go."

"I don't like this site… it's hard for me to find things."

"I'm stuck now, I feel there's nowhere else to go."

"This is going to take a bit extra… it would be nice if it was just on the thing."

"I'd expect it just to give me the advice straight away."

"It doesn't seem like it will tell me the cost here... this is really general fees really."

"Now I'm not in journalism any more, it's all broad."

"I think this is very general – maybe for all the courses."

"General pre-requisite, not very specific."

"I'm thinking this is a crappy site, an annoying site.... it's directing me to too many places."

"See, it's just taking me to too many sites… I don't want to go all over the place."

"When you want the answer to a question you don't want to be going from site to site to site".

"Do you want me to keep looking... I found that so frustrating".

Poorly-written link text presented another information architecture-related problem. This adversely affected 31 percent of participants. Two main problems were observed. First, some participants overlooked links to the information they were seeking. This seemed to be because the words used in the link text did not trigger the appropriate response, the links failed present a "call to action" (McGovern 2005b). Given that participants usually only scan web pages, link text needs to stand out, and to stand out it must use words that are meaningful to the intended audience (Alexander 2000). Second, some participants had problems when the link text did not fully or accurately describe the linked content. For example, a link labelled "International students" was followed by two international students interested in postgraduate study. The link led to undergraduate information. Both students wasted some time on the linked page before realising they were looking at undergraduate information. Neither noticed the heading "Specialized information for undergraduates" above the link they had clicked.

For 18 percent of participants, poor placement of navigation elements led to delays or failure in finding information. In general, participants appeared to scan the central part of the screen when moving towards their desired content, looking only to the top or left of the screen--where navigation elements are generally located--when changing navigation strategy or beginning a new task. New navigation that was introduced at the top or on the side of the page often went unnoticed.

3.2.2 Content

It is hard to argue with Nielsen's statement that "Ultimately, users visit your website for its content. Everything else is just the backdrop" (Nielsen 2000:99). Unfortunately, the content on university websites often fails to deliver. The main reason for this is that the content does not appear to be written for its target audience. It uses a language that they are not familiar with, and does not adequately address their information needs. All participants in the study encountered at least one usability problem related to the content they found on the sites tested.

Some comments by participants illustrate these problems:

"What's the exact difference between programs and courses?"

"It's a degree called 'Introduction to Psychology'."

"I think this is a subject… oh no it says it's a course.. but here it's just a subject."

"I'm thinking this one are the subjects for the course… it's not actually a course itself… yeah, initially I thought it was a course."

"I just got some subjects rather than a course… I think these are subjects so I'm not sure that this helps me."

"There's a lot more journalism courses than I thought… just trying to remember which one I applied for" (participant was looking at a list of subjects).

"I don't know what coursework means… I guess I'll just go into one of the prospective things."

"I thought it [coursework] was colloquial to the other university."

"What the hell is compulsory non-academic fee?"

"I'm thinking what does this [EFTSU] mean… Electronic funds transfer something…"

"I'm thinking because they did not write postgraduate, do they say postgraduate another way."

"'Graduate Admissions'... maybe that means postgraduate."

"Faculty is supposed to be Faculty of Business, faculty of something."

"Full-time in 2 sessions.... is that a semester?"

"This looks like a lot of legal jargon here… it's not really about eligibility."

"I've not been to uni before so I'm not right up with all this."

"It says 'per 8 point subject'… I'm not quite sure what that means."

"I don't know if it's some of the lingo they use on these kind of websites."

"One of the things that a lot of universities don't provide is an idea of the subjects… TAFEs have a lot more in their course guides. When you get a TAFE course guide, you've got a list of your subjects that you're doing."

"It hasn't got a specific amount… hasn't got any amounts really."

"It's not really clear what the cost is. I think I'd need to do a bit of work to figure it out myself, but I'm not exactly sure how."

"It's directing me to the VTAC guide… would be nice if it listed what I needed on the site."

"This is quite confusing actually... it should say it on the site… that would make life easier" (page did not provide specific entry requirements but referred students to the VTAC Guide).

"I know all this already. I just want a table… I don't feel like I got the real price... what I wanted."

"I'd need a calculator."

"Too much detail for me to look at… I wouldn't waste my time on here trying to work it out for myself."

"Oh god, I don't want to read through all this."

"There's a lot of information here, that isn't necessarily a good thing all the time."

"A bit confusing all this scholarship information."

"Unsure which one I am… it doesn't mention HECS."

"It's very confusing."

"Oooh, this is very confusing... 'Commonwealth Supported Place' blah, blah, blah".

3.2.3 Search

Poorly designed site structure and navigation, together with the lack of clarity and detail provided in some content areas led many users to try to search for information on university websites. It was rarely the case that participants used search as their primary information-seeking strategy. 77 percent of participants experienced difficulties when using search. This figure may have been higher had all participants attempted to use search.

Several usability problems were noted when users attempted to find information by using the site's search engine.

While improved search technology can help users locate the information they are looking for, most poor search results observed in this study were caused by poor writing and publishing practices. Many universities do not have separate university-wide intranets and publish the vast majority of their content on their publicly-accessible websites. This fills the search engine index with pages of information that are largely irrelevant to external users, particularly prospective students. Another problem that plagues university websites is the practice of publishing information in several places: faculties publish information about courses, entry requirements, fees and scholarships and much of this is repeated on other faculty sites and again on the sites of administrative units. In a distributed publishing environment, where most content is produced by staff with little time or expertise in online publishing, metadata quality is generally poor. Many page titles encountered by participants when searching were inaccurate and/or inadequate, leading participants to click on pages linked in search results, only to find that the page content was not what they expected to find.

Some search user interfaces need improvement as well. Simplicity is the key (Nielsen 2005). Many participants stumbled through advanced search options and were tripped up by searches where the scope was limited to particular parts of the site. On one site, a box labelled "Quick Search" turned out to be a search on courses only but this was used to search for a wide range of course-related content. On another, the search was restricted to undergraduate information and several users failed to notice this.

The following are some of the participants' comments while trying to use search on university websites:

"It's come up with too much… got me confused now."

"I don't know which one to go to."

"Hmm… I need help."

"I'm thinking that would have been a very obvious site to come straight away and it's come up with European studies and linguistics and things like that."

"I'm looking for a master of business administration but it looks like they don't have it."

"This was pretty hard to find, because even if you put it in the search it doesn't come up."

"In this case, I'd love it if there was a drop list" (after several unsuccessful course searches).

"I'm thinking this site doesn't offer postgraduate courses" (after getting no results and not realising his search was limited to undergraduate information).

"It’s horrible... this is a not nice site."

3.2.4 Domain knowledge

Some participants (38 percent) struggled to complete tasks because they did not possess certain domain-specific knowledge about the higher education sector. Two main problems were observed.

These problems arose mainly in the design of advanced search interfaces, where users were given options to filter course searches by study level or faculty. But they also occurred in simple navigation structures.

Participant comments included:

"Diploma is different from undergraduate course isn't it?"

"I'm unclear if it's all one course."

"I'm wondering which faculty it's a part of".

"I thought there'd be an education (faculty) and there's not really".

3.2.5 PDF documents

The use of PDF documents on the web has been steadily increasing, presumably because it is easy to convert from the original document format, because authors believe their documents cannot be altered or copied, and because of the belief that PDF is widely accessible. Despite warnings from the Australian Human Rights and Equal Opportunity Commission that PDF documents may present significant accessibility barriers to disabled users [HREF19], they remain popular with publishers.

This research shows that PDF documents can cause significant problems for users, even when they do not have any disabilities that affect their use of the web. Almost a quarter (23 percent) of participants had difficulty reading or finding information within a PDF document, and 15 percent indicated a preference not to read PDF documents. These figures may have been higher, except that not all participants encountered PDF documents during the use of the sites studied.

Participant comments on PDF documents included:

"It's so big it's hard to find anything."

"It's a bit awkward to read… I'll print that out because some of the formatting, it's not easy to read on screen."

"Come on, open."

"Why isn't this opening?"

"It's frustrating me a bit" (comment on a blurring effect while scrolling a PDF document with very large images).

"I have to click on this unfortunately…. I don't know why I need to click on an Adobe Acrobat file."

"I know that Acrobat would have details of course that I’m looking for, but I would like to find out the easy way… Acrobat right now for me is not necessary... I would like to find out the easy way."

"I don't like Acrobat… I think it's difficult to read."

3.2.6 Other usability problems

Other usability problems observed during the study included:

3.3 Positive design features

There were three features of sites that prospective students had a positive reaction to. These were:

Some participants' comments on these issues included:

"I like this little table rather than reading through other things."

"I like how it has the quick 'At a glance'."

"This is something I like… when I come to this page it gives me entry requirements, programs and fees, application procedure…"

"I like this. This is good because it's all laid out in front of me. I can see in the headings what's underneath them, they're very clear."

"Great that they have all the subjects here."

"It's got subjects which is really good." (And after seeing links to subject outlines) "excellent, I really like that".

4. Solving the design problems

If universities want to successfully market their courses online, their websites must be designed to meet the needs of prospective students. This study has revealed that the online experience offered to prospective students leaves room for significant improvement and suggests several strategies that are likely to bring positive results.

4.1 Design an information architecture that meets prospective student needs

The design of an information architecture has a significant impact on people's ability to find information on a website (Fleming 1998, Reiss 2000, Rosenfeld and Morville 2002, Wodtke 2003, Barker 2005). Information architects warn that organisational structures generally make for poor information classification schemes, even on intranets where some users may be familiar with the structure of an organisation and the various types of information each business unit may be responsible for (Maurer 2003). Despite this, most universities have allowed their websites to grow into a series of sub-sites, each confined to the business activity of a particular part of the organisation.

As reported above, many prospective students simply did not know where to look for information if it was not linked from or co-located with the page where they ended their previous task. They hadn't formed a mental model of a university site as a collection of independent sub-sites, each managed by different organisational groups. However, even if prospective students did understand that university websites were structured this way, the structure of universities is not standardised, so they may not know which sub-site to go to. Where links to information in other sub-sites were provided, users were often taken to the sub-site home page and had to work their way through another set of navigation links to find the desired information. Many were clearly frustrated by this. Others lost their way.

The optimal strategy to improve findability is to create a self-contained prospective students website, designing an information classification scheme that makes sense to prospective students. The creation of course guides and online course databases is a good start, but more effort and co-operation between organisational units is required to bring related content together from different parts of the website.

Navigation elements should not be introduced on peripheral areas of the screen. The primary screen real estate is the central part of the screen. This is the place where most people's attention is focused most of the time, and particularly when moving forward towards information. Once navigation options have been introduced in the centre of the screen, they can be moved off to the side or top into a navigation column or bar if required.

Navigation and findability would also be improved by the use of better link text. Links need to accurately describe the linked content, using terminology that is meaningful to prospective students. Link text should also be self-contained, so that it makes sense when read out of context, as users often scan pages, clicking quickly on links that seem relevant, without reading adjacent text or headings.

4.2 Create content that meets the needs of prospective students

"The biggest mistake in content management is writing for the organisation and not for the reader" (McGovern 2005a). Web content is only useful when it can be understood by its target audience and when it meets their information needs.

Universities need to make sure that content speaks the language of target users rather than being littered with institutional jargon and acronyms--"the lingo that they use on these sites"--that prospective students simply cannot fathom. It must be written concisely and formatted in a manner that makes it easy to scan, since prospective students are like most other users of the web - impatient and time poor (Nielsen 1997). They scan text rather than reading it closely. Content must also cater to prospective students' information needs. Key content--like tuition fees--needs to be provided upfront, not buried under a mountain of policy.

To do this, editorial guidelines and review processes need to be implemented. Ideally, user information needs should be determined, and content evaluated, using user-centred design methods. Kuniavsky 2003 and Courage and Baxter 2005 discuss user research methods, tools and techniques that university web content teams could readily adopt. Most importantly, content development and maintenance need to be adequately resourced, rather than being a task that people have to find time to do in addition to their regular duties.

4.3 Improve the performance of search

Search results for prospective students could be improved by the ability to provide "best bets" - a list of key resources that are marked to appear as special selections at the top of a search results page when certain keywords are entered by users. This is a useful strategy for large websites with distributed authorship and where guidelines for accurate and useful page titles cannot be enforced. It is also important where universities do not publish internal information on a separate intranet, and where multiple versions of information are published on different publicly-accessible sub-sites.

Search results would also be improved by putting measures in place to improve metadata--page titles in particular. Guidelines, training and some sort of auditing process are likely to be necessary.

Search interfaces should be kept simple by using a keyword input box and associated button, and search scope should be implemented with great care.

4.4 Do not assume that prospective students have the relevant domain knowledge

Universities need to take care that they do not rely on prospective students' understanding of the structure of the university or its courses.

When designing browsing options, use discipline groups that clearly specify the cluster of related disciplines, rather than faculty names. In some universities, psychology may be taught in the Faculty of Science, while in others, it belongs to the Faculty of Arts. Offering browse options of "Science" or "Arts" requires prospective students to know the structure of the university in question. Presenting options like "Psychology and behavioural sciences" avoids the need for this domain knowledge. Do the same when designing search filters, and always provide the option to search in "All".

4.5 Do not use PDF as the primary format for web content

While PDF can be a useful format for information that is intended to be printed, it can be highly unusable as a primary format for online information. The observations from this study send a strong signal to those who rely heavily on the use of PDF: your users will be not be pleased and will struggle to find information if you only publish content as a series of PDF documents.

If PDF documents are used online, it is best to use them to provide an alternative, printable format. If they are used as the sole format for content--which is not recommended--ensure they are formatted in a way that makes them easy to read online. Where tables of data are provided, make sure column headings are repeated on each page. Do not use large images. Avoid using large areas of whitespace. If a table of contents is provided, ensure that each item acts as a link to the relevant page. If PDF documents are created from larger documents and page number references are given in the text, make sure the pages have been renumbered to reflect the size of the smaller document and that in-text page references are updated.

References

Alexander, Dey (2000) "Don't "click here": writing meaningful link text", [HREF26].

Australian Human Rights and Equal Opportunity Commission (2002) "World Wide Web Access: Disability Discrimination Act Advisory Notes", Version 3.2 [HREF19].

Barker, I. (2005) "What is information architecture?", KM Column, [HREF21].

Barnum, C.M. (2002) Usability Testing and Research, New York: Pearson Education.

Dumas, J and Redish, J.C. (1999) A Practical Guide to Usability Testing, revised ed., Portland, OR: Intellect.

Fleming, J. (1998) Web Navigation: Designing the User Experience, Sebastapol, CA: O'Reilly.

Kuniavsky, M. (2003) Observing the User Experience: A Practitioner's Guide to User Research, San Francisco, CA: Morgan Kaufmann.

Maurer, D. (2003) "Escaping the organisation chart on your intranet", KM Column, [HREF20].

McGovern, G. (2005a) "Do you make the most common mistake in content management?", New Thinking, [HREF22].

McGovern, G. (2005b) "Your website needs a call to action", New Thinking [HREF25]

Nielsen, J. (1994) Usability Engineering, Morgan Kaufmann, pp. 195-198.

Nielsen, J. (1997) "How users read on the web", Alertbox, [HREF23].

Nielsen, J. (2000) Designing Web Usability, Indianapolis, IN: New Riders.

Nielsen, J. (2005) "Mental models for search are getting firmer", Alertbox, [HREF24].

Reiss, E. (2000) Practical Information Architecture: A Hands-on Approach to Structuring Successful Websites, Harlow: Addison Wesley.

Rosenfeld, L. and Morville, P. (2002) Information Architecture for the World Wide Web, 2nd edition, Sebastapol, CA: O'Reilly.

Rubin, J. (1994) Handbook of Usability Testing: How to Plan, Design and Conduct Effective Tests, New York: Wiley.

Wodtke, C. (2003) Information Architecture: Blueprints for the Web, Indianapolis, IN: New Riders.

Hypertext references

HREF1
http://deyalexander.com/
HREF2
http://www.its.monash.edu.au/
HREF3
http://www.monash.edu.au/
HREF4
http://www.anu.edu.au/
HREF5
http://www.csu.edu.au/
HREF6
http://www.deakin.edu.au/
HREF7
http://www.lse.ac.uk/
HREF8
http://www.mq.edu.au/
HREF9
http://www.rmit.edu.au/
HREF10
http://www.ucla.edu/
HREF11
http://www.unimelb.edu.au/
HREF12
http://www.uq.edu.au/
HREF13
http://www.une.edu.au/
HREF14
http://www.unsw.edu.au/
HREF15
http://www.unisa.edu.au/
HREF16
http://www.usq.edu.au/
HREF17
http://www.usyd.edu.au/
HREF18
http://www.techsmith.com/products/morae/
HREF19
http://www.hreoc.gov.au/disability_rights/standards/www_3/www_3.html
HREF20
http://www.steptwo.com.au/papers/kmc_orgchart/
HREF21
http://www.steptwo.com.au/papers/kmc_whatisinfoarch/
HREF22
http://www.gerrymcgovern.com/nt/2005/nt_2005_01_31_readers.htm
HREF23
http://www.useit.com/alertbox/9710a.html
HREF24
http://www.useit.com/alertbox/20050509.html
HREF25
http://www.gerrymcgovern.com/nt/2005/nt_2005_05_16-call-to-action.htm
HREF26
http://deyalexander.com/papers/clickhere.html
 

Copyright

Dey Alexander, © 2005. The author assigns to Southern Cross University and other educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to Southern Cross University to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the conference papers and for the document to be published on mirrors on the World Wide Web.