Improving the student experience of a university web site

Maria Moore, Web Development Coordinator, email: Maria.Moore@utas.edu.au, Centre for the Advancement of Learning and Teaching (CALT) [HREF1], University of Tasmania [HREF2] , Private Bag 133, Hobart 7001

Ben Cleland, ICT Projects Officer, email: Ben.Cleland@utas.edu.au, The Graduate School, University of Tasmania, Private Bag 8, Hobart 7001

Marcus Eddy, Web Projects Coordinator, email: Marcus.Eddy@utas.edu.au, CALT

Abstract

This paper reports on a project undertaken to improve University of Tasmania (UTAS) web site resources for potential students, focusing on domestic undergraduate science students. The project focused on web site evaluation, task-based usability testing and a student survey. The project aimed to improve the suitability of Faculty of Science, Engineering and Technology (SET) sites as recruitment tools. To achieve this, the project focused on improving information retrieval, site management, and separating public or 'www' content from internal resources. Traditional marketing programs to recruit students have limited impact on web site usability because they do not address the needs of students, which in this study were revealed to be utilitarian. Usability testing confirmed that the usability of the UTAS web presence for prospective domestic undergraduate students was 'average', as defined by Alexander (2005) [HREF3] for the tasks tested in that study. However, moderate changes to search, navigation and the application of a client-focused content model can produce increases in task success of 20.7-26.6% in tasks equivalent to Alexander (2005). Such successes will, however, be short-lived if they are not advocated or managed centrally and if not supported by effective site structure and management and authoring standards enforced at a University-wide level.

Introduction

UTAS has a distributed model of web site responsibility, with web sites considered an official publication of the section that maintains them. UTAS web resources consist of approximately 360 web sites, with 72 of these promoting course information to prospective students. The Centre for the Advancement of Learning and Teaching (CALT) [HREF1] is the only UTAS business unit that provides advice, training, a policy framework and support for UTAS web developers.

Faculty and School sites are delivered through a content management system, the Faculty Content Management System (FCMS) and static pages. The current FCMS is based on an ASP-driven system built locally (by a company no longer existing). It contains core information areas considered important for student recruitment. Additional pages may be offered using static templates that mimic the Faculty and School visual standard.

This paper reports on components of two project, one to improve the student experience of university web services (ISEUWS), conducted by CALT in conjunction with the Student Centre [HREF4] (responsible for information for prospective students) and a project seeking to improve the marketing information provided by the Faculty of Science, Engineering and Technology (SET) [HREF5]. In the SET project, a total of 16 SET School or centre sites were evaluated using the UTAS 'Health Check' framework, which was designed to identify issues that detract from an optimal user experience [HREF6]. CALT's contribution to the ISEUWS project was to focus on improving the usability and accessibility of the 'prospective student' suite of web sites.

Web Site Evaluation

Methodology

Faculty of SET sites

The Health Check site evaluation process as described in Moore (2006) [HREF6] was used to examine a total of 16 SET School and Centre sites. For each issue found, an action that would rectify the issue was defined, given a priority, and assigned to an appropriate person and recorded in an Action Plan for each site. One Action may address more than one issue. The appropriate person could be CALT Web Services staff, or Faculty of SET staff, depending on the task.

Analysis of Faculty of SET sites

To help determine the impact of what was found in the Health Checks, the severity of each issue for the user was calculated according to the following equation from Nielsen and Loranger (2006):

Calculation of severity of usability issues

The frequency of the issue is multiplied by the impact, and that number multiplied by the square root of the persistence. The resulting number is divided by the square root of ten. This results is a number from 0 to 100, where 0 is very good, and 100 is very bad.

Table 1: Severity Rating for the User
Factor Description Calculation
Frequency How many users will encounter the problem? If a relatively small number of users are hurt by it, it's a lower severity rating range 0 > 10
Impact How much trouble does this problem cause for people who encounter it? This can range from barely imperceptible, to losing hours of work or choosing to leave a web site range 0 > 10
Persistence Is it a one-time problem or does it cause trouble repeatedly? Many usability problems have a low persistence because people figure them out. Other design issues are so confusing they get lost over and over again. These issues need a higher severity rating. range 0 > 10

The Health Check process contains questions relating to site management, which will have minimal or no direct impact on the user experience, but significance for the University. For example, the user will not be directly aware of the suitability of the site administrators' computer to perform site maintenance, unless it prevented this work from happening altogether. A similar severity rating system was derived for issues relevant to the University.

Table 2: Severity Rating for the University
Factor Description Calculation
Frequency How frequent is the business risk to the University? range 0 > 10
Impact How much potential trouble does this problem cause for the University? Is there evidence of bad site management or potential for litigation? range 0 > 10
Persistence Is it a one-time problem or does it cause trouble repeatedly? Many site management problems have a low persistence because people figure out a way around them range 0 > 10

The type of issues found by the Health Check process were classified according to the 'Scale of Misery' determined by Nielsen and Loranger (2006), weighted by how frequently they caused a user to fail a task, according to how much confusion and dissatisfaction each issue causes. The classifications for each issue are:

For the purposes of this project, the following definition of information architecture was taken from Rosenfeld and Moreville (2002), who define it as:

  1. The combination of organisation, labeling and navigation schemes within an information system.
  2. The structural design of an information space to facilitate task completion and intuitive access to content.
  3. The art and science of structuring and classifying web sites and intranets to help people find and manage information.
  4. An emerging discipline and community of practice focused on bringing principles of design and architecture to the digital landscape.

University and Student Centre Sites

Improvements to UTAS and Student Centre sites were derived from usability testing of the Faculty of SET sites and the University of Tasmania site as a whole. See below for the methodology.

Actions completed and results

Faculty of SET sites

CALT staff focused on efforts that would have the most longevity in terms of effectiveness, and those that would have the most impact with respect to the scope of any benefits. Changes to the core functions of the FCMS will benefit all Faculty and School sites. In terms of site improvement, the authors focused on:

Faculty of SET staff focused on:

Information: content, product information, corporate information, depending on available time

Table 3: Overall reduction in the severity of 'Misery' found in SET sites (as percent of total severity found), number of issues rectified out of total
Source of Misery User University
Search 56.50% reduction, 25/45 issues rectified 35.40% reduction, 27/80 issues rectified
Findability 19.40% reduction, 7/67 issues rectified 19.14% reduction, 11/76 issues rectified
Page Design 8.48% reduction, 8/35 issues rectified 23.92% reduction, 8/35 issues rectified
Information 8.77% reduction, 10/101 issues rectified 14.98% reduction, 9/115 issues rectified
Task Support 9.47% reduction,1/4 issues rectified 6.66% reduction,1/5 issues rectified
Fancy Design none none
University Site Administration 0%, no issues having direct impact on user 4.65% reduction, 15/48 issues rectified
Overall 20.54% reduction, 51/252 issues rectified 17.45% reduction, 71/359 issues rectified

University and Student Centre Sites

Changes were made to:

Task-based Usability Testing

Methodology

A user testing methodology was developed by observing usability testing at the State Library of Tasmania, consulting online resources on web usability, such as Usability.gov [HREF8] and attendance at Usability in Practice: Three-day Intensive Camp [HREF9] in 2004. This same methodology has been used for all usability testing conducted to date, to keep testing consistent, to enable cross-test analysis, and with careful attention to substantiating all findings.

Each test requires, as a minimum, a participant and facilitator. One or two observers may also be present, and are given the task of acting as additional recorders. Tasks are presented to participants on a piece of paper and also read out to them. Participants were timed, and their actions recorded in notes taken by the facilitator. These notes are sufficiently comprehensive to allow reconstruction of the actions of the user, capture some of the comments made and substantiate the findings. For this project, only the participant and facilitator were present. Although this methodology is by necessity low-tech at this time, mobility is possible, and as a consequence all tests could be conducted on site. A private study room was used at each location, and the same facilities used for first and second tests, with the exception of one student (see below).

Participants were greeted, allowed to adjust the computer and chair, and were given the following information:

At the end of each task, the participants were given two questions and asked for subjective ratings on a seven-point Likert scale. These questions were:

At the completion of each task, the browser was returned to the relevant home page and the participant was given the next task. At the conclusion of testing, each participant was given a short questionnaire (results described below). The facilitator remained neutral throughout the task.

Participants

Participants were recruited from science 'streams' at two local matriculation colleges and pre-screened through a questionnaire. They were required to have:

All participants were in the age group 16-19, and either year 11 or 12. All participants received a showbag of 'goodies' from SET at the end of each test.

Tests

Two sets of tests were conducted, with two groups. One started from the UTAS home page, and another from the Faculty of Science, Engineering and Technology home page. Various changes were made based on the results of the first test, and a second set of tests conducted using different participants.

Table 4: Date, Location and Number of Participants in each Test
Test Date Location Number of Participants
SET - Test 1 2,3,10, August 2006 The Hutchins School 6
SET - Test 2 28 February 2007 The Hutchins School 6
UTAS - Test 1 11,14 August 2006 Elizabeth College 6
UTAS - Test 2 1 March 2007 Elizabeth College 5

Computers used

Elizabeth College: All students used the same desktop computer with a screen set to resolution 1024x768, Windows XP Professional, Internet Explorer 6.029, 2.8GHz Celeron processor, 504Mb RAM, a 10Mbps internet connection with no throttling, but with a connection time out of 2 minutes. Students have an account limit of 50Mb, which refreshes weekly.

The Hutchins School: All but one student used the same desktop computer with a screen set to resolution 1024x768, Windows XP Professional, Internet Explorer 6.029, 2.6GHz Celeron processor, 248MB RAM, a 1Mbps internet connection that throttles back to 64Kbps for files above 2Mb in the first test, 5MB in the second. Windows and security updates are exempt from this throttle. Students have an account limit of 30MB which refreshes weekly. One student in the second test used an Acer laptop, Windows XP Professional, Internet Explorer 7, Centrino Processor 1.6GHz, 512MB RAM, because the other computer was not available.

Tasks

SET Home Page Test

  1. Does this Faculty offer a degree/diploma that you are interested in?
  2. What are the qualifications needed to apply for this degree/diploma?
  3. Can you find out how much it will cost you to take this degree/diploma?
  4. Can you find out where this degree/diploma is taught from?
  5. Are there scholarships available for this degree/diploma that you could apply for?
  6. Find some information on the course that you can print out to show your family.
  7. What careers do your year 11 and 12 subjects best suit you for?
  8. You want to ask some questions about your course. Find the email address of a suitable contact.
  9. You are interested in studying the Bachelor of Biotechnology. What are your options if you do not have the necessary prerequisites?
  10. You would like to become a surveyor. Does this university offer this course?
  11. You are interested in the environment. What can you study at this university in this area?
  12. You want to study physics in first year. What topics will be taught?

UTAS Home Page Test

  1. Does this university offer a degree/diploma that you are interested in?
  2. What are the qualifications needed to apply for this degree/diploma?
  3. Can you find out how much it will cost you to take this degree/diploma?
  4. Can you find out where this degree/diploma is taught from?
  5. Are there scholarships available for this degree/diploma that you could apply for?
  6. Can you apply online?
  7. Find one of the units in which you can enroll in first year. Can you study this unit online or outside normal school hours?
  8. You want to do a course to help you prepare for study at University. Find out when the next course starts.
  9. You want to do a course at the Launceston campus and need to find the cost of University accommodation. How much does a single room cost per week?
  10. You are interested in a career in marine ecology. Which degree will allow you to specialise in this area?
  11. You are interested in being an exchange student. What do you need to do to qualify?
  12. If you wanted to study a combined Bachelor of Science and Bachelor of Arts degree, what percentage of science would you need to study in first year?

Analysis of Results

The following measures were calculated for each task:

The following information was derived from the notes taken during each test:

Results

Comparison with Alexander (2005)

The first five tasks in each test enable comparison with Alexander (2005) [HREF3]. Although the usability testing methods used in this project were different, it is useful to compare overall task success, average confidence and average satisfaction.

Table 5: Comparison of Task Success with Alexander 2005
Task SET - Test 1 SET - Test 2 UTAS - Test 1 UTAS - Test 2 Alexander (2005) - averages across all tests
1. Does this University/Faculty offer a degree/diploma that you are interested in?

83.3%

100%

66.6% 100%

85%

2. What are the qualifications needed to apply for this degree/diploma?

83.3%

66.6%

66.6% 100%

72%

3. Can you find out how much it will cost you to take this degree/diploma?

16.6%

83.3%

50% 40%

50%

4. Can you find out where this degree/diploma is taught from?

83.3%

100%

66.6% 100%

64%

5. Are there scholarships available for this degree/diploma that you could apply for?

33.3%

83.3%

33.3% 80%

40%

Average for all five tasks:

60.0%

86.6%

63.3% 84.0%

62%

Improvement are shown in BOLD and green, reductions in task success are in italics and are red

For Test 1, the success rates of these five tasks were consistent with the overall average found by Alexander (2005) [HREF3]. Task success was improved by 20.7-26.6%.

Overall usability

Table 6: Overall Task Success, all 12 tasks in each test
SET - Test 1 SET - Test 2 UTAS - Test 1 UTAS - Test 2

59.7%

76.4%

59.7% 76.6%

Improvement are shown in BOLD and green. Nielsen and Loranger (2006) found that the average success for tasks commencing from a home page for tasks that could be completed on that site was 66% and have determined that task success above 70% indicates above average usability.

Participant perceptions and ease of use - first five tasks

Table 7: Comparison of user perceptions and ease of use - SET tests
  Test 1 - averages Test 2 - averages
Task Confidence Satisfaction Changes of approach needed Confidence Satisfaction Changes of approach needed
1 6.5 6.16 0.166 6.5 6.0 0
2 6.16 6.0 0.66 5.16 4.8 0.83
3 2.0 2.16 3 5.5*, P=0.014, t(8.66)=3.05, 95% conf. interval 0.9435 to 6.057
5.83*, P=0.004, t(7.4)=4.13, SET2-SET1=3.6, 95% conf. interval 1.688 to 5.646 0.16**, difference between means=2.83
4 6.5 6.0 0.16 7 6.33 1.16
5 4.16 2.83 2.33 4.83 4.33 1.66
All five tasks 5.06 4.63 1.26 5.8 5.46#, P=0.09, t(58)=1.687, SET2-SET1=0.83, 95% conf. interval 1.822 to 0.1557 0.76
Table 8: Comparison of user perceptions and ease of use - UTAS tests
  Test 1 - averages Test 2 - averages
Task Confidence Satisfaction Changes of approach needed Confidence Satisfaction Changes of approach needed
1 6.33 5.88 0.33 6.2 6.0 0
2 5.66 5.83 0.66 6.6 6.0 0
3 4.16 4.50 2.33 3.6 3.2 1.6
4 6.50 6.50 0.16 6.8 6.8 0
5 5.66 4.66 0.166 5.60 4.60 0.4
All five tasks 5.66 5.46 1.03 5.8 5.32 0.76

Over both tests, there is a trend towards increased confidence, satisfaction and ease of use in the second test for the first five tasks.

Overall Confidence, Satisfaction and ease of use

Table 9: Comparison of user perceptions and ease of use - averages across all tasks in all tests
  Confidence Satisfaction Changes of approach needed
SET Test 1 5.25 4.88 1.00
SET Test 2 5.68 5.37 1.07
UTAS - Test 1 5.43 5.13 0.88
UTAS - Test 2 5.33 4.85 0.68

Use of search

The use of search was compared across all tasks. The number of times search was used, and its contribution to task success were calculated.

Table 10: Instances of search use by participants across all tasks (counted once per task)
SET - Test 1 SET - Test 2 * UTAS - Test 1 UTAS - Test 2

38.8%

29.1%

20.8% 31.6%
Table 11: Search success out of total number of times search used
SET - Test 1 SET - Test 2 * UTAS - Test 1 UTAS - Test 2

39.2%

65.2%

31.2% 50.0%

Improvements are in BOLD and green, *One participant did not use search at all

In SET - Test 2, the use of search was reduced, primarily because more relevant results were returned from the FCMS search engine, and so the number of repeat searches was reduced.

Web site use

The contribution of a particular web site or web site group to task success was determined across all tasks. Following a link from a site or finding the relevant information using the search available for that site was considered a contribution to success.

Table 12: Contribution of web site(s) to task success as percentage of total task success
Web site SET - Test 1 SET - Test 2 UTAS - Test 1 UTAS - Test 2
SET (Faculty and School sites)

41.8%

36.4%

0% 2.2%
Course Unit Handbook 58.1% 30.9% 51.2% 54.3%
Other UTAS sites 0.1% 32.7% 48.8% 43.5%

In all tests, there was a heavy reliance on the Course Unit Handbook. When starting from the home page, the contribution of SET sites to success for the tasks tested was very low.

Type of 'Misery' encountered in each task

Table 13: Total Encounters with Misery - SET tests *
Search Findability Page Design Information Bugs ** Total
Test 1 - first five tasks

5

19

3 12

4

43
Test 2 - first five tasks

0

10

1 6

1

18
Test 1 - all tasks

7

30

4 32

8

81
Test 2 - all tasks

1

13

2 20

2

38
Table 14: Total Encounters with Misery - UTAS tests *
Search Findability Page Design Information Bugs ** Total
Test 1 - first five tasks

3

8

1 9

3

24
Test 2 - first five tasks

1

8

2 6

0

17
Test 1 - all tasks

6

12

9 30

8

65
Test 2 - all tasks

5

18

6 14

2

45

Tasks requiring domain knowledge

The following tasks required (as judged by comments made by participants) domain knowledge to complete the task in an effective or efficient manner.

First five tasks (common across both tests):
SET Tasks:
UTAS Home Page Test

Student Survey

Methodology

Participants in all usability tests were given the following questionnaire at the conclusion of each usability test:

  1. What are the 5 main pieces of information you need when deciding to study at University?
  2. Where would you expect to go to find this information?
  3. What three things do you think are missing from the web site?
  4. Did you always know where you were within the site structure?
  5. What part of the site do you think is the easiest to use?
  6. What is your overall impression of the look of the web site?

The results were analysed by scoring the frequency of each type of comment. The frequency of comments was compared between tests, and combined.

Results

The combined results of all tests are shown below, from a total of 23 participants:

1. What are the 5 main pieces of information you need when deciding to study at university?

Comment Frequency
prerequisites; approximate TCE score 20
place of study 15
costs 14
length of course 12
courses offered 10
scholarships 9
career paths; outcomes; percentage in a job in first year out 9
which topics taught each year; course content 4
accommodation information 4
which courses suit my liking; interest 2
how do I apply? 2
availability 2
benefits of going to uni; what else would I do if I didn't study at uni 2
which uni is better 1
course flexibility 1
workload 1
contact details for areas you are interested in 1
when they start 1

The responses to this question are similar to enquiries entered into the UTAS Enterprise Customer Relationship Management System which contains queries from phone calls, email, web and appointments. Queries from all prospective undergraduate science students (not just college students) and their parents, include:

2. Where would you expect to find this information?

Comment Frequency
web site; course unit pages 23
on campus; visiting or contacting people at the uni; open day 13
careers adviser; people from school 7
Print publications: handbook; brochures; brochures from different universities; course handouts; jobguide book 8

3. What three things do you think are missing from the web site?

Comment Frequency
links to detailed information in the handbook pages; scholarships link on course page; links to scholarships, fees, accommodation, campus map on course pages; basic page for each degree with all of the information 8
entry scores; prerequisites, a table that makes finding the information easy; more information about how to get into courses; explanation of prerequisites 4
proper quick links section; clearer links; sitemap; categories of information 4
comparison between other unis; courses offered by other universities 3
email address of course contacts; faculty contacts; emails for people in charge of courses 3
more help for those new to uni; what is available to get you started; FAQ 3
costs; total costs of courses, books etc 2
blatantly obvious career or course choice button; in depth information on career outcomes 2
maps of campuses; buildings facilities; picture of campuses, residences 2
easier search option 2
scholarships 1
detailed accommodation information 1
contacts of people who have previously studied the course 1

4. Did you always know where you were within the site structure?

Comment SET - Test 1 SET - Test 2 UTAS - Test 1 UTAS - Test 2
yes 3 5 2 3
no 3 1 4 2

5. What part of the site was the easiest to use?

Comment Frequency
finding general info about courses 10
future students 3
search engine 2
main menu; courses link (SET home page) 1
none of it if you don't know specifically what you are looking for 1
home page, from there it is hard 1
obtaining contact details 1
quicklinks 1
accommodation information 1
home page 1
links, less complicated than searches 1
getting to each school 1
course cost and information 1
contacts for course supervisors 1

6. What is your overall impression of the look of the web site?

The positive:

Negative:

Ambivalent or neutral:

The Content Specification

The Content Specification ensures that, in a distributed system, information is placed wherever people will encounter it. A user should see a minimum level of information relevant to the subject when they encounter a page, either by navigation or search. For example, if a page is about facilities, then the location, opening hours, availability, resources and a contact should be listed. If a page is about research projects, then links to publications, and contact details of researchers should be provided. This additional information will also enhance the credibility of the web site [HREF10]. For prospective students the following links should be present on sites and pages promoting courses:

In the case of FCMS pages, some of these links may provide content specific to a particular Faculty or school, such as, scholarships and graduate profiles, as long as general links are also included.

Content Specification links relevant to courses were added to:

Project Costs and Benefits

Three staff formed the core of the project team. Two were current CALT employees providing an in-kind contribution to SET, the other was employed as a project officer for six months funded by SET. In terms of cost recovery, the Faculty only needs to attract and retain one international student for three years to cover its costs for this project. Support from CALT for this project is ongoing.

It is difficult to quantify the improvement to the web site in dollars or increased enrolments, since enrolments are influenced by other factors such as the unemployment rate. Prospective students may be more likely to enroll if they cannot find a job, rather than if they can use the University web site or not. Regardless of the factors that influence the decision to study at University, a web site should not be an impediment to prospective students or any other client group.

The number of site visits (from page statistics) may represent individual efforts to complete tasks (although a task may be completed in another site, and the reason for a site visit cannot be determined). The number of visits to the Prospective Students site in a year is approximately 120,000, with several pages viewed or 'hit' in each visit. If the average task success is 60% for any given task, then 40% of tasks, or 48,000 visits, may have been unsuccessful. If task success can be increased by 25% as has been shown in this project, this translates to 30,000 more successful visits. As long as clients are happy with what they find and it was not too difficult to find information, increasing task success can only benefit the University.

Conclusion

The Health Check identified a number of issues that required rectification, and these were also given a severity rating and priority. The number of issues rectified in this project was sufficient to result in trends towards increased student confidence and satisfaction across the first five tasks, with significant increases in both for Task 3 (course costs). Students were also more aware of where they were within the UTAS site structure. Task success was increased to 100% in some tasks, and overall, was increased by up to 26.6% to a level of usability considered exceptional when compared to the task success found by Nielsen and Loranger (2006). The total number of encounters students had with issues from the 'Scale of Misery' was reduced in both of the second sets of tests. This level of improvement confirms the Health Check as a worthwhile method for identifying usability issues and for improving web sites.

Overall, the increase in task success for both the UTAS and SET sites indicated a change from below to above average usability when compared with the findings of Nielsen and Loranger (2006).

Improvements to search with the FCMS were made across the whole system, and so this will benefit all users of Faculty and School information. The contribution of SET information to success for tasks commencing from the UTAS home page was minimal, so changes made to the FCMS and Student-related sites will also benefit potential non-science students and other client groups.

The advantage of analysing each issue as a 'Source of Misery' is that the actions that can potentially reduce the most misery can be determined. For this project, it proved relatively easy to improve search, navigation, links and some aspects of information architecture because these changes were mainly made to core university systems such as the UTAS search engine and to the FCMS. Improving search is definitely worthwhile, since search was used at least once by all but one participant in this study. The contribution of search to task success also increased.

It is equally important to improve navigation and links. The Health Check process identified issues that did not manifest during the usability testing, such as those affecting task support (from the Scale of Misery), and some of these were rectified. Since usability testing investigates a particular client group, it is worth attempting to rectify all significant issues in order to improve general usability. It was not practical to target content improvement in a project where project staff were unfamiliar with the content, since content is best improved by content custodians.

In comparison with Alexander (2005), this project also found that students were confused by the jargon used by universities, and a significant level of domain knowledge is required in order for students to understand and be confident in what they find. The use of context-sensitive help in the Handbook would probably benefit students, as would ubiquitous contact information. The requirement for Universities to format HECS information in a certain way influenced at least one task in this project. A student who was interested in Architecture could not find the course costs, because they are described in the DEST-compliant fees table as 'Built Environment'. Another student, interested in engineering, thought that this term referred to civil engineering. The needs of the University to describe technical information about courses, the requirements of external funding bodies and the expectations of clients are not always compatible.

The survey conducted in this project revealed a low reliance on brochures, either hard copy or online, as a source of information assisting or influencing the decision to go to university. In at least one instance in the usability testing, a student cancelled a brochure because they did not want to have to 'download' something. In two other tests, the format of the brochure caused the browser to crash. Prior usability testing on prospective international students at UTAS also indicated that students prefer information that looks 'official' to print out to show their families, rather than brochures.

The development of a content specification and placement of these links where students most needed them contributed significantly to the increase in success for the first five tasks in each test. The increase in confidence and satisfaction observed from the SET tests for these five tasks is likely to increase the effectiveness of marketing promoting the home pages of Faculties and Schools. Relevant supporting information should also be placed on every page where users encounter information about a particular subject. This is not a substitute for effective site function and structure. It is of great value for any university to be able to identify where its courses are promoted on its web presence (UTAS has 72 such course-promoting sites), if only to attempt to standardise the additional information provided until such time as the benefits of an effective information architecture can be implemented.

Students clearly want straightforward information, such as how to get in to a course, where they have to go, how long the course will take, what they will get out of it and the costs. Although the results of the survey may have been influenced by the structure of the testing (students had used a web site to look for course information just prior to the survey), there was a strong indication that they expected the Course and Unit Handbook to be a 'one-stop-shop' while performing tasks. This tendency to visit the Course and Unit Handbook as a first option could also have been attributed to the order of the tasks in the usability testing, since the first question was related to a course, as were most of the subsequent questions. However, this expectation that the Handbook should provide all course-related information has been observed in previous usability testing at UTAS on prospective International students, and also by Alexander (2005).

Costs should not be hidden, because universities are 'selling' a product. Students were significantly more confident and satisfied with the cost-related task (Task 3) when this information was easier to find, even though such costs are an unpleasant reality of modern university study. From a marketing perspective, emphasising anything that reduces the costs, such as scholarships or any reductions in HECS, is important. Students are also clearly interested in where their course can lead them, so providing career-outcome statistics and case studies is important. Students in this study were very unfamiliar with the concept of flexible learning options, and given that many students are employed while studying, course and unit flexibility may be another benefit to emphasise.

There are still multiple sources of course information available to prospective students at UTAS. Incidentally, this proliferation of course information, with student recruitment units and Faculties providing different versions, can be observed in six of the Group of Eight universities [HREF11] (surveyed in May 2007, looking for information on the Bachelor of Science in the Future Student site (or equivalent) and Faculty pages). The more versions there are, the more likelihood there is that prospective students will encounter conflicting and misleading information.

Content custodians often have limited resources to maintain information within their domain. In one case, a School and its associated Centre did not have a staff member allocated to maintain their web presence. In several cases, School sites were maintained by academic staff. While academics may have the interest and ability to maintain web sites, the performance management processes for academic staff are not generally compatible with operational tasks, since web site maintenance has to compete with teaching and research performance. This is not to say that general staff are in a better position to undertake this task, since most usually perform these tasks as part of wider duties. Only one School acknowledged web development in performance planning when surveyed for this study.

At UTAS, web administration is the responsibility of sections that maintain the content. This responsibility competes with other operational needs of the section, a situation is not unique to UTAS. A decentralised web management system is likely to be found in many universities and large organisations, such as the University of Melbourne [HREF12], and they are likely to manifest the same issues with resourcing and fitness for purpose of web sites as has been demonstrated in this project.

Although the Health Check process will identify which areas of content need improvement for site custodians, only an appropriate allocation of staff to this task along with centrally supported content standards will make it possible to improve content. Recognising that web site editing is a work area that requires as much support from a human resource perspective as should any other technical skill, is essential. To assist staff in this area, web-related expertise should come from a properly resourced and supported central web services body working with staff to improve web site management. Unless this sort of support is present, the sort of improvements made during this project are likely to decline, particularly those not directly related to changes to search and any navigation that is 'locked down' in some way, in the FCMS in this case.

Understandably, Faculties and Schools wish to control marketing information, and therefore duplicate their own course-related information, albeit with limited resources, and an associated cost in staff hours. The contribution of Faculty course marketing information, both as course descriptions and brochures, to successful tasks commencing from a university home page was minimal in this project. Duplication of course information in Faculty and School sites may not be adding enough value to the user experience to justify the cost of its creation. Since users would probably benefit most from an information architecture that reduces duplicate sources of information and potentially out of date information, then content custodians should spend what time they have contributing content to a central source of course information rather than developing alternative versions. Schools and Faculties already contribute technical information to the Handbook and are in the best position to 'sell' their disciplines on behalf of the University, since they maintain research information and can create accurate and meaningful graduate profiles. The 'University', i.e. academic administration, should sell courses using information provided by its teaching and research entities. The Student Centre [HREF4] is planning to deliver marketing information from the Course and Unit Handbook as part of its project to improve the student experience of university web services.

Providing the same distributed model of site custodianship, as currently exists at UTAS, through a CMS, will address content currency and provide appropriate authorisation processes. However, if the cost of content creation or replacement is considered, if any university:

A very approximate cost of replacement for a web presence can then be calculated. Based on this calculation, the UTAS web presence may (again very conservatively) be worth approximately $15,000,000. For UTAS Course and Unit Handbook descriptions alone, assuming that Faculties and Schools also maintain alternative documents on their own web sites, a reduction in duplication of course descriptions has the potential to free $63,000 of HEO5 staff hours. This staff resource could be put towards entering marketing information in the Course and Unit Handbook.

This project demonstrates what can be done by making improvements to findability as defined above. The next step, in addition to further testing, would be to see how much more can be achieved by improving information architecture and content quality as well. A content management system (CMS) may be the next step towards achieving this. However, Faculties and Schools (and the University) need to have the benefits of usability, improvements to search and web site effectiveness demonstrated to them. This can be achieved by projects such as these. A truly client-focused University web site is likely to need a completely different approach to content creation, in terms of who creates it, who is responsible for authorising it, and where it 'lives'.

References

Nielsen, Jakob and Hoa Loranger (2006). Prioritising Web Usability. New Riders Publishing, Berkeley, California USA

Rosenfeld, Louis and Peter Moreville (2002). Information Architecture for the World Wide Web. Second Edition. O'Reilly Media, Sebastipol, California, USA

Hypertext References

HREF1
http://www.utas.edu.au/calt/
HREF2
http://www.utas.edu.au/
HREF3
http://ausweb.scu.edu.au/aw05/papers/refereed/alexander/paper.html
HREF4
http://www.studentcentre.utas.edu.au/sws/
HREF5
http://fcms.its.utas.edu.au/scieng/scieng/
HREF6
http://ausweb.scu.edu.au/aw06/papers/refereed/moore/index.html
HREF7
http://www.utas.edu.au/webservices/
HREF8
http://www.usability.gov/
HREF9
http://www.nngroup.com/events/tutorials/camp.html
HREF10
http://www.webcredibility.org/guidelines/
HREF11
http://www.go8.edu.au/
HREF12
http://ausweb.scu.edu.au/aw05/papers/edited/booth2/paper.html

Acknowledgements

The authors are very grateful to:

Copyright

Maria Moore, Ben Cleland, Marcus Eddy © 2007. The authors assign to Southern Cross University and other educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to Southern Cross University to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the conference papers and for the document to be published on mirrors on the World Wide Web.