Ian Smissen, Senior Education Developer, Education Design and Research, Deakin University, Geelong Victoria 3217 ismissen@deakin.edu.au
Assoc Prof. Rod Sims Director, Education Design and Research, Deakin University, Geelong Victoria 3217 rsims@deakin.edu.au
The dynamics of teaching and learning in higher education are being affected by a combination of educational, social, political and economic factors, and one of the most important changes is the extent to which Learning Management Systems (LMS) are forming the basis for online teaching and learning environments. Deakin University has just completed an extensive evaluation of learning management systems (LMS) to select an enterprise level online teaching and learning system. One of the important aspects of this process is that unlike other evaluations which focused on systems comparison, this evaluation was user-centred, taking into account teaching and learning needs to determine the LMS that would best align with those needs. This paper examines the methods and results of this collection of staff and student needs in online teaching and learning.
The dynamics of teaching and learning in higher education are being affected by a combination of educational, social, political and economic factors, and one of the most important changes is the extent to which Learning Management Systems (LMS) are forming the basis for online teaching and learning environments. Their ability to deliver web-based content and assessment, provide a communication and collaboration environment and manage student records is increasingly being regarded as a standard requirement for teaching and learning at tertiary institutions.
Like most other Australian universities, Deakin University has invested in centrally managed and supported LMS, currently FirstClass and TopClass, with WebCT Standard Edition also used and supported by one school. The outcomes of this investment have been realised through successful collaborative environments being established with both on-campus and off-campus students as well as the development of interactive content systems. However, the complexities of using online technology with teaching and learning communities has resulted in Deakin University adopting a user-based approach to determine the most appropriate technological infrastructure to support its future offering of quality educational resources. In particular, this need arose as the existing online teaching and learning environment was not meeting the increasingly sophisticated teaching and learning needs of the academic staff and the students. The major activity resulting from this decision was that in mid 2001 Deakin University embarked on a detailed evaluation of alternative LMS products to make a recommendation on the optimal LMS solution for Deakin's ongoing needs.
This paper describes the steps taken to achieve that recommendation (see the Evaluation Process below) and highlights the primary factors that influence the use of online technology in teaching and learning. One of the important aspects of this process is that unlike other evaluations which focused on systems comparison (for example, Centre for Curriculum, Transfer & Technology, 2002 [1]) this evaluation was user-centred, taking into account teaching and learning needs to determine the LMS that would best align with those needs.
Figure 1. Evaluation Process

Prior to this process being implemented, the evolution of Deakin's existing online teaching and learning environment had provided a rich source of intelligence on the university's online teaching and learning requirements, mainly in the form of internal research projects and small scale evaluations on how the existing systems were being used. These studies provided the essential framework for what worked and didn't work and what staff and students liked and disliked about the systems in use. However, the changing dynamics necessitated a systematic collection of the online teaching and learning requirements of students and academic staff, independent of the software used, to ensure that Deakin University was in a position to meet the challenges of operating in a global educational environment.
The first step in the process was to identify the key features of an online teaching and learning environment. This was achieved using focus groups facilitated by an independent consultant. Four two-hour sessions were attended by a total of approximately 100 people, including staff (teaching, technical and administration) and students (on- and off-campus, undergraduate and postgraduate). Participants were led through a process that clarified the key elements of teaching and learning and identified which aspects could be completed online, resulting in a list of approximately 650 features relevant to online teaching and learning environments.
A subsequent consolidation workshop was then organised to systematically categorise the features into the following functional groupings:
To determine the relative importance of the features of an online teaching and learning system identified by the Focus Groups, a Staff Survey and a Student Survey were designed and delivered over the web. Comparisons among different user segments was completed to ensure that relevant staff and student groups had been represented and that the overall results could be used for a more detailed analysis of a short-list of LMS products without prejudicing the analysis for or against particular user groups. Information was collected in four parts - (i) information about the participants, (ii) rating of the categories, (iii) rating of specific features and (iv) participant comments.
The principal reason for collecting this information was to rate and rank the various functional requirements of an online teaching and learning environment. A detailed statistical analysis of the data was therefore beyond the scope of our needs and the timeframe under which the evaluation took place.
231 staff and 893 students responded to the surveys. Respondents were asked to provide basic information about themselves to enable more detailed analysis of the results of Parts 2 and 3 and to ensure coverage of the relevant segments of staff and students. Table 1 shows that the survey reached academic, technical and administrative staff and undergraduate and postgraduate students studying in on- and off-campus modes.
Table 1. Distribution of types of staff and student respondents.
| Students |
893
|
|
| Level of study | ||
| Undergraduate |
670
|
|
| Postgraduate Coursework |
172
|
|
| Postgraduate Research |
41
|
|
| blank |
10
|
|
| Mode of study | ||
| Off campus |
435
|
|
| On campus |
520
|
|
| both on and off campus |
72
|
|
| blank |
10
|
|
| Staff |
231
|
|
| Role | ||
| Lecturer |
109
|
|
| Tutor/practical demonstrator |
9
|
|
| Administrator |
27
|
|
| Technical |
52
|
|
| Education/Course Developer |
8
|
|
| Other |
26
|
Table 2 indicates that the survey also included a broad representation of users accessing Deakin's online systems from within Deakin, at home and at work outside Deakin. This latter group included a small proportion of staff (6%) but a significant proportion of student respondents (25%). This has implications for access to systems with workplace restrictions such as firewalls and inability to install client and other software. The vast majority of respondents (95% of students and 73% of staff) use Windows as their main operating system although a large proportion of staff (20%) use Apple and a very small number of staff and students use Unix. Most respondents had used one or more teaching and learning systems but the survey did identify a high number of both staff (57 =25%) and students (72 =8%) who had not used any such system.
Table 2. Distribution of staff and student respondents according to locations of access to Deakin's online systems, operating system and online teaching and learning systems used.
|
Students
|
Staff
|
|||
| Access | Deakin |
447
|
228
|
|
| Home |
762
|
155
|
||
| Work (outside Deakin) |
221
|
13
|
||
| Other |
38
|
11
|
||
| Operating system | Windows |
851
|
169
|
|
| Apple |
31
|
46
|
||
| Unix |
6
|
13
|
||
| Other |
0
|
1
|
||
| Teaching and | FirstClass |
688
|
142
|
|
| learning systems | TopClass |
185
|
56
|
|
| WebCT |
140
|
26
|
||
| Blackboard |
11
|
5
|
||
| Other |
26
|
15
|
||
| None |
72
|
57
|
Respondents were asked to rate a set of broad feature categories on a four-way scale: not important, desirable, very important and essential. These were scored as 0,1,3, and 9 respectively.
To avoid the problem of non-discriminatory responses, e.g., rating all features as essential, respondents were asked to restrict their essential ratings to a maximum of 4 of the 11 categories for students and 6 of the 18 categories for staff. Unfortunately, there was insufficient time to build a preventative check for this restriction into the surveys, although the responses were checked for "too many" essential responses. The data in Table 3 indicates that 17% of students and 19% of staff did not follow this instruction.
Table 3. Frequency distribution of "essential" responses in staff and student surveys.
|
No. |
Frequency
|
|||
| Students |
0-4
|
738
|
||
| (n=893) |
5
|
54
|
||
|
6
|
40
|
|||
|
7
|
26
|
|||
|
8
|
14
|
|||
|
9
|
12
|
|||
|
10
|
5
|
|||
|
11
|
4
|
|||
| Staff |
0-6
|
187
|
||
| (n=231) |
7
|
13
|
||
|
8
|
5
|
|||
|
9
|
8
|
|||
|
10
|
3
|
|||
|
11
|
3
|
|||
|
12
|
4
|
|||
|
13
|
2
|
|||
|
14
|
1
|
|||
|
15
|
3
|
|||
|
16
|
1
|
|||
|
17
|
0
|
|||
|
18
|
1
|
To assess whether these results affected the overall results for Part 2, the complete results were compared with the results with these anomalous responses excluded. The outcome of this comparison, shown in Table 4, indicate that there was very little difference in rankings of individual categories by either staff or student. Consequently, all further comparisons were done on the complete data.
Table 4. Comparison of overall rankings including and excluding responses with more than the maximum number of "essential" ratings.
|
Students
|
Staff
|
||||
|
Include
|
Exclude
|
Include
|
Exclude
|
||
| Ease of use |
1
|
1
|
1
|
1
|
|
| Platform and browser compatibility |
4
|
4
|
2
|
2
|
|
| Synchronous communication |
9
|
9
|
17
|
17
|
|
| Asynchronous communication |
7
|
7
|
6
|
6
|
|
| Collaborative work |
5
|
5
|
4
|
5
|
|
| Assessment |
8
|
8
|
13
|
12
|
|
| Results management |
2
|
2
|
11
|
11
|
|
| Online help |
6
|
6
|
5
|
4
|
|
| Customise |
10
|
10
|
16
|
15
|
|
| Personal presence |
11
|
11
|
18
|
18
|
|
| Assignment submission |
3
|
3
|
8
|
8
|
|
| Surveys and evaluations |
.
|
.
|
15
|
16
|
|
| Easy course content creation |
3
|
3
|
|||
| Course material to suit individual styles |
.
|
.
|
7
|
7
|
|
| Import 3rd party content |
12
|
14
|
|||
| Groups |
.
|
.
|
9
|
9
|
|
| Database driven web-pages |
14
|
13
|
|||
| Records management and reporting |
.
|
.
|
10
|
10
|
|
Table 5 and Figure 2 display the comparison of rankings and weighted scores for staff and student responses to Part 2. Results Management and Assignment Submission score higher on student responses than staff responses. The most desirable feature for both staff and students was Ease of use, ranking first with all user segments. Four categories (Synchronous communication, Ability to customise, Personal presence and Surveys and evaluations) were the least desirable features scoring below 3.00 with all users.
Table 5. Comparison of rankings and weighted scores for staff
and student responses to Part 2.
|
Students
|
. |
.
|
Staff
|
||||
|
Score
|
Rank
|
Score
|
Rank
|
Rank*
|
|||
| Ease of use |
6.13
|
1
|
6.91
|
1
|
1
|
||
| Platform and browser compatibility |
4.86
|
4
|
5.54
|
2
|
2
|
||
| Synchronous communication |
2.44
|
9
|
2.01
|
17
|
10
|
||
| Asynchronous communication |
3.66
|
7
|
4.13
|
6
|
5
|
||
| Collaborative work |
4.46
|
5
|
4.59
|
4
|
3
|
||
| Assessment |
3.41
|
8
|
2.90
|
13
|
8
|
||
| Results management |
5.51
|
2
|
3.09
|
11
|
7
|
||
| Online help |
4.00
|
6
|
4.51
|
5
|
4
|
||
| Customise |
1.92
|
10
|
2.17
|
16
|
9
|
||
| Personal presence |
1.74
|
11
|
1.89
|
18
|
11
|
||
| Assignment submission |
4.97
|
3
|
3.80
|
8
|
6
|
||
| Surveys and evaluations |
.
|
.
|
. |
2.36
|
15
|
||
| Easy course content creation |
5.14
|
3
|
|||||
| Course material to suit individual styles |
.
|
.
|
. |
3.99
|
7
|
||
| Import 3rd party content |
2.92
|
12
|
|||||
| Groups |
.
|
.
|
. |
3.63
|
9
|
||
| Database driven web-pages |
2.76
|
14
|
|||||
| Records management and reporting |
.
|
.
|
. |
3.22
|
10
|
||
| * Rank after staff only categories are excluded. | |||||||
Figure 2. Comparison of weighted scores for staff and student responses to Part 2.

Clearly, ability to use the system (Ease of use, Platform and browser compatibility and Online help) is of paramount importance to all users. If these items are removed, the functional features or teaching and learning "tools" can be compared. Table 6 lists these features in order of preference for students and staff. There is close similarity between the preferences of staff and students (taking into account the features that were "staff only").
Table 6. Comparison of staff and student rankings of online teaching and learning tools.
| Students | Staff | |
| Results management | Easy course content creation* | |
| Assignment submission | Collaborative work | |
| Collaborative work | Asynchronous communication | |
| Asynchronous communication | Course material to suit individual styles* | |
| Assessment | Assignment submission | |
| Synchronous communication | Groups* | |
| Customise | Records management and reporting* | |
| Personal presence | Results management | |
| Import 3rd party content* | ||
| Assessment | ||
| Database driven web-pages* | ||
| Surveys and evaluations* | ||
| Customise | ||
| Synchronous communication | ||
| Personal presence | ||
| *staff only features |
To assess whether these results are different for different segments of staff and students, the following comparisons were made:
Figure 3 indicates that postgraduate research students rated Collaborative work, Assessment and Results Management much lower than did other students. It would be expected that postgraduate research students would rate these features low as they have no reason to use them.
Figure 3. Comparison of weighted scores for student responses to Part 2.
(subdivided by level of study: undergraduate, postgraduate coursework and postgraduate
research)

Figure 4 indicates that Assignment submission was rated higher by off-campus students than on-campus students, otherwise ratings for all other categories are very similar. This result could be expected as off-campus students require an efficient remote method for submitting and getting feedback on assignments. Online assignment submission offers off campus students a rapid, secure mechanism for submitting work with instant feedback on whether the assignment has been submitted. The normal postal methods are subject to the vagaries of the postal system, require several days lead time to be sure of making deadlines and students are uncertain of delivery. On campus students have the traditional methods of submitting assignments that they are used to using so electronic assignment submission would be less attractive to them but is still rated highly.
Figure 4. Comparison of weighted scores for student responses to Part 2.
(subdivided by mode of study: off-campus and on-campus)

Figure 5 indicates that students who have used FirstClass and WebCT rated Asynchronous communication and Assignment submission higher than did other users. Non-users generally rated most features lower than did users of other systems. Otherwise there appears to be little difference in ratings among users of different systems. Users of FirstClass and WebCT at Deakin are much more likely than non-users or users of TopClass to have participated in online asynchronous communication and to have submitted work online. Successful participation in these activities is likely to influence students to rate them more highly.
Figure 5. Comparison of weighted scores for student responses to Part 2.
(subdivided by use of different online teaching and learning systems)

Figure 6 indicates that teaching staff (lecturers and tutors/practical demonstrators) rated Asynchronous communication, Collaborative work, Assignment submission, Ease of content creation and Ability to create material to suit individual styles much more highly than did non-teaching staff. Otherwise, there is little difference between the responses from staff in different roles. Each of these features is directly related to online teaching practice so it would be expected that those currently involved in teaching would be more likely to rate these features more highly.
Figure 6. Comparison of weighted scores for staff responses to Part 2.
(subdivided by role)

Figure 7 indicates that non-users rated most categories lower than users of other systems. Both FirstClass and TopClass users rated Ease of content creation and Online help higher than did other users. This would support the need for support and training identified in the focus groups. FirstClass users rated Collaborative work slightly higher than did other users. WebCT users rated Results management, Assignment submission and Records management and reporting higher than did other users. These results are not surprising as FirstClass currently is used widely to faciltiate online collaborative work and WebCT is used to manage results and submit assignments. Otherwise, there is little difference between the responses from staff using different systems.
Figure 7. Comparison of weighted scores for staff responses to Part 2.
(subdivided by use of different online teaching and learning systems)

These results indicate that while there are minor differences among some segments, there is a fairly consistent rating and ranking of the broad feature-categories among students and staff. There are certainly no major clashes of opinion where one group rates/ranks a particular feature very high and other groups rate the same feature very low. Differences among segments may be explained by expected differences in different staff and student segments with regard to teaching and learning needs or levels of experience with or relevance of different aspects of online teaching and learning.
Respondents were presented with a list of specific features and were asked to mark any that they considered essential for their use of an online teaching and learning system. Table 7 displays the count and % of total responses for each feature.
Table 7. Comparison of staff and student responses
to Part 3.
(features that scored >50% are marked in bold)
| Feature | . | Students | . | Staff | |||
| Score /893 |
% | Score /231 |
% | ||||
| Synchronous Communication | . | . | . | . | . | . | |
|
text chat
|
507 | 56.77 | 95 | 41.13 | |||
|
live audio
|
151 | 16.91 | 42 | 18.18 | |||
|
live video
|
120 | 13.44 | 36 | 15.58 | |||
|
shared whiteboard
|
252 | 28.22 | 56 | 24.24 | |||
|
file sharing
|
523 | 58.57 | 126 | 54.55 | |||
|
application sharing
|
298 | 33.37 | 63 | 27.27 | |||
|
viewing another's desktop
|
46 | 5.15 | 30 | 12.99 | |||
|
remote control of another's desktop
|
27 | 3.02 | 26 | 11.26 | |||
|
private chat groups
|
355 | 39.75 | 72 | 31.17 | |||
|
embed web links
|
308 | 34.49 | 93 | 40.26 | |||
| Asynchronous Communication | . | . | . | . | . | . | |
|
announcements/bulletin board
|
656 | 73.46 | 153 | 66.23 | |||
|
threaded discussion
|
400 | 44.79 | 117 | 50.65 | |||
|
email
|
754 | 84.43 | 171 | 74.03 | |||
|
see who's read messages
|
375 | 41.99 | 133 | 57.58 | |||
|
link to course material
|
135 | 58.44 | |||||
|
search, sort and manage messages
|
463 | 51.85 | 133 | 57.58 | |||
|
add attachments
|
680 | 76.15 | 168 | 72.73 | |||
|
redirect messages
|
323 | 36.17 | 120 | 51.95 | |||
|
ability to work offline
|
441 | 49.38 | 121 | 52.38 | |||
|
format text
|
300 | 33.59 | 87 | 37.66 | |||
|
embed media
|
222 | 24.86 | 91 | 39.39 | |||
|
formulae and special characters
|
231 | 25.87 | 67 | 29.00 | |||
|
embed web links
|
261 | 29.23 | 108 | 46.75 | |||
|
file sharing
|
418 | 46.81 | 108 | 46.75 | |||
| Assessment | . | . | . | . | . | . | |
|
secure assignment submission
|
793 | 88.80 | 153 | 66.23 | |||
|
self-assessment quizzes
|
552 | 61.81 | 96 | 41.56 | |||
|
instant feedback on self-assessment
|
551 | 61.70 | 87 | 37.66 | |||
|
multiple choice q's
|
103 | 44.59 | |||||
|
true/false q's
|
73 | 31.60 | |||||
|
multiple correct answer q's
|
68 | 29.44 | |||||
|
text entry q's
|
79 | 34.20 | |||||
|
upload q's
|
74 | 32.03 | |||||
|
image map q's
|
35 | 15.15 | |||||
|
drag and drop q's
|
42 | 18.18 | |||||
|
list matching q's
|
32 | 13.85 | |||||
|
ordering q's
|
37 | 16.02 | |||||
|
numerical q's
|
42 | 18.18 | |||||
|
provide automatic feedback
|
91 | 39.39 | |||||
|
reuseability of q's
|
78 | 33.77 | |||||
|
multimedia
|
57 | 24.68 | |||||
|
cascading answers
|
42 | 18.18 | |||||
|
answer range for numerical q's
|
35 | 15.15 | |||||
|
time restrictions
|
89 | 38.53 | |||||
|
import q pools
|
52 | 22.51 | |||||
|
randomised q's
|
69 | 29.87 | |||||
|
anonymous surveys and evaluations
|
75 | 32.47 | |||||
|
link to course material
|
85 | 36.80 | |||||
|
embed q's in course material
|
66 | 28.57 | |||||
|
auto response to submission
|
84 | 36.36 | |||||
|
ability to mark online
|
103 | 44.59 | |||||
| Tracking student progress | . | . | . | . | . | . | |
|
student access to own marks
|
823 | 92.16 | 105 | 45.45 | |||
|
store grades from outside system
|
109 | 47.19 | |||||
|
exportability
|
118 | 51.08 | |||||
| Online help | . | . | . | . | . | . | |
|
online user help
|
647 | 72.45 | 166 | 71.86 | |||
|
ability to customise online help
|
174 | 19.48 | 68 | 29.44 | |||
|
training materials online
|
592 | 66.29 | 142 | 61.47 | |||
|
training materials in print
|
361 | 40.43 | 91 | 39.39 | |||
|
training materials on CD
|
270 | 30.24 | 62 | 26.84 | |||
| Customise desktop | . | . | . | . | . | . | |
|
add or remove features
|
447 | 50.06 | 119 | 51.52 | |||
|
customise look and feel
|
272 | 30.46 | 85 | 36.80 | |||
| Personal presence | . | . | . | . | . | . | |
|
user personal profile
|
357 | 39.98 | 90 | 38.96 | |||
|
user website
|
169 | 18.92 | 64 | 27.71 | |||
|
personal portfolio
|
387 | 43.34 | 92 | 39.83 | |||
| Coursework creation | . | . | . | . | . | . | |
|
build entire websites
|
70 | 30.30 | |||||
|
no HTML expertise required
|
91 | 39.39 | |||||
|
import courses, sections or pages
|
97 | 41.99 | |||||
|
edit online
|
121 | 52.38 | |||||
|
style sheet compatibility
|
77 | 33.33 | |||||
|
javascript/java
|
58 | 25.11 | |||||
|
multimedia
|
84 | 36.36 | |||||
|
time restrictions on course material
|
61 | 26.41 | |||||
|
hide material
|
73 | 31.60 | |||||
|
content customisable for different groups
|
73 | 31.60 | |||||
|
content can be managed automatically
|
46 | 19.91 | |||||
|
reuseability of course material
|
96 | 41.56 | |||||
|
standard file formats
|
138 | 59.74 | |||||
| Groups | . | . | . | . | . | . | |
|
private groups
|
96 | 41.56 | |||||
|
groups within classes
|
122 | 52.81 | |||||
|
groups independent of classes
|
64 | 27.71 | |||||
|
students create/manage groups
|
75 | 32.47 | |||||
|
tools for constructing levels of participation
|
74 | 32.03 | |||||
|
assign roles and responsibilities within groups
|
70 | 30.30 | |||||
| Reporting | . | . | . | . | . | . | |
|
create reports
|
124 | 53.68 | |||||
|
search within a report
|
84 | 36.36 | |||||
|
sort within a report
|
75 | 32.47 | |||||
|
export data from reports
|
108 | 46.75 | |||||
|
modify report structure
|
98 | 42.42 | |||||
|
usage statistics
|
92 | 39.83 | |||||
| Other elements | . | . | . | . | . | . | |
|
calendar
|
314 | 35.16 | 91 | 39.39 | |||
|
LOTE
|
412 | 46.14 | 24 | 10.39 | |||
|
3rd party software
|
415 | 46.47 | 60 | 25.97 | |||
|
who's online
|
517 | 57.89 | 87 | 37.66 | |||
|
drag and drop upload and download
|
374 | 41.88 | 95 | 41.13 | |||
|
intranet environments
|
80 | 34.63 | |||||
|
archive whole or part of content/discussion
|
90 | 38.96 | |||||
|
guest accounts
|
84 | 36.36 | |||||
|
bulk file manipulation
|
66 | 28.57 | |||||
|
delegate permissions
|
86 | 37.23 | |||||
|
search function
|
110 | 47.62 | |||||
|
bookmarks
|
81 | 35.06 | |||||
|
compliance and certification
|
41 | 17.75 | |||||
Features relevant to staff and students:
Features relevant to staff only:
Standout features:
Participants were provided with a text box to add comments at the end of the survey. 215 students (24%) and 63 staff (27%) provided comments. The majority of comments could be classified into three broad categories: (i) general comments on the pros and cons of the use of the online environment in teaching and learning, (ii) complaints and compliments about the software in current use, and (iii) complaints and compliments about support from teaching and technical staff. While these observations have proved useful for improving our understanding of online teaching and learning, they were considered by the evaluation working group to offer minimal information in terms of identifying the essential components of an LMS.
Not surprsingly, the ability to use the system (ease of use, platform and browser compatibility and online help) is of paramount importance to all users. This aside, there are clear preferences among staff and students in the teaching and learning tools of an online system and wuile there are differences between the preferences of staff and students, there are high levels of similarity among different student cohorts and staff types.
All the LMS products commonly used within tertiary institutions include the majority of the features we identified as important. To distinguish between systems, we needed a detailed checklist of specific functionality and a way of deciding which general features were more important. The results of the focus groups and staff and student surveys provided us with this information. Consequently the university was able to examine the possible products and eliminate those that either did not meet the functionality or were not considered viable in terms of vendor support. Not surprisingly perhaps, the final decision came down to a choice between two - Blackboard and WebCT/Vista - the two systems used by thirty-four of the thirty-seven Australian universities.
In addition, the process of focusing on user requirements has resulted in a sense of ownership by the university community of the new LMS and acceptance that those systems currently in use can be replaced without compromising the quality of teaching and learning. The information collected has also provided an insight into the important features of online teaching and learning, which will be used to further improve Deakin University's online environment and define imperatives for professional development of teaching staff and priorities for training and resources for students working in an online environment.
Having reached the final stage of the decision-making process, the challenge will be to implement the system and manage the migration of existing environments to the new LMS.
The authors would like to thank the staff and students of Deakin Univeristy for their participation in the Focus Groups and Surveys, the results of which form the basis of this paper.
Centre for Curriculum, Transfer & Technology (2002). Online educational delivery applications: A web tool for comparative analysis. [1]