Web-based Assessment: Two UK Initiatives

Dave Whittington [HREF1], Robert Clark Centre for Technological Education, University of Glasgow, G12 8LS, UK. d.whittington@elec.gla.ac.uk

Joanna Bull, University of Luton, UK joanna.bull@luton.ac.uk

Myles Danson, Loughborough University, UK m.danson@lboro.ac.uk


Abstract

This paper describes two UK initiatives aimed at promoting and disseminating best practice in the use of computer assisted assessment. Both initiatives are tackling the technological, pedagogical and organisational aspects of using the Web to support student assessment. Some of the more interesting problems being tackled include the interoperability of assessment systems and the use of XML to represent questions and tests. Both groups have also looked at the security requirements of online student assessment and are beginning to develop strategies that are technically feasible, pedagogically sound and work within existing organisations.


Introduction

This paper describes two UK initiatives aimed at promoting and disseminating best practice in the use of computer assisted assessment (CAA). The CAA Centre [HERF2] and the Scottish CAA Network [HREF3] are both recently funded national projects that aim to promote and support more widespread use of the Web, and other technology in the assessment of student achievement.

The CAA Centre is funded by the higher education funding councils of England, Wales and Northern Ireland under phase three of the Teaching and Learning Technology Programme (TLTP3 [HREF4]). The director of the centre is Dr Joanna Bull who is based at the University of Luton, collaborating institutions include the Universities of Glasgow and Loughborough and Oxford Brookes University.

The Scottish CAA Network is funded by the Scottish Higher Education Funding Council (SHEFC) under its Communications and Information Technology Programme [HREF5]. A consortium of the Universities of Glasgow and Strathclyde, and Heriot-Watt University are concerned with the dissemination, evaluation and appraisal of three different Web-based assessment engines and the development of an engine- independent logical definition of an assessment in the form of a document type definition (DTD).

The rest of this paper describes the various activities and achievements of the two initiatives all of which are available to the wider academic community and should be of considerable use to institutions, departments or individuals planning to use the Web to support student assessment.

Computer Assisted Assessment

We define CAA as any form of assessment in which the computer is an integral part in the delivery, response storage, marking of responses, or reporting of results from a test or exercise. This may be as simple as a data entry and graphing tool, some sort of computer generated simulation, or more often use of a computer to mark (optical data capture systems) and deliver assessment (screen based CAA). Screen based CAA may run over a LAN or be Web based and it is the latter which we discuss.

Web-based Assessment

Web-based CAA shares a number of limitations with screen delivered CAA in general. To achieve automated marking it is generally recognised that there must be a single or series of definably correct answers. The CAA Centre has done much work in pedagogical issues of test and question design and this will not be repeated here. Instead we offer a number of advantages:

Background

Research into computer-assisted assessment is widely varied in emphasis. Pedagogical explorations of question and test design have been reported for a number of years, both generically and within specific subject disciplines (Issacs, 1994, Paxton 1998, Farthing and McPhee, 1999). In parallel, research has been conducted on issues of statistical validity testing (Kingsbury and Houser, 1999) and use of item response theory (Linden and Hambleton, 1997). Comparisions of results from computer and paper-based tests have been conducted (see Kniveton, 1996, Russell and Haney, 1997) and More recently, US-based research has focussed on adaptive testing (Drasgow and Olson-Buchanan, 1999), text analysis (Landauer, et al, 1988, Burstein et al, 1998, 1999) and the evaluation of web-based testing systems in distance education (Brusilovsky and Miller, 1999). Australian research has included the implementation of large-scale systems for formative assessment (Sly and Rennie, 1999) and the development of peer-assisted peer review (Robinson, 1999). The development of WebMCQ, a web-based assessment system has been well documented and evaluated (Dalziel and Gazzard, 1998a, 1998b and Gazzard and Dalziel, 1998) The challenges associated with integrating CAA within the Australian higher education curriculum have been explored by Carbone, (1997) and O'Byrne, (1988).

www.CAACentre.ac.uk

The Computer-assisted Assessment (CAA) Centre is part of a TLTP3 project on the implementation and evaluation of CAA. Established in October 1998, the Centre aims to assist staff in higher education with the development and implementation of CAA. Ultimately the Centre will act as a focus for CAA in higher education. The centre already offers the following and it is hoped that this will develop into a one stop shop for anyone interested or involved in CAA:

Consortium Partner Involvement

In order to ensure that the project has generic significance for the sector, and that a wide range of expertise is embraced in the project, the CAA Centre is comprised of a consortium of four UK Universities. Three of these have a particular specialism in CAA, whilst the fourth is committed to running trials. Membership of the consortium has allowed the pooling of high quality resources as well as the option to trial software (in some cases commercial software) and the systems in place at other institutions. Each consortium member is currently involved in trialling various CAA systems.

University of Luton

The University of Luton has successfully implemented a University wide CAA system that is used to summatively assess 9000 students each year in stage and final award examinations. Luton has particular expertise in the strategic implementation of CAA examinations and pedagogic expertise in integrating CAA within existing assessment strategies. The University of Luton trials include the use of various Web-based assessment systems in english, psychology and biomedical science.

University of Glasgow

The University of Glasgow is one of the lead sites for the Clyde Virtual University project (Whittington & Sclater, 1998), offering expertise in multi-disciplinary Web based education. Trials underway at Glasgow University include the use of TRIADS [HREF6] to provide both formative and summative assessments for genetics students and the use of Question Mark's Perception [HREF7] to provide continuous assessment of higher level learning in a third year honours class in electronics.

Loughborough University

The CAA Unit [HREF8] at Loughborough University has expertise in optical data capture for CAA (21000 tests delivered and marked in 2 years), and LAN delivered screen based CAA (22000 tests delivered and marked in 4 months). Previous work at Loughborough includes an extensive national survey of CAA [HREF9]published in 1997. Loughborough also organises the International CAA Conference [HREF10]. The third in this series of conferences was held in June 1999, and brought together expertise from continental Europe and the UK, as well as Brazil and Australia. The proceedings are available online at the above URL. Loughborough is running two trials. The first involves a member of academic staff in the European Studies department and has fairly typical aims for the introduction of CAA:

The second trial is a two tiered approach not unusual for a head of department implementing CAA into teaching. An educational technologist is liasing with the academic and undertaking the material conversion. Aims include:

Both trials are using the commercial web CAA suite Perception from Question Mark Computing. The Perception suite includes both Windows and web-based applications. The two Windows applications are used to author questions (Question Manager) and construct the tests (Session Manager). Once questions are authored they are stored in a local database and transferred via ODBC to the master server database (Oracle or SQL Server). Once questions are authored they are stored in a local database and uploaded to the master server database. Perception Server is installed on an NT server (NT only product). This accesses the questions and serves out the tests. Candidates take tests through a version 4 or above Java enabled browser and tutors can process the results via a browser using the fourth element of the suite, Enterprise Reporter. The CAA Unit at Loughborough has recently acquired funding to scale up the use of Perception and will achieve a central web based assessment service over the next three years.

Oxford Brookes University

Oxford Brookes University is playing the role of our chief pilot site. Brookes brings a well-developed strategy for implementing information technology into learning and teaching, and recognises that CAA will be one of the essential tools in the implementation of that strategy.

CAA Centre Web Site

The Web site for the CAA Centre project provides details of the project and a variety of CAA related resources. These range from a guide to writing objective tests (240 downloads between December 1999 and the end of January 2000) to a searchable bibliography of CAA related articles. Linked with the Web site and the project is a mailing list of over 650 members with much interest from outside the UK.

Blueprint

The blueprint has been created by the CAA Centre to provide a structured approach to the implementation of CAA. It seeks to document the range of issues that concern those responsible for developing and implementing CAA, whether on an individual, faculty or institutional basis. The blueprint provides guidance and good practice concerning the following: pedagogy, operational and procedural issues, quality assurance, evaluation, and strategies for effective implementation on an institutional level.

Security and Authentication

Many data security issues that are important within student assessment are similar to those within e-commerce, some of these are being addressed others are not. The wider distribution of strong encryption is to be warmly welcomed. Knowing that questions, results and answers cannot be eavesdropped or interfered with is vital in high stakes assessment. Most examination administrators must now accept that the use of strong encryption can make the browser server connection secure. The weakness remains on the authentication of the student. The need to correctly identify each candidate is well understood in traditional exams and the requirement to produce ID which includes a photograph is now commonplace. To date this has been the only reliable way to authenticate who is actually being tested in an assessment and rooms of computers have had to be placed under 'exam conditions'. Advances in the field of biometrics offers some hope that in the future authentication can be carried out remotely. Keystroke dynamics offers the possibility of being "undeniably identified" (see Biopassword [HREF11]).

www.SCAAN.ac.uk

Although CAA is in widespread use, the use of Web-based assessment engines and rich question types is quite new and little serious comparative evaluation has taken place. SCAAN is taking three such engines Miranda [HERF12], WebTest [HREF13], and TRIADS [HREF6], which have been developed at the Universities of Strathclyde, Heriot-Watt and Derby. These engines are all in routine use for both formative and summative assessment at the SCAAN partner institutions.

The project involves three inter-locking themes for the exploitation of these engines:

The first phase of the project has linked the three lead centres and their close collaborators into a development network to create a number of substantial demonstrator projects. These projects will be evaluated (a) to determine the pedagogical roles and technical characteristics of the available question engines, and (b) to appraise the pedagogic and technical qualities required for question styles to be enthusiastically adopted by academic staff for formative and summative use. In the second phase, project resources will be used to disseminate methods and skills so that other Scottish HEIs can exploit the learning benefits and economies that accrue from these systems.

An Academic Needs Requirements Analysis

This document was produced by SCAAN and outlines the requirements for a CAA system. It has been used to critically present core academic issues that should be built into the provision of CAA. This document has been used to set the focus for subsequent activity in the project.

An XML DTD for reusable questions

A keystone of the Web since its inception has been the interoperability of different systems. Although 'flavours' of HTML have emerged over time most browsers will happily render pages delivered by most servers. This level of interoperability, although not always perfect, does not yet exist for Web-based assessment systems. Questions and tests produced for one system cannot easily be reused in another. Good questions are time consuming to produce and the development of question banks cry out for the agreement of a standard way to represent questions and tests.

An essential first step towards the development of interoperable systems has been the public release of the IMS project's [HREF14] draft Question and Test standard [HREF15]. The SCAAN project is working on the practicalities of implementing interoperability as it develops two Web based assessment systems which need to be able to exchange questions and also be IMS compliant. The two systems, WebTest and Miranda (more info below) have very different pedigrees and are quite unlike in the way they handle questions and tests internally. Work is progressing to develop an XML DTD for a subset of functionality that is common to both systems. This common DTD will then be used to share questions and tests between the two systems. It is not intended to change the way either engine represents questions and tests internally and the common DTD would be an export/import feature. This initial development will be IMS compliant as far as possible but it may be necessary to alter the internal representations to fully accommodate IMS compliance. Details of the progress towards full IMS compliance and the issues this has raised will be presented at the conference.

Software trials

A big part of the SCAAN project is a comparative evaluation of three Web-based assessment systems. All have been created within UK academic institutions and WebTest is now a commercial system.

WebTest

WebTest [HREF13] has been developed at Heriot-Watt University and according to their Web site "WebTest is a flexible automated-assessment tool, which has been used across a range of departments at Heriot-Watt and in other universities in support of flexible and distance learning. It can also be used in schools and a wide range of training situations."

"WebTest has the following features:

TRIADS

TRIADS [HREF6] is based on Macromedia's Authorware [HREF16] and delivers Web-based assessments via the Authorware plugin. TRIADS was originally developed by a consortium consisting of the University of Derby, the University of Liverpool and the Open University. TRIADS is very flexible and "allows the production of questions in a very wide selection of interaction styles up to the level of full multimedia simulations."

Miranda

Originally referred to as the Clyde Virtual University Assessment Engine, Miranda [HREF12] has been gradually developed since 1995 to provide assessments and evaluations delivered over the Web in the context of a virtual university. Miranda has also been used as a test bed for innovative question types based on embedded Applets (Whittington, 1998)

Workshops

Both the CAA Centre and SCAAN are running workshops around the UK to promote the use of CAA. Heriot-Watt University have held workshops in departments where WebTest was being trialled. These workshops brought together students, academics and administrators to discuss their requirements from a Web-based assessment system. The University of Strathclyde has been using Clyde Virtual University as a platform to introduce academics to new teaching technologies in a series of staff development workshops (Littlejohn and Sclater, 1999). Many academics across the West of Scotland now use Miranda on a regular basis with their students. The CAA Centre have held workshops at Oxford Brookes University where academics have been invited to take part in pilot projects, this work is ongoing and progress on this and all the other areas of development will be reported at the conference.

Conclusion

Use of the Web to provide educational content is now well established, as is its use to facilitate discussion between students. If the Web is to fulfil a complete set of educational roles then online assessment must also become an established practice.

However the true potential of CAA has yet to be realised. 'The capabilities of computers to deliver unique assessments should and will be exploited. The power of networks to transfer, upload and download data automatically should also be exploited. The benefits to student learning of receiving specific, timely and encouraging feedback are utilised by only a few. The advantages of detailed feedback on student and group performance, delivered rapidly enough to allow academics to deal with student misconceptions during a module and enhancing student learning, are yet to be achieved.' (Bull, 1999)

Both of these projects seek to assist in the further development of the Web to meet the need for innovative, authentic assessment which harness the potential of technology to deliver pedagogically sound assessments (Bennett 1998).

And finally, this year's CAA conference [HREF17], which is organised by Loughborough University as part of its commitment to the CAA Centre, is being held a week later than in previous years specifically to allow AusWeb 2k participants to attend!

References

Bennett, R. (1998). Reinventing Assessment, Princeton: Educational Testing Service.

Brusilovsky, P. and Miller, P. (1999) Web-based testing for distance education. In: P. De Bra and J. Leggett (eds.) Proceedings of WebNet'99, World Conference of the WWW and Internet, Honolulu, HI, Oct. 24-30, 1999, AACE, pp. 149-154.

Bull, J. (1999). A glimpse of the future in Computer-Assisted Assessment in Higher Education, edited by Sally Brown, Joanna Bull and Phil Race . London: Kogan Page. 193-197

Burstein, J., Kukich, K., Wolff, S., Chi Lu, Chodorow, M., Braden-Harder, L., and Harris, M. (1998). Automated Scoring Using A Hybrid Feature Identification Technique. In the Proceedings of the Annunal Meeting of the Association of Computational Linguistics, August, 1998. Montreal, Canada. (http://www.webmcq.com/)

Burstein, Jill C., Susanne Wolff and Chi Lu. (1999). Using Lexical Semantic Techniques to Classify Free-Responses, in The Depth and Breadth of Semantic Lexicons. Edited by Nancy Ide and Jean Veronis. Kluwer Academic Press. (http://www.webmcq.com/)

Carbone, A. (1997) Developing and Integrating a Web-based Quiz into the Curriculum, Proceedings of WebNet97, World Conference of the WWW and Internet , Toronto, Canada, November, 1997. Association for the Advancement of Computing in Education, Charlottesville, VA.

Dalziel, J. R. and Gazzard, S. (1998a) Assisting Student Learning using Web-based Assessment: An overview of the WebMCQ system, Poster presentation at the 15th Annual Conference of the Australasian Society for Computers in Learning In Tertiary Education. (http://www.webmcq.com/)

Dalziel, James R. and Gazzard, Scott (1998b) Using WebMCQ for Formative and Summative Assessment, Prodeedings of University Science Teaching and the Web Workshop, (Ed. A. Fernandez), 25-27, Sydney: Uniserve Science. (http://www.webmcq.com/)

Drasgow, F. and Olson-Buchanan, J. (Eds) (1999) Innovations in Computerised Assessment, Lawrence Erlbaum Associates Inc., New Jersey.

Farthing, D. and McPhee, D. (1999) Multiple choice for honours-level students? A statistical Evaluation. Danson, M. and Sherratt, R. (Eds) Proceedings of the 3rd Annual Computer-Assisted Assessment Conference, Loughborough, June.

Gazzard, Scott and Dalziel, James R. (1998) ComDesign Principles for Next Wave Software: The development of the WebMCQ system. Proceedings of the Fifteenth Annual conference of the Australasian Society for Computers in Learning in Tertiary Education. (http://www.webmcq.com/)

Issacs, G. (1994) About multiple choice questions, Multiple Choice Testing: Green Guide, 16, The Higher Education Research and Development Society of Australasia, Cambeltown, NSW, 4 - 22.

Kingsbury, G. and Houser, R. (1999) Developing Adaptive Tests for School Children, in Drasgow, F. and Olson-Buchanan, J. (Eds) (1999) Innovations in Computerised Assessment, Lawrence Erlbaum Associates Inc., New Jersey, p 93 - 116

Kniveton, B. (1996) A correlation of mutiple-choice and essay assessment measures, Research in Education, 56 p 73 - 84.

Landauer, T.K., Foltz, P.W. and Laham, D. (1998) An Introduction to Latent Semantic Analysis, Discourse Processes, 25, 2/3, p 259 - 284.

Littlejohn, A., & Sclater, N. (1999). The virtual university as a conceptual model for faculty change and innovation, Journal of Interactive Learning Environments, Vol 7, Nos 2/3. pp. 209-226, 1999

O'Byrne, J (1998) Computer-based Features of the Junior Astronomy Course [at The University of Sydney] Prodeedings of University Science Teaching and the Web Workshop, (Ed. A. Fernandez), 25-27, Sydney: Uniserve Science.

Paxton, M. (1998) A Linguistic Perspective on Multiple Choice Questioning, The Higher Education Research and Development Society of Australasia Newsletter, 20, 1, April.

Robinson, J. (1999) Computer-assisted peer review, in Brown, S. Bull, J. and Race, P. (Eds) Computer-assisted Assessment in Higher Education, Kogan Page, p 95 - 102

Russell, M. and Haney, W. (1997) Test Writing on Computers: An Experiment Comparing Student Performance on Tests Conducted via Computer and via Paper-and Pencil, Education and Policy Analysis Archives, 5, 3, (http://olam.ed.asu.edu/epaa/v5n3.html)

Sly, L. and Rennie, L. (1999) Computer managed learning as an aid to formative assessment in higher education, in Brown, S. Bull, J. and Race, P. (Eds) Computer-assisted Assessment in Higher Education, Kogan Page, p 95 - 102

van der Linden W.J. and Hambleton R. K. (Eds.) (1997), Handbook of Modern Item Response Theory. New York, NY: Springer-Verlag New York Inc.

Whittington, C.D. (1998). There's more to the Web than Multichoice, Proceedings of the Second Annual Computer Assisted Assessment Conference, ISBN - 0953321010, pp. 117-123, Loughborough, UK

Whittington, C.D. & Sclater, N. (1998). Building and Testing a Virtual University Computers and Education, Vol. 30, Nos. 1/2, pp. 41-47.

Hypertext References

HREF1
http://i.am/davewhittington/
HREF2
http://www.caacentre.ac.uk/
HREF3
http://www.scaan.ac.uk/
HREF4
http://www.tltp.ac.uk/
HREF5
http://www.scotcit.ac.uk/
HREF6
http://www.derby.ac.uk/assess/newdemo/mainmenu.html
HREF7
http://www.qmark.com/perception/
HREF8
http://www.lboro.ac.uk/service/fli/flicaa/
HREF9
http://www.lboro.ac.uk/service/fli/flicaa/downloads/survey.pdf
HREF10
http://www.lboro.ac.uk/service/fli/flicaa/conferences.html
HREF11
http://www.biopassword.com/
HREF12
http://cvu.strath.ac.uk/ae/
HREF13
http://flex-learn.ma.hw.ac.uk/info.html
HREF14
http://www.imsproject.org/
HREF15
http://www.imsproject.org/question/
HREF16
http://www.macromedia.com/software/authorware/
HREF17
http://www.lboro.ac.uk/service/fli/flicaa/conf2000/


Copyright

Dave Whittington, Joanna Bull and Myles Danson, © 2000. The authors assign to Southern Cross University and other educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to Southern Cross University to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the conference papers and for the document to be published on mirrors on the World Wide Web.


Proceedings ]


AusWeb2K, the Sixth Australian World Wide Web Conference, Rihga Colonial Club Resort, Cairns, 12-17 June 2000 Contact: Norsearch Conference Services +61 2 66 20 3932 (from outside Australia) (02) 6620 3932 (from inside Australia) Fax (02) 6622 1954