The Development of a Multiple-Choice and True-False Testing Environment on the Web


Rod Byrnes, Southern Cross University, PO Box 157, Lismore, NSW 2480 Australia rbyrnes@scu.edu.au

Roger Debreceny, Southern Cross University, PO Box 157, Lismore, NSW 2480 Australia rdebrece@scu.edu.au

Peter Gilmour, Syme Department of Marketing, Faculty of Business and Economics, Monash University, PO Box 197, Caulfield East, Vic, 3145 pgilm@brother.cc.monash.edu.au


Keywords: Education Assessment Multiple-choice Testing Computer Managed Learning CML

Introduction

This paper describes the development of a project to provide a multiple-choice testing and delivery suite on the World Wide Web, which can be integrated into a set of course materials for educational delivery on the network. The objective is to develop a testing environment with instant feedback to students and to staff which has the benefit of being able to be changed quickly and easily if problem areas are noted in users' responses.

This project is a pilot and has a defined path for further enhancement. Its purpose is to test the feasibility of WorldWideWeb testing and in its initial implementation provides the following benefits:

Educational Issues

The project has been developed to support teaching for both on- and off-campus using the Web as a common delivery platform. Ramsden (1992) identifies "important properties" of good teaching, as seen from "the individual lecturer's point of view" as including the following eight amongst a total of thirteen points: The role of feedback in any learning environment is clearly important. Over the years a variety of testing mechanisms have been employed in intramural teaching ranging from in-class quizzes to regular tests to the employment of full computer managed learning systems. Many of these feedback mechanisms use multiple-choice testing methods particularly in professional and scientific areas of study.

Ramsden has this to say about multiple-choice tests:

Multiple-choice questions provide another excellent opportunity to offer feedback in an efficient form. Feedback on multiple choice tests - if it is given at all - is usually limited to a score indicating the proportion of right answers obtained. Students do not know which questions they have got wrong, why they are wrong, or what the correct answers would be. Yet is a relatively simple matter to provide students with the marking key for such a test and to provide short explanations of the basis for the correct answer (p 195)
The time taken to provide feedback of the type indicated by Ramsden is an important issue (Rekkedal, 1983). When delivery is provided on-campus, regular testing by class tests and mid-semester tests prompt feedback to students is relatively straightforward. With a student population which is geographically distributed, as is the case with most open learning environments, this is not possible and other alternatives must be considered. The print, audio-visual and ancillary tools employed in open learning can provide structured feedback to the student but are fraught with difficulties. Solutions in printed materials cannot easily provide a structured path to further assistance or to further questions for reinforcement as would normally be the case in computer managed learning environments. There is also a natural tendency for the student to read ahead into the print materials and look directly to the solution, and for them to think "that was easy", when in fact working through the problem is often far from easy.

Crock and Dekkers (1994) in a review of computer managed learning in distance education in Australia, predict that the foundation for materials development and instructional approaches over the next decade will move increasingly from teacher directed to student centred learning which will be facilitated by use of learning technologies such as electronic learning support, access to learning centres, CD-ROMs, use of nascent broadband services as well as reliance on traditional services such as the study package and residential school. They note, however, that the use of CML in distance education has been limited because:

The WorldWideWeb provides an opportunity to incorporate multiple-choice tests directly into teaching and background materials. Tests can range from a single question within the teaching material to a complete "end of topic" test for credit. Whilst the technology provides immediate feedback to students, equally importantly, it provides immediate feedback to the faculty teaching the course.

The adoption of on-line Web-based testing meets Ramsden's educational objectives by:

The adoption of on-line Web-based testing meets Crock and Dekkers' implementation objectives by:

Existing Software Base

The hypertext nature of WWW allows the ability to incorporate mastery testing right into the middle of WWW materials. A student will be able to hyperlink from course materials to a test which gives immediate feedback to students and summary statistics to staff. The questions can incorporate all the elements of WWW documents including fully formatted graphics and sound. Similarly the response to the students can provide hypertext links to other materials.

The project builds upon a multiple-choice testing package which has been developed by Carleton University in Canada and which is in the public domain. The package design is highly elegant. It provides extensions to the WWW markup language (HTML) to facilitate the testing suite. These non-standard extensions do not, however, interfere with either the WWW client or server as the multiple-choice preprocessor acts upon the instructions sent from the document and returns a response to the client.

The package provides the following types of questions:

The package also provides: While the primary program had been designed by Carleton University, it was clear that there was much to do to make it easier for academics to implement the use and improve functionality. The project is designed to "value add" by focussing on easing the development of "reinforcement questions" which would be embedded into the unit materials and "end of topic" review questions and facilitating feedback to the lecturer. The pilot project was designed to: All of this is being evaluated within the context of the Open Learning Agency of Australia (OLAA) unit MAR11 (Marketing Theory and Practice).

The Value-Added System

Overview of the WWW Testing Environment

The WWW testing environment is comprised of four separate modules. Although these will be examined in detail in the following sections, a brief description of each is presented here. The Test Generator allows students to access questions in a controlled manner. The Interactive Question Editor provides an online method of manipulating the question bank. The Question Loader allows large numbers of new questions to be added to the question bank in batches. The Reporting Module can produce reports on various parts of the system.

The WWW testing environment is installed in its own directory tree under the WWW server's document root directory. The extended facilities of HTTP 2.0 are utilised to allow cgi-bin scripts to be executed from within these directory. The following list describes the layout of the directory tree.

/tutorial
The root directory for the WWW testing environment. This directory has no security mechanisms installed, and is the directory in which the test generator module is located.
/tutorial/questions
All units have their own directory beneath this one in which the questions pertaining to that unit are stored. Each question is stored in its own file. This directory is protected to prevent users from obtaining the actual question files, which contain the answers to the questions as well as additional, non-standard markup.
/tutorial/authorised
This directory is password protected, and is where the Interactive Question Editor, the Question Loader, and the Reporting Module are located. Only those users authorised to use these functions can access them. Furthermore, each module is designed in such a way that the authorised user can only access those units for which they are authorised. This additional security information is stored in a separate directory, which is described next.
/tutorial/authorised/.private
This directory is totally protected from any WWW accesses, and contains information that is used directly by the cgi-bin scripts. This includes the log file where attempts at questions are logged, the indexes of the various question directories, security information about each authorised user, and other housekeeping information.

The Question Bank Format

Except for a few additions and alterations, the format of each question is the same as that developed by Holtz at Carleton University. Holtz used additional, non-standard tags to indicate different parts of the question. These are not described here, however more information can be learned about them by reading through Holtz' original documentation [HREF 1].

Other tags have been added to Holtz' original specification to facilitate the indexing and searching methods required. These additional tags store:

These additional tags are stored in the following format:

<keyword>keyword=cat</keyword>

The begin and end tags are used to identify this as being a keyword tag. However, the indexing program (which will be described shortly) ignores all the tags in an HTML document. For this reason the type of the tag is also reproduced with the contents of the tag. The indexing program has been modified slightly to include '=' in its indexes, so the string "keyword=cat" equates to one word. This is the string that is looked for when a search specifying the keyword "cat" is requested. One slight drawback of this method is that any occurence of the string "keyword=cat" anywhere within any question will return a hit. Improved search mechanisms will be investigated further in the future.

Test Generator Module

At the heart of the WWW testing environment is the test generator. Tests in this context are computer generated lists of questions which the user may or may not choose to answer (in harmony with the user-driven navigational paradigm of the web). The user requests a test by invoking a URL that points to the test generator module. This URL may be manually entered into the browser, but it is more likely to be embedded directly into learning materials to provide users with immediate feedback with regard to their understanding of the material just studied.

In order for the test generator to generate a test that contains questions relevant to the learning material, a number of arguments may be passed in the URL. Only one argument is mandatory - the unit being studied. The URL:

http://casmac.scu.edu.au/tutorial/tester.pl?unit=ausweb95

invokes the test generator module and specifies that the question should be selected from those that belong to the unit known as AUSWEB95. The test generator module is a perl script.

Other arguments that can be specified fall into three categories as described below.

As a further example of this, the URL:

http://casmac.scu.edu.au/tutorial/tester.pl?unit=ausweb95&num=5&any=canada
&from=http://www.scu.edu.au/

returns up to a maximum of 5 questions that contain the word 'Canada' within them, and provides a link back to the Southern Cross University home page.

The questions to be included in the test are selected by first making a complete list of all questions that match the search criteria. This is done using a program called swish [HREF 2] (Simple Web Indexing System for Humans). Like WAIS but simpler, swish indexes all the questions pertaining to a particular unit, and provides a search mechanism that returns the names of all files that contain words specified in the search string. Another program called wwwwais (World Wide Web Wais) provides an easy to use front end to swish, and creates a list of all matching questions. The required number of questions are choosen from this list in a random selection process, and are stored in a separate temporary html file which is returned to the user. This separate html file is necessary to ensure that the user encounters the same questions if the test is reloaded for any reason.

When presenting the questions to the user, each question is referenced indirectly through a cgi-bin script. This script performs several useful functions. It:

The form of URL used to present a question to the user is like this (Note that the URL has been split into three sections for ease of description):

http://203.2.54.3/tutorial/ask.pl
/tutorial/questions/ausweb95/63.html
?first=yes&from=http://203.2.54.3/tutorial/tests/7954257056330.html

The first line of the URL is a reference to the actual cgi-bin script that parses the question file. The second line is the path to the actual question file, and the final line contains two pieces of information. The first field indicates that this is the first attempt at the question, and is changed to first=no after each succeeding attempt so that only the first attempt at each question is logged. The second field is a reference back to the temporary test form that was previously generated, and a link to this is included at the bottom of each question.

Interactive Question Editor Module

The test generator is quite an involved set of cgi-bin scripts, html files, and other associated indexing programs. However, these programs are of no use unless there are questions available in a form that they can understand. Two methods have been provided for generating question files in a suitable format: an interactive method which will be described now, and a batch method which will be described in the next section.

The interactive programs make use of html forms to provide a question template that can be filled in by an authorised user. The standard WWW security mechanisms are used to password protect the entire directory in which the scripts are stored. Online readers can add [HREF 3] their own questions by giving the login name ausweb95 and the password online.

After completing the fields on the form, the contents are passed to a perl script which performs basic syntax checking on the different elements. These are then assembled into an html file and stored in the appropriate directory. This directory is then re-indexed by swish so that the new question is immediately available.

The additional elements within each question are incorporated by using non-standard html tags. This does not cause any problems with the browsers because no question file is ever directly accessed. Question files are always filtered through other programs which interpret the non-standard tags to produce standard html output.

Two other operations that are provided to interactively manipulate questions are the modify and delete operations. Both of these operations first present a search form to the user where the user can specify certain search criteria so as to identify the question(s) they wish to modify or delete.

Modifying a question is accomplished by using the same form as that used to add a question, except that the fields now contain an initial value. Markup tags are included in input fields by first translating each tag delimeter ('<' and '>') to it's equivalent literal translation ('&lt;' and '&gt;'). Deleting a question requires the user to check the boxes beside the names of the questions they wish to have deleted.

Question Loader Module

The question loader provides a mechanism whereby large numbers of questions in one specific format are converted en mass into a format compatible with the WWW test system. It is envisaged that a different question loader will eventually be developed for each type of test bank that is required to be converted, but for the moment the question loader operates with one specific format based on Microsoft's RTF (Rich Text Format) specification. The steps involved in producing questions using the question loader are as follows.

Reporting Module

The reporting facilities provided by the WWW testing environment are by necessity quite simple. For instance, as the students taking the tests do not have to identify themselves, it is not possible to keep records of individual students' performance. What can be tracked is how well students on the whole tend to answer specific questions. This information can give the instructor valuable feedback as to which concepts the students are having the most difficulties with, and which concepts they are handling relatively easily, and allow them to modify their lecture material accordingly. Other reports will be added to the WWW testing environment in the future.

Conclusion

This project can be seen as being only at the first stage of its development. It can be seen that the concepts can be enhanced to provide a full computer managed learning environment which tracks and guides the performance of individual students. The concepts employed in the original work by Holz and in the subsequent enhancement are clearly extensible. The task now will be to develop a major live implementation of the software and undertake a process of formal evaluation.

References

Crock, M. and Dekkers, J. (1994) Issues in the use of CML for distance education in Australia. In Computer Managed Learning, Melbourne, Vic:

Lauzon, A. and Moore, G. A. (1989) A Fourth Generation Distance Education System: Integrating Computer-Assisted Learning and Computer Conferencing. American Journal of Distance Education, 3 (1)

Lauzon, A. (1992) Integrating computer-based instruction with computer conferencing: An evaluation of a model for designing online education. The American Journal of Distance Education, 6 (2)

Ramsden, P. (1992) Learning to teach in higher education. London: Routledge.

Rekkedal, T. (1983) The written assignment in correspondence education. Effects of reducing turn-around time. An experimental study. Distance Education, 4 (2), .

Hypertext References

HREF 1
http://www.civeng.carleton.ca:80/~nholtz/tut/doc/doc.html
HREF 2
http://www.scu.edu.au/ausweb95/papers/education3/byrnes/swish.txt
HREF 3
http://casmac.scu.edu.au/tutorial/authorised/add.pl

Copyright

© Southern Cross University, 1995. Permission is hereby granted to use this document for personal use and in courses of instruction at educational institutions provided that the article is used in full and this copyright statement is reproduced. Permission is also given to mirror this document on WorldWideWeb servers. Any other usage is expressly prohibited without the express permission of Southern Cross University.
Return to the AusWeb95 Table of Contents

AusWeb95 The First Australian WorldWideWeb Conference