Online Automated Essay Assessment:
Potentials for Writing Development

Tan Bee Hoon
Department of English
Faculty of Modern Languages & Communication
Universiti Putra Malaysia
E-mail: tanbh@fbmk.upm.edu.my

Abstract

This exploratory study examines the application of an online automated assessment system, trademarked MyAccess. The aim is to explore the viability of MyAccess in automated essay assessment and student writing development. The main strength of MyAccess is its capability in giving immediate holistic and analytical feedback. The system also provides a rich collection of online resources and tools to help teachers and students in the writing classroom. The integration of the instructor and student support tools and resources in a single system enables the use of MyAccess as a writing management system, and also gives it the potential in developing students’ writing competence.
 

1.0              Background to the Study

Evaluating and grading student essays is often a pain to writing instructors, and giving timely pedagogic feedback is a trying challenge in many writing classrooms. Yet this diagnostic advice is necessary for student writers to learn and progress. With the advent of ICT, some writing instructors wished for an intelligent machine or software that could reduce their marking load. This wish had motivated research initiatives into automated essay assessment as early as 1966 (Whittington & Hunt, 1999; Williams, 2001).

To date, a number of such tools are being used with considerable success in education, e.g. E-rater created by Educational Testing Services and MyAccess created by Vantage Learning (Daniel and Cox, 2001). While a faction of writing practitioners is enthusiastically using these automated tools to extend their teaching capacity, another faction is doubtful about its usefulness. The debate on the value of automated essay assessment often recurs in popular journals and conferences (Hearst, 2000; Honan, 1999). The controversy points to the critical and informed adoption of an automated essay assessment tool.

At the moment, an online essay assessment tool trade-marked MyAccess is being promoted in Malaysia. This online software has built up substantial reputation in its country of creation, the USA (see http://www.vantagelearning.com). It claims high level grading accuracy that equals or exceeds human expert graders. It suggests that immediate assessment and feedback encourage learners to write and revise more frequently. A leading edge of this software that makes it shine among its competitors is its ability to assess essays in other languages in addition to English. This leading advantage seems to relate well to educational institutions that teach a number of languages.  

2.0       Identification of the Problem

A common problem among writing instructors is the difficulty in giving timely feedback to students especially those with large classes. Many instructors are reluctant to teach writing skills because of the heavy marking load. Besides writing skill courses, instructors of content courses are also reluctant to give essay or open-ended assignments. For written examinations too, under the pressure of meeting marking deadlines, instructors set more objective than open-ended questions. In the long run, students become incompetent in extended discourse and rhetoric due to insufficient practice. This is a common weakness among tertiary students or graduates that has often been highlighted in the mass media (Chapman, 2002).

In view of these problems, this research study was originally planned to explore the feasibility of using MyAccess, a web-based automated essay assessment system, to assist writing instructors in evaluating students’ written work. However during the study, it was discovered that the system also had an in-built class management tool to help a writing teacher manages a writing class such as keeping records of essays written and grades obtained, and giving comments during a student’s writing process. This asset plus the various instructional and writing support tools and resources, coupled with the immediate holistic and analytical feedback of the system,  give MyAccess the potentials in developing the writing competence of students.

3.0       MyAccess: A Brief Introduction 

To date, due to constant upgrading and updating, MyAccess has evolved into an integrated writing support environment in developing student writing competence. Vantage Learning, the creator of MyAccess, is a leading provider of online educational and assessment solutions. Its headquarter is based at the USA, and it also has offices in Taiwan, Korea, Indonesia, Thailand and Malaysia. Among the many awards won by Vantage Learning include the Codie Award Finalist 2003 & 2004, Eduventures Top 10 Breakhrough Technologies in Education, and Technology & Learning Top 10 Smart Technologies. The automated writing development environment comprises two important electronic platforms: MyAccess and IntelliMetric (see Figure 1).

MyAccess provides students of various grade levels appropriate writing tasks or prompts.    Students maintain their work in an online portfolio that contains their initial drafts, evaluation scores and subsequent revisions.  During the writing process, students have access to a variety of tools that include:

Teachers too have a number of tools to help them manage a writing class, for example, assigning students into various groups, and setting writing assignments. They can view any student’s portfolio, provide comments and track the overall class portfolio to manage writing instruction. Teachers can also generate letters to parents and student performance reports in a variety of formats.  In addition, the teachers have ultimate control over the tools available to the students while they write essays.  For example, if it is important that the students do not receive any help with spelling, the spell checker can be turned off for any particular assignment.

 

Figure 1: Automated Writing Development Environment

The platform that automatically scores student’s writing upon its submission to the system is trademarked IntelliMetric. It is an intelligent scoring system that emulates the process carried out by human scorers.  The system has been “trained” with various sets of previously scored responses or “known score” papers.  These papers are used as a basis for the system to infer the rubric and the pooled judgments of the human scorers. The operating principle is that IntelliMetric internalizes the characteristics of the responses associated with each score point and applies this intelligence in subsequent scoring.  The approach is consistent with typical K-12 and higher education expert writing scoring models.

4.0       Methodology

The first stage of the study was to review MyAccess and to identify potential areas where the system might help in the teaching and learning of writing skills. The study followed the single-case ethnographic research design where the instructor’s and the students’ interaction and reaction in the use of the software were observed throughout one whole semester. Thirty-four students volunteered or encouraged to participate in the exploratory study. The students wrote an in-class impromptu short essay at the beginning of the semester. This essay was used to gauge their eventual writing performance. The course instructor identified writing tasks from the pool of MyAccess writing prompts that were relevant to this group of students. Students were taught to use the various online tools through a training workshop.

During the semester, students wrote and submitted two mandatory writing assignments for the automated feedback and scoring. Besides the compulsory essays, students were encouraged to write and submit additional essays for the same online feedback and grading. When an essay was completed, the students submitted the final draft to the system to obtain a final grade. The final drafts were also printed out and submitted to the instructor for a separate evaluation.  The course instructor observed and monitored students’ reaction and progress. At the end of the semester, the instructor and the students answered a summative questionnaire. The data were then compiled and analyzed.

5.0       Discussion of Findings

5.1       Students’ Perception

After a semester of writing on the platform of MyAccess, the students were asked to write an essay describing their experience and opinions in using the system. They were also requested to answer a summative questionnaire. The following findings were extracted from their narratives and the questionnaire responses.

About 35% of the students used a computer daily. About 55% of them used a computer at least once a week, and the remaining at least once a month. 78% had experienced taking a test online, and this test was the driving or an IQ test. Slightly more than half of the students preferred writing on the computer than on paper. This information generally suggests that the students were fairly frequent computer users, and that their computer literacy is also good.

The majority of the students perceived that the website is useful as it is interactive and the feedback was almost instant and helpful. All of them commented that this was their first experience in writing online and using the accompanying writing tools, and automated essay scoring. They found the online writing resources such as the user and writer guides, writer’s model catalogue, quick reference guide, and instructional units useful. They felt that they had improved their grammar and spelling by using My Editor, and word meanings by using Thesaurus. The feedback provided by My Tutor had helped them identify their weaknesses as it explained with examples. Students were happy to know the level of their writing ability as indicated by MyAccess.  Among their positive responses on the use of the writing platform included:

 

·        The marking scheme is systematic

·        It helps me produce a better essay by showing me the steps

·        It teaches me techniques and rules on effective writing

·        The graphic organizer helps me create and organize ideas

·        The system uses our names to talk to us

·        The website is user friendly as it is relatively easy to use even if one is not computer savvy 

·        I like the additional comments from my lecturer

·        The spell and grammar checks are better than MS Word

·        The software has made the writing experience more fun

·        Students of modern generations should use such a tool

 

Overall, students felt very positively about the use of MyAccess. More than 60% indicated that they would continue to use the system. With regards to writing improvement, nearly 80% of the students felt that MyAccess had helped them improve their writing, and more than 90% indicated that they used the feedback provided to improve their writing.

On the negative side, a small group of students commented that the software was not suitable for novice users who are not good in computer skills. The same students also could not understand some corrections by My Editor. However, the majority agreed that the main problem with MyAccess was the slow log-on or the futile effort to log on. They exclaimed, “Imagine having a deadline to meet!” Other related problems included the difficulty to resume writing after the submission, and certain PCs were difficult to get connected. The writing prompt also took a long time to appear. When this problem was reported to the Service Provider, the explanation given was that the system was undergoing a stage-by-stage upgrading, and it was usually done during night hours in the USA which happened to be day time in Malaysia. The Service Provider suggested that the slow connection might also be caused by the existing server infrastructure at the client site.

 5.2        Instructor’s Perception

 On a scale of five, the instructor rated MyAccess 3 as being somewhat easy to use in terms of class management, record keeping, setting writing assignments, inserting instructor comments, and generating performance reports. The online resources were helpful for both novice and experienced writing instructors, and the online feedback to student writing was helpful for preparing future lesson plans. The instructor agreed that the two most effective features of MyAccess were the immediate feedback with scores, and the potential in improving student writing through the provision of various writing support tools and online resources.

Overall, all the students have improved in their writing performance. This was observed based on comparing the qualitative assessment by the instructor and the quantitative measurement by IntelliMetric between their first essays written before the treatment and the final essays written at the end of the treatment. The quantifiable writing improvement showed an increase of between one to two bands in the five areas of writing assessment that include Focus and Meaning, Content and Development, Organization, Language Use and Style, and Conventions and Mechanics. However, this positive observation needs to be tested again with a bigger sample to confidently attribute the gain to MyAccess, as the maturity factor and other extraneous variables might be able to effect the gain.

 With regards to the viability of using MyAccess for essay scoring, an inter-rating comparison between the system and the instructor was conducted. The study found a close correlation between the system and the instructor’s scoring based on similar evaluation criteria. A six-band score was given to each of the five areas of writing concerns: Focus and Meaning, Content and Development, Organization, Language Use and Style, and Conventions and Mechanics. The final score was the sum of all the bands given to the five areas, and thus the possible maximum score was 30. Nevertheless, the near perfect correlation was only based on the inter-rating with just one single human rater as the study involved only one class, and thus only one instructor was involved. Hence, further research involving more human raters is needed.

On the whole, the set back of MyAccess as a pedagogic tool  for both teachers and students was the slow logging-in, and the large number of authentic essays required to train the scoring engine, IntelliMetric, in assessing accurately an additional writing prompt. At present, MyAccess has only 12 writing prompts for higher education. There is the need to expand the prompt collection to make the system truly versatile and interesting.

 6.0        Suggestions and Conclusion

As this exploratory study involved only one instructor a small group of student users, there is the need to trial the system again with a bigger sample in order to establish its true value.  The instructor also had reservation with regards to its near perfect inter-rater reliability claim. The consistency can be tested when more human raters are involved in grading a large enough pool of similar writing prompts.

In conclusion, the online writing tools and resources of MyAccess, when integrated, can provide a supportive writing instructional and learning enviroment for developing students’ writing competence. Indeed, teachers do not need practice in grading, but students definitely need practice in writing. With the automated holistic and analytical feedback and the various supportive resources and tools, there is hope that the neglected “R” can now be given its due emphasis through the writing environment afforded by MyAccess.  

How do I do References?

AusWeb papers contain two types of references.

References to print resources should follow the system shown below under References.

Hypertext references should use the HREFX formalism shown in this template. At the location in the text where one wants to place a hypertext reference, place a sequentially numbered HREF (HREF1, HREF2, HREF3,...). Use this text as the anchor for your hypertext link. At the end of the document list all the HREFs with their attendant URLs, and again make the URL text an anchor for the hypertext link. The reason we do this is so that printed proceedings will contain the full URLs for use by readers. Have a look at the examples if this is not clear.

References

Chapman, K. (2002). Vital to Master English. TheStar Online,  6th October 2002. Retrieved  January 26, 2004, from Available online [HREF1] .

Daniel, G., & Cox, K. (2001). Automated Essay Grading. Webtools Newsletter. Retrieved  January 26, 2004, from Available online [HREF2] .

Hearst, M. (2000). The Debate on Automated Essay Grading. IEEE, Sept-Oct 2000.

Honan, W. (1999). High Tech Comes to the Classroom: Machines that Grade Essays. New York Times, January 27, 1999.

Whittington, D., & Hunt, H. (1999). Approaches to the Computerized Assessment of Free Text Responses. Proceedings of the Third Annual Computer Assisted Assessment Conference published by Loughborough University, ISBN - 0953321037, pp. 207-219.

Williams, R. (2001). Automated Essay Grading: An Evaluation of Four Conceptual Models. In A. Herrmann and M. M. Kulski (Eds), Expanding Horizons in Teaching and Learning. Proceedings of the 10th Annual Teaching Learning Forum, 7-9 February 2001. Perth: Curtin University of Technology.

Hypertext References

HREF1
http://thestar.com.my/news/story.asp?file=/2002/10/6/ education/
HREF2
http://wwwtool.cityu.edu.hk/newslett/automated essaygrading.htm

Copyright

Tan Bee Hoon, © 2006. The author assigns to Southern Cross University and other educational and non-profit institutions a non-exclusive licence to use this document for personal use and in courses of instruction provided that the article is used in full and this copyright statement is reproduced. The authors also grant a non-exclusive licence to Southern Cross University to publish this document in full on the World Wide Web and on CD-ROM and in printed form with the conference papers and for the document to be published on mirrors on the World Wide Web.