Cheryl Howard, Lecturer, Berwick School of Multimedia, Monash University [HREF1], 100 Clyde Rd Berwick, Victoria 3806. cheryl.howard@infotech.monash.edu.au
Michael Morgan, Senior Lecturer, Berwick School of Multimedia, Monash University michael.morgan@infotech.monash.edu.au
Kirsten Ellis, Lecturer, Berwick School of Multimedia, Monash University kirsten.ellis@infotech.monash.edu.au
This research project explores the differences between established delivery methods (lectures and tutorials) and interactive delivery methods (collaborative learning and game-based study tools) for the delivery of theoretical content of a university-level subject (MMS2403 Human Computer Interaction for Multimedia) will have on the learning outcomes of individual students.
The study investigates the employment of a collaborative learning environment that allows students to contribute to the content within the game-based study tools designed to enhance the study and review of complex and theoretical content. It provides students with an alternative method of learning the required content, allowing them to work collaboratively and using tools with which they were familiar (eg: internet, discussion boards, discussion groups) and then, using the game-based study tools, review and study the content. The treatment compares the results of students in each group (traditional vs collaborative) to their performance scores in a pre-test and post-test of the content area (short-term retention) and results at the end of semester examination (long-term retention). The initial results support the hypothesis that the alternative method is more effective; however, further investigation into the effects of preferred learning styles may impact on this result.
The concept of learning through play has been around for millennia and continues to remain effective for a range of learning outcomes. Prensky (2001b) [HREF2] argues that the current generation of learners think and learn differently because they have grown up being exposed to digital gaming environments for both pleasure and learning. Prensky also argues strongly for changing the present learning environments, at Kindergarten to University level and beyond, from the predominantly “tell-test” framework currently employed by a majority of educational institutions to ones that harness the potential of Digital Games-based Learning (DGBL). He argues that students will learn better when the way they learn is stimulating, exciting and fun, and that the more traditional methods used (lecture, tutorial, textbook, etc.) do not adequately meet the needs of many of today's learners. Aldrich (2005) and Quinn (2005) strengthen Prensky's argument by stating that the key elements of motivation and engagement enhance learning as well as retention of the information being taught.
Research in the areas of DGBL is still relatively new and explores ways in which new methods can enhance the learning process. Oblinger (2004) states that games “offer potentially powerful learning environments” and many of their attributes “make them pedagogically sound learning environments.” Oblinger concedes that the results from using games to enhance traditional learning environments are encouraging but also warns that there are still questions “about how games will be developed, deployed and accepted in higher education” (2004:1) [HREF3].
The research objective is to provide insight into the deployment of an alternative teaching strategy using games as a tool to enhance the study of complex theoretical information. The project focus is on exploring the potential of enhancing learning by encouraging active learning through a collaborative learning and game-based activities in order to challenge the methods currently used in teaching in a tertiary contexts.
While not a new concept, learning through computer games has met resistance, not so much from teachers, but from other school stakeholders (education boards, parents, employers, etc.) who do not see the potential or actual educational benefits of computer games within the curriculum (Kirriemuir & MacFarlane, 2004) . However, DGBL is gaining strong support with numerous large educational institutions undertaking research into “serious games” (eg: NESTA Futurelab [HREF4], BECTA [HREF5] and Abertay University's 'Play to Win' research centre [HREF6] in the UK, MIT Games to Teach programme [HREF7] and Games2train.com [HREF8] in the US, and the Australian Flexible Learning Framework [HREF9]). Exploring ways to engage, stimulate, and motivate learners has become the main focus of the research as well as “trying to develop digital environments that support new forms of learning” (Facer, 2003) . For years the lecture-based model has demonstrated that it is generally ineffective as a teaching method over others such as learning by doing and situated learning, where “games are seen as environments that could actively support these practices.” (Facer, 2003 [HREF10])
The influences of Behaviourist and Instructivist methodologies have moved towards the Constructivist approach to teaching and learning (Aldrich, 2005; Harper & Hedberg, 1997; Haskell, 2001; Jonassen, 1999; Quinn, 2005; Reigeluth, 1999) . This shift in educational philosophy has learning moving away from students being passive learners, as is the case for the former two methodologies, to creating learning environments where learners are encouraged to be actively involved in the construction of their knowledge and understanding of the subject matter. Research has shown that active and collaborative learning environments are quite effective for encouraging learning on many levels (Aldrich, 2005; Hedberg & Harper, 1995; Jonassen, 1999; Keyser, 2000; Quinn, 2005) . It has also demonstrated that by socialising the learning process (such as through small group, discussion, debate, presentation, etc.) it is more likely to increase transfer of the learning to other relevant contexts and improve the retention of information for longer periods (Felder & Brent, 2005 [HREF11]; Grabinger & Dunlap, 2000; Haskell, 2001; Keyser, 2000) .
Howard, Morgan & Ellis (2006) maintain that “good teachers have instinctively known that “learning by doing” can be a very effective method, particularly for the application of subject matter that may be quite abstract and theoretical.” This premise provides the motivation for this project by exploring possible teaching strategies that can support and encourage active learning within collaborative learning environments. However, Kiili (2005) contends that the emphasis on using technology as an integral part of the teaching process often leads to it being a substitute teacher for information delivery rather than as “learning tools that support the active learning process” (Kiili, 2005:303) . Using technology cannot be seen as a panacea for every teaching or learning situation, but needs to ensure its proper application for achieving the required outcome(s), such as computers supporting and enhancing the learning process rather than as a delivery vehicle for content (Facer, 2003; Grabinger & Dunlap, 2000; Quinn, 2005).
Numerous studies (Aldrich, 2005; Facer, 2003; Kirriemuir & MacFarlane, 2004 [HREF12]; Oblinger, 2004; Prensky, 2001b, 2001c [HREF13]; Quinn, 2005) have shown that growing up playing computer games has become an integral part of both leisure and learning time for the current generation of students. Further implications include the recognition that games could potentially support learning in a variety of contexts, that student attitudes towards education have changed their expectations of the types of learning activities they encounter, and that the traditional “tell-test” approach to teaching may no longer be sufficient. Prensky (2001a) challenges the assertion that games are not for learning by arguing that any subject matter (the more boring, complex or difficult the better) can be turned into a game because 1) that it will meet the needs and learning styles of current and future students; 2) they are motivational and fun; 3) they can also be “enormously versatile, adaptable to almost any subject, information or skill to be learned, and when used correctly, is extremely effective” (2001a:3).
The types of games required are moving beyond the ubiquitous simple drill-and-practice to the simulation of real-world contexts and complex problem-solving. However, rote learning (eg: drill-and-practice games and their ilk) can be a necessary part of a student's learning experiences because sometimes there is no other way to learn specific facts and concepts (eg: maths tables, spelling and grammar, dates, etc.). Prensky stresses that these games must be both compelling and “combined creatively with real content”, “not just drill with eye-candy” (Prensky, 2001b, 2001c) . Using this premise, part of th is research was to ascertain whether by developing and providing a variety of simple games through which to deliver complex content, individual teachers could create a learning environment that is both interesting, challenging and, above all, fun for their students.
Unfortunately, a significant limitation of simple games is that they do not address the issue of the transfer of learning, often providing memorisation of the content presented with little opportunity for its application in new situations. Aldrich (2005) , Oblinger (2004) and Quinn (2005) agree that the application of a learning-by-doing approach, particularly one that involves the use of games in some form, can encourage the transfer of learning to other related contexts. They also argue that game elements (eg: urgency, complexity, learning by trial-and-error and scoring points) are attributes that contribute to effective learning environments.
Teachers can implement different educational strategies to create appropriate learning environments for their students which, in itself, may often not be enough to achieve the required outcomes. Student motivation is the essential element that will either make or break the implemented strategy, especially when the content is perceived to be highly theoretical or challenging. Oblinger (2004) suggests that “for learners who are experiential, social, multi-taskers, games may provide a new freshness of approach and motivation to their studies.” Facer (2003) also supports this view indicating that a range of different learning environments to support the learning processes of different types of students should be provided with games being one way to actively support them because they can “both motivate and encourage diverse ways of engaging with learning.” (Facer, 2003)
A further characteristic of the learning process explored was that of preferred learning styles. Felder & Brent (2005:57) argue that students have “different levels of motivation, different attitudes about teaching and learning, and different responses to specific classroom environments and instructional practices”. By understanding these differences teachers can devise more appropriate strategies to meet the needs of more students. Further, Soloman & Felder (1999) [HREF14] developed an Index of Learning Styles (ILS) classifying learners into four main styles: active/reflective, sensing/intuitive, visual/verbal, sequential/global. Implications for understanding learning styles are that teachers, having identified the general trend of their students using a learning styles indicator such as ILS, could adapt their teaching style to suit the learning style(s) of the majority of their students, thereby potentially increasing the learning taking place. The conundrum is that due to rapid student turn-over (6-12 months) this imposes significant stress on teachers to continually adapt or change, is time-consuming and potentially very costly. With student populations becoming more diverse in any given educational institution (Sander, 2005) what can be done to meet the needs of these students that is both cost and learning effective? A proposed solution is the combination of a collaborative learning environment and the use of game-based study tools. However, the researcher acknowledges that “although a promising tool, games are not replacements for faculty involvement, direct experience or the hard work of learning.” (Oblinger, 2004)
The purpose of this study was to test the following hypothesis:
A. Collaboration and game-based study tools improve the learning of Human-Computer Interaction (HCI) content by 2nd year university students.
B. Current student learning styles conflict with current teaching styles decreasing the motivation to learn complex theoretical content.
Key factors that signified the need for the study were a significant drop in attendance to both tutorials and lectures from early in the semester and the students' perceived necessity for knowing the content. As with most tertiary institutions the general mode of course delivery is via lecture and tutorial. The university also provides an online facility, Monash University Students On-line (MUSO), where all lecture notes and other tutorial and reference materials can be posted. The survey data indicated that MUSO may have contributed to the students' lack of attendance at lectures because their perception is “if the lecture notes are posted on MUSO I don't need to go” and their assumptions that “lectures are so boring / such a waste of time” or “what's the point of learning this stuff”. Additionally, the data suggests that with the majority of students enrolled in the course being male (84.54%) aged less than 30 years, the lecture method held very little appeal as a learning environment, thus the necessity of identifying a viable alternative teaching strategy and/or learning environment for delivering the content. Providing opportunities for students to participate in a more hands-on approach was identified as the foundation for developing the teaching/learning strategies for the study, due to the subject being a core unit with highly theoretical content in a largely practical course (Bachelor IT – Multimedia). It is important to note that, despite lectures/tutorials being a typical delivery method for this course, student satisfaction for the unit remains quite good (eg: on a scale of 5 the University's Unit Evaluation surveys showed 3.57 for face-to-face teaching and 3.63 for unit presentation).
The research was conducted at the Berwick School of Information Technology of Monash University. The lectures were conducted in a large theatre supported by PowerPoint slides. Tutorials were conducted in computer labs with 20 networked computers (one per student), however, some students opted to use personal laptops connected to the network. During the experimental period participants used the studio labs, equipped with 5 paired networked workstations with informal seating and work areas (eg: beanbags, lounges and tables). The studios were designed to provide an environment more conductive to collaborative work, accessing the resources (internet, other online resources, textbooks, etc.) as required.
The potential prohibitive issues of a) working within budget, b) timetabling and c) ensuring that no student was deemed disadvantaged were indeed challenging. However, as indicated below (Table 1), the concerns of access and equity were addressed satisfactorily, ensuring that all students the same content and level of instruction.
Lecture / Tutorial Model |
Collaborative / Game Model |
Lecture (1 to 2 hours)
Tutorial (2 hours)
|
Collaborative on-line research (1 to 2 hours)
Tutorial (2 hours)
|
Table 1: Teaching Delivery Models Used |
|
Over the course of a semester, students enrolled in the 2 nd Year unit MMS2403 (Human Computer Interaction in Multimedia) were presented with the topics required by the unit syllabus. The control format was introduced to all students and consolidated during the first three weeks of tutorials (weeks 2-4). This consisted of:
At the end of this period, all participants were administered a practice quiz, covering the topics covered during the previous three weeks, and a survey to ascertain preferred learning styles of the participants. The participants were split into the control and experimental groups with the control group continuing using the control format (described above) and the experimental group using the combined collaborative game-based learning format (described in next section). At the end of the three week experimental period (weeks 5-7) all participants were again administered a practice quiz and a survey about their attitudes towards the delivery model experienced. After the experimental period, the lecture/tutorial model was used to deliver content until the end of semester (Fig. 1). The practice quizzes were to provide feedback to the students on their current level of understanding of the content and were not part of the assessment for the unit.
The decision to use this approach was influenced by the students preferred method of learning, initially from students verbal comments to lecturers and tutors in previous semesters, and later supported by the results from the surveys administered to this cohort.
The prepared Focus Question sheets were used by both groups; however, the experimental group did not attend the lectures but worked in small collaborative groups for the same period of time. The focus questions provided both a guide for study and appropriate starting points for finding the answers (eg: links to lecture notes, on-line readings and textbook chapters). To ensure compatibility with the study tools, the focus questions were divided into five categories related to the current topic with four questions each. The each session used the following format:
The questions generated during the session were later incorporated into the game-based study tools, using the Question & Answer Generator. The students could access these questions during the first half hour of their assigned tutorial time during the same time allotted to the control group for answering the focus questions.
The initial concept for the development of the game-based study tools was inspired by Prensky's games2train [HREF8] website where it “marries computer games and educational content into a new “Nintendo Generation” approach to learning … the underlying idea is that students learn better when they are having fun and are engaged on the learning process.” [HREF15]. Thus, by taking the simple premise that using a game template and a prescriptive form of data entry, a game could be created to present any content in a novel way to help students study complex theoretical information. Analysis of this problem indicated that not only a game template but also a question creation tool was required.
The Question and Answer Generator (Q&A Generator) was developed to ensure that question creation would be a simple process and compatible with the games. The structure of the content would be defined by a broad subject heading (eg: HCI), divided into specific topic areas (eg: What is HCI?, Task Analysis, etc.), and then distilled further into five categories breaking the content into manageable learning chunks. A significant limitation of the Prensky games was also identified – the question formats were very limited – True or False, Multiple Choice (single answer), Multiple Choice (multiple answers) . So that students could use a broader range of thinking skills, eight different question formats were created – True or False, Multiple Choice (single answer), Multiple Choice (multiple answers), Matched Pairs (eg: definition and meaning), Sequencing, Fill-in-the-Blanks, Short Answer, and a Likert Sliding Scale, all of which have the option to include graphics and/or sound. Other features include provision for references, as web links or electronic documents, and a scrollable space to provide comprehensive feedback. Comprehensive documentation was also created to provide both instructions on how to use of the Q&A Generator and suggestions for maximising the effectiveness of the question formats by varying the complexity of the questions asked to promote higher order cognitive thinking and processing. This will increase the value to the questions and challenge the students to think carefully before answering.
Games with game play familiar to most students and would not add an additional burden of learning the games as well as the content where chosen to be developed – t wo arcade games for individuals (a version of space invaders style game and a variation on a Pac-man style game) and one turn-based game for those who prefer a controlled pace or playing against another player (a Pick-a-Box style game). The games became the vehicle by which the content created by the students was accessed in an easy-to-use study tool to help learn and revise the large amount of complex content covered throughout the unit – experimental participants accessed the games during the same period that the control participants were working through the focus questions. By having the students create their own questions, of varying complexity about what they have studied, they will engage in higher order cognitive thinking and levels of understanding. The intention is for the questions to become more meaningful and to challenge the students to avoid guessing the answers.
|
The purpose of the research was to identify the current student cohort's preferred learning styles, to ascertain their attitude towards the lecture/tutorial mode of delivery and to analyse the implications these may have for providing more suitable learning environments in a higher educational institution. T he MMS2403 (HCI for Multimedia) core unit of an undergraduate degree course in multimedia had 99 students enrolled with only 19 choosing not to participate in the research. The results of the second survey and quiz may be skewed due to pressures of preparing other assessments due for submission at the time or by a reduced number of participants (Table 2). Analysis of the data (Fig. 4), using the Soloman & Felder ILS (1999), shows a greater number of students falling within the ASeViSq range of learners ( Active, Sensing, Visual, Seq uential), suggesting that a collaborative learning environment may be a more suitable for them. However, the Soloman & Felder ILS indicates that the current method is better suited to RIVeSq learners ( Reflective, Intuitive, Verbal, Sequential). |
||||||||||||||||||||||||||||||||||||||||
It is important to understand how individual learning styles impact on learning and performance, in general or in specific learning environments, because the dynamics of student populations within universities are changing. Sander (2005) argues that with the diversity within student populations increasing, universities are required to consider, not only effectiveness and efficiency of their teaching, but also the quality of the learning environments they provide. In order to ensure teaching strategies keep pace with the changing student population, examining data using an ILS is one method that can provide indicators to help identify and develop any new strategies that will meet these goals.
|
Quiz 1, containing 15 questions, was administered before the treatment, covering the topics presented during the previous three weeks. The control group (n=23) had a min. score of 3, a max. score of 11.5 and a mean score of 7.72. The experimental group (n=40) had a min. score of 4.75, a max. score of 14 and a mean score of 9.06. Quiz 2, also containing 15 questions, was administered after the treatment, covering the topics presented during the previous three weeks. The control group (n=11) had a min. score of 6.5, a max. score of 9.75 and a mean score of 8.20. The experimental group (n=19) had a min. score of 6.25, a max. score of 12 and a mean score of 8.34. Although the maximum scores were lower for both groups in the second quiz, the minimum scores both improved possibly indicating that both groups were experiencing better learning through the use of the focus questions. The examination, containing 30 questions, was administered at the end of the semester. The control group (n=32) had a min. score of 19, a max. score of 80 and a mean score of 61.64. The experimental group (n=43) had a min. score of 26.5, a max. score of 92 and a mean score of 66.06. This may indicate that the participants' long-term retention of the content may have been improved by the use of the collaborative learning environment, but in the short-term showed no significant statistical difference. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
While the results are not conclusive, there are consistently higher results achieved by the experimental group. It is unclear if these results are due to the differences in the teaching strategies or in the students' attitude towards the subject. However, from the data it could be concluded that the change in delivery method may have contributed to improving student learning of the content in both the short-term and long-term retention of information. More detailed analysis of the individual responses to the quizzes and the examination may identify which of the following factors may have influenced the results:
In general, the comments from each group were more negative from those within the Control group and, despite some negative comments, showed a more positive attitude to the subject and content from the Experimental group (Table 4). Overall, the experimental group demonstrated increased motivation to interact with the content, enjoying the opportunities to explore the variety of resources, and participating in discussions during the researching process. After the first week, the students were keen to start the activities and required very little guidance in the process (outlined previously). The significant difference in attitude between the groups continued after the treatment, with many of the experimental group students choosing to work in groups during the tutorials while answering the focus questions, suggesting that the method was perceived as useful for their learning. This aspect of the research needs further exploration to determine what benefits can be gained by the students from employing this approach to learning within a tertiary environment.
Question |
Control |
Experimental |
What problems or issues did you experience while learning with this method? |
Hard to fake interest because the subject is so dry; hard to apply material to practical tasks. |
Because we only researched a portion of the content ourselves, we ended up having a disparity in levels of knowledge between different content sections. |
What aspects of this method did you like? |
Focus questions make sense and help learning. |
It was an unusual experience to approach learning this way, particularly since the timeframe was so short. I enjoyed it a lot, but it would have been better if it had gone longer. |
Compared to other subjects, how well do you think you have learned the content for this subject? Why (not)? |
Not very well. The lectures are dull and not very motivating. |
In the discussion group I learned quite a lot. |
Do you think the method used for delivery of the content for this subject would be appropriate for other subjects? Why (not)? |
I think the delivery of lectures and focus questions is a little boring, and I find it hard to get motivated. |
The game based learning was new exciting way to learn and much better delivery. |
Table 4: Some sample comments from surveys |
||
The students from both groups raised a number of issues with the focus questions – in the main most were satisfied with the format, but a number of improvements were suggested. Upon reflection, implementation of these improvements will significantly improve the resource, providing a more focused approach for study and revision of the content, and enhancing the experience.
The most notable demonstration of student understanding of the content was the evolution of the questions created for the game study tool and the quality of focus question answers posted. Initially, all groups tended to be conservative, concentrating on the creation of simple right/wrong questions focusing on the T/F or the multiple choice formats requiring very little cognitive processing. The answers posted also tended to be quite simplistic and lacking in depth. The students' confidence with the process demonstrated their ability to interpret the content at a higher level in the following weeks. The quality of the answers posted began to show more depth, distinct links to the questions posed and, in some cases, extension of the ideas presented. They also began to experiment with other types of question formats ( fill-in-the-blanks, sequencing, matched pairs and the Likert sliding scale) and showed improvement with the question content, ranging from simple to quite complex and thought-provoking.
Another observable difference between the groups was that during the consolidation time allotted (the first half hour of tutorials) those within the control group tended to work by themselves, despite being encouraged to work with a partner if they chose, and tended to rely mostly on the lecture notes posted on MUSO. Whereas those within the experimental group, while using the games study tool, often flicked through various resources (ie: the lecture notes, their own answers to the focus questions and the posted answers provided by the other groups) when they encountered difficulty answering the quiz questions. They were also more likely to question the correctness of the answers provided when they made mistakes – checking and re-checking the resources available.
This method worked quite well for the small experimental group, although its application to larger groups may be problematical in that the need for greater human and computer resources may prove to be cost prohibitive – one lecturer and theatre vs a number of computer labs with tutors. However, the benefit to the students may well negate this, considering the increased engagement and motivation with the content to be learned, potentially make this a viable alternative. The data suggests that by increasing the student interactivity with the delivery of the content may increase student levels of motivation to learn it because they can process it in ways that make sense to them and may be significant in overcoming their perceived shortcomings of the current method.
The implementation of the Focus Questions was deemed successful, as many of the students felt that they helped them focus their study of the content. This aspect of the research would be quite easy and cost effective to implement, regardless of the student numbers. Other student feedback provided valuable insight into how this resource can be improved in the future and will be reviewed and implemented in further studies, such as:
The game-based study tools provided a novel way in which the students could engage with the complex theoretical content required for the unit. Although a few issues were identified by the students regarding the usability of the tools, generally they were well received. Additionally, a number of students have expressed interest in creating their own versions of them as part of their 3 rd Year Studio Projects (ie: building applications and/or media resources for a “client”). A game template is being developed, for use by the students, to provide a consistent framework for the delivery of the content while allowing student creativity in the design of the game interface and game play.
This unexpected result from the use of the tools would support the initial premise of the research – increased engagement with the content can promote and motive a student's desire to learn. Although this would be related to a different subject area, the principle is the same; learning will take place within the context of a collaborative environment where students are engaged with the manipulation, processing
Aldrich, C. (2005). Learning by doing: A comprehensive guide to simulations, computer games, and pedagogy in e-learning and other educational experiences . San Francisco: CA: Pfeiffer.
Facer, K. (2003). Computer games and learning. Retrieved 24/06/2005, from [HREF10]
Felder, R. M., & Brent, R. (2005). Understanding student differences. Journal of Engineering Education, 94 (1), 57-72. [HREF11]
Grabinger, R. S., & Dunlap, J. C. (2000). Rich environments for active learning: A definition. In D. Squires, Conole, G. & Jacobs, G. (Ed.), The changing face of learning technology (pp. 8-38). Cardiff: University of Wales Press.
Harper, B., & Hedberg, J. (1997, December 7-10). Creating motivating interactive learning environments: A constructivist view. Paper presented at the 13th ASCILITE Conference “Reflections in Learning with Technology”, Perth.
Haskell, R. E. (2001). Transfer of learning cognition, instruction and reasoning . San Diego: Academic Press.
Hedberg, J. G., & Harper, B. (1995). Exploration and investigation in information landscapes. Paper presented at the Apple University Consortium Conference.
Howard, C., Morgan, M., & Ellis, K. (2006). Games and learning. Does this compute? Paper presented at the ED-MEDIA 2006 Conference (June 26-30), Orlando, Florida, USA.
Jonassen, D. H. (1999). Designing constructivist learning environments. In C. Reigeluth (Ed.), Instructional-design theories and models - a new paradigm of instructional theory (Vol. 2, pp. 215-239). Mahwah, NJ: Lawrence Erlbaum Associates, Publishers.
Keyser, M. (2000). Active learning and co-operative learning: Understanding the difference and using both styles effectively. Research Strategies, 17 , 35-44.
Kiili, K. (2005). Participatory multimedia learning: Engaging learners. Australiasian Journal of Educational Technology, 21 (3), 303-322.
Kirriemuir, J., & MacFarlane, A. (2004). Literature review in games and learning (No. 8). Bristol: NESTA Futurelab. [HREF12]
Oblinger, D. (2004). The next generation of educational engagement. Journal of Interactive Media in Education (2004 Special Issue on the Educational Semantic Web). [HREF3]
Prensky, M. (2001a). Digital game-based learning . New York: McGraw-Hill.
Prensky, M. (2001b). Digital natives, digital immigrants. On the Horizon, 9 (5). [HREF2]
Prensky, M. (2001c). Digital natives, digital immigrants, part ii: Do they really think differently? On the Horizon, 9 (6). [HREF13]
Quinn, C. (2005). Engaging learning: Designing e-learning simulation games . San Francisco, CA: Pfeiffer.
Reigeluth, C. (1999). Instructional-design theories and models - a new paradigm of instructional theory (Vol. 2). Mahwah, NJ: Lawrence Erlbaum Associates, Publishers.
Sander, P. (2005). Reaching our students for more effective university teaching. Electronic Journal of Research in Educational Psychology, 5-3 (1), 113-130.
Soloman, B., & Felder, R. (1999). Index of learning styles (ILS). from [HREF14]