Online Learning and Gaming: Seeking Evidence of Impact - Sponsored furniture provided by Herman Miller
Raymond Fleming, Professor, University of Wisconsin-Milwaukee
Laura Pedrick, Special Assistant to the Provost for Strategic Initiatives, University of Wisconsin-Milwaukee
Diane M. Reddy, Professor, University of Wisconsin-Milwaukee
This session will highlight effective methods for evaluating the impact of online instruction based on lessons learned from the U-Pace projects, funded by Next Generation Learning Challenges and the Institute of Education Sciences. U-Pace is a technologically-enabled instructional approach that uses any learning management system to reliably facilitate deep learning. Using the evaluation methods developed for the U-Pace projects as exemplars, participants will leave with strategies to comprehensively document student success and understand why comprehensive documentation-including measurement of fundamental psychological processes critical to deep learning-is essential. Participants will also learn about valuable U-Pace instructional resources freely available to them.
Cognition and Gaming: What Cognitive Science Contributes to the Development of Educational Games
Steven Ritter, Founder and Chief Scientist, Carnegie Learning, Inc.
To be effective, educational software needs to take into account how students learn. We will share techniques, including task and verbal protocol analyses, to illustrate how an understanding of how students think about math has influenced our designs for and evaluations of games focused on building math fluency. For example, understanding the specific strategies students use to compare fractions has helped us construct "levels" in a racing game and has also provided a model for understanding what students are learning when they play the game.
The Challenges of Evaluating Game Log Data
John Stamper, Systems Scientist, Carnegie Mellon University
The use of games for learning is a hot area for research and funding, but are students playing these learning games effectively? We are implementing a framework for the collection and analysis of game log data that borrows from our earlier work for data collected from more traditional educational technologies. Game data, however, has unique features that traditional data lacks, which presents challenges for evaluating the effectiveness of games. We'll discuss these features and the metrics we've developed in the context of the Learning Game Analysis Framework, as well as how the framework can analyze learning to improve the learning game's effectiveness.
Professor, University of Wisconsin-Milwaukee
Laura PedrickExecutive Director, UWM Online, University of Wisconsin-Milwaukee
Diane ReddyProfessor, University of Wisconsin-Milwaukee
Steven RitterFounder and Chief Scientist, Carnegie Learning, Inc.
John StamperSystems Scientist, Carnegie Mellon University