Study: Physical and Virtual Learning Combine to be Effective


A study at Carnegie Mellon University has found that young students do up to five times better when education bridges the real world and virtual experience.

92 students aged 6 to 8 were part of the study, which compared how well they learned from a mixed-reality game versus one that was screen-only and confined to a laptop or tablet. It also explored the impact that physical interaction, like shaking the screen, had on how much they enjoyed the game.

The game, called EarthShake, included a gorilla and block towers, and had the educational goal of teaching children about basic physics concepts like stability and balance. The students were asked which of two towers would fall first when an “earthquake” hit and why. The testing system, called NoRILLA, included hardware and software that could give students feedback on their real-life experiments, like Microsoft’s Kinect for Windows depth camera, which detected the real-life blocks and allowed real life to become part of the interactive game.

The first variation had real-life blocks on a specially designed table and used a mouse click to activate the earthquake and shake the blocks down. The second version of the game allowed students to control the earthquake table themselves with a button. The third version appeared only on a laptop with no physical blocks or shaking table. Another version had them shaking a tablet to stimulate the quake.

The variations involving the physical table were almost five times more effective than the screen-only version, and were reported to be more enjoyable. However, physically-activated controls like pushing the earthquake button or shaking the tablet didn’t improve learning or enjoyment, writes Jocelyn Duffy of Futurity.

The authors of the study concluded:

Immersing children in the real world with computer vision-based feedback appears to evoke embodied cognition that enhances learning. In the current experiment, we replicated this intriguing result of the mere difference between observing the real world versus a flat-screen. Further, we explored whether a simple and scalable addition of physical control (such as shaking a tablet) would yield an increase in learning and enjoyment. Our 2×2 experiment found no evidence that adding simple forms of hands-on control enhances learning, while demonstrating a large impact of physical observation. A general implication for educational game design is that affording physical observation in the real world accompanied by real-time interactive feedback may be more important than affording simple hands-on control on a tablet.

The experiment is set to be extended, with the goal of understanding how learning and enjoyment are affected when more hands-on activities are included, like children building the towers themselves. The researchers also plan on making new content for the game, writes Dian Schaffhauser of The Journal, like a balancing scale.

The project is headed by PhD student Nesra Yannier, with two professors from the school’s Human Computer Interaction Institute, Kenneth Koedinger and Scott Hudson. Their paper on the experiment was presented at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems in Seoul, and can be read in its entirety at the ACM Digital Library.