Adaptive Teaching Technology Learning to Read Emotions

Even as they promise to be the future of education, adaptive learning technologies still sport some substantial limitations. As Ed Horowitz of EdSurge explains, while the system is getting very good at tailoring the next question to the apparent skill of each student, it can’t perceive why a student is getting questions right or wrong. [...]

Even as they promise to be the future of education, adaptive learning technologies still sport some substantial limitations. As Ed Horowitz of EdSurge explains, while the system is getting very good at tailoring the next question to the apparent skill of each student, it can’t perceive why a student is getting questions right or wrong. It will always correlate answers to skill when the reality could be that the student is tired, or hungry or bored.

In this way, adaptive learning has a long way to go to catch up to real teachers, but some early research is looking to get a start on the problem.

The future has the potential to be quite different, and that possibility is neatly illustrated by some new research led by Tilburg University’s Marije van Amelsvoort. The study, which will appear in the May issue of Computers in Human Behavior, examined whether intelligent tutoring systems have the potential to determine a student’s emotional state by observing their facial expressions.

The first step was to figure out what faces of students had to tell. Adults participating in the study were asked to make judgments about the facial expressions of 5th- and 2nd-graders working on a set of math problems and then were asked to determine, based on those expressions, whether the problems were difficult or easy.

The results were only surprising to those who don’t believe that our emotions are written on our faces. The adults were able to fairly accurately identify when the children were working on problems of higher difficulty level. The next step was to use a computer program to replicate the results. Horowitz explains that only the first second of the video was used in order to not taint the results with student pauses which almost always indicate that the student is experiencing some difficulty.

Using a method called “Active Appearance Models,” a total of 111 video “fragments” were analyzed using Matlab. Instead of analyzing specific facial expressions, which is what the human participants attempted to do, the computer program focused on whether head movements were vertical or diagonal. The program would first choose a single fragment as the “test fragment,” and then use the remaining 110 fragments to train itself on how to analyze the test fragment. This was repeated 110 times, until each fragment had a turn as the test fragment.

Overall in 111 attempts, the program was able to correctly determine the level of difficulty more than 70% of the time. If the program had been a failure, the odds would have been around those of a coin-toss: 50/50.

Both the researchers and Horowitz feel that this study represents a nice beginning, but more work needs to be done in this area. At the moment, however, it’s easy to see that human teachers also have a role to play in classrooms where adaptive technology is being deployed. They alone have the tools to determine the root cause behind a student’s struggle and direct him or her back to work.

Monday

On Twitter

Oklahoma state testing interrupted by tech glitches -- again http://t.co/G8qL86bXfK #edtech #edchat #education

1 day ago

Pennsylvania cyber charter school teachers join union http://t.co/3ejVLWWLvV #edtech #edchat #education

1 day ago

UK officials investigate validity of 'Muslim takeover plot' in Birmingham schools http://t.co/2zzPQfteye #ukedchat #education #edchat

2 days ago

On Facebook

Recent News

Career Index

Plan your career as an educator using our free online datacase of useful information.

View All