A student in America has worked out how to turn his face into a remote control.
PhD student Jacob Whitehill, of UC San Diego, used facial recognition technology to monitor the expressions of test subjects watching video lectures. By detecting confusion, he believes, lectures can be slowed or even replayed over difficult sections.
“If I am a student dealing with a robot teacher and I am completely puzzled and yet the robot keeps presenting new material, that’s not going to be very useful to me,” says Whitehill (press release). “If, instead, the robot stops and says, ‘Oh, maybe you’re confused,’ and I say, ‘Yes, thank you for stopping,’ that’s really good.”
In a paper to presented at an upcoming conference he reports that his system predicts subjects’ self-reported difficulty scores correctly 42% of the time. It’s not quite so good at preferred viewing speed, with only 29% accuracy on this.
Only eight people were involved in the pilot study, where Whitehill confirmed results from previous research showing that people blink less during difficult parts of the lecture. So there’s a lot of work left to do, but Whitehill believes his system could be trained to react to individual users’ expressions.
Are Californian students really too lazy to use a remote? Any readers from that demographic are welcome to comment on this question below.
More
Video of the face-mote
Whitehill’s work on automatic attractiveness detection and its online dating potential
“Developing a Practical Smile Detector”
Image: UC San Diego Jacobs School of Engineering