A new study conducted by body-language expert Dr Harry Witchel, Discipline Leader in Physiology at Brighton and Sussex Medical School (BSMS) believes that computers are able to read a person's body language to tell whether they are bored or interested in what they see on the screen based on the movements on the computer – such as scrolling and clicking.
The research states that it is possible to judge a person’s level of interest by monitoring whether they display the tiny movements that people usually constantly exhibit, widely referred to as non-instrumental movements.
If a person is absorbed in what they are doing or watching on the computer – what Dr Witchel refers to as ‘rapt engagement’ – a noticeable decrease in these involuntary movements will show.
Dr Witchel said: "Our study showed that when someone is really highly engaged in what they're doing, they suppress these tiny involuntary movements. It's the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.
The study consisted of 27 participants who faced a range of three-minute stimuli on a computer, from fascinating games to tedious readings from EU banking regulation, while using a handheld trackball to minimise instrumental movements, such as moving the mouse. Their movements were quantified over the three minutes using video motion tracking. In two comparable reading tasks, the more engaging reading resulted in a significant reduction (42%) of non-instrumental movement.
The study could have a large impact on the development of future learning applications that are based on artificial intelligence. This is due to the potential that the programme could adapt to a person’s interest in order to re-engage them when signs of boredom are shown. It could even contribute towards the development of companion robots, which would be better able to estimate a person's state of mind and emotional state.
"Being able to 'read' a person's interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process," Dr Witchel said. "Further ahead it could help us create more empathetic companion robots, which may sound very 'sci fi' but are becoming a realistic possibility within our lifetimes."
University of Sussex