skip to content
 

As social robots are poised to be more involved in daily human life, they will benefit from an improved understanding of human affective and emotional behaviour. One way to achieve this is to evaluate the facial expressions of the users to learn how these robots should interact with them. Here at the Affective Intelligence and Robotics (AFAR) Lab (https://cambridge-afar.github.io) of the Department of Computer Science and Technology, we are working to create emotionally intelligent robots that can sense and understand human affective behaviour, improving day-to-day Human-Robot Interactions.

On Wednesday, come along for a brief one-to-one taster session with the Pepper Robot and play a short game with Pepper where it learns and adapts to your facial expressions. The game will be explained to you by Pepper, who will then talk to you, over several interactions, and learn to respond to you, using its speech and body gestures, based on your emotional behaviour. On Tuesday, we will be running a very similar session looking at the software Pepper uses but without Pepper Robot as it isn’t available on this day.

The AFAR Lab is also working on robots designed for the evaluation of the mental wellbeing of young people. In the exhibit on Thursday, you will interact with the Nao Robot and get acquainted with it by doing some fun activities together. You will also put on your design caps and come up with innovative robot designs. You can tackle the challenge as a designer, computer whiz, scientist, writer or just about any other role that tickles your fancy!


Nida, a PhD student at the AFAR Lab with the Nao Robot.