Researchers from the laboratory of Facebook AI lab developed a bot, which is an animation, which is controlled by a special algorithm. He was trained on hundreds of videotapes from Skype: he learned to understand and reproduce the way people mimic each other.
To optimize learning, the algorithm divided the human face into 68 key points that tracked during each video call, the New Scientist magazine reported. Usually people nodded, blinked, etc., to show that they were in contact with another person. In the end, the system also learned this.
Then the bot showed that it can, by watching the video with the talking person, choose in real time what kind of mimic reaction will be most suitable. If a person, for example, laughed, the bot decided to open his mouth or toss his head back.
After that, researchers from Facebook tested the system on groups of people. Subjects watched animations, on which there was interaction between a bot and a man, and between a person and a person. Volunteers felt that bots and people are equally natural and realistic.
However, scientists note that mimic expressions are not all, knowledge about them is not enough to maintain a full-fledged communication: people behind facial manifestations are hiding thoughts and feelings.
Facebook will present a study at the International Conference on Intelligent Robots and Systems, which will be held this month in Vancouver, Canada.