How robots are becoming more like us
Researchers are working on personalised voices for humanoid robots
How should the voice of a humanoid robot sound in order to be perceived as pleasant by humans? This is the subject of a project at the Chair for Human-Centred Artificial Intelligence at the University of Augsburg. The project aims to find out whether a personalised robot voice, or more precisely, a voice that resembles that of the user, increases perceived sympathy towards the robot. They can laugh, speak, and help us. Interaction with so-called social robots is becoming evermore important for our society. Humanoids, robots that look and behave similarly to humans, could be used more extensively in future as assistance systems or in fields such as personal care and therapy. Alongside appearance, the humanoid’s voice is an important factor for successful communication with people. At the Chair for Human-Centred Artificial Intelligence at the University of Augsburg, researchers are conducting research into how different voice variations influence the perception of humanoid robots. “What interests us is whether a humanoid robot, whose voice represents that of its counterpart, is perceived as particularly sympathetic.” The more pleasant, which means likeable, the robot seems to its counterpart, the better the interaction between humans and robots,” explains Johanna Kuch, a doctoral candidate who is leading the project “EchoSync” at the Centre for Human-Centred Artificial Intelligence headed by Prof Elisabeth André. For her study, she worked with a gender-ambiguous robot head that could speak with various synthetic voices. A total of 50 persons engaged in three conversations with the robot, each using a different voice: one that matched the robot’s appearance and design, one that neither matched the test person or the robot, as well as one that was generated with the help of voice cloning based on participant language samples. This personalised voice imitated vocal characteristics such as pitch and timbre, without the participants knowing prior to the conversation that their voice had been cloned. “With voice cloning we could personalise the voices and adapt the characteristics to the robot’s counterpart,” explained Kuch, “we wanted to find out whether individually tailored voices were perceived as being more sympathetic.” In her experimental study, Kuch used a very realistic-looking humanoid robot head. The head could be interpreted as either male or female. Through 14 individually programmable pneumatic drives, the facial expressions and movements of the head could be precisely controlled: the head can wink at its conversation partner, as well as lift its eyebrows and laugh. Participants in the study had a short conversation with the robot about planning a meeting. After every interaction they evaluated the robot’s voice based on sympathy, familiarity, and humanness. “Our preliminary results show that both design-congruent voices and the individually cloned voices of participants were perceived as being more sympathetic than randomly selected voices, and that was the case even though most users did not directly recognise their own voice,” says Kuch. “Voice cloning could therefore be a very promising alternative to elaborately designed robot voices, particularly in one-to-one interactions between a human and a machine.”
Email:
andre@informatik.uni-augsburginformatik.uni-augsburg.de ()
Email:
johanna.kuch@uni-auni-a.de ()
Email:
corina.haerning@presse.uni-augsburgpresse.uni-augsburg.de ()
Voice cloning for personalised voices
Personalised voices increase sympathy
Scientific contact
Media contact
Press images available for download

