Sonmez, Elena BattiniHan, HasanKaradeniz, OguzcanDalyan, TugbaSarioglu, Baykal2024-07-182024-07-1820222379-89202379-8939https://doi.org/10.1109/TCDS.2021.3120562https://hdl.handle.net/11411/7789The aim of this work is to design an artificial empathetic system and to implement it into an EMotional RESpondent (EMRES) robot, called EMRES. Rather than mimic the expression detected in the human partner, the proposed system achieves a coherent and consistent emotional trajectory resulting in a more credible human-agent interaction. Inspired by developmental robotics theory, EMRES has an internal state and a mood, which contribute in the evolution of the flow of emotions; at every episode, the next emotional state of the agent is affected by its internal state, mood, current emotion, and the expression read in the human partner. As a result, EMRES does not imitate, but it synchronizes to the emotion expressed by the human companion. The agent has been trained to recognize expressive faces of the FER2013 database and it is capable of achieving 78.3% performance with wild images. Our first prototype has been implemented into a robot, which has been created for this purpose. An empirical study run with university students judged in a positive way the newly proposed artificial empathetic system.eninfo:eu-repo/semantics/closedAccessRobotsComputational ModelingFace RecognitionMoodFacesSynchronizationThree-Dimensional DisplaysComputational Affective ModelsDeep LearningDevelopmental RoboticsFacial ExpressionsVirtual HumanDevelopmental RoboticsCore AffectEMRES: A New EMotional RESpondent RobotArticle2-s2.0-8511779113810.1109/TCDS.2021.31205627802Q177214Q2WOS:000809402600050