Han, H.Karadeniz, O.Sönmez, E.B.Dalyan, T.Sanoğlu, B.2024-07-182024-07-1820219781665429085https://doi.org/10.1109/UBMK52708.2021.9558909https://hdl.handle.net/11411/64396th International Conference on Computer Science and Engineering, UBMK 2021 -- 15 September 2021 through 17 September 2021 -- -- 176826One of the major problems with robot companions is their lack of credibility. Since emotions play a key role in human behaviour their implementation in virtual agents is a conditio sine-qua-non for realistic models. That is, correct classification of facial expressions in the wild is a necessary preprocessing step for implementing artificial empathy. The aim of this work is to implement a robust Facial Expression Recognition (FER) module into a robot. Considering the results of an empirical comparison among the most successful deep learning algorithms used for FER, this study fixes the state-ofthe-art performance of 75% on the FER2013 database with the ensemble method. With a single model, the best performance of 70.8% has been reached using the VGG16 architecture. Finally, the VGG16-based FER module has been been implemented into a robot and reached a performance of 70% when tested with wild expressive faces. © 2021 IEEEeninfo:eu-repo/semantics/closedAccessDeep LearningFacial Expressions ClassificationVirtual HumansBehavioral ResearchDeep LearningE-LearningFace RecognitionReinforcement LearningRoboticsRobotsVirtual RealityDeep LearningFacial Expression RecognitionFacial ExpressionsFacial Expressions ClassificationsHuman BehaviorsPerformanceRealistic ModelRobot CompanionVirtual AgentVirtual HumansLearning AlgorithmsFacial Expression Recognition in the Wild with Application in RoboticsConference Object2-s2.0-8512585380410.1109/UBMK52708.2021.9558909569N/A565