The pursuit of creating empathetic social robots that can understand and respond to human emotions is a critical challenge in Robotics and Artificial Intelligence. Social robots, designed to interact with humans in various settings, from healthcare to customer service, require a sophisticated understanding of human emotional states to resonate and effectively assist truly. Our research contributes to this ambitious goal by exploring the relationship between natural facial expressions and brain activity in these human-robot interactions, as captured by electroencephalogram (EEG) signals. This paper presents our initial steps towards this attempt. We want to find which areas in the participant user’s brain are most activated and how these activations correlate with facial expressions. Understanding these correlations is essential for developing social robots that recognize and empathize with various human emotions. Our approach combines neuroscience and computer science, offering a novel perspective in the quest to enhance the emotional intelligence of social robots. We share some preliminary results on a new multimodal dataset that we are developing, providing valuable insights into the potential of our work to improve the personalization and emotional depth of social robot interactions.

Battisti, L., Fagioli, S., Ferrato, A., Limongelli, C., Mastandrea, S., Mezzini, M., et al. (2024). Towards Empathetic Social Robots: Investigating the Interplay between Facial Expressions and Brain Activity. In Joint Proceedings of the ACM IUI 2024 Workshops co-located with 29th ACM Conference on Intelligent User Interfaces (ACM IUI 2024). Aachen : CEUR-WS.

Towards Empathetic Social Robots: Investigating the Interplay between Facial Expressions and Brain Activity

Battisti L.;Fagioli S.;Ferrato A.;Limongelli C.;Mastandrea S.;Mezzini M.;Nardo D.;Sansonetti G.
2024-01-01

Abstract

The pursuit of creating empathetic social robots that can understand and respond to human emotions is a critical challenge in Robotics and Artificial Intelligence. Social robots, designed to interact with humans in various settings, from healthcare to customer service, require a sophisticated understanding of human emotional states to resonate and effectively assist truly. Our research contributes to this ambitious goal by exploring the relationship between natural facial expressions and brain activity in these human-robot interactions, as captured by electroencephalogram (EEG) signals. This paper presents our initial steps towards this attempt. We want to find which areas in the participant user’s brain are most activated and how these activations correlate with facial expressions. Understanding these correlations is essential for developing social robots that recognize and empathize with various human emotions. Our approach combines neuroscience and computer science, offering a novel perspective in the quest to enhance the emotional intelligence of social robots. We share some preliminary results on a new multimodal dataset that we are developing, providing valuable insights into the potential of our work to improve the personalization and emotional depth of social robot interactions.
2024
Battisti, L., Fagioli, S., Ferrato, A., Limongelli, C., Mastandrea, S., Mezzini, M., et al. (2024). Towards Empathetic Social Robots: Investigating the Interplay between Facial Expressions and Brain Activity. In Joint Proceedings of the ACM IUI 2024 Workshops co-located with 29th ACM Conference on Intelligent User Interfaces (ACM IUI 2024). Aachen : CEUR-WS.
File in questo prodotto:
File Dimensione Formato  
Battisti24_Towards Empathetic Social Robots - Investigating the Interplay between Facial Expressions and Brain Activity.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 397.81 kB
Formato Adobe PDF
397.81 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11590/473187
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact