[en] This research focuses on enhancing public speaking training through the integration of an emotional speech dataset and Virtual Reality (VR) technology. The study recognizes the significance of emotions in effective communication and highlights the potential of AI techniques for emotion detection. To address the limitations of existing datasets, a high-quality emotional speech dataset is developed in French and English, incorporating professional actor recordings and Text-to-Speech software-generated samples. The dataset is then validated through subjective evaluation experiments. By leveraging VR technology, the research aims to create an immersive and interactive training environment where participants can practice speaking in front of virtual avatars that react authentically based on their emotional states. This research contributes to the advancement of public speaking training and establishes a valuable resource for emotional speech analysis in various applications, including VR, robots, and virtual assistants.
Disciplines :
Marketing Quantitative methods in economics & management Social & behavioral sciences, psychology: Multidisciplinary, general & others