Automatic Nonverbal Behavior Generation from Image Schemas

Brian Ravenet 1, 2 Chloé Clavel 3, 2 Catherine Pelachaud 4, 1, 2
1 MM - Multimédia
LTCI - Laboratoire Traitement et Communication de l'Information
3 S2A - Signal, Statistique et Apprentissage
LTCI - Laboratoire Traitement et Communication de l'Information
Abstract :

One of the main challenges when developing Embodied Conversational Agents is to give them the ability to autonomously produce meaningful and coordinated verbal and nonverbal behaviors. The relation between these means of communication is more complex than a direct mapping that has often been applied in previous models. In this paper, we propose an intermediate mapping approach we apply on metaphoric gestures first but that could be extended to other representational gestures. Leveraging from previous work in text analysis, embodied cognition and co-verbal behavior production, we introduce a framework articulating speech and metaphoric gesture invariants around a common mental representation: Image Schemas. We establish the components of our framework, detailing the different steps leading to the production of the metaphoric gestures, and we present some preliminary results and demonstrations. We end the paper by laying down the perspectives to integrate, evaluate and improve our model.

Complete list of metadatas

https://hal.telecom-paristech.fr/hal-02287759
Contributor : Telecomparis Hal <>
Submitted on : Friday, September 13, 2019 - 5:18:57 PM
Last modification on : Sunday, September 15, 2019 - 1:23:16 AM

Identifiers

  • HAL Id : hal-02287759, version 1

Citation

Brian Ravenet, Chloé Clavel, Catherine Pelachaud. Automatic Nonverbal Behavior Generation from Image Schemas. International Conference on Autonomous Agents and Multiagent Systems, Jul 2018, Stockholm, Sweden. ⟨hal-02287759⟩

Share

Metrics

Record views

2