Learning Dialogue Dynamics with the Method ...
Type de document :
Communication dans un congrès avec actes
Titre :
Learning Dialogue Dynamics with the Method of Moments
Auteur(s) :
Barlier, Merwan [Auteur]
Sequential Learning [SEQUEL]
Orange Labs [Issy les Moulineaux]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Laroche, Romain [Auteur]
Orange Labs [Issy les Moulineaux]
Pietquin, Olivier [Auteur]
Université de Lille, Sciences et Technologies
Institut universitaire de France [IUF]
Sequential Learning [SEQUEL]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Sequential Learning [SEQUEL]
Orange Labs [Issy les Moulineaux]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Laroche, Romain [Auteur]
Orange Labs [Issy les Moulineaux]
Pietquin, Olivier [Auteur]
Université de Lille, Sciences et Technologies
Institut universitaire de France [IUF]
Sequential Learning [SEQUEL]
Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 [CRIStAL]
Titre de la manifestation scientifique :
Workshop on Spoken Language Technologie (SLT 2016)
Ville :
San Diego
Pays :
Etats-Unis d'Amérique
Date de début de la manifestation scientifique :
2016-12-13
Discipline(s) HAL :
Informatique [cs]/Apprentissage [cs.LG]
Informatique [cs]/Interface homme-machine [cs.HC]
Informatique [cs]/Interface homme-machine [cs.HC]
Résumé en anglais : [en]
In this paper, we introduce a novel framework to encode the dynamics of dialogues into a probabilistic graphical model. Traditionally, Hidden Markov Models (HMMs) would be used to address this problem, involving a first ...
Lire la suite >In this paper, we introduce a novel framework to encode the dynamics of dialogues into a probabilistic graphical model. Traditionally, Hidden Markov Models (HMMs) would be used to address this problem, involving a first step of hand-crafting to build a dialogue model (e.g. defining potential hidden states) followed by applying expectation-maximisation (EM) algorithms to refine it. Recently, an alternative class of algorithms based on the Method of Moments (MoM) has proven successful in avoiding issues of the EM-like algorithms such as convergence towards local optima, tractability issues, initialization issues or the lack of theoretical guarantees. In this work, we show that dialogues may be modeled by SP-RFA, a class of graphical models efficiently learnable within the MoM and directly usable in planning algorithms (such as reinforcement learning). Experiments are led on the Ubuntu corpus and dialogues are considered as sequences of dialogue acts, represented by their Latent Dirichlet Allocation (LDA) and Latent Semantic Analysis (LSA). We show that a MoM-based algorithm can learn a compact model of sequences of such acts.Lire moins >
Lire la suite >In this paper, we introduce a novel framework to encode the dynamics of dialogues into a probabilistic graphical model. Traditionally, Hidden Markov Models (HMMs) would be used to address this problem, involving a first step of hand-crafting to build a dialogue model (e.g. defining potential hidden states) followed by applying expectation-maximisation (EM) algorithms to refine it. Recently, an alternative class of algorithms based on the Method of Moments (MoM) has proven successful in avoiding issues of the EM-like algorithms such as convergence towards local optima, tractability issues, initialization issues or the lack of theoretical guarantees. In this work, we show that dialogues may be modeled by SP-RFA, a class of graphical models efficiently learnable within the MoM and directly usable in planning algorithms (such as reinforcement learning). Experiments are led on the Ubuntu corpus and dialogues are considered as sequences of dialogue acts, represented by their Latent Dirichlet Allocation (LDA) and Latent Semantic Analysis (LSA). We show that a MoM-based algorithm can learn a compact model of sequences of such acts.Lire moins >
Langue :
Anglais
Comité de lecture :
Oui
Audience :
Internationale
Vulgarisation :
Non
Collections :
Source :
Fichiers
- https://hal.inria.fr/hal-01406904/document
- Accès libre
- Accéder au document
- https://hal.inria.fr/hal-01406904/document
- Accès libre
- Accéder au document
- https://hal.inria.fr/hal-01406904/document
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- SLT_2016_MBRLOP.pdf
- Accès libre
- Accéder au document
- document
- Accès libre
- Accéder au document
- SLT_2016_MBRLOP.pdf
- Accès libre
- Accéder au document