Volltext-Downloads (blau) und Frontdoor-Views (grau)

A Language Model Sensitive to Discourse Context

  • The paper proposes a meta language model that can dynamically incorporate the influence of wider discourse context. The model provides a conditional probability in forms of P (text|context), where the context can be arbitrary length of text, and is used to influence the probability distribution over documents. A preliminary evaluation using a 3-gram model as the base language model shows significant reductions in perplexity by incorporating discourse context.

Download full text files

  • Main Conference Proceedings of the 12th Konvens 2014

Export metadata

Additional Services

Share in Twitter    Search Google Scholar    frontdoor_oas
Metadaten
Author:Tae-Gil Noh, Sebastian Padó
URN:https://nbn-resolving.org/urn:nbn:de:gbv:hil2-opus-2822
Parent Title (English):Proceedings of the 12th edition of the KONVENS conference
Document Type:Conference Proceeding
Language:English
Date of Publication (online):2014/10/23
Release Date:2014/10/23
Tag:Statistische Methoden
Machine Learning; Statistical Methods
GND Keyword:Maschinelles Lernen
First Page:201
Last Page:206
PPN:Link zum Katalog
Institutes:Fachbereich III / Informationswissenschaft und Sprachtechnologie
DDC classes:400 Sprache / 400 Sprache, Linguistik
Collections:KONVENS 2014 / Proceedings of the 12th KONVENS 2014
Licence (German):License LogoCreative Commons - Namensnennung 3.0