Publication Details

i-vectors in language modeling: An efficient way of domain adaptation for feed-forward models

BENEŠ, K.; KESIRAJU, S.; BURGET, L. i-vectors in language modeling: An efficient way of domain adaptation for feed-forward models. In Proceedings of Interspeech 2018. Proceedings of Interspeech. Hyderabad: International Speech Communication Association, 2018. p. 3383-3387. ISSN: 1990-9772.
Czech title
i-vektory pro jazykové modelování: efektivní způsob doménové adaptace s dopřednými modely
Type
conference paper
Language
English
Authors
URL
Keywords

language modeling, feed-forward models, subspace multinomial model, domain adaptation

Abstract

We show an effective way of adding context information to shallow neural language models. We propose to use Subspace Multinomial Model (SMM) for context modeling and we add the extracted i-vectors in a computationally efficient way. By adding this information, we shrink the gap between shallow feed-forward network and an LSTM from 65 to 31 points of perplexity on the Wikitext-2 corpus (in the case of neural 5-gram model). Furthermore, we show that SMM i-vectors are suitable for domain adaptation and a very small amount of adaptation data (e.g. endmost 5% of a Wikipedia article) brings a substantial improvement. Our proposed changes are compatible with most optimization techniques used for shallow feedforward LMs.

Published
2018
Pages
3383–3387
Journal
Proceedings of Interspeech, vol. 2018, no. 9, ISSN 1990-9772
Proceedings
Proceedings of Interspeech 2018
Publisher
International Speech Communication Association
Place
Hyderabad
DOI
UT WoS
000465363900706
EID Scopus
BibTeX
@inproceedings{BUT155102,
  author="Karel {Beneš} and Santosh {Kesiraju} and Lukáš {Burget}",
  title="i-vectors in language modeling: An efficient way of domain adaptation for feed-forward models",
  booktitle="Proceedings of Interspeech 2018",
  year="2018",
  journal="Proceedings of Interspeech",
  volume="2018",
  number="9",
  pages="3383--3387",
  publisher="International Speech Communication Association",
  address="Hyderabad",
  doi="10.21437/Interspeech.2018-1070",
  issn="1990-9772",
  url="https://www.isca-speech.org/archive/Interspeech_2018/abstracts/1070.html"
}
Files
Back to top