Publication Details
Parallel Training of Neural Networks for Speech Recognition
VESELÝ, K.; BURGET, L.; GRÉZL, F. Parallel Training of Neural Networks for Speech Recognition. Prof. Text, Speech and Dialogue 2010. Lecture Notes in Computer Science. LNAI 6231. Brno: Springer Verlag, 2010. p. 439-446. ISBN: 978-3-642-15759-2. ISSN: 0302-9743.
Czech title
Paralelní trénování neuronových sítí pro rozpoznávání řeči
Type
conference paper
Language
English
Authors
Veselý Karel, Ing., Ph.D.
(DCGM)
Burget Lukáš, doc. Ing., Ph.D. (DCGM)
Grézl František, Ing., Ph.D. (DCGM)
Burget Lukáš, doc. Ing., Ph.D. (DCGM)
Grézl František, Ing., Ph.D. (DCGM)
URL
Keywords
neural network, phoneme classification, posterior features, backpropagation training, data parallelization
Abstract
The paper is on Parallel Training of Neural Networks for Speech Recognition. A new parallel-training tool TNet was designed and optimized for multiprocessor computers. The training acceleration rates are reported on a phoneme-state classification task.
Annotation
The feed-forward multi-layer neural networks have significant importance in speech recognition. A new parallel-training tool TNet was designed and optimized for multiprocessor computers. The training acceleration rates are reported on a phoneme-state classification task.
Published
2010
Pages
439–446
Journal
Lecture Notes in Computer Science, vol. 2010, no. 9, ISSN 0302-9743
Proceedings
Prof. Text, Speech and Dialogue 2010
Series
LNAI 6231
ISBN
978-3-642-15759-2
Publisher
Springer Verlag
Place
Brno
BibTeX
@inproceedings{BUT35731,
author="Karel {Veselý} and Lukáš {Burget} and František {Grézl}",
title="Parallel Training of Neural Networks for Speech Recognition",
booktitle="Prof. Text, Speech and Dialogue 2010",
year="2010",
series="LNAI 6231",
journal="Lecture Notes in Computer Science",
volume="2010",
number="9",
pages="439--446",
publisher="Springer Verlag",
address="Brno",
isbn="978-3-642-15759-2",
issn="0302-9743",
url="http://www.fit.vutbr.cz/research/groups/speech/publi/2010/vesely_tsd2010.pdf"
}