A novel bottleneck-BLSTM front-end for feature-level context modeling in conversational speech recognition

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

9 Zitate (Scopus)

Abstract

We present a novel automatic speech recognition (ASR) front-end that unites Long Short-Term Memory context modeling, bidirectional speech processing, and bottleneck (BN) networks for enhanced Tandem speech feature generation. Bidirectional Long Short-Term Memory (BLSTM) networks were shown to be well suited for phoneme recognition and probabilistic feature extraction since they efficiently incorporate a flexible amount of long-range temporal context, leading to better ASR results than conventional recurrent networks or multi-layer perceptrons. Combining BLSTM modeling and bottleneck feature generation allows us to produce feature vectors of arbitrary size, independent of the network training targets. Experiments on the COSINE and the Buckeye corpora containing spontaneous, conversational speech show that the proposed BN-BLSTM front-end leads to better ASR accuracies than previously proposed BLSTM-based Tandem and multi-stream systems.

OriginalspracheEnglisch
Titel2011 IEEE Workshop on Automatic Speech Recognition and Understanding, ASRU 2011, Proceedings
Seiten36-41
Seitenumfang6
DOIs
PublikationsstatusVeröffentlicht - 2011
Veranstaltung2011 IEEE Workshop on Automatic Speech Recognition and Understanding, ASRU 2011 - Waikoloa, HI, USA/Vereinigte Staaten
Dauer: 11 Dez. 201115 Dez. 2011

Publikationsreihe

Name2011 IEEE Workshop on Automatic Speech Recognition and Understanding, ASRU 2011, Proceedings

Konferenz

Konferenz2011 IEEE Workshop on Automatic Speech Recognition and Understanding, ASRU 2011
Land/GebietUSA/Vereinigte Staaten
OrtWaikoloa, HI
Zeitraum11/12/1115/12/11

Fingerprint

Untersuchen Sie die Forschungsthemen von „A novel bottleneck-BLSTM front-end for feature-level context modeling in conversational speech recognition“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren