BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Efficient Lattice Rescoring Using Recurrent Neural Network Languag
 e Models - Xunying (Andrew) Liu (University of Cambridge)
DTSTART:20140307T130000Z
DTEND:20140307T133000Z
UID:TALK51184@talks.cam.ac.uk
CONTACT:Rogier van Dalen
DESCRIPTION:Recurrent neural network language models (RNNLM) have become a
 n increasingly popular choice for state-of-the-art speech recognition syst
 ems due to their inherently strong generalization performance.\nAs these m
 odels use a vector representation of complete history contexts\, RNNLMs ar
 e normally used to rescore N-best lists.\nMotivated by their intrinsic cha
 racteristics\, two novel lattice rescoring methods for RNNLMs are investig
 ated in this paper.\nThe first uses an n-gram style clustering of history 
 contexts. The second approach directly exploits the distance measure betwe
 en hidden history vectors.\nBoth methods produced 1-best performance compa
 rable with a 10k-best rescoring baseline RNNLM system on a large vocabular
 y conversational telephone speech recognition task.\nSignificant lattice s
 ize compression of over 70% and consistent improvements after confusion ne
 twork (CN) decoding were also obtained over the N-best rescoring approach.
LOCATION:Department of Engineering - LR6
END:VEVENT
END:VCALENDAR
