BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:A trip down long short-term memory lane - Petar Veličković (Univ
 ersity of Cambridge)
DTSTART:20170221T130000Z
DTEND:20170221T140000Z
UID:TALK70906@talks.cam.ac.uk
CONTACT:Mariana Marasoiu
DESCRIPTION:Extending neural networks to handle sequential inputs of arbit
 rary\nlengths (such as time-series\, statements in natural language\, or\n
 mathematical expressions to evaluate) impedes the application of\nstandard
  fully-connected or convolutional neural networks\, which will\ntypically 
 require the input to be fixed in size. To alleviate the issue\,\nrecurrent
  neural networks introduce a learnable unit of computation\,\ncapable of p
 rocessing inputs step-by-step in a length-independent\nmanner. In fact\, i
 f feedforward neural networks were concerned with\nlearning desirable _fun
 ctions_ that consume the input\, recurrent neural\nnetworks may be seen as
  learning desirable _programs_ that process the\ninput.\n\nIn this lecture
 \, I will introduce recurrent neural networks from first\nprinciples\, and
  illustrate the many issues that arise with applying them\nnaïvely. As a 
 popular solution to those issues\, the long short-term\nmemory (LSTM) cell
  will be introduced\, with a detailed intuitive and\ntheoretical descripti
 on of its mode of operation. This will be\nfollowed-up with strategies for
  tackling several kinds of sequential\nproblems using LSTMs\, and a survey
  of real-world applications of this\nmodel. Finally\, a recently proposed 
 non-recurrent model that shows\npromising results on sequential tasks will
  be outlined\, to provide\nperspective for potential future trends in the 
 field. No prior knowledge\nof neural networks or machine learning is neede
 d\, but an entry-level\nknowledge of supervised learning principles will b
 e beneficial. Code\nexamples that demonstrate how recurrent models can be 
 constructed in\nonly a few lines of Python will also be provided.
LOCATION:Computer Laboratory\, William Gates Building\, Room SW01
END:VEVENT
END:VCALENDAR
