Ver Post

Research seminar – Wang Ling, Friday, March 24, 11:00-11:45 (Alameda - 0.19, Pav Informática II) and 15:00-15:45 (Tagus park - 1.38)

20 Março 2017, 13:40 - Ana Maria de Almeida Nogueira Marques

Title: Structured Neural Networks for Natural Language Processing

Speaker: Wang Ling, CMU/IST

Date and time: Friday, March 24, 11:00-11:45 and 15:00-15:45 (Tagus park - 1.38)

Location: CSE meeting room (Informática II - Alameda), videocast to DSI room in Tagus, and Tagus park - 1.38

Recent advances in deep learning have led to a new era in Natural Language Processing where neural networks achieve state-of-the-art results on the majority of their mainstream tasks, such as machine translation, language modeling and parsing. In this talk, I will describe how structure plays an important role in the design of neural models for natural language processing tasks. First, I will describe a class of hierarchical models that employ a word composition model at the character level, in addition to a sentence composition component. By enabling morphological awareness, this class of models have shown remarkable improvements in a multitude of natural language processing tasks, such as language modeling, part-of-speech tagging and machine translation. Secondly, I will describe Latent Predictor Networks, a framework that allows the generation of sequences of tokens with multiple predictors at different levels of granularity. This model is applied in the task of generating programming code from natural language descriptions, where the model learns to generate programming language keywords at the character level, and learns to copy strings and values from the natural language input. Thirdly, I will describe a model that learns to solve high school math problems, where the model is required to understand a question in natural language and generate a natural language rationale describing the solution to the problem. I will conclude by discussing promising directions for future research.


Wang Ling is a research scientist in Google DeepMind. He received his dual-degree PhD in Language Technologies in 2015 from Carnegie Mellon University and University of Lisbon. His research interests include Machine Translation, Natural Language Processing, Machine Learning and Deep Learning. He has published over 30 articles in the top tier conferences and journals (including Computational Linguistics, ACL and EMNLP).