Skip to main content
Browse by:
GROUP

+DS IPLE: Natural Language Processing with Attention-based Neural Networks

Event Image
Thursday, November 14, 2019
4:30 pm - 6:30 pm
Larry Carin
+DS In-Person Learning Experiences

There has been a recent surge in the quality of natural language processing technology, and much of this has been driven by a new class of neural networks, based on the concept of "attention." Attention networks localize (pay attention to) a portion of input text, when performing a task. For example, in the context of translation, when converting text from Language A to Language B, when producing the text in Language B the neural network adaptively focuses its attention on the appropriate subset of input text in Language A. In this session we will discuss how attention networks are manifested, and we will use this insight to describe the recently-developed Transformer Network, which is based entirely on the concept of attention. The Transformer Network has achieved state-of-the-art performance in many areas, such as translation, summarization and other text-synthesis tasks.

This session is part of the Duke+Data Science (+DS) program in-person learning experiences (IPLEs). To learn more, please visit https://plus.datascience.duke.edu/