Skip to main content
Browse by:
GROUP

Effective March 10, 2020, all Duke-sponsored events over 50 people have been cancelled, rescheduled, postponed or virtualized.
Please check with the event contact regarding event status. For more information, please see https://coronavirus.duke.edu/events

+DS IPLE: Natural Language Processing with Attention-based Neural Networks

Event Image
Icon calendar
Thursday, November 14, 2019
Icon time
4:30 pm - 6:30 pm
Icon speaker
Larry Carin
Icon series
+DS In-Person Learning Experiences

There has been a recent surge in the quality of natural language processing technology, and much of this has been driven by a new class of neural networks, based on the concept of "attention." Attention networks localize (pay attention to) a portion of input text, when performing a task. For example, in the context of translation, when converting text from Language A to Language B, when producing the text in Language B the neural network adaptively focuses its attention on the appropriate subset of input text in Language A. In this session we will discuss how attention networks are manifested, and we will use this insight to describe the recently-developed Transformer Network, which is based entirely on the concept of attention. The Transformer Network has achieved state-of-the-art performance in many areas, such as translation, summarization and other text-synthesis tasks.

This session is part of the Duke+Data Science (+DS) program in-person learning experiences (IPLEs). To learn more, please visit https://plus.datascience.duke.edu/