Skip to main content
Browse by:
GROUP

+DS IPLE: Attention Networks for Natural Language Processing

Event Image
Thursday, March 26, 2020
4:30 pm - 6:30 pm
Lawrence Carin
+DS In-Person Learning Experiences

Neural-network-based methods for natural language processing (NLP) constitute an area of significant recent technical progress, with many interesting real-world applications. The Transformer Network is one of the newest and most powerful approaches of this type. This algorithm is based on repeated application of attention networks, in an encoder-decoder framework. In this presentation the basics of all-attention models (the Transformer) for NLP will be described, with application in areas like text synthesis (e.g., suggesting email text) and language translation. This session is part of the Duke+Data Science (+DS) program in-person learning experiences (IPLEs). To learn more, please visit https://plus.datascience.duke.edu/