Accessible Foundation Models: Systems, Algorithms, and Science
Lunch will be served at 11:45 AM.
The ever-increasing scale of foundation models, such as ChatGPT and AlphaFold, has revolutionized AI and science more generally. However, increasing scale also steadily raises computational barriers, blocking almost everyone from studying, adapting, or otherwise using these models for anything beyond static API queries. In this talk, I will present research that significantly lowers these barriers for a wide range of use cases, including inference algorithms that are used to make predictions after training, fine-tuning approaches that adapt a trained model to new data, and finally, full training of foundation models from scratch.
Tim Dettmers' research focuses on making foundation models, such as ChatGPT, accessible to researchers and practitioners by reducing their resource requirements. A PhD candidate at the University of Washington who has won oral, spotlight, and best paper awards at conferences such as ICLR and NeurIPS, Tim Dettmers created the bits-and-bytes library for efficient deep learning, which is growing at 1.4 million installations per month and received Google Open Source and PyTorch Foundation awards.