An Introduction to Transformers
In this teaching lecture, I will give a high-level introduction to the popular transformer model architecture. To further the understanding of how transformers work, I deep-dive into applying bidirectional representations from transformers (BERT) to electronic medical health records. Specifically, I introduce my work on Ped-BERT, which accurately predicts the most likely diagnosis codes for a pediatric patient’s next medical visit based on location, age, and history of previous diagnosis codes.
Cornelia Ilin is a lecturer in data science at UC Berkeley’s School of Information, where she teaches the MIDS Applied Machine Learning and Capstone courses. Her research is at the intersection of health and the environment, employing various geospatial, causal inference, and machine learning (ML) methods. Before this, she was a research scientist at Stanford University and a postdoctoral fellow in the Data-Intensive Development Lab at UC Berkeley. She received her doctorate in applied economics from the University of Wisconsin, Madison.