Stanford CS25: Transformers United V6 I Overview of Transformers
For more information about Stanford’s graduate programs, visit: https://online.stanford.edu/graduate-education
April 2, 2026
This seminar covers:
• Overview of the history of ML/NLP, Transformers, and how they work
• Recent trends, breakthroughs, applications, and current challenges
Follow along with the seminar schedule. Visit: https://web.stanford.edu/class/cs25/
Instructors:
• Steven Feng, Stanford Computer Science PhD student and NSERC PGS-D scholar
• Karan P. Singh, Electrical Engineering PhD student and NSF Graduate Research Fellow in the Stanford Translational AI Lab
• Michael C. Frank, Benjamin Scott Crocker Professor of Human Biology Director, Symbolic Systems Program
• Christopher Manning, Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science, Co-Founder and Senior Fellow of the Stanford Institute for Human-Centered Artificial Intelligence (HAI)
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
More on: LLM Foundations
View skill →Related AI Lessons
⚡
⚡
⚡
⚡
What's new in Prompt Optimizer: latest features and improvements
Dev.to AI
AI vs LLM vs AI Agents vs Automation — What’s the Real Difference?
Dev.to AI
PagedAttention: vLLM’s Solution to GPU Memory Waste
Medium · ChatGPT
From 30 to 60 Tokens/Second: How I Got vLLM Running on 2x RTX 3090
Medium · LLM
🎓
Tutor Explanation
DeepCamp AI