LORA - Low Rank Adaptation explained || A paper from Microsoft that made LLMS more efficient .

Paper in a Pod · Beginner ·📄 Research Papers Explained ·1y ago
Hii, Today we are reviewing the paper - LORA - Low Rank Adaptation a matrix decomposition technique for finetuning LLMS. Link to the paper - https://arxiv.org/pdf/2106.09685 Do listen in 2 x to save your time and get the most out of the video in the shortest amount of time possible. Also I would recommend, dive deep and look into the mathematical details. Some more recourses : Explained by the author himself - https://www.youtube.com/watch?v=DhRoTONcyZE
Watch on YouTube ↗ (saves to browser)
Lecture 23: The Qing through Qianlong
Next Up
Lecture 23: The Qing through Qianlong
MIT OpenCourseWare