Up next


MIT 6.S191: Recurrent Neural Networks, Transformers, and Attention

2,886 Views
AI Lover
3
Published on 06/02/23 / In How-to & Learning

MIT Introduction to Deep Learning 6.S191: Lecture 2
Recurrent Neural Networks
Lecturer: Ava Amini
2023 Edition

For all lectures, slides, and lab materials: http://introtodeeplearning.com

Lecture Outline
0:00​ - Introduction
3:07​ - Sequence modeling
5:09​ - Neurons with recurrence
12:05 - Recurrent neural networks
13:47 - RNN intuition
15:03​ - Unfolding RNNs
18:57 - RNNs from scratch
21:50 - Design criteria for sequential modeling
23:45 - Word prediction example
29:57​ - Backpropagation through time
32:25 - Gradient issues
37:03​ - Long short term memory (LSTM)
39:50​ - RNN applications
44:50 - Attention fundamentals
48:10 - Intuition of attention
50:30 - Attention and search relationship
52:40 - Learning attention with neural networks
58:16 - Scaling attention and applications
1:02:02 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

Show more
0 Comments sort Sort By

Up next