How to get free YouTube subscribers, likes and views?
Get Free YouTube Subscribers, Views and Likes

Self Attention in Transformer Neural Networks (with Code!)

Follow
CodeEmporium

Let's understand the intuition, math and code of Self Attention in Transformer Neural Networks

ABOUT ME
⭕ Subscribe: https://www.youtube.com/c/CodeEmporiu...
Medium Blog:   / dataemporium  
Github: https://github.com/ajhalthor
LinkedIn:   / ajayhalthor477974bb  

RESOURCES
[ 1] Code for video: https://github.com/ajhalthor/Transfor...
[2 ] Transformer Main Paper: https://arxiv.org/abs/1706.03762
[3 ] Bidirectional RNN Paper: https://deeplearning.cs.cmu.edu/F20/d...


PLAYLISTS FROM MY CHANNEL
⭕ ChatGPT Playlist of all other videos:    • ChatGPT  
⭕ Transformer Neural Networks:    • Natural Language Processing 101  
⭕ Convolutional Neural Networks:    • Convolution Neural Networks  
⭕ The Math You Should Know :    • The Math You Should Know  
⭕ Probability Theory for Machine Learning:    • Probability Theory for Machine Learning  
⭕ Coding Machine Learning:    • Code Machine Learning  


MATH COURSES (7 day free trial)
Mathematics for Machine Learning: https://imp.i384100.net/MathML
Calculus: https://imp.i384100.net/Calculus
Statistics for Data Science: https://imp.i384100.net/AdvancedStati...
Bayesian Statistics: https://imp.i384100.net/BayesianStati...
Linear Algebra: https://imp.i384100.net/LinearAlgebra
Probability: https://imp.i384100.net/Probability

OTHER RELATED COURSES (7 day free trial)
⭐ Deep Learning Specialization: https://imp.i384100.net/DeepLearning
Python for Everybody: https://imp.i384100.net/python
MLOps Course: https://imp.i384100.net/MLOps
Natural Language Processing (NLP): https://imp.i384100.net/NLP
Machine Learning in Production: https://imp.i384100.net/MLProduction
Data Science Specialization: https://imp.i384100.net/DataScience
Tensorflow: https://imp.i384100.net/Tensorflow

TIMSTAMPS
0:00 Introduction
0:34 Recurrent Neural Networks Disadvantages
2:12 Motivating Self Attention
3:34 Transformer Overview
7:03 Self Attention in Transformers
7:32 Coding Self Attetion

posted by drefnasid8p