< 返回YOUTUBE油管无水印解析工具

Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 10 - Transformers and Pretraining

For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/3bDcbOJ

This lecture covers:
1. Quick review of Transformer model
2. Brief note on subword modeling
3. Motivating model pretraining from word embeddings
4. Model pretraining three ways
1. Decoders
2. Encoders
3. Encoder-Decoders
5. Very large models and in-context learning

To learn more about this course visit: https://online.stanford.edu/courses/cs224n-natural-language-processing-deep-learning
To follow along with the course schedule and syllabus visit: http://web.stanford.edu/class/cs224n/

John Hewitt
PhD student in Computer Science at Stanford University

Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)

#naturallanguageprocessing #deeplearning
Stanford,CS224N,NLP,Deep Learning,AI
  • 来自:139.162.81.175
  • 时间:2024-06-26 10:45:16
  • 网址:https://www.youtube.com/watch?v=j9AcEI98C0o