LLM Foundations (LLM Bootcamp)

On this video, Sergey covers the foundational concepts for giant language fashions: core ML, the Transformer structure, notable LLMs, and pretraining dataset composition.

Obtain slides from the bootcamp web site right here: https://fullstackdeeplearning.com/llm-bootcamp/spring-2023/llm-foundations/

Intro and outro music made with Riffusion: https://github.com/riffusion/riffusion

Watch the remainder of the LLM Bootcamp movies right here: https://www.youtube.com/playlist?list=PL1T8fO7ArWleyIqOy37OVXsP4hFXymdOZ

00:00 Intro00:47 Foundations of Machine Studying12:11 The Transformer Structure12:57 Transformer Decoder Overview14:27 Inputs15:29 Enter Embedding16:51 Masked Multi-Head Consideration24:26 Positional Encoding25:32 Skip Connections and Layer Norm27:05 Feed-forward Layer27:43 Transformer hyperparameters and Why they work so nicely31:06 Notable LLM: BERT32:28 Notable LLM: T534:29 Notable LLM: GPT38:18 Notable LLM: Chinchilla and Scaling Legal guidelines40:23 Notable LLM: LLaMA41:18 Why embrace code in LLM coaching information?42:07 Instruction Tuning46:34 Notable LLM: RETRO

The post LLM Foundations (LLM Bootcamp) appeared first on AIPressRoom.