Review the core architecture of Transformers. Learn how self-attention mechanisms and positional encoding power modern generative AI language models.