Table of Contents
- What are Transformers?
- Getting Started with the Architecture of the Transformer Model
- Emergent vs Downstream Tasks: The Unseen Depths of Transformers
- Advancements in Translations with Google Trax, Google Translate, and Gemini
- Diving into Fine-Tuning through BERT
- Pretraining a Transformer from Scratch through RoBERTa
- The Generative AI Revolution with ChatGPT
- Fine-Tuning OpenAI GPT Models
- Shattering the Black Box with Interpretable Tools
- Investigating the Role of Tokenizers in Shaping Transformer Models
- Leveraging LLM Embeddings as an Alternative to Fine-Tuning
- Toward Syntax-Free Semantic Role Labeling with ChatGPT and GPT-4
- Summarization with T5 and ChatGPT
- Exploring Cutting-Edge LLMs with Vertex AI and PaLM 2
- Guarding the Giants: Mitigating Risks in Large Language Models
- Beyond Text: Vision Transformers in the Dawn of Revolutionary AI
- Transcending the Image-Text Boundary with Stable Diffusion
- Hugging Face AutoTrain: Training Vision Models without Coding
- On the Road to Functional AGI with HuggingGPT and its Peers
- Beyond Human-Designed Prompts with Generative Ideation

