
Transformers for Natural Language Processing
Build, train, and fine-tune deep neural network architectures for NLP with Python, Hugging Face, and OpenAI's GPT-3, ChatGPT, and GPT-4
Publisher:Packt Publishing Limited
By: Denis Rothman
Paid access
|Sep 2024Table of Contents
- What are Transformers?
- Getting Started with the Architecture of the Transformer Model
- Fine-Tuning BERT Models
- Pretraining a RoBERTa Model from Scratch
- Downstream NLP Tasks with Transformers
- Machine Translation with the Transformer
- The Rise of Suprahuman Transformers with GPT-3 Engines
- Applying Transformers to Legal and Financial Documents for AI Text Summarization
- Matching Tokenizers and Datasets
- Semantic Role Labeling with BERT-Based Transformers
- Let Your Data Do the Talking: Story, Questions, and Answers
- Detecting Customer Emotions to Make Predictions
- Analyzing Fake News with Transformers
- Interpreting Black Box Transformer Models
- From NLP to Task-Agnostic Transformer Models
- The Emergence of Transformer-Driven Copilots
- The Consolidation of Suprahuman Transformers with OpenAI's ChatGPT and GPT-4
- Appendix I — Terminology of Transformer Models
- Appendix II — Hardware Constraints for Transformer Models
- Appendix III — Generic Text Completion with GPT-2
- Appendix IV — Custom Text Completion with GPT-2
- Appendix V — Answers to the Questions
PDF ISBN: 978-1-80324-348-1
Publisher: Packt Publishing Limited
Copyright owner: © 2022 Packt Publishing Limited
Publication date: 2024
Language: English
Pages: 602
Related subjects:
