Have a personal or library account? Click to login
Transformers for Natural Language Processing Cover

Transformers for Natural Language Processing

Build, train, and fine-tune deep neural network architectures for NLP with Python, Hugging Face, and OpenAI's GPT-3, ChatGPT, and GPT-4

Paid access
|Apr 2022

Table of Contents

  1. What are Transformers?
  2. Getting Started with the Architecture of the Transformer Model
  3. Fine-Tuning BERT Models
  4. Pretraining a RoBERTa Model from Scratch
  5. Downstream NLP Tasks with Transformers
  6. Machine Translation with the Transformer
  7. The Rise of Suprahuman Transformers with GPT-3 Engines
  8. Applying Transformers to Legal and Financial Documents for AI Text Summarization
  9. Matching Tokenizers and Datasets
  10. Semantic Role Labeling with BERT-Based Transformers
  11. Let Your Data Do the Talking: Story, Questions, and Answers
  12. Detecting Customer Emotions to Make Predictions
  13. Analyzing Fake News with Transformers
  14. Interpreting Black Box Transformer Models
  15. From NLP to Task-Agnostic Transformer Models
  16. The Emergence of Transformer-Driven Copilots
  17. The Consolidation of Suprahuman Transformers with OpenAI's ChatGPT and GPT-4
  18. Appendix I — Terminology of Transformer Models
  19. Appendix II — Hardware Constraints for Transformer Models
  20. Appendix III — Generic Text Completion with GPT-2
  21. Appendix IV — Custom Text Completion with GPT-2
  22. Appendix V — Answers to the Questions
PDF ISBN: 978-1-80324-348-1
Publisher: Packt Publishing Limited
Copyright owner: © 2022 Packt Publishing Limited
Publication date: 2022
Language: English
Pages: 602

People also read