Transformers for natural language processing : build, train, and fine-tuning deep neural network architectures for NLP with Python, Hugging face, and openAI's GPT-3, ChatGPT, and GPT4 / Denis Rothman ; foreword by Antonio Gulli
Material type: TextSeries: Expert insightPublisher: [Birmingham, United Kingdom] : Packt Publishing, [2022]Edition: Second editionDescription: xxiii, 565 pages : illustrations ; 23 cmContent type:- text
- unmediated
- volume
- 006.3 ROT 2022 23
- Q336
Item type | Current library | Home library | Collection | Shelving location | Call number | Status | Date due | Barcode | Item holds |
---|---|---|---|---|---|---|---|---|---|
Open Collection | FIRST CITY UNIVERSITY COLLEGE | FIRST CITY UNIVERSITY COLLEGE | Open Collection | FCUC Library | 006.3 ROT 2022 (Browse shelf(Opens below)) | Available | 00025125 |
Includes index
Available to OhioLINK libraries
Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence. Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers. An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP. This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets