 Transformers for Natural Language Processing: Build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more
Take your NLP knowledge to the next level and become an AI language understanding expert by mastering the quantum leap of Transformer neural network models
Key Features
-
Build and implement state-of-the-art language models, such as the original Transformer, BERT, T5, and GPT-2, using...
|  |  The Art of Error Correcting CodingBuilding on the success of the first edition, which offered a practical introductory approach to the techniques of error concealment, this book, now fully revised and updated, provides a comprehensive treatment of the subject and includes a wealth of additional features. The Art of Error Correcting Coding, Second Edition explores... |  |  Numerical Computing with IEEE Floating Point Arithmetic
Are you familiar with the IEEE floating point arithmetic standard? Would you like to understand it better? This book gives a broad overview of numerical computing, in a historical context, with a special focus on the IEEE standard for binary floating point arithmetic. Key ideas are developed step by step, taking the reader from floating point... |