Transformers for Natural Language Processing: Build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more
Take your NLP knowledge to the next level and become an AI language understanding expert by mastering the quantum leap of Transformer neural network models
Build and implement state-of-the-art language models, such as the original Transformer, BERT, T5, and GPT-2, using...
Python Data Analytics: With Pandas, NumPy, and Matplotlib
Explore the latest Python tools and techniques to help you tackle the world of data acquisition and analysis. You'll review scientific computing with NumPy, visualization with matplotlib, and machine learning with scikit-learn.
This revision is fully updated with new content on social media data analysis, image...
In Object Thinking, esteemed object technologist David West contends that the mindset makes the programmer—not the tools and techniques. Delving into the history, philosophy, and even politics of object-oriented programming. West reveals how the best programmers rely on analysis and conceptualization—on...
The Art of Error Correcting Coding Building on the success of the first edition, which offered a practical introductory approach to the techniques of error concealment, this book, now fully revised and updated, provides a comprehensive treatment of the subject and includes a wealth of additional features. The Art of Error Correcting Coding, Second Edition explores... Modelling and Reasoning with Vague Concepts Vagueness is central to the flexibility and robustness of natural language descriptions. Vague concepts are robust to the imprecision of our perceptions, while still allowing us to convey useful, and sometimes vital, information. The study of vagueness in Artificial Intelligence (AI) is therefore motivated by the desire to incorporate this...
Methodology of Longitudinal Surveys (Wiley Series in Survey Methodology) Longitudinal surveys are surveys that involve collecting data from multiple subjects on multiple occasions. They are typically used for collecting data relating to social, economic, educational and health-related issues and they serve as an important tool for economists, sociologists, and other researchers.
Focusing on the design,...
Foundations of Dependable Computing: Paradigms for Dependable Applications Foundations of Dependable Computing: Paradigms for Dependable Applications, presents a variety of specific approaches to achieving dependability at the application level. Driven by the higher level fault models of Models and Frameworks for Dependable Systems, and built on the lower level abstractions implemented in a third companion book... Trustworthy Compilers (Quantitative Software Engineering Series)
The Most Complete, Real-World Guide to Compiler
Development—and the Principles of Trustworthy Compilers
Drawing on the author's over thirty years of expertise in compiler development, research, and instruction, Trustworthy Compilers introduces and analyzes the concept of trustworthy compilers and the principles...
|Result Page: 435 434 433 432 431 430 429 428 427 426 |