 |
|
|
|
|
 Transformers for Natural Language Processing: Build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more
Take your NLP knowledge to the next level and become an AI language understanding expert by mastering the quantum leap of Transformer neural network models
Key Features
-
Build and implement state-of-the-art language models, such as the original Transformer, BERT, T5, and GPT-2, using...
|  |  |  |  PyTorch Recipes: A Problem-Solution Approach
Get up to speed with the deep learning concepts of Pytorch using a problem-solution approach. Starting with an introduction to PyTorch, you'll get familiarized with tensors, a type of data structure used to calculate arithmetic operations and also learn how they operate. You will then take a look ... |
|
 Modelling and Reasoning with Vague ConceptsVagueness is central to the flexibility and robustness of natural language descriptions. Vague concepts are robust to the imprecision of our perceptions, while still allowing us to convey useful, and sometimes vital, information. The study of vagueness in Artificial Intelligence (AI) is therefore motivated by the desire to incorporate this... |  |  |  |  |
|
|
|
Result Page: 114 113 112 111 110 109 108 107 106 105 |