Limit search to available items
Book Cover
E-book
Author Gómez-Pérez, José Manuel.

Title A practical guide to hybrid natural language processing : combining neural models and knowledge graphs for NLP / Jose Manuel Gomez-Perez, Ronald Denaux, Andres Garcia-Silva
Published Cham : Springer, 2020

Copies

Description 1 online resource (281 pages)
Contents Intro -- Foreword -- ̀̀Don't Read This Book. Use It!'' by Ken Barker -- ̀̀Most of the Knowledge in the World Is Encoded Not in Knowledge Graphs but in Natural Language'' by Denny Vrandecic -- Preface -- Purpose of the Book -- Overview of the Chapters in This Book -- Materials -- Relation to Other Books in the Area -- Acknowledgements -- Contents -- Part I Preliminaries and Building Blocks -- 1 Hybrid Natural Language Processing: An Introduction -- 1.1 A Brief History of Knowledge Graphs, Embeddings, and Language Models -- 1.2 Combining Knowledge Graphs and Neural Approaches for NLP
2 Word, Sense, and Graph Embeddings -- 2.1 Introduction -- 2.2 Distributed Word Representations -- 2.3 Word Embeddings -- 2.4 Sense and Concept Embeddings -- 2.5 Knowledge Graph Embeddings -- 2.6 Conclusion -- 3 Understanding Word Embeddings and Language Models -- 3.1 Introduction -- 3.2 Language Modeling -- 3.2.1 Statistical Language Models -- 3.2.2 Neural Language Models -- 3.3 Fine-Tuning Pre-trained Language Models for Transfer Learning in NLP -- 3.3.1 ELMo -- 3.3.2 GPT -- 3.3.3 BERT -- 3.4 Fine-Tuning Pre-trained Language Models for Bot Detection -- 3.4.1 Experiment Results and Discussion
3.4.2 Using the Transformers Library to Fine-Tune BERT -- 3.4.2.1 The Transformers Library -- 3.4.2.2 Download the Dataset -- 3.4.2.3 BERT Tokenization -- 3.4.2.4 Fine-Tuning the Model -- 3.4.2.5 Other Evaluation Metrics -- 3.4.2.6 Model Inference -- 3.5 Conclusion -- 4 Capturing Meaning from Text as Word Embeddings -- 4.1 Introduction -- 4.2 Download a Small Text Corpus -- 4.3 An Algorithm for Learning Word Embeddings (Swivel) -- 4.4 Generate Co-occurrence Matrix Using Swivel prep -- 4.5 Learn Embeddings from Co-occurrence Matrix -- 4.5.1 Convert tsv Files to bin File
4.6 Read Stored Binary Embeddings and Inspect Them -- 4.6.1 Compound Words -- 4.7 Exercise: Create Word Embeddings from Project Gutenberg -- 4.7.1 Download and Pre-process the Corpus -- 4.7.2 Learn Embeddings -- 4.7.3 Inspect Embeddings -- 4.8 Conclusion -- 5 Capturing Knowledge Graph Embeddings -- 5.1 Introduction -- 5.2 Knowledge Graph Embeddings -- 5.3 Creating Embeddings for WordNet -- 5.3.1 Choose Embedding Algorithm: HolE -- 5.3.1.1 Install scikit-kge -- 5.3.1.2 Install and Inspect holographic_embeddings -- 5.3.2 Convert WordNet KG to the Required Input
5.3.2.1 KG Input Format Required by SKGE -- 5.3.2.2 Converting WordNet 3.0 into the Required Input Format from Scratch -- 5.3.3 Learn the Embeddings -- 5.3.4 Inspect the Resulting Embeddings -- 5.3.4.1 skge Output File Format -- 5.3.4.2 Converting Embeddings to a More Manageable Format -- 5.4 Exercises -- 5.4.1 Exercise: Train Embeddings on Your Own KG -- 5.4.2 Exercise: Inspect WordNet 3.0 Pre-calculated Embeddings -- 5.5 Conclusion -- Part II Combining Neural Architectures and Knowledge Graphs -- 6 Building Hybrid Representations from Text Corpora, Knowledge Graphs, and Language Models
Summary This book provides readers with a practical guide to the principles of hybrid approaches to natural language processing (NLP) involving a combination of neural methods and knowledge graphs. To this end, it first introduces the main building blocks and then describes how they can be integrated to support the effective implementation of real-world NLP applications. To illustrate the ideas described, the book also includes a comprehensive set of experiments and exercises involving different algorithms over a selection of domains and corpora in various NLP tasks. Throughout, the authors show how to leverage complementary representations stemming from the analysis of unstructured text corpora as well as the entities and relations described explicitly in a knowledge graph, how to integrate such representations, and how to use the resulting features to effectively solve NLP tasks in a range of domains. In addition, the book offers access to executable code with examples, exercises and real-world applications in key domains, like disinformation analysis and machine reading comprehension of scientific literature. All the examples and exercises proposed in the book are available as executable Jupyter notebooks in a GitHub repository. They are all ready to be run on Google Colaboratory or, if preferred, in a local environment. A valuable resource for anyone interested in the interplay between neural and knowledge-based approaches to NLP, this book is a useful guide for readers with a background in structured knowledge representations as well as those whose main approach to AI is fundamentally based on logic. Further, it will appeal to those whose main background is in the areas of machine and deep learning who are looking for ways to leverage structured knowledge bases to optimize results along the NLP downstream
Notes 6.1 Introduction
Bibliography Includes bibliographical references
Notes Print version record
Subject Natural language processing (Computer science)
Artificial intelligence.
Application software.
Natural Language Processing
Artificial Intelligence
artificial intelligence.
Application software
Artificial intelligence
Natural language processing (Computer science)
Form Electronic book
Author Denaux, Ronald
Garcia-Silva, Andres
ISBN 9783030448301
3030448304