Neural Network Methods for Natural Language Processing (Record no. 85012)

000 -LEADER
fixed length control field 04146nam a22005295i 4500
001 - CONTROL NUMBER
control field 978-3-031-02165-7
005 - DATE AND TIME OF LATEST TRANSACTION
control field 20240730163825.0
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 220601s2017 sz | s |||| 0|eng d
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
ISBN 9783031021657
-- 978-3-031-02165-7
082 04 - CLASSIFICATION NUMBER
Call Number 006.3
100 1# - AUTHOR NAME
Author Goldberg, Yoav.
245 10 - TITLE STATEMENT
Title Neural Network Methods for Natural Language Processing
250 ## - EDITION STATEMENT
Edition statement 1st ed. 2017.
300 ## - PHYSICAL DESCRIPTION
Number of Pages CCXCII, 20 p.
490 1# - SERIES STATEMENT
Series statement Synthesis Lectures on Human Language Technologies,
505 0# - FORMATTED CONTENTS NOTE
Remark 2 Preface -- Acknowledgments -- Introduction -- Learning Basics and Linear Models -- Learning Basics and Linear Models -- From Linear Models to Multi-layer Perceptrons -- Feed-forward Neural Networks -- Neural Network Training -- Features for Textual Data -- Case Studies of NLP Features -- From Textual Features to Inputs -- Language Modeling -- Pre-trained Word Representations -- Pre-trained Word Representations -- Using Word Embeddings -- Case Study: A Feed-forward Architecture for Sentence -- Case Study: A Feed-forward Architecture for Sentence Meaning Inference -- Ngram Detectors: Convolutional Neural Networks -- Recurrent Neural Networks: Modeling Sequences and Stacks -- Concrete Recurrent Neural Network Architectures -- Modeling with Recurrent Networks -- Modeling with Recurrent Networks -- Conditioned Generation -- Modeling Trees with Recursive Neural Networks -- Modeling Trees with Recursive Neural Networks -- Structured Output Prediction -- Cascaded, Multi-task and Semi-supervised Learning -- Conclusion.-Bibliography -- Author's Biography.
520 ## - SUMMARY, ETC.
Summary, etc Neural networks are a family of powerful machine learning models. This book focuses on the application of neural network models to natural language data. The first half of the book (Parts I and II) covers the basics of supervised machine learning and feed-forward neural networks, the basics of working with machine learning over language data, and the use of vector-based rather than symbolic representations for words. It also covers the computation-graph abstraction, which allows to easily define and train arbitrary neural networks, and is the basis behind the design of contemporary neural network software libraries. The second part of the book (Parts III and IV) introduces more specialized neural network architectures, including 1D convolutional neural networks, recurrent neural networks, conditioned-generation models, and attention-based models. These architectures and techniques are the driving force behind state-of-the-art algorithms for machine translation, syntactic parsing, and many other applications. Finally, we also discuss tree-shaped networks, structured prediction, and the prospects of multi-task learning.
856 40 - ELECTRONIC LOCATION AND ACCESS
Uniform Resource Identifier https://doi.org/10.1007/978-3-031-02165-7
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Koha item type eBooks
264 #1 -
-- Cham :
-- Springer International Publishing :
-- Imprint: Springer,
-- 2017.
336 ## -
-- text
-- txt
-- rdacontent
337 ## -
-- computer
-- c
-- rdamedia
338 ## -
-- online resource
-- cr
-- rdacarrier
347 ## -
-- text file
-- PDF
-- rda
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Artificial intelligence.
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Natural language processing (Computer science).
650 #0 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Computational linguistics.
650 14 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Artificial Intelligence.
650 24 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Natural Language Processing (NLP).
650 24 - SUBJECT ADDED ENTRY--SUBJECT 1
-- Computational Linguistics.
830 #0 - SERIES ADDED ENTRY--UNIFORM TITLE
-- 1947-4059
912 ## -
-- ZDB-2-SXSC

No items available.