000 03574nam a22005415i 4500
001 978-3-031-02177-0
003 DE-He213
005 20240730163829.0
007 cr nn 008mamaa
008 220601s2021 sz | s |||| 0|eng d
020 _a9783031021770
_9978-3-031-02177-0
024 7 _a10.1007/978-3-031-02177-0
_2doi
050 4 _aQ334-342
050 4 _aTA347.A78
072 7 _aUYQ
_2bicssc
072 7 _aCOM004000
_2bisacsh
072 7 _aUYQ
_2thema
082 0 4 _a006.3
_223
100 1 _aPilehvar, Mohammad Taher.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
_980707
245 1 0 _aEmbeddings in Natural Language Processing
_h[electronic resource] :
_bTheory and Advances in Vector Representations of Meaning /
_cby Mohammad Taher Pilehvar, Jose Camacho-Collados.
250 _a1st ed. 2021.
264 1 _aCham :
_bSpringer International Publishing :
_bImprint: Springer,
_c2021.
300 _aXVIII, 157 p.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
490 1 _aSynthesis Lectures on Human Language Technologies,
_x1947-4059
505 0 _aPreface -- Introduction -- Background -- Word Embeddings -- Graph Embeddings -- Sense Embeddings -- Contextualized Embeddings -- Sentence and Document Embeddings -- Ethics and Bias -- Conclusions -- Bibliography -- Authors' Biographies.
520 _aEmbeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.
650 0 _aArtificial intelligence.
_93407
650 0 _aNatural language processing (Computer science).
_94741
650 0 _aComputational linguistics.
_96146
650 1 4 _aArtificial Intelligence.
_93407
650 2 4 _aNatural Language Processing (NLP).
_931587
650 2 4 _aComputational Linguistics.
_96146
700 1 _aCamacho-Collados, Jose.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
_980708
710 2 _aSpringerLink (Online service)
_980709
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783031001888
776 0 8 _iPrinted edition:
_z9783031010491
776 0 8 _iPrinted edition:
_z9783031033056
830 0 _aSynthesis Lectures on Human Language Technologies,
_x1947-4059
_980710
856 4 0 _uhttps://doi.org/10.1007/978-3-031-02177-0
912 _aZDB-2-SXSC
942 _cEBK
999 _c85019
_d85019