000 03563nam a2200493 i 4500
001 6267274
003 IEEE
005 20220712204616.0
006 m o d
007 cr |n|||||||||
008 151223s2007 maua ob 001 eng d
020 _a9780262256292
_qebook
020 _z0262072815
_qalk. paper
020 _z9780262072816
_qhardback
020 _z0262256290
_qelectronic
035 _a(CaBNVSL)mat06267274
035 _a(IDAMS)0b000064818b4260
040 _aCaBNVSL
_beng
_erda
_cCaBNVSL
_dCaBNVSL
050 4 _aQA276.9
_b.G78 2007eb
100 1 _aGr�unwald, Peter D.,
_eauthor.
_921876
245 1 4 _aThe minimum description length principle /
_cPeter D. Gr�unwald.
264 1 _aCambridge, Massachusetts :
_bMIT Press,
_cc2007.
264 2 _a[Piscataqay, New Jersey] :
_bIEEE Xplore,
_c[2007]
300 _a1 PDF (xxxii, 703 pages) :
_billustrations.
336 _atext
_2rdacontent
337 _aelectronic
_2isbdmedia
338 _aonline resource
_2rdacarrier
490 1 _aAdaptive computation and machine learning series
504 _aIncludes bibliographical references (p. [651]-673) and indexes.
506 1 _aRestricted to subscribers or individual electronic text purchasers.
520 _aThe minimum description length (MDL) principle is a powerful method of inductive inference, the basis of statistical modeling, pattern recognition, and machine learning. It holds that the best explanation, given a limited set of observed data, is the one that permits the greatest compression of the data. MDL methods are particularly well-suited for dealing with model selection, prediction, and estimation problems in situations where the models under consideration can be arbitrarily complex, and overfitting the data is a serious concern.This extensive, step-by-step introduction to the MDL Principle provides a comprehensive reference (with an emphasis on conceptual issues) that is accessible to graduate students and researchers in statistics, pattern classification, machine learning, and data mining, to philosophers interested in the foundations of statistics, and to researchers in other applied sciences that involve model selection, including biology, econometrics, and experimental psychology. Part I provides a basic introduction to MDL and an overview of the concepts in statistics and information theory needed to understand MDL. Part II treats universal coding, the information-theoretic notion on which MDL is built, and part III gives a formal treatment of MDL theory as a theory of inductive inference based on universal coding. Part IV provides a comprehensive overview of the statistical theory of exponential families with an emphasis on their information-theoretic properties. The text includes a number of summaries, paragraphs offering the reader a "fast track" through the material, and boxes highlighting the most important concepts.
530 _aAlso available in print.
538 _aMode of access: World Wide Web
550 _aMade available online by EBSCO.
588 _aDescription based on PDF viewed 12/23/2015.
650 0 _aMinimum description length (Information theory)
_921877
655 0 _aElectronic books.
_93294
710 2 _aIEEE Xplore (Online Service),
_edistributor.
_921878
710 2 _aMIT Press,
_epublisher.
_921879
776 0 8 _iPrint version
_z9780262072816
830 0 _aAdaptive computation and machine learning.
_921570
856 4 2 _3Abstract with links to resource
_uhttps://ieeexplore.ieee.org/xpl/bkabstractplus.jsp?bkn=6267274
942 _cEBK
999 _c72932
_d72932