000 04046nam a22005295i 4500
001 978-3-031-02350-7
003 DE-He213
005 20240730163911.0
007 cr nn 008mamaa
008 220601s2017 sz | s |||| 0|eng d
020 _a9783031023507
_9978-3-031-02350-7
024 7 _a10.1007/978-3-031-02350-7
_2doi
050 4 _aQA76.9.A25
072 7 _aUR
_2bicssc
072 7 _aUTN
_2bicssc
072 7 _aCOM053000
_2bisacsh
072 7 _aUR
_2thema
072 7 _aUTN
_2thema
082 0 4 _a005.8
_223
100 1 _aLi, Ninghui.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
_981112
245 1 0 _aDifferential Privacy
_h[electronic resource] :
_bFrom Theory to Practice /
_cby Ninghui Li, Min Lyu, Dong Su, Weining Yang.
250 _a1st ed. 2017.
264 1 _aCham :
_bSpringer International Publishing :
_bImprint: Springer,
_c2017.
300 _aXIII, 124 p.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
490 1 _aSynthesis Lectures on Information Security, Privacy, and Trust,
_x1945-9750
505 0 _aAcknowledgments -- Introduction -- A Primer on ?-Differential Privacy -- What Does DP Mean? -- Publishing Histograms for Low-dimensional Datasets -- Differentially Private Optimization -- Publishing Marginals -- The Sparse Vector Technique -- Bibliography -- Authors' Biographies.
520 _aOver the last decade, differential privacy (DP) has emerged as the de facto standard privacy notion for research in privacy-preserving data analysis and publishing. The DP notion offers strong privacy guarantee and has been applied to many data analysis tasks. This Synthesis Lecture is the first of two volumes on differential privacy. This lecture differs from the existing books and surveys on differential privacy in that we take an approach balancing theory and practice. We focus on empirical accuracy performances of algorithms rather than asymptotic accuracy guarantees. At the same time, we try to explain why these algorithms have those empirical accuracy performances. We also take a balanced approach regarding the semantic meanings of differential privacy, explaining both its strong guarantees and its limitations. We start by inspecting the definition and basic properties of DP, and the main primitives for achieving DP. Then, we give a detailed discussion on the the semantic privacy guarantee provided by DP and the caveats when applying DP. Next, we review the state of the art mechanisms for publishing histograms for low-dimensional datasets, mechanisms for conducting machine learning tasks such as classification, regression, and clustering, and mechanisms for publishing information to answer marginal queries for high-dimensional datasets. Finally, we explain the sparse vector technique, including the many errors that have been made in the literature using it. The planned Volume 2 will cover usage of DP in other settings, including high-dimensional datasets, graph datasets, local setting, location privacy, and so on. We will also discuss various relaxations of DP.
650 0 _aData protection.
_97245
650 1 4 _aData and Information Security.
_931990
700 1 _aLyu, Min.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
_981113
700 1 _aSu, Dong.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
_981114
700 1 _aYang, Weining.
_eauthor.
_4aut
_4http://id.loc.gov/vocabulary/relators/aut
_981115
710 2 _aSpringerLink (Online service)
_981116
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783031002359
776 0 8 _iPrinted edition:
_z9783031012228
776 0 8 _iPrinted edition:
_z9783031034787
830 0 _aSynthesis Lectures on Information Security, Privacy, and Trust,
_x1945-9750
_981117
856 4 0 _uhttps://doi.org/10.1007/978-3-031-02350-7
912 _aZDB-2-SXSC
942 _cEBK
999 _c85108
_d85108