Normal view MARC view ISBD view

Subspace, Latent Structure and Feature Selection [electronic resource] : Statistical and Optimization Perspectives Workshop, SLSFS 2005 Bohinj, Slovenia, February 23-25, 2005, Revised Selected Papers / edited by Craig Saunders, Marko Grobelnik, Steve Gunn, John Shawe-Taylor.

Contributor(s): Saunders, Craig [editor.] | Grobelnik, Marko [editor.] | Gunn, Steve [editor.] | Shawe-Taylor, John [editor.] | SpringerLink (Online service).
Material type: materialTypeLabelBookSeries: Theoretical Computer Science and General Issues: 3940Publisher: Berlin, Heidelberg : Springer Berlin Heidelberg : Imprint: Springer, 2006Edition: 1st ed. 2006.Description: X, 209 p. online resource.Content type: text Media type: computer Carrier type: online resourceISBN: 9783540341383.Subject(s): Algorithms | Computer science -- Mathematics | Mathematical statistics | Computer science | Artificial intelligence | Computer vision | Pattern recognition systems | Algorithms | Probability and Statistics in Computer Science | Theory of Computation | Artificial Intelligence | Computer Vision | Automated Pattern RecognitionAdditional physical formats: Printed edition:: No title; Printed edition:: No titleDDC classification: 518.1 Online resources: Click here to access online
Contents:
Invited Contributions -- Discrete Component Analysis -- Overview and Recent Advances in Partial Least Squares -- Random Projection, Margins, Kernels, and Feature-Selection -- Some Aspects of Latent Structure Analysis -- Feature Selection for Dimensionality Reduction -- Contributed Papers -- Auxiliary Variational Information Maximization for Dimensionality Reduction -- Constructing Visual Models with a Latent Space Approach -- Is Feature Selection Still Necessary? -- Class-Specific Subspace Discriminant Analysis for High-Dimensional Data -- Incorporating Constraints and Prior Knowledge into Factorization Algorithms - An Application to 3D Recovery -- A Simple Feature Extraction for High Dimensional Image Representations -- Identifying Feature Relevance Using a Random Forest -- Generalization Bounds for Subspace Selection and Hyperbolic PCA -- Less Biased Measurement of Feature Selection Benefits.
In: Springer Nature eBook
    average rating: 0.0 (0 votes)
No physical items for this record

Invited Contributions -- Discrete Component Analysis -- Overview and Recent Advances in Partial Least Squares -- Random Projection, Margins, Kernels, and Feature-Selection -- Some Aspects of Latent Structure Analysis -- Feature Selection for Dimensionality Reduction -- Contributed Papers -- Auxiliary Variational Information Maximization for Dimensionality Reduction -- Constructing Visual Models with a Latent Space Approach -- Is Feature Selection Still Necessary? -- Class-Specific Subspace Discriminant Analysis for High-Dimensional Data -- Incorporating Constraints and Prior Knowledge into Factorization Algorithms - An Application to 3D Recovery -- A Simple Feature Extraction for High Dimensional Image Representations -- Identifying Feature Relevance Using a Random Forest -- Generalization Bounds for Subspace Selection and Hyperbolic PCA -- Less Biased Measurement of Feature Selection Benefits.

There are no comments for this item.

Log in to your account to post a comment.