O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

The Gaussian Process Latent Variable Model (GPLVM)

14.425 visualizações

Publicada em

An introduction to the Gaussian Process Latent Variable Model (GPLVM)

Publicada em: Ciências
  • Entre para ver os comentários

The Gaussian Process Latent Variable Model (GPLVM)

  1. 1. Gaussian Process Latent Variable Model (GPLVM) James McMurray PhD Student Department of Empirical Inference 11/02/2014
  2. 2. Outline of talk Introduction Why are latent variable models useful? De
  3. 3. nition of a latent variable model Graphical Model representation PCA recap Principal Components Analysis (PCA) Probabilistic PCA based methods Probabilistic PCA (PPCA) Dual PPCA GPLVM Examples Practical points Variants Conclusion Conclusion References
  4. 4. Why are latent variable models useful? I Data has structure.
  5. 5. Why are latent variable models useful? I Observed high-dimensional data often lies on a lower-dimensional manifold. Example Swiss Roll" dataset
  6. 6. Why are latent variable models useful? I The structure in the data means that we don't need such high dimensionality to describe it. I The lower dimensional space is often easier to work with. I Allows for interpolation between observed data points.
  7. 7. De
  8. 8. nition of a latent variable model I Assumptions: I Assume that the observed variables actually result from a smaller set of latent variables. I Assume that the observed variables are independent given the latent variables. I Diers slightly from dimensionality reduction paradigm which wishes to
  9. 9. nd a lower-dimensional embedding in the high-dimensional space. I With the latent variable model we specify the functional form of the mapping: y = g(x) + where x are the latent variables, y are the observed variables and is noise. I Obtain dierent latent variable models for dierent assumptions on g(x) and
  10. 10. Graphical model representation Graphical Model example of Latent Variable Model Taken from Neil Lawrence: http: // ml. dcs. shef. ac. uk/ gpss/ gpws14/ gp_ gpws14_ session3. pdf
  11. 11. Plan Introduction Why are latent variable models useful? De
  12. 12. nition of a latent variable model Graphical Model representation PCA recap Principal Components Analysis (PCA) Probabilistic PCA based methods Probabilistic PCA (PPCA) Dual PPCA GPLVM Examples Practical points Variants Conclusion Conclusion References
  13. 13. Principal Components Analysis (PCA) I Returns orthogonal dimensions of maximum variance. I Works well if data lies on a plane in the higher dimensional space. I Linear method (although variants allow non-linear application, e.g. kernel PCA). Example application of PCA. Taken from http: // www. nlpca. org/ pca_ principal_ component_ analysis. html
  14. 14. Plan Introduction Why are latent variable models useful? De
  15. 15. nition of a latent variable model Graphical Model representation PCA recap Principal Components Analysis (PCA) Probabilistic PCA based methods Probabilistic PCA (PPCA) Dual PPCA GPLVM Examples Practical points Variants Conclusion Conclusion References
  16. 16. Probabilistic PCA (PPCA) I A probabilistic version of PCA. I Probabilistic formulation is useful for many reasons: I Allows comparison with other techniques via likelihood measure. I Facilitates statistical testing. I Allows application of Bayesian methods. I Provides a principled way of handling missing values - via Expectation Maximization.
  17. 17. PPCA De
  18. 18. nition I Consider a set of centered data of n observations and d dimensions: Y = [y1; : : : ; yn]T . I We assume this data has a linear relationship with some embedded latent space data xn. Where Y 2 RND and x 2 RNq. I yn = Wxn + n, where xn is the q-dimensional latent variable associated with each observation, and W 2 RDq is the transformation matrix relating the observed and latent space. I We assume a spherical Gaussian distribution for the noise with a mean of zero and a covariance of

×