Show simple item record

NUMML 2009 Numerical Mathematics in Machine Learning ICML 2009 Workshop

dc.contributor.authorMurray, Iain
dc.date.accessioned2011-01-31T11:56:16Z
dc.date.available2011-01-31T11:56:16Z
dc.date.issued2009
dc.identifier.urihttp://homepages.inf.ed.ac.uk/imurray2/pub/09gp_eval/en
dc.identifier.urihttp://hdl.handle.net/1842/4707
dc.description.abstractGaussian processes (GPs) provide a flexible framework for probabilistic regression. The necessary computations involve standard matrix operations. There have been several attempts to accelerate these operations based on fast kernel matrix-vector multiplications. By focussing on the simplest GP computation, corresponding to test-time predictions in kernel ridge regression, we conclude that simple approximations based on clusterings in a kd-tree can never work well for simple regression problems. Analytical expansions can provide speedups, but current implementations are limited to the squared-exponential kernel and low-dimensional problems. We discuss future directions.en
dc.language.isoenen
dc.publisherNumerical Mathematics in Machine Learning Workshop - International Conference on Machine Learning ICML 2009en
dc.titleGaussian processes and fast matrix-vector multipliesen
dc.typeConference Paperen
rps.titleNUMML 2009 Numerical Mathematics in Machine Learning ICML 2009 Workshopen
dc.date.updated2011-01-31T11:56:16Z
dc.date.openingDate2009-06-18
dc.date.closingDate2009-06-18


Files in this item

This item appears in the following Collection(s)

Show simple item record