Information Services banner Edinburgh Research Archive The University of Edinburgh crest

Edinburgh Research Archive >
Informatics, School of >
Informatics Publications >

Please use this identifier to cite or link to this item: http://hdl.handle.net/1842/3647

This item has been viewed 42 times in the last year. View Statistics

Files in This Item:

File Description SizeFormat
Efficient Learning and Feature Selection in High-Dimensional Regression.pdf1.17 MBAdobe PDFView/Open
Title: Efficient Learning and Feature Selection in High Dimensional Regression
Authors: Ting, Jo-Anne
D'Souza, Aaron
Vijayakumar, Sethu
Schaal, Stefan
Issue Date: 2010
Journal Title: Neural Computation
Volume: 22
Issue: 4
Page Numbers: 831-886
Publisher: MIT Press
Abstract: We present a novel algorithm for efficient learning and feature selection in high-dimensional regression problems. We arrive at this model through a modification of the standard regression model, enabling us to derive a probabilistic version of the well-known statistical regression technique of backfitting. Using the expectation-maximization algorithm, along with variational approximation methods to overcome intractability, we extend our algorithm to include automatic relevance detection of the input features. This variational Bayesian least squares (VBLS) approach retains its simplicity as a linear model, but offers a novel statistically robust black-box approach to generalized linear regression with high-dimensional inputs. It can be easily extended to nonlinear regression and classification problems. In particular, we derive the framework of sparse Bayesian learning, the relevance vector machine, with VBLS at its core, offering significant computational and robustness advantages for this class of methods. The iterative nature of VBLS makes it most suitable for real-time incremental learning, which is crucial especially in the application domain of robotics, brain-machine interfaces, and neural prosthetics, where real-time learning of models for control is needed. We evaluate our algorithm on synthetic and neurophysiological data sets, as well as on standard regression and classification benchmark data sets, comparing it with other competitive statistical approaches and demonstrating its suitability as a drop-in replacement for other generalized linear regression techniques.
Keywords: Informatics
Computer Science
Robotics
URI: http://www.mitpressjournals.org/doi/abs/10.1162/neco.2009.02-08-702
http://hdl.handle.net/1842/3647
ISSN: 0899-7667
Appears in Collections:Informatics Publications

Items in ERA are protected by copyright, with all rights reserved, unless otherwise indicated.

 

Valid XHTML 1.0! Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh 2013, and/or the original authors. Privacy and Cookies Policy