Publications Freek Stulp


Back to Homepage
Sorted by DateClassified by Publication TypeClassified by Research Category
Many regression algorithms, one unified model - A review
Freek Stulp and Olivier Sigaud. Many regression algorithms, one unified model - A review. Neural Networks, 2015.
Download
[PDF]987.2kB  
Abstract
Regression is the process of learning relationships between inputs and continuous outputs from example data, which enables predictions for novel inputs. The history of regression is closely related to the history of artificial neural networks since the seminal work of Rosenblatt (1958). The aims of this paper are to provide an overview of many regression algorithms, and to demonstrate how the function representation whose parameters they regress fall into two classes: a weighted sum of basis functions, or a mixture of linear models. Furthermore, we show that the former is a special case of the latter. Our ambition is thus to provide a deep understanding of the relationship between these algorithms, that, despite being derived from very different principles, use a function representation that can be captured within one unified model. Finally, step-by-step derivations of the algorithms from first principles and visualizations of their innerworkings allow this article to be used as a tutorial for those new to regression.
BibTeX
@Article{stulp15many,
  title                    = {Many regression algorithms, one unified model -- A review},
  author                   = {Freek Stulp and Olivier Sigaud},
  journal                  = {Neural Networks},
  year                     = {2015},
  abstract                 = { Regression is the process of learning relationships between inputs and continuous outputs from example data, which enables predictions for novel inputs. The history of regression is closely related to the history of artificial neural networks since the seminal work of Rosenblatt (1958). The aims of this paper are to provide an overview of many regression algorithms, and to demonstrate how the function representation whose parameters they regress fall into two classes: a weighted sum of basis functions, or a mixture of linear models. Furthermore, we show that the former is a special case of the latter. Our ambition is thus to provide a deep understanding of the relationship between these algorithms, that, despite being derived from very different principles, use a function representation that can be captured within one unified model. Finally, step-by-step derivations of the algorithms from first principles and visualizations of their innerworkings allow this article to be used as a tutorial for those new to regression.},
  bib2html_pubtype         = {Journal},
  bib2html_rescat          = {Imitation Learning and Regression},
  url                      = {http://www.sciencedirect.com/science/article/pii/S0893608015001185}
}

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints.


Generated by bib2html.pl (written by Patrick Riley ) on Mon Jul 20, 2015 21:50:11