TTIC Machine Learning Seminar

4 views
Skip to first unread message

Andreas Argyriou

unread,
Nov 16, 2011, 5:01:28 PM11/16/11
to Machine Learning Seminar 2011
Dear all,

There will be a machine learning seminar this Wednesday Nov. 23 at
11-12 am in room 526 at TTIC.
Nicholas Polson, from the Booth School of Business will present the
following work.

Title: Sparse Bayes estimation in non-Gaussian models via data
augmentation

In this paper we provide a data-augmentation scheme that unifies many
common sparse Bayes estimators into a single class. This leads to
simple iterative algorithms for estimating the posterior mode under
arbitrary combinations of likelihoods and priors within the class. The
class itself is quite large: for example, it includes quantile
regression, support vector machines, and logistic and multinomial
logistic regression, along with the usual ridge regression, lasso,
bridge estimators, and regression with heavy-tailed errors. To arrive
at this unified framework, we represent a wide class of objective
functions as variance--mean mixtures of Gaussians involving both the
likelihood and penalty functions. This generalizes existing theory
based solely on variance mixtures for the penalty function, and allows
the theory of conditionally normal linear models to be brought to bear
on a much wider class of models. We focus on two possible choices of
the mixing measures: the generalized inverse-Gaussian and Polya
distributions, leading to the hyperbolic and Z distributions,
respectively. We exploit this conditional normality to find sparse,
regularized estimates using tilted iteratively re-weighted least
squares (TIRLS). Finally, we characterize the conditional moments of
the latent variances for any model in our proposed class, and show the
relationship between our method and two recent algorithms: LQA (local
quadratic approximation) and LLA (local linear approximation).

http://arxiv.org/abs/1103.5407

Andreas
Reply all
Reply to author
Forward
0 new messages