Skip to content

Lasso

Model

For a matrix \(\mathsf{A \in \mathbb{R}^{m\times n}}\), a vector \(\mathsf{b \in \mathbb{R}^m}\), and a scalar \(\mathsf{\lambda > 0}\), the model is

\[ \mathsf{\underset{x \in \mathbb{R}^n }{min} \ |Ax - b|_2^2 + \lambda |x|_1. } \]

Overview

The Least Absolute Shrinkage and Selection Operator (Lasso) function is commonly used in statistics and machine learning for variable selection and regularization. Introduced by Tibshirani1, it is primarily used in linear regression models, but its principles can be extended to other models as well. The key idea of the Lasso function is to add a penalty term to least squares regression that is the sum of absolute values of the coefficients.


Property
Convex
Strongly Convex 🟡
Unbiased Estimator2
Feasible Estimator 🟡


Code

#include <iostream>

int main(void) {
  std::cout << "Hello world!" << std::endl;
  return 0;
}
#include <stdio.h>

int main(void) {
  printf("Hello world!\n");
  return 0;
}
#include <iostream>

int main(void) {
  std::cout << "Hello world!" << std::endl;
  return 0;
}
#include <iostream>

int main(void) {
  std::cout << "Hello world!" << std::endl;
  return 0;
}

Applications

  • Statistics...
  • Compressed sensing...

See Also


  1. Tibshirani. "Regression shrinkage and selection via the lasso." Journal of the Royal Statistical Society Series B: Statistical Methodology. 1996 

  2. Fan, Li. "Variable selection via nonconcave penalized likelihood and its oracle properties." Journal of the American statistical Association. 2001. 


Last update: October 12, 2023