CAR-LASSO
https://arxiv.org/abs/2012.08397
Abstract
Need statistical models to denote interactions among microbes. The author proposed a chain graph model with two sets of nodes (predictors and responses) whose solution yields a graph with edges representing conditional dependence. The model uses Bayesian LASSO - the solution is sparse.
[R package](https://github.com/YunyiShen/CAR-LASSO)
Introduction
What was lacking in microbiome analysis: lack of statistical tools to simultaneously infer connections among microbes and their direct reactions to different environmental factors in a unified framework.
For the graphical model:
Nodes represent variable
Edges represent conditional dependence between nodes, the absence of such edges represents conditional independence.
Intuitively, a multiresponse linear regression with LASSO prior on regression coefficients combined with graphical LASSO prior on the precision matrix can provide sparse regression coefficients among responses and predictors. In addition, sparse graphical models can be used to estimate sparse graphical structure among responses. - However, predictors represent marginal effects rather than conditional effects.
Goals:
Estimate the graphical structure between predictors (environment) and responses (microbes).
The graphical structure among responses while keeping the conditional interpretation of both parameters.
CAR-LASSO simulatenoelously estimates the conditional effect of a set of predictors that influence the responses and connections among responses. The model is represented by a chain graph with two sets of nodes - and . And direct edges between predictor and response represent conditional links, and undirected edges among responses represent correlations.
Guarantees sparse solution - Bayesian LASSO.
Fixed penalty - posterior is log-concave.
Adaptive extension allows different shrinkage to different edges to incorporate edge-specific knowledge to model, and use Normal model to build hierarchical structures.
Handles small and big data through Gibbs sampling algorithm.
Methods
Let be multivariate reponse with entries for observations.
Let be the row vector of predictors for , assume design matrix is standardized so each column has mean of 0 and same standard deviation.
Let follow a normal distribution with mean vector and precision matrix (positive definite) where correspond to regression coefficients connecting the responses to predictors and correspond to the intercept. Author use transpose because samples are encoded as row vectors in the design matrix while by convention multivariate Normal samples are column vectors.
The likelihood function of the model is:
encodes the conditional dependence between and . If , then and are conditionally independent. off diagonal entried encode the conditional dependence between and . The regression coefficients in multiresponse linear regression, are marginal effects.
Prior Specification:
Assume Laplace prior on and GLASSO prior on

Note means must be positive definite.
Algorithm

hyperparameters: we determine and .
Learning:
Continuous prior zero probability for parameter to be zero.
Amount of shrinkage, where , is the estimate of parameter under LASSO prior and denominator is the posterior mean of parameter under non-shrinkage prior. Use to decide that .
Extensions
Adaptive lasso - prior knowledge of independence among certain nodes.
Examples

CG-LASSO's conditional effects can be more informative than marginal effects.
Discussion
Conditional Dependence is important - distinguishing between marginal effects and conditional effects. Bayesian model allows for an easier extension to different types of responses. It is also hard to learn graphical structure - confounding in its own structure.
Last updated