sglOptim: Sparse group lasso generic optimizerMartin Vincent
We propose a two-stage analytic strategy combining functional principal component analysis (FPCA) and sparse-group LASSO (SGL) to characterize associations between biomarkers and 30-day mortality rates. Unlike prior reports, our proposed approach leverages: 1) time-varying biomarker trajectories, 2) m...
基于SparseGroupLasso惩罚的整合分析 摘要 大数据往往具有高维度、稀疏性、来源差异性的特点,如何合理有效地挖掘、分析 此类数据集之间的关联信息和差异性,同时完成数据特征的降维去噪,是值得深思和研 究的问题.整合分析不同于以往的单数据集分析和统合分析,它将多个独立数据集联合 起来,同时分析多个数据集,为直接从原...
R implementation of our algorithm in the package SGL. This paper is the continuation of Friedman et al., a brief note on the criterion. A SPARSE-GROUP LASSO 3 This criterion was also discussed in Zhou et al. (2010). They applied it to SNP data for linear and logistic regression with ...
This suggests that, whereas EN or SGL may achieve a similar JI to StablEN or StablSGL, they do so at the expense of selecting more uninformative features. Other SRMs offer advantages beyond adapting to different correlation structures. For example, AL, an extension of Lasso that demonstrates ...
Several machine learning methods, including sparsity-promoting regularization methods (SRMs), such as Lasso7, Elastic Net (EN)8, Adaptive Lasso (AL)9and sparse group Lasso (SGL)10, provide predictive modeling frameworks adapted top ≫ nomic datasets. Furthermore, data fusion methods, such ...
For problems in which covariates are grouped and sparse structure are desired, both on group and within group levels, the sparse-group lasso (SGL) regularization method has proved to be very efficient. Under its simplest formulation, the solution provided by this method depends on two weight ...
Sparse Group LASSO (SGL) is a regularized model for high-dimensional linear regression problems with grouped covariates. SGL applies l_1 l_1 and l_2 l_2 penalties on the individual predictors and group predictors, respectively, to guarantee sparse effects both on the inter-group and within-...
sparse group lasso (SGL)multiclass sparse group lasso (MSGL)Decision trees are examples of easily interpretable models whose predictive accuracy is normally low. In comparison, decision tree ensembles (DTEs) such as random forest (RF) exhibit high predictive accuracy while being regarded as black-...
In this study, we propose the adaptive sparse group Lasso (adSGL) method, which combines the adaptive Lasso and adaptive group Lasso (GL) to achieve bi-level selection. It can be viewed as an improved version of sparse group Lasso (SGL) and uses data-dependent weights to improve selection...