Ye. Efficient methods for overlap- ping group lasso. In Advances in Neural Information Processing Systems 24, 2011.Yuan, L.; Liu, J.; and Ye, J. 2011. Efficient methods for overlapping group lasso. In NIPS, 352-360.L. Yuan, J. Liu, and J. Ye. Efficient methods for overlapping ...
We consider two widely adopted types of such penalties as our motivating examples: 1) overlapping-group-lasso penalty, based on ℓ1/ℓ2 mixed-norm, and 2) graph-guided fusion penalty. For both types of penalties, due to their non-separability, developing an efficient optimization method ...
least-squares methods. Among many regularized methods, the Lasso and the MCP are evaluated for inducing a sparse solution. In order to incorporate the shared genetic effects across traits for improving PA, we propose a cross-trait penalty which is a smooth function of pairwise genetic effects. ...
In this sense, observers seemed to lack regularization, which is an umbrella term for all kinds of processes that prevent overfitting by introducing additional information, for example to use as few nonzero parameters as possible during fitting. Essentially, regularization methods improve generalization ...
Our algorithm retains the simplicity of contemporary methods without any restrictive assumptions on the smoothness of the loss function. We apply our proposed method to solve two challenging problems: overlapping group lasso and convex regression with sharp partitions. Numerical experiments show that our ...
Our algorithm retains the simplicity of contemporary methods without any restrictive assumptions on the smoothness of the loss function. We apply our proposed method to solve two challenging problems: overlapping group Lasso and convex regression with sharp partitions (CRISP). Numerical experiments show ...
Efficient first order methods for linear composite regularizers. arXiv:1104.1436v1 - Argyriou, Micchelli, et al. - 2011 () Citation Context ... However, such a general approach does not fully exploit the structure of the problem and will not scale well to large-scale instances....
3.2. State-of-the-Art Methods for Large-Scale, Non-Smooth MaxEnt Models State-of-the-art methods for computing solutions to large-scale, non-smooth MaxEnt models are based on coordinate descent algorithms [61,62,63,64] and first-order optimization algorithms such as forward–backward splitting...
3.2. State-of-the-Art Methods for Large-Scale, Non-Smooth MaxEnt Models State-of-the-art methods for computing solutions to large-scale, non-smooth MaxEnt models are based on coordinate descent algorithms [61,62,63,64] and first-order optimization algorithms such as forward–backward splitting...
A direct application of this conventional 2SLS under EPS design is inappropriate, as EPS severally violates the random sampling, which is the essential bias reducing procedure for statistical methods. This constraint limits the further exploration of the causal effects between risk factors and health ...