一般来说,Margin-based loss function主要包括两个方面的目标。首先,它试图将所有的样本点都正确地划分到超平面的两侧,即确保所有训练样本点都满足模型对于正负样本的判别要求。其次,它追求最大化超平面与最邻近的正负样本点之间的间距,以提高模型的泛化能力。 常用的Margin-based loss function有Hinge Loss和Squared Hin...
We introduce the leaky hockey stick loss (LHS loss), the first negatively divergent margin-based loss function. We prove that the LHS loss is classification-calibrated. When the hinge loss is replaced with the LHS loss in the ERM approach for deriving the kernel support vector machine (SVM),...
为了解决文档级关系抽取中的长尾多标签问题,本文提出了一种基于Hinge Loss的自适应边界损失函数。它的思想是为每一对实体之间的正类和负类学习一个分隔类。当一个样本被错误地分类或者分类在分隔类的边界附近时,就会触发自适应边界损失函数。通过这个损失函数的优化,可以通过分隔类来增加正类和负类之间的间隔。 在实...
This paper incorporated separation (misclassification) measures conforming to conventional discriminative training criterion in loss function definition of margin based method to derive the mathematical framework for acoustic model parameter estimation and discuss some important issues related to hinge loss ...
a kernel slack variable is introduced into each base kernel to solve the objective function.Two kinds of soft margin MKL methods based on hinge loss function and square hinge loss function can be obtained when hinge loss functions and square hinge loss functions are selected.The improved methods ...
Then, to address the challenge of adaptive anomaly detection thresholds, this research proposes a nonlinear model of support vector data description (SVDD) utilizing a 0/1 soft-margin loss, referred to as L0/1-SVDD. This model replaces the traditional hinge loss function in SVDD with a 0/1...