from sklearn.metrics import roc_curve, auc, accuracy_score, precision_score, recall_score, f1_score class Score: def __init__(self, y_output, y_label, y_pre): self.y_output = y_output self.y_label = y_label self.y_pre = y_pre def cal_roc(self): cls = len(self.y_output[...
from sklearn.datasets import make_regression # define dataset X, y = make_regression(n_samples=1000, n_features=10, n_informative=5, random_state=1) # summarize the dataset print(X.shape, y.shape) Running the example creates the dataset and confirms the expected number of samples and feat...
I would like to have a way to compute the PR-AUC for my models, given their scores and labels. The function's output should matchsklearn'saverage_precision_score, just likeClickHouse'sarrayAUCmatchessklearn'sroc_auc_score. Describe the solution you'd like In a similar fashion to thearray...
logistic_regression.cpp compare with sklearn lr Feb 15, 2021 logistic_regression_ckks.cpp first commit Feb 13, 2021 logistic_regression_kernel.cpp clear code Mar 15, 2021 logistic_regression_new.cpp test read data Mar 14, 2021 lr.cpp test read data Mar 14, 2021 matrix_mult_benchmark.cpp ...