normalized_mutual_info_score(labels_true, labels_pred, *, average_method='arithmetic') 两个聚类之间的标准化互信息。 归一化互信息 (NMI) 是互信息 (MI) 分数的归一化,用于在 0(无互信息)和 1(完全相关)之间缩放结果。在此函数中,互信息通过 H(labels_true) 和H(labels_pred)) 的一些广义平均值...
使用 sklearn 0.20.0,我将提供一个合成示例来重现该问题:metrics.normalized_mutual_info_score([0]*100001, [0]*100000 + [1])metrics.normalized_mutual_info_score([0]*110001, [0]*110000 + [1])我希望下面的答案是 0,但我分别得到了 7.999 和 -7.999。 查看完整描述1 回答江户川乱折腾 TA贡献...
importnumpyasnpfromsklearn.metricsimportnormalized_mutual_info_score# 假设我们有两个聚类结果true_labels=np.array([1,1,0,0,1,0])predicted_labels=np.array([1,0,0,0,1,1])# 计算归一化互信息nmi=normalized_mutual_info_score(true_labels,predicted_labels)print(f"归一化互信息 (NMI):{nmi}") ...
now i want to calculate normalized mutual information: but it acting kind of wierd In [13]: normalized_mutual_info_score(real.astype(int),test.astype(int),average_method='arithmetic') Out[13]: 6.422893887289432e-16 In [14]: normalized_mutual_info_score(real.astype(int),test.astype(int),a...
Security Insights Additional navigation options master 1Branch 8Tags Code This branch is43 commits ahead ofaaronmcdaid/Overlapping-NMI:master. README GPL-3.0 license An implementation of a Normalized Mutual Information (NMI) measure for sets of overlapping clusters and Omega Index. ...
t =normalized_mutual_info_score(mlp.predict(X[te]), y[te]) print("Fold training accuracy: %f"% t) total += t this_score = []foriinmlp.oo_score: this_score.append(normalized_mutual_info_score(i, y[te])) oo_score_bag.append(this_score)frommatplotlibimportpyplotasplt ...
delta =1.0-normalized_mutual_info_score(cuda.to_cpu(c_data), y_pam) loss_expected = f + gamma * delta - f_tilde testing.assert_allclose(loss.data, loss_expected) 开发者ID:ronekko,项目名称:deep_metric_learning,代码行数:25,代码来源:test_clustering_loss.py ...