Cohen's Kappa: What It Is, When to Use It, and How to Avoid Its Pitfalls - The New Stack
Generalized Cohen's Kappa: A Novel Inter-rater Reliability Metric for Non-mutually Exclusive Categories | SpringerLink
Cohen's Kappa • Simply explained - DATAtab
Cohen's Kappa and Classification Table Metrics 2.0: An ArcView 3x Extension for Accuracy Assessment of Spatially Explicit Models: USGS Open-File Report 2005-1363: Jenness, Jeff, Wynne, J. Judson, U.S. Department of the Interior,
Weighted Kappa for Multiple Raters | Semantic Scholar
Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science
Kappa – Model Evaluation and Performance Metrics with yardstick – Quantargo
Cohen's Kappa Explained | Built In
7 methods to evaluate your classification models | by Jin | Analytics Vidhya | Medium
Cohen Kappa — PyTorch-Metrics 1.1.0 documentation
Inter-Annotator Agreement: An Introduction to Cohen's Kappa Statistic | by Surge AI | Medium
Inter-rater agreement Kappas. a.k.a. inter-rater reliability or… | by Amir Ziai | Towards Data Science
Interrater reliability: the kappa statistic - Biochemia Medica
Cohen's Kappa and Classification Table Metrics 2.0: An ArcView 3x Extension for Accuracy Assessment of Spatially Explicit Models: USGS Open-File Report 2005-1363: Jenness, Jeff, Wynne, J. Judson, U.S. Department of the Interior,
Four Key Metrics for Ensuring Data Annotation Accuracy | TELUS International