[en] Quantifying uncertainty is crucial for assessing the trustworthiness of Machine learning (ML) predictions. Factors like data noise or transformations can influence uncertainty in ML predictions. Standard ML models like Random Forest(RF), XGBoost (XGB), and Neural Networks(NN) provide accurate point predictions but lack calibrated confidence or statistical guarantees, often producing uncalibrated probabilities without inherent uncertainty representation.
While these methods perform well, they are not inherently designed to provide coverage guarantees.
Conformal prediction (CP) addresses these limitations by providing explicit coverage guarantees through prediction intervals or calibration sets, correcting the miscalibrated probabilities, and ensuring that the true value lies within the interval with a specified probability (e.g., 90\%).
A CP set is a set of all possible class labels that a given data point could be classified into.
Additionally, CP is a model-agnostic framework that is flexible and versatile and can be applied to any ML model.
Disciplines :
Computer science
Author, co-author :
Singh, Akash ; Université de Liège - ULiège > HEC Liège Research
Ittoo, Ashwin ; Université de Liège - ULiège > HEC Liège : UER > UER Opérations : Systèmes d'information de gestion ; Université de Liège - ULiège > HEC Liège Research > HEC Liège Research: Business Analytics & Supply Chain Mgmt
Pierre, Ars; Ethias > Actuarial Innovation
Vandomme, Elise ; Université de Liège - ULiège > HEC Liège Research > HEC Liège Research: Business Analytics & Supply Chain Mgmt
Language :
English
Title :
Conformal Prediction: Calibrated Decision-Making
Publication date :
29 January 2025
Number of pages :
2
Event name :
Joint ORBEL - NGB conference on Operations Research