Quantifying the uncertainty of the predictions produced by classification and regression techniques is an important problem in the field of Machine Learning. Conformal Prediction is a recently developed framework for complementing the predictions of Machine Learning algorithms with reliable measures of confidence. The methods developed based on this framework produce well-calibrated confidence measures for individual examples without assuming anything more than that the data are generated independently by the same probability distribution (i.i.d.).
The symposium welcomes submissions introducing further developments and extensions of the Conformal Prediction framework and describing its application to interesting problems of any field.
- Non-conformity measures
- Venn prediction
- On-line compression modeling
- Theoretical analysis of Conformal Prediction techniques
- Applications/usages of Conformal Prediction
- Machine learning
- Pattern recognition
- Regression estimation
- Density estimation
- Algorithmic information theory
- Measures of confidence
- Applications in Bioinformatics and Medicine
- Applications in Information Security and Homeland Security
- Data mining and visualization
- Big data applications
- Data analysis applications in science and engineering
- Uncertainty quantification
Novel Directions of Applying Machine Learning in Chemoinformatics