Course Home Page

Temperature Scaling: On Calibration of Modern Neural Networks

This chapter covers results in the paper On calibration of modern neural networks available at this MLR 2017 version.


Lecture

The lecture slides are available here.

Introduction and Motivation

--- primaryColor: steelblue shuffleQuestions: false shuffleAnswers: true --- ### According to this paper, model calibration is influenced negatively by - [ ] depth and width of the neural network. - [x] depth, width, and batch normalization. > Correct. - [ ] batch normalization only.

Calibration Metrics

--- primaryColor: steelblue shuffleQuestions: false shuffleAnswers: true --- ### What is a better calibration metrics for high-risk applications? - [ ] Expected calibration error (ECE) - [x] Maximum calibration error (MCE) > Correct. - [ ] the average of MCE and ECE.

Observations

--- primaryColor: steelblue shuffleQuestions: false shuffleAnswers: true --- ### Model miscalibration - [ ] decreases as we use batch normalization. - [x] increases as we use batch normalization. > Correct. - [ ] does not depend on batch normalization.

Calibration Methods

--- primaryColor: steelblue shuffleQuestions: false shuffleAnswers: true --- ### Temperature scaling is the simplest extension of: - [ ] ECE and MCE metrics. - [x] Platt scaling. > Correct. - [ ] BBQ and/or isotonic regression.

Conclusions

--- primaryColor: steelblue shuffleQuestions: false shuffleAnswers: true --- ### Temperature scaling - [ ] changes the model accuracy. - [x] has no impact on the model accuracy. > Correct. - [ ] is an algorithm to improve model accuracy.

Code and Assignment

There are no programming assignments for this lecture.