Skip to content

Commit 7146c6c

Browse files
committed
Add calib uncertainties in object localization
1 parent 696e8ac commit 7146c6c

File tree

2 files changed

+20
-1
lines changed

2 files changed

+20
-1
lines changed

paper_notes/2dod_calib.md

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
# [Calibrating Uncertainties in Object Localization Task](https://arxiv.org/abs/1811.11210)
2+
3+
_November 2019_
4+
5+
tl;dr: Proof of concept by applying Uncertainty calibration to object detector.
6+
7+
#### Overall impression
8+
For a more theoretical treatment refer to [accurate uncertainty via calibrated regression](dl_regression_calib.md). A more detailed application is [can we trust you](towards_safe_ad_calib.md).
9+
10+
#### Key ideas
11+
- Validating uncertainty estimates: plot regressed aleatoric uncertainty $\sigma_i^2$ and $(b_i - \bar{b_i})^2$
12+
- To find 90% confidence interval, the upper and lower bounds are given by $\hat{P^{-1}}(r \pm 90/2)$, where $r = \hat{P(x)}$ and $\hat{P}$ is the P after calibration.
13+
14+
#### Technical details
15+
- Summary of technical details
16+
17+
#### Notes
18+
- Questions and notes on how to improve/revise the current work
19+

paper_notes/towards_safe_ad_calib.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ tl;dr: Calibration of the network for a probabilistic object detector
77
#### Overall impression
88
The paper extends previous works in the [probabilistic lidar detector](towards_safe_ad.md) and its [successor](towards_safe_ad2.md). It is based on the work of Pixor.
99

10-
Calibration: a probabilistic object detector should predict uncertainties that match the natural frequency of correct predictions. 90% of the predictions with 0.9 score from a calibrated detector should be correct. Humans have intuitive notion of probability in a frequentist sense. --> cf [accurate uncertainty via calibrated regression](dl_regression_calib.md).
10+
Calibration: a probabilistic object detector should predict uncertainties that match the natural frequency of correct predictions. 90% of the predictions with 0.9 score from a calibrated detector should be correct. Humans have intuitive notion of probability in a frequentist sense. --> cf [accurate uncertainty via calibrated regression](dl_regression_calib.md) and [calib uncertainties in object detection](2dod_calib.md).
1111

1212
A calibrated regression is a bit harder to interpret. P(gt < F^{-1}(p)) = p. F^{-1} = F_q is the inverse function of CDF, the quantile function.
1313

0 commit comments

Comments
 (0)