Updating probabilities with data and moments

Using the flow of the River Eden in Carlisle, Cumbria, UK as a case study, this paper develops a Bayesian model for combining historic flood estimates since 1800 with gauge data since 1967 to estimate the probability of low frequency flood events for the area taking account of uncertainty in the discharge estimates.

Results show a reduction in 95% confidence intervals of roughly 50% for annual exceedance probabilities of less than 0.0133 (return periods over 75 years) compared to standard flood frequency estimation methods using solely systematic data.

The chances of them going all the way is only up to 18.5%.

updating probabilities with data and moments-75updating probabilities with data and moments-56updating probabilities with data and moments-71

Next Steps If we want to follow the rest of the Red Sox playoff outcomes probabilistically, we’d take our revised prior – let’s say 18.5% after Game One – and come up with updated probabilities for variables Y and Z for Game Two.

And we hoped that God had plans for our redemption, some day.

We know now that, like the biblical story of Job, Red Sox Nation suffered for a reason. And yet, I insist we try to learn Bayesian Probability today Fine then.

Both the Game Six World Series loss in 1986 to the Mets and the 2003 ALCS loss to the Yankees[1] defied all semblance of probability – we didn’t need a mathematical theorem to tell us that.

At the time, all we knew was that God personally intervened in baseball outcomes and that she enjoyed torturing us.

Leave a Reply