Chapter 6 Using the Standard Toolbox for Bayesian Deep Learning

As we saw in previous chapters, vanilla NNs often produce poor uncertainty estimates and tend to make overconfident predictions, and some aren’t capable of producing uncertainty estimates at all. By contrast, probabilistic architectures offer principled means to obtain high-quality uncertainty estimates; however, they have a number of limitations when it comes to scaling and adaptability.

While both PBP and BBB can be implemented with popular ML frameworks (as shown in our previous TensorFlow examples), they are very complex. As we saw in the last chapter, implementing even a simple network isn’t straightforward. This means that adapting them to new architectures is awkward and ...

Get Enhancing Deep Learning with Bayesian Inference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.