Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.

Summary
Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability teaches the increasingly popular probabilistic approach to deep learning that allows you to refine your results more quickly and accurately without much trial-and-error testing. Emphasizing practical techniques that use the Python-based Tensorflow Probability Framework, you’ll learn to build highly-performant deep learning applications that can reliably handle the noise and uncertainty of real-world data.

Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.

About the technology
The world is a noisy and uncertain place. Probabilistic deep learning models capture that noise and uncertainty, pulling it into real-world scenarios. Crucial for self-driving cars and scientific testing, these techniques help deep learning engineers assess the accuracy of their results, spot errors, and improve their understanding of how algorithms work.

About the book
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.

What's inside

Explore maximum likelihood and the statistical basis of deep learning
Discover probabilistic models that can indicate possible outcomes
Learn to use normalizing flows for modeling and generating complex distributions
Use Bayesian neural networks to access the uncertainty in the model

About the reader
For experienced machine learning developers.

About the author
Oliver Dürr is a professor at the University of Applied Sciences in Konstanz, Germany. Beate Sick holds a chair for applied statistics at ZHAW and works as a researcher and lecturer at the University of Zurich. Elvis Murina is a data scientist.

Table of Contents

PART 1 - BASICS OF DEEP LEARNING

1 Introduction to probabilistic deep learning

2 Neural network architectures

3 Principles of curve fitting

PART 2 - MAXIMUM LIKELIHOOD APPROACHES FOR PROBABILISTIC DL MODELS

4 Building loss functions with the likelihood approach

5 Probabilistic deep learning models with TensorFlow Probability

6 Probabilistic deep learning models in the wild

PART 3 - BAYESIAN APPROACHES FOR PROBABILISTIC DL MODELS

7 Bayesian learning

8 Bayesian neural networks
"1136602690"
Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.

Summary
Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability teaches the increasingly popular probabilistic approach to deep learning that allows you to refine your results more quickly and accurately without much trial-and-error testing. Emphasizing practical techniques that use the Python-based Tensorflow Probability Framework, you’ll learn to build highly-performant deep learning applications that can reliably handle the noise and uncertainty of real-world data.

Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.

About the technology
The world is a noisy and uncertain place. Probabilistic deep learning models capture that noise and uncertainty, pulling it into real-world scenarios. Crucial for self-driving cars and scientific testing, these techniques help deep learning engineers assess the accuracy of their results, spot errors, and improve their understanding of how algorithms work.

About the book
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.

What's inside

Explore maximum likelihood and the statistical basis of deep learning
Discover probabilistic models that can indicate possible outcomes
Learn to use normalizing flows for modeling and generating complex distributions
Use Bayesian neural networks to access the uncertainty in the model

About the reader
For experienced machine learning developers.

About the author
Oliver Dürr is a professor at the University of Applied Sciences in Konstanz, Germany. Beate Sick holds a chair for applied statistics at ZHAW and works as a researcher and lecturer at the University of Zurich. Elvis Murina is a data scientist.

Table of Contents

PART 1 - BASICS OF DEEP LEARNING

1 Introduction to probabilistic deep learning

2 Neural network architectures

3 Principles of curve fitting

PART 2 - MAXIMUM LIKELIHOOD APPROACHES FOR PROBABILISTIC DL MODELS

4 Building loss functions with the likelihood approach

5 Probabilistic deep learning models with TensorFlow Probability

6 Probabilistic deep learning models in the wild

PART 3 - BAYESIAN APPROACHES FOR PROBABILISTIC DL MODELS

7 Bayesian learning

8 Bayesian neural networks
49.99 In Stock
Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability

Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability

Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability

Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability

Paperback

$49.99 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.

Summary
Probabilistic Deep Learning: With Python, Keras and TensorFlow Probability teaches the increasingly popular probabilistic approach to deep learning that allows you to refine your results more quickly and accurately without much trial-and-error testing. Emphasizing practical techniques that use the Python-based Tensorflow Probability Framework, you’ll learn to build highly-performant deep learning applications that can reliably handle the noise and uncertainty of real-world data.

Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.

About the technology
The world is a noisy and uncertain place. Probabilistic deep learning models capture that noise and uncertainty, pulling it into real-world scenarios. Crucial for self-driving cars and scientific testing, these techniques help deep learning engineers assess the accuracy of their results, spot errors, and improve their understanding of how algorithms work.

About the book
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.

What's inside

Explore maximum likelihood and the statistical basis of deep learning
Discover probabilistic models that can indicate possible outcomes
Learn to use normalizing flows for modeling and generating complex distributions
Use Bayesian neural networks to access the uncertainty in the model

About the reader
For experienced machine learning developers.

About the author
Oliver Dürr is a professor at the University of Applied Sciences in Konstanz, Germany. Beate Sick holds a chair for applied statistics at ZHAW and works as a researcher and lecturer at the University of Zurich. Elvis Murina is a data scientist.

Table of Contents

PART 1 - BASICS OF DEEP LEARNING

1 Introduction to probabilistic deep learning

2 Neural network architectures

3 Principles of curve fitting

PART 2 - MAXIMUM LIKELIHOOD APPROACHES FOR PROBABILISTIC DL MODELS

4 Building loss functions with the likelihood approach

5 Probabilistic deep learning models with TensorFlow Probability

6 Probabilistic deep learning models in the wild

PART 3 - BAYESIAN APPROACHES FOR PROBABILISTIC DL MODELS

7 Bayesian learning

8 Bayesian neural networks

Product Details

ISBN-13: 9781617296079
Publisher: Manning
Publication date: 11/10/2020
Pages: 296
Sales rank: 1,055,474
Product dimensions: 7.30(w) x 9.10(h) x 0.70(d)

About the Author

Oliver Du¨rr is professor for data science at the University of Applied Sciences in Konstanz, Germany.

Beate Sick holds a chair for applied statistics at ZHAW, and works as a researcher and lecturer at the University of Zurich, and as a lecturer at ETH Zurich.

Elvis Murina is a research assistant, responsible for the extensive exercises that accompany the book "Probabilistic Deep Learning" (Manning, 2020).

Table of Contents

Preface xi

Acknowledgments xii

About this book xiv

About the authors xvii

About the cover illustration xviii

Part 1 Basics of deep learning 1

1 Introduction to probabilistic deep learning 3

1.1 A first look at probabilistic models 4

1.2 A first brief look at deep learning (DL) 6

A success story 8

1.3 Classification 8

Traditional approach to image classification 9

Deep learning approach to image classification 12

Non-probabilistic classification 14

Probabilistic classification 14

Bayesian probabilistic classification 16

1.4 Curve fitting 16

Non-probabilistic curve fitting 17

Probabilistic curve fitting 18

Bayesian probabilistic curve fitting 20

1.5 When to use and when not to use DL? 21

When not to use DL 21

When to use DL 22

When to use and when not to use probabilistic models? 22

1.6 What you'll learn in this book 23

2 Neural network architectures 25

2.1 Fully connected neural networks (fcNNs) 26

The biology that inspired the design of artificial NNs 26

Getting started with implementing an NN 28

Using a fully connected NN (fcNN) to classify images 38

2.2 Convolutional NNs for image-like data 44

Main ideas in a CNN architecture 44

A minimal CNN for edge lovers 47

Biological inspiration fora CNN architecture 50

Building and understanding a CNN 52

2.3 One-dimensional CNNs for ordered data 56

Format of time-ordered data 57

What's special about ordered data? 58

Architectures for time-ordered data 59

3 Principles of curve fitting 62

3.1 "Hello world" in curve fitting 63

Fitting a linear regression model based on a loss function 65

3.2 Gradient descent method 69

Loss with one free model parameter 69

Loss with two free model parameters 73

3.3 Special DL sauce 78

Mini-batch gradient descent 78

Using SGD variants to speed up the learning 79

Automatic differentiation 79

3.4 Backpropagation in DL frameworks 80

Static graph frameworks 81

Dynamic graph frameworks 88

Part 2 Maximum Likelihood Approaches For Probabilistic DL Models 91

4 Building loss functions with the likelihood approach 93

4.1 Introduction to the MaxLike principle: The mother of all loss functions 94

4.2 Deriving a loss function for a classification problem 99

Binary classification problem 99

Classification problems with more than two classes 105

Relationship between NLL, cross entropy, and Kullback-Leibler divergence 109

4.3 Deriving a loss function for regression problems 111

Using a NN without hidden layers and one output neuron for modeling a linear relationship between input and output 111

Using a NN with hidden layers to model non-linear relationships between input and output 119

Using an NN with additional output for regression tasks with nonconstant variance 121

5 Probabilistic deep learning models with TensorFlow Probability 128

5.1 Evaluating and comparing different probabilistic prediction models 130

5.2 Introducing TensorFlow Probability (TFP) 132

5.3 Modeling continuous data with TFP 135

Fitting and evaluating a linear regression model with constant variance 136

Fitting and evaluating a linear regression model with a nonconstant standard deviation 140

5.4 Modeling count data with TensorFlow Probability 145

The Poisson distribution for count data 148

Extending the Poisson distribution to a zero-inflated Poisson (ZIP) distribution 153

6 Probabilistic deep learning models in the wild 157

6.1 Flexible probability distributions in state-of-the-art DL models 159

Multinomial distribution as a flexible distribution 160

Making sense of discretized logistic mixture 162

6.2 Case study: Bavarian roadkills 165

6.3 Go with the flow: Introduction to normalizing flows (NFs) 166

The principle idea of NFs 168

The change of variable technique for probabilities 170

Fitting an NF to data 175

Going deeper by chaining flows 177

Transformation between higher dimensional spaces* 181

Using networks to control flows 183

Fun with flows; Sampling faces 188

Part 3 Bayesian Approaches For Probabilistic DL Models 195

7 Bayesian learning 197

7.1 What's wrong with non-Bayesian DL: The elephant in the room 198

7.2 The first encounter with a Bayesian approach 201

Bayesian model: The hacker's way 202

What did we just do? 206

7.3 The Bayesian approach for probabilistic models 207

Training and prediction with a Bayesian model 208

A coin toss as a Hello World example for Bayesian models 213

Revisiting the Bayesian linear regression model 224

8 Bayesian neural networks 229

8.1 Bayesian neural networks (BNNs) 230

8.2 Variational inference (VI) as an approximative Bayes approach 232

Looking under the hood of VI* 233

Applying VI to the toy problem* 238

8.3 Variational inference with TensorFlow Probability 243

8.4 MC dropout as an approximate Bayes approach 245

Classical dropout used during training 246

MC dropout used during train and test times 249

8.5 Case studies 252

Regression case study on extrapolation 252

Classification case, study with novel classes 256

Glossary of terms and abbreviations 264

Index 269

From the B&N Reads Blog

Customer Reviews