notes:books:ppbmfh

Notes of "Probabilistic Programming & Bayesian Methods for Hackers"

It's kind of funny that, when first mentioning the Python library they used, the two books both say:

  • Introducing our first hammer: PyMC
  • Introducing our first hammer: TensorFlow Probability (Maybe consider put a “second” hammer instead? Later I found they use this “first” for all versions of the book…)

1. Introduction

A note on Big Data
Paradoxically, big data's predictive analytic problems are actually solved by relatively simple algorithms [2][4]. Thus we can argue that big data's prediction difficulty does not lie in the algorithm used, but instead on the computational difficulties of storage and execution on big data. (One should also consider Gelman's quote from above and ask “Do I really have big data?” )

2. TPF and Bayesian Modeling

To distinguish the tensors from their NumPy-like counterparts, we will use the convention of appending an underscore to the version of the tensor that one can use NumPy-like arrays on.
A general rule of thumb for programming in TensorFlow is that if you need to do any array-like calculations that would require NumPy functions, you should use their equivalents in TensorFlow. This practice is necessary because NumPy can produce only constant values but TensorFlow tensors are a dynamic part of the computation graph. If you mix and match these the wrong way, you will typically get an error about incompatible types.
You can draw random samples from a stochastic variable. When you draw samples, those samples become tensorflow.Tensors that behave deterministically from that point on.

TBD

MCMC

Secondly, knowing the process of MCMC gives you insight into whether your algorithm has converged(Converged to what? We will get to that).

Thirdly, we'll understand why we are returned thousands of samples from the posterior as a solution, which at first thought can be odd.
notes/books/ppbmfh.txt · Last modified: 2021/03/30 20:34 by foreverph