Gamma-Poisson mixture

Note: This section contains notes on the derivations made while authoring my book. The derivation details originate from public Q&A platforms and will naturally differ from those in the book chapters.

The goal is to prove the marginal distribution of data \(n\) is a negative-binomial distribution when the likelihood is Poisson distribution.

Suppose count \(n\) is drawn from a Poisson distribution \[ n\sim Poisson(\mu) \] where \(\lambda=\mu/E\sim \Gamma(\alpha,\beta)\),\(\beta\) is the rate.

Re-express the distribution of \(n\), as \[ \begin{gather*} n|\lambda\sim Poisson(\lambda E)\\ f(n|\lambda,E)=\frac{(\lambda E)^n}{n!}e^{-\lambda E} \end{gather*} \] Suppose \(\alpha,\beta\) are known. Taking all possible values of \(\lambda\), we could obtain the marginal distribution of \(n\) as \[ \begin{gather*} f(n|\alpha,\beta)=\int_0^{\infty}f(n|\lambda)\pi(\lambda)d\lambda\\ =\int_0^{\infty}\frac{(\lambda E)^n}{n!}e^{-\lambda E}\frac{(\beta \lambda)^{\alpha -1}e^{-\lambda \beta}}{\Gamma(\alpha)}\beta d\lambda\\ =\frac{E^n\beta^{\alpha}}{n!\Gamma(\alpha)}\int_0^{\infty}\lambda^ne^{-\lambda E}\lambda^{\alpha-1}e^{-\lambda \beta}d\lambda\\ =\frac{E^n\beta^{\alpha}}{n!\Gamma(\alpha)}\int_0^{\infty}e^{-\lambda(E+\beta)}\lambda^{n+\alpha-1}d\lambda\\ =\frac{E^n\beta^{\alpha}\Gamma(\alpha+n)}{n!\Gamma(\alpha)(\beta+E)^{n+\alpha}}\int_0^{\infty}\frac{(\beta+E)^{n+\alpha}}{\Gamma(\alpha+n)}e^{-\lambda(E+\beta)}\lambda^{n+\alpha-1}d\lambda\\ =\frac{1}{n!\Gamma(\alpha)}(\frac{E}{\beta+E})^n(\frac{\beta}{\beta+E})^{\alpha}\Gamma(\alpha+n)*1\\ =\frac{\Gamma(\alpha+n)}{n!\Gamma(\alpha)}(\frac{\beta}{E}+1)^{-n}(\frac{E}{\beta}+1)^{-\alpha} \end{gather*} \] \(f(n|\alpha,\beta)\) is exactly the density function of the negative Binomial distribution, and is also the likelihood function taking \(\alpha,\beta\) as parameters.

Based on the details provided in the derivation, it can also be stated that the negative binomial distribution is essentially a Gamma-Poisson mixture distribution, as it integrates aspects of both the Gamma and Poisson distributions.

It’s also readily to prove that the posterior predictive distribution of data \(n\) as \(f(\hat{n}|n)\) is a negative binomial distribution. Since the prior of \(\lambda\) is a Gamma distribution, the posterior of \(\lambda\) given \(n\) is also a Gamma distribution. The likelihood of \(\hat{n}\) given unknown but fixed \(\lambda\) is a Poisson distribution. Then we could obtain the posterior predictive distribution of \(\hat{n}\) given \(n\) as \[ \begin{gather*} p(\hat{n}|n)=\int_0^{\infty}p(\hat{n}|\lambda)p(\lambda|n)d\lambda\\ =\int_0^{\infty}(Poisson\times Gamma) d\lambda \end{gather*} \] Thus we show that \(\hat{n}|n\) is a Gamma-Poisson mixture distribution.