Applied Bayesian Inference - AM exercise
You can download the slides for the AM lecture here
Logistic growth
In this example, we are going to perform inference for the logistic growth model commonly used in population biology to describe resource-limited growth. The population density of a given bacteria is supposed to follow:
where we assume ; is the initial (exponential) growth rate when resources are abundant; is the carrying capacity denoting the steady-state population density.
Logistic growth
Using Scipy's integrator, write a function which numerically solves the logistic equation. Plot the solution for from to .
Observed data
Assume that the observed data differs from the logistic model solution according to the measurement equation:
where controls the measurement variability. Generate 'observed' data at assuming . Plot these data overlaid on top of the true solution.
Logistic likelihood
The likelihood for this model is given by:
where is the number of observations and is the ODE solution at time . Write a function which calculates the log-likelihood for given .
Plot likelihood
Using the simulated data you previously generated, plot the log-likelihood as a function of holding all other parameters at their true values.
Plot likelihood contour
Plot the likelihood (not the log-likelihood as this is harder to distinguish) contour surface for holding all other parameters at their true values.
Prior predictive
Assume we have priors on parameters: , and we fix . Generate 100 prior predictive simulations of the ODE solution and plot these. Remember, a single prior predictive simulation is obtained by sampling each parameter from its prior and (in this case) solving the ODE for this parameter set.
Initial condition
Now also allow . How does the prior predictive distribution change?
Metropolis Sampler: Proposal
We are now going to write a random walk Metropolis sampler. The first step is to write a method which takes as input and proposes new values using univariate normal distributions centered at the current values. So, for example,
where is a hyperparameter of the method. Write such a function.
Metropolis Sampler: Prior
Assume priors: and fix . Write a function that accepts as input and outputs the log-prior density .
Metropolis Sampler: Posterior
Write a function which calculates the unnormalised log-posterior (i.e. the sum of the log-prior and log-likelihood), , for a given parameter set.
Metropolis Sampler: Step Accept
The next step is to write a function called 'step_accept' which takes as input , proposes new values . Then calculates the ratio:
Then generates and does the following: if , return ; else return .
Metropolis Sampler: MCMC
Write a function which iterates 'step_accept' generating a chain of MCMC samples of . Initialise using samples from the prior.
Metropolis Sampler: Plot sampled values
Assuming step sizes of , generate an MCMC sample of 1000 draws. Visualise the sampled values of over time.
Metropolis Sampler: Plot sampled values contour
Plot the pairwise distribution of . How do the sampled values compare to the true parameters?
Metropolis Sampler: Convergence
Modify your MCMC routine to use the following half-normal distributions to sample initial points for your chains. Run four independent chains for 1000 iterations each and plot the samples over time. How long does it appear your chains be run until they reach a stationary distribution?
Note, to do so, you can use the following function:
def truncated_normal(mu, sd): myclip_a = 0 myclip_b = 1000000 my_mean = mu my_std = sd a, b = (myclip_a - my_mean) / my_std, (myclip_b - my_mean) / my_std r = scipy.stats.truncnorm.rvs(a, b, loc=my_mean, scale=my_std, size=1) return r[0]
Metropolis Sampler: Posterior predictive
Using a random subset of 100 samples from all four chains taken from after they appear to have reached the stationary distribution, draw from the posterior predictive distribution of the logistic equation solution and overlay the observations.