Abstract: Markov chain Monte Carlo (MCMC) requires only the ability to evaluate the likelihood, making it a common technique for inference in complex models. However, it can have a slow mixing rate, requiring the generation of many samples to obtain good estimates and an overall high computational cost. In this talk, I will present a multi-fidelity layered MCMC method, Shrek MCMC, that exploits lower-fidelity approximations of the true likelihood calculation to improve mixing and leads to overall faster performance. Such lower-fidelity likelihoods are commonly available in scientific and engineering applications where the model involves a simulation whose resolution or accuracy can be tuned. Our technique uses recursive, layered chains and an automatically adapting uniform smoothing parameter; it does not require the likelihood to take any form or have any particular internal mathematical structure and achieves larger effective sample sizes for the same computational time across different scientific domains including hydrology and cosmology.