Matthew Galdo and Giwon Bahg developed a new algorithm for performing Bayesian inference by combining Differential Evolution (DE) as a mechanism to drive the optimization of Variational Bayesian methods of posterior estimation. Unlike your typical Automatic Differentiation algorithms relying on stochastic gradient descent, DE approximates the gradient through finite differences among particles in the system, giving the newly developed DEVI algorithm a leg up on non-standard optimization problems often found in psychology.
Check it out!