Gen: probabilistic programming with fast custom inference via code generation
Probabilistic programming languages have the potential to make probabilistic modeling and inference easier to use in practice, but only if inference is sufficiently fast and accurate for real applications. Thus far, this has only been possible for domain-specific languages that focus on a restricted class of models and inference algorithms. This paper introduces Gen, a probabilistic programming language embedded in Julia that aims to be sufficiently expressive and extensible for general-purpose use. Gen is the first language to provide constructs for automatically generating optimized implementations of custom inference tactics based on static analysis of the target probabilistic model. Gen is also the first language to provide APIs for adding modeling constructs equipped with custom static analyses and for adding inference tactics equipped with custom code generators. This paper uses two examples to show that Gen is more expressive than Stan, a widely used language for hierarchical Bayesian modeling. A first benchmark shows that a prototype implementation can be as fast as Stan, only $\sim$1.4x slower than a hand-coded sampler in Julia, and $\sim$7,500x faster than Venture, one of the only other probabilistic languages with support for custom inference.
Conference DayMon 18 JunDisplayed time zone: Eastern Time (US & Canada) change
14:00 - 15:30
|Relay: A New IR for Machine Learning Frameworks|
|Diesel - DSL for Linear Algebra and Neural Net Computations on GPUs|
|Gen: probabilistic programming with fast custom inference via code generation|