Differentiable Programming in HEPLukas Heinrich, TU Munich
The rise of machine learning within the last decade has to a large degree also been the success of differentiable programming and gradient-based methods both in optimization as well as statistical inference. Going beyond vanilla Deep Learning, differentiable programming allows physicists to inject domain knowledge throughout the ML workflow from adding inductive bias to models via symmetries, using physics models within the loss definition or in defining more informative label data. While this may significantly increase both interpretability and efficiency of ML application in physics, challenges remain in casting key physics processes in the language of differentiable programming - particularly for the deeply hierarchical stochastic processes one observes in high energy physics. In this talk I will review recent advances in applying differentiable programming as a paradigm to HEP and point out new research directions.