References of "Louppe, Gilles"
     in
Bookmark and Share    
See detailLikelihood ratio estimation for statistical inference in physical sciences
Louppe, Gilles ULiege

Conference (2019, April 19)

Detailed reference viewed: 18 (2 ULiège)
See detailLectures on Deep Learning
Louppe, Gilles ULiege

Conference (2019, April 08)

Detailed reference viewed: 18 (1 ULiège)
Full Text
Peer Reviewed
See detailAdversarial Variational Optimization of Non-Differentiable Simulators
Louppe, Gilles ULiege; Hermans, Joeri ULiege; Cranmer, Kyle

in Proceedings of Machine Learning Research (2019, April)

Complex computer simulators are increasingly used across fields of science as generative models tying parameters of an underlying theory to experimental observations. Inference in this setup is often ... [more ▼]

Complex computer simulators are increasingly used across fields of science as generative models tying parameters of an underlying theory to experimental observations. Inference in this setup is often difficult, as simulators rarely admit a tractable density or likelihood function. We introduce Adversarial Variational Optimization (AVO), a likelihood-free inference algorithm for fitting a non-differentiable generative model incorporating ideas from generative adversarial networks, variational optimization and empirical Bayes. We adapt the training procedure of Wasserstein GANs by replacing the differentiable generative network with a domain-specific simulator. We solve the resulting non-differentiable minimax problem by minimizing variational upper bounds of the two adversarial objectives. Effectively, the procedure results in learning a proposal distribution over simulator parameters, such that the Wasserstein distance between the marginal distribution of the synthetic data and the empirical distribution of observed data is minimized. We present results of the method with simulators producing both discrete and continuous data. [less ▲]

Detailed reference viewed: 16 (3 ULiège)
See detailLikelihood-free inference in Physical Sciences
Louppe, Gilles ULiege

Conference (2019, March 22)

Detailed reference viewed: 13 (1 ULiège)
See detailIntelligence artificielle
Louppe, Gilles ULiege

Conference given outside the academic context (2019)

Detailed reference viewed: 22 (3 ULiège)
Full Text
See detailLikelihood-free MCMC with Approximate Likelihood Ratios
Hermans, Joeri ULiege; Begy, Volodimir; Louppe, Gilles ULiege

E-print/Working paper (2019)

We propose a novel approach for posterior sampling with intractable likelihoods. This is an increasingly important problem in scientific applications where models are implemented as sophisticated computer ... [more ▼]

We propose a novel approach for posterior sampling with intractable likelihoods. This is an increasingly important problem in scientific applications where models are implemented as sophisticated computer simulations. As a result, tractable densities are not available, which forces practitioners to rely on approximations during inference. We address the intractability of densities by training a parameterized classifier whose output is used to approximate likelihood ratios between arbitrary model parameters. In turn, we are able to draw posterior samples by plugging this approximator into common Markov chain Monte Carlo samplers such as Metropolis-Hastings and Hamiltonian Monte Carlo. We demonstrate the proposed technique by fitting the generating parameters of implicit models, ranging from a linear probabilistic model to settings in high energy physics with high-dimensional observations. Finally, we discuss several diagnostics to assess the quality of the posterior. [less ▲]

Detailed reference viewed: 15 (1 ULiège)
See detailDoc'Café: "Siffler en travaillant? de Marx à l'intelligence artificielle"
Louppe, Gilles ULiege

Conference given outside the academic context (2019)

Detailed reference viewed: 22 (3 ULiège)
See detailLectures on Deep Learning
Louppe, Gilles ULiege

Conference (2019, January 14)

Detailed reference viewed: 13 (0 ULiège)
See detailParameter inference and data modelling with deep learning
Louppe, Gilles ULiege

Conference (2019, January 09)

Detailed reference viewed: 22 (2 ULiège)
Full Text
Peer Reviewed
See detailQCD-Aware Recursive Neural Networks for Jet Physics
Louppe, Gilles ULiege; Cho, Kyunghyun; Becot, Cyril et al

in Journal of High Energy Physics (2019)

Recent progress in applying machine learning for jet physics has been built upon an analogy between calorimeters and images. In this work, we present a novel class of recursive neural networks built ... [more ▼]

Recent progress in applying machine learning for jet physics has been built upon an analogy between calorimeters and images. In this work, we present a novel class of recursive neural networks built instead upon an analogy between QCD and natural languages. In the analogy, four-momenta are like words and the clustering history of sequential recombination jet algorithms is like the parsing of a sentence. Our approach works directly with the four-momenta of a variable-length set of particles, and the jet-based tree structure varies on an event-by-event basis. Our experiments highlight the flexibility of our method for building task-specific jet embeddings and show that recursive architectures are significantly more accurate and data efficient than previous image-based networks. We extend the analogy from individual jets (sentences) to full events (paragraphs), and show for the first time an event-level classifier operating on all the stable particles produced in an LHC event. [less ▲]

Detailed reference viewed: 20 (3 ULiège)
Full Text
Peer Reviewed
See detailRecurrent machines for likelihood-free inference
Pesah, Arthur; Wehenkel, Antoine ULiege; Louppe, Gilles ULiege

Conference (2018, December 08)

Likelihood-free inference is concerned with the estimation of the parameters of a non-differentiable stochastic simulator that best reproduce real observations. In the absence of a likelihood function ... [more ▼]

Likelihood-free inference is concerned with the estimation of the parameters of a non-differentiable stochastic simulator that best reproduce real observations. In the absence of a likelihood function, most of the existing inference methods optimize the simulator parameters through a handcrafted iterative procedure that tries to make the simulated data more similar to the observations. In this work, we explore whether meta-learning can be used in the likelihood-free context, for learning automatically from data an iterative optimization procedure that would solve likelihood-free inference problems. We design a recurrent inference machine that learns a sequence of parameter updates leading to good parameter estimates, without ever specifying some explicit notion of divergence between the simulated data and the real data distributions. We demonstrate our approach on toy simulators, showing promising results both in terms of performance and robustness. [less ▲]

Detailed reference viewed: 34 (7 ULiège)
Full Text
Peer Reviewed
See detailDeep Quality Value (DQV) Learning
Sabatelli, Matthia ULiege; Louppe, Gilles ULiege; Geurts, Pierre ULiege et al

in Advances in Neural Information Processing Systems, Deep Reinforcement Learning Workshop (2018)

We introduce a novel Deep Reinforcement Learning (DRL) algorithm called Deep Quality-Value (DQV) Learning. DQV uses temporal-difference learning to train a Value neural network and uses this network for ... [more ▼]

We introduce a novel Deep Reinforcement Learning (DRL) algorithm called Deep Quality-Value (DQV) Learning. DQV uses temporal-difference learning to train a Value neural network and uses this network for training a second Quality-value network that learns to estimate state-action values. We first test DQV’s update rules with Multilayer Perceptrons as function approximators on two classic RL problems, and then extend DQV with the use of Deep Convolutional Neural Networks, ‘Experience Replay’ and ‘Target Neural Networks’ for tackling four games of the Atari Arcade Learning environment. Our results show that DQV learns significantly faster and better than Deep Q-Learning and Double Deep Q-Learning, suggesting that our algorithm can potentially be a better performing synchronous temporal difference algorithm than what is currently present in DRL. [less ▲]

Detailed reference viewed: 62 (17 ULiège)
See detailLikelihood-free inference for Physical Sciences
Louppe, Gilles ULiege

Conference (2018, November 19)

Detailed reference viewed: 11 (0 ULiège)
See detailLikelihood-free inference in Physical Sciences
Louppe, Gilles ULiege

Conference (2018, November 16)

Detailed reference viewed: 13 (0 ULiège)
See detailLikelihood-free inference, effectively
Louppe, Gilles ULiege

Conference (2018, October 26)

Detailed reference viewed: 9 (0 ULiège)
See detailConstraining Effective Field Theories with Machine Learning
Louppe, Gilles ULiege

Conference (2018, October 17)

We present powerful new analysis techniques to constrain effective field theories at the LHC. By leveraging the structure of particle physics processes, we extract extra information from Monte-Carlo ... [more ▼]

We present powerful new analysis techniques to constrain effective field theories at the LHC. By leveraging the structure of particle physics processes, we extract extra information from Monte-Carlo simulations, which can be used to train neural network models that estimate the likelihood ratio. These methods scale well to processes with many observables and theory parameters, do not require any approximations of the parton shower or detector response, and can be evaluated in microseconds. We show that they allow us to put significantly stronger bounds on dimension-six operators than existing methods, demonstrating their potential to improve the precision of the LHC legacy constraints. [less ▲]

Detailed reference viewed: 22 (1 ULiège)
Full Text
Peer Reviewed
See detailConstraining Effective Field Theories with Machine Learning
Brehmer, Johann; Cranmer, Kyle; Louppe, Gilles ULiege et al

in Physical Review Letters (2018)

We present powerful new analysis techniques to constrain effective field theories at the LHC. By leveraging the structure of particle physics processes, we extract extra information from Monte-Carlo ... [more ▼]

We present powerful new analysis techniques to constrain effective field theories at the LHC. By leveraging the structure of particle physics processes, we extract extra information from Monte-Carlo simulations, which can be used to train neural network models that estimate the likelihood ratio. These methods scale well to processes with many observables and theory parameters, do not require any approximations of the parton shower or detector response, and can be evaluated in microseconds. We show that they allow us to put significantly stronger bounds on dimension-six operators than existing methods, demonstrating their potential to improve the precision of the LHC legacy constraints. [less ▲]

Detailed reference viewed: 33 (2 ULiège)
Full Text
Peer Reviewed
See detailA Guide to Constraining Effective Field Theories with Machine Learning
Brehmer, Johann; Cranmer, Kyle; Louppe, Gilles ULiege et al

in Physical Review. D. (2018)

We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments. By harnessing the latent-space structure of particle physics processes, we extract ... [more ▼]

We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments. By harnessing the latent-space structure of particle physics processes, we extract extra information from the simulator. This augmented data can be used to train neural networks that precisely estimate the likelihood ratio. The new methods scale well to many observables and high-dimensional parameter spaces, do not require any approximations of the parton shower and detector response, and can be evaluated in microseconds. Using weak-boson-fusion Higgs production as an example process, we compare the performance of several techniques. The best results are found for likelihood ratio estimators trained with extra information about the score, the gradient of the log likelihood function with respect to the theory parameters. The score also provides sufficient statistics that contain all the information needed for inference in the neighborhood of the Standard Model. These methods enable us to put significantly stronger bounds on effective dimension-six operators than the traditional approach based on histograms. They also outperform generic machine learning methods that do not make use of the particle physics structure, demonstrating their potential to substantially improve the new physics reach of the LHC legacy results. [less ▲]

Detailed reference viewed: 32 (2 ULiège)
Full Text
See detailLikelihood-free inference with an improved cross-entropy estimator
Stoye, Markus; Brehmer, Johann; Louppe, Gilles ULiege et al

E-print/Working paper (2018)

We extend recent work (Brehmer, et. al., 2018) that use neural networks as surrogate models for likelihood-free inference. As in the previous work, we exploit the fact that the joint likelihood ratio and ... [more ▼]

We extend recent work (Brehmer, et. al., 2018) that use neural networks as surrogate models for likelihood-free inference. As in the previous work, we exploit the fact that the joint likelihood ratio and joint score, conditioned on both observed and latent variables, can often be extracted from an implicit generative model or simulator to augment the training data for these surrogate models. We show how this augmented training data can be used to provide a new cross-entropy estimator, which provides improved sample efficiency compared to previous loss functions exploiting this augmented training data. [less ▲]

Detailed reference viewed: 20 (5 ULiège)
Full Text
See detailEfficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model
Gunes Baydin, Atilim; Heinrich, Lukas; Bhimji, Wahid et al

E-print/Working paper (2018)

We present a novel framework that enables efficient probabilistic inference in large-scale scientific models by allowing the execution of existing domain-specific simulators as probabilistic programs ... [more ▼]

We present a novel framework that enables efficient probabilistic inference in large-scale scientific models by allowing the execution of existing domain-specific simulators as probabilistic programs, resulting in highly interpretable posterior inference. Our framework is general purpose and scalable, and is based on a cross-platform probabilistic execution protocol through which an inference engine can control simulators in a language-agnostic way. We demonstrate the technique in particle physics, on a scientifically accurate simulation of the tau lepton decay, which is a key ingredient in establishing the properties of the Higgs boson. High-energy physics has a rich set of simulators based on quantum field theory and the interaction of particles in matter. We show how to use probabilistic programming to perform Bayesian inference in these existing simulator codebases directly, in particular conditioning on observable outputs from a simulated particle detector to directly produce an interpretable posterior distribution over decay pathways. Inference efficiency is achieved via inference compilation where a deep recurrent neural network is trained to parameterize proposal distributions and control the stochastic simulator in a sequential importance sampling scheme, at a fraction of the computational cost of Markov chain Monte Carlo sampling. [less ▲]

Detailed reference viewed: 15 (1 ULiège)