Event Generation and Statistical Sampling with Deep Generative Models Rob Verheyen
Introduction Event generation is really hard! 2
Introduction Can we use deep neural networks to do event generation? Possible applications: • Faster • Data driven generators • Targeted event generation 3
Introduction Study of different types of unsupervised generative models • Generative Adversarial Networks • Variational Autoencoders • Buffer Variational Autoencoder Can these networks be used for event generation? 4
Generative Adversarial Networks (GANs)
Generative Adversarial Networks Two networks (Generator & Discriminator) that play a game against each other 6
Generative Adversarial Networks Loss function: Nash equilibrium: p data ( x ) = p gen ( x ) D ( x ) = 1 2 7
Generative Adversarial Networks 1812.04948 8
Variational Autoencoders (VAEs)
Autoencoders • Data is encoded into latent space • Dim of latent space is often lower than dim of data 10
Variational Autoencoders Add degree of randomness to training procedure 11
Variational Autoencoders Points in latent space are ordered 12
Variational Autoencoders Loss function L VAE = (1 − � ) 1 y i ) 2 + � D KL ( N ( µ i , � i ) , N (0 , 1)) N ( ~ x i − ~ Mean squared error Kullback–Leibler divergence MSE : Gaussians prefer being very narrow KL Div: Gaussians prefer being close to N (0 , 1) is a hyperparameter: tune by hand β 13
Information Buffer The latent space representation of our datapoints are now ordered Normally, one would now sample from in latent space N (0 , 1) But we can do better: Create information buffer n p ( z ) = 1 X N ( µ i , σ i ) n i Representation of distribution of training data in latent space 14
Results
Toy Model decay with uniform angles and 1 → 2 no exact momentum conservation Trained on four-vectors 16
Top pair production MG5 aMC@NLO 6.3.2 + Pythia 8.2 + Delphes 3 • One top required to decay leptonically 5 × 10 5 • Number of training points • Jets with p T > 20 GeV O (10 8 ) Event generation with the B-VAE is faster! 17
Top pair production 18
Latent space distributions Distributions are still Gaussian-like Some have sharp cutoffs: Unphysical events outside Information buffer very important! 19
Latent Space Principal Component Analysis 20
Latent Space Principal Component Analysis 21
Possible Applications Most direct application: Importance sampling for ME generation d Φ p ( Φ ) | M ( Φ ) | 2 Z Z d Φ | M ( Φ ) | 2 = σ ∝ p ( Φ ) Recent ML techniques: Current methods: VEGAS Latent variable models 1810.11509 e + e − → qg ¯ q efficiency: • VEGAS: ~4% • LVM: ~ 65% • B-VAE: ??? 22
Applications? • Data-driven event generators • Targeted event generation • Applications outside High Energy Physics? • ??? 23
Recommend
More recommend