However, Even in Dimensions this starts to break.
Even modern ML (normalising flows) won't save you [2001.05478]
Algorithm | Efficiency | Efficiency |
---|---|---|
HAAG | 3.0% | 2.7% |
Vegas | 27.7% | 31.8% |
Neural Network | 64.3% | 33.6% |
Central problem:
MCMC kicks in as we go to high dimensions, grey area between IS and MCMC, can ML help?
In neglecting the Evidence () we have neglected precisely the quantity we want,
Nested Sampling [Skilling 2006], implemented for in PolyChord [1506.00171]. Is a good way to generically approach this problem for dimensions
Primarily an integral algorithm (largely unique vs other MCMC approaches)
Designed for multimodal problems from inception
Requires construction that can sample under hard likelihood constraint
Largely self tuning
[yallup.github.io/bayeshep_durham] for animated versions
Unweighted Events
Algorithm | |||
---|---|---|---|
HAAG | 3.0% | 2.7% | 2.8% |
Vegas (cold start) | 2.0% | 0.05% | 0.01% |
NS | 1.0% | 1.0% | 1.0% |
End to end stylised version of the problem demonstrated.
This is deeper than coming up with a new way of mapping phase space
(dedicated section in paper)
Physics challenges
Variants of NS algorithm
Prior information
Fitting this together with modern ML
The fundamental motivation for this work came from recognising not just an ML challenge but a physics challenge [2004.13687]
LO dijet isn't hard, NNNLO is. If your method isn't robust in these limits it doesn't solve the right problem. Unique features of NS open up interesting physics:
In my opinion (your milage may vary)
The fundamental problem for LHC event generation trying to do Importance Sampling in high dimension.
Machine learning can and will be useful, but this is not just a machine learning mapping problem.
This is a Bayesian inference problem, precisely calculating Evidences or Posterior sampling.
Nested Sampling is a high dimensional integration method, primarily from Bayesian Inference, that is an excellent choice for particle physics integrals
#footer: Exploring phase space with Nested Sampling
Event generation is a Markov Chain sampling process (despite what you may have been told), needing stochastic refinement is inevitable.
Event Generation is fundamentally an inversion of a Conditional probability, Bayesian inference (and hence Markov Chains) are the (IMO) proper language for this problem