Nonasymptotic bounds for suboptimal importance sampling
by Carsten Hartmann, Lorenz Richter
Year:
2021
Publication:
eprint arXiv:2102.09606
Abstract:
Importance sampling is a popular variance reduction method for Monte Carlo estimation, where a notorious question is how to design good proposal distributions. While in most cases optimal (zero-variance) estimators are theoretically possible, in practice only suboptimal proposal distributions are available and it can often be observed numerically that those can reduce statistical performance significantly, leading to large relative errors and therefore counteracting the original intention.
Link:
Read the paperAdditional Information
Brief introduction of the dida co-author(s) and relevance for dida's ML developments.
About the Co-Author
With an original focus on stochastics and numerics (FU Berlin), the mathematician has been dealing with deep learning algorithms for some time now. Besides his interest in the theory, he has practically solved multiple data science problems in the last 10 years. Lorenz leads the machine learning team.