Nonasymptotic bounds for suboptimal importance sampling
von Carsten Hartmann, Lorenz Richter
Jahr:
2021
Publikation:
eprint arXiv:2102.09606
Abstrakt:
Importance sampling is a popular variance reduction method for Monte Carlo estimation, where a notorious question is how to design good proposal distributions. While in most cases optimal (zero-variance) estimators are theoretically possible, in practice only suboptimal proposal distributions are available and it can often be observed numerically that those can reduce statistical performance significantly, leading to large relative errors and therefore counteracting the original intention.
Link:
Read the paperAdditional Information
Brief introduction of the dida co-author(s) and relevance for dida's ML developments.
Dr. Lorenz Richter
Aus der Stochastik und Numerik kommend (FU Berlin), beschäftigt sich der Mathematiker seit einigen Jahren mit Deep-Learning-Algorithmen. Neben seinem Faible für die Theorie hat er in den letzten 10 Jahren diverse Data Science-Probleme praktisch gelöst. Lorenz leitet das Machine-Learning-Team.