This repository contains code of the experiments of AISTATS 2024 paper Minimizing Convex Functionals over Space of Probability Measures via KL Divergence Gradient Flow, Rentian Yao, Linjun Huang, and Yun Yang. We introduce an implicit scheme, called the implicit KL proximal descent (IKLPD) algorithm, for discretizing a continuous-time gradient flow relative to the Kullback–Leibler (KL) divergence for minimizing a convex target functional. We perform experiments to validate the effectiveness of our approach works in different scenarios.
- Repository
Gaussian_locationcontains code used in the experiments for the Gaussian location mixture model. - Repository
Gaussian_location_scalecontains code used in the experiments for the Gaussian location scale mixture model. - Repository
Bayesian_samplingcontains code used in the experiments for the Bayesian sampling. - Repository
Comparisons_with_WGFcontains code used in the experiments for the Comparison with Wassersten gradient flows. - Repository
impact_of_tau_figure_5contains code used in the experiments for figure 5 of the Impact of IKLPD Step Size in Appendix C.4. - Repository
impact_of_tau_figure_6contains code used in the experiments for figure 6 of the Impact of IKLPD Step Size in Appendix C.4. - Repository
impact_of_tau_figure_7contains code used in the experiments for figure 7 of the Impact of IKLPD Step Size in Appendix C.4. - Repository
NF_short_flowscontains code used in the experiments for the Composition of Short Flows and Teacher-Student Architecture in Appendix C.5.
- We use the
normflowspackage to implement the IKLPD algorithm.