22------
33
44Asynchronously Parallel Optimization Solver for finding Multiple Minima
5- (APOSMM) coordinates concurrent local optimization runs in order to identify
6- many local minima.
5+ (APOSMM) coordinates concurrent local optimization runs to identify
6+ many local minima faster on parallel hardware.
7+
8+ Supported local optimization routines include:
9+
10+ - DFO-LS _ Derivative-free solver for (bound constrained) nonlinear least-squares minimization
11+ - NLopt _ Library for nonlinear optimization, providing a common interface for various methods
12+ - `scipy.optimize `_ Open-source solvers for nonlinear problems, linear programming,
13+ constrained and nonlinear least-squares, root finding, and curve fitting.
14+ - `PETSc/TAO `_ Routines for the scalable (parallel) solution of scientific applications
715
816Required: mpmath _, SciPy _
917
@@ -12,32 +20,26 @@ Optional (see below): petsc4py_, nlopt_, DFO-LS_
1220Configuring APOSMM
1321^^^^^^^^^^^^^^^^^^
1422
15- APOSMM works with a choice of optimizers, some requiring external packages. To
16- import the optimization packages (and their dependencies) at a global level
17- (recommended), add the following lines in the calling script before importing
18- APOSMM::
23+ APOSMM works with a choice of optimizers, some requiring external packages. Specify
24+ them on a global level before importing APOSMM::
1925
2026 import libensemble.gen_funcs
2127 libensemble.gen_funcs.rc.aposmm_optimizers = <optimizers>
2228
23- where ``optimizers `` is a string (or list of strings) from the available options :
29+ where ``optimizers `` is a string (or list-of- strings) from:
2430
2531``"petsc" ``, ``"nlopt" ``, ``"dfols" ``, ``"scipy" ``, ``"external" ``
2632
27- .. dropdown :: Issues with ensemble hanging or failed simulations?
28-
29- Note that if using **mpi4py ** comms, PETSc must be imported at the global
30- level or the ensemble may hang.
33+ .. dropdown :: Issues with ensemble hanging or failed simulations with PETSc?
3134
32- Exception: In the case that you are using the MPIExecutor or other MPI inside
33- a user function and you are using Open MPI, then you must:
35+ If using the MPIExecutor or other MPI routines
36+ and your MPI backend is Open MPI, then you must:
3437
35- - Use ``local `` comms for libEnsemble (not ``mpi4py ``)
36- - Must **NOT ** include the *rc * line above
38+ - Use ``local `` comms for libEnsemble (no ``mpirun ``, `` mpiexec ``, `` aprun ``, etc.).
39+ - Must **NOT ** include the *aposmm_optimizers * line above.
3740
38- This is because PETSc imports MPI, and a global import of PETSc would result
39- in nested MPI (which is not supported by Open MPI). When the above line is
40- not used, an import local to the optimization function will happen.
41+ This is because PETSc imports MPI, and a global import of PETSc results
42+ in nested MPI (which is not supported by Open MPI).
4143
4244To see the optimization algorithms supported, see `LocalOptInterfacer `_.
4345
@@ -49,17 +51,19 @@ Persistent APOSMM
4951^^^^^^^^^^^^^^^^^
5052
5153.. automodule :: persistent_aposmm
52- :members:
54+ :members: aposmm
5355 :undoc-members:
5456
5557LocalOptInterfacer
5658^^^^^^^^^^^^^^^^^^
5759.. automodule :: aposmm_localopt_support
58- :members:
60+ :members: LocalOptInterfacer
5961 :undoc-members:
6062
6163.. _DFO-LS : https://github.com/numericalalgorithmsgroup/dfols
6264.. _mpmath : https://pypi.org/project/mpmath
6365.. _nlopt : https://nlopt.readthedocs.io/en/latest/
6466.. _petsc4py : https://bitbucket.org/petsc/petsc4py
6567.. _SciPy : https://pypi.org/project/scipy
68+ .. _PETSc/TAO : http://www.mcs.anl.gov/petsc
69+ .. _scipy.optimize : https://docs.scipy.org/doc/scipy/reference/optimize.html
0 commit comments