You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.rst
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -120,7 +120,7 @@ Output changes:
120
120
121
121
Bug fixes:
122
122
123
-
* Resolved PETSc/OpenMPI issue (when using the Executor). #1064
123
+
* Resolved PETSc/Open-MPI issue (when using the Executor). #1064
124
124
* Prevent `mpi4py` validation running during local comms (when using OO interface). #1065
125
125
126
126
Performance changes:
@@ -380,7 +380,7 @@ Documentation:
380
380
381
381
:Known issues:
382
382
383
-
* OpenMPI does not work with direct MPI job launches in ``mpi4py`` comms mode,
383
+
* Open-MPI does not work with direct MPI job launches in ``mpi4py`` comms mode,
384
384
since it does not support nested MPI launches.
385
385
(Either use local mode or the Balsam Executor.)
386
386
* See known issues section in the documentation for more issues.
@@ -444,7 +444,7 @@ Other functionality changes:
444
444
445
445
:Known issues:
446
446
447
-
* OpenMPI does not work with direct MPI job launches in ``mpi4py`` comms mode,
447
+
* Open-MPI does not work with direct MPI job launches in ``mpi4py`` comms mode,
448
448
since it does not support nested MPI launches.
449
449
(Either use local mode or the Balsam Executor.)
450
450
* See known issues section in the documentation for more issues.
@@ -492,7 +492,7 @@ Documentation:
492
492
493
493
:Known issues:
494
494
495
-
* OpenMPI does not work with direct MPI job launches in ``mpi4py`` comms mode, since it does not support nested MPI launches
495
+
* Open-MPI does not work with direct MPI job launches in ``mpi4py`` comms mode, since it does not support nested MPI launches
496
496
(Either use local mode or Balsam Executor).
497
497
* See known issues section in the documentation for more issues.
498
498
@@ -540,7 +540,7 @@ Documentation:
540
540
:Known issues:
541
541
542
542
* We currently recommend running in Central mode on Bridges, as distributed runs are experiencing hangs.
543
-
* OpenMPI does not work with direct MPI job launches in mpi4py comms mode, since it does not support nested MPI launches
543
+
* Open-MPI does not work with direct MPI job launches in mpi4py comms mode, since it does not support nested MPI launches
544
544
(Either use local mode or Balsam Executor).
545
545
* See known issues section in the documentation for more issues.
546
546
@@ -696,7 +696,7 @@ Release 0.5.0
696
696
697
697
:Known issues:
698
698
699
-
* OpenMPI does not work with direct MPI job launches in mpi4py comms mode, since it does not support nested MPI launches
699
+
* Open-MPI does not work with direct MPI job launches in mpi4py comms mode, since it does not support nested MPI launches
700
700
(Either use local mode or Balsam job controller).
701
701
* Local comms mode (multiprocessing) may fail if MPI is initialized before forking processors. This is thought to be responsible for issues combining with PETSc.
702
702
* Remote detection of logical cores via LSB_HOSTS (e.g., Summit) returns number of physical cores since SMT info not available.
@@ -728,7 +728,7 @@ Release 0.4.0
728
728
729
729
:Known issues:
730
730
731
-
* OpenMPI is not supported with direct MPI launches since nested MPI launches are not supported.
731
+
* Open-MPI is not supported with direct MPI launches since nested MPI launches are not supported.
732
732
733
733
Release 0.3.0
734
734
-------------
@@ -749,7 +749,7 @@ Release 0.3.0
749
749
750
750
:Known issues:
751
751
752
-
* OpenMPI is not supported with direct MPI launches since nested MPI launches are not supported.
752
+
* Open-MPI is not supported with direct MPI launches since nested MPI launches are not supported.
753
753
754
754
Release 0.2.0
755
755
-------------
@@ -765,7 +765,7 @@ Release 0.2.0
765
765
:Known issues:
766
766
767
767
* Killing MPI jobs does not work correctly on some systems (including Cray XC40 and CS400). In these cases, libEnsemble continues, but processes remain running.
768
-
* OpenMPI does not work correctly with direct launches (and has not been tested with Balsam).
768
+
* Open-MPI does not work correctly with direct launches (and has not been tested with Balsam).
0 commit comments