Add multivariate optimization#331
Conversation
- BFGS: Quasi-Newton method with automatic differentiation - Nelder-Mead: Derivative-free simplex method - Update optimize.livemd with multivariate section - Add efficient_frontier.livemd portfolio optimization example
f9d143a to
f2404ad
Compare
|
And comparing them with python |
|
@nyo16 I dropped two tiny comments. Please test the notebooks work as expected and we should be good to go! |
- BFGS line search: replace manual 10x unroll with `while ... unroll: true` - Nelder-Mead: bundle candidate points/values into a map instead of passing 11 positional arguments to `nelder_mead_update` - Migrate both files from `defn minimize` + `deftransformp transform_opts` to the `deftransform minimize` options pattern used by golden_section and brent
optimize.livemd: - Drop unreachable f32 tolerances in the Brent/GoldenSection internals blocks and the JIT demo (tol 1.0e-10/1.0e-8 on f32 inputs never met the gradient criterion, so blocks printed "Converged: false") efficient_frontier.livemd: - Express monthly returns as percentages so objective magnitudes match achievable gradient tolerances - Replace soft-penalty objective (which produced leveraged long/short weights with multi-thousand-% annualized returns) with a z^2/sum(z^2) simplex reparameterization that enforces w_i >= 0 and sum(w_i) = 1 by construction - Rewrite the lambda sweep for the new scale; every point on the frontier now converges and returns are monotonic and realistic
|
@josevalim i resolved the 2 comments :) |
|
@nyo16 the issue description says: "[ ] Test livebooks manually in Livebook". Did you double check the notebooks too? Thanks! |
- Switch Mix.install to pull scholar from the feature branch on GitHub so the notebooks can be opened directly from Livebook without a local path checkout - Add a prominent educational-use disclaimer to the efficient frontier notebook clarifying that the data is fabricated, the model ignores real-world frictions, and the notebook must not be used for actual investment decisions
|
@josevalim yes, i left the {:scholar, github: "nyo16/scholar", branch: "add-multivariate-optimization"},for this branch to be testable. |
|
I changed it in the notebooks to use the formal scholar 0.4.0 |
|
The notebooks need to use path dependencies, no? Otherwise they won’t work because they require unreleased code? |
Correct, sorry I will revert it :) |
BFGS and Nelder-Mead are unreleased, so a Hex version pin can't
resolve them. Fall back to {:scholar, path: "."} and leave the
intended ~> 0.4.1 pin commented out as a reminder to swap once the
release containing these modules ships.
|
@josevalim i revert it. |
|
@nyo16 thank you! Btw, if you could send a PR that replaces the use of Scidata by an in memory data set or similar, it would be very welcome! ❤️ |
|
💚 💙 💜 💛 ❤️ |
|
@akoutmos you maybe find this interesting :) |
This PR adds two multivariate optimization algorithms to Scholar, completing the optimization module:
Both implementations follow the patterns established in PRs #327 (Golden Section) and #328 (Brent):
defnentry point withdeftransformpfor option validationNx.tensor)NimbleOptionsschemaChanges
lib/scholar/optimize/bfgs.exlib/scholar/optimize/nelder_mead.extest/scholar/optimize/bfgs_test.exstest/scholar/optimize/nelder_mead_test.exsnotebooks/optimize.livemdnotebooks/efficient_frontier.livemdcompare_*.pyTest plan
mix test)mix test test/scholar/optimize/)mix format)