Skip to content

Commit c47e149

Browse files
lxvmstevengj
andauthored
Set up Julia package and example SSP2 optimizations (#2)
* Set up Julia package and example SSP2 optimizations * Delete examples/julia/Manifest.toml * fix underflow in tanh projection * use evalpoly * fix projection typos/bugs * refactor projection into a loop over target points * refactor interpolation into another module * update example to use a higher resolution target grid and reduce allocations * correct the usage of cubic_adjoint deriv api * correct the comparison to finite differences because arrays are reused * avoid NaN in tanh projection when x == eta at beta = Inf * change definition of ProjectionProblem and SSP2 * add grid as an optional argument to PaddingProblem * add init! to interface * fix typo * add support for SSP1 * add linear interpolation and use bc in cubic interpolation * rename R_smoothing_factor to smoothing_radius like python api * add pythonic API * add ChainRulesCore rrules for pythonic api * run checks to make sure different apis match * document apis of padding module * document public api in kernels module * document public api of convolution module * Document public api of interpolation module * document the public api of the projection module * document public api of ssp package * add package tests * add README.md * remove revise dep * use rffts, precompute conv kernel, use perturb in arb direction, and make adjoint_solve use a structural tangent * Apply suggestion from @stevengj --------- Co-authored-by: Steven G. Johnson <stevenj@alum.mit.edu>
1 parent eec3ded commit c47e149

23 files changed

Lines changed: 1890 additions & 0 deletions

examples/julia/Project.toml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
[deps]
2+
CairoMakie = "13f3f980-e62b-5c42-98c6-ff1f3baf88f0"
3+
NLopt = "76087f3c-5699-56af-9a33-bf431cd00edd"
4+
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
5+
SSP = "e5b5d2ee-15bb-40cc-a0da-b305b842b7a8"
6+
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

examples/julia/ssp2_example.jl

Lines changed: 201 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,201 @@
1+
using SSP: init, solve!, adjoint_solve!
2+
using SSP: Kernel, Pad, Convolve, Project
3+
using .Kernel: conickernel
4+
using .Pad: FillPadding, BoundaryPadding, Inner, PaddingProblem, DefaultPaddingAlgorithm
5+
using .Convolve: DiscreteConvolutionProblem, FFTConvolution
6+
using .Project: ProjectionProblem, SSP1_linear, SSP1, SSP2
7+
8+
using Random
9+
using CairoMakie
10+
using CairoMakie: colormap
11+
using NLopt
12+
13+
14+
Nx = Ny = 100
15+
grid = (
16+
range(-1, 1, length=Nx),
17+
range(-1, 1, length=Ny),
18+
)
19+
# Random.seed!(42)
20+
# design_vars = rand(Nx, Ny)
21+
# design_vars = [sinpi(x) * sinpi(y) for (x, y) in Iterators.product(grid...)]
22+
design_vars = let a = 0.5, b = 0.499
23+
# Cassini oval
24+
[((x^2 + y^2)^2 - 2a^2 * (x^2 - y^2) + a^4 - b^4) + 0.5 for (x, y) in Iterators.product(grid...)]
25+
end
26+
radius = 0.1
27+
28+
kernel = conickernel(grid, radius)
29+
30+
padprob = PaddingProblem(;
31+
data = design_vars,
32+
boundary = BoundaryPadding(size(kernel) .- 1, size(kernel) .- 1),
33+
# boundary = FillPadding(1.0, size(kernel) .- 1, size(kernel) .- 1),
34+
)
35+
padalg = DefaultPaddingAlgorithm()
36+
padsolver = init(padprob, padalg)
37+
padsol = solve!(padsolver)
38+
39+
convprob = DiscreteConvolutionProblem(;
40+
data = padsol.value,
41+
kernel,
42+
)
43+
44+
convalg = FFTConvolution()
45+
convsolver = init(convprob, convalg)
46+
convsol = solve!(convsolver)
47+
48+
depadprob = PaddingProblem(;
49+
data = convsol.value,
50+
boundary = Inner(size(kernel) .- 1, size(kernel) .- 1),
51+
)
52+
depadalg = DefaultPaddingAlgorithm()
53+
depadsolver = init(depadprob, depadalg)
54+
depadsol = solve!(depadsolver)
55+
56+
filtered_design_vars = depadsol.value
57+
58+
# projection points need not be the same as design variable grid
59+
target_grid = (
60+
range(-1, 1, length=Nx * 1),
61+
range(-1, 1, length=Ny * 1),
62+
)
63+
target_points = vec(collect(Iterators.product(target_grid...)))
64+
projprob = ProjectionProblem(;
65+
rho_filtered=filtered_design_vars,
66+
grid,
67+
target_points,
68+
beta = Inf,
69+
eta = 0.5,
70+
)
71+
# projalg = SSP1_linear()
72+
# projalg = SSP1()
73+
projalg = SSP2()
74+
projsolver = init(projprob, projalg)
75+
projsol = solve!(projsolver)
76+
77+
projected_design_vars = projsol.value
78+
79+
let
80+
fig = Figure()
81+
ax1 = Axis(fig[1,1]; title = "design variables", aspect=DataAspect())
82+
h1 = heatmap!(grid..., design_vars; colormap=colormap("grays"))
83+
Colorbar(fig[1,2], h1)
84+
85+
ax2 = Axis(fig[1,3]; title = "SSP2 output", aspect=DataAspect())
86+
h2 = heatmap!(target_grid..., reshape(projected_design_vars, length.(target_grid)); colormap=colormap("grays"))
87+
Colorbar(fig[1,4], h2)
88+
save("design.png", fig)
89+
end
90+
91+
function fom(data, grid)
92+
return sum(abs2, data) / length(data)
93+
end
94+
obj = fom(projected_design_vars, grid)
95+
96+
function adjoint_fom(adj_fom, data, grid)
97+
adjoint_fom!(similar(data), adj_fom, data, grid)
98+
end
99+
function adjoint_fom!(adj_data, adj_fom, data, grid)
100+
adj_data .= (adj_fom / length(data)) .* 2 .* data
101+
return adj_data
102+
end
103+
104+
adj_rho_projected = adjoint_fom(1.0, projected_design_vars, grid)
105+
106+
adj_projsol = (; value=adj_rho_projected)
107+
adj_projprob = adjoint_solve!(projsolver, adj_projsol, projsol.tape)
108+
adj_depadsol = (; value=adj_projprob.rho_filtered)
109+
adj_depadprob = adjoint_solve!(depadsolver, adj_depadsol, depadsol.tape)
110+
adj_convsol = (; value=adj_depadprob.data)
111+
adj_convprob = adjoint_solve!(convsolver, adj_convsol, convsol.tape)
112+
adj_padsol = (; value=adj_convprob.data)
113+
adj_padprob = adjoint_solve!(padsolver, adj_padsol, padsol.tape)
114+
adj_design_vars = adj_padprob.data
115+
116+
let
117+
fig = Figure()
118+
ax1 = Axis(fig[1,1]; title = "SSP2 output", aspect=DataAspect())
119+
h1 = heatmap!(ax1, target_grid..., reshape(projected_design_vars, length.(target_grid)); colormap=colormap("grays"))
120+
Colorbar(fig[1,2], h1)
121+
122+
ax2 = Axis(fig[1,3]; title = "design variables gradient", aspect=DataAspect())
123+
h2 = heatmap!(ax2, grid..., adj_design_vars; colormap=colormap("RdBu"))
124+
Colorbar(fig[1,4], h2)
125+
save("design_gradient.png", fig)
126+
end
127+
128+
fom_withgradient = let grid=grid, padsolver=padsolver, convsolver=convsolver, depadsolver=depadsolver, projsolver=projsolver, adj_rho_projected=adj_rho_projected
129+
function (design_vars)
130+
131+
padsolver.data = design_vars
132+
padsol = solve!(padsolver)
133+
convsolver.data = padsol.value
134+
convsol = solve!(convsolver)
135+
depadsolver.data = convsol.value
136+
depadsol = solve!(depadsolver)
137+
projsolver.rho_filtered = depadsol.value
138+
projsol = solve!(projsolver)
139+
140+
_fom = fom(projsol.value, grid)
141+
adjoint_fom!(adj_rho_projected, 1.0, projsol.value, grid)
142+
143+
adj_projsol = (; value=adj_rho_projected)
144+
adj_projprob = adjoint_solve!(projsolver, adj_projsol, projsol.tape)
145+
adj_depadsol = (; value=adj_projprob.rho_filtered)
146+
adj_depadprob = adjoint_solve!(depadsolver, adj_depadsol, depadsol.tape)
147+
adj_convsol = (; value=adj_depadprob.data)
148+
adj_convprob = adjoint_solve!(convsolver, adj_convsol, convsol.tape)
149+
adj_padsol = (; value=adj_convprob.data)
150+
adj_padprob = adjoint_solve!(padsolver, adj_padsol, padsol.tape)
151+
adj_design_vars = adj_padprob.data
152+
return _fom, adj_design_vars
153+
end
154+
end
155+
156+
h = 1e-6
157+
Random.seed!(0)
158+
perturb = h * randn(size(design_vars))
159+
fom_ph, = fom_withgradient(design_vars + perturb)
160+
fom_mh, = fom_withgradient(design_vars - perturb)
161+
dfomdh_fd = (fom_ph - fom_mh) / 2h
162+
163+
fom_val, adj_design_vars = fom_withgradient(design_vars)
164+
dfomdh = 2sum(adj_design_vars .* perturb) / 2h
165+
@show dfomdh_fd dfomdh
166+
167+
opt = NLopt.Opt(:LD_CCSAQ, length(design_vars))
168+
evaluation_history = Float64[]
169+
my_objective_fn = let fom_withgradient=fom_withgradient, evaluation_history=evaluation_history, design_vars=design_vars
170+
function (x, grad)
171+
val, adj_design = fom_withgradient(reshape(x, size(design_vars)))
172+
if !isempty(grad)
173+
copy!(grad, vec(adj_design))
174+
end
175+
push!(evaluation_history, val)
176+
return val
177+
end
178+
end
179+
NLopt.min_objective!(opt, my_objective_fn)
180+
NLopt.maxeval!(opt, 50)
181+
fmax, xmax, ret = NLopt.optimize(opt, vec(design_vars))
182+
183+
let
184+
padsolver.data = reshape(xmax, size(design_vars))
185+
padsol = solve!(padsolver)
186+
convsolver.data = padsol.value
187+
convsol = solve!(convsolver)
188+
depadsolver.data = convsol.value
189+
depadsol = solve!(depadsolver)
190+
projsolver.rho_filtered = depadsol.value
191+
projsol = solve!(projsolver)
192+
193+
fig = Figure()
194+
ax1 = Axis(fig[1,1]; title = "Objective history", yscale=log10, limits = (nothing, (1e-16, 1e1)))
195+
h1 = scatterlines!(ax1, evaluation_history)
196+
197+
ax2 = Axis(fig[1,2]; title = "Final SSP2 design", aspect=DataAspect())
198+
h2 = heatmap!(target_grid..., reshape(projsol.value, length.(target_grid)); colormap=colormap("grays"))
199+
Colorbar(fig[1,3], h2)
200+
save("optimization.png", fig)
201+
end

examples/julia/ssp_comparison.jl

Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,97 @@
1+
using SSP: conic_filter, ssp1_linear, ssp1, ssp2
2+
3+
using Random
4+
using CairoMakie
5+
using CairoMakie: colormap
6+
using NLopt
7+
using Zygote
8+
9+
10+
Nx = Ny = 100
11+
grid = (
12+
range(-1, 1, length=Nx),
13+
range(-1, 1, length=Ny),
14+
)
15+
# Random.seed!(42)
16+
# design_vars = rand(Nx, Ny)
17+
# design_vars = [sinpi(x) * sinpi(y) for (x, y) in Iterators.product(grid...)]
18+
design_vars = let a = 0.5, b = 0.499
19+
# Cassini oval
20+
[((x^2 + y^2)^2 - 2a^2 * (x^2 - y^2) + a^4 - b^4) + 0.5 for (x, y) in Iterators.product(grid...)]
21+
end
22+
radius = 0.1
23+
beta = Inf
24+
eta = 0.5
25+
ssp_algs = (ssp1_linear, ssp1, ssp2)
26+
27+
ssp_projections = map(ssp_algs) do ssp
28+
rho_filtered = conic_filter(design_vars, radius, grid)
29+
rho_projected = ssp(rho_filtered, beta, eta, grid)
30+
return rho_projected
31+
end
32+
33+
let
34+
fig = Figure(size = (1200, 400))
35+
for (i, (ssp, rho_projected)) in enumerate(zip(ssp_algs, ssp_projections))
36+
ax = Axis(fig[1,2i-1]; title = "$(string(nameof(ssp))) projection", aspect=DataAspect())
37+
h = heatmap!(grid..., rho_projected; colormap=colormap("grays"))
38+
Colorbar(fig[1,2i], h)
39+
end
40+
save("projection_comparison.png", fig)
41+
end
42+
43+
function figure_of_merit(rho_projected)
44+
sum(abs2, rho_projected) / length(rho_projected)
45+
end
46+
47+
ssp_projection_gradients = map(ssp_algs) do ssp
48+
design_vars_gradient = Zygote.gradient(design_vars) do design_vars
49+
rho_filtered = conic_filter(design_vars, radius, grid)
50+
rho_projected = ssp(rho_filtered, beta, eta, grid)
51+
return figure_of_merit(rho_projected)
52+
end
53+
return design_vars_gradient[1]
54+
end
55+
56+
let
57+
fig = Figure(size = (1200, 400))
58+
for (i, (ssp, rho_projected_gradient)) in enumerate(zip(ssp_algs, ssp_projection_gradients))
59+
ax = Axis(fig[1,2i-1]; title = "$(string(nameof(ssp))) projection gradient", aspect=DataAspect())
60+
h = heatmap!(grid..., rho_projected_gradient; colormap=colormap("RdBu"))
61+
Colorbar(fig[1,2i], h)
62+
end
63+
save("projection_gradient_comparison.png", fig)
64+
end
65+
66+
ssp_optimization_histories = map(ssp_algs) do ssp
67+
opt = NLopt.Opt(:LD_CCSAQ, length(design_vars))
68+
evaluation_history = Float64[]
69+
my_objective_fn = let evaluation_history=evaluation_history, design_vars=design_vars
70+
function (x, grad)
71+
fom, adj_design = Zygote.withgradient(x) do x
72+
rho_filtered = conic_filter(reshape(x, size(design_vars)), radius, grid)
73+
rho_projected = ssp(rho_filtered, beta, eta, grid)
74+
return figure_of_merit(rho_projected)
75+
end
76+
if !isempty(grad)
77+
copy!(grad, vec(adj_design[1]))
78+
end
79+
push!(evaluation_history, fom)
80+
return fom
81+
end
82+
end
83+
NLopt.min_objective!(opt, my_objective_fn)
84+
NLopt.maxeval!(opt, 50)
85+
fmax, xmax, ret = NLopt.optimize(opt, vec(design_vars))
86+
return evaluation_history
87+
end
88+
89+
let
90+
fig = Figure()
91+
ax = Axis(fig[1,1]; title = "Optimization history", yscale=log10, limits = (nothing, (1e-16, 1e1)))
92+
for (i, (ssp, evaluation_history)) in enumerate(zip(ssp_algs, ssp_optimization_histories))
93+
scatterlines!(ax, evaluation_history; label=string(nameof(ssp)))
94+
end
95+
Legend(fig[1,2], ax)
96+
save("evaluation_history_comparison.png", fig)
97+
end

src/julia/SSP/Project.toml

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
name = "SSP"
2+
uuid = "e5b5d2ee-15bb-40cc-a0da-b305b842b7a8"
3+
version = "0.1.0"
4+
authors = ["Lorenzo Van Munoz <lorenzo@vanmunoz.com>"]
5+
6+
[deps]
7+
FFTW = "7a1cc6ca-52ef-59f5-83cd-3a7055c09341"
8+
FastInterpolations = "9ea80cae-fc13-4c00-8066-6eaedb12f34b"
9+
10+
[weakdeps]
11+
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
12+
13+
[extensions]
14+
SSPChainRulesCoreExt = "ChainRulesCore"
15+
16+
[compat]
17+
ChainRulesCore = "1"
18+
FFTW = "1.10.0"
19+
FastInterpolations = "0.4.4"
20+
julia = "1.11"

src/julia/SSP/README.md

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
# SSP
2+
3+
A Smoothed Subpixel Projection (SSP) package for topology optimization in Julia.
4+
Supports N-dimensional data and reverse-mode automatic differentiation with minimal allocations.
5+
6+
## Usage
7+
8+
This package provides a high-level API nearly identical to its python cousin.
9+
It provides a `conic_filter` routine for smoothing design variables as well as
10+
three projections: `ssp1_linear`, `ssp1`, and `ssp2`.
11+
12+
### Example
13+
14+
```julia
15+
using SSP: conic_filter, ssp2
16+
17+
Nx = Ny = 100
18+
grid = (
19+
range(-1, 1, length=Nx),
20+
range(-1, 1, length=Ny),
21+
)
22+
23+
design_vars = let a = 0.5, b = 0.499
24+
# Cassini oval
25+
[((x^2 + y^2)^2 - 2a^2 * (x^2 - y^2) + a^4 - b^4) + 0.5 for (x, y) in Iterators.product(grid...)]
26+
end
27+
radius = 0.1
28+
beta = Inf
29+
eta = 0.5
30+
31+
mapping = let radius=radius, grid=grid, beta=beta, eta=eta
32+
function (design_vars)
33+
rho_filtered = conic_filter(design_vars, radius, grid)
34+
rho_projected = ssp2(rho_filtered, beta, eta, grid)
35+
return rho_projected
36+
end
37+
end
38+
39+
mapping(design_vars)
40+
```
41+
42+
This API also supports reverse-mode automatic differentaion through packages
43+
that rely on ChainRulesCore.jl, such as Zygote.jl. Therefore, calculating
44+
gradients is straightforward:
45+
46+
```julia
47+
using Zygote
48+
49+
fom = let mapping=mapping
50+
function (x)
51+
rho_projected = mapping(x)
52+
return sum(abs2, rho_projected) / length(rho_projected)
53+
end
54+
end
55+
Zygote.withgradient(fom, design_vars)
56+
```
57+
58+
### Low-level API
59+
60+
We provide a SciML-like API that allows reduced allocations and finer control
61+
over algorithm details such as boundary conditions and padding for interpolation
62+
and the selection of projection points. See `SSP/examples/julia/ssp2_example.jl`
63+
for how to use this API.

0 commit comments

Comments
 (0)