Adjoint Solver: Optimization problem with an objective function dependent on separate simulations. #2199
Replies: 2 comments 1 reply
-
This supported by the Meep JAX wrapper. Essentially, you can create multiple |
Beta Was this translation helpful? Give feedback.
-
This is probably the easiest approach.
Luckily you can indeed do this in parallel using the approach described below...
If everything is linear, you can cascade things together rather easily. The trick is to compute the gradient of each independent FOM, and then recombine them together like you suggest. We do this quite often with the current infrastructure (no changes to the We simply run if my_group == 0:
geometry = [<first geometry>]
sources = [<first source>]
elif my_group == 1:
geometry = [<second geometry>]
sources = [<second source>] Then, you compute your gradients as expected and can broadcast all the gradients from each group to each other, such that you can now optimize over all design fields etc. using something like f, df = opt([x])
all_grads = mp.merge_subgroup_data(df) This is a bit intimidating at first, but it's a truly powerful form of parallelism (which we call "in-the-loop" parallelism in our paper) that dramatically simplifies multiobjective optimization. |
Beta Was this translation helpful? Give feedback.
-
I'm currently working on an optimization project involving an objective function that is dependent on the DFT fields of completely independent simulations.
That is, denoting the DFT fields of a specific region of the simulation number$s$ by $\mathbf{E_s}$ , I need the gradients with respect to the design region permittivity $\epsilon$ of an objective function $\mathcal{F}$ of the form:
$$\mathcal{F}(\mathbf{E_1} , \mathbf{E_2}, \mathbf{E_3} ... \mathbf{E_N} )$$
However, meep adjoint
OptimizationProblem
object only allows an objective function whose arguments are exclusive from the output of a single simulation object.Is there a way to handle my problem without substantially changing the meep adjoint module?
One possibility is computing the gradients via the following expression:
$$\frac{\partial \mathcal{F}}{\partial \epsilon_i} = \sum_{s} \frac{\partial \mathcal{F} }{\partial \mathbf{E_s}} \frac{\partial \mathbf{E_s}}{\partial \epsilon_i} $$
Where the chain rule was used and$\frac{\partial \mathbf{E_s}}{\partial \epsilon_i}$ is the jacobian of the DFT fields of the simulation number $s$ .
How could I compute that Jacobian by using meep adjoint module?
Beta Was this translation helpful? Give feedback.
All reactions