Skip to content

Allow both objective and solution sensitivities in reverse mode#338

Open
klamike wants to merge 13 commits intojump-dev:masterfrom
klamike:mk/allow_obj_and_sol
Open

Allow both objective and solution sensitivities in reverse mode#338
klamike wants to merge 13 commits intojump-dev:masterfrom
klamike:mk/allow_obj_and_sol

Conversation

@klamike
Copy link
Contributor

@klamike klamike commented Feb 12, 2026

This one warrants some discussion. Allowing both is indeed error-prone, it may be better to have some flag to explicitly allow both, enable the currently commeted warning, or just keep the behavior/close this PR.

@andrewrosemberg

@codecov
Copy link

codecov bot commented Feb 12, 2026

Codecov Report

❌ Patch coverage is 85.71429% with 3 lines in your changes missing coverage. Please review.
✅ Project coverage is 89.88%. Comparing base (56efbf0) to head (763ecf5).
⚠️ Report is 3 commits behind head on master.

Files with missing lines Patch % Lines
src/moi_wrapper.jl 81.81% 2 Missing ⚠️
src/NonLinearProgram/NonLinearProgram.jl 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #338      +/-   ##
==========================================
+ Coverage   89.14%   89.88%   +0.73%     
==========================================
  Files          16       16              
  Lines        1963     2105     +142     
==========================================
+ Hits         1750     1892     +142     
  Misses        213      213              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@klamike
Copy link
Contributor Author

klamike commented Feb 12, 2026

codecov is marking the begin line as uncovered but the body of the block as covered...

@andrewrosemberg
Copy link
Collaborator

not opposed to have this! LGTM.

Should we add just a couple of tests more with: perturbation to the dual and objective, and another with all 3?

We are missing some coupled tests in general and this could be a nice oportunity to add.

@joaquimg
Copy link
Member

I am not convinced about this one.
I have changed my opinion back and forth, that's why I have not done as well.
If there is another a realistic usecase we can add iwith the ewarn and the flag

@klamike
Copy link
Contributor Author

klamike commented Feb 15, 2026

One setting this would be useful for is implementing training objectives that look like $\min_\theta \mathbb{E}_d \big[ f^\star(p(d; \theta)) + g(x^\star(p(d; \theta))) \big]$ where $\theta$ are e.g. neural network weights, $p(d; \theta)$ represents a forward-pass of the neural network taking features $d$ as input and returning a parameter vector $p$, $f^\star(p)$, $x^\star(p)$ are the optimal objective value and solution vector at $p$, and g is some "external" objective, e.g. $g(x)=\Vert x \Vert_2$.

Of course, users can just do it with the non-objective sensitivity API, it is just a bit less clean. A warning by default, that can be disabled via attribute, seems like a best solution to me.

@joaquimg
Copy link
Member

Fair enough, lets do it. Lets go for warn by default and flag to disable it.

@klamike
Copy link
Contributor Author

klamike commented Feb 16, 2026

Any recommendation for attribute name? AllowObjectiveAndSolutionInput()?

@klamike
Copy link
Contributor Author

klamike commented Feb 16, 2026

Not sure why the @test_warn is failing, only on 1.6:

┌ Warning: Computing reverse differentiation with both solution sensitivities and objective sensitivities. Set `DiffOpt.AllowObjectiveAndSolutionInput()` to `true` to silence this warning.
└ @ DiffOpt ~/work/DiffOpt.jl/DiffOpt.jl/src/moi_wrapper.jl:561
test_ObjectiveSensitivity_model1: Test Failed at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.6/Test/src/Test.jl:671
  Expression: contains_warn(read(fname, String), $(Expr(:escape, :warning)))
Stacktrace:
 [1] macro expansion
   @ /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.6/Test/src/Test.jl:671 [inlined]
 [2] test_ObjectiveSensitivity_model1()
   @ Main.TestNLPProgram ~/work/DiffOpt.jl/DiffOpt.jl/test/nlp_program.jl:681

@joaquimg
Copy link
Member

we can drop 1.6 in favour of current LTS

@klamike
Copy link
Contributor Author

klamike commented Feb 17, 2026

Done, though I managed to fixed it for 1.6 too. The remaining CI failure is due to a coverage bug

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants