Adding IPOPT as an alternative solver for MAgPIE#871
Adding IPOPT as an alternative solver for MAgPIE#871georg-schroeter wants to merge 16 commits intomagpiemodel:developfrom
Conversation
pascal-sauer
left a comment
There was a problem hiding this comment.
Seeing the code duplication now I think we need a better solution for the food demand model.
| @@ -0,0 +1,2 @@ | |||
| name,type,reason | |||
| vm_landdiff,input,questionnaire | |||
There was a problem hiding this comment.
This was just copied from the other realization (nlp_apr17). Does anyone know where this is coming from and if it's still needed?
There was a problem hiding this comment.
The reason is that the lp_nlp_apr17 realization minimizes vm_landdiff instead of vm_cost_glo in its second solve while respecting the vm_cost_glo optimum as an upper bound.
Co-authored-by: Pascal Sauer <156898545+pascal-sauer@users.noreply.github.com>
Co-authored-by: Pascal Sauer <156898545+pascal-sauer@users.noreply.github.com>
|
The implementation settings were simplified, and unused options (additional optfile, conditional solve settings) removed. The duplication of the food demand model with ipopt as a solver was reverted, instead an upcoming rewrite of the module will change the food demand model realization to use the general optimization settings. |
pascal-sauer
left a comment
There was a problem hiding this comment.
Nice and simple PR, looks good! Thanks @georg-schroeter !
| ); | ||
|
|
||
| p80_modelstat(t) = magpie.modelstat; | ||
| p80_num_nonopt(t) = magpie.numNOpt; |
There was a problem hiding this comment.
What is happening here, what are these variables? I assume this was copied from the conopt realization, so maybe @flohump knows?
There was a problem hiding this comment.
Okay found it in the gams docs:
numNOpt (integer): Number of nonoptimalities
Available: Attribute statement (use after solve)
This model attribute returns the number of nonoptimalities after a solve.
There was a problem hiding this comment.
A quick search on github indicates that we never check p80_num_nonopt or do anything with it, do we actually need? If p80_num_nonopt > 0 then this would be reflected in p80_modelstat, no?
There was a problem hiding this comment.
From inspecting the full.lst of several IPOPT runs, "nonopt" entries do appear, and can e.g. happen when values slightly out of bound are moved in-bound again, causing a solved equation to be slightly above the given tolerance which makes them count as non-optimal.
There was a problem hiding this comment.
Okay, so you think there's enough upside to keep this in the code? My tendency is rather to delete if in doubt, but no strong feelings here
pascal-sauer
left a comment
There was a problem hiding this comment.
How would the user switch to optfile2? Would that realistically ever be done? My feeling is no, so I'd much prefer to agree on/find the one configuration that works best
| put 'quality_function_max_section_steps 4' /; | ||
| put 'nlp_scaling_method gradient-based' /; | ||
| put 'nlp_scaling_max_gradient 100' /; | ||
| put 'acceptable_tol 1e-6' /; |
There was a problem hiding this comment.
Huh, I wasn't even aware of the acceptable concept/feature of Ipopt. Did someone (not a LLM) think this through? This only makes sense if acceptable_tol > tol, right? Our usual tol (which is feasible) is 1e-7, so 1e-6 does not make sense here. Also, 1e-6 is the default, I'd not specify default settings explicitly unless there is a reason for it.
There was a problem hiding this comment.
I'm in favor of dropping the acceptable stuff
There was a problem hiding this comment.
Our tol is 1e-8 so considerably lower. "Solved to acceptable level" followed by a "warm start" from that point actually happened sometimes with my other IPOPT runs as well. Since it's the default value, it should be removed anyway.
Co-authored-by: Pascal Sauer <156898545+pascal-sauer@users.noreply.github.com>
|
Not related to this PR, but It's really hard to see the red line in the plots. Maybe we should think about making the lines a little transparent, so it's easier to see overlaps? |
pascal-sauer
left a comment
There was a problem hiding this comment.
I suggest some cleanup for optfile2, but good to go otherwise :)
| put 'quality_function_max_section_steps 4' /; | ||
| put 'nlp_scaling_method gradient-based' /; | ||
| put 'nlp_scaling_max_gradient 100' /; | ||
| put 'acceptable_tol 1e-6' /; |
There was a problem hiding this comment.
The other acceptable_* should also be deleted, no?
| @@ -50,9 +47,6 @@ put 'honor_original_bounds yes' /; | |||
| put 'max_iter 10000' /; | |||
| put 'linear_solver mumps' /; | |||
There was a problem hiding this comment.
This is the default, no need to specify
| @@ -50,9 +47,6 @@ put 'honor_original_bounds yes' /; | |||
| put 'max_iter 10000' /; | |||
There was a problem hiding this comment.
Is this relevant enough to keep?
| @@ -40,7 +38,6 @@ put 'mu_oracle quality-function' /; | |||
| put 'quality_function_max_section_steps 4' /; | |||
| put 'nlp_scaling_method gradient-based' /; | |||
There was a problem hiding this comment.
Delete as this is the default
| @@ -40,7 +38,6 @@ put 'mu_oracle quality-function' /; | |||
| put 'quality_function_max_section_steps 4' /; | |||
| put 'nlp_scaling_method gradient-based' /; | |||
| put 'nlp_scaling_max_gradient 100' /; | |||
There was a problem hiding this comment.
Delete as this is the default
|
Oh and as I was heavily involved in this PR I think we should get another RSE review, can you have a look @tscheypidi ? |

































🐦 Description of this PR 🐦
While MAgPIE and all pre- and post-processing libraries are open source, executing the MAgPIE core model so far relies on CONOPT, a proprietary solver inside the GAMS framework. This isn't ideal in several ways:
This PR tries to alleviate parts of this by adding the option to solve MAgPIE with the open source nlp solver IPOPT instead of the proprietory CONOPT4. Note however that this isn't (yet) an equivalent drop-in solution without any downsides. First, this required several changes in the core model and post-processing formulations (see #869). Second, while we ultimately got it to work properly, it takes much longer than CONOPT4 (by about a factor 12x right now) which can be partially attributed to CONOPT's extensive preprocessing of the model, cutting its size by about 60% after removing the pre- and post-triangular parts. We might be able to cut that in the future by improving the model formulation, but it might also be increased further by future changes in the input our output related parts.
🔧 Checklist for PR creator 🔧
If a point is not applicable, check the checkbox anyway and write "non-applicable" next to the checkbox.
Label pull request from the label list.
Self-review own code
magpie4R library has been updated accordingly and backwards compatible where necessary.scenario_config.csvhas been updated accordingly (important ifdefault.cfghas been updated)Document changes
CHANGELOG.mdgoxygen::goxygen()and verify the modified code is properly documentedPerform test runs
Rscript start.R --> "compilation check"Rscript start.R --> "default"Rscript start.R --> "test runs"Reporting produces no errors and no new warnings
Get two approving reviews (at least one from RSE)
📉 Performance 📈
🚨 Checklist for reviewer 🚨
CHANGELOGis updated correctly