Skip to content

Use a custom linear relaxation for fpump#255

Open
this-josh wants to merge 13 commits intolanl-ansi:masterfrom
this-josh:master
Open

Use a custom linear relaxation for fpump#255
this-josh wants to merge 13 commits intolanl-ansi:masterfrom
this-josh:master

Conversation

@this-josh
Copy link

Implementation of #254

@this-josh
Copy link
Author

this-josh commented Nov 15, 2022

@Wikunia here is the implementation I have, as mentioned in #255 I'm not convinced it it a good approach. It works as such:

       # build MINLP
        _nl_solver = optimizer_with_attributes(Ipopt.Optimizer, "print_level" => 0)

        juniper_opt = optimizer_with_attributes(Juniper.Optimizer, "nl_solver" => _nl_solver, "mip_solver" => HiGHS.Optimizer)
        model = Model(juniper_opt)
        @variable(model, a, integer=true)
        @constraint(model, 0<=model[:a] <= 10)
        @NLconstraint(model, model[:a] * abs(model[:a]) >=3)
        @objective(model, Min, model[:a])
        juniper_opt = optimizer_with_attributes(Juniper.Optimizer, "nl_solver" => _nl_solver, "mip_solver" => HiGHS.Optimizer, "time_limit"=>64)
    
        # build MILP
        mip = Model(juniper_opt)
        @variable(mip, a, integer=true)
        @constraint(mip, mip[:a] * mip[:a] ==0)
        @constraint(mip, mip[:a] <= 10)
        @objective(mip, Min, mip[:a])
        set_silent(mip)
        set_optimizer_attribute(model, "mip_model", mip)

        optimize!(mip)
        # set the MINLP to have the MILP model
        set_optimizer_attribute(model, "mip_model", mip)
        optimize!(model)

Note six test_linear_transform tests are failing but they are also failing on main.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant