Skip to content

Commit ad143a7

Browse files
joaquimgandrewrosembergodowblegat
authored
Add JuMP-like API (#281)
* initial sketch * rm method * add JuMP API and test it * add diff_model test * Fix conic error (#284) * fix conic error * use jump psd problem * [docs] update to Documenter@1 (#286) * [docs] update to Documenter@1 * Update * update * format * Pass attributes through Objective.FunctionConversionBridge (#287) * Pass attributes through Objective.FunctionConversionBridge * Fix * add test * fix tol * fix test * add reverse test --------- Co-authored-by: joaquimg <[email protected]> * bump POI * cleanup --------- Co-authored-by: Oscar Dowson <[email protected]> Co-authored-by: Benoît Legat <[email protected]> Co-authored-by: joaquimg <[email protected]> * fix parameters support * Update jump-api pr (#294) * prepare more tests * sketch * temp examples * remove printing * remove printing * remove duplicatie def * move definition * format * format * simplify conic model * force nlp on nlp tests * fix param tests * cleanup parameter usage * remove code * cleanup usage of parameters * format * add temp dep * format * add temp dep * fix PSDSquare * temp fix for tests * format * Improve name handling, cleanup parameter error messages, remove slack * re-enable constraint names * Integrate new POI (#302) * integrate new POI * remove test * rm adhoc pkg add * fix docs * add POI test and docs * rm test_solve_conflict * format * add parameter in cone test * add back name constraint vect * Update Project.toml --------- Co-authored-by: Joaquim <[email protected]> * add tests * format --------- Co-authored-by: Andrew Rosemberg <[email protected]> Co-authored-by: Oscar Dowson <[email protected]> Co-authored-by: Benoît Legat <[email protected]>
1 parent 2d346bb commit ad143a7

28 files changed

+2404
-498
lines changed

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,5 +23,5 @@ JuMP = "1"
2323
LazyArrays = "0.21, 0.22, 1"
2424
MathOptInterface = "1.18"
2525
MathOptSetDistances = "0.2.9"
26-
ParametricOptInterface = "0.9.0"
26+
ParametricOptInterface = "0.12.1"
2727
julia = "1.6"

README.md

Lines changed: 18 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -33,15 +33,12 @@ examples, tutorials, and an API reference.
3333

3434
### DiffOpt-JuMP API with `Parameters`
3535

36+
Here is an example with a Parametric **Linear Program**:
37+
3638
```julia
3739
using JuMP, DiffOpt, HiGHS
3840

39-
model = Model(
40-
() -> DiffOpt.diff_optimizer(
41-
HiGHS.Optimizer;
42-
with_parametric_opt_interface = true,
43-
),
44-
)
41+
model = DiffOpt.quadratic_diff_model(HiGHS.Optimizer)
4542
set_silent(model)
4643

4744
p_val = 4.0
@@ -64,9 +61,9 @@ optimize!(model)
6461

6562
# differentiate w.r.t. p
6663
direction_p = 3.0
67-
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(p), Parameter(direction_p))
64+
DiffOpt.set_forward_parameter(model, p, direction_p)
6865
DiffOpt.forward_differentiate!(model)
69-
@show MOI.get(model, DiffOpt.ForwardVariablePrimal(), x) == direction_p * 3 / pc_val
66+
@show DiffOpt.get_forward_variable(model, x) == direction_p * 3 / pc_val
7067

7168
# update p and pc
7269
p_val = 2.0
@@ -82,45 +79,30 @@ optimize!(model)
8279
DiffOpt.empty_input_sensitivities!(model)
8380
# differentiate w.r.t. pc
8481
direction_pc = 10.0
85-
MOI.set(model, DiffOpt.ForwardConstraintSet(), ParameterRef(pc), Parameter(direction_pc))
82+
DiffOpt.set_forward_parameter(model, pc, direction_pc)
8683
DiffOpt.forward_differentiate!(model)
87-
@show abs(MOI.get(model, DiffOpt.ForwardVariablePrimal(), x) -
84+
@show abs(DiffOpt.get_forward_variable(model, x) -
8885
-direction_pc * 3 * p_val / pc_val^2) < 1e-5
8986

9087
# always a good practice to clear previously set sensitivities
9188
DiffOpt.empty_input_sensitivities!(model)
9289
# Now, reverse model AD
9390
direction_x = 10.0
94-
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, direction_x)
91+
DiffOpt.set_reverse_variable(model, x, direction_x)
9592
DiffOpt.reverse_differentiate!(model)
96-
@show MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(p)) == MOI.Parameter(direction_x * 3 / pc_val)
97-
@show abs(MOI.get(model, DiffOpt.ReverseConstraintSet(), ParameterRef(pc)).value -
98-
-direction_x * 3 * p_val / pc_val^2) < 1e-5
93+
@show DiffOpt.get_reverse_parameter(model, p) == direction_x * 3 / pc_val
94+
@show DiffOpt.get_reverse_parameter(model, pc) == -direction_x * 3 * p_val / pc_val^2
9995
```
10096

101-
### Low level DiffOpt-JuMP API:
102-
103-
A brief example:
97+
Available models:
98+
* `DiffOpt.quadratic_diff_model`: Quadratic Programs (QP) and Linear Programs
99+
(LP)
100+
* `DiffOpt.conic_diff_model`: Conic Programs (CP) and Linear Programs (LP)
101+
* `DiffOpt.nonlinear_diff_model`: Nonlinear Programs (NLP), Quadratic Program
102+
(QP) and Linear Programs (LP)
103+
* `DiffOpt.diff_model`: Nonlinear Programs (NLP), Conic Programs (CP),
104+
Quadratic Programs (QP) and Linear Programs (LP)
104105

105-
```julia
106-
using JuMP, DiffOpt, HiGHS
107-
# Create a model using the wrapper
108-
model = Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
109-
# Define your model and solve it
110-
@variable(model, x)
111-
@constraint(model, cons, x >= 3)
112-
@objective(model, Min, 2x)
113-
optimize!(model)
114-
# Choose the problem parameters to differentiate with respect to, and set their
115-
# perturbations.
116-
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)
117-
# Differentiate the model
118-
DiffOpt.reverse_differentiate!(model)
119-
# fetch the gradients
120-
grad_exp = MOI.get(model, DiffOpt.ReverseConstraintFunction(), cons) # -3 x - 1
121-
constant(grad_exp) # -1
122-
coefficient(grad_exp, x) # -3
123-
```
124106

125107
## Citing DiffOpt.jl
126108

docs/Project.toml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
99
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
1010
MLDatasets = "eb30cadb-4394-5ae3-aed4-317e484a6458"
1111
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
12+
ParametricOptInterface = "0ce4ce61-57bf-432b-a095-efac525d185e"
13+
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
1214
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
1315
SCS = "c946c3f1-0d1f-5ce8-9dea-7daa1f7e2d13"
1416
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
Lines changed: 191 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,191 @@
1+
# # Thermal Generation Dispatch Example
2+
3+
#md # [![](https://img.shields.io/badge/GitHub-100000?style=for-the-badge&logo=github&logoColor=white)](@__REPO_ROOT_URL__/examples/Thermal_Generation_Dispatch_Example.jl)
4+
5+
# This example illustrates the sensitivity analysis of thermal generation dispatch problem.
6+
7+
# This problem can be described as the choice of thermal generation `g` given a demand `d`, a price for thermal generation `c` and a penalty price `c_{ϕ}` for any demand not attended ϕ.
8+
9+
# ```math
10+
# \begin{split}
11+
# \begin{array} {ll}
12+
# \mbox{minimize} & \sum_{i=1}^{N} c_{i} g_{i} + c_{\phi} \phi \\
13+
# \mbox{s.t.} & g_{i} \ge 0 \quad i=1..N \\
14+
# & g_{i} \le G_{i} \quad i=1..N \\
15+
# & \sum_{i=1}^{N} g_{i} + \phi = d\\
16+
# \end{array}
17+
# \end{split}
18+
# ```
19+
# where
20+
# - `G_{i}` is the maximum possible generation for a thermal generator `i`
21+
22+
# ## Define and solve the Thermal Dispatch Problem
23+
24+
# First, import the libraries.
25+
26+
using Test
27+
using JuMP
28+
import DiffOpt
29+
import LinearAlgebra: dot
30+
import HiGHS
31+
import MathOptInterface as MOI
32+
import Plots
33+
34+
# Define the model that will be construct given a set of parameters.
35+
36+
function generate_model(
37+
d_data::Float64;
38+
g_sup::Vector{Float64},
39+
c_g::Vector{Float64},
40+
c_ϕ::Float64,
41+
)
42+
## Creation of the Model and Parameters
43+
model = DiffOpt.quadratic_diff_model(HiGHS.Optimizer)
44+
set_silent(model)
45+
I = length(g_sup)
46+
47+
## Variables
48+
@variable(model, g[i in 1:I] >= 0.0)
49+
@variable(model, ϕ >= 0.0)
50+
51+
## Parameters
52+
@variable(model, d in Parameter(d_data))
53+
54+
## Constraints
55+
@constraint(model, limit_constraints_sup[i in 1:I], g[i] <= g_sup[i])
56+
@constraint(model, demand_constraint, sum(g) + ϕ == d)
57+
58+
## Objectives
59+
@objective(model, Min, dot(c_g, g) + c_ϕ * ϕ)
60+
61+
## Solve the model
62+
optimize!(model)
63+
64+
## Return the solved model
65+
return model
66+
end
67+
68+
# Define the functions that will get the primal values `g` and `\phi` and sensitivity analysis of the demand `dg/dd` and `d\phi/dd` from a optimized model.
69+
70+
function diff_forward(model::Model, ϵ::Float64 = 1.0)
71+
## Initialization of parameters and references to simplify the notation
72+
vect_ref = [model[:g]; model[]]
73+
74+
## Get the primal solution of the model
75+
vect = value.(vect_ref)
76+
77+
## Reset the sensitivities of the model
78+
DiffOpt.empty_input_sensitivities!(model)
79+
80+
## Pass the perturbation to the DiffOpt Framework
81+
DiffOpt.set_forward_parameter(model, model[:d], ϵ)
82+
83+
## Compute the derivatives with the Forward Mode
84+
DiffOpt.forward_differentiate!(model)
85+
86+
## Get the derivative of the model
87+
dvect = DiffOpt.get_forward_variable.(model, vect_ref)
88+
89+
## Return the values as a vector
90+
return [vect; dvect]
91+
end
92+
93+
function diff_reverse(model::Model, ϵ::Float64 = 1.0)
94+
## Initialization of parameters and references to simplify the notation
95+
vect_ref = [model[:g]; model[]]
96+
97+
## Get the primal solution of the model
98+
vect = value.(vect_ref)
99+
100+
## Set variables needed for the DiffOpt Backward Framework
101+
dvect = Array{Float64,1}(undef, length(vect_ref))
102+
103+
## Loop for each primal variable
104+
for i in 1:I+1
105+
## Reset the sensitivities of the model
106+
DiffOpt.empty_input_sensitivities!(model)
107+
108+
## Pass the perturbation to the DiffOpt Framework
109+
DiffOpt.set_reverse_variable.(model, vect_ref[i], ϵ)
110+
111+
## Compute the derivatives with the Forward Mode
112+
DiffOpt.reverse_differentiate!(model)
113+
114+
## Get the derivative of the model
115+
dvect[i] = DiffOpt.get_reverse_parameter(model, model[:d])
116+
end
117+
118+
## Return the values as a vector
119+
return [vect; dvect]
120+
end
121+
122+
# Initialize of Parameters
123+
124+
g_sup = [10.0, 20.0, 30.0]
125+
I = length(g_sup)
126+
d = 0.0:0.1:80
127+
d_size = length(d)
128+
c_g = [1.0, 3.0, 5.0]
129+
c_ϕ = 10.0;
130+
131+
# Generate models for each demand `d`
132+
models = generate_model.(d; g_sup = g_sup, c_g = c_g, c_ϕ = c_ϕ);
133+
134+
# Get the results of models with the DiffOpt Forward and Backward context
135+
136+
result_forward = diff_forward.(models)
137+
optimize!.(models)
138+
result_reverse = diff_reverse.(models);
139+
140+
# Organization of results to plot
141+
# Initialize data_results that will contain every result
142+
data_results = Array{Float64,3}(undef, 2, d_size, 2 * (I + 1));
143+
144+
# Populate the data_results array
145+
for k in 1:d_size
146+
data_results[1, k, :] = result_forward[k]
147+
data_results[2, k, :] = result_reverse[k]
148+
end
149+
150+
# ## Results with Plot graphs
151+
# ### Results for the forward context
152+
# Result Primal Values:
153+
Plots.plot(
154+
d,
155+
data_results[1, :, 1:I+1];
156+
title = "Generation by Demand",
157+
label = ["Thermal Generation 1" "Thermal Generation 2" "Thermal Generation 3" "Generation Deficit"],
158+
xlabel = "Demand [unit]",
159+
ylabel = "Generation [unit]",
160+
)
161+
162+
# Result Sensitivity Analysis:
163+
Plots.plot(
164+
d,
165+
data_results[1, :, I+2:2*(I+1)];
166+
title = "Sensitivity of Generation by Demand",
167+
label = ["T. Gen. 1 Sensitivity" "T. Gen. 2 Sensitivity" "T. Gen. 3 Sensitivity" "Gen. Deficit Sensitivity"],
168+
xlabel = "Demand [unit]",
169+
ylabel = "Sensitivity [-]",
170+
)
171+
172+
# ### Results for the reverse context
173+
# Result Primal Values:
174+
Plots.plot(
175+
d,
176+
data_results[2, :, 1:I+1];
177+
title = "Generation by Demand",
178+
label = ["Thermal Generation 1" "Thermal Generation 2" "Thermal Generation 3" "Generation Deficit"],
179+
xlabel = "Demand [unit]",
180+
ylabel = "Generation [unit]",
181+
)
182+
183+
# Result Sensitivity Analysis:
184+
Plots.plot(
185+
d,
186+
data_results[2, :, I+2:2*(I+1)];
187+
title = "Sensitivity of Generation by Demand",
188+
label = ["T. Gen. 1 Sensitivity" "T. Gen. 2 Sensitivity" "T. Gen. 3 Sensitivity" "Gen. Deficit Sensitivity"],
189+
xlabel = "Demand [unit]",
190+
ylabel = "Sensitivity [-]",
191+
)

0 commit comments

Comments
 (0)