Costum constraints with Linopy

361 views
Skip to first unread message

Ebbe Kyhl Gøtske

unread,
Apr 13, 2023, 8:58:19 AM4/13/23
to pypsa
Hi,

I would like to define a constraint that bounds the dispatch of a StorageUnit to follow some predefined/historical seasonality profile. The idea could be to aggregate the dispatch on a monthly level and build the linear expression based on the array containing the monthly aggregates. 

However, I am running in to some syntax issues/challenges when I try to implement this with the Linopy. If one was to aggregate the dispatch on an annual timescale and use a single scalar as an upper/lower bound, this would be an easy fix in Linopy (here, exemplified with a hydro StorageUnit):

>>> lhs = n.model.variables["StorageUnit-p_dispatch"].sel(StorageUnit='hydro').sum()
>>> rhs = scalar_bound
>>> n.model.add_constraints(lhs <= rhs, name="wind total constraint")

However, as described, instead of the '.sum' over all snapshots, I would like to resample/aggregate the dispatch on monthly level. This could for example be done with the "xarray.Dataset.resample.sum()" function:

>>> xr_data = n.model.variables["StorageUnit-p_dispatch"].sel(StorageUnit='hydro').data
>>> xr_data['snapshot'] = pd.DatetimeIndex(xr_data['snapshot'].values)
>>> xr_data.resample(snapshot='m').sum()

This leads to the desired array of 12 aggregate dispatch values. However, to get a linear expression from here does not seem very straight forward. I am not even sure if Linopy is compatible with this approach.

Perhaps some of you have some valuable insight.

Best regards,
Ebbe Gøtske

Fabian Hofmann

unread,
Apr 19, 2023, 11:00:48 AM4/19/23
to py...@googlegroups.com

Hey Ebbe,


what should work is:


p = n.model.variables["StorageUnit-p_dispatch"].sel(StorageUnit='hydro')

p.groupby("snapshot.month").sum()


Could you have a try with that?


Best

Fabian Hofmann

--
You received this message because you are subscribed to the Google Groups "pypsa" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pypsa+un...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/pypsa/1f3108ee-9e29-4f5b-be03-6706e5fac9f8n%40googlegroups.com.
-- 
Fabian Hofmann 

Postdoctoral Researcher
Institute of Energy Technology
Technische Universität Berlin
http://fabianhofmann.org/

Group website: https://tub-ensys.github.io/

Ebbe Kyhl Gøtske

unread,
Apr 20, 2023, 3:20:08 AM4/20/23
to pypsa
Hi Fabian,

It works very well. When applying the .grouping(), the argument would need to be an xarray format with a datetime index, so I just added the following and now it works.

p = n.model.variables["StorageUnit-p_dispatch"].sel(StorageUnit='hydro')

ds_months = pd.Series(n.snapshots.month,
                                          index=n.snapshots).to_xarray()

ds_months['snapshot'] = pd.DatetimeIndex(ds_months['snapshot'].values)

p.groupby(ds_months).sum()

Thanks a lot for your suggestion! :)

Best regards,
Ebbe

Fabian Hofmann

unread,
Apr 20, 2023, 3:45:29 AM4/20/23
to py...@googlegroups.com

Hey Ebbe,


great that it works. Just out of curiosity, why not doing it directly with the short `p.groupby("snapshot.month").sum()`? It yields the same result or am I missing something?

Best

Fabian

Ebbe Kyhl Gøtske

unread,
Apr 20, 2023, 4:17:34 AM4/20/23
to pypsa
My mistake. For some reason, the snapshot coordinate was not a datetime64 data type in the Linopy model that I had initiated. I tried your command in another setup and it works fine.



In that case, to implement a month-dependent constraint (perhaps, others would find it useful):

Define left-hand side:

p = n.model.variables["StorageUnit-p_dispatch"].sel(StorageUnit='hydro')
lhs = p.groupby(ds_months).sum()

Define right-hand side:

snapshot = np.arange(1,13)
limit = np.arange(1,13)
data_array = xr.DataArray(
            data=limit,
            dims = ["snapshot"],
            coords = dict(snapshot=snapshot),
            attrs = dict(description="monthly_limit",units="")
            )

Add constraint:

n.model.add_constraints(lhs <= rhs, name="hydro monthly upper bound")

Output:

Constraint `hydro monthly upper bound` (snapshot: 8772) - 8760 masked entries -------------------------------------------------------------------------------
[1]: 1.0 StorageUnit-p_dispatch[2015-01-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-01-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-01-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-01-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-01-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-01-31 23:00:00+00:00, hydro] <= 1.0
[2]: 1.0 StorageUnit-p_dispatch[2015-02-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-02-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-02-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-02-28 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-02-28 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-02-28 23:00:00+00:00, hydro] <= 2.0
[3]: 1.0 StorageUnit-p_dispatch[2015-03-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-03-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-03-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-03-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-03-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-03-31 23:00:00+00:00, hydro] <= 3.0
[4]: 1.0 StorageUnit-p_dispatch[2015-04-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-04-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-04-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-04-30 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-04-30 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-04-30 23:00:00+00:00, hydro] <= 4.0
[5]: 1.0 StorageUnit-p_dispatch[2015-05-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-05-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-05-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-05-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-05-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-05-31 23:00:00+00:00, hydro] <= 5.0
[6]: 1.0 StorageUnit-p_dispatch[2015-06-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-06-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-06-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-06-30 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-06-30 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-06-30 23:00:00+00:00, hydro] <= 6.0
[7]: 1.0 StorageUnit-p_dispatch[2015-07-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-07-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-07-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-07-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-07-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-07-31 23:00:00+00:00, hydro] <= 7.0
[8]: 1.0 StorageUnit-p_dispatch[2015-08-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-08-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-08-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-08-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-08-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-08-31 23:00:00+00:00, hydro] <= 8.0
 [9]: 1.0 StorageUnit-p_dispatch[2015-09-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-09-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-09-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-09-30 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-09-30 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-09-30 23:00:00+00:00, hydro] <= 9.0
[10]: 1.0 StorageUnit-p_dispatch[2015-10-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-10-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-10-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-10-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-10-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-10-31 23:00:00+00:00, hydro] <= 10.0
[11]: 1.0 StorageUnit-p_dispatch[2015-11-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-11-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-11-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-11-30 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-11-30 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-11-30 23:00:00+00:00, hydro] <= 11.0
[12]: 1.0 StorageUnit-p_dispatch[2015-12-01 00:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-12-01 01:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-12-01 02:00:00+00:00, hydro] ... 1.0 StorageUnit-p_dispatch[2015-12-31 21:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-12-31 22:00:00+00:00, hydro] + 1.0 StorageUnit-p_dispatch[2015-12-31 23:00:00+00:00, hydro] <= 12.0

Fabian Hofmann

unread,
Apr 20, 2023, 6:59:32 AM4/20/23
to py...@googlegroups.com

Nice. It would also be useful to rename the constraint's dimension "snapshot" to "month", since right now it re-indexes the constraint to the snapshot dimension when adding it to the model (indicated by the first printed line). This is already automatically done by

lhs = p.groupby("snapshot.month").sum()


And to make the rhs more compact:

rhs = xr.DataArray(np.arange(13), [lhs.indexes['month']])

Ebbe Kyhl Gøtske

unread,
Apr 21, 2023, 4:30:42 AM4/21/23
to pypsa
Even better!

Thanks once again.

Best,
Ebbe

RIzky Rahmani

unread,
May 19, 2023, 6:02:56 AM5/19/23
to pypsa
Hi Fabian,

Is there any way for multiindex? Because i run the multi year investment, so my snapshot is multiindex

I tried this code but shows error

Miguel Ángel Martinez Pérez

unread,
May 31, 2023, 9:37:26 AM5/31/23
to pypsa
Hi Fabian,

this was really useful, but I actually have the same question as riykyrah. How should the groupby() be written in case of multiindex in the snapshots?
snapshots - Multiindex
  • period int64:  2025, 2030, 2035, 2040, 2045, 2050
  • timestep datetime64: with an hourly resolution

I would like to aggregate by period.
Thank you so much in advance

Miguel

Fabian Hofmann

unread,
Jun 6, 2023, 4:59:46 AM6/6/23
to py...@googlegroups.com

Hey Miguel,


unfortunately, I cannot come up with a good solution, as the `groupby` with multiindex is broken in the current xarray version. But the issue was solved in the recent PR https://github.com/pydata/xarray/issues/6836#event-9439829468. So, you can expect the groupby function in linopy  working as expected with the next xarray version (you can alternatively install from the xarray master branch directly). Then, the following operation should work:


expr.groupby("timestep").sum()


Please notify me if this still encounter problems.


Best wishes

Fabian

Reply all
Reply to author
Forward
0 new messages