Energy models in Julia: overview and call for cooperation

192 views
Skip to first unread message

Tom Brown

unread,
May 15, 2018, 12:31:16 PM5/15/18
to openmod list
Hi all,

The programming language Julia has often been discussed in openmod
circles, particularly because of its attractive optimisation framework JuMP:

https://github.com/JuliaOpt/JuMP.jl

JuMP has all the performance, elegance and readability of GAMS without
the closed licence and price tag for non-academic users.

Many groups have started using Julia, particularly in the power systems
world, but it seems to be slowly creeping into energy system modelling too.

Let's try and avoid the proliferation of models we've seen in Python. So
to this end:

i) Please send an email if you have or know an existing Julia model
(those known to me are listed below). I'll report back to the list.

ii) Let's see if we can cooperate to avoid multiple models.

Known to me are:

a) LANL's PowerModels.jl

https://github.com/lanl-ansi/PowerModels.jl

and their ecosystem of related models (including gas models):

https://github.com/lanl-ansi

This is mostly focused on single periods as far as I know (?).

b) PSR (closed)

https://juliacomputing.com/case-studies/psr.html

c) Our own partial implementation of PyPSA in Julia

https://github.com/PyPSA/PSA.jl

(not to be used since it will be rewritten over the next few weeks (and
also renamed), but initial experiments are very promising regards memory
and CPU performance).

Best,

Tom


--
Karlsruhe Institute of Technology (KIT)
Institute for Automation and Applied Informatics (IAI)

Dr. Tom Brown
Energy System Modelling

Phone: +49 721 608 25737
Fax: +49 721 608 22602
Website: https://www.iai.kit.edu/
Personal website: https://nworbmot.org/

Visitor Address:
Office 309
Campus North Building 445
Hermann-von-Helmholtz-Platz 1
76344 Eggenstein-Leopoldshafen

Robbie Morrison

unread,
May 19, 2018, 11:34:41 AM5/19/18
to openmod-i...@googlegroups.com
Hello Tom, all

I recently spoke to three German modelers and each had failed to
recognize the underlying message in your original posting.

So could perhaps you repeat yourself but this time with a bit less
English reserve?  :)

Also, do you want to propose a rebel-a-thon on a subset-of-the-community
julia rewrite for the Zürich openmod meeting?

cheers, Robbie
Robbie Morrison
Address: Schillerstrasse 85, 10627 Berlin, Germany
Phone: +49.30.612-87617

Oleg Lugovoy

unread,
May 20, 2018, 6:21:01 AM5/20/18
to openmod initiative
Hi Tom, all,

FYI:
Julia is promising. Fast developing (there are pros and cons of it).
This is not about energy system models, but IAMs:
1. I tried to convert DICE to Julia (from GAMS), using Ipopt - it was straightforward to do:  https://github.com/olugovoy/climatedice/blob/master/Julia/DICE2016inJulia.jl
2. There is another IAMs in Julia initiative: https://github.com/anthofflab/Mimi.jl

And we are going to add Julia version for our Bottom-Up model generator (energyRt) as well. Will be glad to learn about your progress and what others are doing.

Best,
Oleg

Bart Wiegmans

unread,
May 20, 2018, 7:56:34 AM5/20/18
to openmod initiative
Hi all,

With a programming language nerds' perspective, my 2c are:

- Julia combines an interactive, 'dynamically-typed' language with a LLVM backend. In short: feels like python, compiles like C (or maybe FORTRAN).
- For the purpose of high-performance computing, Julia relies heavily on LLVM autovectorizing, rather than explicit human-written vector code (like R / numpy do). Whether or not that pays off, I couldn't tell you without benchmarking.
- There are some extra fun features thrown in for parallel processing, but I don't know much about that.

In other words, compared with python/R a 'for' loop will be much faster in julia, but I'm not so sure about e.g. vector addition or matrix multiplication, which are typically a bit more important.

Regards,
Bart

--
You received this message because you are subscribed to the Google Groups "openmod initiative" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openmod-initiative+unsub...@googlegroups.com.
To post to this group, send email to openmod-initiative@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/openmod-initiative/0d7c2013-10c2-40f1-9c7d-916e378589dd%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Oleg Lugovoy

unread,
May 20, 2018, 8:33:40 AM5/20/18
to openmod initiative
here are some benchmarking: https://modelingguru.nasa.gov/docs/DOC-2625
Julia looks pretty good...

OL
To post to this group, send email to openmod-i...@googlegroups.com.

Tom Brown

unread,
May 21, 2018, 7:55:19 AM5/21/18
to openmod-i...@googlegroups.com
Hi all,

Thanks for all the tips on JuMP! And thanks Robbie for re-prodding the
list about it :-). I've now set up a do-a-thon for Julia/JuMP at the
Zurich workshop:

https://forum.openmod-initiative.org/t/do-a-thon-using-julias-jump-for-energy-system-optimisation/838

It's wiki-editable, so please add any other models you know; I haven't
added any of the models I was told about in private.

If anyone can add any experience of the performance of GAMS versus JuMP,
please post it to the forum. JuMP is clearly streaks ahead of Pyomo both
in speed and memory consumption.

Also ideas on what cooperating on a common code base might look like?
E.g. defining common components, behaviour, etc.

Best,

Tom


On 20/05/18 14:33, Oleg Lugovoy wrote:
> here are some benchmarking: https://modelingguru.nasa.gov/docs/DOC-2625
> Julia looks pretty good...
>
> OL
>
> On Sunday, May 20, 2018 at 7:56:34 AM UTC-4, Bart Wiegmans wrote:
>
> Hi all,
>
> With a programming language nerds' perspective, my 2c are:
>
> - Julia combines an interactive, 'dynamically-typed' language with a
> LLVM backend. In short: feels like python, compiles like C (or maybe
> FORTRAN).
> - For the purpose of high-performance computing, Julia relies
> heavily on LLVM autovectorizing, rather than explicit human-written
> vector code (like R / numpy do). Whether or not that pays off, I
> couldn't tell you without benchmarking.
> - There are some extra fun features thrown in for parallel
> processing, but I don't know much about that.
>
> In other words, compared with python/R a 'for' loop will be much
> faster in julia, but I'm not so sure about e.g. vector addition or
> matrix multiplication, which are typically a bit more important.
>
> Regards,
> Bart
>
> 2018-05-20 12:21 GMT+02:00 Oleg Lugovoy <olug...@gmail.com
> <javascript:>>:
> openmod-initiat...@googlegroups.com <javascript:>.
> To post to this group, send email to
> openmod-i...@googlegroups.com <javascript:>.
> <https://groups.google.com/d/msgid/openmod-initiative/0d7c2013-10c2-40f1-9c7d-916e378589dd%40googlegroups.com?utm_medium=email&utm_source=footer>.
>
> For more options, visit https://groups.google.com/d/optout
> <https://groups.google.com/d/optout>.
>
>
> --
> You received this message because you are subscribed to the Google
> Groups "openmod initiative" group.
> To unsubscribe from this group and stop receiving emails from it, send
> an email to openmod-initiat...@googlegroups.com
> <mailto:openmod-initiat...@googlegroups.com>.
> To post to this group, send email to openmod-i...@googlegroups.com
> <mailto:openmod-i...@googlegroups.com>.
> To view this discussion on the web, visit
> https://groups.google.com/d/msgid/openmod-initiative/de10abd7-e54a-4648-ab9d-7f6c99d031aa%40googlegroups.com
> <https://groups.google.com/d/msgid/openmod-initiative/de10abd7-e54a-4648-ab9d-7f6c99d031aa%40googlegroups.com?utm_medium=email&utm_source=footer>.
> For more options, visit https://groups.google.com/d/optout.

Frank Hellmann

unread,
May 21, 2018, 12:43:48 PM5/21/18
to openmod-i...@googlegroups.com
I can chip in that we are developing a Julia based library for Powergrid
dynamics. We intend to loosely couple this to PSA.jl where
possible/appropriate.

Our experience has been absolutely stellar with it. We are building on
DifferentialEquations.jl which already is outclassing everything else
out there for dynamics.

I think the comment on auto-vectorization vs vectored code misses the
point a bit. I have worked extensively with Python numpy/numba code, and
it's all fine as long as your code does things that fit within the numpy
framework. The issue is, as soon as your code doesn't you're either
writing plain C in Python syntax (numba) or you're back to being very
slow. So you're constraining the behaviour of the models you code to fit
the framework. We have definitely had to make modeling decisions based
on the limitations of Python rather than the subject matter.

In order to achieve the required speed in Python we've ended up
metaprogramming Strings together to then compile functions with numba.
That lead to slow start up times, was brittle and just all around
annoying. We ended up not really publishing or even using our library much.

Misc stuff:
+ Julia has a proper type system, making more complex development much
nicer.
+ We can benefit from Automatic Differentiation techniques.
+ Our code is considerably simpler and cleaner than in Python.
+ Performance is great, even though we haven't invested anything in
optimizing it yet.
+ The Library Ecosystem is rapidly developing best in class libraries [1].
- The tooling isn't quite there yet.
- Types + Multiple Dispatch confuses people who are used to objects +
classes.

Best,
Frank

[1] This is why the DiffEq Author thinks that is:
http://www.stochasticlifestyle.com/like-julia-scales-productive-insights-julia-developer/
Reply all
Reply to author
Forward
0 new messages