Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Research paper "Energy Efficiency across Programming Languages: How does energy, time, and memory relate?"

82 views
Skip to first unread message

bream...@gmail.com

unread,
Sep 16, 2017, 7:04:28 PM9/16/17
to
I thought some might find this https://sites.google.com/view/energy-efficiency-languages/ interesting.

--
Kindest regards.

Mark Lawrence.

Steve D'Aprano

unread,
Sep 16, 2017, 10:01:41 PM9/16/17
to
On Sun, 17 Sep 2017 09:04 am, bream...@gmail.com wrote:

> I thought some might find this
> https://sites.google.com/view/energy-efficiency-languages/ interesting.

"Made with the new Google Sites, an effortless way to create beautiful sites."

More like an effortless way to create a complete dog's breakfast. Once upon a
time, web sites would degrade gracefully. If something interrupted the page
loading, or something wasn't quite right, or you'd still get something usable.
Now, if the tiniest thing goes wrong, you get a junk.

I've tried to see the results, but I just get a bunch of broken images :-(


On the linked page, starting from the top and scrolling down, I see:

- about two screens worth of black white space;

- followed by three giant black horizontal bars, each one about an inch high;

- more white space;

- what looks like something that was intended to be a side-bar, containing:

SLE'17
Home
Results
Setup
More

- a giant down-pointing arrowhead, about three inches tall, which turns
grey when you mouse-over it but doesn't do anything when clicked;

- three more links:

Home
Results
Setup

which disappear when you mouse-over them;

- finally some content!

The tools and graphical data pointed by this page are included in the
research paper "Energy Efficiency across Programming Languages: How does
Energy, Time and Memory Relate?", accepted at the International Conference
on Software Language Engineering (SLE)

[1] Measuring Framework & Benchmarks
[2] Complete Set of Results
[3] Setup
[4] Paper


where the last four bits are links;

- the smug, self-congratulatory comment quoted above about "beautiful sites";

- a button "Create a site"

- What was presumably intended to be a link, but is actually just a piece of
plain text: "Report abuse";

- more whitespace;

- and finally a giant blue "i", pointed at the bottom, and slanted at 45
degrees. Presumably a logo for somebody or something.


And yes, I am allowing scripts from Google and Gstatic to run, and the page is
still broken.


Including the hyperlinks, that's about 700 bytes of actual content. Let's double
it for the overhead of HTML over plain text, so somewhat less than 1.5 KiB of
content.

The actual page is 27285 bytes or over 26 KiB. That gives us something with a
useful content to bloat factor of 1:17, and *the page still doesn't work.*

And that's not even counting any additional files the page requires, like CSS,
external javascript files, images, ads, web-bugs, etc. You want to know why
browsing the web today on full ADSL or faster speeds is *slower* than using a
dial up modem in the 1990s? This is why.

www.antipope.org/charlie/blog-static/2008/05/why_your_internet_experience_i.html

Nine years later, and the problem is worse, not better.




--
Steve
“Cheer up,” they said, “things could be worse.” So I cheered up, and sure
enough, things got worse.

Ian Kelly

unread,
Sep 17, 2017, 12:23:33 AM9/17/17
to
It looks fine to me.


> Including the hyperlinks, that's about 700 bytes of actual content. Let's double
> it for the overhead of HTML over plain text, so somewhat less than 1.5 KiB of
> content.
>
> The actual page is 27285 bytes or over 26 KiB. That gives us something with a
> useful content to bloat factor of 1:17, and *the page still doesn't work.*
>
> And that's not even counting any additional files the page requires, like CSS,
> external javascript files, images, ads, web-bugs, etc. You want to know why
> browsing the web today on full ADSL or faster speeds is *slower* than using a
> dial up modem in the 1990s? This is why.
>
> www.antipope.org/charlie/blog-static/2008/05/why_your_internet_experience_i.html
>
> Nine years later, and the problem is worse, not better.

If you're using a cell phone over 2G, then I tentatively agree. But on
my laptop over WiFi, this page that you're complaining about loaded in
783 ms when I tried it.

Terry Reedy

unread,
Sep 17, 2017, 2:01:03 AM9/17/17
to
On 9/16/2017 7:04 PM, bream...@gmail.com wrote:
> I thought some might find this https://sites.google.com/view/energy-efficiency-languages/ interesting.

By 'energy', they only mean electricity, not food calories. This is the
email I sent to the authors.
-----------

As a two-decade user of Python, I was interested to read your paper.
Unfortunately, it is deeply flawed with respect to Python in the sense
that your conclusions are inapplicable to real-world usage of Python.

The problem is your use of the Computer Language Benchmark Game. As the
name says, it is a *game*. As a game, it has artificial rules dictated
by the game masters. It uses toy problems, and for Python, the rules
dictate unrealistic toy solutions. In particular, it appears to
disallow use of 'import' with 3rd-party modules, whereas real-world
Python is expected to use them, and nearly always does.

The particular crippler for CLBG problems is the non-use of numpy in
numerical calculations, such as the n-body problem. Numerical python
extensions are over two decades old and give Python code access to
optimized, compiled BLAS, LinPack, FFTPack, and so on. The current one,
numpy, is the third of the series. It is both a historical accident and
a continuing administrative convenience that numpy is not part of the
Python stdlib. However, it is easily installed from the PSF-maintained
repository (python -m pip install numpy), and is included with most
third-party distributions of Python.

The numerical extensions have been quasi-official in the sense that at
least 3 language enhancements have been make for their use. Nearly all
real-world scientific, financial, and neural-network Python programs are
build on top of numpy. When a Python program spend 95% of the time in
the optimized compiled C routines, it is nearly as fast as a 100% C
solution. The reason people use Python instead of C for the other 5% is
to save human time.

Even when it come to executing the pure Python solutions, the CLBG rules
apparently require the slowest execution possible. Execution would be at
least 2 to 5 times faster if compilation to machine code were allowed,
either before or during execution. But the point of the game is to
provide a 'level' playing field for competition between Python
programmers, even if the cost is to cripple comparison with other
language solution.

Terry Jan Reedy




--
Terry Jan Reedy

Chris Angelico

unread,
Sep 17, 2017, 2:04:51 AM9/17/17
to
On Sun, Sep 17, 2017 at 4:00 PM, Terry Reedy <tjr...@udel.edu> wrote:
> The numerical extensions have been quasi-official in the sense that at least
> 3 language enhancements have been make for their use.

I know about the matrix multiplication operator. What are the other
two (or more)?

ChrisA

Terry Reedy

unread,
Sep 17, 2017, 2:17:08 AM9/17/17
to
Stride slicing, which later became valid in regular code, and Ellipsis.
(I could be wrong on the latter.)

--
Terry Jan Reedy

Steve D'Aprano

unread,
Sep 17, 2017, 4:15:01 AM9/17/17
to
Nope, both are correct.

Slice strides were first supported in Python 1.4, but used exclusively by
Numerical Python (numpy's ancient predecessor), and didn't finally get
supported by Python builtins until as late as version 2.3!

https://docs.python.org/2.3/whatsnew/section-slices.html

Ellipsis was used for multi-dimensional slicing, and was added for Numerical
Python. It wasn't until recently (Python 3.4 perhaps?) that it finally became
legal to write '...' instead of 'Ellipsis' outside of slice notation.

Here's Peter Otten talking about it way back in 2004:

http://grokbase.com/t/python/python-list/042jd5y60n/ellipsis-usage

John Ladasky

unread,
Sep 18, 2017, 5:21:55 PM9/18/17
to
On Saturday, September 16, 2017 at 11:01:03 PM UTC-7, Terry Reedy wrote:
> On 9/16/2017 7:04 PM, b...@g...com wrote:
<snip>
> The particular crippler for CLBG problems is the non-use of numpy in
> numerical calculations, such as the n-body problem. Numerical python
> extensions are over two decades old and give Python code access to
> optimized, compiled BLAS, LinPack, FFTPack, and so on. The current one,
> numpy, is the third of the series. It is both a historical accident and
> a continuing administrative convenience that numpy is not part of the
> Python stdlib.

OK, I found this statement intriguing. Honestly, I can't function without Numpy, but I have always assumed that many Python programmers do so. Meanwhile: most of the time, I have no use for urllib, but that module is in the standard library.

I noticed the adoption of the @ operation for matrix multiplication. I have yet to use it myself.

So is there a fraction of the Python community that thinks that Numpy should in fact become part of the Python stdlib? What is the "administrative convenience" to which you refer?

bream...@gmail.com

unread,
Sep 18, 2017, 6:08:48 PM9/18/17
to
My very opinionated personnal opinion is that many third party libraries are much better off outside of the stdlib, numpy particulary so as it's one of the most used, if not the most used, such libraries.

My rationale is simple, the authors of the libraries are not tied into the (c)Python release cycle, the PEP process or anything else, they can just get on with it.

Consider my approach many blue moons ago when I was asking when the "new" regex module was going to be incorporated into Python, and getting a bit miffed in my normal XXXL size hat autistic way when it didn't happen. I am now convinved that back then I was very firmly wrong, and that staying out of the stdlib has been the best thing that could have happened to regex. No doubt MRAB will disagree :)

Terry Reedy

unread,
Sep 18, 2017, 6:13:20 PM9/18/17
to
On 9/18/2017 5:21 PM, John Ladasky wrote:
> On Saturday, September 16, 2017 at 11:01:03 PM UTC-7, Terry Reedy wrote:
>> On 9/16/2017 7:04 PM, b...@g...com wrote:
> <snip>
>> The particular crippler for CLBG problems is the non-use of numpy in
>> numerical calculations, such as the n-body problem. Numerical python
>> extensions are over two decades old and give Python code access to
>> optimized, compiled BLAS, LinPack, FFTPack, and so on. The current one,
>> numpy, is the third of the series. It is both a historical accident and
>> a continuing administrative convenience that numpy is not part of the
>> Python stdlib.
>
> OK, I found this statement intriguing. Honestly, I can't function without Numpy, but I have always assumed that many Python programmers do so.

True. Very few websites need numpy. Ditto for some other categories.
> Meanwhile: most of the time, I have no use for urllib, but that
module is in the standard library.

Urllib is comparable in scope to, say, BLAS, or even to the math +
statistics module. And it is less dependent on particular systems.
Django and associated modules, which is comparable, in its own way, to
numpy + scipy, is also not in the stdlib, and should not be.

> I noticed the adoption of the @ operation for matrix multiplication. I have yet to use it myself.
>
> So is there a fraction of the Python community that thinks that Numpy should in fact become part of the Python stdlib?

Only naive beginners, I should think.

> What is the "administrative convenience" to which you refer?

Numerical analysis is quite different from compiler construction and
communication protocols. So numpy needs its own set of developers,
policies, decision process, release schedule, etc. It took an expert in
the field to persuade the numerical and numarray people to join together
to produce numpy. PSF provides the infrastructure that makes 'pip
install numpy' or 'pip install django' possible. Separate groups
provide the content.
Does this make sense?

--
Terry Jan Reedy

Ethan Furman

unread,
Sep 18, 2017, 6:26:37 PM9/18/17
to
On 09/17/2017 01:14 AM, Steve D'Aprano wrote:
> On Sun, 17 Sep 2017 04:16 pm, Terry Reedy wrote:
>
>> On 9/17/2017 2:04 AM, Chris Angelico wrote:
>>> On Sun, Sep 17, 2017 at 4:00 PM, Terry Reedy <tjr...@udel.edu> wrote:
>>>> The numerical extensions have been quasi-official in the sense that at least
>>>> 3 language enhancements have been make for their use.
>>>
>>> I know about the matrix multiplication operator. What are the other
>>> two (or more)?
>>
>> Stride slicing, which later became valid in regular code, and Ellipsis.
>> (I could be wrong on the latter.)
>
>
> Nope, both are correct.

Weren't the rich comparison operators* also motivated by the numerical/scientific community?

--
~Ethan~

* __le__, __gt__, etc.

MRAB

unread,
Sep 18, 2017, 8:13:58 PM9/18/17
to
I was, at one time, in favour of including it in the stdlib, but then I
changed my mind. Being outside the stdlib _does_ give me more
flexibility. I can, as you said, just get on with it.
I even have it on a Raspberry Pi. "pip install regex" is all it took. No
need for it to be in the stdlib. :-)

Rick Johnson

unread,
Sep 18, 2017, 8:23:29 PM9/18/17
to
On Saturday, September 16, 2017 at 9:01:41 PM UTC-5, Steve D'Aprano wrote:
> [...]
> - a giant down-pointing arrowhead, about three inches tall,
> which turns grey when you mouse-over it but doesn't do
> anything when clicked;

Oh, it does something, just not an _obvious_ something. LOL.
I tried to warn you about the dangers of implicitness, but
alas, you had to learn the hard way! Steven, you should know
better than to click on giant arrow buttons. At this moment,
some Russian hacker is sifting through your browser history.
Might be a good idea to cancel your credit cards...

>[...]
> And that's not even counting any additional files the page
> requires, like CSS, external javascript files, images, ads,
> web-bugs, etc. You want to know why browsing the web today
> on full ADSL or faster speeds is *slower* than using a dial
> up modem in the 1990s? This is why.
>
> www.antipope.org/charlie/blog-static/2008/05/why_your_internet_experience_i.html
>
> Nine years later, and the problem is worse, not better.

I was wondering what could compel you to go off on such a
ridiculous rant over a simple slow loading page, but now i'm
starting to think this is all a giant charade so you could
direct traffic to some random blog.

John Ladasky

unread,
Sep 19, 2017, 2:55:20 AM9/19/17
to
On Monday, September 18, 2017 at 5:13:58 PM UTC-7, MRAB wrote:
> On 2017-09-18 23:08, b...@g...com wrote:

> > My rationale is simple, the authors of the libraries are not tied into the (c)Python release cycle, the PEP process or anything else, they can just get on with it.
> >
> > Consider my approach many blue moons ago when I was asking when the "new" regex module was going to be incorporated into Python, and getting a bit miffed in my normal XXXL size hat autistic way when it didn't happen. I am now convinved that back then I was very firmly wrong, and that staying out of the stdlib has been the best thing that could have happened to regex. No doubt MRAB will disagree :)
> >
> I was, at one time, in favour of including it in the stdlib, but then I
> changed my mind. Being outside the stdlib _does_ give me more
> flexibility. I can, as you said, just get on with it.
> I even have it on a Raspberry Pi. "pip install regex" is all it took. No
> need for it to be in the stdlib. :-)

Inadvertently, you have just pointed out a weakness of not including something important and great in the stdlib. There's an alternative to the re module, which at least a few members of the community consider to be superior, and which might therefore be widely used. But... until now, I'd never heard of it.

I have come to understand from your other posts that adding something to the stdlib imposes significant constraints on the release schedules of those modules. I can appreciate the hassle that might cause. Still, now I wonder what I might be missing.

Steven D'Aprano

unread,
Sep 19, 2017, 3:11:58 AM9/19/17
to
On Tue, 19 Sep 2017 01:13:23 +0100, MRAB wrote:

> I even have it on a Raspberry Pi. "pip install regex" is all it took. No
> need for it to be in the stdlib. :-)

That's fine for those of us who can run pip and install software from the
web without being immediately fired, and for those who have installation
rights on the computers they use. And those with easy, cheap and fast
access to the internet.

Not everyone is so lucky.

There is a significant chunk of the Python community for whom "just pip
install it" is not easy, legal or even possible. For them, if its not in
the standard library, it might as well not even exist.



--
Steven D'Aprano
“You are deluded if you think software engineers who can't write
operating systems or applications without security holes, can write
virtualization layers without security holes.” —Theo de Raadt

Stefan Behnel

unread,
Sep 19, 2017, 4:05:51 AM9/19/17
to
John Ladasky schrieb am 19.09.2017 um 08:54:
> I have come to understand from your other posts that adding something to
> the stdlib imposes significant constraints on the release schedules of
> those modules. I can appreciate the hassle that might cause. Still,
> now I wonder what I might be missing.

There are many packages on PyPI that reimplement functionality of the
stdlib in some "better" way, by their own definition of "better". Some are
faster, some are more feature-rich, some have a better API, some focus on
making specific special cases faster/easier/whatever.

The stdlib is there to provide a base level of functionality. That base
level tends to be much higher up than in most other programming languages,
but from the point of view of Python, it's still just a base level, however
comfortable it might be.

If you need specific features, more speed, can't live with a certain API or
feel that you are wasting too much developer time by doing something the
way you always did it, search PyPI for something "better" by your own
definition at a given time.

If you can live with what the stdlib provides, stick to it. Keeping foreign
dependencies low is also "better" in some cases.

Stefan

Tim Golden

unread,
Sep 19, 2017, 4:12:00 AM9/19/17
to
On 19/09/2017 09:05, Stefan Behnel wrote:
> The stdlib is there to provide a base level of functionality. That base
> level tends to be much higher up than in most other programming languages,
> but from the point of view of Python, it's still just a base level, however
> comfortable it might be.
>
> If you need specific features, more speed, can't live with a certain API or
> feel that you are wasting too much developer time by doing something the
> way you always did it, search PyPI for something "better" by your own
> definition at a given time.
>
> If you can live with what the stdlib provides, stick to it. Keeping foreign
> dependencies low is also "better" in some cases.

Nice summary.

TJG

Stephan Houben

unread,
Sep 19, 2017, 2:30:57 PM9/19/17
to
Op 2017-09-19, Steven D'Aprano schreef <steve+comp....@pearwood.info>:

> There is a significant chunk of the Python community for whom "just pip
> install it" is not easy, legal or even possible. For them, if its not in
> the standard library, it might as well not even exist.

But numpy *is* in the standard library, provided you download the
correct version of Python, namely the one from:

https://python-xy.github.io/

Stephan

leam hall

unread,
Sep 19, 2017, 2:40:32 PM9/19/17
to
On Tue, Sep 19, 2017 at 2:37 PM, Stephan Houben <
steph...@gmail.com.invalid> wrote:

> Op 2017-09-19, Steven D'Aprano schreef <steve+comp.lang.python@
> pearwood.info>:
>
> > There is a significant chunk of the Python community for whom "just pip
> > install it" is not easy, legal or even possible. For them, if its not in
> > the standard library, it might as well not even exist.
>
> But numpy *is* in the standard library, provided you download the
> correct version of Python, namely the one from:
>
> https://python-xy.github.io/
>
> Stephan
>
>
Many of us can't pip install; it's in the OS supplied vendor repo or it
doesn't go on the machines.

Leam

John Ladasky

unread,
Sep 19, 2017, 3:01:01 PM9/19/17
to
On Tuesday, September 19, 2017 at 1:05:51 AM UTC-7, Stefan Behnel wrote:
> John Ladasky schrieb am 19.09.2017 um 08:54:
> > I have come to understand from your other posts that adding something to
> > the stdlib imposes significant constraints on the release schedules of
> > those modules. I can appreciate the hassle that might cause. Still,
> > now I wonder what I might be missing.
>
> There are many packages on PyPI that reimplement functionality of the
> stdlib in some "better" way, by their own definition of "better". Some are
> faster, some are more feature-rich, some have a better API, some focus on
> making specific special cases faster/easier/whatever.

And of course I have found some other third-party packages: scipy, pandas, matplotlib, and PyQt5 are important for my work. I helped a student of mine get selenium running. In the case of PyQt, I found TKinter unsatisfactory many years ago, and went looking for better choices. I used wxPython first, when I was working in Py2. When wxPython was slow to migrate to Py3, I went searching again.

> The stdlib is there to provide a base level of functionality. That base
> level tends to be much higher up than in most other programming languages,

Very much agreed.

> but from the point of view of Python, it's still just a base level, however
> comfortable it might be.
>
> If you need specific features, more speed, can't live with a certain API or
> feel that you are wasting too much developer time by doing something the
> way you always did it,

And that's the tricky part. Maybe there's a better way out there that is helping other programmers, but it isn't well-publicized. Say what you want about the hassle it imposes on package developers, but the best advertisement is having your package in the stdlib where everyone will see it.

> If you can live with what the stdlib provides, stick to it. Keeping foreign
> dependencies low is also "better" in some cases.

Also agreed. Thankfully, my code doesn't run on any machines more than two doors down from my office. And although it has to run on Windows, Linux, and Mac, I can get my hands on every machine and install the packages my users need.

alister

unread,
Sep 20, 2017, 8:58:57 AM9/20/17
to
dnf install <package>
or
apt_get install <package>

most of the mainstream modules seem to be there (certainly numpy)



--
Kliban's First Law of Dining:
Never eat anything bigger than your head.

Paul Moore

unread,
Sep 20, 2017, 9:14:57 AM9/20/17
to
On 20 September 2017 at 13:58, alister via Python-list
You're missing the point. A significant number of Python users work on
systems where:

1. They have no admin rights
2. Their corporate or other policies prohibit installing 3rd party
software without approval that is typically difficult or impossible to
get
3. Quite possibly the system has no network access outside of the local intranet
4. The system admins may not be able or willing to upgrade or
otherwise modify the system Python

Writing code that works only with stdlib modules is basically the only
option in such environments.

Having said that, I don't advocate that everything be in the stdlib
because of this. A lot of things (such as numpy) belong as 3rd party
packages. But that doesn't mean that "get XYZ off PyPI" (or "install
XYZ alternative Python distribution/version") is a viable solution to
every problem.

Paul

alister

unread,
Sep 20, 2017, 12:11:38 PM9/20/17
to
not missing the point you said previously "it's in the OS supplied vendor
repo or it doesn't go on the machines."

dnf/yum or apt_get install form the "vendor supplied repo"

I fully understand that even this may require various hoops to be jumped
through before it can happen



--
Minnie Mouse is a slow maze learner.

Chris Warrick

unread,
Sep 20, 2017, 12:25:21 PM9/20/17
to
On 20 September 2017 at 17:16, Dennis Lee Bieber <wlf...@ix.netcom.com> wrote:
> On Tue, 19 Sep 2017 11:58:47 -0700 (PDT), John Ladasky
> <john_l...@sbcglobal.net> declaimed the following:
>
>>
>>And of course I have found some other third-party packages: scipy, pandas, matplotlib, and PyQt5 are important for my work. I helped a student of mine get selenium running. In the case of PyQt, I found TKinter unsatisfactory many years ago, and went looking for better choices. I used wxPython first, when I was working in Py2. When wxPython was slow to migrate to Py3, I went searching again.
>>
>
> And if wxPython had been part of the stdlib, it would have meant Python
> 3 would have been delayed years until wxPython had been ported -- or
> wxPython would have been pulled from the stdlib and something else put in
> its place...
>
> So no help to those migrating.

If wxPython had been part of the stdlib, there would be much more
manpower to port it to 3. Also, the project underwent a complete
rewrite, which dooms many projects to failure. Perhaps they wouldn’t
try the rewrite, or they would port the older codebase to Python 3 so
that it could be shipped. (They’re currently at Beta 2 of the
post-rewrite 4.0.0 version.)

--
Chris Warrick <https://chriswarrick.com/>
PGP: 5EAAEA16

Ethan Furman

unread,
Sep 20, 2017, 12:36:24 PM9/20/17
to
On 09/20/2017 09:24 AM, Chris Warrick wrote:
> On 20 September 2017 at 17:16, Dennis Lee Bieber wrote:

>> And if wxPython had been part of the stdlib, it would have meant Python
>> 3 would have been delayed years until wxPython had been ported -- or
>> wxPython would have been pulled from the stdlib and something else put in
>> its place...
>>
>> So no help to those migrating.
>
> If wxPython had been part of the stdlib, there would be much more
> manpower to port it to 3.

How do you figure? The available manpower is what it took to get Python 3 itself out when it came out; adding another
project as large as wxPython would not magically make it so the same target dates were hit -- it's not like we have
core-devs sitting idly by waiting for something to do.

--
~Ethan~
0 new messages