Tracking memory usage and time

76 views
Skip to first unread message

J

unread,
Aug 24, 2019, 5:56:13 AM8/24/19
to sage-s...@googlegroups.com, J
Hi,

to do a overview of a rather different set of `SAGE` methods, I would
like to not only track the time used to run a command, but also the
memory usage of the commands.

Is there a recommended way to do this?

Context:
I want to run the `nearestneighbordecoder`, the `syndromedecoder` and
the `informationsetdecoder` for errors from 1 to `X` (where X is
probably <<d/2, as the syndromedecoder crashs for bigger values/ is
killed by the OOM killer as far as I know) and for a set linear codes.

Greets J

Simon King

unread,
Sep 2, 2019, 6:35:28 AM9/2/19
to sage-s...@googlegroups.com
Hi J,

On 2019-08-24, J <mailinglists...@927589452.de> wrote:
> to do a overview of a rather different set of `SAGE` methods, I would
> like to not only track the time used to run a command, but also the
> memory usage of the commands.
>
> Is there a recommended way to do this?

I am a bit surprised that nobody answered this question yet. Sorry.

There is the get_memory_usage command, that might provide what you asked
for. But I am not sure if I understand correctly what you want to
achieve: Do you have a lengthy program and you want to understand how
much resources each individual command in your program takes? In that
case, it might make sense to use a profiler (e.g., %prun or %crun). Or do
you only want to know how much time and memore the program takes in total?
In that case, %time and get_memory_usage would probably give you the
answer.

Best regards,
Simon

J

unread,
Sep 2, 2019, 7:24:46 AM9/2/19
to sage-s...@googlegroups.com, J
Thanks get_memory_usage sounds good; I want to run several decoders from
the coding theory module to see better show there ups and downs;

it is expected for the syndrome decoder to be quicker but more memory
hungry while the nearest neighbor is more time expensive.

Jori Mäntysalo (TAU)

unread,
Sep 3, 2019, 5:43:27 AM9/3/19
to sage-s...@googlegroups.com
On Mon, 2 Sep 2019, J wrote:

> Thanks get_memory_usage sounds good; I want to run several decoders from
> the coding theory module to see better show there ups and downs;

There is also at least %mprun magic. Googling that will give you some
examples.

--
Jori Mäntysalo

Tampereen yliopisto - Ihminen ratkaisee

J

unread,
Sep 3, 2019, 7:15:07 AM9/3/19
to sage-s...@googlegroups.com, J
On 19-09-03 09:43:23, Jori Mäntysalo (TAU) wrote:
> On Mon, 2 Sep 2019, J wrote:
>
> > Thanks get_memory_usage sounds good; I want to run several decoders from
> > the coding theory module to see better show there ups and downs;
>
> There is also at least %mprun magic. Googling that will give you some
> examples.
>
Thank you, I will research this too

Simon King

unread,
Sep 3, 2019, 1:05:06 PM9/3/19
to sage-s...@googlegroups.com
On 2019-09-03, Jori Mäntysalo <jori.ma...@tuni.fi> wrote:
> On Mon, 2 Sep 2019, J wrote:
>
>> Thanks get_memory_usage sounds good; I want to run several decoders from
>> the coding theory module to see better show there ups and downs;
>
> There is also at least %mprun magic. Googling that will give you some
> examples.

Cool! I didn't know about %mprun before.

Thanks!
Simon

Nils Bruin

unread,
Sep 3, 2019, 2:31:32 PM9/3/19
to sage-support


On Tuesday, September 3, 2019 at 2:43:27 AM UTC-7, Jori Mäntysalo (TAU) wrote:
There is also at least %mprun magic. Googling that will give you some
examples.

Looking at the memory footprint of the entire process (as a function of time) gives some indication of memory use of a certain implementation of an algorithm, but there are many factors that influence it. CPython probably has a slight preference for reusing (freed/reclaimed) memory over requesting new memory from the OS, but there is not an actual guarantee. And CPython might be seriously lax in reclaiming memory, or it might be prevented by a memory leak in sage that is not due to the implementation of the algorithm. So you can take results like that only as an indication and not as authoritative. Determining memory usage of an algorithm in the mathematical sense probably needs code analysis.

(that said, memory claimed from the OS definitely gives an UPPER BOUND on the memory usage of a certain algorithm; for obvious reasons)

J

unread,
Sep 3, 2019, 3:55:16 PM9/3/19
to sage-s...@googlegroups.com, J
Well, one of the algorithms stores

sum {l^e} {\binom{n}{e}}

with l being the number of elements of the field
which gets big quite fast

> --
> You received this message because you are subscribed to the Google Groups "sage-support" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to sage-support...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/sage-support/66674d87-2f64-4da1-89f9-009b2898d3a4%40googlegroups.com.

J

unread,
Sep 18, 2019, 4:12:29 AM9/18/19
to sage-s...@googlegroups.com, J
On 19-09-03 09:43:23, Jori Mäntysalo (TAU) wrote:
> On Mon, 2 Sep 2019, J wrote:
>
> > Thanks get_memory_usage sounds good; I want to run several decoders from
> > the coding theory module to see better show there ups and downs;
>
> There is also at least %mprun magic. Googling that will give you some
> examples.
>

TBH I can't get it to work
and
"sagemath %mprun"
gives to pages of search results including this thread ^^

> --
> Jori Mäntysalo
>
> Tampereen yliopisto - Ihminen ratkaisee
>
> --
> You received this message because you are subscribed to the Google Groups "sage-support" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to sage-support...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/sage-support/alpine.DEB.2.21.9999.1909031237550.45316%40shell.sis.uta.fi.

Jori Mäntysalo (TAU)

unread,
Sep 18, 2019, 6:20:17 AM9/18/19
to sage-s...@googlegroups.com, J
On Tue, 17 Sep 2019, J wrote:

> TBH I can't get it to work
> and
> "sagemath %mprun"
> gives to pages of search results including this thread ^^

Duh. Somebody should add a page on this to the doc.

First, said

./sage -pip install memory_profiler

and then normally

./sage --notebook=jupyter

In a notebook I did

def power2(x):
L = range(x)
s = 0
for i in L:
s += 2*i+1
return s

and then loaded the extension

%load_ext memory_profiler

and last ran

%memit
power2(5)

It works. However,

%mprun -f power2
power2(5)

does not. Is there an easy way to profile memory usage on line-by-line
basis?

J

unread,
Sep 18, 2019, 4:41:20 PM9/18/19
to sage-s...@googlegroups.com, J
The most problematic part for me is:

I would like to script it
> --
> You received this message because you are subscribed to the Google Groups "sage-support" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to sage-support...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/sage-support/alpine.DEB.2.21.9999.1909181316180.53292%40shell.sis.uta.fi.

Jori Mäntysalo (TAU)

unread,
Sep 19, 2019, 4:22:34 AM9/19/19
to sage-s...@googlegroups.com, J
On Wed, 18 Sep 2019, J wrote:

> The most problematic part for me is:
>
> I would like to script it

I'm not sure what you mean. You have some list L of objects, and want to
know how much memory it takes to run f(x) for each x in L?

J

unread,
Sep 19, 2019, 6:53:37 AM9/19/19
to sage-s...@googlegroups.com, J
More or less this yes;

But it seem I will stick to putting it in a sage ipyhton session wrapped
in a screen session

In detail I found the syndrome_decoders crashes while initiating with
big codes as the system runs out of memory

and that prompted me to plot RAM vs max_error for the syndrome decoder
> --
> You received this message because you are subscribed to the Google Groups "sage-support" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to sage-support...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/sage-support/alpine.DEB.2.21.9999.1909191120130.110721%40shell.sis.uta.fi.

Jori Mäntysalo (TAU)

unread,
Sep 19, 2019, 7:25:50 AM9/19/19
to sage-s...@googlegroups.com, J
On Thu, 19 Sep 2019, J wrote:

> In detail I found the syndrome_decoders crashes while initiating with
> big codes as the system runs out of memory

Hmm... An idea:

$ fgrep VmPeak /proc/6649/status
VmPeak: 31204 kB

(6649 is PID of my bash)

If you add delay to the end of sage script, you could check how much was
max memory usage.
Reply all
Reply to author
Forward
0 new messages