memory comparison between x32- and 64-bit-applications

107 views
Skip to first unread message

Nathalie

unread,
Oct 16, 2012, 5:41:55 AM10/16/12
to x32...@googlegroups.com
Dear all,

I am currently testing x32-ABI and I am comparing time (user, CPU, real) and memory between some applications installed as x32 and 64-bit. The x32- and 64-bit applications do not differ except in the gcc-flag m64 and mx32.  Now I have the problem to understand the results. In one benchmark I could gain more than half of the memory, which cannot be just due to the half size of pointers. Does x32-ABI optimize somehow the memory handling, so that it is possible to gain more than half of the memory?

Thanks in advance for your support!

Best wishes,
Nathalie

H. Peter Anvin

unread,
Oct 16, 2012, 12:06:56 PM10/16/12
to x32...@googlegroups.com, Nathalie
How do you measure memory consumption?

-hpa

Nathalie

unread,
Oct 16, 2012, 3:20:06 PM10/16/12
to x32...@googlegroups.com, Nathalie, h...@zytor.com
I measured the consumption via the memory maps (SMAPS) of a process. I take several snapshots and then I sum up the resident set size per snapshot,  with which I produce a memory profile over time.

I checked again the binaries of my test-applications. The only significant difference I could see was via "readelf -a". My 64-bit application somehow consist of several different
glibc-versions (2.3.4, 2.2.5, 2.14) and the x32-application just consist of glibc 2.16. Might that be the problem? If so, how can it happen that the compiler takes  different versions? As an environment I uesed the x32 gentoo release, in which I did not modify any gcc/glibc related libraries.

Best wishes,
Nathalie

Mike Frysinger

unread,
Oct 16, 2012, 4:58:33 PM10/16/12
to x32...@googlegroups.com, Nathalie, h...@zytor.com
On Tue, Oct 16, 2012 at 3:20 PM, Nathalie wrote:
> I checked again the binaries of my test-applications. The only significant
> difference I could see was via "readelf -a". My 64-bit application somehow
> consist of several different
> glibc-versions (2.3.4, 2.2.5, 2.14) and the x32-application just consist of
> glibc 2.16. Might that be the problem?

that's not what you're seeing. as new versions of glibc come out and
new symbols are added, versioning information is applied to them.
that way if you run a binary with glibc-2.13, but you compiled it
against glibc-2.15, the system can tell you that the binary really
needs glibc-2.15 because it uses symbols from the newer version.
further, if a symbol changes behavior in glibc-2.17, we can still have
older programs use the older symbol and still work while newer
programs will use the newer symbol.

since x32 was first released with glibc-2.16, there is no older
versioning information present in the binary and no need to include
old symbols that were deprecated in the past. this is by design.
-mike
Reply all
Reply to author
Forward
0 new messages