You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to networkx...@googlegroups.com
I have a ajacency list file with 3.65 million nodes which compose a huge network, and then I run the following simple script:
import networkx as nx import sys G = nx.read_adjlist(sys.argv[1]+'.adj') print nx.number_of_nodes(G)
I notice the script occupies up to 5 GB memory, namely around 14 KB per node. Why does networkx require so much memory? or in another way, is networkx efficient on memory usage over large networks?
Aric Hagberg
unread,
May 13, 2014, 10:55:50 AM5/13/14
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to networkx...@googlegroups.com
Don't forget to count the edges. I'm guessing you have at least as
many as nodes.
So, yes, it may take 5GB of memory. You could use as a rough estimate
about 100bytes per node or edge - the exact size depends on what you
use as a node or store with an edge.
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to networkx...@googlegroups.com
1 I don't understand this sentence "the exact size depends on what you
use as a node or store with an edge.
", what do you mean?
2 yes, it has around 7.8 million edges, so altogher 3.6+7.8 million so it is around 440 Bytes per node or edge I'm wondering why each node or edge need 440 Bytes memory?
Aric Hagberg
unread,
May 13, 2014, 11:27:25 AM5/13/14
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to networkx...@googlegroups.com
You can store arbitrary (Python hashable) objects as nodes. Integers
will use less memory than long strings or other complicated objects.
Also you can store arbitrary data with edges so the size of that data
matters too. The default data for an edge is an empty dictionary
which is 136 bytes on my machine.
Aric
Moritz Beber
unread,
May 13, 2014, 11:33:41 AM5/13/14
Reply to author
Sign in to reply to author
Forward
Sign in to forward
Delete
You do not have permission to delete messages in this group
Copy link
Report message
Show original message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to networkx...@googlegroups.com
Hey,
> 2 yes, it has around 7.8 million edges, so altogher 3.6+7.8 million
> so it is around 440 Bytes per node or edge
> I'm wondering why each node or edge need 440 Bytes memory?
>