Re: Entropy of a Graph

1,380 views
Skip to first unread message

CE

unread,
Apr 29, 2013, 3:49:36 PM4/29/13
to networkx...@googlegroups.com
On Sunday, April 28, 2013 10:35:43 PM UTC-5, Rajat Khanna wrote:
Hi, does anyone has any idea if it is possible to measure entropy of a graph using networkx. I need to do it for a project so if anyone know then please let me know.


You'd have to clarify what it is, exactly, that you are trying to calculate. Given a probability distribution, it is easy to calculate the entropy; so the real work is in constructing that distribution...but there are many ways you might want to do so.

Rajat Khanna

unread,
Apr 29, 2013, 6:41:51 PM4/29/13
to networkx...@googlegroups.com
I am planning to estimate degree centrality for each node and use this to construct probability distribution...I can try to di it myself but was just wondering if someone has implemented it already...would be helpful to see some exmaples/applications...

CE

unread,
Apr 29, 2013, 9:54:23 PM4/29/13
to networkx...@googlegroups.com

On Monday, April 29, 2013 5:41:51 PM UTC-5, Rajat Khanna wrote:
I am planning to estimate degree centrality for each node and use this to construct probability distribution...I can try to di it myself but was just wondering if someone has implemented it already...would be helpful to see some exmaples/applications...



There are many ways to define a distribution from degree centralities, and it depends on the type of graph you have as well. Let's assume you have a simple graph (no self-loops or multiple edges).  Then, the centralities for each node are properly normalized in the output of nx.degree_centrality().   However, they do not sum to 1.  So, we can normalize each by dividing by the sum.  Here is demo code:



from __future__ import division

import networkx as nx
import numpy as np

def entropy(dist):
    """
    Returns the entropy of `dist` in bits (base-2).

    """
    dist = np.asarray(dist)
    ent = np.nansum( dist *  np.log2( 1/dist ) )
    return ent

def centrality_distribution(G):
    """
    Returns a centrality distribution.

    Each normalized centrality is divided by the sum of the normalized
    centralities. Note, this assumes the graph is simple.

    """
    centrality = nx.degree_centrality(G).values()
    centrality = np.asarray(centrality)
    centrality /= centrality.sum()
    return centrality

if __name__ == '__main__':
    G = nx.barabasi_albert_graph(10, 2)
    d = centrality_distribution(G)
    print d
    print entropy(d)

Rajat Khanna

unread,
Apr 29, 2013, 11:52:05 PM4/29/13
to networkx...@googlegroups.com
That's great, this should work just fine with little adjustments...

Thanks a lot...

Paulo Freitas Gomes

unread,
Dec 6, 2017, 9:09:44 AM12/6/17
to networkx-discuss
So, there are many different entropies for one specific graph? We choose one variable from the graph, calculate its probability distribution, and then calculate the entropy? Last question: the formula for the entropy is E = sum_x p(x) * log[1/p(x)] where x can be degree centrality, or other variable from the graph.?
Reply all
Reply to author
Forward
0 new messages