I've run average_neighbor_degree with every possible combination with params source and target which implies basic attributes for directed graph. Here's my code and results:
G = nx.DiGraph([(0, 3), (1, 3), (2, 4), (3, 5), (3, 6), (4, 6), (6, 5), (4, 3)])Looking at this graph I don't understand how this is calculated and from where those results come from. Could someone explain it please?
Environment: Python 3.8.8 NetworkX 2.5
>>> G = nx.DiGraph([(0, 3), (1, 3), (2, 4), (3, 5), (3, 6), (4, 6), (6, 5), (4, 3)])
...: print(f'in-in: {nx.average_neighbor_degree(G, source="in", target= "in")}')
...: print(f'in-out: {nx.average_neighbor_degree(G, source="in", target= "out")}')
...: print(f'out-in: {nx.average_neighbor_degree(G, source="out", target= "in")}')
...: print(f'out-out: {nx.average_neighbor_degree(G, source="out", target= "out")}')
...:
in-in: {0: 0.0, 3: 1.3333333333333333, 1: 0.0, 2: 0.0, 4: 5.0, 5: 0.0, 6: 1.0}
in-out: {0: 0.0, 3: 0.3333333333333333, 1: 0.0, 2: 0.0, 4: 3.0, 5: 0.0, 6: 0.0}