Hi Humair,
I presume you're looking at transfer entropies from the notation.
What is it that you are trying to conclude? I would have thought you wanted to compare the effects from two different sources to the same target, where the motivation is more clear as identifying the stronger source. But it looks like you're comparing the effects of two variables on eachother in both directions, so I think you're trying to conclude in which direction the effect is (why not in both directions?). Is this right?
Perhaps have a look through some short video lectures I've created to introduce the topics -- see those on Information transfer towards the end of the list at
https://www.youtube.com/playlist?list=PLOfPLLxr5gsVLSlmzcMnsFANb-uWkArby (I'll send a proper announcement of the availability of these to the list soon ...). In Information transfer part 8 in particular, I talk through my thinking on how pairwise and conditional relate to eachother.
Basically, these are simply giving you two different model perspectives on how strong the relationship is, one with and one without taking a third variable into account.
Here then, it would appear that the predictive effect of 1->3 is not really already explained by 2 (it only drops a little when 2 is conditioned on).
In contrast, it would appears that the effect of 3->1 is in part mediated through 2, because we have a stronger prediction when taking 2 into account. This is also known as a synergistic interaction between 3 and 2 creating an outcome on 1.
These observations depend of course on the measurements being statistically significant. (You can see more details on that earlier in the playlist)