Oswald Berthold
unread,Feb 6, 2024, 3:47:54 PMFeb 6Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Java Information Dynamics Toolkit (JIDT) discussion
Dear group,
does anyone know (something about) what is the behavior of continuous
MutualInfo estimators (kraskov1 / 2) on under-determined data, that is
(much) less examples than dimensions per example?
my observation is, that the source and destination entropies (each
computed as their respective self-information) is somehow tightly
bounded by the number of examples (it is the same for source and
destination and increases with number of examples). the "real" MI
between source and destination does seem plausible, though.
thanks for any pointers.
bst, oswald