Hi Ahmet,
To answer your question, here is the definition of (normalized) entropy:
(normalized) entropy = -sum(prob_values * np.log(prob_values) / np.log(slot_types))
where "prob_values" are the probabilities of each word in a slot and "slot_types" are the number of types in a slot
Entropy is a measure of uncertainty. So, a low entropy of 0 means the system has only 1 possible state, whereas an entropy of 1 means all the possible states are equally likely. In some ways an entropy value of 0 means the p-frame is completely predictable as there is only one possible slot filler, but for a value of 1, all we can say is that each slot filler is equally likely.
I hope that helps!
Laurence.
###############################################################
Laurence ANTHONY, Ph.D.
Professor of Applied LinguisticsFaculty of Science and Engineering
Waseda University
3-4-1 Okubo, Shinjuku-ku, Tokyo 169-8555, Japan
E-mail:
antho...@gmail.comWWW:
http://www.laurenceanthony.net/###############################################################