Q-values etc.

1 view
Skip to first unread message

Loet Leydesdorff

unread,
Mar 25, 2022, 7:23:41 AM3/25/22
to Krippendorff, Klaus, cogi...@googlegroups.com
Dear Klauss, 

As promised here below some further remarks  in reaction to your notes.  
I cc to the google group. 

Best, 
Loet

My reaction to Klaus's comments to Chapter 4 requires a bit of personal history. Our discussion follows up on a series of articles in the Int. J. of General Systems 2009: Krippendorff (2009a and b); Leydesdorff (2009). 

We studied information theory from very different perspectives.
 I turned to information theory in the late 1980s because of the (at the time) unsolved issues in the relations between the reduction of complexity in (e.g.,) factor or principal component analysis, on the one side, and time-series analysis (e.g., ARIMA), on the other side. Information theory provides static measures for decomposing the complexity, and dynamic measures of the information flow (e.g., Kuhlbach-Leibler divergence). The econometrician Henry Theil (1972) wrote a textbook addressing these issues and introducing information theory. It became my guide at the time leading to, for example, to Leydesdorff (1991, 1995, and 2005).
 
In this context, I read Krippendorff (1986). It is a fascinating book which takes the discussion much further than Theil (1972). However, the book is not easy. Krippendorff (1980) furthermore showed (on the basis of earlier work) that in the case of circular relations, so-called Q-values were no longer based on proper probabilities (adding up to 1; ), but artifacts of adding and subtracting entropies. Krippendorff formulated: “They have no observable reality.”
 
This was elaborated in Krippendorff (2009). Along a different path (Triple Helix; cf. Ulanowicz, 1986) and in collaboration with Inga Ivanova (e.g., Leydesdorff & Ivanova, 2014) I had followed the interpretation of the negative values in terms of redundancy. At the time (2009), I reproduced with Klaus' help his values as a calculus different from the one I had used hitherto (Leydesdorff, 2009). However, the results of a large empirical study using citation distributions remained difficult to interpret (Leydesdorff, 2002).
 
Krippendorff (2009b) formulated as follows:
 
Q -measures are not perfect, they are good approximations to maximum entropies and Leydesdorff (2009) sought other explanations to justify their continued use. I consider salvaging this measure to be an exercise in futility and am extending Watnabe's judgment to all quantities that include these products. Q-quantities may have other uses, and I will mention one below, but multivariate information measures they are not.
 
Note that interactions with loops entail positive or negative redundancies, those without loops do not . Loops can be complex, especially in systems with many variables. (p. 676).
 
It seems to me that Klaus insists that these measures are not valid because they are not proper probabilities. I used them since Leydesdorff (2003) because there is no alternative measure. One would need another calculus. They may be second-best, but they provide an opportunity to proceed. I accept that they are not probabilities and search for another “calculus” of redundancy. Do they provide access to the domain of meaning-processing?
 
In sum, my use is not as information-theoretical concepts, but as information-theoretically informed methodologies . This is different from Klaus's approach in which information-theory is under discussion in relation to cybernetics. I use information theory in science, technology, and innovation studies.
 
References 
  • Krippendorff, K. (1980). Q; an interpretation of the information theoretical Q-measures. In R. Trappl, G. J. Klir & F. Pichler (Eds.), Progress in cybernetics and systems research (Vol. VIII, pp. 63-67). New York: Hemisphere.
  • Krippendorff, K. (1986). Information Theory. Structural Models for Qualitative Data . Beverly Hills, etc.: Sage).
  • Krippendorff, K. (2009a). W. Ross Ashby's information theory: a bit of history, some solutions to problems, and what we face today. International Journal of General Systems, 38(2), 189-212.
  • Krippendorff, K. (2009b). Information of Interactions in Complex Systems. International Journal of General Systems, 38 (6), 669-680.
  • Leydesdorff, L. (1991). The Static and Dynamic Analysis of Network Data Using Information Theory. Social Networks, 13 (4), 301-345.
  • Leydesdorff, L. (1995). The Challenge of Scientometrics: The development, measurement, and self-organization of scientific communications . Leiden: DSWO Press, Leiden University; at http://www.universal-publishers.com/book.php?method=ISBN&book=1581126816.
  • Leydesdorff, L. (2002). Indicators of structural change in the dynamics of science: Entropy statistics of the SCI Journal Citation Reports. Scientometrics, 53 (1), 131-159.
  • Leydesdorff, L. (2005). Similarity Measures, Author Cocitation Analysis, and Information Theory. Journal of the American Society for Information Science and Technology, 56 (7), 769-772.
  • Leydesdorff, L. (2009). Interaction information: linear and nonlinear interpretations. International Journal of General Systems, 38 (6), 681-685.
  • Leydesdorff, L., & Ivanova, I. A. (2014). Mutual Redundancies in Interhuman Communication Systems: Steps Toward a Calculus of Processing Meaning. Journal of the Association for Information Science and Technology, 65 (2), 386-399. doi: 10.1002/asi.22973
  • Theil, H. (1972). Statistical Decomposition Analysis . Amsterdam/ London: North-Holland.
  • Ulanowicz, R. E. (1986). Growth and Development: Ecosystems Phenomenology . San Jose, etc.: toExcel.
 
 
 

_______________

Loet Leydesdorff


Professor emeritus, University of Amsterdam 

Amsterdam School of Communication Research (ASCoR)

lo...@leydesdorff.net ; http://www.leydesdorff.net/

http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en


1.png
Reply all
Reply to author
Forward
0 new messages