References

AP12

S. A. Abdallah and M. D. Plumbley. A measure of statistical complexity based on predictive information with application to finite spin systems. Physics Letters A, 376(4):275–281, 2012.

ASBY14

B. Allen, B. C. Stacey, and Y. Bar-Yam. An information-theoretic formalism for multiscale structure in complex systems. arXiv preprint arXiv:1409.4708, 2014.

Ama01

Shun-ichi Amari. Information geometry on hierarchy of probability distributions. Information Theory, IEEE Transactions on, 47(5):1701–1711, 2001.

BG15

Pradeep Kr Banerjee and Virgil Griffith. Synergy, redundancy and common information. arXiv preprint arXiv:1509.03706, 2015.

BRMontufar17

Pradeep Kr. Banerjee, Johannes Rauh, and Guido Montúfar. Computing the unique information. arXive preprint arXiv:1709.07487, 2017.

BY04

Y. Bar-Yam. Multiscale complexity/entropy. Advances in Complex Systems, 7(01):47–63, 2004.

BG16

Salman Beigi and Amin Gohari. Phi-entropic measures of correlation. arXiv preprint arXiv:1611.01335, 2016.

Bel03

A. J. Bell. The co-information lattice. In S. Makino S. Amari, A. Cichocki and N. Murata, editors, Proc. Fifth Intl. Workshop on Independent Component Analysis and Blind Signal Separation, volume ICA 2003, 921–926. New York, 2003. Springer.

BROJ13

Nils Bertschinger, Johannes Rauh, Eckehard Olbrich, and Jürgen Jost. Shared information—new insights and problems in decomposing information in complex systems. In Proceedings of the European Conference on Complex Systems 2012, 251–269. Springer, 2013.

BRO+14

Nils Bertschinger, Johannes Rauh, Eckehard Olbrich, Jürgen Jost, and Nihat Ay. Quantifying unique information. Entropy, 16(4):2161–2183, 2014.

Cal02

Cristian Calude. Information and Randomness: An Algorithmic Perspective. Springer-Verlag New York, Inc., Secaucus, NJ, USA, 2nd edition, 2002. ISBN 3540434666.

CABE+15

Chung Chan, Ali Al-Bashabsheh, Javad B Ebrahimi, Tarik Kaced, and Tie Liu. Multivariate mutual information inspired by secret-key agreement. Proceedings of the IEEE, 103(10):1883–1913, 2015.

CP16

Daniel Chicharro and Stefano Panzeri. Redundancy and synergy in dual decompositions of mutual information gain and information loss. arXiv preprint arXiv:1612.09522, 2016.

CRW03

Matthias Christandl, Renato Renner, and Stefan Wolf. A property of the intrinsic mutual information. In IEEE international symposium on information theory, 258–258. 2003.

CT06

Thomas M. Cover and Joy A. Thomas. Elements of Information Theory. Wiley-Interscience, New York, second edition, 2006. ISBN 0471241954.

CPC10

Paul Warner Cuff, Haim H Permuter, and Thomas M Cover. Coordination capacity. IEEE Transactions on Information Theory, 56(9):4181–4206, 2010.

FL18

C. Finn and J. T. Lizier. Pointwise partial information decomposition using the specificity and ambiguity lattices. Entropy, 20(4):297, 2018.

FWA16

Seth Frey, Paul L Williams, and Dominic K Albino. Information encryption in the expert management of strategic uncertainty. arXiv preprint arXiv:1605.04233, 2016.

GA17

Amin Gohari and Venkat Anantharam. Comments on “information-theoretic key agreement of multiple terminals—part i”. IEEE Transactions on Information Theory, 63(8):5440–5442, 2017.

GGunluK17

Amin Gohari, Onur Günlü, and Gerhard Kramer. On achieving a positive rate in the source model key agreement problem. arXiv preprint arXiv:1709.05174, 2017.

GK17

Allison E Goodwell and Praveen Kumar. Temporal information partitioning: characterizing synergy, uniqueness, and redundancy in interacting environmental variables. Water Resources Research, 53(7):5920–5942, 2017.

GCJ+14

Virgil Griffith, Edwin KP Chong, Ryan G James, Christopher J Ellison, and James P Crutchfield. Intersection information based on common randomness. Entropy, 16(4):1985–2000, 2014.

GK14

Virgil Griffith and Christof Koch. Quantifying synergistic mutual information. In Guided Self-Organization: Inception, pages 159–190. Springer, 2014.

GacsKorner73

Peter Gács and János Körner. Common information is far less than mutual information. Problems of Control and Information Theory, 2(2):149–162, 1973.

Han75

T. S. Han. Linear dependence structure of the entropy space. Information and Control, 29:337–368, 1975.

HSP13

Malte Harder, Christoph Salge, and Daniel Polani. Bivariate measure of redundant information. Physical Review E, 87(1):012130, 2013.

Inc17a

Robin A. A. Ince. Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19(7):318, 2017.

Inc17b

Robin A. A. Ince. The Partial Entropy Decompostion: decomposing multivariate entropy and mutual information via pointwise common surprisal. arXive preprint arXiv:1702.01591, 2017.

JEC17

Ryan G. James, Jeffrey Emenheiser, and James P. Crutchfield. Unique information via dependency constraints. arXiv preprint arXiv:1709.06653, 2017.

KCM19

Artemy Kolchinsky and Bernat Corominas-Murtra. Decomposing information into copying versus transformation. arXiv preprint arXiv:1903.10693, 2019.

Kri09

Klaus Krippendorff. Ross ashby’s information theory: a bit of history, some solutions to problems, and what we face today. International Journal of General Systems, 38(2):189–212, 2009.

KLEG14

G. R. Kumar, C. T. Li, and A. El Gamal. Exact common information. In Information Theory (ISIT), 2014 IEEE International Symposium on, 161–165. IEEE, 2014.

LSAgro11

Frank Lad, Giuseppe Sanfilippo, and Gianna Agrò. Extropy: a complementary dual of entropy. arXiv preprint arXiv:1109.6440, 2011.

LMPR04

P. W. Lamberti, M. T. Martin, A. Plastino, and O. A. Rosso. Intensive entropic non-triviality measure. Physica A: Statistical Mechanics and its Applications, 334(1):119–131, 2004.

LXC10

Wei Liu, Ge Xu, and Biao Chen. The common information of n dependent random variables. In Communication, Control, and Computing (Allerton), 2010 48th Annual Allerton Conference on, 836–843. IEEE, 2010.

LFW13

Joseph T Lizier, Benjamin Flecker, and Paul L Williams. Towards a synergy-based approach to measuring information modification. In Artificial Life (ALIFE), 2013 IEEE Symposium on, 43–51. IEEE, 2013.

Mac03

David JC MacKay. Information theory, inference and learning algorithms. Cambridge university press, 2003.

MW97

Ueli Maurer and Stefan Wolf. The intrinsic conditional mutual information and perfect secrecy. In IEEE international symposium on information theory, 88–88. Citeseer, 1997.

McG54

W. J. McGill. Multivariate information transmission. Psychometrika, 19(2):97–116, 1954.

OBR15

Eckehard Olbrich, Nils Bertschinger, and Johannes Rauh. Information decomposition and synergy. Entropy, 17(5):3501–3517, 2015.

PVerdu08

Daniel P Palomar and Sergio Verdú. Lautum information. IEEE transactions on information theory, 54(3):964–975, 2008.

PPCP17

Giuseppe Pica, Eugenio Piasini, Daniel Chicharro, and Stefano Panzeri. Invariant components of synergy, redundancy, and unique information among three variables. arXiv preprint arXiv:1706.08921, 2017.

RCVW04

Murali Rao, Yunmei Chen, Baba C Vemuri, and Fei Wang. Cumulative residual entropy: a new measure of information. Information Theory, IEEE Transactions on, 50(6):1220–1228, 2004.

Rau17

Johannes Rauh. Secret sharing and shared information. arXiv preprint arXiv:1706.06998, 2017.

RBO+17

Johannes Rauh, Pradeep Kr Banerjee, Eckehard Olbrich, Jürgen Jost, and Nils Bertschinger. On extractable shared information. arXiv preprint arXiv:1701.07805, 2017.

RBOJ14

Johannes Rauh, Nils Bertschinger, Eckehard Olbrich, and Jurgen Jost. Reconsidering unique information: towards a multivariate information decomposition. In Information Theory (ISIT), 2014 IEEE International Symposium on, 2232–2236. IEEE, 2014.

RSW03

Renato Renner, Juraj Skripsky, and Stefan Wolf. A new measure for conditional mutual information and its properties. In IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 259–259. 2003.

RNE+16

Fernando Rosas, Vasilis Ntranos, Christopher J Ellison, Sofie Pollin, and Marian Verhelst. Understanding interdependency through complex information sharing. Entropy, 18(2):38, 2016.

SSB+03

E. Schneidman, S. Still, M. J. Berry, W. Bialek, and others. Network information and connected correlations. Phys. Rev. Lett., 91(23):238701, 2003.

TS80

H. Te Sun. Multiple mutual informations and multiple interactions in frequency data. Information and Control, 46:26–45, 1980.

TPB00

Naftali Tishby, Fernando C Pereira, and William Bialek. The information bottleneck method. arXiv preprint physics/0004057, 2000.

TSE94

Giulio Tononi, Olaf Sporns, and Gerald M Edelman. A measure for brain complexity: relating functional segregation and integration in the nervous system. Proceedings of the National Academy of Sciences, 91(11):5033–5037, 1994.

TNG11

H. Tyagi, P. Narayan, and P. Gupta. When is a function securely computable? Information Theory, IEEE Transactions on, 57(10):6337–6350, 2011.

VAPelaezM16

Francisco José Valverde-Albacete and Carmen Peláez-Moreno. The multivariate entropy triangle and applications. In Hybrid Artificial Intelligent Systems, pages 647–658. Springer, 2016.

VW08

Sergio Verdu and Tsachy Weissman. The information lost in erasures. Information Theory, IEEE Transactions on, 54(11):5030–5058, 2008.

VerduW06

S. Verdú and T. Weissman. Erasure entropy. In Information Theory, 2006 IEEE International Symposium on, 98–102. IEEE, 2006.

Wat60

S. Watanabe. Information theoretical analysis of multivariate correlation. IBM Journal of research and development, 4(1):66–82, 1960.

WB10

Paul L Williams and Randall D Beer. Nonnegative decomposition of multivariate information. arXiv preprint arXiv:1004.2515, 2010.

WB11

Paul L Williams and Randall D Beer. Generalized measures of information transfer. arXiv preprint arXiv:1102.1507, 2011.

Wyn75

A. D. Wyner. The common information of two dependent random variables. Information Theory, IEEE Transactions on, 21(2):163–179, 1975.

Yeu08

Raymond W Yeung. Information theory and network coding. Springer, 2008.

Zwi04

M. Zwick. An overview of reconstructability analysis. Kybernetes, 33(5/6):877–905, 2004.