We further examine some properties of the conditional Rényi and Tsallis–Havrda–Charvát (THC) entropies. Such properties are interesting from the viewpoint of applications in studying protocols of quantum information science and foundations of quantum mechanics. In particular, we consider properties of the conditional Rényi and THC entropies with respect to conditioning on more. We also exemplify that the desired property can be violated with the conditional min-entropy. Applications of such results to the TCH entropy rate are considered. Connections between generalized conditional entropies and error probability are examined. Several relations between various conditional entropies are obtained. It is shown that such relations can be used for bounding the conditional Rényi and TCH entropies.
Accepté le :
DOI : 10.1051/ita/2014029
Keywords: Rényi entropy, Tsallis–Havrda–Charvát entropy, entropy rate, index of coincidence, error probability, Fano inequality
Rastegin, Alexey E. 1
@article{ITA_2015__49_1_67_0,
author = {Rastegin, Alexey E.},
title = {Further results on generalized conditional entropies},
journal = {RAIRO - Theoretical Informatics and Applications - Informatique Th\'eorique et Applications},
pages = {67--92},
year = {2015},
publisher = {EDP Sciences},
volume = {49},
number = {1},
doi = {10.1051/ita/2014029},
mrnumber = {3342174},
zbl = {1395.94219},
language = {en},
url = {https://www.numdam.org/articles/10.1051/ita/2014029/}
}
TY - JOUR AU - Rastegin, Alexey E. TI - Further results on generalized conditional entropies JO - RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications PY - 2015 SP - 67 EP - 92 VL - 49 IS - 1 PB - EDP Sciences UR - https://www.numdam.org/articles/10.1051/ita/2014029/ DO - 10.1051/ita/2014029 LA - en ID - ITA_2015__49_1_67_0 ER -
%0 Journal Article %A Rastegin, Alexey E. %T Further results on generalized conditional entropies %J RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications %D 2015 %P 67-92 %V 49 %N 1 %I EDP Sciences %U https://www.numdam.org/articles/10.1051/ita/2014029/ %R 10.1051/ita/2014029 %G en %F ITA_2015__49_1_67_0
Rastegin, Alexey E. Further results on generalized conditional entropies. RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications, Tome 49 (2015) no. 1, pp. 67-92. doi: 10.1051/ita/2014029
N. Alon and J.H. Spencer, The Probabilistic Method, 3rd edition. John Wiley & Sons, Hoboken (2008). | MR | Zbl
and , Recent progress in quantum algorithms. Commun. ACM 53 (2010) 84–93. | DOI
and , Rényi’s entropy and error probability. IEEE Trans. Inf. Theory 24 (1978) 324–331. | Zbl | DOI
I. Bengtsson and K. Życzkowski, Geometry of Quantum States: An Introduction to Quantum Entanglement. Cambridge University Press, Cambridge (2006). | MR | Zbl
and , Information-theoretic Bell inequalities. Phys. Rev. Lett. 61 (1988) 662–665. | MR | DOI
, , and , Noise and disturbance in quantum measurements: An information-theoretic approach. Phys. Rev. Lett. 112 (2014) 050401. | DOI
C. Cachin, Entropy measures and unconditional security in cryptography. Ph.D. thesis, Swiss Federal Institute of Technology, Zürich (1997).
and , Entropic approach to local realism and noncontextuality. Phys. Rev. A 85 (2012) 032113. | DOI
and , Entropic inequalities and marginal problems. IEEE Trans. Inf. Theory 59 (2013) 803–817. | MR | Zbl | DOI
, Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967) 299–318. | MR | Zbl
, Axiomatic characterizations of information measures. Entropy 10 (2008) 261–273. | Zbl | DOI
T.M. Cover and J.A. Thomas, Elements of Information Theory. John Wiley & Sons, New York (1991). | MR | Zbl
, Generalized information functions. Inform. Control 16 (1970) 36–51. | MR | Zbl | DOI
, Inequalities for some infinite series. Acta Math. Hungar. 75 (1997) 5–8. | MR | Zbl | DOI
and , An introduction to quantum annealing. RAIRO: ITA 45 (2011) 99–116. | MR | Zbl | Numdam
and , Lower and upper bounds for misclassification probability based on Rényi’s information. J. VLSI Signal Process. 37 (2004) 305–317. | Zbl | DOI
R.M. Fano, Transmission of Information: A Statistical Theory of Communications. MIT Press and John Wiley & Sons, New York (1961). | MR
and , Relations between entropy and error probability. IEEE Trans. Inf. Theory 40 (1994) 259–266. | Zbl | DOI
, Information-theoretical properties of Tsallis entropies. J. Math. Phys. 47 (2006) 023302. | MR | Zbl | DOI
M. Gell-Mann and C. Tsallis, Nonextensive Entropy – Interdisciplinary Applications. Oxford University Press, Oxford (2004). | Zbl
, and , Quantum simulation. Rev. Mod. Phys. 86 (2014) 153–185. | DOI
, and , Some properties of Rényi entropy and Rényi entropy rate. Inf. Sci. 179 (2009) 2426–2433. | MR | Zbl | DOI
and , Inequalities between entropy and index of coincidence derived from information diagrams. IEEE Trans. Inf. Theory 47 (2001) 2944–2960. | MR | Zbl | DOI
and , Quantification methods of classification processes: concept of structural -entropy. Kybernetika 3 (1967) 30–35. | MR | Zbl
, , and , Quantum -divergences and error correction. Rev. Math. Phys. 23 (2011) 691–747. | MR | Zbl | DOI
and , On the interplay between conditional entropy and error probability. IEEE Trans. Inf. Theory 56 (2011) 5930–5942. | MR | Zbl | DOI
and , On the discontinuity of the Shannon information measures. IEEE Trans. Inf. Theory 55 (2009) 5362–5374. | MR | Zbl | DOI
and , The world according to Rényi: thermodynamics of multifractal systems. Ann. Phys. 312 (2004) 17–59 | MR | Zbl | DOI
, Minimizing -information for generalization and interpretation. Algorithmica 22 (1998) 173–197. | MR | Zbl | DOI
and , On information and sufficiency. Ann. Math. Stat. 22 (1951) 79–86. | MR | Zbl | DOI
and , Generalized entropic uncertainty relations. Phys. Rev. Lett. 60 (1988) 1103–1106. | MR | DOI
A.W. Marshall, I. Olkin and B.C. Arnold, Inequalities: Theory of Majorization and Its Applications, 2nd edition. Springer-Verlag, New York (2011). | MR | Zbl
, and , On an inequality for the entropy of a probability distribution. Acta Math. Hungar. 85 (1999) 345–349. | MR | Zbl | DOI
, An entropy proof of Brégman’s theorem. J. Comb. Theory Ser. A 77 (1997) 161–164. | MR | Zbl | DOI
, Convexity inequalities for estimating generalized conditional entropies from below. Kybernetika 48 (2012) 242–253 | MR | Zbl
, Bounds of the Pinsker and Fannes types on the Tsallis relative entropy. Math. Phys. Anal. Geom. 16 (2013) 213–228. | MR | Zbl | DOI
, Uncertainty relations for MUBs and SIC-POVMs in terms of generalized entropies. Eur. Phys. J. D 67 (2013) 269. | DOI
, Tests for quantum contextuality in terms of -entropies. Quantum Inf. Comput. 14 (2014) 0996–1013. | MR
A. Rényi, On measures of entropy and information, in Proc. of the 4th Berkeley Symposium on Mathematical Statistics and Probability, edited by J. Neyman. University of California Press, Berkeley CA (1961) 547–561. | MR | Zbl
, Statistics and information theory. Studia Sci. Math. Hungar. 2 (1967) 249–256. | MR | Zbl
and , Note on the equivalence relationship between Rényi-entropy based and Tsallis-entropy based image thresholding. Pattern Recognit. Lett. 26 (2005) 2309–2312. | DOI
, and , Conditional Rényi entropies. IEEE Trans. Inf. Theory 58 (2012) 4273–4277. | MR | Zbl | DOI
, Possible generalization of Boltzmann–Gibbs statistics. J. Stat. Phys. 52 (1988) 479–487. | MR | Zbl | DOI
, Bounds of the minimal error probability on checking a finite or countable number of hypotheses. Problemy Peredači Informacii 4 (1968) 9–19, (in Russian); translated as Probl. Inf. Transm. 4 (1968) 6–14. | MR
, Rényi extrapolation of Shannon entropy. Open Sys. Inf. Dyn. 10 (2003) 297–310, corrigendum in arXiv:quant-ph/0305062v2. | MR | Zbl | DOI
Cité par Sources :





