Full-text resources of PSJD and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl


Preferences help
enabled [disable] Abstract
Number of results


2003 | 1 | 3 | 393-420

Article title

Uncertainty relations expressed by Shannon-like entropies


Title variants

Languages of publication



Besides the well-known Shannon entropy, there is a set of Shannon-like entropies which have applications in statistical and quantum physics. These entropies are functions of certain parameters and converge toward Shannon entropy when these parameters approach the value 1. We describe briefly the most important Shannon-like entropies and present their graphical representations. Their graphs look almost identical, though by superimposing them it appears that they are distinct and characteristic of each Shannon-like entropy. We try to formulate the alternative entropic uncertainty relations by means of the Shannon-like entropies and show that all of them equally well express the uncertainty principle of quantum physics.










Physical description


1 - 9 - 2003
1 - 9 - 2003


  • Institute of Physics, Slovak Academy of Sciences, Dúbravská cesta, SK-84 228, Bratislava, Slovak Republic


  • [1] J. Aczcél and Z. Daróczy: On measures of information and their characterization, Academic Press, New York, 1975.
  • [2] B.M. Boghosian: “Thermodynamic description of the relaxation of two-dimensional turbulence using Tsallis statistics”, Phys. Rev, Vol. E 53, (1996), pp. 4754–4763.
  • [3] A. Compte and D. Jou: “Non-equilibrium thermodynamics and anomalous diffusion”, J. Phys. A: Math & Gen., Vol. 29, (1996), pp. 4321–4329. http://dx.doi.org/10.1088/0305-4470/29/15/007[Crossref]
  • [4] Z. Daróczy: “Az információ Shannon-féle mértékéröl”, MTA III.Osztály Közleménzyei, Vol. 19, (1969), pp. 11–16.
  • [5] D. Deutsch: “Uncertainty in Quantum Measurements”, Phys. Rev., Vol. D 50, (1983), pp. 631.
  • [6] W. Feller: An Introduction to Probability Theory and its Applications., Vol. 1, Wiley, New York, 1966.
  • [7] R.W. Finkel: “Generalized Uncertainty Relations”, Phys. Rev., Vol. A 35, (1987), pp. 1488–1489.
  • [8] S. Guiasu: Information Theory with Applications, McGraw-Hill, New York, 1977.
  • [9] V.H. Hamity and D.E. Barraco: “Generalized nonextensive thermodynamics applied to the cosmical background radiation in Robertson-Walker Universe”, Phys. Rev. Lett., Vol. 76, (1996), pp. 4664. http://dx.doi.org/10.1103/PhysRevLett.76.4664[Crossref]
  • [10] J. Havrda and F. Charvat: “Concept of structural α-entropy”, Kybernetika, Vol. 3, (1967), pp. 30–41.
  • [11] D.B. Ion and M.L.D. Ion: “Optimal bounds for Tsallis-like entropies in quantum scattering”, Phys. Rev. Lett., Vol. 83, (1999), pp. 463–467; “Angle-angular-momentum entropic bounds and optimal frequencies for quantum scattering of spinless particles”, Phys. Rev., Vol. E, (1999), pp. 5261–5274. http://dx.doi.org/10.1103/PhysRevLett.83.463[Crossref]
  • [12] M.L.D. Ion and D.B. Ion: “Entropic lower bound for the quantum scattering of spinless particles”, Phys. Rev. Lett., Vol. 81, (1998), pp. 5714–5717; “Strong evidences for correlated nonextensive quantum statistics in hadronic scatterings”, Phys. Lett., Vol. B 482, (2000), pp. 57–64. http://dx.doi.org/10.1103/PhysRevLett.81.5714[Crossref]
  • [13] J.N. Kapur, C.R. Bector, U. Kumar: “A generalization of the entropy model for brand purchase behavior”, Naval Research Logistics Quarterly, Vol. 31, (1984), pp. 183–198. [Crossref]
  • [14] U. Kumar, J.N. Kapur, C.J. Bector: “Rényi Entropy Model for Brand Purchase Behaviour”., Journal of Information and Optimization Sciences, Vol. 6, (1985), pp. 233–242.
  • [15] J. Lin and S.K.M. Wong: “A new divergence measure and its characterization”, Int. J. General Systems, Vol. 17, (1990), pp. 73–81.
  • [16] V. Majerník and B. Mamojka: “On the nonstandard measures of uncertainty and organization”, Phys. Scr., Vol. 44, (1991), pp. 412–417. http://dx.doi.org/10.1088/0031-8949/44/5/002[Crossref]
  • [17] V. Majerník and L. Richterek: “Entoopic uncertainty relations”., Eur. J. Phys., Vol. 18, (1997), pp. 79–89. http://dx.doi.org/10.1088/0143-0807/18/2/005[Crossref]
  • [18] V. Majerník: Elementary theory of organization, Palacky University Press, Olomouc, 2001.
  • [19] V. Majerník and E. Majerníková: “The determination of bounds of the β-entropic sum of two noncommuting observables”, Reports on Math. Phys., Vol. 45, (2001), pp. 381–392. http://dx.doi.org/10.1016/S0034-4877(01)80051-3[Crossref]
  • [20] E. Merzbacher: Quantum Physics, 2th ed., Wiley, New York, 1972.
  • [21] A.R. Plastino and A. Plastino: “Dynamic Aspects of Tsallis entropy”, Physica, Vol. A 202, (1994), pp. 438–448. http://dx.doi.org/10.1016/0378-4371(94)90470-7[Crossref]
  • [22] A.R. Plastino and A. Plastino: “Non-extensive statistical mechanics and generalized Fokker-Plank equation”, Physica, Vol. A 222, (1995), pp. 347–354. http://dx.doi.org/10.1016/0378-4371(95)00211-1[Crossref]
  • [23] M. Portesi and A. Plastino: “Generalized entropy as a measure of quantum uncertainty”, Physica, Vol. A 225, (1996), pp. 412–430. http://dx.doi.org/10.1016/0378-4371(95)00475-0[Crossref]
  • [24] A.K. Rajagopal: “The Sobolev inequality and the Tsallis entropic uncertainty relation”, Phys. Lett., Vol. A 205, (1995), pp. 32–36. http://dx.doi.org/10.1016/0375-9601(95)00500-3[Crossref]
  • [25] A. Rényi: “On the Measures of Entropy and Information”, In: 4th Berkeley Symp. Math. Stat. Probability, New York, 1961, North Holland, Vol. 1, pp. 547–554.
  • [26] C. Tsallis: “Possible generalization of Boltzmann-Gibbs statistics”, J. Stat. Phys., Vol. 52, (1988), pp. 499–487. http://dx.doi.org/10.1007/BF01016429[Crossref]
  • [27] C. Tsallis: The collection of articles in Brazilian J. Phys., Vol. 29, (1999), pp. 1–87.
  • [28] I. Vajda: “The information “energy””, Bull. Math. de la Soc. Sci. Math. de la R. S. de Roumaine, Vol. 11, (1967), pp. 167–172.
  • [29] I. Vajda: Theory of information and statistical decisions, Alfa, Bratislava, 1982.

Document Type

Publication order reference


YADDA identifier

JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.