V piatok 9. septembra 2011 o 15:00 sa na seminári Oddelenia teoretických metód v zasadačke Ústavu merania SAV bude konať prednáška, ktorú prednesie profesor Constantino Tsallis, známy brazílsky fyzik, ktorý zaviedol pojmy dnes známe ako Tsallisova entrópia a Tsallisova štatistika. Pracuje v brazílskom Rio de Janeiro v CBPF (Centro Brasileiro de Pesquisas Físicas).
Abstract
Exploring the region where the hypothesis validating Boltzmann-Gibbs statistical mechanics are not satisfied
Constantino Tsallis
Centro Brasileiro de Pesquisas Físicas
Coordenação de Matéria Condensada e Física Estatística
Rua Dr. Xavier Sigaud 150
Urca - Rio de Janeiro - RJ - Brazil - 22290-180
The celebrated Boltzmann-Gibbs entropy and its associated statistical mechanics are producing, along 140 years, uncountable contributions to our knowledge of systems whose collective dynamics satisfy simplifying hypothesis such as ergodicity.
Nevertheless a fascinating world exists outside this (and related) hypothesis. It is the purpose of the so called nonadditive entropy Sq and its associated nonextensive statistical mechanics, introduced in 1988, to theoretically approach this world.
We shall briefly illustrate some of its main grounding concepts, and exhibit typical, as well as recent, predictions and verifications in natural, artificial and social systems.
Bibliography:
- C. Tsallis, Introduction to Nonextensive Statistical Mechanics - Approaching a Complex World (Springer, New York, 2009);
- C. Tsallis, Entropy, in Encyclopedia of Complexity and Systems Science, ed. R.A. Meyers (Springer, Berlin, 2009);
- S. Umarov, C. Tsallis, M. Gell-Mann and S. Steinberg, J. Math. Phys. 51, 033502 (2010);
- J. S. Andrade Jr., G.F.T. da Silva, A.A. Moreira, F.D. Nobre and E.M.F. Curado, Phys. Rev. Lett. 105 , 260601 (2010);
- F.D. Nobre, M.A. Rego-Monteiro and C. Tsallis, Phys. Rev. Lett. (2011), in press;
- http://tsallis.cat.cbpf.br/biblio.htm.
Professor Constantino Tsallis
From Wikipedia, the free encyclopedia (see http://en.wikipedia.org/wiki/Constantino_Tsallis)
Constantino Tsallis (Greek: born 1943) is a naturalized Brazilian physicist working in Rio de Janeiro at CBPF, Brazil. He was born in Greece, and grew up in Argentina, where he studied physics at Instituto Balseiro, in Bariloche. In 1974 he received a Doctorat d'Etat et Sciences Physiques degree from the University of Paris-Orsay. He moved to Brazil in 1975 with his family (his wife and daughter).
Tsallis is credited with introducing the notion of what is known as Tsallis entropy and Tsallis statistics in his 1988 seminal paper "Possible generalization of Boltzmann-Gibbs statistics" published in the Journal of Statistical Physics, vol. 52, pp. 479-487. The generalization is considered to be one of the most viable and applicable candidates for formulating a theory of non-extensive thermodynamics. The resulting theory is not intended to replace Boltzmann-Gibbs statistics, but rather supplement it, such as in the case of anomalous systems characterised by non-ergodicity or metastable states.One of the most impressive experimental verifications of the predictions of q-statistics concerns cold atoms in dissipative optical lattices. Eric Lutz made an analytical prediction in 2003 which was verified in 2006 by a London team.Tsallis conjectured in 1999 (Brazilian Journal of Physics 29):
That a longstanding quasi-stationary state (QSS) was expected in LONG-range interacting Hamiltonian systems (one of the core problems of statistical mechanics). This was quickly verified by many groups around the world.
That this QSS should be described by q-statistics instead of Boltzmann-Gibbs statistics. This was verifed in June 2007 by Pluchino, Rapisarda and Tsallis (in the last figure, instead of the celebrated Maxwellian (Gaussian) distribution of velocities (valid for SHORT-range interactions), one sees a q-Gaussian).
These results establish that the q-entropy provides verifiable predictions from first principles as a generalization of Boltzmann-Gibbs entropy for certain classes of phenomena.
Remarks from Rex Graham, Senior Editor of Astronomy magazine (see http://www.mlahanas.de/Greeks/new/Tsallis.htm)
For nearly 120 years, physicists have relied on a particular formula to describe entropy. This formula, often simply called Boltzmann-Gibbs entropy, appears in virtually all modern physics textbooks. But few, if any, textbooks discuss a new expression of entropy that many physicists now describe as a major advance in theoretical physics. Put forth by Constantino Tsallis, a professor at Centro Brasileiro de Pesquisas Fisicas in Rio de Janeiro, Brazil, the generalization helps explain many physical phenomena, from fractal behavior to time-dependent behavior of DNA and other macromolecules.
"Energy is an extremely rich concept, but it is simpler than entropy," says Tsallis. "Energy has to do with possibilities," he explains. "Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step." Tsallis has suspected for years that Boltzmann's and Gibbs' formulas had limitations. They failed, for example, to describe the observed "time evolution" of entropy in critical environments where a system is poised on a razor's edge between order and chaos. So-called Boltzmann-Gibbs entropy also failed to describe self-organized critical systems whose properties evolve in time in a particular way.
Physicists around the world are applying Tsallis entropy in many systems—from solid-state physics to information theory. Tsallis entropy can adapt to suit the physical characteristics of many systems while preserving the fundamental property of entropy in the Second Law of Thermodynamics, namely that the entropy of the universe increases with time in all processes. Although Tsallis's definition of entropy includes Boltzmann's expression in one case—when the entropy of a system is merely the sum of the entropies of its subsystems—Tsallis's definition of entropy is much broader. It describes unusual phenomena that, while sometimes rare, are vitally important. "Many physicists will tell you that this is very strange because there is no choice—there is only one entropy," says Tsallis. "I think the concept is larger than that."
Tsallis was born in 1943 in Athens, Greece. His father, a natural linguist and textile merchant, left Greece with his family in 1947 to escape the country's civil war. The Tsallis family settled in Argentina. Constantino flourished in the Spanish culture, but he has always felt at home with the scientific insights of the ancient Greek philosophers. It was the ancient Greeks, after all, who derived the concept of atoms based on philosophical reasoning rather than evidence. Tsallis argues passionately that truth and beauty are equivalent, a concept that also dates back 2,500 years to the birthplace of theory, democracy, and classical literature.
Even though atoms and molecules had not been unequivocally discovered by the late 1800s, there was enough evidence for their existence for Boltzmann to derive his formula based on the probabilities of what he called "elementary complexions" in the system. Distraught that some of his valued colleagues accepted neither the atomic theory nor his expression for entropy, Boltzmann killed himself in 1906. In 1994, Tsallis stood at Boltzmann's grave in the Zentralfriedhof, or Central Cemetery, in Vienna, Austria, and gazed at the tragic scientist's mathematical epitaph, S = k log W, carved in granite. Many of the world's most distinguished scientists make the pilgrimage, but even as a young scientist, Tsallis had sensed a weakness in the formula. During the past three decades, Tsallis's theoretical publications have ranged from genetics to galaxies. He was particularly intrigued by fractals, self-similar constructs independent of scale that describe clouds, mountains and coastlines. Earthquakes and the flocking behavior of birds are self-organizing systems that exhibit fractal behavior. Tsallis was intrigued by the ubiquity of fractal behaviors in nature and how Boltzmann-Gibbs entropy essentially doesn't apply to them.
It was during a coffee break at a workshop in Mexico City almost a decade earlier, in 1985, that the idea of the generalization of entropy and Boltzmann-Gibbs statistical mechanics came to Tsallis. It took him three years to decide to publish his idea. "Entropy is a very subtle, controversial topic," says Tsallis. "I was trying to penetrate into the physical meaning and validity of my generalization." After that fateful coffee break, Tsallis was able to use mathematical analogies he derived from fractals to conceive his expression. Some physicists are calling it a brilliant generalization of the Boltzmann-Gibbs microscopic expressions for Clausius entropy. Over the years, engineers, cybernetics experts, and other theoreticians proposed a variety of new possibilities for entropy, but none were within the scheme set by the great master Gibbs. While those attempts were made with no particular physical goal in mind, Tsallis wanted to generalize both statistical mechanics and thermodynamics. His generalization met all of Gibbs's criteria except one: it did not meet Gibbs's requirement of additivity, which is sometimes referred to as extensivity. In usual thermodynamics, energy and entropy are extensive quantities. That means that the total energy or entropy of two systems that are independent or uncorrelated equals the sum. Tsallis's expression for entropy, published in a 1988 paper in the Journal of Statistical Physics, is nonextensive. The paper uses statistical mechanics in the anomalous cases in which a non-Boltzmann entropy seems to reign. It was a crisp break with convention.
A follow-up paper in 1991, co-authored by Tsallis and E. M. F. Curado and published in the Journal of Physics titled "Generalized Statistical Mechanics: Connections with Thermodynamics," extended the revolution. "The entropy we have always learned is good for a mass of molecules in a room, for a heat engine, for a million things," Tsallis says, carefully enunciating each word for effect and gazing at a novice with the eyes of an evangelist. "But there are a million other processes in which a different entropy appears to be needed. . . . Many physicists will tell you this is absolute nonsense. But an increasing number will also say it is not nonsense." The Institute for Science Information cited the 1991 Tsallis and Curado paper as the most-cited Brazilian physics paper worldwide in the 1990s. Three international workshops in 1999 and 2000—two in Japan and one in Texas—were dedicated to exploring the ramifications of Tsallis's ideas. In 2001, a conference on physical applications of Tsallis entropy is scheduled in Italy, and another to be held at the Santa Fe Institute will focus on nonphysics applications. It will be co-chaired by SFI professor Murray Gell-Mann.
Explaining his ideas to a reporter in July 2000, Tsallis, 56, writes equations slowly on a sheet of paper. It's a warm, humid afternoon in Cambridge, MA, and MIT's air conditioning can't quite keep up. Tsallis begins with probability, derives Boltzmann's and Gibbs's formulas, and then draws a solid horizontal line, under which a fractal term with an exponent q appears. He combines q with Boltzmann-Gibbs entropy so that the probability, p, is raised to the power q. Suddenly, the power of this seemingly simple approach is apparent.
"If q equals 1, you get back Boltzmann-Gibbs entropy," he says. "But with some rare event in which the probability is very small, and if you raise it to a power q, which is smaller than 1, its weight grows up." What he means is any small number raised to a power less than 1 becomes larger. (For example, 0.5 to the 0.3 power equals 0.8.) Tsallis's forehead glistens with tiny beads of sweat.
Tsallis uses the example of a tornado to demonstrate how low-probability events "grow up." Normally, the air molecules above a farm or city move about independently and fairly randomly. In such cases, the entropy of two different volumes of air can simply be added. This is his key point: the quantities of two systems that can be summed to yield the total are called extensive quantities. Standard statistical mechanics and thermodynamics are extensive: they assume that the atoms, molecules or particles in a system are independent of each other or that they interact only with nearby particles. A fast-moving air molecule zips past a motionless one with neither greatly affecting the other. However, nature is not always extensive. Tornadoes—systems in which the movements of air molecules are highly correlated—a nonextensive case—happen frequently enough to draw the attention of lots of Midwesterners on stormy summer days.
"A tornado is a very rare event," says Tsallis. "Why? Because trillions and trillions of molecules are turning orderly around. So a vortex is a very low-probability event, but when it is there, it controls everything." Human vision also behaves in very unlikely, nonextensive ways. For example, if a large smooth wall is painted white except for a small red spot, the human eye very quickly finds the dot. "Why?" asks Tsallis. "Because it's not supposed to be there. The phenomenon of visual perception also is controlled by rare events. In fact, we are the offspring of those who quickly saw a tiger nearby, because it should not be there, and ran away."
For most of the systems that people deal with, the assumption of extensivity is very well obeyed. "What Tsallis defined was a simple generalization of Boltzmann entropy that does not add up from system to system and has a parameter q that measures the degree to which the nonextensivity holds," says Seth Lloyd, an External Faculty member at SFI and an associate professor of mechanical engineering at MIT. "Tsallis's is the most simple generalization that you can imagine. And for a variety of systems with long-range interactions—solid-state physics, chaotic dynamics, chemical systems, the list goes on and on—Tsallis entropy is maximized for some value of q. It is mathematically handy."
In nonextensive situations, correlations between individual constituents in the system do not die off exponentially with distance as they do in extensive cases. Instead, the correlations die off as the distance is raised to some empirically derived or theoretically deduced power, which is called a power law.
If a plot of the logarithm of the number of times a certain property value is found against the logarithm of the value itself results in a straight line, the relationship is a power law. The Richter scale is a power law: the logarithm of the strength of earthquakes plotted against the logarithm of the number of quakes yields a straight line. Tsallis entropy is applicable to hundreds of nonextensive systems with such power-law scaling. Power laws are helpful in describing not only fractal behavior but many other physical phenomena as well. Unfortunately, Tsallis has no proof from first principles that his expression of entropic nonextensivity is the best one possible. Michel Baranger, an emeritus professor of physics at MIT, agrees that the lack of such a proof has led many physicists to be skeptical. "As far as I'm concerned, his formula is excellent," says Baranger. "But I would still like to see justification of this formula from first principles. It will probably come because all indications are that it is good."
"Tsallis did pull this out of thin air," says A. K. Rajagopal, an expert on condensed-matter physics and quantum information theory at the Naval Research Laboratory in Washington, D.C. "What he did, intuitively, was really remarkable. It takes you from ordinary exponential probabilities to power-law probabilities. And it is important because so many physical phenomena—such as fractal behavior, or anomalous diffusion in condensed-matter materials, time-dependent behavior of DNA and other macromolecules, and many, many, many other phenomena—are explained by these power-law probabilities. There is a formula for one class of phenomena, another formula for another class, but there may be basic tenants in common."
Many scientists refer to Tsallis's q parameter as the entropic index or the nonextensive index. He argues that his expression of entropic nonextensivity "appears as a simple and efficient manner to characterize what is currently referred to as complexity—or at least some types of complexity." Not every complexity theorist would go that far, but most of them are willing to entertain the possibility, however unlikely.
|