In chemistry, physics and thermodynamics, thermodynamic entropy, symbolized by S, is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1850s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek ÏÏοÏη meaning "transformation". Although the concept of entropy was originally a thermodynamic construct, it has since been adapted and applied to many disparate fields of study, including statistical mechanics, thermal physics, statistics, information theory, psychodynamics, economics, and evolution.
"Ice melting" - a classic example of entropy increasingContents [hide]
1 Ice melting example
2 Overview
3 History
4 Thermodynamic definition
4.1 Units and symbols
5 Statistical interpretation
6 Information theory
7 The second law
8 The arrow of time
9 Entropy and cosmology
10 Generalized Entropy
11 Entropy in fiction
12 See also
13 References
14 Further reading
15 External links
[edit]
Ice melting example
The illustration for this article is a classic example in which entropy increases in a small 'universe', a thermodynamic system consisting of the 'surroundings' (the warm room) and 'system' (glass, ice, cold water). In this universe, some heat energy dQ from the warmer room surrroundings (at 77 F (298 K) will spread out to the cooler system of ice and water at its constant temperature T of 32 F (273 K), the melting temperature of ice. Thus, the entropy of the system, which is dQ/T, increases by dQ/273 K. (The heat dQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. the ÎH for ice fusion.)
It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of dQ/298 K for the surroundings is smaller than the ratio (entropy change), of dQ/273 K for the ice+water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy': the final net entropy after such an event is always greater than was the initial entropy.
As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the dQ/T over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice + water was introduced and became a 'system' within it.
[edit]
Overview
In a thermodynamic system, a 'universe' consisting of 'surroundings' and 'system' and made up of quantities of matter, its pressure differences, density differences, and temperature differences all tend to equalize over time. As shown in the preceding discussion of the illustration involving a warm room (surrroundings) and cold glass of ice and water (system), the difference in temperature begins to be equalized as portions of the heat energy from the warm surroundings become spread out to the cooler system of ice and water. Over time the temperature of the glass and its contents becomes equal to that of the room. The entropy of the room has decreased because some of its energy has been dispersed to the ice and water. However, as calculated in the discussion above, the entropy of the system of ice and water has increased more than the entropy of the surrounding room decreased. This is always true, the dispersal of energy from warmer to cooler always results in an increase in entropy. Thus, when the 'universe' of the room surroundings and ice and water system has reached an equilibrium of equal temperature, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.
The entropy of a thermodynamic system can be interpreted in two distinct, but compatible, ways:
From a macroscopic perspective, in classical thermodynamics the entropy is interpreted simply as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy ÎE, and its entropy falls by ÎS, a quantity at least TR ÎS of that energy must be given up to the system's surroundings as unusable heat (TR is the temperature of the system's external surroundings). Otherwise the process will not go forward.
From a microscopic perspective, in statistical thermodynamics the entropy is a measure of the number of microscopic configurations that are capable of yielding the observed macroscopic description of the thermodynamic system:
where Ω is the number of microscopic configurations, and kB is Boltzmann's constant.
It can be shown that this definition of entropy, sometimes referred to as Boltzmann's postulate, reproduces all of the properties of the entropy of classical thermodynamics.
An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value. Unlike almost all other laws of physics, this associates thermodynamics with a definite arrow of time. However, for a universe of infinite size, which cannot be regarded as an isolated system, the second law does not apply.
[edit]
History
Main article: History of entropy
The short history of entropy begins with the work of mathematician Lazare Carnot who in his 1803 work Fundamental Principles of Equilibrium and Movement postulated that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power of Fire in which he set forth the view that in all heat-engines "caloric", or what is now known as heat, moves from hot to cold and that "some caloric is always lost". This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics. In the 1850s, Rudolf Clausius began to give this "lost caloric" a mathematical interpretation by questioning the nature of the inherent loss of heat when work is done, e.g. heat produced by friction.[1] In 1865, Clausius gave this heat loss a name:[2]
I propose to name the quantity S the entropy of the system, after the Greek word [trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.
Later, scientists such as Ludwig Boltzmann, Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. Carathéodory linked entropy with a mathematical definition of irreversiblity, in terms of trajectories and integrability.
[edit]
Thermodynamic definition
Main article: Entropy (thermodynamic views)
In the early 1850s, Rudolf Clausius began to put the concept of "energy turned to waste" on a differential footing. Essentially, he set forth the concept of the thermodynamic system and positioned the argument that in any irreversible process a small amount of heat energy dQ is incrementally dissipated across the system boundary.
Specifically, in 1850 Clausius published his first memoir in which he presented a verbal argument as to why Carnot’s theorem, proposing the equivalence of heat and work, i.e. Q = W, was not perfectly correct and as such it would need amendment. In 1854, Clausius states: “In my memoir ‘On the Moving Force of Heat, &c.’, I have shown that the theorem of the equivalence of heat and work, and Carnot’s theorem, are not mutually exclusive, by that, by a small modification of the latter, which does not affect its principle, they can be brought into accordance.” This small modification on the latter is what developed into the second law of thermodynamics.
In his 1854 memoir, Clausius first develops the concepts of interior work, i.e. “those which the atoms of the body exert upon each other”, and exterior work, i.e. “those which arise from foreign influences which the body may be exposed”, which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three types of heat by which Q may be divided:
heat employed in increasing the heat actually existing in the body
heat employed in producing the interior work
heat employed in producing the exterior work
Building on this logic, and following a mathematical presentation of the first fundamental theorem, Clausius then presents us with the first-ever mathematical formulation of entropy, although at this point in the development of his theories calls it “equivalence-value”. He states, “the second fundamental theorem in the mechanical theory of heat may thus be enunciated:"[3]
If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat Q from work at the temperature t , has the equivalence-value:
and the passage of the quantity of heat Q from the temperature t1 to the temperature t2, has the equivalence-value:
wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.
This is the first-ever mathematical formulation of entropy; at this point, however, Clausius had not yet affixed the concept with the label entropy as we currently know it; this would come in the following two years.
In 1876, chemical engineer Willard Gibbs, building on the work of those as Clausius and Hermann von Helmholtz, situated the view that the measurement of “available energy” ÎG in a thermodynamic system could be mathematically accounted for by subtracting the “energy loss” TÎS from total energy change of the system ÎH. These concepts were further developed by James Clerk Maxwell [1871] and Max Planck [1903].
[edit]
Units and symbols
Conjugate variables
of thermodynamics
Pressure Volume
Temperature Entropy
Chem. potential Particle no.
Entropy is a key physical variable in describing a thermodynamic system. The SI unit of entropy is 'joule per kelvin' (J·Kâ1), which is the same as the unit of heat capacity, and entropy is said to be thermodynamically conjugate to temperature. The entropy depends only on the current state of the system, not its detailed previous history, and so it is a state function of the parameters like pressure, temperature, etc., which describe the observable macroscopic properties of the system. Entropy is usually symbolized by the letter S.
There is an important connection between entropy and the amount of internal energy in the system which is not available to perform work. In any process where the system gives up an energy ÎE, and its entropy falls by ÎS, a quantity at least TR ÎS of that energy must be given up to the system's surroundings as unusable heat. Otherwise the process will not go forward. (TR is the temperature of the system's external surroundings, which may not be the same as the system's current temperature T ).
[edit]
Statistical interpretation
Main article: Entropy (statistical views)
In 1877, thermodynamicist Ludwig Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrodinger, has been to determine the distribution of a given amount of energy E over N identical systems.
Statistical mechanics explains entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties have been taken into account. For a given set of macroscopic quantities, like temperature and volume, the entropy measures the degree to which the probability of the system is spread out over different possible quantum states. The more states available to the system with higher probability, and thus the greater the entropy.
On the molecular scale, the two definitions match up because adding heat to a system, which increases its classical thermodynamic entropy, also increases the system's thermal fluctuations, so giving an increased lack of information about the exact microscopic state of the system, i.e. an increased statistical mechanical entropy.
Qualitatively, entropy is often associated with the amount of disorder in the system. For example, solids (which are typically ordered on the molecular scale) usually have smaller entropy than liquids, and liquids smaller entropy than gases. This happens because the number different microscopic states available to an ordered system is usually much smaller than the number of states available to a disordered system.
[edit]
Information theory
Main article: Information entropy
Main article: Entropy in thermodynamics and information theory
The concept of entropy in information theory describes with how much randomness (or, alternatively, 'uncertainty') there is in a signal or random event. An alternative way to look at this is to talk about how much information is carried by the signal.
The entropy in statistical mechanics can be considered to be a specific application of Shannon entropy, according to a viewpoint known as MaxEnt thermodynamics. Roughly speaking, Shannon entropy is proportional to the minimum number of yes/no questions you have to ask to get the answer to some question. The statistical mechanical entropy is then proportional to the minimum number of yes/no questions you have to ask in order to determine the microstate, given that you know the macrostate.
[edit]
The second law
Main article: Second law of thermodynamics
An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value; and so, by implication, the entropy of the universe (i.e. the system and its surroundings), assumed as an isolated system, tends to increase. We will consider the meaning of the "second law" further in a subsequent section. Two important consequences are that heat cannot of itself pass from a colder to a hotter body: i.e., it is impossible to transfer heat from a cold to a hot reservoir without at the same time converting a certain amount of work to heat. It is also impossible for any device that operates on a cycle to receive heat from a single reservoir and produce a net amount of work; it can only get useful work out of the heat if heat is at the same time transferred from a hot to a cold reservoir. This means that there is no possibility of a 'perpetuum mobile' which is isolated. Also, from this it follows, that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient.
[edit]
The arrow of time
Main article: Entropy (arrow of time)
Entropy is the only quantity in the physical sciences that "picks" a particular direction for time, sometimes called an arrow of time. As we go "forward" in time, the Second Law of Thermodynamics tells us that the entropy of an isolated system can only increase or remain the same; it cannot decrease. Hence, from one perspective, entropy measurement is thought of as a kind of clock.
[edit]
Entropy and cosmology
We have previously mentioned that a finite universe may be considered an isolated system. As such, it may be subject to the Second Law of Thermodynamics, so that its total entropy is constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source.
If the universe can be considered to have generally increasing entropy, then - as Roger Penrose has pointed out - an important role in the increase is played by gravity, which causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes, if they are totally effective matter and energy traps. Hawking has, however, recently changed his stance on this aspect.
The role of entropy in cosmology remains a controversial subject. Recent work has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly and leads to an "entropy gap," thus pushing the system further away from equilibrium with each time increment. Complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.
[edit]
Generalized Entropy
Many generalizations of entropy have been studied, two of which, Tsallis and Rényi entropies, are widely used and the focus of active research.
The Rényi entropy is an information measure for fractal systems.
.
where α > 0 is the 'order' of the entropy, pi are the probabilities of {x1, x2 ... xn}. For α =1 we recover the standard entropy form.
The Tsallis entropy is employed in Tsallis statistics to study nonextensive thermodynamics.
where p denotes the probability distribution of interest, and q is a real parameter that measures the non-extensitivity of the system of interest. In the limit as q â 1, we again recover the standard entropy.
[edit]
Entropy in fiction
Isaac Asimov's The Last Question, a short science fiction story about entropy
Thomas Pynchon, an American author who deals with entropy in many of his novels
Diane Duane's Young Wizards series, in which the protagonists' ultimate goal is to slow down entropy and delay heat death.
Gravity Dreams by L.E. Modesitt Jr.
The Planescape setting for Dungeons & Dragons includes the Doomguard faction, who worship entropy.
Arcadia, a play by Tom Stoppard, explores entropy, the arrow of time, and heat death.
Stargate SG-1 and Atlantis, science-fiction television shows where a ZPM (Zero Point Module) is depleted when it reaches maximum entropy
In DC Comics's series Zero Hour, entropy plays a central role in the continuity of the universe.
In a post crisis issue of Superman, in which Doomsday is brought to the end of time and entropy is discussed.
H.G. Wells' story "The Time Machine" had a theme that was based upon entropy and the idea of humans evolving into two species, each of which is degenerate in its own way. Such a process is often incorrectly termed devolution.
"Logopolis," an episode of Doctor Who
Asemic Magazine Asemic Magazine is an Australian publication that is exploring entropy in literature.
Philip K. Dick's "UBIK" a science fiction novel with entropy as underlying theme.
Peter F. Hamilton's "The Night's Dawn Trilogy", is a trilogy with entropy as an underlying theme.
In Savage: The Battle for Newerth, the Beast Horde class can build Entropy Shrines and Spires.
[edit]
See also
Arrow of time
Black hole entropy
Chaos Theory
Enthalpy
Entropic force
Entropy of mixing
Information entropy
Kolmogorov-Sinai entropy (in dynamical systems)
Logarithmic units
Maxwell's demon
Negentropy
Residual entropy
Statistical Mechanics
Syntropy
Thermodynamic potential
[edit]
References
^ Clausius, Ruldolf (1850). On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff's Annalen der Physick, LXXIX (Dover Reprint). ISBN 0486590658.
^ Laidler, Keith J. (1995). The Physical World of Chemistry. Oxford University Press. ISBN 0198559194.
^ Published in Poggendoff’s Annalen, Dec. 1854, vol. xciii. p. 481; translated in the Journal de Mathematiques, vol. xx. Paris, 1855, and in the Philosophical Magazine, August 1856, s. 4. vol. xii, p. 81
[edit]
Further reading
Fermi, Enrico (1937). Thermodynamics. Prentice Hall. ISBN 048660361X.
Kroemer, Herbert; Charles Kittel (1980). Thermal Physics, 2nd Ed., W. H. Freeman Company. ISBN 0716710889.
Penrose, Roger (2005). The Road to Reality : A Complete Guide to the Laws of the Universe. ISBN 0679454438.
Reif, F. (1965). Fundamentals of statistical and thermal physics. McGraw-Hill. ISBN 0070518009.
Goldstein, Martin; Inge, F (1993). The Refrigerator and the Universe. Harvard University Press. ISBN 0674753259.
[edit]
External links
Mechanical Theory of Heat – Nine Memoirs on the development of concept of "Entropy" by Rudolf Clausius [1850-1865]
Max Jammer (1973). Dictionary of the History of Ideas: Entropy
Frank L. Lambert, Disorder - A Cracked Crutch For Supporting Entropy Discussions, Journal of Chemical Education 79 187-192 (2002).
Frank L. Lambert, A Student’s Approach to the Second Law and Entropy — Thorough presentation aimed at chemistry students of entropy from the viewpoint of dispersal of energy, part of an extensive series of websites by the same author
Quantum mechanics is a fundamental branch of theoretical physics that supersedes classical mechanics at the atomic and subatomic levels. It provides the underlying mathematical framework for many fields of physics and chemistry, including condensed matter physics, atomic physics, molecular physics, computational chemistry, quantum chemistry, particle physics, and nuclear physics. Along with general relativity, quantum mechanics is one of the pillars of modern physics.
Contents [hide]
1 Introduction
2 Description of the theory
2.1 Quantum mechanical effects
2.2 Mathematical formulation
2.3 Interactions with other scientific theories
3 Applications of quantum theory
4 Philosophical consequences
5 History
5.1 Founding experiments
6 See also
7 References
8 Notes
9 External links
[edit]
Introduction
The term quantum (Latin, "how much") refers to discrete units that the theory assigns to certain physical quantities, such as the energy of an atom at rest (see Figure 1, at right). The discovery that waves could be measured in particle-like small packets of energy called quanta led to the branch of physics that deals with atomic and subatomic systems which we today call Quantum Mechanics. The foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Albert Einstein, Niels Bohr, Werner Heisenberg, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Wolfgang Pauli and others. Some fundamental aspects of the theory are still actively studied.
Quantum mechanics is a more fundamental theory than Newtonian mechanics and classical electromagnetism, in the sense that it provides accurate and precise descriptions for many phenomena that these "classical" theories simply cannot explain on the atomic and subatomic level. It is necessary to use quantum mechanics to understand the behavior of systems at atomic length scales and smaller. For example, if Newtonian mechanics governed the workings of an atom, electrons would rapidly travel towards and collide with the nucleus. However, in the natural world the electron normally remains in a stable orbit around a nucleus -- seemingly defying classical electromagnetism.
Quantum mechanics was initially developed to explain the atom, especially the spectra of light emitted by different atomic species. The quantum theory of the atom developed as an explanation for the electron's staying in its orbital, which could not be explained by Newton's laws of motion and by classical electromagnetism.
In the formalism of quantum mechanics, the state of a system at a given time is described by a complex number wave functions (sometimes referred to as orbitals in the case of atomic electrons), and more generally, elements of a complex vector space. This abstract mathematical object allows for the calculation of probabilities of outcomes of concrete experiments. For example, it allows to compute the probability of finding an electron in a particular region around the nucleus at a particular time. Contrary to classical mechanics, one cannot in general make predictions of arbitrary accuracy. For instance electrons cannot in general be pictured as localized particles in space but rather should be thought of as "clouds" of negative charge spread out over the entire orbit. These clouds represent the regions around the nucleus where the probability of "finding" an electron is the largest. The Heisenberg's Uncertainty Principle quantifies the inability to precisely locate the particle.
The other exemplar that led to quantum mechanics was the study of electromagnetic waves such as light. When it was found in 1900 by Max Planck that the energy of waves could be described as consisting of small packets or quanta, Albert Einstein exploited this idea to show that an electromagnetic wave such as light could be described by a particle called the photon with a discrete energy dependent on its frequency. This led to a theory of unity between subatomic particles and electromagnetic waves called wave-particle duality in which particles and waves were neither one nor the other, but had certain properties of both. While quantum mechanics describes the world of the very small, it also is needed to explain certain "macroscopic quantum systems" such as superconductors and superfluids.
Broadly speaking, quantum mechanics incorporates four classes of phenomena that classical physics cannot account for: (i) the quantization (discretization) of certain physical quantities, (ii) wave-particle duality, (iii) the uncertainty principle, and (iv) quantum entanglement. Each of these phenomena will be described in greater detail in subsequent sections.
Since the early days of quantum theory, physicists have made many attempts to combine it with the other highly successful theory of the twentieth century, Albert Einstein's General Theory of Relativity. While quantum mechanics is entirely consistent with special relativity, serious problems emerge when one tries to join the quantum laws with general relativity, a more elaborate description of spacetime which incorporates gravity. Resolving these inconsistencies has been a major goal of twentieth- and twenty-first-century physics. Despite the proposal of many novel ideas, the unification of quantum mechanics—which reigns in the domain of the very small—and general relativity—a superb description of the very large—remains a tantalizing future possibility. (See quantum gravity, string theory.)
Because everything is composed of quantum-mechanical particles, the laws of classical physics must approximate the laws of quantum mechanics in the appropriate limit. This is often expressed by saying that in case of large quantum numbers quantum mechanics "reduces" to classical mechanics and classical electromagnetism . This requirement is called the correspondence, or classical limit.
Quantum mechanics can be formulated in either a relativistic or non-relativistic manner. Relativistic quantum mechanics (quantum field theory) provides the framework for some of the most accurate physical theories known. Still, non-relativistic quantum mechanics is also used due to its simplicity and when relativistic effects are negligible. We will use the terms quantum mechanics, quantum physics, and quantum theory synonymously, to refer to both relativistic and non-relativistic quantum mechanics. It should be noted, however, that certain authors refer to "quantum mechanics" in the more restricted sense of non-relativistic quantum mechanics. Also, in quantum mechanics, the use of the term particle typically refers to an elementary or subatomic particle.
[edit]
Description of the theory
There are a number of mathematically equivalent formulations of quantum mechanics. One of the oldest and most commonly used formulations is the transformation theory invented by Cambridge theoretical physicist Paul Dirac, which unifies and generalizes the two earliest formulations of quantum mechanics, matrix mechanics (invented by Werner Heisenberg) and wave mechanics (invented by Erwin Schrödinger).
In this formulation, the instantaneous state of a quantum system encodes the probabilities of its measurable properties, or "observables". Examples of observables include energy, position, momentum, and angular momentum. Observables can be either continuous (e.g., the position of a particle) or discrete (e.g., the energy of an electron bound to a hydrogen atom).
Generally, quantum mechanics does not assign definite values to observables. Instead, it makes predictions about probability distributions; that is, the probability of obtaining each of the possible outcomes from measuring an observable. Naturally, these probabilities will depend on the quantum state at the instant of the measurement. There are, however, certain states that are associated with a definite value of a particular observable. These are known as "eigenstates" of the observable ("eigen" meaning "own" in German). In the everyday world, it is natural and intuitive to think of everything being in an eigenstate of every observable. Everything appears to have a definite position, a definite momentum, and a definite time of occurrence. However, Quantum Mechanics does not pinpoint the exact values for the position or momentum of a certain particle in a given space in a finite time, but, rather, it only provides a range of probabilities of where that particle might be. Therefore, it became necessary to use different words for a) the state of something having an uncertainty relation and b) a state that has a definite value. The latter is called the "eigenstate" of the property being measured.
A concrete example will be useful here. Let us consider a free particle. In quantum mechanics, there is wave-particle duality so the properties of the particle can be described as a wave. Therefore, its quantum state can be represented as a wave, of arbitrary shape and extending over all of space, called a wavefunction. The position and momentum of the particle are observables. The Uncertainty Principle of quantum mechanics states that both the position and the momentum cannot simultaneously be known with infinite precision at the same time. However, we can measure just the position alone of a moving free particle creating an eigenstate of position with a wavefunction that is very large at a particular position x, and zero everywhere else. If we perform a position measurement on such a wavefunction, we will obtain the result x with 100% probability. In other words, we will know the position of the free particle. This is called an eigenstate of position. If the particle is in an eigenstate of position then its momentum is completely unknown. An eigenstate of momentum, on the other hand, has the form of a plane wave. It can be shown that the wavelength is equal to h/p, where h is Planck's constant and p is the momentum of the eigenstate. If the particle is in an eigenstate of momentum then its position is completely blurred out.
Usually, a system will not be in an eigenstate of whatever observable we are interested in. However, if we measure the observable, the wavefunction will immediately become an eigenstate of that observable. This process is known as wavefunction collapse. If we know the wavefunction at the instant before the measurement, we will be able to compute the probability of collapsing into each of the possible eigenstates. For example, the free particle in our previous example will usually have a wavefunction that is a wave packet centered around some mean position x0, neither an eigenstate of position nor of momentum. When we measure the position of the particle, it is impossible for us to predict with certainty the result that we will obtain. It is probable, but not certain, that it will be near x0, where the amplitude of the wavefunction is large. After we perform the measurement, obtaining some result x, the wavefunction collapses into a position eigenstate centered at x.
Wave functions can change as time progresses. An equation known as the Schrödinger equation describes how wave functions change in time, a role similar to Newton's second law in classical mechanics. The Schrödinger equation, applied to our free particle, predicts that the center of a wave packet will move through space at a constant velocity, like a classical particle with no forces acting on it. However, the wave packet will also spread out as time progresses, which means that the position becomes more uncertain. This also has the effect of turning position eigenstates (which can be thought of as infinitely sharp wave packets) into broadened wave packets that are no longer position eigenstates.
Some wave functions produce probability distributions that are constant in time. Many systems that are treated dynamically in classical mechanics are described by such "static" wave functions. For example, a single electron in an unexcited atom is pictured classically as a particle moving in a circular trajectory around the atomic nucleus, whereas in quantum mechanics it is described by a static, spherically symmetric wavefunction surrounding the nucleus (Fig. 1). (Note that only the lowest angular momentum states, labeled s, are spherically symmetric).
The time evolution of wave functions is deterministic in the sense that, given a wavefunction at an initial time, it makes a definite prediction of what the wavefunction will be at any later time. During a measurement, the change of the wavefunction into another one is not deterministic, but rather unpredictable, i.e., random.
The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous Bohr-Einstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Interpretations of quantum mechanics have been formulated to do away with the concept of "wavefunction collapse"; see, for example, the relative state interpretation. The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wavefunctions become entangled, so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics.
[edit]
Quantum mechanical effects
As mentioned in the introduction, there are several classes of phenomena that appear under quantum mechanics which have no analogue in classical physics. These are sometimes referred to as "quantum effects".
The first type of quantum effect is the quantization of certain physical quantities. Quantization first arose in the mathematical formulae of Max Planck in 1900 as discussed in the introduction. Max Planck was analyzing how the radiation emitted from a body was related to its temperature, in other words, he was analyzing the energy of a wave. The energy of a wave could not be infinite, so Planck used the property of the wave we designate as the frequency to define energy. Max Planck discovered a constant that when multiplied by the frequency of any wave gives the energy of the wave. This constant is referred to by the letter h in mathematical formulae. It is a cornerstone of physics. By measuring the energy in a discrete non-continuous portion of the wave, the wave took on the appearance of chunks or packets of energy. These chunks of energy resembled particles. So energy is said to be quantized because it only comes in discrete chunks instead of a continuous range of energies.
In the example we have given, of a free particle in empty space, both the position and the momentum are continuous observables. However, if we restrict the particle to a region of space (the so-called "particle in a box" problem), the momentum observable will become discrete; it will only take on the values , where L is the length of the box, h is Planck's constant, and n is an arbitrary nonnegative integer number. Such observables are said to be quantized, and they play an important role in many physical systems. Examples of quantized observables include angular momentum, the total energy of a bound system, and the energy contained in an electromagnetic wave of a given frequency.
Another quantum effect is the uncertainty principle, which is the phenomenon that consecutive measurements of two or more observables may possess a fundamental limitation on accuracy. In our free particle example, it turns out that it is impossible to find a wavefunction that is an eigenstate of both position and momentum. This implies that position and momentum can never be simultaneously measured with arbitrary precision, even in principle: as the precision of the position measurement improves, the maximum precision of the momentum measurement decreases, and vice versa. Those variables for which it holds (e.g., momentum and position, or energy and time) are canonically conjugate variables in classical physics.
Another quantum effect is the wave-particle duality. It has been shown that, under certain experimental conditions, microscopic objects like atoms or electrons exhibit particle-like behavior, such as scattering. ("Particle-like" in the sense of an object that can be localized to a particular region of space.) Under other conditions, the same type of objects exhibit wave-like behavior, such as interference. We can observe only one type of property at a time, never both at the same time.
Another quantum effect is quantum entanglement. In some cases, the wave function of a system composed of many particles cannot be separated into independent wave functions, one for each particle. In that case, the particles are said to be "entangled". If quantum mechanics is correct, entangled particles can display remarkable and counter-intuitive properties. For example, a measurement made on one particle can produce, through the collapse of the total wavefunction, an instantaneous effect on other particles with which it is entangled, even if they are far apart. (This does not conflict with special relativity because information cannot be transmitted in this way.)
[edit]
Mathematical formulation
Main article: Mathematical formulation of quantum mechanics. See also the discussion in Quantum logic.
In the mathematically rigorous formulation of quantum mechanics, developed by Paul Dirac and John von Neumann, the possible states of a quantum mechanical system are represented by unit vectors (called "state vectors") residing in a complex separable Hilbert space (variously called the "state space" or the "associated Hilbert space" of the system) well defined upto a complex number of norm 1 (the phase factor). In other words, the possible states are points in the projectivization of a Hilbert space. The exact nature of this Hilbert space is dependent on the system; for example, the state space for position and momentum states is the space of square-integrable functions, while the state space for the spin of a single proton is just the product of two complex planes. Each observable is represented by a densely defined Hermitian (or self-adjoint) linear operator acting on the state space. Each eigenstate of an observable corresponds to an eigenvector of the operator, and the associated eigenvalue corresponds to the value of the observable in that eigenstate. If the operator's spectrum is discrete, the observable can only attain those discrete eigenvalues.
The time evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian, the operator corresponding to the total energy of the system, generates time evolution.
The inner product between two state vectors is a complex number known as a probability amplitude. During a measurement, the probability that a system collapses from a given initial state to a particular eigenstate is given by the square of the absolute value of the probability amplitudes between the initial and final states. The possible results of a measurement are the eigenvalues of the operator - which explains the choice of Hermitian operators, for which all the eigenvalues are real. We can find the probability distribution of an observable in a given state by computing the spectral decomposition of the corresponding operator. Heisenberg's uncertainty principle is represented by the statement that the operators corresponding to certain observables do not commute.
The Schrödinger equation acts on the entire probability amplitude, not merely its absolute value. Whereas the absolute value of the probability amplitude encodes information about probabilities, its phase encodes information about the interference between quantum states. This gives rise to the wave-like behavior of quantum states.
It turns out that analytic solutions of Schrödinger's equation are only available for a small number of model Hamiltonians, of which the quantum harmonic oscillator, the particle in a box, the hydrogen-molecular ion and the hydrogen atom are the most important representatives. Even the helium atom, which contains just one more electron than hydrogen, defies all attempts at a fully analytic treatment. There exist several techniques for generating approximate solutions. For instance, in the method known as perturbation theory one uses the analytic results for a simple quantum mechanical model to generate results for a more complicated model related to the simple model by, for example, the addition of a weak potential energy. Another method is the "semi-classical equation of motion" approach, which applies to systems for which quantum mechanics produces weak deviations from classical behavior. The deviations can be calculated based on the classical motion. This approach is important for the field of quantum chaos.
An alternative formulation of quantum mechanics is Feynman's path integral formulation, in which a quantum-mechanical amplitude is considered as a sum over histories between initial and final states; this is the quantum-mechanical counterpart of action principles in classical mechanics.
[edit]
Interactions with other scientific theories
The fundamental rules of quantum mechanics are very broad. They state that the state space of a system is a Hilbert space and the observables are Hermitian operators acting on that space, but do not tell us which Hilbert space or which operators. These must be chosen appropriately in order to obtain a quantitative description of a quantum system. An important guide for making these choices is the correspondence principle, which states that the predictions of quantum mechanics reduce to those of classical physics when a system moves to higher energies or equivalently, larger quantum numbers . This "high energy" limit is known as the classical or correspondence limit. One can therefore start from an established classical model of a particular system, and attempt to guess the underlying quantum model that gives rise to the classical model in the correspondence limit.
Unsolved problems in physics: In the correspondence limit of quantum mechanics: Is there a preferred interpretation of quantum mechanics? How does the quantum description of reality, which includes elements such as the superposition of states and wavefunction collapse, give rise to the reality we perceive?When quantum mechanics was originally formulated, it was applied to models whose correspondence limit was non-relativistic classical mechanics. For instance, the well-known model of the quantum harmonic oscillator uses an explicitly non-relativistic expression for the kinetic energy of the oscillator, and is thus a quantum version of the classical harmonic oscillator.
Early attempts to merge quantum mechanics with special relativity involved the replacement of the Schrödinger equation with a covariant equation such as the Klein-Gordon equation or the Dirac equation. While these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities stemming from their neglect of the relativistic creation and annihilation of particles. A fully relativistic quantum theory required the development of quantum field theory, which applies quantization to a field rather than a fixed set of particles. The first complete quantum field theory, quantum electrodynamics, provides a fully quantum description of the electromagnetic interaction.
The full apparatus of quantum field theory is often unnecessary for describing electrodynamic systems. A simpler approach, one employed since the inception of quantum mechanics, is to treat charged particles as quantum mechanical objects being acted on by a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using a classical Coulomb potential. This "semi-classical" approach fails if quantum fluctuations in the electromagnetic field play an important role, such as in the emission of photons by charged particles.
Quantum field theories for the strong nuclear force and the weak nuclear force have been developed. The quantum field theory of the strong nuclear force is called quantum chromodynamics, and describes the interactions of the subnuclear particles: quarks and gluons. The weak nuclear force and the electromagnetic force were unified, in their quantized forms, into a single quantum field theory known as electroweak theory.
It has proven difficult to construct quantum models of gravity, the remaining fundamental force. Semi-classical approximations are workable, and have led to predictions such as Hawking radiation. However, the formulation of a complete theory of quantum gravity is hindered by apparent incompatibilities between general relativity, the most accurate theory of gravity currently known, and some of the fundamental assumptions of quantum theory. The resolution of these incompatibilities is an area of active research, and theories such as string theory are among the possible candidates for a future theory of quantum gravity.
[edit]
Applications of quantum theory
Quantum mechanics has had enormous success in explaining many of the features of our world. The individual behaviour of the subatomic particles that make up all forms of matter - electrons, protons, neutrons, and so forth - can often only be satisfactorily described using quantum mechanics.Quantum mechanics has strongly influenced string theory, a candidate for a theory of everything (see Reductionism). It is also related to statistical mechanics.
Quantum mechanics is important for understanding how individual atoms combine covalently to form chemicals or molecules. The application of quantum mechanics to chemistry is known as quantum chemistry. (Relativistic) quantum mechanics can in principle mathematically describe most of chemistry. Quantum mechanics can provide quantitative insight into ionic and covalent bonding processes by explicitly showing which molecules are energetically favorable to which others, and by approximately how much. Most of the calculations performed in computational chemistry rely on quantum mechanics.
Much of modern technology operates at a scale where quantum effects are significant. Examples include the laser, the transistor, the electron microscope, and magnetic resonance imaging. The study of semiconductors led to the invention of the diode and the transistor, which are indispensable for modern electronics.
Researchers are currently seeking robust methods of directly manipulating quantum states. Efforts are being made to develop quantum cryptography, which will allow guaranteed secure transmission of information. A more distant goal is the development of quantum computers, which are expected to perform certain computational tasks exponentially faster than classical computers. Another active research topic is quantum teleportation, which deals with techniques to transmit quantum states over arbitrary distances.
[edit]
Philosophical consequences
Main article: Interpretations of quantum mechanics
Since its inception, the many counter-intuitive results of quantum mechanics have provoked strong philosophical debate and many interpretations. Even fundamental issues such as Max Born's basic rules concerning probability amplitudes and probability distributions took decades to be appreciated.
The Copenhagen interpretation, due largely to the Danish theoretical physicist Niels Bohr, is the interpretation of quantum mechanics most widely accepted amongst physicists. According to it, the probabilistic nature of quantum mechanics predictions cannot be explained in terms of some other deterministic theory, and does not simply reflect our limited knowledge. Quantum mechanics provides probabilistic results because the physical universe is itself probabilistic rather than deterministic.
Albert Einstein, himself one of the founders of quantum theory, disliked this loss of determinism in measurement. He held that there should be a local hidden variable theory underlying quantum mechanics and consequently the present theory was incomplete. He produced a series of objections to the theory, the most famous of which has become known as the EPR paradox. John Bell showed that the EPR paradox led to experimentally testable differences between quantum mechanics and local hidden variable theories. Experiments have been taken as confirming that quantum mechanics is correct and the real world cannot be described in terms of such hidden variables. "Loopholes" in the experiments, however, mean that the question is still not quite settled.
See the Bohr-Einstein debates
The Everett many-worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a "multiverse" composed of mostly independent parallel universes. This is not accomplished by introducing some new axiom to quantum mechanics, but on the contrary by removing the axiom of the collapse of the wave packet: All the possible consistent states of the measured system and the measuring apparatus (including the observer) are present in a real physical (not just formally mathematical, as in other interpretations) quantum superposition. (Such a superposition of consistent state combinations of different systems is called an entangled state.) While the multiverse is deterministic, we perceive non-deterministic behavior governed by probabilities, because we can observe only the universe, i.e. the consistent state contribution to the mentioned superposition, we inhabit. Everett's interpretation is perfectly consistent with John Bell's experiments and makes them intuitively understandable. However, according to the theory of quantum decoherence, the parallel universes will never be accessible for us, making them physically meaningless. This inaccessiblity can be understood as follows: once a measurement is done, the measured system becomes entangled with both the physicist who measured it and a huge number of other particles, some of which are photons flying away towards the other end of the universe; in order to prove that the wave function did not collapse one would have to bring all these particles back and measure them again, together with the system that was measured originally. This is completely impractical, but even if one can theoretically do this, it would destroy any evidence that the original measurement took place (including the physicist's memory).
[edit]
History
In 1900, the German physicist Max Planck introduced the idea that energy is quantized, in order to derive a formula for the observed frequency dependence of the energy emitted by a black body. In 1905, Einstein explained the photoelectric effect by postulating that light energy comes in quanta called photons. The idea that each photon had to consist of energy in terms of quanta was a remarkable achievement as it effectively removed the possibility of black body radiation attaining infinite energy if it were to be explained in terms of wave forms only. In 1913, Bohr explained the spectral lines of the hydrogen atom, again by using quantization, in his paper of July 1913 On the Constitution of Atoms and Molecules. In 1924, the French physicist Louis de Broglie put forward his theory of matter waves by stating that particles can exhibit wave characteristics and vice versa.
These theories, though successful, were strictly phenomenological: there was no rigorous justification for quantization (aside, perhaps, for Henri Poincaré's discussion of Planck's theory in his 1912 paper Sur la théorie des quanta). They are collectively known as the old quantum theory.
The phrase "quantum physics" was first used in Johnston's Planck's Universe in Light of Modern Physics.
Modern quantum mechanics was born in 1925, when the German physicist Heisenberg developed matrix mechanics and the Austrian physicist Schrödinger invented wave mechanics and the non-relativistic Schrödinger equation. Schrödinger subsequently showed that the two approaches were equivalent.
Heisenberg formulated his uncertainty principle in 1927, and the Copenhagen interpretation took shape at about the same time. Starting around 1927, Paul Dirac began the process of unifying quantum mechanics with special relativity by discovering the Dirac equation for the electron. He also pioneered the use of operator theory, including the influential bra-ket notation, as described in his famous 1930 textbook. During the same period, Hungarian polymath John von Neumann formulated the rigorous mathematical basis for quantum mechanics as the theory of linear operators on Hilbert spaces, as described in his likewise famous 1932 textbook. These, like many other works from the founding period still stand, and remain widely used.
The field of quantum chemistry was pioneered by physicists Walter Heitler and Fritz London, who published a study of the covalent bond of the hydrogen molecule in 1927. Quantum chemistry was subsequently developed by a large number of workers, including the American theoretical chemist Linus Pauling at Cal Tech, and John Slater into various theories such as Molecular Orbital Theory or Valence Theory.
Beginning in 1927, attempts were made to apply quantum mechanics to fields rather than single particles, resulting in what are known as quantum field theories. Early workers in this area included Dirac, Pauli, Weisskopf, and Jordan. This area of research culminated in the formulation of quantum electrodynamics by Feynman, Dyson, Schwinger, and Tomonaga during the 1940s. Quantum electrodynamics is a quantum theory of electrons, positrons, and the electromagnetic field, and served as a role model for subsequent quantum field theories.
The theory of quantum chromodynamics was formulated beginning in the early 1960s. The theory as we know it today was formulated by Politzer, Gross and Wilzcek in 1975. Building on pioneering work by Schwinger, Higgs, Goldstone, Glashow, Weinberg and Salam independently showed how the weak nuclear force and quantum electrodynamics could be merged into a single electroweak force.
[edit]
Founding experiments
Thomas Young's double-slit experiment demonstrating the wave nature of light (c1805)
Henri Becquerel discovers radioactivity (1896)
Joseph John Thomson's cathode ray tube experiments (discovers the electron and its negative charge) (1897)
The study of black body radiation between 1850 and 1900, which could not be explained without quantum concepts.
The photoelectric effect: Einstein explained this in 1905 (and later received a Nobel prize for it) using the concept of photons, particles of light with quantized energy
Robert Millikan's oil-drop experiment, which showed that electric charge occurs as quanta (whole units), (1909)
Ernest Rutherford's gold foil experiment disproved the plum pudding model of the atom which suggested that the mass and positive charge of the atom are almost uniformly distributed. (1911)
Professor Walter Ernhart-Plank's Proton Collapse experiment disproved the Rutherford model and temporarily cast doubt on the distribution of protons throughout an atom.
Otto Stern and Walter Gerlach conduct the Stern-Gerlach experiment, which demonstrates the quantized nature of particle spin (1920)
Clinton Davisson and Lester Germer demonstrate the wave nature of the electron 1 in the Electron diffraction experiment (1927)
Clyde L. Cowan and Frederick Reines confirm the existence of the neutrino in the neutrino experiment (1955)
Claus Jönsson`s double-slit experiment with electrons (1961)
[edit]
See also
Basics of quantum mechanics
Measurement in quantum mechanics
Quantum electrochemistry
Quantum chemistry
Quantum computers
Quantum information
Quantum field theory
Quantum thermodynamics
Theoretical chemistry
[edit]
References
P. A. M. Dirac, The Principles of Quantum Mechanics (1930) -- the beginning chapters provide a very clear and comprehensible introduction
David Griffiths, Introduction to Quantum Mechanics, Prentice Hall, 1995. ISBN 0-13-111892-7 -- A standard undergraduate level text written in an accessible style.
Richard P. Feynman, Robert B. Leighton and Matthew Sands (1965). The Feynman Lectures on Physics, Addison-Wesley. Richard Feynman's original lectures (given at CALTECH in early 1962) can also be downloaded as an MP3 file from www.audible.com[1]
Hugh Everett, Relative State Formulation of Quantum Mechanics, Reviews of Modern Physics vol 29, (1957) pp 454-462.
Bryce DeWitt, R. Neill Graham, eds, The Many-Worlds Interpretation of Quantum Mechanics, Princeton Series in Physics, Princeton University Press (1973), ISBN 069108131X
Albert Messiah, Quantum Mechanics, English translation by G. M. Temmer of Mécanique Quantique, 1966, John Wiley and Sons, vol. I, chapter IV, section III.
Richard P. Feynman, QED: The Strange Theory of Light and Matter -- a popular science book about quantum mechanics and quantum field theory that contains many enlightening insights that are interesting for the expert as well
Marvin Chester, Primer of Quantum Mechanics, 1987, John Wiley, N.Y. ISBN 0486428788
Hagen Kleinert, Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets, 3th edition, World Scientific (Singapore, 2004)(also available online here)
George Mackey (2004). The mathematical foundations of quantum mechanics. Dover Publications. ISBN 0486435172.
Griffiths, David J. (2004). Introduction to Quantum Mechanics (2nd ed.). Prentice Hall. ISBN 013805326X.
Omnes, Roland (1999). Understanding Quantum Mechanics. Princeton University Press. ISBN 0691004358.
J. von Neumann, Mathematical Foundations of Quantum Mechanics, Princeton University Press, 1955.
H. Weyl, The Theory of Groups and Quantum Mechanics, Dover Publications 1950.
[edit]
Notes
Note 1: The Davisson-Germer experiment, which demonstrates the wave nature of the electron
[edit]
External links
Wikiquote has a collection of quotations related to:
Quantum mechanicsWikimedia Commons has media related to:
Quantum mechanicsGeneral:
A history of quantum mechanics
A Lazy Layman's Guide to Quantum Physics
Introduction to Quantum Theory at Quantiki
Quantum Physics Made Relatively Simple: three video lectures by Hans Bethe
Decoherence by Erich Joos
Course material:
MIT OpenCourseWare: Chemistry. See 5.61, 5.73, and 5.74
MIT OpenCourseWare: Physics. See 8.04, 8.05, and 8.06.
Imperial College Quantum Mechanics Course to Download
A set of downloadable tutorials on Quantum Mechanics, Imperial College
Spark Notes - Quantum Physics
FAQs:
Many-worlds or relative-state interpretation
Measurement in Quantum mechanics
A short FAQ on quantum resonances
Media:
Everything you wanted to know about the quantum world — archive of articles from New Scientist magazine.
"Quantum Trickery: Testing Einstein's Strangest Theory", The New York Times, December 27, 2005.
Philosophy:
Quantum Mechanics (Stanford Encyclopedia of Philosophy)
David Mermin on the future directions of physics
"Quantum Physics Quackery" by Victor Stenger, Skeptical Inquirer (January/February 1997).
Crank Dot Net's quantum physics page — "cranks, crackpots, kooks & loons on the net"
Hinduism & Quantum Physics
hope this helps