Classical physics
“Yes! Physics has given up. We do not know how to
predict what would happen in a given circumstance, and we believe now that it
is impossible, that the only thing that can be predicted is the probability of
different events.”
Richard Feynman {1}
Explaining the world through certain principles and
mathematics is the essence of physics. We observe events in the world and begin
to try and understand their mechanisms through experiment, the data these
experiments produce are then modelled by mathematics. A physicist then, with a
great physical insight, begins to formulate general principles which underpin
not just a specific physical phenomenon but are deemed mechanisms of the whole
natural world.
The principles of classical physics, which was made
intelligible by Sir Isaac Newton, have been the principles which we thought the
whole natural world obeyed. The most significant of these principles is the
principle of motion:
Eq{1}
For which the force acting on a mass m is the scalar product between m
and the second derivative of its displacement with respect to a duration t. The mass is viewed as being a point
particle in space and matter in general communicate through mutually interacting
forces. Matter can be described in terms of its energy and momentum (E,p).
Classical physics contains a description of electric and
magnetic phenomena which are conveniently described in terms of E and B which represent the electric and magnetic fields respectively.
These two fields obey this equation:
Eq{2}
It is with the Newtonian point-particle mechanics and the
classical electromagnetic wave theory which have enabled humans to describe a
great amount of physical phenomena. With time this has ingrained in us the idea
that the world should be intrinsically deterministic i.e. we should be able to
predict quantities in our equations with a very definite idea of its value[1].
This assumption that determinism is intrinsic to the world
will be shown false through the failure of classical physics to describe
certain phenomena. As scientists then, we must re-formulate the theory or
arrive at a new theory built upon different foundations. It is with the efforts
of Planck, De Broglie, Einstein, Bohr, Born, Schrodinger, Heisenberg, Pauli and
Dirac that enabled physics to overcome the limitations of classical mechanics
and gradually arrive at a new theory of the world which will encompass its
small scales.
Breakdown of the Classical theory and origins of quantization
Classical theory struggles to describe the complicated
phenomenon of black body radiation, which consists of the thermodynamics of the
exchange of energy between radiation and matter. Classical theory supposes that
this exchange of energy is continuous and that light of frequency v can give up any amount of energy on
absorption. This theory also predicts that higher frequencies of radiation will
be emitted as there are much higher frequencies (say above 1 million) than
lower frequencies; also we are assuming that every frequency has the same
probability for emission. Nature says this is not the case. A black body will
radiate at lower frequencies.
In 1901 Planck formulated his law for the radiation rate of
a black body by rejecting the assumption that the exchange of energy is
continuous, the exchange of energy is discrete which means radiation is
quantized.
Eq {3}
Eq {3} represents the radiation rate as the classical
radiation rate found by Rayleigh-Jeans times by a factor which is deduced from
the quantization of radiation.
Eq{4} Eq{5}
Here represents the energy E of a discrete packet of
radiation, p is the momentum and h=h2pi where h is Planck’s constant. The quantities v and k are the
frequency and wave number respectively and are wave quantities. Planck had to
change an assumption of classical theory to fit experimental data, this was the
first historical crack shown within the marble of classical physics. The reader
may also notice the first indication of a wave-particle duality in a theory
i.e. the quantified relationship between particle properties and wave
properties.
Albert Einstein in 1905 produced a
paper[2] on
the photoelectric effect which with its empirical verification confirmed
Planck’s quantization. The photoelectric effect describes how a light beam
incident on a metal surface causes electron emission from that surface. The
kinetic energy, E, of an emitted
electron is related to the energy of the incident quantized radiation by:
Eq{6}
Here phi represents the threshold
work needed to detach an electron and hv
is the energy of the photon which collided with the electron. A modern
verification of this is in solar panels which uses photons from the sun (hence
solar) to generate electrical energy. The photoelectric effect simply confirms
Planck’s hypothesis.
Bohr in later years used
quantization to describe how electrons exist within the Hydrogen atom; he had
shown that electrons exist in discrete energy levels. If an
electron moves to a lower energy state, it will have lost energy and does so by
an emission of a particle. This particle was shown to be a photon with
frequency v.
Eq{7}
We have established then the particle aspects of radiation and
the need for it. This then necessitates a falsehood within classical theory
which restricts its validity to the motion of macroscopic objects (including
gases).
Electron diffraction, Wave-particle duality, Probability and Bohr’s
Complementarity
Davisson and Germer[3] (also
Sir George Paget Thomson) had revealed the wave nature of electrons through the
diffraction patterns of a beam of electrons reflected from the surface of a
nickel crystal. The diffraction pattern still existed even when the electrons
went through one at a time. This showed that small scale particles can exhibit
wave phenomena such as diffraction. De Broglie suggested that the electron
could be described by the same equation which relates the particle and wave
aspects of radiation. Davisson, De Broglie and Thomson received the Nobel prize
in physics (De Broglie in 1929, Davisson and Thomson in 1937).
So the reader may be slightly puzzled by the fact that there
are two views of matter on the small scale which are both empirically correct
for certain situations. A paradox has arisen but a solution exists which
requires us to change our fundamental viewpoint of matter yet again and change
our conceptual thinking of the wave.
Consider Young’s interference experiment (see figure 2)
which consists of a source of light, S, which passes through a lens, L. This
beam will go through two slits A and B and produce interference patterns with
bright and dark fringes on a sheet. Let us change the sheet to a photoelectric
emitter such that electrons will be emitted if light of threshold frequency is
incident on it. We witness wave-particle duality in such an experiment, in
order to explain the bright and dark fringes we must describe light as a wave
but to understand the electron emissions we must describe light as a particle.
Figure 2
The locations of the bright and dark fringes on P depend upon the separation between A and B. A photon sufficiently small enough to eject an electron from P could not go through both A and B. If one places photon detectors at A and B one only finds whole photons or the absence of a photon, it never finds partial photons. The question of how a photon passing through A ‘could know’ about the presence of B is perplexing.
One possibility is that photons through A and photons
through B act on each other in such a way as too produce the interference
patterns at P. However this is incorrect as we decrease the intensity of the
beam in such a way to only allow one photon pass through either A or B, the
pattern still exists!
This experiment highlights the unpredictability of the
destiny of any given photon. It will appear on a bright fringe at P but one
cannot know which fringe. Furthermore, the intensity distribution over a fringe
(see figure 2 on P) serves as a probability distribution for a photon going
there. This distribution does not allow for exact prediction of where the
emitted electron from P will show up.
Here we find a resolution to the wave-particle duality paradox.
Instead of thinking of the wave of radiation which propagates through real
space, we interpret the wave function as a probability type function. The wave
intensity is the probability density of finding a photon at a particular
location.
It is also possible to formulate quantum mechanics by
beginning with the classical wave theory and quantizing its equations. This
approach is known as quantum-mechanical field theory and its formulation pays
homage to Paul Dirac.
In the experiment in figure 2 the wave that falls on P is
not a pure plane wave. It can however be separated into plane waves. It is
important to consider the state of a particle which has a wave function in the
form of a super-position of two or more plane waves. The wave-function of a particle
is:
This is in agreement with De Broglie’s equation. The
probability is therefore the absolute value of phi squared:
Eq{9}
Notice that at the end of the equation we have the
interference term which is localized (not being averaged over all space).
However if Eq{9} is averaged over all space, the interference term averages to
zero and the square of absolute value of phi is a measure of the probability of
finding the particle at somewhere, without regard to location. It must also be
said that the position and momentum of a particle is uncertain when it is
described by Eq{8}.
Also the equations we have just written down are wave
functions prior to observation. If the particle is observed in some restricted
region of space then this observation has disrupted the quantum-mechanical
system hence its wave function. This is called the collapse of the wave
function and after observation the wave-function behaves appropriately to the
observed momentum.
If the characteristic wavelength of a particle relating to
the particle momentum is given by:
Eq{10}
Then a particle which is localised in a defined region of
space must have a spread of momenta. It can be deduced that the smaller the
region of space the greater is the spread of wavelengths and hence momenta.
This is an example of the idea of complementary observables in quantum
mechanics in which an exact specification of the value of one quantity can be
obtained only at the expense of uncertainty of the other complementary
quantity.
This is the basis of the Complementarity principle formulated by Bohr, which states that a physical system can only be described in terms of imprecise specification of a pair of complementary quantities. This is expressed as a commutation relation:
Eq{11}
Where x hat and p hat are position and momentum observables
respectively.
The uncertainty in the position, x, and the uncertainty in the momentum, p, of a particle are
related. It can be shown that this uncertainty relation is an explicit
consequence of Bohr’s Complementarity principle shown in Eq{11}. The general
form of this uncertainty relation for any wave function which specifies the
position and momentum of a particle is:
Eq{12}
This means that if the position
of a particle is known very well then as a consequence of this the momentum of
the particle will be uncertain by a degree determined by delta(x) and h/2pi.
The atomic and subatomic world
implied by these principles and equations goes against all our intuitions about
nature. Why should it adhere to our intuitions however? We as biological beings
have evolved to function in the macroscopic world (everyday world) why should
we assume nature to conform to this evolution? To me it seems that strangeness
and the probabilistic nature of quantum mechanics is exciting. It shows the
complexity and ingenuity of nature, as if we were trying to find the rules to a
grand game, one which as quantum mechanics shows, we play a part.
“If I have seen further than others, it is by
standing upon the shoulders of giants.”
Isaac Newton
The Solvay Conference 1927 which shows the like of Einstein (centre), Dirac, Bohr, Heisernberg, Pauli, Shrodinger, Lorentz and many more great minds.
Bibliography
Feynman{1} - Feynman, Leighton, Sands: The Feynman Lectures on Physics Volume 1 first printed in 1965 Addison-Wesley Publishing Company, INC.
Eqs{1-12} – P.T.Matthews, F.R.S.: Introduction to Quantum Mechanics (Chapters I-III) 1974 McGraw-Hill Publishing Company Limited/ R.H. Dicke, J.P Wittke: Introduction to Quantum Mechanics Addison-Wesley Publishing Company, INC/ P.W.Atkins: Molecular Quantum Mechanics second-edition (1: Historical Introduction) Oxford University Press Figure 2- R.H. Dicke, J.P Wittke: Introduction to Quantum Mechanics Addison- Wesley Publishing Company, Inc. The actual drawing was done by Luke Kristopher Davis on Microsoft Paint.
Isaac Newton{2} – James Gleick: Sir Isaac Newton (biography)
General references (of which aided my general understanding and knowledge) – Jagdish Mehra, Helmut Rechenberg: The Historical Development Of Quantum Theory Springer Volume 6 part 1/ P.T.Matthews, F.R.S.: Introduction to Quantum Mechanics (Chapters I-III) 1974 McGraw-Hill Publishing Company Limited/ R.H. Dicke, J.P Wittke: Introduction to Quantum Mechanics Addison- Wesley Publishing Company, INC/ P.W.Atkins: Molecular Quantum Mechanics second-edition (1: Historical Introduction) Oxford University Press/ Stehle: Quantum Mechanics/ http://cvitae.org/images/stories151/history_of_quantum.pdf (thank you author)/ http://noberlprize.org/nobel_prizes/physics/ :Max Born: The statistical interpretations of Quantum Mechanics/ Murray Gell-Mann: The Quark and the Jaguar (popular science)
[1] Of
course this strict determinism fails within molecular dynamics (thermodynamics)
as we have to take averages of certain temperatures and densities. However this
statistical mechanics is due to our inability to specify all positions and
momenta of small particles, the statistics are not assumed to be intrinsic to
the molecular system.
[2] This
paper granted Albert Einstein the nobel prize in physics 1921.
[3]
C.Davisson and L. Germer, Nature, 119, 558 (1927)
No comments:
Post a Comment