Thursday, March 22, 2007

The Pulse of the Universe


Robert Millikan was born on this day in 1868 in Morrison, Illinois.

Robert Millikan was at the forefront of the awakening of U.S. physics around the turn of the 20th century, both as a researcher and educator. While a student at Oberlin College, with no previous experience he was pressed into service by his Greek professor to teach physics; when he protested that he didn't know anything about physics, his Greek professor simply insisted that anyone who could master Greek could master physics -- such was the American view of physics as a static, classical discipline at the end of the 19th century.

Nevertheless, the advice propelled Millikan into his life's work. He later studied at Columbia, graduating as the school's only physics Ph.D. in 1893, and went to Germany to study with Max Planck. He returned to the U.S. in 1896 to teach physics at the University of Chicago, where he devoted half of his time to research. In 1909, Millikan succeeded in measuring an electric charge by suspending a tiny oil drop in an electric field, then measuring the strength of the field. He discovered that an electric charge (e) can only be found in integer multiples of a fundamental "piece" of charge, thus demonstrating the atomic nature of electricity. For this work in 1923 Millikan became only the second American to win the Nobel Prize for Physics (the first being the chair of his department at Chicago, Albert A. Michelson).

Between 1912 and 1915, Millikan experimentally verified Albert Einstein's predictions about the photoelectric effect, providing a photoelectric determination of the value of Planck's constant (h) in the process. For the first time in the 20th century, through Millikan's work American physics was having an impact on the leading-edge crosscurrents of European physics. In the 1920s, he began to study the region of the spectrum between the ultraviolet and X-radiation, extending the knowledge of the ultraviolet spectrum downwards far below what was previously observed. He also discovered a law of motion regarding the falling of a particle towards the Earth after entering the Earth's atmosphere, which led to his study of radiation in the atmosphere, previously thought to have resulted from radioactive elements on Earth.

In 1927, Millikan appeared on the cover of Time as the man who had "detected the pulse of the universe" after he coined the phrase "cosmic rays" (to the delight of sci-fi writers everywhere) to describe radiation in the atmosphere which could only come from outer space, as his experiments demonstrated. Initially, Millikan believed that these rays were "birth cries" (i.e. energy) from infant atoms being formed in outer space, perhaps by the Creator who was "continually on his job," but he later retreated from the position as the scientific community failed to rally to the hypothesis. Nevertheless, Millikan was passionate about harmonizing science and religion, and was a frequent speaker on the subject; some critics joked that they couldn't tell the difference between the two by the time Millikan was through with them.

America's most popular scientist, he was particularly admired by the U.S. industrial community, much of which applauded the anti-New Deal, Herbert Spencer-style determinism which laced his speeches, a circumstance which served him well in raising funds for the California Institute of Technology as head of the physics department from 1921 to 1946. He died on December 19, 1953 in San Marino, California.

Labels:

Monday, February 19, 2007

With the Sun at its Center


For 14 centuries, the prevailing view of the workings of the Earth, the Sun, the planets and the stars was the one articulated by the Greek astronomer Ptolemy: that the Earth was a stationary object, and that all other planetary bodies revolved around it in a uniform circular motion. The Ptolemaic model was happily embraced by the Roman Catholic Church as it sought to portray the Creation of Man on Earth as God's masterpiece, the center of God's universe. By the 15th century, European scientists and mathematicians -- who were generally not atheists, but aesthetes who yearned for glimpses of beauty at the nexus of time and space -- grew curious about the troublesome phenomena that the Ptolemaic model seemed to gloss over without easy explanation, such as the fact that the Big Dipper, for example, at certain times looked further away from the Earth than at other times. Over the years, scientists added a plethora of minor amendments to the Ptolemaic model to attempt to explain the observable eccentricities, but Ptolemy's simple, elegant model was beginning to look like a Rube Goldberg invention.

Mikolaj Kopernik (later known by his latinized name, Copernicus) grew up during this time of unresolved skepticism, the son of a merchant. Born on this day in 1473 in Torun, Poland, Copernicus' father died when he was 10, and he was sent to be raised by his maternal uncle, Lucas Watzenrode, the bishop of Ermeland. The bishop set young Mikolaj on a course to become a church canon. He studied liberal arts (including astronomy and astrology) at Cracow before setting off to his uncle's alma mater, the University of Bologna, at 23. There he lived for a time in the household of Bologna's foremost astronomer, Domenico Maria de Novara, who introduced Copernicus to the curative and skeptical literature, including Regiomontanus' Epitome of Ptolemy's Almagest (1496) and Pico della Mirandola's scathing Disputations Against Divinatory Astrology (1496), which argues that one of the flaws with astrology was that no one could agree on the order of the planets floating around the Earth.

With dissent already in the air, the exploration of the new hemisphere of the Earth incidentally discovered by Columbus began to call into question all sorts of fundamental assumptions about the order and primacy of things, and encouraged scientists to consider anew the incompleteness of their knowledge. Meanwhile, Copernicus kept busy: he received a doctorate in canon law at Ferrara in 1503; studied medicine at Padua; served as a scholar in absentia at Wroclaw while assuming the duties of canon at Frauenberg (which involved general administration and occasionally practicing medicine); prepared a Latin translation of the aphorisms of the Byzantine poet Theophylactus Simocattes (published in 1509); and still managed to pursue his astronomical observations in his spare time, building a small tower at Frauenberg from which to observe the sky.

His reputation as an amateur astronomer was great enough, however, for Copernicus to be invited in 1514 to the Fifth Lateran Council to make recommendations on the reform of the calendar. By this time he had begun to articulate a theoretical critique of the Ptolemaic model, quietly showing a summary of his views to a few friends. As a good canon lawyer, he must have predicted that the Church would not be happy with his theories; but after his 25-year old admirer Georg Rheticus published a summary of Copernicus' summary without provoking the Church's anger (in all likelihood they were a bit busy with Martin Luther at the time), Copernicus turned to completing a full dissertation on the heliocentric ("Sun-centered") model.

In what would come to be known as On the Revolution of the Heavenly Bodies, Copernicus argued that the Ptolemaic model, as amended, was like a Mr. Potathead in which the arms, legs, nose, eyes, ears and mouth were all placed in the wrong holes (to paraphrase his paraphrase of Horace's Ars poetica). By contrast, Copernicus wrote, if one assumes that the Sun is a stationary midpoint and the Earth is in motion, by a relatively simple set of calculations the remaining planets fall into orderly orbits around the Sun -- orbits whose length of time increase with a planet's relative distance from the Sun. Thus, Mercury would circumnavigate the Sun in 88 days; Venus, in 225 days; Earth, in one year; Mars, 1.9 years; Jupiter, 12 years; and Saturn, 30 years.

While he admitted that he could not rule out other potential alternative models to the Ptolemaic model -- it would take later mathematicians to uncover solutions for the problems of falling bodies, acceleration and force and ultimately prove that Copernicus was correct -- at least he had constructed one handsome Potatohead, in which all of the pieces fit together into a balanced, harmonious whole.

Copernicus avoided the potential wrath of the Church by waiting 36 years to publish Revolution; legend has it that on his deathbed he was able to hold the freshly minted first edition in his hands before passing away on May 24, 1543 in Frauenberg, East Prussia (now Poland). The cause and its implications would be taken up, in turn, by Galileo, Kepler, Descartes and Newton, among others. To Galileo goes the prize for making Copernicus a posthumously dangerous thinker; after Galileo's Discourse on Floating Bodies (1612), the Church banned Copernicus' Revolution. The Church would finally lift its ban in 1835.

Categories: , ,

Labels: , ,

Monday, November 13, 2006

Maxwell's Equations


"The most significant event of the 19th century will be judged as Maxwell's discovery of the laws of electrodynamics." - Richard Feynman.

A shy, somewhat dull child who earned the nickname "Dafty" while at Edinburgh Academy, James Clerk Maxwell -- born on this day in 1831 in Edinburgh -- had an intense curiosity about the mechanics of everyday objects, and later, much ahead of his schoolmates, developed an appreciation for the power of mathematical models. At 14 he wrote a paper on a method of drawing elipses using pins and thread which was published by the Edinburgh Royal Society (a not entirely new idea -- the mathematical basis for it was proposed by Descartes -- but it was a remarkable adaptation for a 14 year-old).

He read Newton at the University of Edinburgh and in 1850 went to Cambridge, where he met the cream of Britain's young scientists; they found him eccentric and a bit difficult to follow as he jumped excitedly from topic to topic, but nonetheless they seemed to recognize his intellectual gifts. After graduation in 1854, he went to teach at Marischal College in Aberdeen, where he studied the rings of Saturn and described them as being composed of numerous small solid particles (that being the best mathematical explanation for their stability), a description which was verified by NASA's Voyager probe in 1980. He married the daughter of the principal at Marischal in 1859, but that did not save his job as a junior professor when Marischal merged with King's College Aberdeen the following year. He managed to obtain the chair of natural philosophy at King's College London shortly thereafter, where he did his most important work.

Before Maxwell's work on electromagnetism, although predecessors such as Michael Faraday did develop a sophisticated understanding of the circumstances in which electrical induction could exist and how to produce it, they did not have a mature or very clear picture of the shape of electricity and its movements. Maxwell replaced the old "machine-like" models of electricity (put it in here and it comes out here) with a mathematical model which would predict electrical phenomena. Before a mystified Royal Society meeting in 1864, Maxwell read his "Dynamical Theory of the Electromagnetic Field," unveiling the equations which comprise the basic laws of electromagnetism, and showing that an electric charge sends waves through space at some frequency.

From his work, Maxwell could predict the existence of the whole invisible spectrum of electromagnetic frequencies, including radio waves, microwaves, infrared and ultraviolet waves, X-rays and gamma rays, as well as pinpointing the speed of electricity at about 300,000 kilometers per second - so close to the speed of light as to suggest that light itself was an electromagnetic disturbance. Maxwell's equations, which finally appeared in their most developed form in Electricity and Magnetism (1874), were puzzling to scientists of the time; but after Maxwell's death, Heinrich Hertz (at the urging of Heinrich von Helmholtz) confirmed Maxwell's theory of electromagnetism in a series of experiments which measured electromagnetic "waves" and showed how electricity behaved like light.

The verification of Maxwell's equations, then, became an important first step in the development of 20th century atomic physics; on the practical side of things, Maxwell's equations are used today in the design of everything from integrated circuits to cellular phones to predict and reduce levels of electromagnetic interference. Applying a similar theoretical basis to the study of gases, Maxwell also shares credit for the "Maxwell-Boltzmann" kinetic theory of gases: working independently of Ludwig Boltzmann, Maxwell showed that temperatures and heat were manifestations of molecular movement, and Maxwell attempted to describe such movement mathematically.

At the time of his death on November 5, 1879 of abdominal cancer (at age 49, the same age his mother died of the same disease), most scientists were pretty sure Maxwell was onto something, but only time would show the true extent of his brilliance and influence.

Labels:

Thursday, September 21, 2006

The Gentleman of Absolute Zero


Physicist Heike Kamerlingh Onnes, known as the "Gentleman of Absolute Zero," was born on this day in 1853 in Groningen, Netherlands.

After studying at Groningen and later with Gustav Kirchhoff, Kamerlingh Onnes taught at Leiden in 1882 and began his intense preoccupation with the behavior of matter at extremely low temperatures. At Leiden he established the Cryogenic Laboratory, which became the worldwide center of low temperature research using liquid-helium facilities of his own design, even setting up a school for glass-blowers to train them to make the special flasks he needed. In fact, in 1908 Kamerlingh Onnes was the first person to liquify helium (by bringing it to a temperature of 4.2 degrees above the absolute zero, the complete absence of heat).

In 1911, Kamerlingh Onnes discovered the phenomenon of superconductivity while studying how the electrical resistance of metals varied at very low temperatures. Initially, he believed that the resistance would increase as the temperature was lowered, reaching the maximum near absolute zero, but he was surprised to find that the resistance of certain metals actually decreases at a temperature close to absolute zero. Although he could not explain the phenomena using classical physics, he was aware of its significance: if superconductivity could be developed in materials at higher temperatures the result could be a powerful energy source. In 1957, John Bardeen and others proposed a theoretical explanation for superconductivity using quantum electrodynamics, and the scientific community has spent considerable resources in attempting to harness superconductivity as an energy source for levitating trains, and for medical and nuclear applications.

For liquifying helium, Kamerlingh Onnes won the Nobel Prize for Physics in 1913. He also demonstrated the effect of paramagnetic saturation -- the parallel positioning of elementary atomic magnets in a very high magnetic field at low temperatures -- proving the magnetic material theory of Paul Langevin. In addition to his theoretical work, Kamerlingh Onnes also worked on designs for the practical use of refrigeration for food storage and transport. He died on February 21, 1926 in Leiden, South Holland.

His near-obsession with low temperature work led him to have extraordinarily high expectations for his assistants, and many of them viewed him as a tyrant around the lab. At his funeral, his lab assistants were walking behind his hearse when the hearse driver began to speed up, causing several of them to break into a run to keep up with it. "The old devil," observed one of them, "even after he’s gone he makes us run."

Labels:

Thursday, June 08, 2006

The Secret of Life


Physicist and molecular biologist Francis Crick, co-discoverer of the double helix structure of DNA with James Dewey Watson (1953) and co-winner of the 1962 Nobel Prize for Physiology or Medicine, was born on this day in 1916 near Northampton, England.

The son of a shoe salesman, Francis Crick studied physics at University College London, and during World War II designed non-contact "magnetic" mines for the British navy. After the war, influenced by his own long-held atheism and by Schrodinger's What is Life?, Crick decided to immerse himself in the "living-nonliving borderline" of biology to study the basis of life, taking up the call of Linus Pauling, who just after the war had boasted about the prospective role of structural chemistry in understanding the biological microworld.

Crick went to Cambridge and eventually joined Max Perutz's X-ray crystallography lab which was then investigating the structure of proteins. Although it was thought that the genetic material in a cell was a protein and that a polymer, deoxyribonucleic acid (DNA), might be a factor in the manufacture of proteins, little insight had been gained into why this polymer was particularly suited to replication.

Meanwhile, in 1951, young Jim Watson, an American biochemist, arrived at Perutz's lab, and the two became fast friends and office-mates; apart from both having read Schrodinger, they shared a ruthless, somewhat arrogant sense of the same mission -- figuring out the relationship between the structure of DNA and its function in molecular genetics. Relying upon the experimental work of others (including the X-ray photographs of DNA taken by Rosalind Franklin in Maurice Wilkins' lab at University College London), Crick and Watson began to build models of the DNA molecule out of cardboard, wire and beads. On February 21, 1953, Watson noticed the complementary shape of adrene-thymine and guanine-cytosine, the base pairs of proteins underlying DNA, and together Crick and Watson worked out a picture of DNA as a ladder-like double helix, founded upon what was known through experiments, showing two chains of molecules linked by hydrogen bonds.

Watson says Crick went to the Eagle Pub at the end of the day and announced at the top of his lungs that he had discovered the secret of life; Crick's wife said he was always saying things like that, so she didn't pay him any particular attention. Crick and Watson published a short paper on their hypothesis in Nature (Apr. 1953), and in it noted, "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material."
Watson left Cambridge shortly thereafter, but Crick stayed on at Cambridge to become a dominant force in molecular biology, predicting the discovery of transfer-RNA as a mechanism for the manufacture of proteins within a cell and postulating the "central dogma" in molecular genetics (since unraveled by the behavior of certain viruses, but still largely revered as a key principle), that genetic information could only pass one way -- from protein to RNA to DNA. In 1976, Crick moved to the Salk Institute for Biological Studies in La Jolla, California to study consciousness and the brain. In The Astonishing Hypothesis (1994), his materialist habits of mind revealed themselves again as he theorized an electrophysical basis of human consciousness. He died July 29, 2004 in San Diego, California.

Labels: ,

Tuesday, March 14, 2006

Einstein


"The most beautiful thing we can experience is the mysterious. It is the source of all true art and all science. He to whom this emotion is a stranger, who can no longer pause to wonder and stand rapt in awe, is as good as dead: his eyes are closed." -- A. Einstein.

Among 20th century scientists, Albert Einstein has probably spawned the most virulent worldwide cult following, and he has left a rich folklore of life legends -- but unlike the heroes of some cults, this result somehow seems appropriate, given the way in which Einstein has forever altered our notions of space and time.

Born on this day in 1879 in Ulm, Germany, as a child, Einstein was fascinated by his father’s demonstration of a compass, with its needle which seemed to have a mind of its own, and with a little book on Euclidean plane geometry -- both of which gave him a sense of the deeply hidden forces which defined physical reality. Einstein’s father had a none-too-successful electrical business, first in Munich and later in Milan. While in Milan at the age of 16, free from the restrictive environment of formal schooling, Einstein taught himself calculus and higher mathematics, and later entered the Polytechnic Academy in Zurich.

He settled in Switzerland after school, taught mathematics and physics, and then became a patent examiner at the Swiss Patent Office, during which time he wrote several theoretical physics articles and obtained his Ph.D from the University of Zurich. By 1909, as a young University of Zurich professor, he was regarded as a leading scientific thinker.

As early as 1905, however, while still isolated from the physics community in his job as a patent examiner, he had written three papers which made astounding creative advances in the area of physics. The first, called The Special Theory of Relativity, was an overhaul of the classical principle of relativity, which held that the laws of physics had to have the same form in any frame of reference. Einstein postulated that the parameters of motion, time and space could not be absolute because they are measured relative to the observer (alluding to the work on transformation equations done by H.A. Lorentz), yet he reasserted the assumption that the speed of light itself remained constant in all frames of reference, as James Clerk Maxwell had argued. Through the imaginative combination of disparate ideas proposed by other physicists, Einstein began his critique of generally accepted concepts of space and time, and created for the first time a unified theory which could explain the apparent contradiction between Maxwell’s electromagnetic theory and classical relativity theory.

In addition, these exercises led Einstein to posit that mass is a direct measurement for the energy contained in bodies (crystallized in his famous equation E=mc²), in that mass decreases proportionately if a body emits a certain energy. The Special Theory of Relativity turned out to be essential to our understanding of the interactions of atomic and subatomic particles.

His second paper, The Quantum Law of the Emission and Absorption of Light, contained Einstein’s proof of Max Planck’s quantum theory, which provided that total energy was not a continuous variable, but rather a flow of distinguishable energy elements, or "quanta" of energy. After initially expressing some skepticism over Planck’s theory (summing up his objections to certain interpretations of it by declaring, "God does not play dice"), Einstein applied quantum theory to the electromagnetic radiation of light and used it to explain anomalies in fluorescence and photo-ionization, taking the bold step of representing light as a stream of particles with a quantifiable energy value. The work is considered to be one of Einstein’s most important, and in 1921 he won the Nobel Prize for it, donating the prize money to charity.

In the third paper, Theory of Brownian Movement, Einstein provided a mathematical formula to predict the path and speed of minute particles suspended in liquid, confirming atomic motion as the cause for the movement of such particles by providing a "proof" of the existence of molecules.

He served in professorships in Prague, Zurich and Leyden before being appointed director of the Kaiser Wilhelm Institute (established by Walther Nernst in 1894) in 1914. In 1915 he published his General Theory of Relativity. An extension of the Special Theory of Relativity, Einstein postulated that gravity and the inertial force of a system in acceleration (the heavy feeling of your own weight that pushes you against your seat when you accelerate quickly in a car, for instance) are indistinguishable. Leading from this premise, Einstein concluded that gravity was not simply a force in nature as Isaac Newton had argued, but a warping of space and time by physical mass. Because mass exists, space must be curved -- thus, Einstein predicted, light from a far-off star passing close by the Sun should be bent by the Sun’s great mass.

Before the scientific community could react to the theory, World War I broke out, causing Einstein to quip, "If my theory is proved correct Germany will hail me as a great German, and the French will call me a citizen of the world. If it is proved false, the French will call me a German and the Germans will call me a Jew." In 1919, astronomers verified Einstein’s predictions regarding the gravitational effect on light, and thereupon Einstein became an instant celebrity; the headlines in the London Times shouted in large print "Revolution in Science -- New Theory of the Universe -- Newtonian Ideas Overthrown," and Einstein received the Copley Medal of the Royal Society and the Gold Medal of the Royal Astronomical Society.

The far-reaching implications of Einstein’s theory launched modern cosmology, providing an explanation for the expansion of the universe and foreshadowing the notion of black holes among other things. Following a heart attack prompted by overwork, in 1929 Einstein published two papers explaining his Unified Field Theory which attempted to unite the theories of gravitation and electromagnetism, but the experimental work which it spawned was inconclusive. He continued to hold out hope that a true model of reality, not merely of a probability of reality as provided by quantum theory (as advanced by Niels Bohr), could be formed; but as Bohr and quantum physics dominated the field, Einstein’s scientific influence declined somewhat.

With the rise of the Nazis in Germany, Einstein came under increasing fire. Lending an overripe vitality to Einstein’s predictions about German attitudes regarding his work, Adolf Hitler denounced Einstein as a Jew and insinuated that Einstein must have stolen his theories from a German Army officer who died in World War I. Finding his property confiscated and his books burned by the Nazis, Einstein emigrated to the U.S., settling at Princeton University. In 1939, at the urging of Leo Szilard and others, Einstein wrote a letter to President Franklin Roosevelt advising him that the U.S. had the potential to build an atomic bomb, a project about which he had moral reservations but which he felt was necessary to defeat Hitler -- before the Germans could produce their own atomic warfare. The letter resulted in the launching of the successful Manhattan Project under the direction of J. Robert Oppenheimer.

In later years, Einstein wrote about pacifism, political and moral topics, and in 1952, he was offered the presidency of Israel, but declined. Following the rupture of an aortic aneurysm, Einstein refused surgery, declaring "I want to go when I want. It is tasteless to prolong life artificially." He died on April 18, 1955 in Princeton, New Jersey.

With his wild shocks of gray hair, droopy mustache and benign countenance, in some ways Einstein’s persona has come to define the 20th century identity of the creative genius, the personification of all "Einsteins" who meditate on the Universe and articulate their visions. Einstein the 20th century folk hero can be felt in such diverse creations as Nicolas Roeg's 1985 film Insignificance (in which Theresa Russell, as a Marilyn Monroe-like character, demonstrates the theory of relativity to an Einstein-like character); Philip Glass' 1979 opera Einstein on the Beach; Yahoo Serious' 1988 slapstick farce Young Einstein (in which a stick-figure Einstein is made to fall in love with a fictionalized Marie Curie); an appearance in Bill & Ted's Bogus Journey (1991); Steve Martin's 1997 play about a fictional meeting between Picasso, Einstein and Elvis, Picasso at the Lapin Agile; and Walter Matthau's impersonation of Einstein-as-cupid in the lukewarm 1994 romantic comedy I.Q.

Labels: ,

Wednesday, January 04, 2006

God said, Let Newton Be!


"Nature and Nature's Laws lay hid in Night/ God said, Let Newton be! and all was Light." -- Alexander Pope.

For centuries he was the most influential scientific thinker in the Western world. Nevertheless, no person's right to intellectual significance has been more trivialized than Isaac Newton's, his memory unfairly relegated to the fable of a slightly dim-witted man sitting under an apple tree (someone who could be portrayed in Hollywood by Harpo Marx of all people).

Before Newton started his compulsive calculating, the physical universe was poorly understood, thought to be largely the province of chance, an undivinable mystery -- but afterwards, scientists were to view the universe and its changes as being shaped by laws which could be described mathematically.

Born to a modest household in Woolsthorpe, Lincolnshire, England on this date in 1643, his father died before he was born, but he managed to enter Trinity College, Cambridge in 1661 to pursue a law degree. There he quickly grew tired of Aristotelian philosophy and pursued the writings of Francis Bacon and Rene Descartes on his own. In 1664 he was selected to be a scholar at Trinity, which would have permitted him to work on his scientific ideas with a modest stipend, but soon thereafter Cambridge University closed down due to the Great Plague.

A disheartened Newton returned home to his mother, where he toiled away in obscurity, there inventing the foundations of differential and integral calculus, simple analytical methods for finding the values of areas, tangents, the lengths of curves and the limits of functions -- methods which have become indispensable to later physicists for solving problems. He also derived a universal law of gravitation and began to investigate the nature of light.

Upon his return to Cambridge in 1667, he was elected a fellow. In 1672 he impressed the scientific world with the first reflecting telescope. Later that year, he published his first scientific paper, "A New Theory About Light and Colours," in which he claimed that light consisted of small particles rather than waves, but his ideas were attacked by Robert Hooke, then the supreme scientist in England; always fearful and sensitive, Newton was personally distraught by Hooke's criticism, and did not publish a full explanation of his optical theory, which did employ waves to supplement his particle explanation, until after Hooke's death.

He spent the next years working alone on his theories when in 1684, encouraged by Edmond Halley (who was tired of Hooke's braggadocio), Newton scooped Hooke by explaining the apparently irregular motion of the planets by proving with his calculus that the planets moved in elliptical orbits. In his De motus corporum, he went on to extrapolate three laws of motion (that a body in motion moves with constant velocity unless acted upon by some force, and that a body at rest remains at rest unless acted upon by some force; that force equals mass multiplied by acceleration; and that every action evokes an equal and opposite reaction) and to articulate the law of gravity he had previously derived, that between any two bodies, the gravitational force is proportional to the product of their masses, and inversely proportional to the square of the distance between them.

Try getting that from a falling apple.

Halley persuaded Newton to write a full treatment of these principles, which became the Philosophiae naturalis principia mathematica (1687), universally recognized as the greatest scientific treatise ever written.

In 1693, the unmarried, skittish Newton suffered a nervous breakdown after a brief career in Parliament, and in 1696 gave up research to become an official with the Royal Mint. Meanwhile, in private he had compiled copious writings and investigations of alchemy, the mystical study of turning base metals into gold. It has perplexed scholars (including John Maynard Keynes, who purchased the papers some years later) that a man with as much as insight into the physical universe as Newton would fritter away so much of his life on a diligent and enthusiastic study of an illusion. While some apologize that Newton was attracted by the religious significance of alchemy and was striving for a synthetic understanding of the universe, it might also be observed that Newton's lack of self esteem would never let him be proud of his achievements in science, and that perhaps his guilt over having chosen a flimsy scientific career sought to be assuaged by a practical effect -- that his life choices would produce something which would actually change someone's life. His tortured innards would not let him see that he had already done just that.

Newton passed away on March 31, 1727 in London.

[For extra credit, here is a great site devoted to Newton's explorations in alchemy, with scans of his notebook pages. Via BoingBoing.]

Labels:

Wednesday, December 07, 2005

Craters, Cosmos and Chronicles


Herbert Shaw, another bard in the long tradition of literature emanating from the U.S. Geological Survey (having served there as a geologist from 1959 to 1995), was born on this day in 1930 in San Mateo County, California.

In his book Craters, Cosmos and Chronicles: A New Theory of Earth (1994), Shaw applied non-linear dynamical systems analysis to the study of meteorites and attempted to identify a relationship between the interior dynamics of the Earth and the entire record of each meteorite hitting the Earth since pre-Cambrian times. In the process, Shaw proposed the possibility of synchronicity between physical occurrences as diverse as volcanic eruptions, meteoroids, biochemical genetic changes, mass extinctions and intergalactic dynamics, positing that instead of viewing such phenomena as randomly disassociated from each other, that they are instead a pattern of interactions occurring within what Shaw described as the "Celestial Reference Frame," a "limitless chain of 'resonances' linking terrestrial microcosms to galactic macrocosms," in the words of writer Mike Davis.

Shaw was an incorrigible scientific eclectic, known to cut a swath in such diverse specialties as magma rheology, thermal modeling, experimental geochemistry and even fractal geometry and linguistics (not to mention poetry, sculpting and painting) -- which corresponds with his desire in Craters to recast the many disciplines of earth sciences as essential parts of one whole, a truly interdisciplinary geo-cosmology.

In the world of science, it may be true that the 20th century was the century of the physicist, but at the end of the 21st century, it is quite possible that we will all be in awe at the breathtaking prescience found in the ideas of our 20th century geologists. Or, what the heck -- maybe Shaw was just crazy.

Shaw passed away on August 26, 2002 in Menlo Park, California.

Labels: , , ,

Monday, December 05, 2005

There is Uncertainty about Heisenberg


Werner Heisenberg was born on this day in 1901 in Wurzburg, Germany.

As a youth, Werner Heisenberg earned a reputation as an academically ambitious character, with a cool exterior containing a fervent drive to succeed. Apart from his participation in the German youth movement during World War I and his support for the suppression of the Bavarian workers' revolt in 1919, Heisenberg's mind and activities were almost completely focused on physics. He studied at the University of Munich for 2 years, then moved to Gottingen in 1922 to study with Max Born. There he also met the 37-year old Niels Bohr, who remarked after taking a stroll with Heisenberg over the Hain mountain that Heisenberg "understands everything."

After Heisenberg received his doctorate, he followed Bohr to Copenhagen to work under him at the Institute of Physics. By this time, Bohr's reputation had been established through his improvements on Rutherford's description of the orbit of electrons around the nucleus of the atom (known as "the first quantum revolution," for which Bohr won the Nobel Prize), but even Bohr's improvements failed to explain certain observable phenomena.

Rather than continue to approach the issue through metaphor or purely visual descriptions of the orbit of electrons, in 1925 Heisenberg turned to mathematics, devising a set of matrices for what could be observed -- the effects of electron orbits on the absorption and emission of light -- and in the process developed a new version of quantum theory known as matrix mechanics. Soon afterward, Erwin Schrodinger proposed another mathematical model, wave mechanics, for the activity of electrons, and showed that although the two approaches described electrons in different ways, they were mathematically equivalent to each other.

In a 1927 paper called "On the Intuitive Content of Quantum Kinematics and Mechanics," Heisenberg proposed a radical yet elegant answer to Schrodinger's paradox. He proposed that the simultaneous measurement of connected variables (the position and momentum of an electron) is a losing proposition -- the more precisely you measure position, the less precise your measurement of momentum, and vice versa; and that this relationship of uncertainty, or at least the limits it poses with respect to the problem of describing electron orbits, is a precisely formulable relationship.

All statements about the atom, he continued, are governed by the "uncertainty principle," so that in effect, ordinary words, based on the limitations of human perception, cannot describe the atom. Although commentators, within the context of 20th century science crushing the holy certainty of the past, gorged themselves on Heisenberg's "uncertainty principle" and used it as a signpost in other areas, Heisenberg only applied it to the atom. Together with his mentor Bohr, whose deep-seated desire for consensus led to his proposal of "complementarity" (the notion that diametrically different characterizations must be employed for a full understanding of the atom), Heisenberg became one of the leading exponents of the "Copenhagen interpretation" of quantum mechanics, and he won the Nobel Prize for physics in 1932, with Schrodinger and Paul Dirac.

Although Heisenberg had the opportunity to leave Germany during the rise of Hitler, he chose to stay, suffering briefly at the hands of the Nazis for refusing to compromise his support for Einstein's physics, but Himmler exonerated him and he eventually was appointed director of the Kaiser Wilhelm Institute of Physics and of the German program to develop an atomic bomb. It appears that he accepted the task, but believed that physicists on both sides of the conflict in World War II should willfully avoid inventing the bomb, a position he hinted at during a meeting with Bohr, who subsequently fled to the U.S., in occupied Denmark in 1941 (hypothetically dramatized in Michael Frayn's play, Copenhagen, 1998).

Recently unearthed letters, addressed to him by Bohr but never sent, indicate that Heisenberg expressed to Bohr that if anyone would have the bomb, it would be the Germans. When Heisenberg learned of the Allied atomic bomb being dropped on Hiroshima, he initially refused to believe it. His later statements indicate that he did not think it was possible to construct such a bomb in time for it to have any effect on the War, so he did not press for the resources necessary to do so; it is suspected, too, that he arrogantly thought that the Allied scientists could do no better than he, and therefore, there was no question in his mind of needing to deliberately sabotage the German effort. Heisenberg was imprisoned briefly after the War, and then went on to resume his direction of the Kaiser Wilhelm (later Max Planck) Institute until 1970; yet there remained a certain ambiguity (if not "uncertainty") about his actions and motives during the War. He was a scientist, first and foremost, who is more fairly accused of having pride in his abilities as a scientist and a loyalty to the Germany that had trained him; a blinkard enabler rather than a patriot or a Nazi. He passed away on February 1, 1976 in Munich.

Labels:

Wednesday, November 23, 2005

'A Matter of Great Regret'


Physicist Henry Moseley was born on this day in 1887 in Weymouth, Dorset, England.

Oxford-educated, Moseley taught physics at Manchester under the famous Ernest Rutherford from 1910 to 1913 before returning to Oxford. In 1914, he began to study X-rays emitted by metals when bombarded by electrons. While Van den Broek had shown that wavelengths of X-rays emitted by metals varied in a regular manner from element to element, "Moseley's law" showed the precise form of the relationship, allowing for the determination of an "atomic number" for each element on the periodic table based on its nuclear charge. Moseley's discovery made it possible for the first time to create a meaningful ordering of the elements on the periodic table, and provided the tools necessary for physicists to look for previously undiscovered elements. Three new elements were discovered shortly after Moseley's law was unveiled: hafnium, rhenium and francium.

Within a year after his discovery, Moseley joined the British Army and was sent directly to the Eastern front of World War I, dying in the badly mishandled Battle of Gallipoli on August 10, 1915 at the age of 28.

Rutherford expressed his outrage. "To use such a man as a subaltern," he said, "is economically the equivalent to using the Lusitania to carry a pound of butter from Ramsgate to Margate." Even the Germans called Moseley's death "a matter of great regret."

Labels:

Monday, November 07, 2005

Marie Curie

Marie Curie's triumph -- of initiating the field of radiochemistry and a greater understanding of the structure of the atom -- is one of intellect and an obsessive-compulsive personality over adverse circumstances. To begin with, she was born a woman, and in 19th century Europe, the only time a bearded scientist wanted a woman in his laboratory was when it was time to serve him tea and cakes, or time to clean up after tea and cakes. Secondly, she was born in Poland -- not as unlikely as, say, Madagascar for its time, but yet it was only a mere province of Russia, and hardly the world's center of scientific learning.

Born on this day in 1867 in Warsaw, she was the daughter of a physics teacher-father and a schoolmistress-mother, so education was a matter of focus in her childhood home; unfortunately, the education of women was not a matter of focus in 19th century Poland, and despite her high level of academic achievement, Marie was mistreated during her early schooling and denied admission to a university.

One thing that Poland has long been good at is underground movements, and Marie continued her science education in clandestine meetings while working as a governess and saving enough money to send her sister Bronia to medical school in Paris. After 6 years of sacrifice, Marie was finally invited to Paris in 1891. She received a physics degree from the Sorbonne, graduating magna cum laude in 1893, and took an additional degree in mathematics a year later.

Although she had suitors (one poor lad swallowed laudanum to prove his love, prompting Marie to observe that his priorities were simply out of order), she ignored Paris social life in favor of her plan to return to Poland. However, during a brief trip home in 1894 she realized that Poland wasn't ready for female physicists, and she resolved to stay in Paris. This change of mind perhaps prepared her for the development of a more intimate friendship with a well-regarded though humble chemist named Pierre Curie. By the following year, they were married in a secular ceremony, and she moved into his lab at the Sorbonne to begin her independent research.

Working from Henri Becquerel's discovery of the unusual rays emitting from uranium and Wilhelm Roentgen's discovery of X-rays, Marie began to measure the properties of uranium and investigate whether other minerals emitted similar rays using an electrometer designed by Pierre. First she discovered that thorium gave off rays, but even more curious was the fact that pitchblende and chalcolite threw off more measurable activity than uranium, and that such activity could not be explained by the presence of traces of uranium and thorium.

She and Pierre worked laboriously in an abandoned, leaky shed on the campus to isolate the unknown elements by fractional crystallization, and in 1898, the Curies announced the discovery of a new element, named after Marie's homeland, called polonium. Deducing that the emission of Becquerel's rays was possibly a more general natural phenomenon rather than the property of a few substances, the Curies named this phenomenon "radioactivity" and Marie worked to isolate another element from pitchblende that proved to be able to emit heat and light for many years -- one gram of salts per 8 tons of pitchblende -- to be known as radium.

In 1900, the Curies presented their findings at the International Congress of Physics and, based on their experiments, they also announced that radiation spontaneously emitted from uranium, even when tested in a vacuum. Thus, Marie surmised, the rays were not a result of chemical reactions, but they emanated from activity within the atoms themselves.

This last pronouncement would have a most far-reaching influence on the study of the structure of the atom, and the scientific community chose to honor Marie and Pierre Curie, along with Henri Becquerel with the 1903 Nobel Prize for Physics, for their studies of radioactivity. At first, the Nobel judges had not intended to make the award to a woman scientist, but Pierre went out of his way to make sure that they understood that the insights behind the work were Marie's.

Marie and Pierre became world famous overnight, in part for the curiosity about this brilliant young woman, a totally unfamiliar kind of icon. They were still enjoying the limelight when Pierre was tragically killed in a street accident 3 years later. Curie took her husband's teaching post at the Sorbonne (becoming the first woman professor there) and devoted herself to research.

She became a close friend of physicist Paul Langevin, with whom she shared research interests as well as anti-fascist politics; the Paris newspapers whispered that the relationship was an affair, and just as Curie's reputation began to suffer for it, the Nobel committee awarded her the 1911 Prize for Chemistry for her discovery of radium and polonium, thereby making her the first human being to win 2 Nobel Prizes. She took the opportunity to reassert the significance of her research -- that radioactivity is an atomic property of matter and could be used for finding new elements.

During World War I, Curie organized medical X-ray clinics and afterwards founded the Radium Institute of Paris. Curie was not aware of the dangers of radiation when she began her research and was careless about the exposure (she would regularly leave radioactive substances glowing at her bedside, and to this day her laboratory notebooks are hot), and later developed leukemia, but she would live to see her daughter Irene Joliot-Curie and Irene's husband Frederic take over her work at the Radium Institute. She died on July 4, 1934 in Savoy, France.

Here's an interesting fact: Greer Garson was nominated for an Oscar for portraying Curie in the 1943 biopic Madame Curie. The previous year, Garson won an Oscar for her performance in Mrs. Miniver and gave an acceptance speech that was an hour long. When Curie won the Nobel Prize in 1911, her banquet speech was approximately 2 minutes long.

Labels: ,

Wednesday, October 05, 2005

M. King Hubbert


Geophysicist and mathematician M. King Hubbert was born on this day in 1903 in San Saba, Texas.

In 1956, M. King Hubbert, the head of Shell Oil's research lab and a brilliant but cantankerous University of Chicago scientist, made a bold prediction that raised the ire of his powerful employer. Starting from the assumption that there is only so much oil in the ground, Hubbert designed a mathematical formula that could be used to chart the rate of consumption against the remaining reserves of any finite resource. The resulting curve, known as "Hubbert's peak," would roughly show the point in time at which the production of oil in the U.S. would peak and thereafter decline.

Hubbert's prediction? That the production of crude oil in the U.S. would peak some time between 1966 and 1971, then fall off rapidly to nearly zero. Shell Oil was aghast; Hubbert's estimate of future reserves in the U.S. was far below what Shell and its competitors had predicted, and Hubbert was failing to take into account the uncovering of new reserves and the improvement of exploration technology.

Hubbert stuck to his guns, however. In 1958, Hubbert published a report recommending that the U.S. increase its importation and storage of foreign oil -- in response to which, ignoring Hubbert's estimates, the U.S. government tapped Francis Turner to expand the interstate highway system. In 1962, he wrote the energy section for a National Academy of Sciences report commissioned by President Kennedy, but his conclusions about peak oil production were toned down for the executive summary. Hubbert reached the mandatory retirement age at Shell at age 60 and assumed joint appointments with the U.S. Geological Survey and Stanford University in 1964.

In February 1975, when the U.S. was experiencing a "surprise" oil crisis, the National Academy of Sciences finally accepted Hubbert's calculations, admitting that the U.S. peak had occurred in 1970 -- leaving Hubbert to conduct "I-told-you-so" interviews with the press. "A child born in the middle [19]30s will have seen the consumption of 80% of all American oil and gas in his lifetime," he declared, and "a child born about 1970 will see most of the world's [reserves] consumed."

With better statistical information, Hubbert's successors have concluded that worldwide peak oil production will occur no later than 2020, although industry diehards continue to dispute Hubbert's methodology, awaiting the new technology panacea. With consumption rates in China and India growing by leaps and bounds at the beginning of the 21st century, the Hubbertists claim that the peak could occur even sooner.

Some experts now believe that Hubbert's 1956 prediction should have been a Bill W. moment of self-awareness in America concerning its addiction to petroleum -- if you suddenly became aware of the fact that you would have to have your gin shipped to you from half-way across the world in specially-built tankers just to keep up with your glorious future thirst, wouldn't that be an opportune time to try to kick the habit? Although the macroeconomic significance of "Hubbert's peak" should have led public policy to favor the development of alternative energy sources in 1956, its macroeconomic significance 50 years hence is simply that buyers and sellers, in a world gravely dependent on oil, are now aware that the end is in sight, and prices will continue to rise accordingly.

While "Hubbert's peak" propelled Hubbert to visionary status, his earlier work in geophysics was also highly influential. In 1937, while teaching at Columbia, Hubbert employed a mathematical analysis to explain how the hardest of rocks forming the crust of the Earth could show signs of plastic flow under extreme geophysical pressures. By the 1950s, his work regarding the flow and entrapment of underground fluids caused the oil industry to rethink its basic exploration methods; and much later, Hubbert authored a compelling argument that overcrust faults, whose origins had long puzzled geologists, originated as a consequence of underground fluid pressures. Hubbert was also influential, at Stanford and at Berkeley (1973-6), in advocating that earth and environmental science programs place a greater emphasis on mathematics and physics. He passed away in 1989.

"Growth, growth, growth -- that's all we've known . . . World automobile production is doubling every 10 years; human population growth is like nothing that has happened in all of geologic history. The world will only tolerate so many doublings of anything -- whether it's power plants or grasshoppers." -- M. King Hubbert, 1975.

Labels: , , , ,