"A short history of nearly everything" - читать интересную книгу автора (Bryson Bill)

8 EINSTEIN’S UNIVERSE

AS THE NINETEENTH century drew to a close, scientists could reflect with satisfaction that they had pinned down most of the mysteries of the physical world: electricity, magnetism, gases, optics, acoustics, kinetics, and statistical mechanics, to name just a few, all had fallen into order before them. They had discovered the X ray, the cathode ray, the electron, and radioactivity, invented the ohm, the watt, the Kelvin, the joule, the amp, and the little erg.

If a thing could be oscillated, accelerated, perturbed, distilled, combined, weighed, or made gaseous they had done it, and in the process produced a body of universal laws so weighty and majestic that we still tend to write them out in capitals: the Electromagnetic Field Theory of Light, Richter’s Law of Reciprocal Proportions, Charles’s Law of Gases, the Law of Combining Volumes, the Zeroth Law, the Valence Concept, the Laws of Mass Actions, and others beyond counting. The whole world clanged and chuffed with the machinery and instruments that their ingenuity had produced. Many wise people believed that there was nothing much left for science to do.

In 1875, when a young German in Kiel named Max Planck was deciding whether to devote his life to mathematics or to physics, he was urged most heartily not to choose physics because the breakthroughs had all been made there. The coming century, he was assured, would be one of consolidation and refinement, not revolution. Planck didn’t listen. He studied theoretical physics and threw himself body and soul into work on entropy, a process at the heart of thermodynamics, which seemed to hold much promise for an ambitious young man.[15] In 1891 he produced his results and learned to his dismay that the important work on entropy had in fact been done already, in this instance by a retiring scholar at Yale University named J. Willard Gibbs.

Gibbs is perhaps the most brilliant person that most people have never heard of. Modest to the point of near invisibility, he passed virtually the whole of his life, apart from three years spent studying in Europe, within a three-block area bounded by his house and the Yale campus in New Haven, Connecticut. For his first ten years at Yale he didn’t even bother to draw a salary. (He had independent means.) From 1871, when he joined the university as a professor, to his death in 1903, his courses attracted an average of slightly over one student a semester. His written work was difficult to follow and employed a private form of notation that many found incomprehensible. But buried among his arcane formulations were insights of the loftiest brilliance.

In 1875-78, Gibbs produced a series of papers, collectively titled On the Equilibrium of Heterogeneous Substances, that dazzlingly elucidated the thermodynamic principles of, well, nearly everything-“gases, mixtures, surfaces, solids, phase changes . . . chemical reactions, electrochemical cells, sedimentation, and osmosis,” to quote William H. Cropper. In essence what Gibbs did was show that thermodynamics didn’t apply simply to heat and energy at the sort of large and noisy scale of the steam engine, but was also present and influential at the atomic level of chemical reactions. Gibbs’s Equilibrium has been called “the Principia of thermodynamics,” but for reasons that defy speculation Gibbs chose to publish these landmark observations in the Transactions of the Connecticut Academy of Arts and Sciences, a journal that managed to be obscure even in Connecticut, which is why Planck did not hear of him until too late.

Undaunted-well, perhaps mildly daunted-Planck turned to other matters.[16] We shall turn to these ourselves in a moment, but first we must make a slight (but relevant!) detour to Cleveland, Ohio, and an institution then known as the Case School of Applied Science. There, in the 1880s, a physicist of early middle years named Albert Michelson, assisted by his friend the chemist Edward Morley, embarked on a series of experiments that produced curious and disturbing results that would have great ramifications for much of what followed.

What Michelson and Morley did, without actually intending to, was undermine a longstanding belief in something called the luminiferous ether, a stable, invisible, weightless, frictionless, and unfortunately wholly imaginary medium that was thought to permeate the universe. Conceived by Descartes, embraced by Newton, and venerated by nearly everyone ever since, the ether held a position of absolute centrality in nineteenth-century physics as a way of explaining how light traveled across the emptiness of space. It was especially needed in the 1800s because light and electromagnetism were now seen as waves, which is to say types of vibrations. Vibrations must occur in something; hence the need for, and lasting devotion to, an ether. As late as 1909, the great British physicist J. J. Thomson was insisting: “The ether is not a fantastic creation of the speculative philosopher; it is as essential to us as the air we breathe”-this more than four years after it was pretty incontestably established that it didn’t exist. People, in short, were really attached to the ether.

If you needed to illustrate the idea of nineteenth-century America as a land of opportunity, you could hardly improve on the life of Albert Michelson. Born in 1852 on the German-Polish border to a family of poor Jewish merchants, he came to the United States with his family as an infant and grew up in a mining camp in California’s gold rush country, where his father ran a dry goods business. Too poor to pay for college, he traveled to Washington, D.C., and took to loitering by the front door of the White House so that he could fall in beside President Ulysses S. Grant when the President emerged for his daily constitutional. (It was clearly a more innocent age.) In the course of these walks, Michelson so ingratiated himself to the President that Grant agreed to secure for him a free place at the U.S. Naval Academy. It was there that Michelson learned his physics.

Ten years later, by now a professor at the Case School in Cleveland, Michelson became interested in trying to measure something called the ether drift-a kind of head wind produced by moving objects as they plowed through space. One of the predictions of Newtonian physics was that the speed of light as it pushed through the ether should vary with respect to an observer depending on whether the observer was moving toward the source of light or away from it, but no one had figured out a way to measure this. It occurred to Michelson that for half the year the Earth is traveling toward the Sun and for half the year it is moving away from it, and he reasoned that if you took careful enough measurements at opposite seasons and compared light’s travel time between the two, you would have your answer.

Michelson talked Alexander Graham Bell, newly enriched inventor of the telephone, into providing the funds to build an ingenious and sensitive instrument of Michelson’s own devising called an interferometer, which could measure the velocity of light with great precision. Then, assisted by the genial but shadowy Morley, Michelson embarked on years of fastidious measurements. The work was delicate and exhausting, and had to be suspended for a time to permit Michelson a brief but comprehensive nervous breakdown, but by 1887 they had their results. They were not at all what the two scientists had expected to find.

As Caltech astrophysicist Kip S. Thorne has written: “The speed of light turned out to be the same in all directions and at all seasons.” It was the first hint in two hundred years-in exactly two hundred years, in fact-that Newton’s laws might not apply all the time everywhere. The Michelson-Morley outcome became, in the words of William H. Cropper, “probably the most famous negative result in the history of physics.” Michelson was awarded a Nobel Prize in physics for the work-the first American so honored-but not for twenty years. Meanwhile, the Michelson-Morley experiments would hover unpleasantly, like a musty smell, in the background of scientific thought.

Remarkably, and despite his findings, when the twentieth century dawned Michelson counted himself among those who believed that the work of science was nearly at an end, with “only a few turrets and pinnacles to be added, a few roof bosses to be carved,” in the words of a writer in Nature.

In fact, of course, the world was about to enter a century of science where many people wouldn’t understand anything and none would understand everything. Scientists would soon find themselves adrift in a bewildering realm of particles and antiparticles, where things pop in and out of existence in spans of time that make nanoseconds look plodding and uneventful, where everything is strange. Science was moving from a world of macrophysics, where objects could be seen and held and measured, to one of microphysics, where events transpire with unimaginable swiftness on scales far below the limits of imagining. We were about to enter the quantum age, and the first person to push on the door was the so-far unfortunate Max Planck.

In 1900, now a theoretical physicist at the University of Berlin and at the somewhat advanced age of forty-two, Planck unveiled a new “quantum theory,” which posited that energy is not a continuous thing like flowing water but comes in individualized packets, which he called quanta. This was a novel concept, and a good one. In the short term it would help to provide a solution to the puzzle of the Michelson-Morley experiments in that it demonstrated that light needn’t be a wave after all. In the longer term it would lay the foundation for the whole of modern physics. It was, at all events, the first clue that the world was about to change.

But the landmark event-the dawn of a new age-came in 1905, when there appeared in the German physics journal Annalen der Physik a series of papers by a young Swiss bureaucrat who had no university affiliation, no access to a laboratory, and the regular use of no library greater than that of the national patent office in Bern, where he was employed as a technical examiner third class. (An application to be promoted to technical examiner second class had recently been rejected.)

His name was Albert Einstein, and in that one eventful year he submitted to Annalen der Physik five papers, of which three, according to C. P. Snow, “were among the greatest in the history of physics”-one examining the photoelectric effect by means of Planck’s new quantum theory, one on the behavior of small particles in suspension (what is known as Brownian motion), and one outlining a special theory of relativity.

The first won its author a Nobel Prize and explained the nature of light (and also helped to make television possible, among other things).[17] The second provided proof that atoms do indeed exist-a fact that had, surprisingly, been in some dispute. The third merely changed the world.

Einstein was born in Ulm, in southern Germany, in 1879, but grew up in Munich. Little in his early life suggested the greatness to come. Famously he didn’t learn to speak until he was three. In the 1890s, his father’s electrical business failing, the family moved to Milan, but Albert, by now a teenager, went to Switzerland to continue his education-though he failed his college entrance exams on the first try. In 1896 he gave up his German citizenship to avoid military conscription and entered the Zurich Polytechnic Institute on a four-year course designed to churn out high school science teachers. He was a bright but not outstanding student.

In 1900 he graduated and within a few months was beginning to contribute papers to Annalen der Physik. His very first paper, on the physics of fluids in drinking straws (of all things), appeared in the same issue as Planck’s quantum theory. From 1902 to 1904 he produced a series of papers on statistical mechanics only to discover that the quietly productive J. Willard Gibbs in Connecticut had done that work as well, in his Elementary Principles of Statistical Mechanics of 1901.

At the same time he had fallen in love with a fellow student, a Hungarian named Mileva Maric. In 1901 they had a child out of wedlock, a daughter, who was discreetly put up for adoption. Einstein never saw his child. Two years later, he and Maric were married. In between these events, in 1902, Einstein took a job with the Swiss patent office, where he stayed for the next seven years. He enjoyed the work: it was challenging enough to engage his mind, but not so challenging as to distract him from his physics. This was the background against which he produced the special theory of relativity in 1905.

Called “On the Electrodynamics of Moving Bodies,” it is one of the most extraordinary scientific papers ever published, as much for how it was presented as for what it said. It had no footnotes or citations, contained almost no mathematics, made no mention of any work that had influenced or preceded it, and acknowledged the help of just one individual, a colleague at the patent office named Michele Besso. It was, wrote C. P. Snow, as if Einstein “had reached the conclusions by pure thought, unaided, without listening to the opinions of others. To a surprisingly large extent, that is precisely what he had done.”

His famous equation, E=mc2, did not appear with the paper, but came in a brief supplement that followed a few months later. As you will recall from school days, E in the equation stands for energy, m for mass, and c2 for the speed of light squared.

In simplest terms, what the equation says is that mass and energy have an equivalence. They are two forms of the same thing: energy is liberated matter; matter is energy waiting to happen. Since c2 (the speed of light times itself) is a truly enormous number, what the equation is saying is that there is a huge amount-a really huge amount-of energy bound up in every material thing.[18]

You may not feel outstandingly robust, but if you are an average-sized adult you will contain within your modest frame no less than 7 x 1018 joules of potential energy-enough to explode with the force of thirty very large hydrogen bombs, assuming you knew how to liberate it and really wished to make a point. Everything has this kind of energy trapped within it. We’re just not very good at getting it out. Even a uranium bomb-the most energetic thing we have produced yet-releases less than 1 percent of the energy it could release if only we were more cunning.

Among much else, Einstein’s theory explained how radiation worked: how a lump of uranium could throw out constant streams of high-level energy without melting away like an ice cube. (It could do it by converting mass to energy extremely efficiently a la E=mc2.) It explained how stars could burn for billions of years without racing through their fuel. (Ditto.) At a stroke, in a simple formula, Einstein endowed geologists and astronomers with the luxury of billions of years. Above all, the special theory showed that the speed of light was constant and supreme. Nothing could overtake it. It brought light (no pun intended, exactly) to the very heart of our understanding of the nature of the universe. Not incidentally, it also solved the problem of the luminiferous ether by making it clear that it didn’t exist. Einstein gave us a universe that didn’t need it.

Physicists as a rule are not overattentive to the pronouncements of Swiss patent office clerks, and so, despite the abundance of useful tidings, Einstein’s papers attracted little notice. Having just solved several of the deepest mysteries of the universe, Einstein applied for a job as a university lecturer and was rejected, and then as a high school teacher and was rejected there as well. So he went back to his job as an examiner third class, but of course he kept thinking. He hadn’t even come close to finishing yet.

When the poet Paul Valery once asked Einstein if he kept a notebook to record his ideas, Einstein looked at him with mild but genuine surprise. “Oh, that’s not necessary,” he replied. “It’s so seldom I have one.” I need hardly point out that when he did get one it tended to be good. Einstein’s next idea was one of the greatest that anyone has ever had-indeed, the very greatest, according to Boorse, Motz, and Weaver in their thoughtful history of atomic science. “As the creation of a single mind,” they write, “it is undoubtedly the highest intellectual achievement of humanity,” which is of course as good as a compliment can get.

In 1907, or so it has sometimes been written, Albert Einstein saw a workman fall off a roof and began to think about gravity. Alas, like many good stories this one appears to be apocryphal. According to Einstein himself, he was simply sitting in a chair when the problem of gravity occurred to him.

Actually, what occurred to Einstein was something more like the beginning of a solution to the problem of gravity, since it had been evident to him from the outset that one thing missing from the special theory was gravity. What was “special” about the special theory was that it dealt with things moving in an essentially unimpeded state. But what happened when a thing in motion-light, above all-encountered an obstacle such as gravity? It was a question that would occupy his thoughts for most of the next decade and lead to the publication in early 1917 of a paper entitled “Cosmological Considerations on the General Theory of Relativity.” The special theory of relativity of 1905 was a profound and important piece of work, of course, but as C. P. Snow once observed, if Einstein hadn’t thought of it when he did someone else would have, probably within five years; it was an idea waiting to happen. But the general theory was something else altogether. “Without it,” wrote Snow in 1979, “it is likely that we should still be waiting for the theory today.”

With his pipe, genially self-effacing manner, and electrified hair, Einstein was too splendid a figure to remain permanently obscure, and in 1919, the war over, the world suddenly discovered him. Almost at once his theories of relativity developed a reputation for being impossible for an ordinary person to grasp. Matters were not helped, as David Bodanis points out in his superb book E=mc2, when the New York Times decided to do a story, and-for reasons that can never fail to excite wonder-sent the paper’s golfing correspondent, one Henry Crouch, to conduct the interview.

Crouch was hopelessly out of his depth, and got nearly everything wrong. Among the more lasting errors in his report was the assertion that Einstein had found a publisher daring enough to publish a book that only twelve men “in all the world could comprehend.” There was no such book, no such publisher, no such circle of learned men, but the notion stuck anyway. Soon the number of people who could grasp relativity had been reduced even further in the popular imagination-and the scientific establishment, it must be said, did little to disturb the myth.

When a journalist asked the British astronomer Sir Arthur Eddington if it was true that he was one of only three people in the world who could understand Einstein’s relativity theories, Eddington considered deeply for a moment and replied: “I am trying to think who the third person is.” In fact, the problem with relativity wasn’t that it involved a lot of differential equations, Lorentz transformations, and other complicated mathematics (though it did-even Einstein needed help with some of it), but that it was just so thoroughly nonintuitive.

In essence what relativity says is that space and time are not absolute, but relative to both the observer and to the thing being observed, and the faster one moves the more pronounced these effects become. We can never accelerate ourselves to the speed of light, and the harder we try (and faster we go) the more distorted we will become, relative to an outside observer.

Almost at once popularizers of science tried to come up with ways to make these concepts accessible to a general audience. One of the more successful attempts-commercially at least-was The ABC of Relativity by the mathematician and philosopher Bertrand Russell. In it, Russell employed an image that has been used many times since. He asked the reader to envision a train one hundred yards long moving at 60 percent of the speed of light. To someone standing on a platform watching it pass, the train would appear to be only eighty yards long and everything on it would be similarly compressed. If we could hear the passengers on the train speak, their voices would sound slurred and sluggish, like a record played at too slow a speed, and their movements would appear similarly ponderous. Even the clocks on the train would seem to be running at only four-fifths of their normal speed.

However-and here’s the thing-people on the train would have no sense of these distortions. To them, everything on the train would seem quite normal. It would be we on the platform who looked weirdly compressed and slowed down. It is all to do, you see, with your position relative to the moving object.

This effect actually happens every time you move. Fly across the United States, and you will step from the plane a quinzillionth of a second, or something, younger than those you left behind. Even in walking across the room you will very slightly alter your own experience of time and space. It has been calculated that a baseball thrown at a hundred miles an hour will pick up 0.000000000002 grams of mass on its way to home plate. So the effects of relativity are real and have been measured. The problem is that such changes are much too small to make the tiniest detectable difference to us. But for other things in the universe-light, gravity, the universe itself-these are matters of consequence.

So if the ideas of relativity seem weird, it is only because we don’t experience these sorts of interactions in normal life. However, to turn to Bodanis again, we all commonly encounter other kinds of relativity-for instance with regard to sound. If you are in a park and someone is playing annoying music, you know that if you move to a more distant spot the music will seem quieter. That’s not because the music is quieter, of course, but simply that your position relative to it has changed. To something too small or sluggish to duplicate this experience-a snail, say-the idea that a boom box could seem to two observers to produce two different volumes of music simultaneously might seem incredible.

The most challenging and nonintuitive of all the concepts in the general theory of relativity is the idea that time is part of space. Our instinct is to regard time as eternal, absolute, immutable-nothing can disturb its steady tick. In fact, according to Einstein, time is variable and ever changing. It even has shape. It is bound up-“inextricably interconnected,” in Stephen Hawking’s expression-with the three dimensions of space in a curious dimension known as spacetime.

Spacetime is usually explained by asking you to imagine something flat but pliant-a mattress, say, or a sheet of stretched rubber-on which is resting a heavy round object, such as an iron ball. The weight of the iron ball causes the material on which it is sitting to stretch and sag slightly. This is roughly analogous to the effect that a massive object such as the Sun (the iron ball) has on spacetime (the material): it stretches and curves and warps it. Now if you roll a smaller ball across the sheet, it tries to go in a straight line as required by Newton’s laws of motion, but as it nears the massive object and the slope of the sagging fabric, it rolls downward, ineluctably drawn to the more massive object. This is gravity-a product of the bending of spacetime.

Every object that has mass creates a little depression in the fabric of the cosmos. Thus the universe, as Dennis Overbye has put it, is “the ultimate sagging mattress.” Gravity on this view is no longer so much a thing as an outcome-“not a ‘force’ but a byproduct of the warping of spacetime,” in the words of the physicist Michio Kaku, who goes on: “In some sense, gravity does not exist; what moves the planets and stars is the distortion of space and time.”

Of course the sagging mattress analogy can take us only so far because it doesn’t incorporate the effect of time. But then our brains can take us only so far because it is so nearly impossible to envision a dimension comprising three parts space to one part time, all interwoven like the threads in a plaid fabric. At all events, I think we can agree that this was an awfully big thought for a young man staring out the window of a patent office in the capital of Switzerland.

Among much else, Einstein’s general theory of relativity suggested that the universe must be either expanding or contracting. But Einstein was not a cosmologist, and he accepted the prevailing wisdom that the universe was fixed and eternal. More or less reflexively, he dropped into his equations something called the cosmological constant, which arbitrarily counterbalanced the effects of gravity, serving as a kind of mathematical pause button. Books on the history of science always forgive Einstein this lapse, but it was actually a fairly appalling piece of science and he knew it. He called it “the biggest blunder of my life.”

Coincidentally, at about the time that Einstein was affixing a cosmological constant to his theory, at the Lowell Observatory in Arizona, an astronomer with the cheerily intergalactic name of Vesto Slipher (who was in fact from Indiana) was taking spectrographic readings of distant stars and discovering that they appeared to be moving away from us. The universe wasn’t static. The stars Slipher looked at showed unmistakable signs of a Doppler shift[20]-the same mechanism behind that distinctive stretched-out yee-yummm sound cars make as they flash past on a racetrack. The phenomenon also applies to light, and in the case of receding galaxies it is known as a red shift (because light moving away from us shifts toward the red end of the spectrum; approaching light shifts to blue).

Slipher was the first to notice this effect with light and to realize its potential importance for understanding the motions of the cosmos. Unfortunately no one much noticed him. The Lowell Observatory, as you will recall, was a bit of an oddity thanks to Percival Lowell’s obsession with Martian canals, which in the 1910s made it, in every sense, an outpost of astronomical endeavor. Slipher was unaware of Einstein’s theory of relativity, and the world was equally unaware of Slipher. So his finding had no impact.

Glory instead would pass to a large mass of ego named Edwin Hubble. Hubble was born in 1889, ten years after Einstein, in a small Missouri town on the edge of the Ozarks and grew up there and in Wheaton, Illinois, a suburb of Chicago. His father was a successful insurance executive, so life was always comfortable, and Edwin enjoyed a wealth of physical endowments, too. He was a strong and gifted athlete, charming, smart, and immensely good-looking-“handsome almost to a fault,” in the description of William H. Cropper, “an Adonis” in the words of another admirer. According to his own accounts, he also managed to fit into his life more or less constant acts of valor-rescuing drowning swimmers, leading frightened men to safety across the battlefields of France, embarrassing world-champion boxers with knockdown punches in exhibition bouts. It all seemed too good to be true. It was. For all his gifts, Hubble was also an inveterate liar.

This was more than a little odd, for Hubble’s life was filled from an early age with a level of distinction that was at times almost ludicrously golden. At a single high school track meet in 1906, he won the pole vault, shot put, discus, hammer throw, standing high jump, and running high jump, and was on the winning mile-relay team-that is seven first places in one meet-and came in third in the broad jump. In the same year, he set a state record for the high jump in Illinois.

As a scholar he was equally proficient, and had no trouble gaining admission to study physics and astronomy at the University of Chicago (where, coincidentally, the head of the department was now Albert Michelson). There he was selected to be one of the first Rhodes scholars at Oxford. Three years of English life evidently turned his head, for he returned to Wheaton in 1913 wearing an Inverness cape, smoking a pipe, and talking with a peculiarly orotund accent-not quite British but not quite not-that would remain with him for life. Though he later claimed to have passed most of the second decade of the century practicing law in Kentucky, in fact he worked as a high school teacher and basketball coach in New Albany, Indiana, before belatedly attaining his doctorate and passing briefly through the Army. (He arrived in France one month before the Armistice and almost certainly never heard a shot fired in anger.)

In 1919, now aged thirty, he moved to California and took up a position at the Mount Wilson Observatory near Los Angeles. Swiftly, and more than a little unexpectedly, he became the most outstanding astronomer of the twentieth century.

It is worth pausing for a moment to consider just how little was known of the cosmos at this time. Astronomers today believe there are perhaps 140 billion galaxies in the visible universe. That’s a huge number, much bigger than merely saying it would lead you to suppose. If galaxies were frozen peas, it would be enough to fill a large auditorium-the old Boston Garden, say, or the Royal Albert Hall. (An astrophysicist named Bruce Gregory has actually computed this.) In 1919, when Hubble first put his head to the eyepiece, the number of these galaxies that were known to us was exactly one: the Milky Way. Everything else was thought to be either part of the Milky Way itself or one of many distant, peripheral puffs of gas. Hubble quickly demonstrated how wrong that belief was.

Over the next decade, Hubble tackled two of the most fundamental questions of the universe: how old is it, and how big? To answer both it is necessary to know two things-how far away certain galaxies are and how fast they are flying away from us (what is known as their recessional velocity). The red shift gives the speed at which galaxies are retiring, but doesn’t tell us how far away they are to begin with. For that you need what are known as “standard candles”-stars whose brightness can be reliably calculated and used as benchmarks to measure the brightness (and hence relative distance) of other stars.

Hubble’s luck was to come along soon after an ingenious woman named Henrietta Swan Leavitt had figured out a way to do so. Leavitt worked at the Harvard College Observatory as a computer, as they were known. Computers spent their lives studying photographic plates of stars and making computations-hence the name. It was little more than drudgery by another name, but it was as close as women could get to real astronomy at Harvard-or indeed pretty much anywhere-in those days. The system, however unfair, did have certain unexpected benefits: it meant that half the finest minds available were directed to work that would otherwise have attracted little reflective attention, and it ensured that women ended up with an appreciation of the fine structure of the cosmos that often eluded their male counterparts.

One Harvard computer, Annie Jump Cannon, used her repetitive acquaintance with the stars to devise a system of stellar classifications so practical that it is still in use today. Leavitt’s contribution was even more profound. She noticed that a type of star known as a Cepheid variable (after the constellation Cepheus, where it first was identified) pulsated with a regular rhythm-a kind of stellar heartbeat. Cepheids are quite rare, but at least one of them is well known to most of us. Polaris, the Pole Star, is a Cepheid.

We now know that Cepheids throb as they do because they are elderly stars that have moved past their “main sequence phase,” in the parlance of astronomers, and become red giants. The chemistry of red giants is a little weighty for our purposes here (it requires an appreciation for the properties of singly ionized helium atoms, among quite a lot else), but put simply it means that they burn their remaining fuel in a way that produces a very rhythmic, very reliable brightening and dimming. Leavitt’s genius was to realize that by comparing the relative magnitudes of Cepheids at different points in the sky you could work out where they were in relation to each other. They could be used as “standard candles”-a term she coined and still in universal use. The method provided only relative distances, not absolute distances, but even so it was the first time that anyone had come up with a usable way to measure the large-scale universe.

(Just to put these insights into perspective, it is perhaps worth noting that at the time Leavitt and Cannon were inferring fundamental properties of the cosmos from dim smudges on photographic plates, the Harvard astronomer William H. Pickering, who could of course peer into a first-class telescope as often as he wanted, was developing his seminal theory that dark patches on the Moon were caused by swarms of seasonally migrating insects.)

Combining Leavitt’s cosmic yardstick with Vesto Slipher’s handy red shifts, Edwin Hubble now began to measure selected points in space with a fresh eye. In 1923 he showed that a puff of distant gossamer in the Andromeda constellation known as M31 wasn’t a gas cloud at all but a blaze of stars, a galaxy in its own right, a hundred thousand light-years across and at least nine hundred thousand light-years away. The universe was vaster-vastly vaster-than anyone had ever supposed. In 1924 he produced a landmark paper, “Cepheids in Spiral Nebulae” (nebulae, from the Latin for “clouds,” was his word for galaxies), showing that the universe consisted not just of the Milky Way but of lots of independent galaxies-“island universes”-many of them bigger than the Milky Way and much more distant.

This finding alone would have ensured Hubble’s reputation, but he now turned to the question of working out just how much vaster the universe was, and made an even more striking discovery. Hubble began to measure the spectra of distant galaxies-the business that Slipher had begun in Arizona. Using Mount Wilson’s new hundred-inch Hooker telescope and some clever inferences, he worked out that all the galaxies in the sky (except for our own local cluster) are moving away from us. Moreover, their speed and distance were neatly proportional: the further away the galaxy, the faster it was moving.

This was truly startling. The universe was expanding, swiftly and evenly in all directions. It didn’t take a huge amount of imagination to read backwards from this and realize that it must therefore have started from some central point. Far from being the stable, fixed, eternal void that everyone had always assumed, this was a universe that had a beginning. It might therefore also have an end.

The wonder, as Stephen Hawking has noted, is that no one had hit on the idea of the expanding universe before. A static universe, as should have been obvious to Newton and every thinking astronomer since, would collapse in upon itself. There was also the problem that if stars had been burning indefinitely in a static universe they’d have made the whole intolerably hot-certainly much too hot for the likes of us. An expanding universe resolved much of this at a stroke.

Hubble was a much better observer than a thinker and didn’t immediately appreciate the full implications of what he had found. Partly this was because he was woefully ignorant of Einstein’s General Theory of Relativity. This was quite remarkable because, for one thing, Einstein and his theory were world famous by now. Moreover, in 1929 Albert Michelson-now in his twilight years but still one of the world’s most alert and esteemed scientists-accepted a position at Mount Wilson to measure the velocity of light with his trusty interferometer, and must surely have at least mentioned to him the applicability of Einstein’s theory to his own findings.

At all events, Hubble failed to make theoretical hay when the chance was there. Instead, it was left to a Belgian priest-scholar (with a Ph.D. from MIT) named Georges Lemaitre to bring together the two strands in his own “fireworks theory,” which suggested that the universe began as a geometrical point, a “primeval atom,” which burst into glory and had been moving apart ever since. It was an idea that very neatly anticipated the modern conception of the Big Bang but was so far ahead of its time that Lemaitre seldom gets more than the sentence or two that we have given him here. The world would need additional decades, and the inadvertent discovery of cosmic background radiation by Penzias and Wilson at their hissing antenna in New Jersey, before the Big Bang would begin to move from interesting idea to established theory.

Neither Hubble nor Einstein would be much of a part of that big story. Though no one would have guessed it at the time, both men had done about as much as they were ever going to do.

In 1936 Hubble produced a popular book called The Realm of the Nebulae, which explained in flattering style his own considerable achievements. Here at last he showed that he had acquainted himself with Einstein’s theory-up to a point anyway: he gave it four pages out of about two hundred.

Hubble died of a heart attack in 1953. One last small oddity awaited him. For reasons cloaked in mystery, his wife declined to have a funeral and never revealed what she did with his body. Half a century later the whereabouts of the century’s greatest astronomer remain unknown. For a memorial you must look to the sky and the Hubble Space Telescope, launched in 1990 and named in his honor.