Stories of scientific discovery often turn on moments of imagination, dreams and the unreal. This is no coincidence. Discovery and invention use the power of the unreal to usher new things into existence. Scientists stand outside the boundaries of the real when they push against it, writes Jimena Canales.
Eureka!—the expression is frequently used to designate the moment of discovery in science, when a genius idea suddenly enters into the mind of the researcher. What prompts such moments and what can we do so that they come to us more frequently? The utterance of the word in the context of science is often attributed to the Greek polymath Archimedes, who, as he was getting into a bathtub, had a brilliant idea that led to the principle that now goes by his name. Two centuries later after it allegedly too place, Vitruvius, the Roman architect and engineer, turned it into the story we know today. Archimedes, he recounted, had “leapt out of the vessel in joy, and, returning home naked, cried out with a loud voice that he had found that of which he was in search, for he continued exclaiming, in Greek, εὑρηκα, (I have found it out).”
Newton was fully dressed when he arrived at his breakthrough, yet the idea also caught him by surprise. Newton described his eureka experience to his biographer William Stukeley, who wrote it down soon after their conversation took place and passed it on to posterity. “After dinner, the weather being warm,” he wrote, “we [Newton and I] went into the garden and drank thea, under the shade of some apple trees…he told me, he was just in the same situation, as when formerly, the notion of gravitation came into his mind. It was occasion’d by the fall of an apple, as he sat in contemplative mood.”
When the French microbiologist Louis Pasteur wrote about moments of discovery, he was less colorful, noting that in science luck usually favored the prepared. For him, a good education in science that included laboratory practices was essential for churning out more and more eureka moments that could lead to great discoveries.
Einstein’s own account of the idea that led to the discovery of general relativity shared some elements with Newton’s and Archimedes’. He too was in a contemplative mood and caught by surprise. Einstein described the incident in a conversation with a journalist from The New York Times who visited him at home in Berlin shortly after the experimental verification of his theory had made him world famous: “It was from his lofty library,” wrote the reporter, “that he observed years ago a man dropping from a neighboring roof - luckily on a pile of soft rubbish - and escaping almost without injury.” The headline for the story ran: “Not Inspired As Newton Was But by the Fall of a Man from a Roof Instead of the Fall of an Apple.” Scholars and historians have since dismissed Einstein’s account as apocryphal. Towards the end of his life Einstein admitted that there was really not much truth in how scientists, including himself, described how they came up with their discoveries. “The worst person to document any ideas about how discoveries are made is the discoverer,” he told an interviewer.
SUGGESTED READING Why Thomas Kuhn was no revolutionary By John Preston Prominent scientists are routinely asked to account for the arrival of their genius ideas. A common response is that breakthroughs appear while sleeping. It is thanks to curious dreams that we now know how to arrange elements in a periodic table (Dimitri Mendeleev), evolution by natural selection (Alfred Russel Wallace), the chemistry of nerve signal (Otto Loewi), the structure of DNA (James Watson), and many others. In these accounts, dreams appear as catalysts that bring imaginary otherworlds into this one.
Scientists’ testimony is almost always given retrospectively, long after the scientific community and the general public have been convinced of the importance of the work in question. The social aspect of discovery is neither punctual nor sudden, but much the opposite: drawn-out, contested and frequently marked by bitter priority disputes. Simultaneous (or near-simultaneous) discoveries are the rule rather than the exception in science. For this reason, scientific biographies most often revolve around a main character who competes against a rival who is closing in on the prize. But in contrast to sports, there are no fixed goal posts in the bitter wars of discovery. Sometimes victory is attributed to the scientist who first observed a new aspect of nature, other times to the one who first had the idea that led to an observation performed by somebody else, other times it rewards the first one who offered a valid theoretical explanation what had been poorly understood, while yet on other occasions the deciding criterion is based on who published first. Different standards apply to different cases. Sharp agate point-like moments of discovery are only found after priority has been firmly established, and that usually takes a long while.
Since most of the accounts about discovery—including the paradigmatic ones of Archimedes, Newton and Einstein—describe moments which are sudden, personal and unintelligible, most scientists and philosophers have simply given up studying the process further, arguing that discovery is a “private art” too unruly, inchoate, and slippery to merit serious consideration. In fact, setting discovery aside became a programmatic recommendation of much twentieth century philosophy of science. Hans Reichenbach, one of its most prominent representatives and a close colleague of Einstein, noted that since “discoveries need a kind of mythology” it was better to ignore it and dedicate all hands-on deck to studying only those aspects of science that could be tested experimentally.
Since most of the accounts about discovery describe moments which are sudden, personal and unintelligible, most scientists and philosophers have simply given up studying the process further.
Yet we continue to be fascinated by the topic, partly because discovery changes what we include in the list of what exists. Usually it adds new stuff, but other times it involves eliminating things that we once fervently believed in (such as phlogiston, hysteria, the ether, and absolute time and space, and others. When the process includes the discovery of something new, this is retrospectively determined to have always already been there and out entire past needs to be reevaluated. So, we now know that the Big Bang set in motion our universe, that dinosaurs once roamed the earth, and that King Tut likely suffered (and may have died) from malaria. In the case where discovery leads to an elimination, we retrospectively attribute the belief in the existence of that something as due to faulty thinking and false assumptions. Hysteria, animal magnetism, the caloric, among others, have all been discovered to have been manifestations of something else which does exist in nature, such as mass suggestion, hypnotic influences and heat.
Things that come into our world via discovery are different from those which arrive via invention. Ever since Newton used the phrase hypotheses non fingo scholars have debated about the relation between discovery and invention. The two are notoriously hard to parse out, yet discovery is generally characterized by the belief that the entity in question already existed, whereas invention is generally understood to as the creation of something new. Gravity existed before Newton; relativity before Einstein, and most of the many other entities uncovered by science that are as hard or impossible to observe, from the electron to the Higgs boson, already existed in our universe. A few holdouts disagree: anti-realist, instrumentalist or social constructivist all claim that there simply is no absolute reality underpinning our entire universe. Yet nobody would claim that the steam engine, the telephone, the airplane or the automobile existed before they were invented. Certainly not in the same way that gravity, relativity, the electron, or the Higgs boson did.
Einstein understood the difference between scientific discovery and invention by saying that if Newton and Leibniz had not lived, somebody else would have arrived at their discoveries. In contrast, if Beethoven had not lived, Einstein believed that the Eroica symphony would not have come to be. Yet inventiveness and creativity are as essential to discovery as they are to invention.
If we stuck only with the astronomer Carl Sagan’s view of science, one which he described in his bestselling The Demon-Haunted World (1995), it would be tempting to consider science merely as an effective process for eliminating false beliefs. But science is much more than that. Its most powerful characteristic is based on how it is used to change the world by bringing new technologies and innovations into being. The elementary lesson that elves and giants are mythical creatures who do not exist in the real world, pales in comparison with how motors, computers, nuclear bombs and virtual reality technologies have changed it. Yet Sagan’s highly popular vision of science still dominates the public’s understanding of it.
Science has indeed eliminated many superstitious beliefs and corrected countless misconceptions, but it has also given life to many new imaginings, some which will likely change our world in the future. Scientific discovery brings new things into the purview of existence in a roundabout way—by way of fictions. The anteroom of experimental science is populated by wonderful imaginary creatures featured in thought experiments which have been essential for the development of thermodynamics, relativity and quantum mechanics (among other branches of science).
The anteroom of experimental science is populated by wonderful imaginary creatures featured in thought experiments
Open almost any dictionary and turn to the term “demon” followed by the subheading “in science,” a term that does not trip so easily off the secular tongue. In the hands of the British inventor Charles Babbage, Laplace’s demon, a creature named after the statistician Pierre-Simon Laplace, became a blueprint for some of the first computers. Maxwell’s demon, a creature named after James Clerk Maxwell, motivated researchers to develop more efficient machines and better ways of interfacing with them. Descartes’s demon, a creature thought up by René Descartes in the eighteenth century, whose main ability was to take over your senses and install an alternative reality, is still used by scientists to understand how our minds work and to develop better virtual reality technologies. The search for these creatures is ongoing. Certain demons, pace Sagan, continue to thrive in the age of reason.
Spending time concocting imaginary beings might seem like a useless, even infantile, exercise. It is difficult enough to live and work with things whose existence is not up for debate, such as what impinges on our safety, health and income. But if we look closely at the latter, we soon notice that actual existing objects play slight roles in our everyday discourse. Legal and juridical concepts govern our lives, yet they cannot be pinned down to any particular thing or referent. Some neuroscientists might try to find the place where each and every one of our memories are stored, yet most of us think of them as something which does not exist in this world the same way that say, a table, does. We tend to associate rational thought with a belief with only certain forms of existence while we neglect the many others that play key roles in our life?
As heirs to the Enlightenment, most of us would like to live in a world with clear-cut standards for distinguishing the imaginary from the real. Our literary tradition, starting with Don Quijote and Hamlet, is riddle with characters who struggle with precisely this issue. Throughout history we have attempted to create systems and institutions to separate the two, including removing individuals who cannot properly distinguish between the two from polite society. Perhaps no one had higher hopes for achieving this goal than the eighteenth-century jurist Jeremy Bentham, often celebrated for his revolutionary ideas designed to inculcate reasonable behavioral and thought habits into those who most resisted them. His most well-known book, Panopticon: Or, The Inspection-House (1791), a manual for “punishing the incorrigible, guarding the insane, reforming the vicious, confining the suspected, employing the idle, maintaining the helpless, curing the sick, instructing the willing in any branch of industry, or training the rising race in the path of education” advocated a new approach to managing prisons and asylums based on observation towers that could be used to survey its unruly inmates.
Bentham’s lesser-known work, the Theory of Fictions, sought to manage unruly minds rather than bodies. It tried to bring order to all sorts of “non-entities” that played important roles in society, including representations of the devil that appeared as “having a head, body and limbs, like man’s, horns like goat’s, wing like bat’s, and a tail like a monkey’s.” Bentham’s work on fictions remained unfinished, yet the goal of clearing our universe from all that is crazy and unreal still motivates us.
Few thinkers would dare to think otherwise. One of the few to strike such a discordant note was the early twentieth-century aristocratic Austrian philosopher Alexius Meinong. He became convinced that intellectuals around him were in thrall of a “prejudice in favor of the real.” To combat such a prejudice, Meinong attempted to create an entirely new area of knowledge, called The Theory of Objects, that went against the general trend dominating most intellectual inquiries of his era. His philosophical project consisted in trying to find a place for “homeless” [Heimatlos] things that did not yet fit within the realm of the “real” [Wirklich]. Even the square circle exists in some strange capacity. Otherwise, we could not even talk about it.
Unsurprisingly, the philosophical community rallied against Meinong and united under the fervent cry of “I am not a Meinongian!” The mathematician Bertrand Russell, who had at first been curious about the project, changed tack. He became one of his most vicious attackers, accusing the Austrian of introducing into philosophy an “unduly populous realm of being” as primitives were wont to do, and became one of Einstein’s most vocal defenders. Russell and acolytes embarked instead on the opposite project: trying to cut many of them out of philosophy by considering them as cockamamie as listening to the sound of silence. Under their influence, most scholars still concur that the generally-agreed-upon list of the “existing” should not be up for modification. Meinong’s philosophy, mostly forgotten, is usually only remembered as the “bête noir of analytical philosophy” in reference the school of thought that continues to effectively police the boundary around the real.
Yet science is much more than a mere procedure for testing hypotheses: it is also a process that permits new things (think of anesthesia, antibiotics, vaccines or nuclear energy) to enter into our world. That aspect of science is an integral part of our history, arguably its most important side, because the changes it brings about are what makes history, leading us to reevaluate the future and the past. Scientists stand outside the boundaries of the real when they push against it.
Imaginings with the power to become real alter our world in unsuspected ways, catching even the researchers themselves by surprise. But fear not the world of the scientific imagination: it is very much unlike the free-for-all universe of unreason feared by Enlightenment reformers, but one with a long tradition composed of main characters, lesser players, and tried and trusted plots. As such, it is not equivalent to everything that does not exist. The rabbit hole is not as deep as it may first appeared. Its contents offer us a preview into a world of science long before the curtains are drawn and the show begins, permitting us to enjoy the rehearsal of our own technological future.
Join the conversation