Search SFE    Search EoF

  Omit cross-reference entries  


Entry updated 14 February 2017. Tagged: Theme.

In its strict meaning, "entropy" is a thermodynamics term, first used by the German physicist Rudolf Clausius (1822-1888) in 1850 to describe the amount of heat that must be put into a closed system to bring it to a given state. The Second Law of Thermodynamics – often stated in terms of work as "it is impossible to produce work by transferring heat from a cold body to a hot body in any self-sustaining process" – can alternatively be rendered: "Entropy always increases in any closed system not in equilibrium, and remains constant for a system that is in equilibrium."

To put it less technically: whenever there is a flow of energy some is always lost as low-level heat. For example, in a steam engine, the friction of the piston is manifested in non-useful heat, and hence some of the energy put into it is not turned into work. There is no such thing as a friction-free system, and for that reason no such thing as a perfect machine. Entropy is a measure of this loss. In a broader sense we can refer to entropy as a measure of the order of a system: the higher the entropy, the lower the order. There is more energy, for example, tied up in complex molecules than in simple ones (they are more "ordered"); the Second Law can therefore be loosely rephrased as "systems tend to become less complex". Heat flows, so ultimately everything will tend to stabilize at the same temperature. When this happens to literally everything – in what is often called the heat-death of the Universe – entropy will have reached its maximum, with no order left, total randomness, no life, the end. (There is, however, an argument about whether the concept of entropy can properly be related to the Universe as a whole.) Of course, the amount of usable energy in the Universe, primarily supplied by the stars, is unimaginably huge, and the heat-death of the Universe is billions of years away. Isaac Asimov's amusing "The Last Question" (November 1956 Science Fiction Quarterly) has a supercomputer, which for aeons has been worrying about the heat-death, reversing entropy at the last possible moment. The scientist Freeman Dyson, in "Time Without End: Physics and Biology in an Open Universe" (Review of Modern Physics July 1979), confronts the same question with a similar optimism and, one must assume, rather better mathematics. Maxwell's Demon, a hypothetical entity which could apparently reduce the entropy of a closed system, is a famous Thought Experiment (which see) in thermodynamics, and is invoked in Joseph Samachson's "A Feast of Demons" (March 1958 Galaxy) as by William Morrison. Local images of entropy, like the huge red Sun at the end of H G Wells's The Time Machine (1895), long antedate the general use of the word; indeed, dying-Earth stories generally (see End of the World) can be seen as entropy stories, both literally and metaphorically.

Although "entropy" has been a technical term for a long time, it is only since the early 1960s that it has, in its extended meaning, become a fashionable concept (although the word sometimes popped up in sf earlier, as in House of Entropy [1953] by H J Campbell as Roy Sheldon). Since the 1960s, to the annoyance of some scientifically minded people, the extended concept of increasing entropy includes holes wearing in socks, refrigerators breaking down, coalminers going on strike, and death. These are indeed all examples of increasing disorder in a technical though not necessarily a moral sense. Life itself is a highly ordered state, and in its very existence is an example of negative entropy (negentropy). It is as if, though the Universe is running down, there are whirlpools of local activity where things are winding up. All forms of information, whether in the form of the DNA code or the contents of this encyclopedia, can be seen as examples of negentropy. It is natural, then, that a popular variant on the entropy story is the Devolution story.

Entropy has become a potent metaphor. It is uncertain who first introduced the term into sf, but it is likely that Philip K Dick, who makes much of the concept in nearly all his work, was the first to popularize it. He spells it out in Do Androids Dream of Electric Sheep? (1968), where entropy, or increasing disorder, is imaged as Kipple: "Kipple is useless objects, like junk mail or match folders after you use the last match or gum wrappers or yesterday's homeopape. When nobody's around, kipple reproduces itself ... the entire universe is moving towards a final state of total, absolute kippleization."

It was, however, in New-Wave writing, especially that associated with the magazine New Worlds, that the concept of entropy made its greatest inroads into sf. J G Ballard has used it a great deal, and did so as early as "The Voices of Time" (October 1960 New Worlds), in which a count-down to the end of the Universe is accompanied by more localized entropic happenings, including the increasing sleepiness of the protagonist. Pamela Zoline's "The Heat Death of the Universe" (July 1967 New Worlds), about the life of a housewife, is often quoted as an example of the metaphoric use of entropy. Another example is "Running Down" (in New Worlds 8: The Science Fiction Quarterly, anth 1975, ed Hilary Bailey) by M John Harrison, whose protagonist, a shabby man who perishes in earthquake and storm, "carried his own entropy around with him". The concept appears in the work of Thomas M Disch, Barry N Malzberg, Robert Silverberg, Norman Spinrad and James Tiptree Jr as a leitmotiv, and also in nearly all the work of Brian W Aldiss, which typically displays a tension between entropy and negentropy, between fecundity and life on the one hand, stasis, decay and death on the other. Outside Genre SF, Thomas Pynchon has used images of entropy many times, especially in Gravity's Rainbow (1973) and as the title of Entropy (Spring 1960 Kenyon Review; 1977 chap). George Alec Effinger's What Entropy Means to Me (1972) is not in fact a hardcore entropy story at all (apart from a tendency for things to go wrong), but Robert Silverberg's "In Entropy's Jaws" (in Infinity 2, anth 1971, ed Robert Hoskins) is a real entropy story and a fine one, exploring the metaphysics of the subject with care.

Colin Greenland wrote a critical book called The Entropy Exhibition: Michael Moorcock and the UK "New Wave" (1983), and it is indeed Moorcock who has perhaps made more complex use of entropy and negentropy than any other sf writer, and not just in The Entropy Tango (fixup 1981); the two concepts run right through his Dancers at the End of Time and Jerry Cornelius sequences. Jerry Cornelius seems for a long time proof against entropy, and keeps slipping into alternate realities as if in hope of finding one whose vitality outlives its decay, but like a Typhoid Mary he carries the plague of entropy with him, and ultimately, especially after the death of his formidably vital and vulgar mother, succumbs to it himself, becoming touchingly more human, though diminished.

Although it was in the 1960s and 1970s that the entropy-story peaked, the image continued to appear, as in Dan Simmons's Entropy's Bed at Midnight (1990 chap). Ted Chiang's "Exhalation" (in Eclipse 2, anth 2008, ed Jonathan Strahan) is a limpid presentation of the entropy principle as discovered by a literally introspective member of a community of sentient Robots. These beings' mechanisms – including thought – depend on the pressure difference between their Pocket Universe and an unknown exterior. The usual heat transfer here becomes a flow of life-giving argon gas, which inexorably lessens with time as the pressures equalize.

In all of these works, entropy is a symbol or metaphor through which the fate of the macrocosm, the Universe, can be linked to the fate of societies and of the individual – a very proper subject for sf. Negentropy versus entropy is usually seen as an unequal battle, David against Goliath, but sickness, sorrow, rusting, cooling and death contrive to be held at bay, locally and occasionally, by passion and movement and love. Looked at from this perspective, entropy is one of the oldest themes in literature, the central concern, for example, of Shakespeare, Donne, Milton and – especially – Charles Dickens. [PN/DRL]

previous versions of this entry

This website uses cookies.  More information here. Accept Cookies