300
absolute classifications” since “there is always a vacillation” between them (4). Of course, this
studied vagueness, characteristic of Barthes’ later period, has not stopped anyone from co-opting
the terms for more rigid polemical purposes.
5. Some interpretations of information, following Wiener, prefer to consider information
the negative of entropy. The difference, so far as I understand it, mostly derives from whether
one considers information a property of the message or the receiver’s knowledge, but for our
purposes, the distinction is unimportant, because in both cases their absolute value increases in
exact proportion, even though one has a negative sign in front of it.
6. The otherwise astute Steven Weisenburger provides a case study. Discussing Gaddis’s
first novel, he claims that “Even in The Recognitions, though, Gaddis is well aware that
cybernetic theorists had recently discovered a telling homology between the mathematical
equation for computing entropy and the equation for calculating the information per message,
that is, the message’s susceptibility to semiotic entropy or ‘noise’” (Fables 212). This is entirely
wrong. As may be seen from my description, Shannon’s definition says nothing about a
tendency toward noise: in fact, because it evaluates text separate from its channel, it is exempt
from Newton’s Second Law of Thermodynamics, which Weisenburger seems to have conflated
with entropy proper. (Only when the message is transmitted in a channel does the Second Law
apply.) Furthermore, Shannon’s definition is explicitly non-semiotic, because it separates
information (i.e., the actual characters) from meaning (i.e., what they semiotically represent)
(“Mathematical” 1).
7. How a focus on meaning can confuse literary treatments of information may be seen in
Thomas Pynchon’s early story “Entropy.” In his introduction to the retrospective collection of
his early short stories, Slow Learner, Pynchon warns that, though his early work had given
readers the impression that he possessed a deep knowledge of entropy, they should “not
underestimate the shallowness of [his] understanding” (12-13). In “Entropy,” in fact, Pynchon
has his character Saul refer to five dramatically distinct information theory terms—noise,
redundancy, ambiguity, irrelevance, and leakage—as essentially synonymous (90-91). All five
impede meaning, of course, but how their effect on information causes them to do so is
dramatically different. As a result, Saul’s application of information theory to relationships
becomes incoherent—which might be an appropriate, if unintended, result for the story.
8. This paradox may not apply in quite the same way to literature as biology: after all,
biological mutation is unguided and genetic, while literary mutations are shaped by critical
intelligence and may be taken from anywhere. Yet this is likely why, despite The Gold-Bug
Variations’ interest in genetic mutation, there are so few errors (and so much conventionality,
not only in language but plot and character) over its 600 pages, excepting the one
“diseasterously” in the passage from which I quote (and, of course, in the book’s title).
9. We should take a moment, incidentally, to reject the commonplace view that modern
art is characterized by an original defiance of convention, while postmodern art is based upon
repeating conventions and undoing totalized orders. After all, contrary to Eco, it is the arch
postmodern novel J R that Steiner criticized as too unconventional to be read, while it is the
modernist masterpiece The Making of Americans that Wilson found too repetitive to finish. This
distinction looks particularly inadequate given the equally common, and diametrically opposed,
identification of modernist literature as affirming totalized order and postmodern literature as
subverting metanarratives. Not only does that distinction work no better on its own—for
instance, Leo Bersani, in The Culture of Redemption, complains about supposed totalized
aesthetic order in modernist work such as Joyce’s Ulysses but ignores how the contemporary