The Ends of Information: Searching for Truth in the Digital Age
"T he universe,” wrote the Argentine author Jorge Luis Borges, “(which others call the library), is composed of an indefinite and perhaps infinite number of hexagonal galleries, with vast air shafts between, surrounded by very low railings.”
The library in question is the imaginary Library of Babel, described by Borges in a short story of the same name. Occupied only by wandering librarians, it is eternal, an endless collection of unique books, all comprised of the same twenty-five orthographical symbols, repeated in infinite permutations. And in the infinitude of this library and its symbols, realize the librarians, everything is to be found.
In Borges’ universes, though, “everything” is never so simple as it sounds. “Everything,” he writes of the infinite contents of the library, includes not merely the innumerable variations of letters on a page, but
the minutely detailed history of the future, the archangels’ autobiographies, the faithful catalogue of the Library, thousands and thousands of false catalogues, the demonstration of the fallacy of those catalogues, the demonstration of the fallacy of the true catalogue, the Gnostic gospel of Basilides, the commentary on that gospel, the true story of your death, the translation of every book in all languages, the interpolations of every book in all books.
Everything, in other words, includes everything and its converse, truth and its repudiation alike.
Reading Borges today, one thinks it merciful that he did not live to see the internet age. His oeuvre, renowned for its dealings with memory, signs, infinity, and the like, anticipated today’s cultural atmosphere to a stunning degree. It is almost impossible to read about information—digital or otherwise—without stumbling across a reference to the Argentine, and for good reason. To enter his world is to be treated to an intricate, encyclopedic, and labyrinthian system of references—the internet avant la lettre. The mercy, of course, is that Borges’ work is coherent; his stories are short (he sometimes wrote reviews of his own non-existent books, declaring that it was not worth writing the whole books themselves), and intentional to the point of being didactic. If anything, his concision suggests that his relationship to infinity was as much a fear as a fascination.
We who survive Borges may envy him in his tomb. As even the most articulate describers of the information zeitgeist will tell you, there is no metaphor sufficient to capture what we have lived through since the birth of the internet at the end of the last millennium. First, there is the issue of speed. As computer scientist Jaron Lanier has described it, the totality of available information has grown at a scarcely-fathomable pace: “It’s as if you kneel to plant the seed of a tree and it grows so fast that it swallows your whole town before you can even rise to your feet.” Yet speed is the easy half of the puzzle; far more bewildering is the question of scale.
In 1949, when information theory was in its infancy, the Library of Congress was estimated to contain one hundred trillion (10^14) bits. It was then thought to be the largest information repository in the world. Since the birth of the internet, though, the numbers have exceeded anything remotely intelligible to us. We now speak of zetabytes (10^21) and yottabytes (10^24), the latter of which is equal to ten billion times the Library of Congress of only seventy years ago. The creation, collection, and dissemination of information in the twenty-first century has, by virtue of its scale, surpassed our capacity to wrap our minds around it. For all intents and purposes, our library is like Borges’: effectively infinite.
It is not odd, therefore, that we as a society seem to be experiencing a collective sense of disorientation. It floats latent in the air, the wifi, the socialsphere, and crystalizes in headlines and articles like “Are We Consuming Too Much Information?” and “Death by Information Overload.” The latter term, “information overload,” dates back to Alvin Toffler’s 1970 book, Future Shock. Fifty years later, a whole cottage industry of books, classes, talks, etc. is devoted to helping people sort through the surfeit of available information (doing so, of course, by providing them with more information). Clearly, we think, there is too much to know.
What makes, I believe, this feeling of overload so freighted is its moral dimension. Old is the notion that more and better information improves ethical judgement. Socrates, the authors of the Gospels, and Shakespeare, for instance, have all suggested that moral actors are best off when making informed choices, and, inversely, that uninformed actors may be forgiven their moral errors. Certainly it’s not always so simple; John Rawls, for one, has proposed with his “veil of ignorance” thought experiment that some information can prejudice our actions and lead us to make arguably less moral choices. And, finally, what if the mere presence of information proves itself an obstacle to moral action?
In his introduction to the 2007 anthology of The Best American Essays, the author David Foster Wallace coined the phrase “Total Noise,” which he described as “the seething static of every particular thing and experience, and one’s total freedom of infinite choice about what to choose to attend to and represent and connect, and how, and why, etc.” Total Noise, according to Wallace, is more than just too much information; it is the incapacity to remain conversant with the ever-expanding volume of information necessary to remain a moral citizen of the world. Wallace’s feeling of being overwhelmed therefore isn’t only a sense of epistemic inadequacy—it is a distinct awareness of failing on levels ethical and existential.
Lest these failings seem like the abstractions of a particularly torturous graduate school philosophy paper, let’s take an aerial survey of American information culture right now, in the fall of 2021. The immediately obvious issue is COVID-19: the propagation of both mis- and dis-information (the key difference being intentionality) has been declared an “infodemic” by the WHO. On top of that, we might point to the QAnon conspiracy theory or the “Big Lie” about election fraud. The actions of adherents of both of these credos are patently inexcusable—but one must admit the possibility that, if someone genuinely believed that, say, Democrat politicians were running a satanic child sex-trafficking ring, she would be morally obliged to do something about it. Lastly, and more broadly, the atomization of American culture, fueled largely by social media, has led to what the literary critic Sven Birkerts has called the “balkanization of interests,” as well as its more explicitly nefarious counterpart, the ideological echo chambers of contemporary politics. Information overload, in other words, can credibly be viewed as the source of our most pressing sociopolitical issues in America today.
So where does that leave us? We feel overwhelmed by a perceived excess of information, which in turn makes us morally agitated. Our social, political, and cultural institutions all suffer for it. This raises, then, a question: if all this information is the source of so many problems, why are we producing and collecting it? What is information for, and what do we hope to find at the bottom of it all?
“Infoglut” is, at present, the term of choice for describing this flood of information, but the feeling of inundation itself is far older than the neologism. More than two thousand years ago, the author of Kohelet (Ecclesiastes) decried the excess of books being written in his own time. The first-century Roman philosopher Seneca derided his peers for mindlessly accumulating books, on the grounds that their pages contained more information than a person could possibly absorb in a lifetime. The Muslim historian Ibn Khaldun made similar complaints in the fourteenth century, and no lesser figures than Gottfried Wilhelm Leibniz and Alexander Pope despaired at the “deluge” of books that followed Gutenberg’s invention of the movable-type printing press.
Curiously, though, what spurred these experiences of infoglut was not technological innovation, but cultural change. To be sure, the invention of the printing press galvanized the spread of information; historian Elizabeth Eisenstein has famously argued that the printing press effectively caused modernity in Europe. But ancient China and Korea developed the printing press more than five hundred years before Europe without ever succumbing to the same sense of overload. On the other hand, medieval Islamic societies, despite lacking mechanical printing technology, were sufficiently inundated by information to prompt regular complaints from leading scholars. The difference, according to Ann Blair’s Too Much to Know, was cultural: “the invention of printing in Europe coincided with a renewed enthusiasm, visible in earlier centuries but revitalized by the humanists, for the accumulation of information.” Whether in ancient Baghdad, seventeenth-century Holland, or present-day America, an overwhelming amount of information was always available; the question was simply whether or not it was accompanied by a cultural belief in the importance of its comprehensive accumulation.
In “The Library of Babel,” Borges describes the moment when a “librarian of genius” discovers the fundamental law of the Library, that all things are found therein:
When it was proclaimed that the Library contained all books, the first impression was one of extravagant happiness. All men felt themselves to be masters of an intact and secret treasure. There was no personal or world problem whose eloquent solution did not exist in some hexagon.
In a rush, the librarians set off to plumb the endless halls of the library in search of the book that might contain the answers to their own lives, or to the origin and fate of the library itself. For centuries, relates the narrator, they have kept up the search.
Quixotic as Borges’ librarians may seem, they are not as alien to us as we would like to think. Our problem with infoglut isn’t the glut, precisely—it’s our unstated conviction that, in the ever-growing haystack, there is a needle to be found. Infoglut is only a problem because we happen to believe that there is some finite quantity of important knowledge that can only be found by parsing an ever-growing accumulation of information. Although infoglut refers literally to the haystack, its implicit quarrel is with the needle.
One of the primary impacts of the Enlightenment in the seventeenth and eighteenth centuries was to undermine the nearly two thousand-year-old belief in divine truth. Thinkers like Spinoza illuminated the too-human origins of the Bible, while early scientists like Newton showed that the world’s purportedly supernatural order could actually be explained through pure, mechanical laws. In particular, the discovery of deep geologic time—that the Earth was far, far older than the Bible claimed—shook the fundaments of Western knowledge, incontrovertibly repudiating the claims of the Bible, which had until then been thought the supreme arbiter of truth.
With the Bible’s credibility reduced by rubble, the West began its long, slow abandonment of Christian explanations of the world, opening itself instead to the novelties of science. Yet, while this entailed rejecting the divine coherence of the world, those who put their faith in natural science did not lose their conviction that explanations of the world must ultimately accord with one another. Instead, they came to believe that the world was governed by a collection of physical laws: all in harmony, and all eventually discoverable by humanity’s rational, observational faculties.
The process of this progressive discovery—the assembly of the great edifice of Science—has been so successful and become so much a part of the fabric of our society that we rarely notice it today. Everyone who passed through an American public school is familiar with the scientific method: advancing iteratively through trial and error, buttressed by hypothesis, evidence, and reevaluation that ultimately leads, no matter how slowly, to an increasingly accurate conception of the world. Along the way, we discard our less accurate ideas—out the window with the flat Earth, miasma theory, lobotomies. In his book Delete: The Virtue of Forgetting in the Digital Age, Viktor Mayer-Schönberger offers a clarifying anecdote:
[My old friend] limited his personal library to exactly two hundred books. Once he had read a new book, he would decide whether it was among the two hundred best books he’d ever read. If so, he would add it to his collection, and discard the lesser one. Over time, he thought this process of constant filtering and deliberate forgetting would continuously improve his library’s quality, so that he would retain in his external memory only the really important and valuable thoughts.
This is the teleology at the heart of our civilization. Essentially, we see the world not only as an open book, but as a comprehensive one, comprised of knowable and ultimately coherent truths. There is no personal or world problem whose eloquent solution does not exist in our hexagon, if only we are willing and able to sort through 1024 bits worth of information.
We denizens of the post-Enlightenment also typically take for granted that the truth lies in the future. Whether you’re a liberal, a Marxist, or a Christian, you have inherited this teleological reasoning—just over the next hill, you think, lies the End of History/a classless society/the Kingdom of God. Yet these beliefs would strike many of the ancients as bizarre. For many in the pre-Enlightenment world, both in the West and out, as well as for many contemporary societies, we move temporally away from an original truth, to which we must return.
Around the twelfth century, the intellectual elite of medieval Jewry were engaged in a heated debate. One side maintained that God’s Law had been given in utter clarity and completeness at Sinai and since distorted, while the other held that the Law needed to be progressively uncovered as humans came to a better understanding of the world around them. Traditional Islamic theology holds that Muhammad’s revelation was not new, but in fact a corrective to Jewish and Christian perversions of an original divine revelation (paralleled by Jesus’ oft-ignored claim in the Book of Matthew: “Do not think I have come to abolish the Law or the Prophets. I have not come to abolish them, but to fulfill them.”) Even today, the veracity of religious claims in the Jewish and Islamic worlds depends on chronological lists of who learned what from whom, somewhat akin to footnotes. The closer one gets to the original, oldest source (Moses or Muhammad, respectively), the more authority a claim has.
Nor is this idea somehow superstitious or irrational; up until the time of mechanized printing, older versions of a text were less likely to have been subjected to multiple rounds of copying, and therefore less likely to contain grave errors. New copies of texts were considered farther from the truth, which is why the Library of Alexandria, for instance, always sought out the earliest possible copy of a text for its collection, paying handsomely for its purchase (or taking it by force when necessary). Even today, children playing “whisper down the lane” understand that more recent information can serve to obscure what is true.
Be that as it may, our present society is all-in on the notion that a final truth is out there—and that, if we simply gather enough information, we will find it. There is perhaps no better exemplar of this paradigm than a now-notorious 2008 essay in Wired by Chris Anderson, titled “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.” Anderson's basic thesis was that, prior to the emerging data age, the scientific method functioned by proposing inherently inaccurate models of the world and then continuously improving those models based on small data sets. With our increasing ability to make use of Big Data, however, Anderson argued that we would be able to simply analyze incomprehensibly large quantities of data to find correlations. We no longer needed to understand why things correlated in order to know that they did so in a statistically significant way:
Petabytes allow us to say: “Correlation is enough.” We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.
The crux of Anderson's argument was that data would lead us to the truth. Collect enough data and, with or without the obdurate theorizing of the model-blinded scientists, we would arrive at understanding. Such faith in the power of fact and reason alone is admirable if not a little too dogmatic, but my point here is not to criticize Anderson so much as to highlight how his argument is emblematic of a larger social faith that we have—in many ways unique to our civilization—in inexorable epistemic progress by the sheer force of information.
It might be helpful to pause here and digress for a moment on what we mean when we say “information.” Our present understanding of the term is, much like our expectations for its uses, rather novel. It dates back to the revolutionary work of Claude Shannon in the first half of the twentieth century. Shannon’s “Mathematical Theory of Communication” (known today as information theory) was built on the bit: the most basic unit of information transmission, representing a logical state of two values (true/false, yes/no, 1/0, etc.), and it lies today at the heart of much of our world. But “information” in Shannon's theory is a unit generalized, the epistemological atom, and, like the atom, can form many different things.
When we laypeople think of information, we tend to associate it with quantitative data: life expectancy, likes on Instagram, probability of winning an election, free throws made. With the rise of companies like FiveThirtyEight and the omnipresent spectre of The Algorithm, this is certainly one of the most popularly visible manifestations of information. But it is not the extent or limit. Information, in contemporary parlance, is much wider in scope.
Consider a field outside the sciences, such as history. Sixty years ago, at the outset of the data revolution, professional historians felt themselves stifled. They were reaping the informational surpluses of the digital age, but, in contrast to their quantitative counterparts in scientific fields, at a loss for how to proceed. Carl Bridenbaugh, president of the American Historical Association, warned in 1962 of a “historical amnesia” as the result of new technology like Kodak cameras and radios, which he feared were rapidly eroding the skills and communicative ability of historians specifically and society in general. Yet not everyone agreed that the problem was one of perdition. As Elizabeth Eisenstein herself explained at the time, “It is not the onset of amnesia that accounts for present difficulties but a more complete recall than any prior generation has ever experienced. Steady recovery, not obliteration, accumulation, rather than loss, have led to the present impasse.”
For historians, an abundance of information was an obstacle to actually using that information effectively. This is because gathering and analyzing historical information, most of which is inevitably qualitative, demands a painstaking and largely subjective application of parameters. At a minimum, competence as a historian depends on an ability not just to find patterns in data, but to put those patterns in a broader context. Equally problematic, the abundance of information gives a false appearance of comprehensiveness, as if a lot of information is effectively the same as all the information. But, as historians-in-training are often admonished, “the absence of evidence is not evidence of absence.” Big Data today, for instance, largely leaves out the many millions of people around the world who are not online. The hard work of history, then—contextualization—is often undermined by the accumulation of information, which gives the illusion of completeness while becoming ever harder to understand en masse.
To further complicate our definition of information in the context of history, various historians, philosophers, and literary scholars have pointed out over the last half-century (e.g., Jacques Derrida, Edward Said, Michel Rolph-Truillot, etc.) that the parameters of our questions themselves—i.e., the assumptions we make about the information that we are analyzing—are often themselves the sources of profound bias. “The winners write the history books” isn’t only an indictment of propagandist historical writing; it is an acknowledgement that all historical work is an act of interpretation. Whether victory is military or epistemic, history is the construction of competing narratives. Historical judgement therefore requires a meta-understanding of why and how we make those judgements in the first place. Yet ultimately, we cannot but admit that our own biases sneak into the parameters we set for historical study, making objectivity or truth impossible to attain no matter how much information we have at hand.
This cuts at the heart of the study of history itself: all historical evidence, being ephemeral, is incomplete. Like a limit in calculus, we might approach historical truth, but our presumptions will always be conjecture, shaped arguably as much by our own contemporary experiences as by the actual conditions of the past. As Mayer-Schönberger points out,
Even in the digital age, not everything we communicate is captured in digital format—and certainly not what thoughts we ponder, and how we assess and weigh the pros and cons before making a specific decision. ... Put simply, even if perfect contextualization may re-create the information context, it cannot take us back in time.
All this accumulation elides a plain reality: the past is a foreign country. No amount of information will allow us to revive subjects who were once flesh and blood, to know what jokes Nefertiti laughed at, or how the market smelled in the morning at Caral-Supe, or whose breasts inspired the author of the Song of Songs to think of two fawns browsing among the lilies. We may have the facts of history, but we can never wholly grasp the meaning as it was for those who lived it.
In New Jersey, at Bell Laboratories in 1943, a portentous meeting took place between Alan Turing and Claude Shannon. Turing, brought recently to popular awareness by Benedict Cumberbatch in The Imitation Game, is considered the progenitor of theoretical computer science and artificial intelligence. He was, like Shannon, a visionary, capable of understanding long before others the transformative potential of the technology at their fingertips. But Turing, for all his foresight, left his meeting with Shannon astounded by the scope of Shannon’s notion of information. “Shannon wants to feed not just data to a [computer],” said Turing, “but cultural things! He wants to play music to it!”
Turing’s astonishment might now sound naive. The world’s 365 million Spotify listeners seem to prove pretty conclusively that you can, after all, play music to a computer. We have managed not only to play music to a computer, but to show it art and movies, to read it books, to play sports with it, and so on. We’ve even managed to quantify (or so we think, anyway) culture’s reception: cultural phenomena today are measured in likes, views, clicks. If anything, we now live—in the Global North, at least—at a moment when culture unfolds predominantly as digital information and our digital reaction to it.
To Turing, Shannon’s idea of handling culture mechanically must have sounded like philistinism. Both inside the West and out, civilizations have long thought of culture’s genesis as ineffable. The Bible speaks often of divine inspiration, the Greeks of the muses, Arabic poetry of the influence of jinn. More than one cultural moment in recent memory has depended on a heady stock of psychoactive drugs as a means to (in their view, at least) transcend rationality. Simply, it is hard (both conceptually difficult and, for certain people like myself, somewhat galling) to imagine the paintings of El Greco, the music of John Coltrane, the novels of James Joyce reduced to a string of bits to be apprehended by a machine.
At present, though, the bits appear to be winning. Computers at least comprehend culture in the literal sense that they contain it—and increasingly they attempt to lay claim to its creation, too. There is a growing interest in using AI to generate visual art and music. Google has introduced Verse-by-Verse, an AI tool that generates poetry in the style of famous American poets (though the poetry itself leaves much to be desired). And beginning in 2010, a multidisciplinary team of researchers created the Culturomics project, which analyzed the corpus of available literature on Google Books to identify and analyze culture trends over time, such as the social media leadup to the Arab Spring. Taken together, these efforts suggest authority, tangibility—a belief that there is some truth in culture to be pinned down and known, some definite direction to which we are headed.
Many civilizations in history have believed that culture moves in a fixed direction. Sometimes this has been forward: the sense of indomitable progress in nineteenth century Europe, for example, inspired the poet Tennyson to write: “Not in vain the distance beacons. Forward, forward let us range. / Let the great world spin for ever down the ringing grooves of change.” Others have revered a semi-mythic Golden Age, and thought all subsequent culture a fall from grace. The eleventh-century Chinese literary critic Huang Tingjian, having declared that all the best poetry had long since been written, dismissed innovators, asserting that “the quest for new expressions is itself a literary disease.” (The specific target of Huang’s ire was Li Bai, considered today among the greatest Chinese poets.) Too often, the notion that culture has a necessary direction has even been the precursor to violent chauvinism, such as under the Nazi regime. Unfortunately for all of these views, the clarity of theory and expectation is inevitably muddled by, well, life.
Only the benefit of hindsight makes cultural change appear inevitable. It’s easy to say, for instance, that our high estimation of Shakespeare was a matter of course. But Shakespeare was for two hundred years discounted by the English-speaking world in favor of his contemporary, Ben Johnson. The Bard was only rescued from relative obscurity because the German philosopher Johann Gottfried Herder extolled his virtues in Shakespeare, a 1773 book that helped launch Romanticism across Europe. Likewise, the now-canonical Dutch painter Johannes Vermeer died impoverished in 1675. His work languished, virtually unknown, until its rediscovery two centuries after his passing. So even if we can agree, say, that Michelangelo and van Gogh represent peaks in the Western canon of visual art, it would be hard to claim that Michelangelo led to van Gogh, and absurd to think that van Gogh’s art would inevitably supplant Michelangelo’s by virtue of a greater claim to truth or beauty.
Chris Anderson's 2008 Wired article explains that, from Google's perspective, cultural explanations were irrelevant—only predictability mattered: “Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.” Thirteen years later, we know that this predictability has been willfully harnessed by Big Tech to drive us into more and more predictable consumer niches, along the way knowingly exacerbating ideological extremism. While we may pride ourselves today on a willful degree of cultural tolerance and pluralism, our present idea that there is a truth to be found in the analysis of culture remains deeply problematic because, fundamentally, what all this information does is narrow things. It takes an immense corpus of work and gives us a vector, a probable direction. This may be useful when examining something like the laws of physics, but it is antithetical to the function of culture.
This is because the value of culture is not in information to be sorted and analyzed to find a truth, but precisely in that it defies monolithic interpretation. Metastudies of culture such as cultural history, might benefit from accumulation of information, insofar as we “better” understand the culture being studied—but culture itself does not. Cultural items rise, decline, and reappear as needed; their meaning and value are contingent upon the needs of the present generations. The point of cultural information is not Truth, but truths—to call our attention to the staggering, ecstatic plurality of human experience. As the American literary critic Lionel Trilling wrote of literature in his preface to The Liberal Imagination,
To the carrying out of the job of criticizing the liberal imagination, literature has a unique relevance, not merely because so much of modern literature has explicitly directed itself upon politics, but more importantly because literature is the human activity that takes the fullest and most precise account of variousness, possibility, complexity, and difficulty.
The determinism inherent in not caring why culture is one way or another reduces cultural actors (read: us) to pieces in a fixed game of chess, endlessly traversing a shrinking board. Rather than succumb to this homogenization, we might insist that the vagaries of culture are in fact the best defense against the increasingly rigid factionalism of our algorithm society.
Doubtlessly, the mass of available information will continue to swell. We will need new numbers, new metaphors of scale that make our present befuddlement seem quaint. It is often taken for granted that we will eventually emerge from this cultural moment, that we will learn to cope with infoglut like our ancestors learned to deal with the deluge of books that followed the invention of printing. I am fairly confident that (if we survive the climate crisis) this is true. My question, though, is not if but how we survive this moment—what kind of society will we be when it has passed?
By the end of Borges’ story, most of the librarians are despondent. Centuries of searching have proved fruitless. If the truth is out there, they realize, it will never be found, not in all the vastness of the library's infinite halls. In despair, they turn to mysticism, to nihilism, to violence, to suicide. But the narrator, curiously, is possessed of a strange optimism. Though he is tired, and not a little unsure of the meaning of his circumstances, he views the library's boundless shelves as a sign not of death, but of life. “The certitude,” he insists, “that everything has been written negates us or turns us into phantoms.” In their innumerable possible permutations, the library’s pages are a bulwark against erasure.
Contrary to what our present culture insists, there is no final truth out there for us to find. No amount of accumulated information—nothing short of the entire universe itself—will explain everything, wrapped neatly with a bow. But there are smaller, personal truths out there, and beauties, and freedoms. In embracing them, we might find that they, far more than any singular vision of Truth, accommodate the illimitable possible futures that lay before us. The world is not a finite thing to be known once and for all. It expands, growing ever larger in what it might encompass, and information, if it seeks to describe our world, finds its truth not in uniformity, but in constant flux and evolution. Without the burden of teleology, then, the abundance of information is not moral encumbrance, but hope for change—not a final end, but fodder for meaning, which varies infinitely according to the ceaselessly changing constellations of human experience.