And now for something completely different… Our good pal Nick Pisano approached me recently with a very different sort of article than what we normally publish at AITS. Its theme extends far beyond basic knowledge management and big data, challenging our very conception of information preservation, and it just so happens to be the most thought-provoking piece I have come across in a long time. I had absolutely no choice but to publish it, and I hope it will encourage vigorous debate among readers about whether we are on the cusp of a “digital dark age.” -Editor’s Note
“He who controls the past controls the future. He who controls the present controls the past.” ― George Orwell, 1984
Google Vice President Vint Cerf recently turned some heads at the annual meeting of the American Association for the Advancement of Science in San Jose, warning the attending scientists that the digitization of the artifacts of civilization may create a digital dark age. “If we’re thinking 1,000 years, 3,000 years ahead in the future, we have to ask ourselves, how do we preserve all the bits that we need in order to correctly interpret the digital objects we create?” Cerf’s concerns are that today’s technology will become obsolete at some future time, with the information of our own times locked in a technological prison.
Cerf’s observation is certainly chilling, but trends in the digital environment point to altogether different issues in terms of this new media’s effects on society and the individual. Frankly, information loss is nothing new in human history. Thus, given new technology and its greater capacity to store data, it seems worthwhile to survey whether the issue is the loss of data–or whether it is the collection of too much data.
A History of Lost History
Upon reading Cerf’s article, my initial impression was that he is being too optimistic. Let’s take older, but relatively recent, technologies as a baseline. For example, according to The Film Foundation, about half of all American films made before 1950 and over 90% of films made before 1929 have been lost forever. According to the Library of Congress, the amount of our significant musical and sound heritage available that was recorded between 1890 and 1964 that is still available is, at best, 14% of the total. This does not fully take into account those recordings that were destroyed and already lost forever to history.
Photography, the technology cited in the wake of Mr. Cerf’s comments as being most in danger, is a fairly new technology. We can only guess at the proportion of images that have been discarded and lost forever. I would guess that it is extensive.
Both film and aural recording technology took hold in the 1890s—photography a bit earlier—so the pace of information loss in the space of just 125 years is problematic. With an increasing amount of our day-to-day communications being rendered via electronic means—emails, tweets, texts, and other social media, which are viewed as ephemeral—it is very easy to see why Cerf is concerned. Civilization is losing the artifacts and detritus necessary to construct a history of the times in which we live.
More problematic, of course, is that the artifacts of these media can be changed, manipulated, destroyed, selectively chosen, or concealed at will. For example, under the European “right to be forgotten” legislation, Google has received over 500,000 requests to remove links from its search engine.
Furthermore, even natural processes of decay that are characteristic of the digital environment will degrade this information. Hardware fails, backups are incomplete, data is corrupted, bits degrade, technologies quickly advance rendering earlier ones obsolete, and viruses destroy data. Only through the evidence of actual practice or events can the facts be fully constructed.
Same Issue, Different Media
“Facts do not cease to exist because they are ignored.” ― Aldous Huxley, Complete Essays 2, 1926-29
For the historian, fragmented information is a familiar problem. Autobiographies and biographies where they exist are notoriously self-serving and cannot be taken on their face. Facts provided by participants often conflict: memories fade; perspectives and filters alter the recounting of actual events. There are numerous studies on the unreliability of eye-witness accounts.
But even tainted or unreliable evidence, when placed side-by-side by the sequence of events—and which in the best cases are often forensic artifacts, as opposed to accounts of the events—at least can lead one to make a plausible rendering and interpretation of the history being recounted. Without them, however, such reconstructions are impossible.
In my own experience as a historian I specialized in 16th century Spain, its Reconquista, its age of exploration, and how it interacted with the world based on its Millenarian vision. As part of my research I had to delve into languages and manners of expression that haven’t been used in at least 400 years. The information contained in these written artifacts required that I also immerse myself into obsolete ways of thinking and of daily perception in order to derive the proper meaning and interpretation of my subject matter. In some cases there was no direct translation or analogue in our modern framework.
But, more importantly, I came to realize that what was available to me was but a very small portion of the world in which those people lived. Our impressions of them, what their lives may have been like, the day-to-day reality of their existence, came down to us through the oftentimes serendipitous accident of historical events. The written record, where it does exist, was created by a very small, educated, and literate elite. Thus, the historian is always dealing with the Rashomon Effect.
This hobbles the historian in fully understanding the context of the time that is being studied. For the overwhelming majority of the millions of people who lived, worked, toiled, and died during those times did so in obscurity.
Huxley’s observation, which introduces this section, is correct in terms of the physical world and to varying degrees elsewhere. Mathematics, physics, the biological sciences, and other disciplines of learning can establish facts. Placing those facts into a contextual framework requires the application of intelligence.
What differentiates the era of modern media at least since the mid- to late 19th century is not the paucity of data, but the wealth of data. We mourn the loss of our recorded visual and aural cultural heritage, ignoring the fact that before recorded technologies virtually all of these forms of human expression—drama, comedy, music, spoken word—were lost to history, except those that were specifically committed to paper.
For example, “classical” music has survived because it was financed by the aristocracy, and suffice to say, they deemed artistic expressions by the peasants as vulgar. The transcriptions that survived were the price of being paid. Mozart borrowed liberally from the folk forms around him. How many peasant Mozarts have lived and died? We will never know. This is not to degrade Mozart’s achievement, but it does call into question many of the assumptions we make about history and information.
Modern Media is Ahistorical
“The best books… are those that tell you what you know already.” ― George Orwell, 1984
Thus, it is not that digitization represents a new threat so much as it provides us with both a panacea and a problem. On the panacea side it provides us with the technology to preserve the cultural heritage that we value. If not for digitization, so much more music, film, spoken word, and photographs would have been lost.
Since the opening of the net to commercial and consumer access, there has arisen a plethora of voices that have overcome official and traditional narratives. Commercial products and services can be had at the click of a button, providing comparative pricing, product, and consumer rating information hindered only by Internet access and speed. Assuming that the net will remain neutral, what we have been given is a great First Amendment machine.
On the problem side, however, there are a number of issues created from its attributes. The attributes are not necessarily negative ones on their own. Instead, it is the manner that people interact with these attributes that are of interest and which pose challenges. Those of us who utilize and advance these technologies have a particular responsibility to address them.
First, modern media that utilizes digitization tends to view all information as the same, without context. The trivial stands equally next to information of importance. Under Marshall McLuhan’s definition, it is a “cold” media. This places a great deal of importance on the ability of the consumer to make qualitative distinctions. Most often, these distinctions cannot be made readily without expending effort toward that determination, and without the requisite critical skill sets, educational background (either formal or otherwise), experience, and maturity.
Second, modern media encourages atomization and alienation. The ability to atomize breaks down barriers—what is often called disruption. Since it is pervasive and cannot be entirely controlled, the new media can insinuate itself into oppressive regimes and coercive private sector environments. While I view this—and believe most people would view this—as a net positive within this context, atomization and alienation also has its negatives.
By definition, cold media requires interaction by the user, thus having a stimulating effect on key neural receptors. Thus the medium itself, absent content, draws the consumer to it.
People operating within normative measures are motivated by love and acceptance. Viewing themselves as apart from their surrounding environments due to atomization, people exhibit addictive behavior in interacting with media, oftentimes viewing these connections as extensions of themselves. The image of the individual out and about among friends, concentrating on their smartphone, is the iconic image of our times.
One of the largest effects has been that individuals, when faced with cognitive dissonance, are drawn to information that reinforces their preconceived beliefs. This has become—rather than a symptom of ignorance that can be ameliorated with proper socialization and education—a type of societal psychosis that has contributed to the weakening of the processes essential to learning and, thus, to the healthy functioning of the world’s democratic republics.
Hardened beliefs that contribute to political polarization, ideological extremism, anti-intellectualism, severe nationalism, and nativism can and do appeal to a broader constituency more rapidly and in more disparate places than previously imagined.
Third, modern media has a modernist bias. The amount of data and information since the dawning of the digital age has expanded exponentially. As noted, not all of this information is equally important. But, more importantly, the amount of data of a recent nature subconsciously suggests that contemporaneous events and information, most of which is trivial or of transient importance, is therefore of greater importance than actual threats that may be due to longer-term natural or geopolitical trends that predate the new media. The previously cited positive characteristic of immediacy only compounds this problem.
A general resistance to scientific knowledge is an example in the first case. In the second, the modern Middle East, North Africa, and much of the conflict we see in Eastern Europe finds its roots in the First World War.
Fourth, modern media has broken down the boundaries between the public and private, well beyond McLuhan’s concept of the Global Village. Virtually every aspect of the individual is now susceptible to a level of intrusion never before contemplated or desired. Since individuals are powerless to prevent this intrusion without electing to drop out of society—a practically impossible task in the modern world if one is to make a living and engage with their fellows—counterproductive attempts at coping mechanisms contribute to the impacts of alienation, distrust, anti-social behavior, and psychosis already noted.
Furthermore, as the Internet of Things (IoT) expands to create a world where even our everyday devices can record our movements and our preferences, no human activity will be completely immune from intrusion, tailored advertising and marketing, and hacking. This issue is so important that the Central Intelligence Agency views it as a serious cyber-warfare and geo-security threat. Hacking and cyber-attacks have become pervasive, targeting financial institutions, government agencies, media companies, and satellites.
Checking the Checker
“Those who cultivate competence in the use of a new technology become an elite group that are granted undeserved authority and prestige by those who have no such competence.” – Neil Postman, Technopoly
Beginning with the invasion of Egypt in 48 BC, and continuing through several subsequent invasions, pillaging, and censorship, the libraries of Alexandria were lost to history. These libraries had contained all of the important literature of the ancient world. The majority of works by the great Hellenic and Hellenistic Greek philosophers, scientists, playwrights, and geographers are now known only to exist through reference and citation.
What survived would not be reintroduced to Western Civilization until the 13th century through the first universities of Al-Andalus and the Mediterranean trading city-states of Italy. Jewish scholars, which were a protected class under the Caliphates of North Africa and the Middle East, translated the early Greek texts into Latin and, as a consequence, seeded what would become the European Renaissance.
The ancient Mediterranean world was one that was hierarchical and which centralized knowledge. Thus, its organization into centers of learning made its knowledge vulnerable to near-virtual annihilation. The inability of one generation to pass knowledge on to the next—and the resistance to learn and to expand on the knowledge that did exist—undid European civilization for 1,000 years.
As media consolidate into large corporate oligopolies, the vulnerability of information manipulation, oppression, and destruction increases. Along with it, the vulnerability of the average individual to coercion, manipulation, and loss of privacy also increases. The infamous Facebook emotional-manipulation experiment is only the first of what is to come.
My colleague Dave Gordon, who also writes for the AITS Blogging Alliance, recently turned me on to a Newsweek article entitled “Plan to Quit? Big Data Might Tell Your Boss Before You Do.” I have no doubt that the entrepreneurs who created this solution are very skilled people with proven track records of tech business success. But what if what they provide to their customers as an indicator is wrong? Even if they are correct in their assessments a majority of the time, what about the cases where an adverse action is taken?
Since we are dealing with social and economic systems, our experiments, by definition, must be in real time using proven indicators. With the loss of employee empowerment through the systematic destruction of collective bargaining and union representational rights, Galbraith’s system of countervailing power no longer applies. Thus, in this and many other cases, who is checking the checker?
The concept of checking the checker is implicit in our system of government (checks and balances), and in those institutions formed to regulate large business and financial firms that impact the public interest. I don’t mean to unfairly pick on this or any other cutting edge company in our field. They are certainly not responsible for the current asymmetrical condition of the workplace. But tech and the new media that deploy products that reinforce this condition have been left largely to their own devices. It is one thing to deploy proven microeconomic and business economic metrics and modeling to big data. It is quite another to apply predictive analytics to the unsettled science of human behavior.
Those who use the term self-organization to justify this condition do so from an ideological position that has little evidence in either mathematics or experience. Glen Alleman, a former physicist, quickly and effectively dispatched the neo-mystical use of this term recently in his own blog by listing the conditions that define it, skewering it as a post hoc justification of chaotic results. As such, he demystifies a term often conflated with the expansive use of the word “freedom.”
In the movie Nightcrawler, the main character Lou Bloom is a cipher, absorbing the knowledge of new media to overtake and coopt the older medium of television. He is amoral, manipulative, and ruthless—an android mind inside of a human body. His approach to life in using new media is like a virus and infects everyone around him. In the end, his influence is absorbed by the character Nina, the evening news producer who felt she could use and control Bloom, but is instead now controlled by him. She, in turn, manipulates the information gathered to create a dialogue in diametric opposition to what is happening in the real world, thereby manipulating the public.
Once again, our historical experience with the early, politically-biased newspapers and yellow journalism demonstrate that such manipulation is not new. But earlier generations, where such abuses were largely confined geographically and demographically, realized that such use of power was a form of corruption that undermined the democratic ideal. Until recently, reforms to address these abuses limited the cross-ownership and power of media outlets. But these constraints no longer exist.
New media, with its ability to infiltrate every aspect of human life and behavior, and its mass appeal, represents a powerful new force, combining as it does both the fourth and fifth estates. Both IoT and the introduction of artificial general intelligence (AGI) may represent a new, sixth estate. As such, it is incumbent for new media to begin to reform and regulate itself.
So rather than a digital dark age as indicated by Mr. Cerf, which is created as a result of the absence of data, I think the dark age, if it is to come, will occur through the misuse, destruction, and manipulation of too much data disconnected from its context. The individual person—alienated and atomized, and not possessing the resources to protect their privacy, identity, and property in highly secure repositories available only to the most affluent—will be most vulnerable in this new world.
For more brilliant insights, check out Nick’s blog: Life, Project Management, and Everything