Saturday , March 25 2017
ITMPI FLAT 002
Main Menu
Home / Authors / Blogging Alliance / Maxwell’s Demon: Planning for Obsolescence in Acquisitions

Maxwell’s Demon: Planning for Obsolescence in Acquisitions

BloggingAllianceButton

Imagine a chamber divided into two parts by a removable partition. On one side is a hot sample of gas and on the other side a cold sample of the same gas. The chamber is a closed system with a certain amount of order, because the statistically faster moving molecules of the hot gas on one side of the partition are segregated from statistically slower moving molecules of the cold gas on the other side. Maxwell’s demon guards a trap door in the partition, which is still assumed not to conduct heat. It spots molecules coming from either side and judges their speeds…The perverse demon manipulates the trap door so as to allow passage only to the very slowest molecules of the hot gas and the very fastest molecules of the cold gas. Thus the cold gas receives extremely slow molecules, cooling it further, and the hot gas receives extremely fast molecules, making it even hotter. In apparent defiance of the second law of thermodynamics, the demon has caused heat to flow from the cold gas to the hot one. What is going on?

Because the law applies only to a closed system, we must include the demon in our calculations. Its increase of entropy must be at least as great as the decrease of entropy in the gas-filled halves of the chamber. What is it like for the demon to increase its entropy? –Murray Gell-Mann, The Quark and the Jaguar: Adventures in the Simple and the Complex, W. H. Freeman and Company, New York, 1994, pp. 222-223

“Entropy is a figure of speech, then,” sighed Nefastis, “a metaphor. It connects the world of thermodynamics to the world of information flow. The Machine uses both. The Demon makes the metaphor not only verbally graceful, but also objectively true.” –Thomas Pynchon, The Crying of Lot 49, J.B. Lippincott, Philadelphia, 1965

Technology Acquisition: The Basics

I’ve recently been involved in discussions regarding software development and acquisition that cut across several disciplines that should be of interest to anyone engaged in project management in general, but IT project management and acquisition in particular.

The specific statement that prompted this line of thought was the concern that our acquisition systems are not flexible enough in the acquisition of technology to allow for exploiting new capabilities. A corollary concern is that when an influential organization—especially a public agency—commits to a specific solution in a highly organized industry, that the effect can be to make that solution the de facto standard, which may limit technological flexibility and may restrict the suppliers to the market.

There are obvious objections and qualifications to the deterministic conclusions of this line of reasoning. The first is: it depends. If your acquisition strategy is segmented and focused on addressing specific needs, as opposed to a complex multi-year development, then such concerns can be addressed by application of the basic economic concept of sunk versus opportunity costs. The second is the public good basis of economic theory. That is, that in order to incentivize and reward innovation, such objections are otherwise unfounded. The third is a test in addressing the underlying socioeconomic concerns. If there is more than adequate market competition in the general market, even though specific alternatives may be inferior in a niche market, then such concerns are unfounded. Thus, one supplier may have a competitive advantage in meeting the organization’s needs due to some set of features or functionality, but it is a counterfactual to conclude that the condition does not meet a social good. Otherwise public institutions would always be excluded from acquiring new technology.

But the question of agility in acquisition is still a concern, especially in extremely complex project management environments involving multi-year research and development. Here the economic and socioeconomic considerations are more complex. Furthermore, these two aspects of social science do not provide all of the solutions to the issue.

I believe that we find the answers, instead, in the basic physics of information theory and the practical application of the theory as it applies to computing and digitization. These answers then provide us with an outline for practical solutions in project management. On a more prosaic level, our outline can strongly suggest the approach that is advisable in application of acquisition methods, acquisition planning, service life management, and anticipated obsolescence.

What Lies Beneath Our Assumptions

theoryAt the heart of information theory is the proposition that there are universal laws tied to the nature of the universe that can be understood and leveraged to our advantage. This proposition was first anticipated by physicist James Clerk Maxwell in 1871, that the use of information can contradict the Second Law of Thermodynamics to the point of statistical certainty.

The Second Law, first discovered by Lord Kelvin, states that the universe expends its energy in what has been coined a “heat death” with time’s arrow always pointing forward, moving toward equilibrium. In order to achieve order of some sort over the course of time’s arrow, energy or heat must be expended; you don’t get anything for free in this universe. Building or organizing something requires that disorder, or entropy, be increased somewhere else. But disorder is not uniform, and there is a tradeoff between order and disorder.

It wasn’t until 1929 that physicist Leó Szilárd was able to come up with a mathematical proof that reconciled the Second Law with the concept of “Maxwell’s Demon” as described above. Szilárd did this by identifying the “demon” as intelligent enough to measure the speed of the molecules to distinguish whether they are hot or cold. Entropy—disorder and decay—is transferred not to the system being measured but to the “demon.”

Szilard’s measurement later became known as a “bit,” which has become most widely known through the work of Bell Labs’ Claude Shannon, the recognized father of information theory. For those of you who are mathematically minded, the equation that describes the cost of measurement of a bit of information is:

S = k log 2

where k is Boltzmann’s constant. The base-2 logarithm reflects the binary decision.

The amount of entropy generated by measurement can always exceed this amount, but never be less than it, in order not to violate the Second Law. Thus, entropy can be diverted or absorbed by some other object. There can be natural “demons” (such as the agency of natural selection and adaptation), or there can be imposed “demons.” In our modern worldview, it is increasingly apparent that the universe is made up of information of one kind or another. The practical application of this equation on computing first extended to cryptography, and is seen today in the widespread use and availability of such things as VPNs, CAPTCHA, encrypted storage, and the use of zip-type files.

The purpose of all of this effort, of course, is to find the minimum lower bound to the amount of energy expended in extracting information from any bit, and thus minimizing or deflecting entropy, while maximizing the information content that could be compacted or communicated. The connection between the thermodynamic concept of entropy and informational concept of entropy as made by Shannon was further confirmed in 1961 by the work of Rolf Landauer of IBM, though coming from a different direction, which has come to be called Landauer’s Principle.

oooldThis principle calculates the lowest theoretical energy level of computation that is required to erase one bit of information. Once again, it matches Szilard’s equation pertaining to Maxwell’s Demon, though coming from the direction of information deletion. Just as Pynchon used the concept of Maxwell’s Demon in his novel to connect thermodynamics and real-world information, so too do we find that the mathematics underlying it demonstrate a connection that reflects a common theory of limitations both for thermodynamics and the information contained in every bit of the universe.

Putting It All Together

Computing and software, like all artifacts of human nature that attempt to extend our capabilities, are limited by physics and human mortality. We live in a universe in which our limits are predetermined, but accord us a great deal of flexibility within our sphere, where there is vast uncertainty within the bounds of determinism that is a function of probability.

According to Moore’s Law, introduced by Intel cofounder Gordon Moore in 1965, processing power doubles (at minimum) every eighteen months to two years. At the same time, we know that our digitization processes, while never being able to break the Second Law, are rapidly approaching the lower bound of the equation that underlies information theory. I think this process is perpetual; that is, it will continue to approach the lower bound without actually touching it.

By definition then, all of our software applications and attempts at artificial intelligence have a built-in obsolescence of two years, unless those tools are updated to take into account the machine and operating level languages that can harness such power. This is not so much planned obsolescence as unplanned obsolescence.

Furthermore, as we apply this power to larger datasets, the entropy that is needed to be expended also exponentially increases, constrained by Landauer’s Principle. In Landauer’s time this entropy was expelled as heat. But in our own time, given the nature of some of the information that is processed in the realm of what we now call big data, entropy can be expelled as uncertainty.

This application of Landauer’s Principle is why, on previous occasions in writing about big data, that I distinguish between data that requires a high final state of validity, requiring normalization and rationalization, and big data that can withstand some uncertainty, its validity reduced to probability.

Information Economics Steps In

Thus, combining an understanding of the physical limitations of information theory with information economics will give us a better idea of how to approach the concerns that opened this article.

According to J. Bradford DeLong and A. Michael Froomkin in their seminal paper “The Next Economy?,” there are three principles to information economics that differentiate it from neoclassical economics. These are:

  1. Exclusion is not a natural property of information, meaning it is extremely hard to exclude others from enjoying the information.
  2. Information is non-rivalrous, which means that information systems involve technologies where the willingness to pay for a bit of information is greater than the marginal cost of producing another copy of that bit. Thus, price is determined by the perceived value applied to the information.
  3. The information market does not exhibit high degrees of transparency; to understand the value of the information, it must be known. The relationship, therefore, between the supplier of the information and the consumer of the information is asymmetrical. To understand the value and applicability of a bit of software, you have to learn to use it. Oftentimes, when piloting or evaluating the product, the information is simultaneously consumed or utilized, involving cost to both supplier and consumer.

The Interesting and the Mundane

moneymazeDeLong and Froomkin wrote their paper during the heady days of the 1990s tech bubble, which was soon to burst due to the economic characteristics of information that they identified. Both the non-exclusiveness and non-rivalrous features of information essentially make it a public good, changing the locus of who owns it. The key distinction missing here is between data and information—a concept that also goes back to theories of cognition and intelligence-gathering contemporaneous with information theory.

What this means is that while data has taken the form of a public good, with little marginal cost attached to making copies of it, the processing of that data still possesses the neoclassical characteristics of exclusion and rivalry, while still also offering the unique characteristic of being opaque. The marginal cost of implementation, upgrades, and maintenance is non-trivial.

Joseph Stiglitz, George Akerlof, and Mike Spence, who all received the Nobel Prize in 2001 for their work, identified the effects of this asymmetry in information. The asymmetry has an enhancing effect on the opaqueness of digital products. This increases both risk and uncertainty in technology acquisition.

Thus, understanding both the physical and economic characteristics of data and information, we now have a basis for establishing an acquisition strategy to exploit and influence the characteristics of the market. Strategies that take into account that data is a public good establish ownership of the data in the consumer. Thus, the value is in the processing—the features, salient characteristics, and functionality of the software or digital product.

Understanding that greater processing power—essentially a new software generation—comes every two years, our acquisition strategy must include a phase for costs associated with the evaluation of solutions. This should be done through piloting or proof of concept, an evaluation of the platforms in terms of whether they provide an ability to adapt to the introduction of a new generation of processing every two years, and whether the platforms behave in a way that supports open architectures.

In my next article, I’ll outline the specific characteristics of such an acquisition strategy from these general propositions.

 

For more brilliant insights, check out Nick’s blog: Life, Project Management, and Everything

About Nick Pisano

Profile photo of Nick Pisano
Nick Pisano has extensive experience in the software, project, business, and acquisition management fields, with over 30 years in both government and private industry. He is a retired “mustang” U.S. Navy Commander having served as a government contracting officer, contract negotiator, business manager, CIO, and program manager, aside from significant operational assignments aboard ship and overseas. He is internationally recognized as the developer and co-developer of several project management techniques and methodologies, including the integration of technical performance measurement and risk with earned value, and in the establishment of the concept of the integrated digital environment (IDE) to normalize proprietary data through the use of common data protocols. Since his Navy career, Pisano has held senior positions in various high tech project management companies. For the last several years, he has been President and CEO of SNA Software LLC, a project management software firm. Pisano holds a B.S. from the University of Maryland (Honors), an M.S. from Pepperdine University, an M.A. from the Combat Studies Institute of the Army Command and General Staff College (Honors), and is a graduate of the senior executive program of the Colgate-Darden School of Business of the University of Virginia. You can visit his blog, Life, Project Management, and Everything, by clicking the button below.

Check Also

Organizational Project Business Management and PMOs: Executive View

Written with Darrel G. Hubbard Introduction During our PMO case study research over the past ...

Leave a Reply

Your email address will not be published. Required fields are marked *

Get the best IT management articles right in your inbox
Subscribe
Join 15K subscribers