In last week’s Thoughts from the Frontline I talked about the Age of Transformation, attempting to refute Robert’s Gordon rather stark and gloomy view of the future growth potential of the economy. That letter generated a rather significant amount of reader response, both pro and con, as not everyone agrees with my decidedly optimistic long-term view of the future. It might be fun and thought-provoking, in fact, to do a letter that deals with some of the issues you raised. I really do have some of the smartest readers of any economics and investing letter out there.
Inside what was a long letter even for me, I buried a few quotes from George Gilder’s latest book, Knowledge and Power. I am not simply reading this book, I am thinking through it, as some of what he writes is truly pivotal for my own thought process.
For this week’s Outside the Box, George has graciously allowed me to reproduce chapters 1 and 3 from his book. It helps that George is a gifted wordsmith and a raconteur of the highest order. He doesn't bury his insights in dry economics-speak that demands intense concentration if you are to stay focused on the topic. Rather, he draws you into and through the topic, until you find yourself on a delightful Slip 'n Slide of thought. I think you will get a lot out of reading these chapters, and I strongly suggest you consider reading the whole book. It is an important one.
I write this note as I am wrapping up three days of meetings with my partners at Altegris Investments and Mauldin Economics, focusing on how to deliver better products and services to you. And we've enjoyed a few late-night conversations about where the world is going and how to surf the inevitable and the profitable.
In a few hours I will be off to Dubai and Riyadh, with maybe even a side trip to Abu Dhabi, after a very long layover in London, where I will jump into town to have lunch with Simon Hunt and delve deep into the happenings in China and Europe. I'll also bounce a few ideas off him for this weekend’s letter.
I am really kind of excited about this trip, as all these places are new territory for me. You have a great week, and I will send you this weekend’s letter from Dubai.
Your wondering what I will find out there analyst,
John Mauldin, Editor
Outside the Box
Knowledge and Power
By George Gilder
Chapter One: The Need for a New Economics
Economic life is full of surprises. We cannot predict the value of our homes or prices on the stock market from day to day. We cannot anticipate illness or automobile accidents, the behavior of our children or the incomes of our parents. We cannot know the weather beyond a week or so. We cannot gauge what course of college study will yield the best lifetime earnings or career. We are constantly startled by newspaper headlines or the eruptions of TV events. We are almost entirely incapable of predicting the future.
Yet from Adam Smith’s day to our own the chief concern of the discipline has been to render economic events unsurprising. Given a supply x of corn and a demand y, the price will be z. Change x or y and hold all else equal and the price will instead be a predictable z′. The discernment of orderly rules governing the apparent chaos of life was a remarkable achievement and continues to amaze. Economists such as Steven Leavitt of Freakonomics fame and Gary Becker of the University of Chicago became media stars for their uncanny ability to unveil what “we should have known.” Closer investigation, however, reveals that even these ingenious analysts are gifted chiefly with 20-20 hindsight. They prosper by explaining to us what has happened more than by anticipating the future with prescient investments.
The passion for finding the system in experience, replacing surprise with order, is a persistent part of human nature. In the late eighteenth century, when Smith wrote The Wealth of Nations, the passion for order found its fulfillment in the most astonishing intellectual event of the seventeenth century: the invention of the calculus. Powered by the calculus, the new physics of Isaac Newton and his followers wrought mathematical order from what was previously a muddle of alchemy and astronomy, projection and prayer. The new physics depicted a universe governed by tersely stated rules that could yield exquisitely accurate predictions. Science came to mean the elimination of surprise. It outlawed miracles, because miracles are above all unexpected.
The elimination of surprise in some fields is the condition for creativity in others. If the compass fails to track North, no one can discover America. The world shrinks to a mystery of weather and waves. The breakthroughs of determinism in physics provided a reliable compass for three centuries of human progress.
Ignored in all this luminous achievement, however, was the one unbridgeable gap between physics and any such science of human behavior: the surprises that arise from free will and human creativity. The miracles forbidden in deterministic physics are not only routine in economics, they constitute the most important economic events. For a miracle is simply an innovation, a sudden and bountiful addition of information to the system. Newtonian physics does not admit of new information of this kind—describe a system and you are pretty much done. Describe an economic system and you have described only the circumstances—favorable or unfavorable—for future innovation.
In Newton’s physics, the equations encompass and describe change, but there is no need to describe the agent of this change, the creator of new information. (Newton was a devout Christian but his system relieved God or his angels of the need to steer the spheres.) In an economy, however, everything useful or interesting depends on agents of change called entrepreneurs. An economics of systems only—an economics of markets but not of men—is fatally flawed.
As the eminent mathematician Gregory Chaitin has pointed out, for human and biological processes, the calculus does not suffice. He writes: “Life is plastic, creative! How can we build this out of static, eternal, perfect mathematics? We shall use post-modern math…open not closed math, the math of creativity…”
Flawed from its foundation, economics as a whole has failed to improve much with time. As it both ossified into an academic establishment and mutated into mathematics, the Newtonian scheme became a mirage of determinism in a tempestuous world of human actions. Economists became preoccupied with mechanical models of markets and uninterested in the willful people who inhabit them.
Some economists become obsessed with market efficiency and others with market failure. Generally held to be members of opposite schools—“freshwater” and “saltwater,” Chicago and Cambridge, liberal and conservative, Austrian and Keynesian—both sides share an essential economic vision. They see their discipline as successful insofar as it eliminates surprise—insofar, that is, as the inexorable workings of the machine override the initiatives of the human actors.
“Free market” economists believe in the triumph of the system and want to let it alone to find its equilibrium, the stasis of optimum allocation of resources. Socialists see the failures of the system and want to impose equilibrium from above. Neither spends much time thinking about the miracles that repeatedly save us from the equilibrium of starvation and death.
The late financial crisis was perhaps the first in history actually to be caused by economists. Entranced by statistical models, they ignored the larger dimensions of human creativity and freedom. To cite an obvious example, “structured finance”—the conglomerations of thousands of dubious mortgages diced and sliced and recombined and all trebly insured against failure—was supposed to eliminate the surprise of mortgage defaults. The mortgage defaults that came anyway and triggered the collapse came not from the aggregate inability of debtors to pay as calculated by the economists, but from the free acts of home buyers. Having bet on constantly rising home prices, they simply folded their hands and walked away when the value of their houses collapsed. The bankers had accounted for everything but free will.
The real error, however, was a divorce between the people on the ground who understood the situation and the people who made the decisions. John Allison is the former CEO of a North Carolina bank, BB&T, which profitably surmounted the crisis after growing from $4.5 billion of assets in 1989 when he took over to $152 billion in 2008. Allison ascribed his success to decentralization of power in the branches of his bank.
But decentralized power, he warned, has to be guarded from the well-meaning elites “who like to run their system and hate deviations.” So as CEO, Allison had to insist to his managers that with localized decision-making, “We get better information, we get faster decisions, we understand the market better.”
Allison was espousing a central insight of the new economics of information. At the heart of capitalism is the unification of knowledge and power. As Friedrich Hayek, leader of the Austrian school of economics, put it, “To assume all the knowledge to be given to a single mind … is to disregard everything that is important and significant in the real world.” Because knowledge is dispersed, so must be power. Leading classical thinkers such as Thomas Sowell and supply-siders such as Robert Mundell refined the theory. All saw that the crucial knowledge in economies originated in individual human minds and thus was intrinsically centrifugal, dispersed and distributed.
Enforced by genetics, sexual reproduction, perspective and experience, the most manifest characteristic of human beings is their diversity. The freer an economy is, the more this human diversity of knowledge will be manifested. By contrast, political power originates in top-down processes—governments, monopolies, regulators, elite institutions, all attempting to quell human diversity and impose order. Thus power always seeks centralization.
The war between the centrifuge of knowledge and the centripetal pull of power remains the prime conflict in all economies. Reconciling the two impulses is a new economics, an economics that puts free will and the innovating entrepreneur not on the periphery but at the center of the system. It is an economics of surprise that distributes power as it extends knowledge. It is an economics of disequilibrium and disruption that tests its inventions in the crucible of a competitive marketplace. It is an economics that accords with the constantly surprising fluctuations of our lives.
In a sense, I introduced such an economics more than 30 years ago in my book Wealth&Poverty and reintroduced it in 2012 in a new edition. It spoke of economics as “a largely spontaneous and mostly unpredictable flow of increasing diversity and differentiation and new products and modes of production…full of the mystery of all living and growing things (like ideas and businesses).” Heralding what was called “supply side economics” (for its disparagement of mere monetary demand), it celebrated the surprises of entrepreneurial creativity. The original work was widely popular around the globe, published in 15 languages and for six months reigning as the number one book in France. President Ronald Reagan made me his most quoted living author.
Explicitly focusing on knowledge and power allows us to transcend rancorous charges of socialism and fascism, greed and graft, “voodoo economics” and “trickle down” theory, callous austerity and wanton prodigality, conservative dogmatism and libertarian license.
We begin with the proposition that capitalism is not chiefly an incentive system but an information system. We continue with the recognition, explained by the most powerful science of the epoch, that information itself is best defined as surprise: by what we cannot predict rather than by what we can. The key to economic growth is not acquisition of things by the pursuit of monetary rewards but the expansion of wealth through learning and discovery. The economy grows not by manipulating greed and fear through bribes and punishments but by accumulating surprising knowledge through the conduct of the falsifiable experiments of free enterprises. Crucial to this learning process is the possibility of failure and bankruptcy. In this model, wealth is defined as knowledge, and growth is defined as learning.
Because the system is based more on ideas than on incentives, it is not a process changeable only over generations of Sisysphean effort. An economy is a noosphere (a mind-based system) and it can revive as fast as minds and policies can change.
That new economics—the information theory of capitalism—is already at work in disguise. Concealed behind an elaborate mathematical apparatus, sequestered by its creators in what is called information technology, the new theory drives the most powerful machines and networks of the era. Information theory treats human creations or communications as transmissions through a channel, whether a wire or the world, in the face of the power of noise, and gauges the outcomes by their news or surprise, defined as “entropy” and consummated as knowledge. Now it is ready to come out into the open and to transform economics as it has already transformed the world economy itself.
[Now, skipping some interesting work in chapter two, let’s jump to chapter 3.]
Chapter Three: The Science of Information
The current crisis of economic policy cannot be understood as simply the failure of either conservative or socialist economics to triumph over its rival. It cannot be understood as New York Times Nobelist Paul Krugman or Ron Paul and the libertarians might wish, as a revival of the debate between Keynesian and Austrian schools—John Maynard Keynes and Paul Samuelson against Friedrich Hayek and Ludwig Von Mises. The hard science that is the key to the current crisis had not been invented when Keynes or Hayek were doing their seminal work.
This new science is the science of information. In its full flower, information theory is densely complex and mathematical. But its implications for economics can be expressed in a number of simple and intelligible propositions.
All information is surprise; only surprise qualifies as information. This is the fundamental axiom of information theory. Information is the change between what we knew before the transmission and what we know after it.
From Adam Smith’s day to ours, economics has focused on the nature of economic order. Much of the classical and neo-classical work was devoted to observing the mechanisms by which markets, confronted with change—especially change in prices—restored a new order, a new equilibrium. Smith and his successors followed in the metaphorical paths of Newton and Leibniz, mounting a science of systems.
What they lacked was a science of disorder and randomness, a mathematics of innovation, a rigorous measure and mandate for freedom of choice. In economics, the relevant science has arrived just in time. The great economic crisis of our day, a crisis of theory as well as practice, is a crisis of information. It can be grasped and resolved only by an economics of information. Pioneered by such titans as Kurt Gödel, John von Neumann, and Alan Turing, the mathematical structure for this new economics was consummated by one of the paramount minds of the 20th century, Claude Elwood Shannon.
In a long career at MIT and AT&T’s Bell Laboratories, Shannon was a man of toys, games, and surprises. They all tended to be underestimated at first and then become resonant themes of his time and technology—from computer science and Artificial Intelligence to investment strategy and Internet architecture. As a boy during the roaring twenties in snowy northern Michigan, young Claude—grandson of a tinkering farmer who held a patent for a washing machine—made a telegraph line using the barbed-wire fence between his house and a friend’s half a mile away. “Later,” he said, “we scrounged telephone equipment from the local exchange and connected up a telephone.” Thus he recapitulated the pivotal moment in the history of his later employer: from telegraph to telephone.
There is no record of what Shannon and the world would come to call the “channel capacity” of the fence. But later in the era Shannon’s followers at industry conferences would ascribe a “Shannon capacity” of gigabits per second to barbed wire, and joke about the “Shannon limit” of a long strand of linguini.
Shannon’s contributions in telephony would follow his contributions in computing, all of which in turn were subsumed by higher abstractions in a theory of information. His award-winning Master’s thesis from MIT kick-started the computer age by demonstrating that the existing “relay” switching circuits from telephone exchanges could express the nineteenth-century algebra of logic invented by George Boole, which became the prevailing logic of computing. A key insight came from an analogy with the game of Twenty Questions: paring down a complex problem to a chain of binary, yes-no choices, which Shannon may have been first to dub “bits”. Then this telephonic tinkerer went to work for Bell Labs at its creative height, when it was a place where a young genius could comfortably unicycle down the hallways juggling several balls over his head.
During the war, he worked on cryptography there and talked about thinking machines over tea with the great tragic figure Alan Turing. Conceiving a generic abstract computer architecture, Turing is arguably the progenitor of information theory broadly conceived. At Bletchley Park in Britain, his contributions to breaking German codes were critical to the Allied victory. Following the war, he committed suicide by eating a poisoned apple after having undergone court-mandated estrogen therapy to rein in his public homosexuality. (The Apple logo, with its missing bite, is seen by some as an homage to Turing, but Steve Jobs said he only wished he had been that smart).
The two computing-obsessed cryptographers, Shannon and Turing, also discussed during these wartime teas what Shannon described as his burgeoning “notions on Information Theory” (for which Turing provided “a fair amount of negative feedback”).
In 1948, Shannon published those notions on Information Theory in The Bell System Technical Journal as a 78-page monograph, “The Mathematical Theory of Communication.” (The next year—with an introduction by Warren Weaver, one of America’s leading wartime scientists—it reappeared as a book.) It became the central document of the dominant technology of the age and still resonates today as the theoretical underpinning for the Internet.
Shannon’s first wife described the arresting magnetism of his countenance as “Christ-like.” Like Leonardo and fellow computing pioneer Charles Babbage, he was said by one purported witness to have built floating shoes for walking on water. With his second wife, herself a “computer” when he met her at AT&T, he created a home full of pianos, unicycles, chess-playing machines, and his own surprising congeries of seriously playful gadgets. These included a mechanical white mouse named Theseus—built soon after the information theory monograph—which could learn its way through a maze; a calculator that worked in Roman numerals; a rocket-powered Frisbee; a chair lift to take his children down to the nearby lake; a diorama in which three tiny clowns juggled 11 rings, 10 balls, and 7 clubs; and an analog computer and radio apparatus, built with the help of blackjack card-counter and fellow MIT professor Edward Thorp, to beat the roulette wheels at Las Vegas (it worked in Shannon’s basement but suffered technical failure in the casino). Later an uncannily successful investor in technology stocks, Shannon insisted on the crucial differences between a casino and a stock exchange that eluded some of his followers.
When I wrote my book, Microcosm, on the rise of the microchip, I was entranced with physics and was sure that the invention of the transistor at Bell Labs in 1948 was the paramount event of the post-war decade. Today, I find that physicists are entranced with the theory of information. I believe, with his biographer James Gleick, that Shannon’s Information Theory was a breakthrough comparable to the transistor. While the transistor is today ubiquitous in information technology, Shannon’s theories are immanent in all the ascendant systems of the age. As universal principles, they grow ever more fruitful and fertile as time passes. Every few weeks, I encounter another company crucially rooted in Shannon’s theories, full of earnest young engineers conspiring to beat the Shannon limit. The technology of our age seems to be at once both Shannon limited and Shannon enabled. So is the modern world.
Let us imagine the lineaments of an economics of disorder, disequilibrium, and surprise that could explain and measure the contributions of entrepreneurs. Such an economics would begin with the Smithian mold of order and equilibrium. Smith himself spoke of property rights, free trade, sound currency, and modest taxation as crucial elements of an environment for prosperity. Smith was right: An arena of disorder, disequilibrium, chaos, and noise would drown the feats of creation that engender growth. The ultimate physical entropy envisaged as the heat death of the universe, in its total disorder, affords no room for invention or surprise. But entrepreneurial disorder is not chaos or mere noise. Entrepreneurial disorder is some combination of order and upheaval that might be termed “informative disorder.”
Shannon defined information in terms of digital bits and measured it by the concept of information entropy: unexpected or surprising bits. Reported to have been the source of the name was John von Neumann, inventor of computer architectures, game theory, quantum math, nuclear devices, military strategies, cellular automata, among other ingenious phenomena. Encountering von Neumann in a corridor at MIT, Shannon allegedly told him about his new idea. Von Neumann suggested that he name it “entropy” after the thermodynamic concept (according to Shannon, von Neumann said it would be a great word to use because no one knows what it means).
Shannon’s entropy is governed by a logarithmic equation nearly identical to the thermodynamic equation of Rudolf Clausius that describes physical entropy. But the parallels between the two entropies conceal several pitfalls that have ensnared many. Physical entropy is maximized when all the molecules in a physical system are at an equal temperature (and thus cannot yield any more energy). Shannon entropy is maximized when all the bits in a message are equally improbable (and thus cannot be further compressed without loss of information). These two identical equations point to a deeper affinity that MIT physicist Seth Lloyd identifies as the foundation of all material reality—at the beginning was the entropic bit.
For the purposes of economics, the key insight of information theory is that information is measured by the degree to which it is unexpected. Information is “news,” gauged by its surprisal, which is the entropy. A stream of predictable bits conveys no information at all. A stream of uncoded chaotic noise conveys no information, either.
In the Shannon scheme, a source selects a message from a portfolio of possible messages, encodes it through resort to a dictionary or lookup table using a specified alphabet, then transcribes the encoded message into a form that can be transmitted down a channel. Afflicting that channel is always some level of noise or interference. At the destination, the receiver decodes the message, translating it back into its original form. This is what is happening when a radio station modulates electromagnetic waves, and your car radio demodulates those waves, translating them back into the original sounds or voices at the radio station.
Part of the genius of information theory is its understanding that this ordinary concept of communication through space extends also through time. A compact disk, iPod memory, or Tivo personal video recorder also conducts a transmission from a source (the original song or other content) through a channel (the CD, DVD, microchip memory, or “hard drive”) to a receiver chiefly separated by time. In all these cases, the success of the transmission depends on the existence of a channel that does not change significantly during the course of the communication, either in space or in time.
Change in the channel is called noise and an ideal channel is perfectly linear. What comes out is identical to what goes in. A good channel, whether for telephony, television, or data storage, does not change in significant ways during the period between the transmission and receipt of the message. Because the channel is changeless, the message in the channel can communicate changes. The message of change can be distinguished from the unchanging parameters of the channel.
In that radio transmission, a voice or other acoustic signal is imposed on a band of electromagnetic waves through a modulation scheme. This set of rules allows a relatively high-frequency non-mechanical wave (measured in kilohertz to gigahertz and traveling at the speed of light) to carry a translated version of the desired sound, which the human ear can receive only in the form of a lower frequency mechanical wave (measured in acoustic hertz to low kilohertz and traveling close to a million times slower). The receiver can recover the modulation changes of amplitude or frequency or phase (timing) that encode the voice merely by subtracting the changeless radio waves. This process of recovery can occur years later if the modulated waves are sampled and stored on a disk or long term memory.
The accomplishment of Information Theory was to create a rigorous mathematical discipline for the definition and measurement of the information in the message sent down the channel. Shannon entropy or surprisal defines and quantifies the information in a message. In close similarity with physical entropy, information entropy is always a positive number measured by minus the base two logarithm of its probability.
Information in Shannon’s scheme is quantified in terms of a probability because Shannon interpreted the message as a selection or choice from a limited alphabet. Entropy is thus a measure of freedom of choice. In the simplest case of maximum entropy of equally probable elements, the uncertainty is merely the inverse of the number of elements or symbols. A coin toss offers two possibilities, heads or tails; the probability of either is one out of two; the logarithm of one half is minus one. With the minus canceled by Shannon’s minus, a coin toss can yield one bit of information or surprisal. A series of bits of probability one out of two does not provide a 50 percent correct transmission. If it did, the communicator could replace the source with a random transmitter and get half the information right. The probability alone does not tell the receiver which bits are correct. It is the entropy that measures the information.
For another familiar example, the likelihood that any particular facet of a die turns up in a throw of dice is one sixth, because there are six possibilities all equally improbable. The communication power, though, is gauged not by its likelihood of one in six, but by the uncertainty resolved or dispersed by the message. One out of six is two to the minus 2.58, yielding an entropy or surprisal of 2.58 bits per throw.
Shannon’s entropy gauged the surprisal of any communication event taking place over space or time. By quantifying the amount of information, he also was able to define both the capacity of a given channel for carrying information and the impact of noise on that carrying capacity.
From Shannon’s Information Theory—his definition of the bit, his explanation and calculation of surprisal or entropy, his gauge of channel capacity, as well as his profound explorations of the impact and nature of noise or interference, his abstract theory of cryptography, his projections for multi-user channels, his rules of redundancy and error correction, and his elaborate understanding of codes—would stem most of the technology of this information age.
Working at Bell Labs, Shannon focused on the concerns of the world’s largest telephone company. But he offered cues for the application of his ideas in larger domains. His 1940 Ph.D. thesis had treated “An Algebra for Theoretical Genetics”. Armed with his later information theory insights, he included genetic transmissions as an example of communication over evolutionary time through the channel of the world. He estimated the total information complement in a human being’s chromosomes to be hundreds of thousands of bits. He vastly underestimated of the size of the genome, missing the now estimated six billion bits by a factor of four thousand. But he was the first to assert that the human genetic inheritance consisted of encoded information measurable in bits.
Thus Shannon boldly extended the sway of his theory to biological phenomena and perhaps implicitly authorized its extension into economics, though to the end of his life in 2001 he remained cautious about the larger social applications of his mathematical concept.
Ironically it was his caution, his disciplined reluctance to contaminate his pure theory with wider concepts of semantic meaning and creative content, that gives his formulations their huge generality and applicability. Shannon did not create a science of any specific kind of communication. It is not tied to telephone communication or television communications or physical transmission over radio waves or down wires, or transmission of English language messages or numerical messages, or measurement of the properties of music or genomes or poems or political speeches or business letters. He did not supply a theory for communicating any particular language or code, though he was fascinated by measures of the redundancy of English.
Shannon offered a theory of messengers and messages, without a theory of the ultimate source of the message in a particular human mind with specific purposes, meanings, projects, goals, and philosophies. Because Shannon was remorselessly rigorous and restrained, his theory became a carrier that could bear almost anything transmitted over time and space in the presence of noise or interference—including business ideas, entrepreneurial creations, economic profits, monetary currency values, private property protections, and innovative processes that impel economic growth.
An entrepreneur is the creator and manager of a business concept that he wishes to instantiate or reify—make real—over time and space. Let us envisage the canonical Steve Jobs and the iPod: when he conceives the idea in his head, he must then move to encode it in a particular physical form that can be transmitted into a marketplace. This requires design, engineering, manufacturing, marketing and distribution. It is an ineffably complex endeavor dense with information at every stage.
As an entrepreneur and CEO of Apple, Jobs controlled many of the stages. But the ultimate success of such a project depends on the existence of a channel that can enable the process to be consummated over nearly a decade, during which many other companies, outside his control, produce multifarious competitive or complementary creations. Vital to all Apple’s wireless advances are achievements in ceramic and plastic packaging, in digital signal processing, in radio communications, in miniaturization of hard disks, in non-volatile “flash” silicon memories, in digital compression codes, and in innumerable other technologies feeding an unfathomably long and roundabout chain of interdependent creations.
In biology itself, chemical and physical laws define many of the enabling regularities of the channel of the world. In the world of economics in which Jobs operated, he needs the stable existence of a “channel” that can enable the idea he conceives at one point in time and space to arrive at another point years later. Essential to the channel is the existence of the Smithian order. Jobs must be sure that the economic system that is in place at the beginning of the process remains in its essential parameters at the end. Smith defined the essential parameters of the channel as free trade, reasonable regulations, sound currencies, modest taxation, and reliable protection of property rights. No one has much improved on this list.
In other words, the entrepreneur needs a channel that in these critical respects does not drastically change. The technologies that accomplish these goals can radically change. But the characteristics of the basic channel for free entrepreneurial creativity cannot change significantly. A radical rise in tax rates, or imposition of laws against ownership of rights to music, or regulations gravely inhibiting international trade would all have tended to close off the channel for the iPod.
One fundamental information-theory principle distills all these considerations of the state of the channel: to transmit a high entropy, surprising product, requires a low entropy, unsurprising channel, largely free of interference. Interference can come from many sources. Acts of God, tsunamis, and class five hurricanes have been known to do the job, though otherwise vigorous economies quickly recover from these. For a particular entrepreneurial idea, a more powerful competing technology, though a clear signal in itself and a boon to the world, can inflict overwhelming interference.
The most common and destructive purveyor of noise, however, is precisely the institution on which we most depend to provide a clear and stable channel in the first place. When government either neglects its role as guardian of the channel, or still worse, tries to help by becoming a transmitter and raising the power on certain favored signals, the noise can be deafening.
A friendly government that excluded all Jobs’ rivals from the channel or mandated his iPod model alone as a way to distribute music might have benefited Jobs for one product. But a government that could ban competitive products would thwart all the necessary technological advances that could endow Jobs’ future products. By definition such a government would create a high-entropy, government-dominated channel, full of unpredictable political interference and noise, which would balk sacrificial long-term investment of capital. The horizons of the economy would shrink to the bounds of political expediency, and short term arbitrage and trading would prevail over investment and innovation.
As the entrepreneur contemplates his invention, crucial to the prospects for success is an estimate of its potential profitability. Profit is the name that economics assigns to the yield of investments. Expressing the average yield across an entire economy, the level of interest rates and their time and risk structure will reflect the existing pattern of production and expected values of currencies. Interest rates will define the opportunity cost of investments in new products: what other opportunities are missed on average as a result of pursuing one in particular.
In information theoretic terms, interest rates are a critical index of real economic conditions. If they are manipulated by government, they will issue false signals and create confusion that undermines entrepreneurial activity. If low interest rates, for example, are targeted to institutions that finance the government—as has been the case in the United States—they represent a serious distortion of the channel. They are noise rather than signal. Zero-interest-rate money enables a hypertrophy of finance as privileged borrowers reinvest government funds in government securities, only a minority of which finance even putatively useful “infrastructure” while the rest is burned off in consumption beyond our means.
An entrepreneur making large outlays to bring a major product to market over a process taking years will normally have to promise a profit, perhaps to venture capitalists or a board of directors, far exceeding the interest rate. This entrepreneurial profit is not expected by the economy at large. It is unanticipated by the large established companies that dominate the marketplace at the time. Profits differentiate between the normally predictable yield of commodities and the unexpected returns of creativity. Reflecting the surprisal in the new product or business, this payback will be surprising, disruptive, and disequilibriating to the existing order. If established companies can manipulate the channel to protect their own products and businesses and margins, a new product cannot pass through.
The unexpected financial profit is surprisal or entropy—what Peter Drucker termed an “upside surprise.” Drucker pointed out that most measured financial “profits” are not real in this sense. They merely cover the cost of capital—the return of interest. Innovation is the source of real profit, entropic profit, which derives from the upside surprises of entrepreneurial creativity.
In order for the entrepreneur to succeed, he must know that if his creation generates an upside surprise, the related profits will not be confiscated or taxed away. If they may be confiscated, his entire project will not be able to command the necessary resources to bring it to market in volume. Thwarted are the crucial processes of learning and knowledge-creation that constitute economic growth and progress.
Linking innovation, surprise, and profit, learning and growth, Shannon entropy stands at the heart of the economics of information theory. Signaling the arrival of an invention or disruptive innovation is first its surprisal, then its yield beyond the interest rate—its profit, a further form of Shannon entropy. As a new item is absorbed by the market, however, its entropy declines until its margins converge with prevailing risk-adjusted interest rates. The entrepreneur must move on to new surprises.
The economics of entropy depict the process by which the entrepreneur translates his idea into a practical form from the realms of imaginative creation. In those visionary realms, entropy is essentially infinite and unconstrained, and thus irrelevant to economic models. But to make the imagined practical, the entrepreneur must make specific choices among existing resources and strategic possibilities. Entropy here signifies his freedom of choice.
As Shannon understood, the creation process itself escapes every logical and mathematical system. It springs not from secure knowledge but from falsifiable tests of commercial hypotheses. It is not an expression of past knowledge but of the fertility of consciousness, will, discipline, imagination, and art.Like all logical systems founded on mathematical reasoning, information theory is dependent on axioms that it cannot prove. These comprise the content flowing through the conduits of the economy and they come from the minds of creators, endowed with freedom of choice. Once the entrepreneur reifies his plans in the world, projecting them into the channel of the economy as falsifiable experiments, they fall into the Shannon scheme. Measured by their entropy—their content and surprisal—new products face the test of the market that they create. They converge learning, knowledge and power in an experimental economy of freedom.