https://hedgehogreview.com/issues/the-use-and-abuse-of-history/articles/democracy-disrupted
It is hard not to think that the world has come to a critical juncture, a point of possibly catastrophic collapse. Multiple simultaneous crises—many of epic proportions—raise doubts that liberal democracies can govern their way through them. In fact, it is vanishingly rare to hear anyone say otherwise.
While thirty years ago, scholars, pundits, and political leaders were confidently proclaiming the end of history, few now deny that it has returned—if it ever ended. And it has done so at a time of not just geopolitical and economic dislocations but also historic technological dislocations. To say that this poses a challenge to liberal democratic governance is an understatement. As history shows, the threat of chaos, uncertainty, weakness, and indeed ungovernability always favors the authoritarian, the man on horseback who promises stability, order, clarity—and through them, strength and greatness.
How, then, did we come to this disruptive return? Explanations abound, from the collapse of industrial economies and the post–Cold War order to the racist, nativist, and ultranationalist backlash these have produced; from the accompanying widespread revolt against institutions, elites, and other sources of authority to the social media business models and algorithms that exploit and exacerbate anger and division; from sophisticated methods of information warfare intended specifically to undercut confidence in truth or facts to the rise of authoritarian personalities in virtually every major country, all skilled in exploiting these developments. These are all perfectly good explanations. Indeed, they are interconnected and collectively help to explain our current state. But as Occam’s razor tells us, the simplest explanation is often the best. And there is a far simpler explanation for why we find ourselves in this precarious state: The widespread breakdowns and failures of governance and authority we are experiencing are driven by, and largely explicable by, underlying changes in technology.
We are in fact living through technological change on the scale of the Agricultural or Industrial Revolution, but it is occurring in only a fraction of the time. What we are experiencing today—the breakdown of all existing authority, primarily but not exclusively governmental—is if not a predictable result, at least an unsurprising one. All of these other features are just the localized spikes on the longer sine wave of history.
About ten years ago, I taught a course on the future of government at the University of Chicago. The broad hypothesis guiding it was that the rise of the Internet would lead to the breakup of governments and nation-states, and the emergence of new competitors, thanks to the democratization of consumer choice and the demise of territorial boundaries as a constraint, just as had already begun happening in most private-sector businesses and industries.1 Rebellion against supranational organizations, sovereign citizen movements, individuated politics and party fracturing, and most other features of politics in the last decade flowed logically from the consequences of the Internet’s emergence.2 As it happened, I had to keep revising the course syllabus when the most far-fetched developments I believed might occur decades in the future—for example, a country hiring foreign hackers to launch a cyberwar against a major corporation (North Korea and Paramount Studios) or another country selling its virtual government to consumers everywhere (Estonia)—indeed did occur, in case after case, in the middle of each semester.
The recent Russian invasion of Ukraine is the tragic exception that proves the point: Who invades another country anymore? Who (except backward-looking authoritarian regimes) obsesses about geopolitical “heartlands,” territorial buffer zones, or eight-hundred-year-old historical claims?
Instead, we can glimpse the future in the extensive damage inflicted through an unprecedented global economic, and largely virtual, counterattack, rather than a primarily territorial, kinetic response. Perhaps most surprising has been the discovery that, as The Economist put it, Ukrainians “are ready to die for the idea that they should choose their own destiny.”3 People actually want democracy. Who knew?
Until now, conventional wisdom has held authoritarianism to be winning its global struggle with democracy. Instead, it should have been clear before now that we are actually living in a world increasingly dominated by digital technologies, even in war, where territory is largely irrelevant and everyone in fact wants to determine his or her own destiny.
Democracy is indeed under attack from authoritarian forces. It always will be. But it can survive if those who want it fight back—though in a very different form from what we have come to expect. Like virtually everything else.
This, then, is a discussion of what the new technological realities mean for the future of power and its control through governance. It is not, however, an argument for a simple technological determinism—especially not of the type popular right now and most prominently advanced by the historian Yuval Noah Harari, among others, who asserts that contemporary digital technologies tend almost relentlessly to the authoritarian. That is not true in any of its particulars: Technologies themselves are not inherently authoritarian—or otherwise ideological—and current digital technologies certainly are not. These technologies are what we make of them. But to some degree, we also are what these technologies make of us—or, more precisely, once we make them, they then make the world that shapes our thoughts, actions, and lives. While the dominant technology of any era shapes not just what people do but also how they conceive of the world around them, it doesn’t dictate the paths they ultimately choose: Those paths are chosen by dozens, in fact today billions, of individuals who, together, decide the direction the future will take.
Alexis de Tocqueville observed that historians “are always inclined to find general causes,” while “politicians who have concerned themselves with producing events…living in the midst of disconnected daily facts, are prone to imagine that everything is attributable to particular incidents.” Tocqueville concluded, “It is to be presumed that both are equally deceived.”4 The unique gifts of Donald Trump, say, or the sway of right-wing media, the cynical calculations of conservative politicians, the changing demographics of the United States and the developed world more broadly, the economic vicissitudes of the last decade—all of these are shaping today’s events. History is contingent, and the sand pile we live on could shift or collapse at any moment as each new event, like another grain, is added to it. Yet these shifting sands rest upon deeper tectonic plates that shape the overall unfolding of history.
Just as in previous technological transitions, all social constructs currently taken largely for granted are changing: the structure of the family and gender relations, the place in our lives of religious belief, the nature of the economy, the distribution and geography of political power, the definitions and interrelations of social classes, the bases of self-identity, and the sources and existence of truth and reality themselves. The Protestant Reformation, whose “true and enduring radicalism,” in the words of historian Alec Ryrie, lay in “its readiness to question every human authority and tradition,”5 was made possible in large measure by the information technology revolution of its day—the invention of the printing press—and unleashed conflicts and violence that eventually led to the emergence of the modern nation-state as the means for containing it. We are experiencing a similar upheaval today, with widespread challenges to authority and tradition posing direct threats to nations and governing systems. Nor should it surprise us that all this has induced violent backlash and extreme polarization.
This technological explanation at first may seem impersonal, deterministic, and therefore depressing. But it is none of those things: We are living through a historical earthquake. Countries, government, democracy—at least, as we now know them—are all collapsing. That, we cannot change. But we can determine what we build in response.
On Technology and Technological Determinism
Contrary to the arguments of technodeterminists like Harari that today’s interventions doom us to a world of tyranny because “there is nothing inevitable about democracy,”6 there is in fact nothing inevitable about technology. Take, for instance, the world-shaking technological breakthrough that most obviously led to the concentration of authority in centralized, all-powerful autocratic regimes: the stirrup.
The stirrup was invented in China, at a time roughly coincident with the rise of the Roman Republic, but it did not make its way into Western usage for nearly a millennium. By allowing horsemen to maintain their weight evenly across a horse’s back, the stirrup dramatically altered warfare, enabling riders to remain horsed in combat, to wield a sword more easily, and to employ more of their mount’s momentum against an opponent. This shifted the balance of impact from infantry to cavalry. It also led—many historians believe—to the strengthening of centralized authority and the rise of feudalism: Virtually anyone could enlist as a footman, but becoming a knight required special equipment, first and foremost a horse, something much more expensive to acquire and maintain than a pike and shield. Power became more concentrated in a landed gentry able to command this new, and more limited, capital resource, and even more so in a unifying monarch able to amass the loyalty, horsemen, and tax revenues of significant numbers of vassals.
A horse is a horse, of course, and its effectiveness in warfare was eventually blunted by another technological advance that obviously leads to concentration of authority in centralized, all-powerful dictatorial regimes: the square.
The infantry square is a battlefield formation that was developed in China and Rome at roughly the same time and used effectively against mounted opponents. As the name implies, an infantry square consists simply of troops drawn up into four lines in the shape of a hollow square, all facing outward, leaving no unguarded side or rear for a more mobile (i.e., mounted) opponent to circle around and attack. This formation became particularly effective with the development of the bayonet in the early modern era, which allowed each side of a square to become a bristling wall of flashing metal knives, all projecting several feet outward at the end of a muzzle. As it turns out, horses—unlike their supposedly smarter human counterparts—cannot be made to charge into a wall of knives. The square thus achieved its true glory in the Napoleonic Wars, particularly in the decisive battle of Waterloo. The British squares at Waterloo broke the charges of Napoleon’s theretofore invincible cavalry, holding the ridge athwart the main road to Brussels and ending for good the threat posed by the Little Corporal to the other European powers.
The victory at Waterloo was celebrated across the Continent as a triumph of freedom—by the ruling, aristocratic regimes. Although a dictator, Napoleon was seen by those regimes as the spear’s tip of the revolutionary movement that had toppled the monarchy and massacred the aristocracy in France, and he appeared to pose a similar danger to monarchs and aristocrats everywhere. Waterloo thus looked at the time to be the salvation of the ancien régime across Europe.7
But it was not. The Napoleonic Wars unleashed the forces of democratization in unforeseen ways. Because the technologies of the time favored huge massed armies, all the countries of Europe had been required to develop both the incentives to raise previously unimagined numbers of recruits and the bureaucracies necessary to process, provision, transport, and direct them. The nobles dancing in Brussels to celebrate Napoleon’s defeat soon found that their mighty endeavor had raised the expectations of their respective peoples now keen to share in their nations’ fortunes, having been called to defend their various regimes from mortal peril. Such expectations could be met only by using the newly created machinery of a modern state to distribute (and redistribute) resources on a mass scale. Instead of averting the threat of further revolutions like those in France and America, the Napoleonic Wars gave rise to a century of upheaval that led eventually to the spread of democracy and the advent of the welfare state.
In short, it is often difficult to see where new technologies actually will lead. The same technological development can, in different settings, have different effects: The use of horses in warfare, which led seemingly inexorably in China and Europe to more centralized and autocratic states, had the effect on the other side of the world of enabling Hernán Cortés, with an army of roughly five hundred Spaniards, to defeat the massed infantries of the highly centralized, autocratic Aztec regime. Cortés’s example demonstrates that a particular technology generally employed by a concentrated power to centralize and dominate can also be used by a small insurgent force to disperse and disrupt (although in Cortés’s case this was on behalf of the eventual imposition of an even more despotic rule).
Regardless of the lack of inherent ideological content in any given technology, however, our technological realities consistently give metaphorical shape to our ideological constructs. In ancient Egypt, the regularity of the Nile’s flood cycle, which formed the society’s economic basis, gave rise to a belief in recurrent cycles of life and death; in contrast, the comparatively harsh and static agricultural patterns of the more-or-less contemporaneous Mesopotamian world produced a society that conceived of gods who simply tormented humans and then relegated them after death to sit forever in a place of dust and silence; meanwhile, the pastoral societies of the Fertile Crescent have handed down to us the vision of God as shepherd of his flock. (The Bible also gives us, in the story of Cain and Abel, a parable of the deadly conflict that technologically driven economic changes wreak: Abel was a traditional pastoralist—he tended sheep—while Cain, who planted seeds in the ground, represented the disruptive “New Economy” of settled agriculture. Tellingly, after killing off the pastoralist, the sedentarian Cain exits to found the first city.8)
As humans developed more advanced technologies, these in turn reshaped our conceptions of the world around us, including the proper social order. Those who possessed superior technological knowledge were invested with supernatural authority: The key to early Rome’s defense was the ability quickly to assemble and disassemble the bridges across the Tiber, so much so that the pontifex maximus—literally the “greatest bridge-builder”—became the high priest, from whose Latin title we derive the term pontiff. The most sophisticated—and arguably most crucial—technology in any town in medieval Europe was its public clock. The clock, in turn, became a metaphor for the mechanical working of the universe—God, in fact, was often conceived of as a clockmaker (a metaphor still frequently invoked to argue against evolution and for the necessity of an intelligent creator)—and for the proper form of social organization: All should know their place and move through time and space as predictably as the figurines making their regular appearances and performing their routinized interactions on the more elaborate and entertaining of these town-square timepieces.
In our own time, the leading technologies continue to provide the organizing concepts for our economic, political, and theological constructs. The factory became such a ubiquitous reflection of economic and social realities that, from the early nineteenth century onward, virtually every social and cultural institution—welfare (the poorhouse, or, as it was often called, the “workhouse”), public safety (the penitentiary), health care (the hospital), mental health (the insane asylum), “workforce” or public housing, even (as teachers often suggest to me) the education system—was consciously remodeled around it. Even when government finally tried to get ahead of the challenges posed by the Industrial Revolution by building the twentieth-century welfare state, it wound up constructing essentially a new capital of the Industrial Age in Washington, DC, with countless New Deal ministries along the Mall—resembling, as much as anything, the rows of factory buildings one can see in the steel and mill towns of the same era.
By the middle of the twentieth century, the atom and the computer came to dominate most intellectual constructs. First, the uncertainty of quantum mechanics upended mechanistic conceptions of social and economic relations, helping to foster conceptions of relativism in everything from moral philosophy to literary criticism. More recently, many scientists have come to the conclusion that the universe amounts to a massive information processor, and popular culture to the conviction that we all simply live inside a giant video game.
In sum, while technological developments are not deterministic—their outcomes being shaped, rather, by the uses we conceive to employ them—our conceptions are largely molded by these dominant technologies and the transformations they effect.9 To understand, then, the deeper movements of thought, economic arrangements, and political developments, both historical and contemporary, one must understand the nature of the technologies underlying and driving their unfolding.
The Nature of Today’s Technologies
Like the proverbial elephant examined by the blind men, any conversation about today’s digital technologies might appear to be about a wide range of very different, and very personal, impacts. When most people, particularly the young, who use them most, think about the current generation of technologies, they think mostly of social media. Many also think in terms of simulations or gaming. Or, perhaps, the ready availability of pornography.
With a modicum of ambition and tech savvy, users of these technologies can easily put out a publication read by millions, create their own music studio (or, for that matter, a hundred-piece orchestra without any knowledge of how to play an instrument), and produce movies or live “TV.” They can “print” complex manufactured objects—such as guns—in their own homes or turn parts of that home (or their car, or their excess computer capacity) into components of a globe-spanning enterprise (Airbnb, Uber, cryptocurrency “mining”) the constituents of which are in a constant state of assembly, disassembly, and reassembly. Another major everyday impact, accelerated by the COVID-19 pandemic, has been the ease of ordering goods and services of all kinds from just about anywhere, delivered promptly to one’s door. As many parents discovered during the last two years, mostly to their dismay, this can include K–12 education as well as almost all facets of office work.
These many applications, however varied, share certain core characteristics, most fundamentally, that today’s dominant and emerging technologies are disaggregative. The essence of digitization is the breaking down of everything into its information component. Information is massless and occupies no space. It reduces complex systems, constructs, and interrelationships to their simplest units—which we can think of as ones and zeroes, ons and offs, signal or silence, existence or nonexistence—which can be broken apart, transmitted, and recombined to create meaning. This characteristic compresses time, reduces cost, and renders place meaningless.
The most immediate effects of the digital revolution flow obviously from the foregoing: Information becomes cheap and ubiquitous. Meaning can be assembled and reassembled from data in infinite ways. The virtual is more valuable than the concrete. Location doesn’t matter. Anyone can create anything anywhere and provide it to someone somewhere else.
Moreover, anything can be broken apart into its components, which then can be manipulated separately. An extreme (because physical-world) example of this is 3-D printing, which allows the idea of an object—whether a complex machine, an entire building, or a living organism—to be translated into bits, transmitted to a “printer,” then reassembled monad by monad, as it were.
This disaggregation (and then reaggregation by the user) has broader applications: Services and products previously bundled together can be (and thus eventually must be) “unbundled” and acquired or sold separately. Low-cost providers can more easily market stripped-down products, and consumers can more easily obtain only, and exactly, those components they want, with the result being an economy of mass individuation. Capital goods previously obtainable only in fixed bulk amounts can be fractionated: Portions of homes or cars or factories can be used when or as needed, by whoever chooses to use them.
Present-day technology is making disaggregation increasingly an aspect of the analog economy, as well. As George Will observed, one of the more transformative technologies of recent decades is the shipping container, which allows products previously sold in bulk to be more easily disaggregated, transported, sold, and consumed in whatever units desired.10 This unbundling of component parts into individual subunits has further, and profound, implications for virtually every human activity.
The first such implication is virtuality itself. The basic element of every human interaction that is severed by digital technologies is its connection to a specific physical location. Almost any product or service can be delivered from anywhere to anywhere. This fact has reshaped almost every industry in the world, particularly those that are information based. It will have its most profound effect on the industry most defined by territoriality: government.
We have come to think of government almost definitionally as the exercise of authority over territory. Our fundamental unit of modern government, the nation-state, has been defined explicitly as territorial since its rise in the seventeenth century. But that definition is already losing its salience: An increasing number of countries—most notably Estonia, but many others in other settings as well11—have started selling their governmental services to the residents and citizens of other countries. Even some US states are doing this—for instance, Rhode Island with its college savings plan, Florida with its “virtual school” (online K–12 public education available to students worldwide), and, of course, Delaware with corporate governance.
All this unbundling and territorial decoupling greatly expands the options available to consumers, producers, and providers. You need no longer purchase the services of only the local video rental store, power generator, or—as Estonia has demonstrated—government. Anyone can order bespoke suits or running shoes, create a personal brand or produce music or writings consumed by millions, launch the next online business, or wreak economic damage through hacking or physical destruction through weapons of mass destruction (WMDs) on a scale previously reserved to large nation-states. This is a highly “democratic” world, but not in the way most people think: That sort of democracy is, along with nation-states of broad territorial reach, increasingly a relic of the past.
The Changing Nature of Governance
Conventional wisdom holds that, contrary to what I have argued above, today’s digital technologies are aggregative, centralizing, and authoritarian. Accordingly, they lead both to the concentration of entire fields of activity in the hands of a few actors such as Facebook or Amazon (which themselves exhibit dictatorial tendencies) and to the rise of authoritarian politics worldwide. Yet even as most commentators bemoan the increasingly authoritarian nature of politics and government around the world, democracy movements are alive and thriving—if under assault from authoritarians—from Hong Kong to Sudan to Venezuela to Ukraine to the United States. Even the worldwide populist movements perceived as authoritarian are inherently democratic, rejecting elite policies and demanding greater individual autonomy. They tend to be illiberal, but that is different from being undemocratic.
The tension between these two understandings of our age’s underlying technologies is inherent in them: These technologies are built on network economics, which tend to scale, and thus lead to monopolization and concentration. But at the same time, networks are inherently disaggregating, decentralizing, and democratizing.
Most prior modern technological advances, particularly those of the twentieth century (broadcast communications media such as television and radio, WMDs such as the atomic bomb, the information processing function of the mainframe computer), were built on one-to-many architectures (i.e., those that run from a single source to multiple recipients). They underlay the centralization of power in, and creation of, the modern nation-state and the modern corporation. By contrast, the architecture of today’s technologies is many-to-many. In the euphoric early days of the Internet, people commonly believed that the new “World Wide Web,” by allowing virtually every piece of information to be knitted together and made accessible to virtually every human being (and then shared back out to every other human being), would lead to the advance of human knowledge by making possible a widely shared consensus on all the facts. Note the implicit assumption that democratization and decentralization would lead to consensus.
A quarter century on, it is clear that this early optimism was exactly wrong: The availability of all data to all people does not create an authoritative version of reality. It destroys it. The days of only three network news broadcasts of varying shades of gray, with accompanying entertainment divisions producing largely similar mass offerings, were replaced first by the larger set of options that came with numerous cable TV news offerings and limitless TV channels for specialized tastes (individual networks for baseball or hockey, food, home decorating, travel, history, etc.), and then by media like Facebook Live that enable anyone to become a broadcaster. A comparably discrete number of major newsmagazines—Time, Newsweek, and US News & World Report—has been displaced by a blogosphere in which virtually anyone can reach a global audience. Contrary to the expectations of the Internet’s original theorists, the democratization of information has created nothing like a Jeffersonian marketplace of ideas in which truth ultimately wins out, but simply a marketplace where anyone can peddle or acquire whatever version of “truth” one wishes.
Today’s technologies, then, not only destroy any notion of “authoritativeness” but also have a profoundly destabilizing effect on all forms of “authority.” Of course, new technologies generally undermine incumbents and the existing distributions of power. But the distinctive feature of the Internet era has been the rapid disintegration of all forms of authoritativeness—especially the authority of objective reality. Alternative realities are the reality of the digital era: Second Life, forerunner of the metaverse, provided people with an alternative universe, alternative identity, and alternative economy. Augmented and virtual reality are the next logical and technological step that will make it harder and harder in the years ahead to distinguish fact from fantasy, truth from fiction, real from fake, “alternative facts”—in Kellyanne Conway’s memorable phrase—from factual facts.
Despots, would-be or actual, depend on the destruction of truth and objective reality to undermine or discredit all other sources of authority, so that they may rule by force. In contrast, the current technological regime, in undermining all authority, including all incumbent structures of power, is as much a state-destroying force as a state-fortifying one. Even Donald Trump, despite generally being viewed as authoritarian, encouraged vigilantism both before and even during his presidency in ways that actually promoted the undermining and dispersal of the state’s monopoly on “legitimate force.” It would be naive to see the privatization and democratization of force as something uniquely promoted by Trump, however. Indeed, this process has only gathered strength and speed since he left office. A major goal of what passes for “conservatism” today is the weakening of the state in virtually all of its forms and the return of its powers to private, and individual, hands: a sort of violent libertarianism.
The most prominent example of this so far is Texas’s antiabortion law, known as SB 8 (formally the Texas Heartbeat Act), which—constitutional issues aside—outsources a core government function and obliterates geographic boundaries as a key component of “statehood.” (The same is true of Texas’s acceptance of South Dakota National Guard units for border patrol duties, a scheme dreamed up and paid for by a private funder.12) Under SB 8, the enforcers, the new “state” power, comprise all those who have reason to believe that someone has had an abortion in the state, even private citizens, even nonresidents of Texas—imposing a regime that is actually opposed by a plurality (48 percent) of Texas residents.13
This is not simply an undemocratic form of government. It is a nonstate form of government, in the sense that “states” have been understood to be geographically defined for the last four centuries. That trashing of the modern state and its institutions of government is very much of a piece with the larger changes in the nature of government in the world resulting from the new techno-intellectual construct. Today’s radical inversion of conservatism has placed it, ironically, at the cutting edge of the future.
It hardly should be surprising that more and more people now believe that they are entitled to their own state. If you dislike federal policy and you have the means, you can buy your own army and send it to the border to conduct your own foreign policy. (Recall that Aaron Burr was indicted for treason for doing just this.) For the rest of us, there’s good old-fashioned secessionism—popular with voters everywhere, but particularly in eastern Oregon, where conservatives just want to escape the liberalism of the Portland metro area to join up with more like-minded souls in Idaho.14 It’s not just conservatives who want their way, though. Talk of secession moved from southern states under Barack Obama to Massachusetts, New York, and California under Trump. California, in particular, did its best to all but secede from the federal government’s constitutional supremacy during the Trump presidency, dissenting from Trump’s withdrawal from the Paris Accords and more-or-less claiming nation-state status to deal with foreign governments as a peer, at least on that issue. Again, this is not just an American issue: Secessionist sympathies are rising everywhere from Spain to Scotland to Saskatchewan. Such recent phenomena as Brexit and Trumpism have been interpreted as essentially “nationalist” movements, when they are more truly about breaking away and splitting apart.15
The endpoint of this impetus is—to borrow a phrase from American financial adviser James Dale Davidson and British Eurosceptic William Rees-Mogg—the “sovereign individual.” In the political realm, this has been emerging globally for some time at the party, rather than state, level, since—at least as long as states retain the monopoly on legitimate force—it is easier to form your own party than your own state. In Europe, where the electoral rules do not militate in favor of a two-party system as in the United States, practically any tendency develops its own party. Americans, essentially stuck in one party or the other, simply insist on their own agendas.
And why not? This is not just good old American individualism: The new reality is that we are increasingly able to create and live entirely within our own full-on reality. The much-hyped metaverse is the latest iteration of this concept. Within a few years, virtual reality, augmented reality, and their manipulation through “deep fakes” will take alternative facts to a new, all-encompassing level. In short, authoritativeness is dying, along with reality.
Last summer, writing in The Atlantic, David Brooks argued that working-class Americans’ resentment of educated elites is based on real grievances. “In revolt,” Brooks elaborated, “populist Trump voters sometimes create their own reality, inventing absurd conspiracy theories and alternative facts about pedophile rings among the elites who they believe disdain them.”16 Brooks has it backward: Today’s technologies provide those who want to turn “alternative facts” into reality—and to share that new reality with others—with the ability to do so far beyond anything possible in the past. That is the revolt.
The most significant manifestation of this ability to create one’s own reality—so far—has been the Capitol insurrection of January 6, 2021. The events of that day, which came as a bolt from the blue to so many Americans, were in fact the culmination of months of discussion and planning among so many other Americans on Internet channels all their own. Since then, it has become clear to everyone that half the country lives inside an alternative reality where one can believe whatever one wants, accept it as real, and act on it IRL—in real life. (Of course, which half that is depends on whom you ask.) It turns out that the late senator Daniel P. Moynihan was wrong: Not only are you entitled to your own opinion; you are now also entitled to your own facts.
In some ways, this is a good thing. The vast majority of people are now able to imagine the world as they want it to be, to work with others to attain that reality, and to believe that they have done so. In many cases, they actually have. We indeed are living in a time of tremendous freedoms and, in the democratization of virtually everything, a Golden Age of Democracy.
The vetocracy17 emerging from this hyperdemocratic state of affairs—in which we all expect to get everything we want, including blocking anyone else who wants something different—is, however, both illiberal and antisocial. As “reality” has become more and more personalized, the right and necessity to protect that reality against alternative realities has become more compelling. Trump’s encouragement of his supporters to armed insurrection was merely a reflection of the coming “democratization” of force. It is a forewarning of the future of government—of “legitimate force”—itself, when, as Benjamin Wittes and Gabriella Blum argue in their book, The Future of Violence, we all will have our own drones, WMDs, or worse.18 The current threat, in sum, is not really a centralized “authoritarianism.” It is a dispersed reality and a nearly universal intolerance of everyone else’s. All of this is incubating within a social framework shaped largely by communications channels constructed specifically to encourage anger, disagreement, and the spread of rumor and falsehood, because these propagate most readily and thus generate the most revenue for their owners.
Unfortunately, the companies that channel our use of these technologies in this way are just a part of the economic changes rending the fabric of day-to-day reality for millions. As the digital economy displaces the analog, the value of the weightless surpasses that of the tangible, and the virtual supersedes the terrestrial, the old economy is fading, its industries, jobs, and communities with it. People rooted in the older economy are enduring the destruction of everything they have known and cared about. As a result, they are angry.
These economic changes, as in all prior such “new economies,” are resulting in greater aggregate wealth but also rising inequality. Historically, as capital has become less rooted in land and more intangible, it has grown more concentrated; we are reaching the apogee of that tendency. The old hard-won, broadly distributed middle-class economy is collapsing, its breakdown exacerbating the festering resentment already described. Not only has the foundation collapsed underneath most people, so has what was left of the social safetly net. As with all such economic transitions, the existing methods of “social insurance”—which over time emerge to mitigate the risks and unfairness of each “new economy”—are being destroyed along with the other social arrangements that spawned and underpinned them. In prior transformations, these were the extended family, the village, and civil society; today, it is the welfare state that originally evolved to counter the worst depredations of the Industrial Revolution. Of course, the constriction of the welfare state only further undercuts its popular support, weakening existing governments worldwide.
Unsurprisingly, thanks to all these digitally driven developments, reaction and division in every form imaginable are sweeping through our society like a tsunami. In their seminal recent book The Dawn of Everything, anthropologist David Graeber and archaeologist David Wengrow go to great lengths to reject the standard evolutionary argument of technodeterminists like Francis Fukuyama, Jared Diamond, and Yuval Noah Harari that human society has progressed more-or-less linearly from a Hobbesian “State of Nature” to the more advanced forms of governance we know today. It may very well be that the technodeterminists all actually have it backward: The current march of ever more advanced technologies is in fact driving an evolution in human societies to a “State of Nature.”
The World We Can Create
If technology isn’t destiny, then what can we do about all this?
We can tinker with the mechanics of the existing nation-state by, for example, abolishing the Electoral College or reining in gerrymandering. We can reform the tech industry or the broader economy within the current nation-state framework. These are all worth pursuing—in the same sense that, if you want to retain your legacy vinyl albums and eight-track tapes for nostalgia’s sake, you might as well make sure your turntable and tape player are functional.
It might be more worthwhile, though, to think about alternatives to the models of social organization, global governance, and economic structure emerging from our current technological tectonic shifts: The challenge during the next decade will be not to cling to some idealized vision of the world we have known, but, rather, to accept the coming changes and shape them for the better. This is what that can mean.
Politics and Markets
Politics and economics are not separate. They are interchangeable mechanisms, like mass and energy. There is little meaningful difference between businesses and governments: Governments are subject to the same forces as businesses—demand for their services, ability to generate revenue, the rise of competitive alternatives. The difference is that governments can use guns and jails to retain their customers and resist change. Hence Max Weber’s famous definition of the state as an enterprise that “successfully upholds a claim to the monopoly of the legitimate use of physical force” (emphasis added). Keep both “monopoly” and “physical force” in mind.19
What scholars Daron Acemoglu and James A. Robinson describe as extractive forms of government—those with institutions structured to suck most of the resources and personal autonomy out of most individuals and concentrate them in a powerful few20—are essentially the same as extractive economies (generally conceived as centering on resource extraction such as agriculture or, more commonly, mineral and fossil fuel mining, although today these activities include extracting data, and its economic value, from the vast number of individuals for the benefit of the powerful few).21 Indeed, extractive governments and extractive economies tend to go hand in hand—even more so as the artificial distinctions between public and private, specific to the “modern” age, further erode. We therefore need to develop what Acemoglu and Robinson call “inclusive”—democratic—economic institutions in order to preserve democratic political institutions, and vice versa.
The line between the public and private sectors is, like all lines these days, vanishingly thin under the pressure of digital technologies. Governance increasingly is and will continue to be carried out by entities other than what we think of as “governments,” that is, territorially defined states. When I started arguing, over a decade ago, that Facebook was a form of government, my students laughed; now—with Mark Zuckerberg tracking your every online move, setting up his own court system, and dreaming of his own air force—not so much. Ultimately, politics and markets are simply two different forms of societal decision-making, the former centralized and communal, the latter decentralized and individual, but each capable of being more—or less—democratic.
People will be able to choose from a variety of “governance” providers, few of which will have anything to do with their physical “jurisdiction.” The meaningful distinction will be not public or private, but analog or digital: Territorially based governments will become more and more local in focus. Meanwhile, most of what modern governments do—providing schools, nursing homes, childcare, health and income insurance, retirement savings accounts, even package delivery—does not need to be done by governments at all, and, throughout most of history, was not. As technology untethers these services from geography and thus territorial states, those services will come from other providers. Governments as we currently know them—geographically delimited and compulsory—will supply localized, largely physical public goods such as water and sewer systems, parks, and fire departments. Other elements of the twentieth-century welfare state, human capital investments such as education and income supports, will be assumed by nonstate actors. As they already are. Modern nation-states will be destabilized and—like government functions and most other things today—unbundled. Consumers will have a broader choice of “governments.” In some instances—businesses that want to become e-Estonians, shippers on the high seas who wish to sail under the flag of a less regulatory state, wealthy individuals who purchase European Union passports and protections through Bulgaria—they already do.22 In the future, choosing among government providers of services will be as easy and widespread as switching from one wireless network operator to another.
In that fast-approaching future, anyone who chooses not to purchase various services will be able simply to press a button to “join” a different “government.” Compulsory support for public goods will erode. You will be able to live in a tax-free libertarian bubble, if you choose. But then you won’t be reaping the benefits of joining a community, such as the increased overall standard of living of a better-educated, higher-earning society, or even the privilege of driving on communal roadways. (In the near future you won’t be able to drive on a road you don’t pay to use, or park illegally in a parking space where you’re not allowed, because your self-driving smart car won’t let you.)
Plenty of people dislike this prospect because of what it means for public goods, especially the reduction of inequity. They believe that we need government as we know it in order to overcome the unfairness of the market. One must ask, however: How good a job does government do at that now? As the historian Walter Scheidel has demonstrated, redistribution to reduce inequality has occurred in human history only as a byproduct of one of three destructive processes: war, revolution, or widespread disasters such as plagues.23 Otherwise—unsurprisingly—the rich and powerful find a way to dominate governments just as much as markets and use them to produce what economists call “rents”: excess benefits for themselves. Government policies in the United States—particularly tax cuts aimed primarily at high-income individuals and corporations, and cuts in welfare spending—have in fact been exacerbating inequality for the last fifty years.
However, the same digital technologies undermining governments today also make it cheaper and easier to aggregate large numbers of people; to digitize, metricize, and monetize—and thus capture the value of—virtually any activity; and to exclude free riders. These are all, basically, descriptions of what “governments” traditionally do. It thus is becoming increasingly possible to devise nongovernmental—in other words, nonterritorial and noncoercive—business models to fund public goods, particularly human capital investment (or what the government obsessed might call “social programs”), something that occupies a good deal of my thinking.24 Others are at work on this, too.
Whichever form of society prevails in the long run (and I’m fairly confident it will be the high social investment model, because—whether you like this rationale or not—it produces higher returns), this will be decided through highly democratic means, though in the form of individual decisions in the marketplace rather than collective decision-making in the voting booth. We therefore need to develop the economic and technological models to realize both this revolution in the nature of democracy and the preservation of public goods and social investments—because the alternative is simply the demise of democracy as we know it, without any twenty-first-century replacement.
And that makes the business model surrounding digital technologies crucial to the future of not just the economy but also governance and democracy.
Democracy and Monopoly
The threat of authoritarianism is at least as strong from the new corporate governments of the virtual world as from state governments. Most of the fear of the New Digital Authoritarianism has focused on the rise of the surveillance economy and the threats posed to privacy today by everything from the electronic trail left by all of our online activities to the recording by the ubiquitous camera lens of what is left of our activities IRL. Of even greater concern should be the growing ability of researchers to identify ways to use these inputs first to predict and, soon, to channel our behavior. But even the most nefarious capability can be rendered relatively benign if people have the ability to avoid it. Which in turn raises the issue of choice, and its nemesis: monopoly.
Monopoly, government, and democracy are intimately related. The key issue defining “government” is not control over territory, per se, but the ability to enforce one’s edicts. Control over territory helps with that—if you cannot run away, you have to comply—but it is ultimately the ability to compel people, not the hold on mere territory, that makes a government. The lack of meaningful alternatives is what makes Facebook a form of government (and as hated, according to polls, as the IRS). Eventually, people will demand and find better alternatives to the feudal autocracies currently governing digital space, just as they did with territorial governments—obtaining either the right to a say or the ability to go elsewhere, what the economist Albert O. Hirschman identified as “voice” and “exit,” the two bases of any legitimate system.
New and better business models that provide these consumer options—what in the political context we call “rights”—will emerge naturally, if given the opportunity to do so. After all, better search engines than Google existed in the past. I used to love Alta Vista, but Google bought it out and shut it down. That is exactly why we need traditional states to act, while they still can, to lower barriers to entry and to enable new companies, technologies, and business models to compete with the existing monopolists and oligopolists, such as interoperability requirements among networks.25 We need renewed traditional antitrust enforcement, including prohibitions on buying out competing technologies and models, cross-subsidizing products and services in order to price competitors out of business, and monopolizing market share. We need public and private sector initiatives to encourage, design, and fund alternative business models in which people, not their providers, can control their data, their privacy, and their identities—or not, as they choose, selling these “for their own account” rather than simply letting the tech platforms extract the value for themselves.
Critics of Elon Musk’s move to acquire Twitter contend that his hostility to content moderation will mean that, to quote the poet William Butler Yeats, “mere anarchy is loosed upon the world,” while Musk’s fans on the right are celebrating for much the same reason. The real issue, however, is that if, as Harvard law professor Lawrence Lessig famously proclaimed, “code is law,”26 then algorithms are government: Musk claims that he will free Twitter users by making Twitter’s “algorithm” public—but while being transparent makes for better government, if Musk were really such a libertarian and avatar of freedom, he (and other tech titans) would give users control over their feeds and allow them to choose their own algorithms for what they saw and read. But—like old-style state media—he, and they, don’t.
For years, Facebook wrestled unsuccessfully with the challenge of creating a profitable business model for social media before hitting on precisely the extraction and commodification of user data now emulated by the tech giants and virtually all platforms. So the fundamental political question of our age is whether there in fact can be any profitable model for networking people that is not extractive. The answer is: Yes, there is—and it is what we’ve traditionally thought of as “government.”
Governments naturally bring people together (or, to put it better, aggregate them) in order to provide public goods and then pay for them by collecting taxes—all of which naturally leads to communication systems. (Early writing, for example, from Mesopotamian cuneiform to Andean quipus, developed originally strictly for the maintenance of tax records.) Governments are able to cross-subsidize the provision of services that do not necessarily generate money (like welfare programs) with services that do. (Delaware has long funded the government from the sale of services desired, and paid for, by out-of-staters, such as stable corporate governance and a short stretch of the New York-Washington highway corridor.27) Conversely, through the inherent capability to layer further “apps” on top of highly-trafficked platforms, government has the means to build profit-making enterprises (e.g., the ancillary services and products post offices traditionally have provided for purchase such as gift-wrapping, photocopying, and wire transfers).28
Among the many public goods that governments traditionally have provided in this fashion, but that are now under assault by the atomizing nature of the new technologies, perhaps the biggest casualty is trust. Government regulation of untruth on the Internet is unlikely to remedy the problem, both because limiting free discourse rarely itself leads to truth and because tech companies with the current business model cannot be regulated into being truth producers any more than coal companies can be regulated into becoming clean air producers. Rather, what is needed is a business model that does not feed on anger and misinformation in order to generate profits.
What is that model? An entity that has already built an organic community—rather than Mark Zuckerberg’s ersatz vision of one—upon which to layer a large social media enterprise without caring how to extract money from it. Such a rights-respecting, inclusive model, built upon a true community, would provide the remedy for the disappearance of trust, privacy, and autonomy from today’s economy along with all other public goods. (The Roman Catholic Church, Sony—the world’s most-loved corporate brand—New Zealand, Manchester United: All undergird such communities and could form the “backbone” of a successful social network. So, too, could a political or economic entity built similarly on shared values.) A model embracing inclusion and respect for rights would break the hold of the extractive and destructive tech model built by Facebook, Amazon, and the others. It would aggregate people for their mutual benefit, rather than as, say, Amazon does to monopolize both sides of all transaction for its own benefit. It would naturally lend itself to political communities that aggregated citizens to produce public goods and social investments, consumer co-ops that aggregated households, or labor “unions” that aggregated workers virtually.
That would be the first step toward a more democratic economy. Meanwhile, capital goods are becoming more widely accessible: One need not own a factory now to produce products with limited factory runs, or a hotel to offer commercial lodging, or a taxi medallion to offer ride services—thanks to the fractionation of practically all goods and services. Blockchain’s ability to make commitments self-executing will enable lenders to shut down and automatically retrieve a car if a payment is missed, thus making the car its own collateral and making lending less risky. Such a model is already being applied to make solar energy systems available and affordable to villagers in sub-Saharan Africa. Soon, this “unbundled” economy will be everywhere: Authentic, the 2020 Kentucky Derby winner and Horse of the Year, had more “owners” than any winner of the “Run for the Roses” in history, thanks to crowdfunding through a new online, fractionated thoroughbred ownership model.29
At the same time, people may seek capital ownership less and less: Why buy a car when you can get one on demand? And while millennials are less likely than preceding generations to be able to afford to buy a house, they also are less likely—being more mobile and more focused on experiences than things—to live in a world where homeownership seems as compelling.
This presents the possibility of a more democratized economy in which capital is in high supply and low demand. The alternative, a world in which capital grows even more concentrated, when combined with an automation-driven decline in the demand for labor, would leave the great mass of people with dwindling routes to resources of any type. Fostering new potential “business models”—both commercial and governmental—like those cited above would promote democratization.
Without technology-driven capital dispersal, pressures will only mount for some new form of redistribution, just as the modern welfare state arose to mediate the distributional inequities of industrial capitalism. While it will be different from anything we have seen in the past, we can be pretty confident that it will be a major component of whatever it is that we will call “government” in the future, and likely will be intimately related, as always, to the resolution of the other great constant in human affairs: the use of force and violence.
War and Peace
The fundamental challenge of human governance—the ultimate “public good”—is physical security. This will not change in the virtualized world under discussion here, at least as long as people remain physical beings. I have posited a system of “governments” some of which regulate spaces and others of which “follow the person.” But won’t that only set up more conflict? No more so than the overlapping jurisdictions and sovereignties of our current governmental systems, which have been mediated for centuries by treaties as well as complex regimes of both laws of conflict and conflict of laws. Overlapping auto and health insurers manage to sort out responsibility when your neighbor rams your car; overlapping “government businesses” will be able to do the same. The word international—meaning the very conception of a world stitched together of competing territorial governments—did not even exist until 1780 (a coinage of the Utilitarian philosopher Jeremy Bentham); we will be able to get by quite well with other conceptions of conflict and cooperation in the future.
Indeed, the great global conflict today already is not so much between different countries as between different cultures that cross, and coexist within, existing national borders—one culture fluid and amoebic like the technology on which it rests, the other reacting against the spread of the former. As Putin’s invasion of Ukraine clearly reflects, the extractive regimes currently ruling Russia and China (as well as reactionary actors everywhere) still view world domination in terms of territory and natural resources. Despite their superior appreciation of the value of cyberwarfare in destabilizing a country, they still conceive primarily of undermining competing governments’ control over territory.
Putin is in other ways, as well, very much the classic general fighting the last war—or, rather, the last century’swar. Russia inadequately anticipated the financial—meaning, essentially, virtual—warfare the West has trained on Russia. War in the virtual realm—not just the ones and zeroes of financial accounting, but also information warfare and cyberattacks on both virtual and physical infrastructure—is the future, and increasingly the present. State-on-state aggression has declined markedly in the last few decades, which is what makes Putin’s criminal territorial invasion seem so anachronistic. But if we are not at war with our adversaries in the real world right now, it is largely because nowadays we are at war with them, every instant, in the virtual world.
Cyberwar, like everything digital, blurs almost all lines: between what is and is not war, between international and domestic, between public and private, between virtual and physical. As Wittes and Blum argue, the same technology that democratizes threats also democratizes defense, “distributing” the nation-state’s activities across a wider range of actors, notably the private sector providers of the “pipes,” both traditional utilities and information technology, on which modern society now depends. Current and emerging technologies expand the battlefield not just to all actors in all places, as Wittes and Blum insist, but at all times.
How do we guard in the future against this form of constant warfare escaping the confines of virtuality, destroying physical infrastructure (dams, entire power grids), and wreaking untold economic destruction, not to mention engendering precisely the type of civil discord and societal dissolution we are currently experiencing? In the mid-twentieth century, traditional thinking on nuclear defense called for “hardening” targets—for instance, investing in “a nuclear-resistant buried cable network [that] would cost $2.4 billion,” writes Andrew Keen in The Internet Is Not the Answer. But a young RAND analyst, Paul Baran, had a different idea, a “user-to-user rather than…center-to-center operation,” a “distributed network” that “would be survivable in a nuclear attack because it wouldn’t have a heart…. It would have no heart, no hierarchy, no central dot.”30 The answer, the title of Keen’s book notwithstanding, was the Internet. Today’s blockchain technology essentially takes the modeling of “defense” as dispersion and redundancy a step further. Increasingly, dispersion—making potential targets “softer,” or more ephemeral and diffuse, rather than “harder”—could be the conceptual infrastructure of the future. In what I’ve called the “Holographic Society,” physical and virtual redundancy—distributing the defense against distributed threats—can deter attack by rendering it futile: No matter how successful your assault on any of its physical components, the system, state, society, or culture you attack will survive.31
Can you actually “virtualize” or “distribute” an entire country? Well, yes—if you think differently about what a country is. Estonia, for one, is certainly trying, essentially making its entire government and society “redundant” in case of attack. Although Ukraine is much less technologically advanced, if it prevails in the long run, it will do so because its people continue to win the global information war, refuse to acknowledge the legitimacy of Russian overlordship, and relocate abroad in large numbers. Putin will be left controlling territory, which is his main focus, but neither the people nor the future. But the leader in this is actually the United States, which already has become a virtual, distributed country: American culture and values, whatever the current vicissitudes of the American state, are ever more broadly dispersed, having essentially conquered the world. The more this is true, the more pointless a physical attack. That, more than our government, is what truly threatens our antagonists today.
So what will conflict look like in the future? In a world where survivability is more dispersed, threats will evolve as well. In the pre-9/11 world full of state-sponsored WMDs, stability could be achieved when they were concentrated in a few large states, because rational actors were deterred by the likelihood of mutually assured destruction (MAD).
Game theory provided the intellectual construct for Cold War strategy that resulted in MAD; it also can provide a helpful way (as well as helpful catch phrases) for thinking about a greatly dispersed world of force and power. There’s one standard analytic “game” in which some people are “chickens” but others adopt “deadlock” behavior, willing to blow up themselves and everyone else if they don’t get their way: This is essentially the end state of our emergent vetocracy, one in which widely available drones, chemicals, biological agents, and hacking will mean that basically anyone who wants to wreak state-level havoc pretty much will be able to do so. In game theory, this is called the “bully” gambit, and the risk-averse “chickens” quickly give in to the player threatening “deadlock.” The more reckless one is—or the more willing one appears to be—the further ahead one will emerge from such negotiations.
This was Richard Nixon’s “madman theory” for dealing with the North Vietnamese; Trump claimed it to be his modus operandi, as well. But the gambit works better in two-player (i.e., major state) contests, not ones involving countless WMD-armed actors—many of whom will not knuckle under to the bully. Such a world of widespread, distributed “deadlock”—where we increasingly find ourselves today—as unpleasant and contentious as it is, may have a silver lining, however: More than anything, vetocracy ultimately represents a sort of hellish mirror image of the “heaven” envisioned over fifty years ago by the philosopher John Rawls.32
In Rawls’s famous formulation, individuals in some prebirth “original position,” not knowing their eventual social status, would negotiate a better world, one not necessarily of perfect equality but of at least a minimum level of “justice” for the worst off in society. In a world of widely distributed MADness, participants are likely in the long run (assuming, of course, that we aren’t all dead by then) to negotiate precisely the same kind of cold peace, in which redistribution (rather than Rawls’s pre-distribution) results in everyone’s attaining what negotiation theorists call his or her “reservation price” (or what policy wonks today might call a guaranteed minimum income) as the price of societal or global peace: what Rawls called “minimax.”33
In short, the democratization of force and violence provides just as viable a path, over time, to a greater democratization of power and economic resources—of mutually assured survival—as it does to Hobbes’s state of nature.
Culture and Identity
The struggle in which the world is now enmeshed transcends individual nation-states. It is not so much a “clash of civilizations,” in the sense that Samuel P. Huntington famously framed it, but rather one between cultures—an emerging digital culture different from anything we have known until now, and a culture reacting against all of this. The latter is not simply an ironically global antiglobalist movement. Pace Huntington, it encompasses a wide range of historical cultures—fundamentalist Islam, the Christian traditionalism of nationalist movements across the West, and the Eastern Orthodox authoritarianism of Vladimir Putin, as well as the nonreligious (though similarly sex-obsessed) ideologies of China and North Korea.
In short, 9/11, the Capitol insurrection, China’s suppression of Hong Kong and designs on Taiwan, and the Russian invasion of Ukraine are all expressions of the same global reaction. It is a reaction against liberalism, rationalism, and modernism, exacerbated by the extreme disaggregation, individualism, and relativism the digital revolution has thrust upon our analog world.
But that revolution is equally subversive of Enlightenment culture and everything it has represented: liberalism, individual autonomy, democracy, free markets, rationality, and the essential belief that all of these lead humanity toward not chaos but rather a collective and higher Truth, a Greater Good. This culture has not been without its fair share of criticism already, and not just from reactionaries: capitalism, for its relentless degradation of non-commercial, non-material, and non-individualist values; liberalism, for its undermining of community, consensus, and constraint; even modern science, for its reductionism and relativism. Today’s increasing alienation, hyper-democracy, intolerance and “bully” culture, abrogation of all authority, collapsing of all categories, digitization and commoditization of everything from human wellbeing to the entire natural world, and the negation of meaning—all raise the question: Are these simply the logical, indeed inevitable, outcomes of scientific rationalism, liberal democracy, and market economies?
What Is To Be Done?
Or is it possible, instead, to construct a capitalist economy that honors nonmaterial values, including a decent regard for humanity and the natural world? A rationalist science that provides a sense of greater meaning to human life? A universalist ethos that nurtures a satisfying sense of identity? A liberal society that undergirds a communal culture? A truly democratic and diverse polity that is actually still governable?
The virtual communities of choice that I believe will develop to provide previously “governmental” services flow from technological and economic forces—their ability both to aggregate individuals and to fence out free riders—but this ultimately will create true communities: People will aggregate in these virtual conurbations because of their desire for shared services, shared economic interests, and shared values.
To some degree, such “community” already exists on the Internet, though less on self-consciously “social media” than, for example, on eBay, where, although constituted solely from commercial purposes, you nonetheless can find social groupings that share your interests—in, say, collecting stamps, Ming vases, or medieval weaponry. The future virtual communities I imagine, however, will create, benefit from, and share with others, the public goods and societal (not just social) structures that are increasingly being abandoned by territorial governments: Social services, human capital investment, open spaces, and public safety are all public goods that emerge from the formation of real communities. They are economic functions, but they are also social constructs, values expressions, and identity generators.
Despite the current crises of identity engendered by the undermining of categories tied to presumably immutable, physical, real-world distinctions—gender, race, nationality—some of the strongest identities and connections that people form (such as religious affiliation) already transcend the tangible. The digital world has intensified the formation of such virtual affinities: Today’s many-to-many technologies allow an endless number of subcultures to form and to attract adherents who otherwise would have felt isolated and alone within prior technologies’ homogenized mass culture. This increased accessibility can be both positive and negative, prosocial and antisocial, depending on one’s perspective. For example, it allows both lonely queer teenagers and lonely neo-Nazi teenagers to discover that—unlike in three-network Leave It to Beaver Land—they are not alone. It hardly should be surprising that with the technological revolution’s simultaneous assault on old—and facilitation of new—identities and cultures, identity and culture have become the defining issues of our moment. But the breaking apart of current identities is also providing the crucible for the formation of new categories, new combinations, and new identities that will take the current ones’ place.
This process can include the generation of new politicoeconomic systems. Individuals who come together online to support the economic provision of public goods—such things as social services, human capital investment, and environmental preservation—will share not just an economic interest, even if that is the primary basis of their relationship, but also a set of values and, ultimately from that, a culture and identity, just as In Real Life. By allowing the construction of online communities and cultures where not only are values driven by economics, but economics are driven by values, new political economies will emerge. This is already happening to a large degree, on both the left and—perhaps more noticeably—on the right, with extensive cause-driven financial investment ecosystems.34
The unfortunate thing about capitalism is the extent to which capitalists have given it a bad name. As Princeton historian Harold James has noted, capitalism, broadly defined as a market economy facilitating voluntary exchange, has existed everywhere, at all times, even under those regimes that have purported to extirpate it, such as communism, prisons, and professional baseball.35 If “capitalism” is limited to mean merely the private ownership of the means of production, this only became problematic as the preeminent mode of production and source of wealth shifted from land to factories to machinery to finance, each of which has become easier to concentrate in fewer hands. This has led to the current crises of capitalist, liberal, rationalist society: its alienation from meaning, its extreme inequality, its undermining of democracy, and, ultimately, its total consumption of our planet itself.
The truly sad, and often overlooked, endpoint of Francis Fukuyama’s “End of History” thesis was its ironic prediction that the ultimate triumph of liberalism, democracy, and free markets would be so good at resolving the world’s struggles that this would require a renewal of old-fashioned nationalistic wars in order to manufacture some sense of greater meaning in most people’s lives. The opposite would seem more to be the case: It is the ultimate triumph of liberalism, democracy, and free markets in the form of Twitter, Facebook, and PornHub that has unleashed a global yearning to fill the moral void by sublimating individual fulfillment to that of the collective, promoting centralized authoritarianism as the authentic expression of the collective will, and reasserting traditional beliefs as necessary to social stability.
But there are other paths to meaning and other mountains still to climb—from biomedical advances or reducing exploitation to saving the planet. The new technologies are providing platforms that enable people to come together over these causes, just as much as all the other problems discussed thus far. While the new technologies have led to speculation that a robot invasion will kill most jobs (an outlook I happen to share), the tremendous wealth and cost reduction that will result creates the possibility that most people will be able to meet their material needs by working less and putting their remaining “work” time into noneconomic pursuits. As the Harvard social scientist Yochai Benkler has demonstrated, the New Economy has already enabled some people to engage such models of “commons-based peer production,” in which large numbers of people work cooperatively and often without compensation.36 Of course, that presumes that all the surplus value isn’t absorbed by the owners of capital and then hoarded, unfortunately a likely prospect unless the minimax scenario comes true.
One of the reasons the returns on capital grow faster than the economy as a whole—spurring greater inequality—as Thomas Piketty has observed, is that the holders of wealth are able to use it for “rent-seeking,” the technical economic term for using government to extract noncompetitive advantages. However, to the extent that digital technologies lead to the decentralization and disaggregation of capital, making it cheaper and more widely available, they also will erode the use of political power to distort the marketplace and the concentration of capital to distort democracy.
So, what, then, of democracy? I doubt it will survive—at least in the form we know: In this brave, new, disaggregating and individuating world, where you can make your own realities and construct your own communities, you do not have to be part of anything with which you disagree. So why should you have to submit to the will of the majority? Soon, you will not: You will simply join a different community, culture, country, or .com (or different ones for different purposes), in which everyone shares your values and identity. This is not democracy or society as we have known them, but it would be highly democratic and social. This raises the possibility of creating a new liberal and democratic culture (or cultures) based on shared values and choice, in which everything need not be mercenary or commodified. This would be a culture producing societies and lives of purpose and meaning—perhaps so much so that most people could respect (or at least tolerate) the purposes, meaning, and pursuits of others. It would be a world in which we would find a way to satisfy human needs without seizing territory or resources from each other, and thus commodifying and partitioning the very planet and web of life on which we all ultimately depend. It would be a long-term future, in short, of greater prosperity, sustainability, security, freedom, community, and meaning—one, indeed, of greater good.
Whether we attain that greater good, however—or sink into the “catastrophic collapse” with which this essay opened—is not predetermined. Technologies present opportunities and challenges; it is we who determine the outcomes. The only certainty is that, whatever future we choose, it will be dramatically different from the past or present.
A few years ago, shortly after Trump’s election, I gave a talk on this general subject to an audience of computer science students and recent graduates interested in social change. When I finished, the moderator, who had been a contemporary of mine at college, summed up her reaction—the reaction I get from most of my contemporaries—by saying, somewhat warily, “Very interesting. But scary.” So I turned to the young audience and asked, “Do any of you find this scary?” There was a long silence. No one raised a hand. “Of course not,” I continued, “because all of you already know this is what’s coming.” The young people in the room all laughed.
Those who will inherit this future are not frightened by the prospect of today’s political, economic, and social structures all collapsing. Those structures, in their view, have already failed them. They want and need new models to overcome the political, economic, social, and—above all—environmental failures we are bequeathing to them. To help them find these models is the least we can do.