[Editor’s Note: This piece is written by an individual with a long background in the tech community who prefers to remain anonymous.]
Blogging, certainly “citizen blogging,” is dead—haven’t you noticed? So too a host of other enthusiasms and ideas that once seemed poised to transform culture. In 2006, Wired editor Chris Anderson predicted a coming new economy in niche markets, the so-called long tail of online consumerism. Turns out, the long tail of indy bookstores, out-of-the-mainstream music choices, and other products mostly ended up on Amazon, the behemoth sometimes accused of strong-arming smaller companies into contracts favorable to its bottom line. The grassroots, 2005-ish internet, full of opportunity—the sweet smell of revolution and change in the air—well, disappeared. Slowly at first, then seemingly all at once (like falling in love), we now have the Big Five: Google, Facebook, Amazon, Apple, Microsoft.
What happened? Remembering the prophecies for the web in the halcyon days of ten or (better) fifteen years ago is strangely painful and disorienting, like a hangover, largely because we so silently abandoned its ideals. Following endless feelgood clickstreams, we marched happily into big advertising and data monopolies. No one seemed to notice along the way—no bang, and nary a whimper. Further existential angst: Many Millennials have grown up with Big Tech and feel no such pangs of such remembrance. Facebook, Twitter, and Instagram were always there, the way older generations remember Kodak moments and microfiche.
Clay Shirky, a writer and consultant who is now a professor at New York University’s Interactive Telecommunications Program (ITP), once penned web 2.0-era best-sellers like Here Comes Everybody: The power of organizing without organizations (2008) and Cognitive Surplus: Creativity and Generosity in a Connected Age (2010), foretelling the rise of an uber-informed, socially conscious citizen, a new persona of our times. A little jingoistic, his message was still clarion: web denizens were poised to re-write the rule books, ridding the world of stodgy “gatekeepers” like the mainstream press and broadcast media (the horrors!), who unfairly controlled the production and flow of news and knowledge.
“Power to the people” was the trope of the mid-2000s, a meme that spread endlessly in blogs and commentary and on bookshelves—which, fortunately, did not disappear due to a digital takeover by e-books. Yochai Benkler, Harvard professor of Entrepreneurial Legal Studies, proclaimed in his widely read The Wealth of Networks: How Social Production Transforms Markets and Freedom (2006) that a new era was upon us, a kind of revolution where large numbers of networked people would take on collaborative projects online, all for the public good, without pesky requirements like paychecks. Wikipedia seemed to buttress his point, a case of collaborative production without expectation of financial recompense. Wired editor Kevin Kelly (and others) later called Benkler’s paean to online collaboration a “hive mind,” a nod to the social intelligence of bees, without a whisper of irony or derision. Benkler himself prefaced his academically serious rallying cry to the web 2.0 world with a quote from John Stuart Mill:
Human nature is not a machine to be built after a model, and set to do exactly the work prescribed for it, but a tree, which requires to grow and develop itself on all sides, according to the tendency of the inward forces which make it a living thing.John Stuart Mill, On Liberty (Henry Holt, New York: 1895), Chapter 3, pp. 106-107
Heady stuff. But it has an outlandish pie-in-the-sky feel today, as we bestride the narrow earth with cell phones in hand, obsessing over our “likes” on social media.
Shirky’s ideas have a whimsical and naive feel to them now too. His concept of cognitive surplus captures the insight that, when everyone goes online, they might quit or cut back on mind-numbing activities like watching sitcoms. There’s a surplus of cognitive—thinking—power in the age of the internet which we can turn to good use, like bringing about a social revolution in an Arab Spring, or inventing cures for cancer. His precursor book, Here Comes Everybody, bustled with anecdotes about ordinary people helping the police capture crooks using mobile technology. We can, of course, still pitch in like this with smartphones. But the Venn diagram of everyday usage is mostly full of triviata these days, selfies and so on. Our cognitive surplus has made Facebook, Google, and Apple executives extraordinarily rich while worries about privacy and legal issues with personal data mount. Depressing talk of dependence and addiction dominates discussions about the internet, replacing Shirky’s, Benkler’s and others’ optimism for change. The tendency of our “inward forces” is toward tweets, it seems. We’re still pretty entrenched in mind-numbing stuff after all.
The feeling of large, untapped powers bubbling into culture reached a fever pitch with Wired editor Chris Anderson’s idea that “big data” (that is: all your personal data, owned by someone else) replaced theory in science. His “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete” appeared on the glossy pages of Wired in 2008, continuing an exciting but ultimately dehumanizing trend from Marxian ideas about surplus time and creativity, liberated from the shackles of gatekeepers, to hive minds buzzing about, making encyclopedias, to Big Data and Artificial Intelligence ridding us of pesky theory in science. “Human nature is not a machine,” said Mill. Benkler’s optimism transmogrified, in the space of a few years, into a machine revolution, rapidly displacing human creativity. Like much of the internet these days, the transformation seems too quick, and foolishly motivated and conceived.
Before the ink dried (so to speak), scientists and other members of the intelligentsia pointed out (cough) that science without theory doesn’t make sense because theoretical “models” or frameworks precede big data analysis and give machine learning something specific to do, to analyze. But the zeitgeist of mid-2000’s web 2.0 had turned abruptly away from “power to the people” by 2010. Two years later, in 2012 Facebook would go public to the tune of over sixteen billion in valuation. We had ceased to even notice Google, our ubiquitous search engine giant, like keys in our pocket. Less than a decade after James Surowiecki’s 2004 hit The Wisdom of Crowds, the idea that people online displayed wisdom through collective activities on social networks or Twitter seemed laughable.
Looking back to the mid-2000s is to peer through the looking glass. We remember the words and the ideas but they somehow describe an alternate world, a different history. It’s (dare we say) a Winston Smith moment, a kind of newspeak that proclaimed the bright future of the web, the coming of everyone for everything, and by degrees almost subconsciously deposited us here, to our perplexing present. Investors who gave us web 2.0 in Silicon Valley now hope blockchain will save us, by eliminating the need for trusted relationships, itself a worrisome admission that that sort of thing doesn’t exist online anymore. New ideas and fortune tellers will no doubt emerge. Yet our task today is to fix so much that was broken.
That great shattered optimist, American novelist F. Scott Fitzgerald, left us with a bit of futurism about American promise in another odd reach-back into history in the Jazz Age. “So we beat on,” he said, “borne back ceaselessly into the past.” Endlessly striving for better. We are, we do. Here comes everybody, indeed.
Recent Analysis features at Mind Matters News:
We built the power big social media have over us Click by click, and the machines learned the patterns. Now we aren’t sure who is in charge
Futurism doesn’t learn from past experience. Technological success stories cannot be extrapolated into an indefinite future