Big Data Follows and Buries Us in Equal Measure

Nobody is listening to your telephone calls… But by sifting through this so-called ‘metadata’, [the intelligence community] may identify potential leads with respect to folks who might engage in terrorism.”

— President Obama, 7 June 2013

One of the blind spots of the digitized form derives, paradoxically, from its ravenous, undiscerning and all-seeing eye. Raw data has no introspective mechanism and affixes no value-coefficient to its own informational content. This evaluative process is left to external entities; sentient, sifting human beings—or their algorithm-proxies—who realize that within oceans of data, trophy catches are few and sea garbage is the norm. Actionable or useful data thus swims against near-insurmountable odds of detection. The ‘promotion’ from data to information requires human agency and recognition. How can humans accomplish this crucial anointing when information must be dredged from a great dismal data swamp that retraces God’s infinitude a little more each day?

The statistics are suitably staggering: According to a recent CSC study, data production will be 44 times greater in 2020 than it was in 2009. A July 2012 BT survey reported, “…a quarter of the decision-makers surveyed predict that data volumes in their companies will rise by more than 60 percent by the end of 2014, with the average of all respondents anticipating a growth of no less than 42 percent.” This fantastic upsurge in digital effluence is commonly known as Big Data.

Faced with towering silos of bric-a-brac, poets tend toward metaphysical swoons, certainly one technique for navigating the meta-morass. So consider this fair warning.

That said, I can’t help but think of Carl Jung’s Answer to Job. God is a stalking horse for today’s Big Data in the sense that His infinity induces a moral blindness that only Job, a human agent of particular discernment, can instruct Him through. Seeing everything is not unlike seeing nothing at all. Human insight, by contrast, is a narrowed gaze. Vision demands a focal point, a seeing-eye dog, a discrete POV. Poets are the woofers amidst the tweeters, our first-order data miners. They name things and in so doing give form to chaos.

Database administration is a degraded form of poetry, really a meta-poetry whose administrators play in a sandbox beside the legislators of the world. This latter poetic function Sven Birkerts, channeling Ranier Rilke, identifies as the human being’s seminal role—raising the world into consciousness, not just, “collectively, into a noosphere, and not digitally, into a cloud of data, but subjectively, inwardly, into language.” (“The Room and the Elephant”, Los Angeles Review of Books, 7 June 2011)

All that has first been named can be data tagged but only after our fervency–Rilke’s word—has expended itself. Thus, those who prioritize the Cloud have it backwards. Technologists are the post facto manipulators, the illusionists in our midst whereas poets keep it real. That’s why the latter can’t find jobs in a Big Data world. Birkerts quotes the following lines from Rilke’s “Ninth Elegy”:

Are we here perhaps just to say:

house, bridge, well, gate, jug, fruit tree, window —

at most, column, tower… but to say, understand this, to say it

as the Things themselves never fervently thought to be.

— (C. F. MacIntyre, Translator)

Birkerts and Rilke invite us back to Jung’s subjective self whose universe exists only because we have the eyes to (data) mine it. God recognizes Himself through our cognition. We suspect this pleases Him immensely. The Book of Job becomes a pre-Mosaic prototype for the Anthropic Principle. Data, by contrast, is a retrospective, a cataloguing of prior ‘authenticities’.

Far and away most data, if not much of life itself, is hardly worth our powers of recollection. Yet in the Digital Age, every traversal of Sisyphus’ hill becomes a discrete negotiation, an indexable transaction. By now Sisyphus’ travelogue would require a supercomputer. There is no human act or gesture so beneath our retrospective radar that it can anymore slip, blithely undetected, into the veils of time. The NSA and its commercial doppelganger, Facebook, are committed to the eternality of the less-than-mundane. Interestingly, Sisyphus’ punishment derived in part from chaining Thanatos, a ploy aimed literally at cheating death. Life seizes the moment. The life force doesn’t look back. Data storage makes its bones with the dead.

All these data-dependent claims on our past help to encourage a retro-reptilian-hoarding reflex.

There’s existential philosophy; then there’s existential practicality. We compound Big Data’s overhang by adding to it daily. However, it’s in the here-and-now where the potential for comprehension is greatest. A host of nimble and proactive analytics tools loom on the horizon which will better prepare us for what Anukool Lakhina of Big Data company Guavus calls ‘knowing the now’ (“We Need to Prevent Insights from Dying in the Big Data Avalanche” Gigaom; 6 October 2012). Humanity’s accumulated now‘s form Big Data’s past. The future must be seized knowingly. We can ill-afford to dither and let it just happen.

The past will not be relinquished lightly as the bankers have our coupon books to keep track of. Old Power and Money cements its power on the backs of our deeply regretted past transactions. Usura’s how they make their game in the present and promise the future to the image of the past. They are the celebrants of stasis. Under their rubric, we are going nowhere fast. All these data-dependent claims on our past help to encourage a retro-reptilian-hoarding reflex. IBM Global Business Service’s Teresa Pritchard-Schoch in a recent exchange called it Dino’s Albatross:

“…we now see the head looking behind at an enormous tail, a tail so heavy that the creature can hardly move forward. It is a tail comprised of hoarded information, kept without any measure of true value. A tail that now puts the entire being at risk because it requires attention that should be paying attention to the environment in which it must survive.”

Big Media casts its own Big Data footprint. We desecrate the up-close and sacred naming task with what novelist Don DeLillo calls white noise, that is, the make-work routine of papering over the hard work of consciousness-raising with a dust-layer of bytes and suspect media coordinates (or as DeLillo terms it, that “dull and unlocatable roar, as of some form of swarming life just outside the range of human apprehension.”— White Noise). Mediation is a diversionary campaign that traffics in the propagandistic terms of clarification and distillation; or, if you prefer the Fox News coordinates, fair and balanced.

Big Media’s Big Data diverts us from the task of Big Apprehension. We are kept to the realm of the observable. All that can be measured is pored over as though nothing else exists while Keats’ infamous Vale toils at the crucial work of soul-making at the subjective ‘unobserved’ level. Yes of course, the poets’ by-product, poems, are in evidence on the web. But the process of manufacturing soul through suffering evades the artifactual record. This flattening of poetry into bytes abets a shadow-project to equate poetry with food recipes and baseball scores. Suddenly at the time they are needed most, poets are marginalized further.

Fortunately, regular folks are more than taking up the slack. In the social media realm, we have the power to avert much of the Big Data landslide if only we could stop chattering amongst ourselves, continually giving up banal lives and journeyman repasts that surely drive our overlords to an ever rising contempt. Frankly, who could blame the Illuminati for its machinations as, handed the mic, all we could think to tweet was what we had for breakfast? My point Mr. Everyman, is that your grating ignorance and predilection for Eggo’s may have bought you a dystopia that’ll hang around well past the dinner hour. I told you to brush up on your Adorno and Marcuse. But nooo, you wouldn’t leggo.

The real little man disease is well-earned envy as the floodgates of Facebook fly open only to reveal oceans of drivel. How oceanic, you ask? “Just two days of the current global data production, from all sources — five quintillion bytes (a letter of text equals one byte) — is about equal to the amount of information created by all the world’s conversations, ever, according to research at the University of California, Berkeley.” (“Sizing Up Big Data, Broadening Beyond the Internet”, by Steve Lohr, The New York Times, 19 June 2013)

One could be forgiven for wanting to head some of this yadda-yadda off at the pass before the Word becomes flesh to make its dwellings among us. I mean, I gotta be me, you gotta be you. But must our Eggo’s leave behind minable contrails? The collapse in embedded processor pricing will soon allow for smart toasters. Every appliance will have a snappy retort. Every briefcase will carry an airtight alibi. The world is irrevocably data and sensor-rich and there’s no going back.

Going forward then, how can we vouchsafe an authentic human sphere within this sea of data? or is ‘soul’ ripe for a digitized deconstruction? The trans-humanists suggest Job v.2 will be a robot sent to teach the machine the ineffable nature of soul. That’s provided the ineffable (that transcendent, ‘extra-data’ realm which literature purports to stalk) is indeed antithetical to data and not subsumable within a Big Data skein. Stephen Marche suggests as much: “Literature cannot meaningfully be treated as data. The problem is essential rather than superficial: literature is not data. Literature is the opposite of data.” (“Literature is not Data: Against Digital Humanities”, Los Angeles Review of Books, 28 October 2012)

Marche’s essay title hat-tips a vast field of endeavor known as Digital Humanities to which he (and I) probably give shamefully short shrift. Some of the mandates emanating from this new academic wing are tantalizingly terrifying. Here, Bruno Latour is discussing nothing less than Big Data’s potential for cataloging the ‘inner workings’ of the soul:

The precise forces that mould our subjectivities and the precise characters that furnish our imaginations are all open to inquiries by the social sciences. It is as if the inner workings of private worlds have been pried open because their inputs and outputs have become thoroughly traceable.”

“Beware, Your Imagination Leaves Digital Traces”, Times Higher Education Literary Supplement, 6 April 2007

Wikileaks’ Julian Assange pointed out recently that the East German secret police employed ten percent of the population at one time or another as informants. That sort of high overhead will cripple any enterprise. No wonder the Soviet bloc collapsed. Fortunately, the fascio-corporatists have our backs. The genius of Facebook is that it is an emoticon-besotted surveillance apparatus through which friends rat out friends routinely, unwittingly and for free. Hey, if I’m sending my buds to the Gulag, I want beer money to help subsidize my tears.

Big Thinker Jaron Lanier proposes an even starker equivalence in his latest book Who Owns the Future?: “Information is people in disguise, and people ought to be paid for the value they contribute that can be steered or stored on a digital network.” Despite his defense of regular folks, Lanier seems oddly acquiescent to our object status as though we are in fact mere data warehouses, albeit with a propped-open backdoor. Nonetheless Lanier is onto something when he suggests the value-exchange is poorly understood by the average Facebook consumer-supplier.

In a nation of rip-offs, the thief is king, so it pays to study his M.O. Facebook aggressively runs all of its employees, regardless of formal function, through Big Data boot-camps in an effort to “promote a culture in which everyone uses data to test and ultimately roll out new products, design changes, and other improvements.” (“What I Learned at Facebook’s Big Data Boot-Camp”; CNN-Money-Fortune, by Michal Lev-Ram, 13 June 2013)

The Facebook micro-culture may augur the macro-culture, or is a nation of thieves unsustainable? Clearly, Facebook knows the trove over which it presides and the extractive capacity for all nearby hands to just dig in. Good for Facebook. Apply the distributive computing model over a massive pro bono user base, paint a solicitous happy face above the front door and the cost of data collection suddenly vanishes into the ether. Where the East Germans insisted on payment, we give ourselves and our loved ones up without a fight, without a nickel.

Alright, so everyone gets a shovel and we’ll dig ourselves to a collective nirvana. On the other hand (said one equivocating economist to another), might what Big Data pioneer Jeffrey Hammerbacher calls the impending renaissance of the ‘numerical imagination’ yield up the metrics of what poets have insisted on calling since time immemorial, soul? Perhaps there is no ghost in the machine. Perhaps it’s all machine. Perish the thought.

Intuition Will Not Be Indexed

Poets notwithstanding, all that glistens on human lips has never been gold, anyway. That our fingers excel at capturing every demiurge now with dispatch on one PDA or another does nothing to burnish the archival value of the utterance. Would Zeus have been less cruel, more circumspect in his meting out of punishment, had it also fallen within his purview to store the repetitions of his wrath?

Perhaps human data generation should consist of a finite annual allotment of bytes per year, per capita much like a carbon tax. No doubt Al Gore can invent the apt paradigm. Perhaps we are discovering the darker side of near-universal literacy, you know, those same seven billion souls who can’t wait to share what they had for breakfast on Facebook.

Are we being incorrigible elitists even to suggest such things? The carbon analogy is not as facetious as it sounds. In some sense, data is an exhalation. Of course there’s money in the quotidian. Facebook makes a fortune monetizing our errant chatter. But is there transformative meaning? Surely we’re not here only to make money (an imaginal exercise itself) only to have them listen to us very closely so that they can take it all back again—echoes of Sisyphus in his green-eyeshade permutation?

Studies have shown three-quarters of all data has the retention value of an empty gum wrapper.

No, the human race didn’t wait for the Digital Age to dawn so that it could suddenly exhale en masse. What has changed is that we are all now affixed with carbon dioxide monitoring devices, low-cost handheld appliances that record our every hiccup. Our heart beats. Our data emits. Barely audible, off-hand remarks—veritable verbal tics—that our own spouses have the good sense not to query for clarification are being cataloged by digital devices.

Nor am I deaf to the durable idealistic notion that all human musings (nothing less than the murmuring of souls) are inherently valuable, certainly of a higher order than, say, other excretions, e.g., perspiration, waste product and the like. Indeed the democratic impulse is offended by the notion that quotidian effusions do not merit attention. This was not always the case.

In his 1994 essay, “The Future of the Book”, Umberto Eco reminds us that broadly prevalent literacy is a relative blip on the human culture timeline. All the hand-wringing over our TV-besotted age (a phenomenon Eco refers to sardonically as “mass media criticism of mass media”) forgets the profound illiteracy that preceded it for many centuries: “We can complain that a lot of people spend their day watching TV and never read a book or a newspaper, and this is certainly a social and educational problem, but frequently we forget that the same people, a few centuries ago, were watching at most a few standard images and were totally illiterate.”

Eco delineates further between publishing and communicating. With the advent of handheld devices, many of us have migrated unwittingly into the realm of publishing (fossilized entrails) versus ephemeral (sound wave-dissipating) communications. Indeed the NSA has conscripted all of us into the publishing game without so much as a referendum. Police states are funny that way. Every hiccup has a shelf-life. Long live the permanent record.

For the moment, Moore’s Law is still finding cupboard space for our personal effusions. But even that venerable efficiency curve is flashing the fault-lines of fatigue. (Theoretical physicist Michio Kaku predicts its collapse in ten years.) Only Uncle Sam has the real estate and the mindless profligacy to even try and keep abreast of the tsunami. In a Russia Today interview (4 December 2012), whistleblower and former NSA crypto-mathematician William Binney suggested well before the Snowden revelations that the NSA is collecting everything from everybody (what a shrewd discerning beast, that Uncle Sam!), thus the need for the 1.5 million square foot, $2 billion data storage facility in Bluffdale, Utah: “I don’t think they are filtering [the totality of society’s data]. They are just storing it. I think it’s just a matter of selecting when they want it. So, if they want to target you, they would take your attributes, go into that database and pull out all your data.” (William Binney, “Everyone in U.S. Under Virtual Surveillance”)

On one level, the NSA’s strategic plan seems to have stepped out of a French existentialist novel. It collects the data because, well, it and the data are there. Of course, part of the plotline involves forgetting that the Constitution is there, too—or once was, anyway. For the moment, storage capability and analytics techniques are evolving briskly and the government is flush enough to afford them. The day has arrived when, if the government decides it doesn’t like you, it will simply data-mine you to backstop all the reasons why it doesn’t like you.

Alas, profit-making entities do not enjoy the same boundless access to acres of Utah desert and public largesse. Big Brother enjoys scalability whereas profit centers cannot forgo front-end data analytics techniques. Capitalists have to take out the trash because data warehousing is a huge and growing expense. In a perverse twist on the

crowding-out effect, the private sector could ultimately contract under the onerous burden of data storage costs (even as the business value of the stored data is known to be de minimus), while the public sector sits smug atop your paramour’s pet name. You call that fair, Mr. Orwell?

Studies have shown three-quarters of all data has the retention value of an empty gum wrapper. This is one way of saying the legal profession has zero interest in its liability value (and don’t think for a minute that defense against potential lawsuits isn’t a big part of the anal retention bias). IBM’s Pritchard-Schoch, again:

A large part of the inability to push a delete button is the result of legislation requiring businesses to maintain certain identifiable information to ensure transparency when ostensibly working on behalf of stockholders. In addition, a business of any size being sued, or suing to protect its rights, better be able to produce evidence to prove its case. Court sanctions have been swift and harsh in the evidentiary arena. Attorneys have responded to keep it all. Attorneys are focused on risk. They look in one direction, strictly adhering to the law, torpedoes be damned.

Fortunately, there are countervailing forces within the enterprise. IT departments, threatened by the predations of data storage costs on their budgets (and the resultant brakes on innovation and development) are as eager to take out the trash as in-house general counsel are to let the refuse just pile up. Nor did enterprising CIO’s climb the corporate ladder for the purpose of becoming graveyard caretakers. And yet a recent McKinsey & Company report (“Big Data: The Next Frontier for Innovation, Competition, and Productivity, May 2011) projects 40 percent growth in global data generation per annum versus five percent growth in global IT spending. With fewer allocable dollars contending with explosive and unabated data generation, Big Data risks becoming the dumpster that ate The Next Big Thing.

How will innovation maintain a place at the IT table? Slowly, senior management is coming to realize that the security blanket is really an anvil in disguise. The fact, is Big Data threatens to be a major job and productivity killer. With more bytes and less people, the machine wins again. Frankly, how many more battles can We the People afford to lose?

Even today, only two percent of all existent human data is on the Internet. Oh good, only 98 percent more to plow through! Rilke would be struck by the frivolity of the task, indexing the totality of (ever-expanding) human data, tantamount, one suspects, to moving every grain of sand on every beach from the left side of the beach to the right side and vice versa.

Suppose Sisyphus managed just once to tip his boulder over the crest of the hill. Would it not just careen into a meta-valley on the other side? How is our wisdom, our knowledge enhanced by the reptilian impulse to catalog everything under the sun or, as Sven Birkerts characterizes, the replicative meaninglessness of the so-called ‘digital path’, to invent:

“…a parallel realm… [that] would move us away by building a new world, with new human rules, and placing it squarely atop the old.” (“The Room and the Elephant”, Sven Birkerts, Los Angeles Review of Books, 7 June 2011)

Should the day ever arrive (it would have to be at the end of history) when the universe becomes fully indexed on the Internet, does the Internet not become the universe? or at the least a parallel meta-universe? What will we do then? Re-roll our boulders to their originating valleys? Admit the inevitable and collapse our souls into avatars? Who will conduct the first-order, up-close reconnoiter, what Emerson, anticipating Rilke describes as, “…the poet nam[ing] the thing because he sees it, or comes one step nearer to it than any other”? Metadata names names, making it at least one step removed from the poet’s sacred project. Our transformative energies are wasted on filing chores, relegating us to glorified machine-language adjuncts.

Steve Lohr looks ahead to this very prospect:

“Decisions of all kinds, [Big Data experts] say, will increasingly be made on the basis of data and analysis rather than experience and intuition — more science and less gut feel…what psychologists call ‘anchoring bias.’” (“Sizing Up Big Data, Broadening Beyond the Internet”, The New York Times, 19 June 2013)

Anchoring bias sounds a lot like poetic voice, that woefully inadequate yet durable nemesis of analytics everywhere, the human soul. The impending Big Data train-wreck cries out for a deeper reckoning to which we must rally our poet-technologists, all five of them. If we would only self-listen with proper gnostic intensity our data footprints would collapse like the nervous babel they mostly are. Big Data is the shadow-form of all we could not bring ourselves to reflect upon. Intuition will not be indexed.

Therein lies its value. Intuitives risk being hunted to extinction by the NSA State. If you cannot tweet it, it will not exist, an assault on Rilkean consciousness Patriot Act IV will surely codify. The apotheosis of P. K. Dick’s black iron prison (and Bentham’s Panopticon) is the Internet in its late-stage authoritarian form. Even Hammerbacher asks rhetorically if belatedly, “What does it mean to live in an era where things and people are infinitely observed?” Thank you, Mr. Hammerbacher, for tossing circumspection on the pyre of scientific advance.

But then, scientists are famous for plunging ahead and leaving others to look like ridiculously out-of-step Luddites. Allow me to dig my heels in first: If the wonders of Hiroshima have taught us anything, it is that the huge potential of Big Data will be met with a mushroom cloud of compensatory magnitude. Thus, it is precisely the breathless claims of Big Data analytics that have me shaking in my boots. We must relight the early Christian catacombs somewhere off the grid as the soul is being driven underground, once again.

I’m also prompted to offer an updated definition of that cagey yet ineradicable word ‘soul’ as being the human region which proves resistant to data collection and surveillance, not because we erect a killer (and thus someday, ‘with the right technology’, surmountable) firewall; but because there is something within the very fabric of soul that is antithetical to data collection and looms one step beyond Hammerbacher’s ‘infinite’ field of observation. The proof for soul? That Sisyphus’ punishment is so incomprehensible in magnitude and scale that no data silo can ever hope to contain it in the shuttered language of binaries. Capture is impossible. Only poetry can evoke it.

The stakes couldn’t be higher. If the soul proves to be but a billion points of convergent data, we will brush through the trans-human era on the way to machine-hegemony and human extinction. There’s a whole human movement working earnestly towards this capitulation called Singularitarianism —how traitorous, how charming. Absent this forever vouchsafed realm, the poetic project collapses like a metaphysical hoax perpetrated against the centuries. As goes poetry, so goes the soul. Historic man cannot be so far behind.

In the meantime, we are high-tech beasts of burden dragging stones towards a Great Collective Pyramid of Cyber. Had we realized the Digital Revolution would enlist us in a massive water-carrying project instead of emancipating us to pursue a Greater Meaning (the manna-headstone of information), we might never have picked up those damned Blackberry’s in the first place. Now we’re hooked. But please, just hold that thought. Don’t type it.

Publish with PopMatters

PopMatters Seeks Book Critics and Essayists

Call for Papers: All Things Reconsidered – FILM Winter 2023-24

Call for Papers: All Things Reconsidered – MUSIC Winter 2023-24

Submit an Essay, Review, Interview, or List to PopMatters

PopMatters Seeks Music Writers