This week’s Economist has a special Technology Quarterly section, which is, as is typical of such packages, full of generally optimistic accounts of how various technological breakthroughs will inevitably make everything better: “Geoengineering” will solve global warming, trees will be bioengineered to supply a sustainable fuel source, solar power is within reach, computer processing power will help eliminate crime and terror and the semantic Web will make our lives run on automatic pilot. The futurist bias means there’s not usually much column space given in these things to the way technology can be leveraged against ordinary citizens (as Julian Sanchez describes in this recent Reason article about pinpoint searches) to intrude into our lives in unforeseen and undesired ways and force upon us choices that we’d rather not have. I know that for some it’s an absurdity to assert the very possibility of an undesirable choice: all choices are good, they’d argue, because choices extend individuality. But to give an example of a choice I don’t want that may be coming, the section reports on a firm called Attention Trust that plans to allow you to sell your web-generated browsing data to advertisers interested in one demographic or another that you fit into. The idea here is that since other firms profit from selling your personal data, why shouldn’t you? (Self-exploitation, like undesirable choice, is an oxymoron. The Economist refers to it unironically as “grassroots self-marketing”.) Seth Goldstein, Attention Trust’s founder, points out that “attention is a valuable resource” and he wants to provide a means by which we can sell the traces of it to advertisers, who will process it to target us more effectively. Then we’ll be seeing what we want without perhaps even realizing its been sponsored, it will so seamlessly accord with what our gestating desires are—just-in-time advertising, right there to steer us as it occurs to us (a truth latent in our sold data, when cross-checked with those like us) that we want something. If we sell enough of our preferences, we’ll let advertisers perfectly target our well-crafted niche of one.
In the context of the Internet, where distribution costs are close to zero, production (usually cultural production—uploading photos, tagging film clips, blogging, etc.) is often undertaken for nothing but attention, making it the coin of the virtual realm. Goldstein wants us to think less of being producers in that way, trying to earn attention, and more of being brokers, selling it off, rather than giving it away, as we do now to things that merely interest us. We shouldn’t waste our attention this way; we should capitalize on it, instrumentalize it so that we can earn money rather than merely be engaged and enthused with what the Internet (and life, for that matter) can offer us. Proponents of Attention Trust would likely argue that you can do both; you can both pay attention and sell it. But it seems to me consciousness of the latter will begin to affect the choices that go into the former, circumscribing them. So the additional choice of self-marketing ultimately comes at the expense of other choices, which are now sullied and burdened with commercial considerations.
It seems that the very possibility of selling attention will inevitably make it seem incontestably right that one should do this, rendering irrelevant whether or not anyone should be doing this and whether we should be trying instead to develop technology to make our web lives more private, make anonymity more reliable. Instead of developing our abilities to sell ourselves, couldn’t we be more concerned with undoing the custom of our being sold?
This pragmatic question occurred to me as well: What happens if everyone stopped trading their attention for the entertainment products that ads are usually attached to and held out for a more direct deal? Who will fund those entertainments? Will user-generated content fill the gap? Or will the web denude itself of entertaining things, leaving us with no web histories to sell?