Call for Feature Essays About Any Aspect of Popular Culture, Present or Past

The Rise of the Data Self

Bookmark and Share
Text:AAA
Wednesday, Jan 25, 2012
Freud is out, Facebook et al. is in. For example, we try things that seem self-expressive using media that can give us quantified feedback, and only when the results come back do we decide whether what was expressed was "true".

This Smithsonian post (via 3QD) offers some more support for my fledgling thesis from Monday’s post that “normal” identity is becoming explicitly data-based—that it’s natural to think about who we “really” are in terms of statistics-driven self-surveillance rather than depth psychology or self-actualization quests or anything like that. Freud is out, Facebook et al. is in. For example, we try things that seem self-expressive using media that can give us quantified feedback, and only when the results come back do we decide whether what was expressed was “true.” We can convert ourselves in the same way into data, that can make us into a statistical profile and return to us what other people with similar data profiles are doing, and hence what we ourselves should be doing.
  
Recommendation engines are perhaps the most explicit form of this: “Customers Who Bought Items in Your Recent History Also Bought…” Data-driven micro-marketing approaches are another form. The Smithsonian piece offers other examples derived from “quantified self” initiatives that have people monitoring their vital statistics and uploading them for analysis and aggrgation.


Consider the possibilities in health care. In the past, anyone analyzing who gets ill and why had to rely on data skewed heavily toward sick people–statistics from hospitals, info from doctors. But now, with more and more healthy people collecting daily stats on everything from their blood pressure to their calorie consumption to how many hours of REM sleep they get a night, there’s potentially a trove of new health data that could reshape what experts analyze. As Shamus Husheer, CEO of the British firm Cambridge Temperature Concepts, told the Wall Street Journal, “You can compare sleep patterns from normal people with, say, pain sufferers. If you don’t know what normal sleep looks like, how do you tease out the data?”


What a dream come true! We can collect enough data to create the profile of the ultimate superbeing: the perfectly average human. And then we can use health-insurance protocols to force everyone to become this or else.


But my suspicion is that this runs deeper — that data collection is slowly becoming the ideological basis of the self — what we regard as the real self. Data is the authorized way to pursue self-knowledge in the networked society; the other means are suspicious, deluded or outmoded. This is not just a matter of the evergreen appeal of naive empiricism. (It has numbers; it can be graphed; ergo, it’s true! Numbers don’t lie! Who cares how they are contextualized?) Since interactions within social networks are now easily captured and standardized, the quantifiable data thereby produced have become far more constitutive of identity. Just read this article about the designers of Facebook’s Timeline function. As the post explains, the user interface is what is supposed to dictate the self as you navigate through your own data heap. With Facebook’s organizational help, you muck around in there looking to build the real you.


The assumption is that by letting Facebook capture and process everything, a more reliable version of the self than our own memory can give us will be produced. As the post’s title suggests, the UI has “soul”; you do not. Or, as a subheading in the piece claims, life should be seen as having a UI. There is no direct experience of life; it’s entirely a data network that we need mediated for us. In one of the more disturbing hubristic-techie quotes I’ve read in a while, one of the designers tells us what Facebook Timeline lets us do: “You gently consume time.” Rage, rage against the dying of the light, etc.


And though Facebook wants “the Timeline to be a place for self-expression: A way for users to reveal who they are and what their lives are about,” it has provided a tightly controlled and highly formatted medium for it that emphasizes standardization (echoing the old Facebook vs. MySpace distinction; Facebook was “clean” because it retained aesthetic control). It imposes the metaphor of life and memory as a stream, which, as Eric Harvey notes, is not some natural, neutral reflection of how we remember but a reshaping of life into narrative, which suits Facebook’s ends. The more work we put into making a coherent story out of the data Facebook collects, the more useful, marketable information we give them.


It makes little sense to look within for the true self when we have available immediate (and processable) reactions from and comparisons with masses of other people to help sketch out our contours, when we have an enormous data trail we have created incidentally to be reprocessed by outside parties as self-revelation. Our filters are who we are; the “social graph” is not something on which we are merely one point, but it’s instead a map of our identity (or at least what capital wants us to think of as our identity). We become more of a person the more we build out this graph, let the flows of information it facilitates constitute us.


As Nicholas Carr pointed out here, the “right to be forgotten” by tech companies that want to own our memories and even the process by which we remember, may be in danger, despite the E.U.‘s effort to institutionalize it. The degree to which the data self is naturalized for us will determine how much such a right will seem beside the point. 

Comments
Now on PopMatters
PM Picks
Announcements

© 1999-2014 PopMatters.com. All rights reserved.
PopMatters.com™ and PopMatters™ are trademarks
of PopMatters Media, Inc.

PopMatters is wholly independently owned and operated.