Many people are rightfully suspicious of Web 2.0. Is it a bubble, in the style of the “new economy” of the last century? They remember how dotcom turned into notcom practically overnight and wonder if this is just the second wave of Internet hype, a similar rabbit hole where billions of dollars will potentially disappear without a trace. Or could it really be different this time? Might it take root deep in the Web, yet transcend its boundaries at the same time?
In the early days of the Internet, roles were clearly delineated. Companies used the Web to offer information; users called up this information. The medium was organized along the logistics of transfer of goods: Supply here, on the company’s servers, and demand there, in the brain of the Web surfer. Not all that different than the brick-and-mortar economy.
Thus the old Web’s main feature was the portal, which, like a shopping center, bundled as many different kinds of offerings as possible. In the early years of Web commerce, the goal was to become an online mall of information, shopping, and services.
With Web 2.0, this paradigm is changing. Its applications give consumers a new role. They are now not-so-silent partners in a business relationship. While portal was the buzzword of the early Internet, platform is the keyword for the second act. Internet companies today want to profit by giving users a platform, a framework they can use and advertisers can then exploit. No longer is the producer on one side of the economic fence and consumer on the other. In the Web 2.0 business models, customers are not only served; they are also integrated into the transactions, adding value by volunteering information that’s useful and attractive to other consumers.
Ebay, the online auction site, exemplifies this approach. It sells nothing itself; instead it offers its customers a platform for selling things. Although Ebay offers a sort of safety net for transactions, in actuality, most transactions regulate themselves. In the world of brick-and-mortar transactions, the customer can remain anonymous. But on Ebay, the customer is evaluated publicly through the site’s feedback system, whereby customers supply the information that Ebay needs to secure users’ willingness to use its service. Thus, with little intervention from the company itself, customers do the work to make it viable.
Amazon, too, has a similar strategy. Its customers review and evaluate books, films, gadgets, clothes, food items, beauty items—anything that’s for sale through the site. Potential shoppers are not coaxed or persuaded or suckered; they are encouraged to do the research and exercise their judgment.
Of course, neither EBay nor Amazon is new, nor are blogs and photohosts like Flickr or Photobucket, other exemplars of Web 2.0. Still, the term is useful for labeling the trend of how more and more applications are fusing two seemingly antithetical elements—social networking and niche marketing—and reconciling them at the bottom line.
The point of Web 2.0 applications is to manufacture new marketing possibilities by cross-referencing the individual preferences of (inadvertently or not) networked people. Economic advantages go to those services designed to target individuals, even if that design comes by way of an algorithm. Amazon for example, supplies personalized recommendations simply by calculating a customer’s preferences based on the profiles of other consumers: “Customers Who Viewed This Item Also Viewed…” or “Customers viewing this page may be interested in these Sponsored Links…” or “Customers Who Bought Items Like This Also Bought…” Last.fm translates this principle to the music business. If you add a song to Last.fm’s player, you will subsequently receive similar tracks. The catch phrase for this is collective intelligence or the wisdom of crowds, as New Yorker columnist James Surowiecki’s book on the subject was called.
Even as these applications point toward more interactive participation and networking, they also supply users with a greater sense of individuality. Everyone is allowed a frame, an area of influence, a platform to supply uniquely tailored content and derive personal recognition for it. And with content we reach the heart of the matter. Selling content has always been difficult on the Web. Users want free content, and are usually persistent enough o find it. Both Salon and The New York Times have abandoned their efforts to charge for content, realizing that the antiquated distribution monopolies were no longer working in an age where blogs provide often better and more insightful commentary than somewhat anachronistic mainstream media.
Also, Really Simple Syndication (RSS) undermines the concept of destination sites. RSS allows people to subscribe to content and read it on their aggregator, without the extraneous design or advertising the site wants to couple with it. To cope with this, marketers and designers must now think beyond site design and figure out how to brand the content itself.
But the same forces that have changed the Web are also democratizing content creation, letting users make and distribute their own content (often based on the expensively produced content from theentertainment industry) with more and more ease and sophistication. This explosion of sharing and voluntaristic creativity makes it tempting to regard the social net as a socialist paradise, but before we get too comfortable, it’s important to remember that blogs, wikis, RSS feeds, mashups, tagging, folksonomy, and the semantic web are merely underlying technologies. The notion that “data wants to be free” was a mantra for the original tech bubble, after all. But Web 2.0 is not about sharing anymore than the original boom was. The new Internet technologies are being sold as turning passive consumers into active ones, but they are building for free the content that they will made to buy back later—whether with time, money, or energy.
So the new business model is really the old business model, and the key to monetizing Web 2.0 lies in mobilizing content, not controlling it with gated sites and proprietary systems. Microsoft was the avatar of the old tech business model, making its fortune by manufacturing, licensing, marketing, and protecting software—selling content. But the software it thrived on selling is rapidly becoming open-source or integrated into the Web itself. The successor to the tech throne, Google, has already begun exploiting this evolution with its Google Applications, which emulates Microsoft Office but is free and operates entirely on the Internet.
With all the participation and personalization and sophisticated marketing possibilities, what could possibly be wrong with Web 2.0? Think about what Web 2.0 aims to achieve: a global social net where business news, information, videos, and viral ads zoom through the online population and stick for a moment or two to a couple of eyeballs at every intersection before zooming on to the next knot in the matrix. But the problems with this vision revolve mainly around two issues: Who owns the content created by consumer/users and how will their personal information be kept private? And with the extension of targeted advertising that taps into social networking sites, your friends and contacts on the Web become a marketing platform.
Facebook exemplifies this: When users make an addition to their page, all their contacts are notified. So when a Facebook user, say, adds an application sponsored by Red Bull, everyone on that user’s contact list knows about it. If one considers that profile-pages can be mined for user-specific information to target advertising, one begins to see Facebook’s immense commercial possibilities. No wonder tech entrepreneurs are calling for a global graph of all the various social networks: a system that would connect them all in one giant metanet.
One such place is Fuser.com. It advertises itself as “the coolest way to unify your mail from multiple accounts. View your mail and social networking messages in one convenient location. It’s easy and secure.” For marketing gurus, this completely connected platform is a dream come true: an Internet where each Amazon item bought, each song downloaded at iTunes, each plane ticket purchased at Orbitz is connected, connectable, and, best of all, available for immediate transmittal to all the contacts on the user’s list on every social-networking platform.
But is the convenience that derives from all this interconnectedness worth the privacy risks? To many, the danger seems remote and diffused: a few extra pieces of targeted junk mail, another telemarketer calling in the evening, another online form to fill out. No totalitarian Big Brother seems to be staring back from our televisions. And Web 2.0 are specially designed to make us feel at ease, coddled and catered to. But the web of surveillance is slowly closing around us, and we are weaving it ourselves.
For example, consider the growth of “I report” platforms at the mainstream-media outlets. CNN, Fox, CBS, NBC, ABC—they all want us to provide the pictures, the words, the moving images. If we have them, they want us to upload them. Consumer-generated content has been a growing among marketers, too. Converse and Sony have both aired user creations on TV. Frito-Lay held a contest for customers to make a Dorito ad; the winning spot ran during the 2008 Super Bowl. Thus encouraged, we all turn our cell phones and iPhones to interesting scenes, upload them to YouTube, and hope to become an overnight celebrity, as with the infamous “Don’t tase me, bro!” clip in 2007. So in addition to the thousands of surveillance cameras in all public spots on every possible square foot of earth (and don’t forget Google Earth!), we are all busy filming one another and posting it online, seeking our own notoriety.
Photo (partial) found on BoingBoing.net
The increasing convergence of spectacle and surveillance makes it possible and for the first time, desirable, both to see and be seen simultaneously and continuously. Web 2.0 technologies make it possible for society to become truly panoptic: It can be seen as a medium through which everybody can watch everybody all the time. But what type of animal has to show its relevance to itself to believe in its own existence? Imagine the panic this animal must feel when the iPhone battery dies and the creature is rendered invisible?
Of course, the original panopticon was conceived by Jeremy Bentham as a model penitentiary. Today, we are not forced into such a prison; instead it enters our lives as electronic data gathering, online and in the real world via RFID tags, designed to make consumption more convenient. But in pursuing convenience, we are consenting to being incorporated into a finely tuned marketing machine, with ever more subtly adapted gears, to our meet our needs—manufactured and otherwise. But these gears may one day grind us into finely consumable bits and bytes. Happy meals for all.
// Moving Pixels
"It's easy to dismiss blood and violence as salacious without considering why it is there, what its context is, and what it might communicate.READ the article