Several weeks ago in my Classics of Science Fiction course, I referenced the 1983 miniseries V. After I explained the basic plot, I paused and added, “I heard somewhere that someone was remaking it.”
There was another pause, this one filled with the soft clicking of keys. In a moment, several students could tell me all about the new V miniseries. They could also tell me, should I forget, which book won the Hugo for Best Novel in 1978, who was the key grip for the film Children of Men, and what year the Joker premiered in the Batman comic series. All of this important information is just one Google search away.
Albert Einstein once said “Never memorize what you can look up in books.” Of course, Einstein didn’t have the Internet, a Blackberry, or an iPhone. Something tells me if he was alive today he would tell people “Never memorize what you can Google.”
The question is-how long do people remember this Googled information? Because of the Internet and hand-held devices that make the Internet accessible from virtually anywhere, people have access to so much more information than they did in the dark ages (aka the 1980s). But is this access making people better informed and more knowledgeable? or is there just so much information out there that it all simply becomes incomprehensible? I argue it is the latter.
It is a case of science fiction becoming fact. William Gibson’s vision of cyberspace put forth so beautifully in Neuromancer has, at least in part, come true. Gibson coined the term cyberspace in Neuromancer, stating: “The matrix has its roots in primitive arcade games … Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts … A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding.”
“Constellations of data” sound so pretty, except these constellations aren’t made of stars but of satellites. Thousands perhaps millions of satellites flickering in the sky, collecting information, and funneling it down to our computers where we Google, mine for data, and start del.icio.us pages to bring order to the chaos. People talk about feeling the effects of information overload—but they insist on continuing to contribute to it.
True knowledge (especially if it has a strong historical component) is power, as the saying goes, but factual knowledge is now cheap and readily available to anyone with a modicum of literacy skills—which makes facts, much like tuna fish and Ramen noodles, easy to overlook, particularly when facts merge into the factoids so readily accessed via hand-held devices. Factoids can be true but trivial facts, or factoids can be ideas that are generally believed to be true but in reality are actually not true.
Factoids (or urban legends) are nothing new. The term factoid has been around since the ‘60s, and many modern urban legends started to become popular in the ‘60s as well. However, the Internet, and specifically email, allows factoids, urban legends, and net hoaxes to multiply as quickly as the proverbial rabbit. Most people have probably received emails about dying children who need their help, department stores charging thousands of dollars for cookie recipes, or Snowball the Monster Cat. Most people are probably also guilty of passing at least some of these emails along to friends, who then pass them along to their friends. And on and on it goes.
Factoids have become so common that we now have websites, such as Snopes.com, to debunk factoids. On this site, people can learn, for example, that Hostess Twinkies don’t last forever and that the Great Wall of China is not actually visible from the moon - assumed ‘facts’ in modern culture that never were factual.
Information overload, assumed ‘facts’, and shortened attention spans that like to receive information via 140 character Tweets have changed the ways people learn and process information. Simply giving a student a list of all the important dates in World War II and telling them to memorize this list doesn’t work anymore. A student might be able to regurgitate this information on a test, but don’t ask the student to recite this information a year later (unless you give them a chance to Google it first). In an age where Twitter and Google seem to be taking over the world, how do people communicate information in a meaningful and memorable manner? They tell a story.
For centuries, stories were the primary way of teaching and communicating important information - using historical fact, dramatic reinterpretation, and fictional example to convey the point of the story. Stories kept histories alive (oral storytelling) or explained how things came to be. The Christian Bible, Sufi parables, Native American folklore, and Zen parables all entertained - but also explained. Even children’s stories taught moral lessons and illustrated cultural expectations.
Historically, stories have inspired, motivated, taught, and entertained. Consider an 11th-century Jewish Teaching Story Annette Simmons references in her book The Story Factor: Inspiration, Influence, and Persuasion through the Art of Storytelling:
Truth, naked and cold, had been turned away from every door in the village. Her nakedness frightened the people. When Parable found her she was huddled in a corner, shivering and hungry. Taking pity on her, Parable gathered her up and took her home. There, she dressed Truth in story, warmed her and sent her out again. Clothed in story, Truth knocked again at the doors and was readily welcomed into the villagers’ houses. They invited her to eat at their tables and warm herself by their fires.