In her writing “Cyberspace, Virtuality, and the Real,” Elizabeth Grosz states that “the concept of virtuality has been with us a remarkably long time. It is a coherent and functional idea already in Plato’s writings, where both ideas and simulacra exist in some state of virtuality…since there has been writing…there has been some idea of the virtual.” (111). She argues for an understanding of virtuality that is rooted in the real, as both an age-old concept and one “whose status as virtual requires a real relative to which its virtuality can be marked as such” (Grosz 109). Manuel Castells describes the same issue in terms of communication. “Cultures are made up of communication processes. And all forms of communication, as Roland Barthes and Jean Baudrillard taught us many years ago, are based on the production and consumption of signs. Thus there is no separation between ‘reality’ and symbolic representation” (Castells 404). All too often, however, the term “virtual” is conceptualized as digital space. New developments in technology are pushing towards a realization of “virtual reality,” a substitution of spatial and sensory experience that would provide enough similarity to our own reality as to appear indistinguishable, if not superior. Virtual and digital are not synonymous, however, and though they may share several key qualities, it is important that a line be drawn between the two.

Our mental substitution of digital as virtual is rooted in the incomprehensibly small size of digital data, especially when juxtaposed to the value that information provides. Consider, for example, the internet. Attempts to quantify the internet into an actual measurable weight prove tricky. Stephen Cass of Discover Magazine released an article describing his attempt to do so by uncovering just how much digital information comprised the internet, and then “weighing” that content. Cass’s calculation portrayed the internet as weighing a mere 0.2 millionths of an ounce, or about the size of the smallest possible grain of sand (Cass). Another approach yields a much larger, albeit still incredibly small, answer: by roughly calculating the energy needed to run all of the servers that make up the internet, Russel Seitz estimated that the combined total weight only came to roughly fifty grams (less than two ounces), approximately the weight of one strawberry (Seitz). When considering the fact that this is divided between 75-100 million servers, the value of such a number becomes relatively meaningless.

Considering the physical volume of such information is perhaps even more difficult. While an electron’s mass can be reasonably estimated, its size is a much more variant number.  The issue that this consideration does raise, however, is that of the storage devices associated with digital information. Storing, enclosing, managing, and networking digital content elevates the amount of space involved to the scale of the building, or, in the case of data-giants like Google, the mega-warehouse. Despite the fact that this volume is much more graspable, digital information still seems to lack any intuitive sense of size. On the one hand, the size needed for storage devices is shrinking: the 100 gigabyte hard drive that used to occupy a significant portion of my desk space can now be squeezed onto a flash drive and hidden inconspicuously on my keychain. At the same time, the data centers that are so instrumental to the functioning of the internet, despite being sizable, critical, and incredibly complex buildings, are rarely (if ever) discussed. The very fact that the term “cloud” is used to reference these centers should hint towards their intentionally intangible nature. The volume of digital information means nothing to us because we cannot evaluate it at the micro level and yet we can’t rely on any understanding of it at the macro level.

Another characteristic that obfuscates our prior considerations of information is the speed at which digital content travels. Whether the travel medium is the copper wires running out the back of your computer or the optical fibers that lie on the bottom of the ocean floor, data moves from server to server at speeds approaching the speed of light. In fact, any latency experienced by the end user has nothing to do with how far away the information originates, but rather the multitude of physical connections and complications that exist at either end. In some communication programs, it’s even possible to see when your partner is composing a message, or receive (instant) validation that he or she has read the one you just sent. Information and communication, even across great distances, are essentially instantaneous, rendering hopeless any attempt to discern the physical origin of content based on travel times. Recent music provider softwares have started focusing more on streaming music from the internet, erasing the boundary between local content and internet content and occasionally even doing away with the local provider altogether. Just like fast food, access to digital information has become entirely relative: If it’s not provided quickly enough, it becomes problematic, even though the delayed time is still only a fraction of the time it would take to procure the content using traditional means.

No matter how miniscule the actual internet may be, however, its breadth and scope seem to be infinitely large, rather than infinitely small. An article published by the Guardian in 2009 estimated that, at almost 500 billion gigabytes (500 exabytes), the printed equivalent of the internet ” would form a stack [of books] that would stretch from Earth to Pluto 10 times.” Not only that, but the internet is expanding at an incredible rate: the same article quoted technology consultancy IDC in saying that this number would double in the next year and a half (Wray). With such wildly divergent considerations of size, it is no wonder that the digital content we create each and every day registers as entirely virtual, with no real dimensional quality. Grosz describes this shift in perception as “perhaps the most striking transformation effected by these technologies” (Grosz 109). With no way to quantify digital content according to our traditional senses, and no conception as to where that content originates, we are left with no choice but to relegate it as “virtual.”

map of world’s undersea cables (graphic courtesy of Nicolas Rapp)
map of world’s undersea cables (graphic courtesy of Nicolas Rapp)

Grosz goes on to say that “the virtual is the space of emergence of the new, the unthought, the unrealized” (Grosz 110). Recognizing the effect of technological developments on society as a whole, she hints that the digital-as-virtual confusion is also rooted in its novelty (Grosz 109). Along with being new, she also paraphrases Marcos Novak in saying that “‘Cyberspace stands to thought as flight stands to crawling.’ In short, cyberspace is a mode of transcendence, the next quantum leap in the development of mind, as flying is a mode of corporeal transcendence of the bodily activity of walking” (Grosz 114). The digital is the latest frontier, the newest form of experimentation and exploration, and with continuing developments pushing out new hardware and software every week, it has become remarkably good at maintaining its glamour.

map of the internet (graphic courtesy of the Opte project)
map of the internet (graphic courtesy of the Opte project)

While both thought and computer virtuality may be connected in their mutual synthesis of “the new” and representation of the intangible, it is critical to realize that the latter does not occupy the exact same territory as the former. The first distinction, as outlined previously, is in the finite nature of digital technology. In Grosz’s words, “the capacity for simulation clearly has sensory and corporeal limits that are rarely acknowledged, especially because the technology is commonly characterized as a mode of decorporealization and dematerialization” (p. 111).  The second distinction lies in the fact that digital space is not inherently generative. Whereas thoughts are born from within our own heads, relative to our cumulative experiences and dependent upon more external and internal influences than we are able to measure, the output of computers is entirely reliant upon pre-programmed software. No matter how complex the algorithm or seemingly unpredictable the output, a computer’s generated results always originate in human input.

Jaron Lanier, one of the forefathers of virtual reality, admits an inherent danger in its development. “When my friends and I built the first virtual reality machines, the whole point was to make this world more creative, expressive, empathetic, and interesting. It was not to escape it” (Lanier 33). Lanier’s manifesto argues that creativity is a quality that can’t (and shouldn’t) be approximated by computers. But with the computer holding “the promise of a perfect open-ended automatism, a nonstop variability untainted by false consciousness” (Jones 8), it seems that our misconception of digital as virtual is leading us to believe that the computer is a space of creative generation. Whereas the computer may provide a valuable tool in the creative process, it is critical that it be recognized as such – merely a tool – and never serve to supplant our own ability to understand value and assign judgment. “The attribution of intelligence to machines, crowds of fragments, or other nerd deities obscures more than it illuminates…Treating computers as intelligent, autonomous entities ends up standing the process of engineering on its head. We can’t afford to respect our own designs so much” (Lanier 36).

Cass, Stephen. “How Much Does the Internet Weigh?” Discover Magazine. N.p., 29 May 2007. Web. 12 Nov. 2012.

Castells, Manuel. The Rise of the Network Society. Malden, MA: Blackwell, 1996. Print.

Grosz, Elizabeth. “Cyberspace, Virtuality, and the Real: Some Architectural Reflections.” Architecture from the Outside: Essays on Virtual and Real Space. Cambridge, MA: MIT, 2001. N. pag. Print.

Jones, Wes. “Big Forking Dilemma.” Harvard Design Magazine 2010: n. pag. Web

Lanier, Jaron. You Are Not a Gadget: A Manifesto. New York: Alfred A. Knopf, 2010. Print.

Seitz, Russell. “Weighing the Web.” Web log post. Adamant. N.p., 25 Oct. 2006. Web. 12 Nov. 2012.

Wray, Richard. “Internet Data Heads for 500bn Gigabytes.” The Guardian. Guardian News and Media, 18 May 2009. Web. 20 Nov. 2012.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: