The Age of Optimization (part 1)
Man is no longer the measure of all things. The dimensions of human endeavor have expanded from bodily cubits to incomprehensibly tiny angstroms and incomprehensibly large light years. Architecture, comfortably situated in the middle of this spectrum, and rarely departing from human dimension by more than one or two orders of magnitude, has correspondingly lost authority.
— From “Digital Ground” by Malcolm McCullough
A few decades ago, speaking about the potential of Virtual Reality, futurist philosopher Timothy Leary observed that “the idea for you to trap yourself in a 300 horsepower vehicle, emitting toxic waste and fighting the freeways, or worse, fighting New York traffic to lumber and bring your body to a place where you’re going to do mind work, ranks down there with cannibalism.”
Since then, the number of miles Americans drive has risen at more than double the population growth. And even while more than half of the areas of our cities is covered in roads and the US spending $30,024,236,000 annually on national highway improvements, people’s mobility is continually on the verge of coming to a grinding halt. Not to mention the fact that fossil fuels keep throwing the world into one crisis after another.
In a simultaneous development, we are now routinely transporting our simulated “bodies” to alternate online worlds, where, besides social activities, we are doing most of our mind work in an inter-connective space shared by 1.5 billion internet users. However, as our activities keep migrating from the physical realm to this emerging digital infrastructure we have yet to grasp its power to transform the world to a much larger extent than is commonly realized.
In this series of posts I will try to explore some possible consequences of the rapidly advancing virtualization of the world which can, at least partially, be measured in terms of its potential to free up space in the “real world.” Next I will try to demonstrate how the virtual and the actual worlds will continue to complement each other and eventually will become the integrated spaces of the future where the atomic and the digital will converge.
In the early days of computing, in pursuit of a good yarn, I stumbled upon the idea that motion pictures of the future might one day forego physical reality and be generated electronically. In the resulting screenplay that was based on this idea, this technological breakthrough caused an uprising in the ranks of the town’s filmmakers and stars, challenging their very notion of what it means to be human.
Needless to say that at the time the powers that be in Hollywood were at a complete loss about the concept. But a few decades later, Computer Generated films proved to be among the most successful genres at the box office, and today even so-called “live-action” blockbusters are largely software-based. It won’t be long until the motion picture industry will be virtualized to the point when film crews no longer have to scour the earth in search of cheap locations and tax incentives, from Romania to the Philippines, while lugging around truckloads of heavy equipment, star trailers, traveling kitchens, portable toilets and so on. Instead of logistics that are more fitting for military operations than for the creation of cinematic illusions, movies of the future will primarily be conceived in the digital domain, allowing for much greater identification, heightened immersion and game-like interactivity.
Along the way I had the opportunity to produce the world’s first catalog of sampled sounds for Emulator's electronic keyboards, storing the musical instruments of an entire philharmonic orchestra on a pocket-sized floppy disk. It would become the seminal moment in my understanding of how computer technology would continue to transform the world. Within a year or so many television shows and low budget movies were no longer scored by live musicians but by individual composers in their home studios. Simulated symphonic film scores emanated from their keyboards, enhanced by an occasional violin overdub to give large string sections a more acoustic feeling, or infused with human breath blown into a plastic tube attached to the keyboard to add life to the sound of a sampled saxophone. As is usually the case with technology-driven progress, people’s fears that this breakthrough would render musicians obsolete did not come true. But it was yet another important step in the rapid virtualization of the culture, which was inevitably enriched by putting an otherwise inaccessible musical palette in the hands of numerous talented musicians at a minimum cost.
Meanwhile, the digital age is in full force on other fronts as well as Search engines are promising to give everybody access to the aggregate knowledge brought forth by human culture. Recently, I had some firsthand experiences with a number of institutions whose existence appears to be under siege due to the public’s changing relationship with all this information. For a project I’m currently working on, I visited the Rem Koolhaas-designed Seattle public library, which is a gorgeous architectural ode to the book, expressing great optimism about a culture worth saving. But in reality, the library holds only 780.000 books, all of which can be contained on one external hard drive you may find on sale for $240 at your local electronics store.
Although the librarians don’t like to talk about it, they suspect that at this juncture the future of the book is hanging in the balance. Their apprehension is based on technological developments that are beginning to turn their profession upside down by offering people unprecedented online access to the very information that was once their exclusive analog domain. They suspect that their own livelihood may be in danger once Google launches its advanced search engines which according to the company will function “like reference librarians with complete mastery of all human knowledge,” providing people with search results far beyond what’s possible today.
Today, large scale book digitization projects are well under way. Other than complicated copyright issues, this is not nearly as intimidating a proposition as it would seem. Just consider the following statistics taking the Library of Congress as their starting point:
- The Library of Congress is the largest print library in the world with a collection of 26 million published works, making up the majority of all existing books, half of which are in the English language
- This may seem like a lot of books, but in the digital age it doesn’t represent much data. By comparison, the same amount of information as is printed in the total number of existing books is posted online every two months
- If we consider that it would take one person roughly a year to digitize 3000 books, this means that all 26 million titles can be scanned by the population of Detroit in the course of one long weekend
- In terms of computer storage the average content of a scanned book takes up one megabyte, adding up to a total of twenty six million megabytes. This means that the combined text of all published books amounts to just 26 terabytes of data, which can be stored on a server taking up less shelf space than the 32 volumes of the combined Encyclopedia Britannica.
- To put it differently, the 650 miles of books stored on the stacks of the Library of Congress (roughly covering the distance from Chicago to New York) can be collapsed into a few feet of digital storage space
- Obviously this does not mean the demise of the Library of Congress but the launch of a parallel digital archive which will make these books universally accessible and conducive to search
Thus, the digitization of the Wisdom of the Ages that was once verbally passed on from one campfire to the next, then copied in long hand, and published in print, will soon be liberated from its heft and become available online, where its contents will be saved from obscurity by making it instantly available to all Internet users.
Also see Part II and Part III