Have traditional mass media models influenced the development of the Internet? This paper examines several ways in which the Internet and the World Wide Web have been described in terms of historical media, and how the Internet and the Web differ remarkably from these models. Thinking in the terms of traditional media - has created misinterpretations of the Web, built from conventions of past media.
Why reinvent the wheel?
Why reinvent the process?
Why reinvent the message?
Whenever a new media has emerged, both society and industry have used the models of former media as measuring sticks, comparing the new entrant to the standards, successes, and failures of the old. It's natural to use those things familiar - those things comfortable - when making comparisons to new elements in life. In the past, mass media such as print, film, radio and broadcast television developed, in part, due to both the inefficiencies and gold-mine prosperity of pre-existing media. With the development of multimedia (CD-ROM content, interactive film and video, PC and console games, edutainment, movie-ride films, interactive kiosks, virtual reality devices, etc.) we saw how all of the pre-existing media types came crashing, or converging, together into a collage of media elements. The Internet, more precisely, the World Wide Web, seemed the logical extension of this convergence theory when introduced in 1991: all of the power of the past media - which the multimedia industry was attempting to leverage - could now be accessed via a global network. The multimedia industry that promised the culmination of all media into a integrated, interactive experience now had the ideal media vehicle to deliver this experience into homes, schools, and businesses world-wide. The media of the Web was now the message for a blossoming new-media industry. When writing on digital media convergence theory in 1994, I espoused the message whole-heartedly, "Media convergence is the inevitable by-product of the digital evolution". Many other writers (and much more notable, may I add) championed the same ideas. Negroponte observed this convergence in the late 1970s when designing the mission statement of the M.I.T. Media Lab:
"All communication technologies are suffering a joint metamorphosis, which can only be understood properly if treated as a single subject, and only advance properly if treated as a single craft".
There are numerous writings on media convergence and the Internet, so much that it is virtually a prerequisite to wax nostalgic about the "metamorphosis" when writing a text on new media. In many ways, convergence is both a true and sound theory: media have joined together in many ways and it could be reasonably argued that the Web is the most "multi" of all media types. The problem is not within the theory itself, but with the by-products of the convergence theory. By continuing to use the models of past media as our measuring sticks we continue to develop the new media of the Web within the standards and ideas of these past media. From the production of tools and content to the delivery and experience, we think of the Web in film, radio, television, and print terminology. The focus of this article is to open a discourse on why this may be, at the least, a myopic perspective, and at the worst a roadblock for the further development of the Web as a mass media. In short, convergence ideology - thinking in the terms of traditional media - has lulled us into a mind-set of historic perspective, creating a black box for the Web, built from the conventions of past media.
Why reinvent the wheel?
My mother tells a wonderful story about her first experience watching a television and how it visualized many of the things she saw in her mind's eye when listening to her favorite weekly radio broadcasts. From a content, business and production perspective, television and radio shared many of the same elements: similar narrative structure, advertising models, format, and schedule. Radio worked. People enjoyed it by and large and radio advertising was, overall, a viable form of revenue. Of course, television brought the visual benefit missing from the radio broadcast - an undisputed and revolutionary value-add.
During the development of television and radio, the real access-point problems were two-fold: affordable receivers and space on the frequency range (i.e. available channel space). The modern equivalent to the television or radio receiver for us is the computer, and a case could be argued that the Web shares the same access problems as television and radio. However, the computer offers much more than just an antennae for reception. Frequency range - as a concept - is, currently, not as significant of an issue for the Web as it was for television and radio. In television and radio, channel space is similar to Manhattan real estate. Gaining broadcast space in these mediums is no small endeavor simply because there is a finite range of space to transmit in, i.e., there are only so many channels on the dial. However, the notion of the 'receiver' as a barrier to accessing the media is just as significant an issue to the Web as it was to TV and radio.
The development of the Internet shares a startling number of similarities with the development of radio. What's most important about examining the parallels in these continuums is not their collective milestones in content, but the likeness of the growth of the mediums themselves. To restate, the issue is not the content. In this context of media analysis, content, as McLuhan states so poetically, "is the numb stance of the technological idiot. For the 'content' of a medium is like the juicy piece of meat carried by the burglar to distract the watchdog of the mind". This is as true of the Web as it is radio, television, and film - the medium is the real story. As Sun Microsystems espouses, "the network is the computer".
There are multiple parallels between the early adopters of radio and the Internet, essentially these pioneers can fall into two groups: the professionals and the hobbyists. The major players of early radio history, such as Westinghouse Electric and Manufacturing, General Electric, and the Radio Corporation of America (RCA - who would later dominate the industry) can be compared with present day corporations such as Microsoft, AOL, Netscape, and CNET. The hobbyists/amateurs share many of the same traits as the early adopters of the Web, the hackers of the digital era. These users came to their respective media not out of profit margins and market share, but from a desire to explore the cutting edge of technology. For both radio and the Web, getting "online" was not an "if" but a "why not?" Domain names were easily had in the first years of the Web, much like attaining a license from the U. S. Department of Commerce to set up a transmitter was easy to do during the boon of radio - the main obstacle being a knowledge of Morse code. The radio hackers read Wireless Age with starry-eyed fascination, the main hubs of development occurred primarily in garages and college campuses. This was a new frontier and the excitement of communicating with someone "over the ether" was a major thrill. Which is not much of a stretch if you recall the feeling of receiving your first e-mail. A notable parallelism can also be found in the way the U. S. government responded to both radio and the Web. Both media quickly out-ran the U. S. legal system. At first, radio fell into the lap of the U. S. Commerce Department, then the Federal Radio Commission was created in 1927. Driven by greater regulatory needs, Congress passed the Communications Act of 1934, which created the Federal Communications Commission - an authoritative body that would serve as the basis for communications law. Out of a similar demand for regulation of the Web came the controversial U. S. Communications Decency Act (CDA) in 1996. Designed to prohibit distribution of indecent materials over the Internet, it received an injunction against its enforcement only three months after it's inception. The Supreme Court ruled it unconstitutional in 1997. These events describe not only the growing pains of a new medium, but the meaning that these media began to hold in the public eye. In essence, each of these Congressional acts illuminate how radio and the Internet emerged from the garages and university labs; they had become big business and as such were required to operate by the rules of industry, no longer merely the playthings of hobbyists.
Why reinvent the process?
The film industry introduced three key elements to mass media: specialized and expensive production, distribution and transmission. Although the influence and balance of this production process has changed a great deal since the birth of film in the late 19th century, it is still the de facto standard for the development of a film project. Like much of the new media terminology and dialect, this process for production has been adapted first by the CD-ROM and gaming industries and now, the commercial Web development community.
From staff titles to schedules to project philosophy, many commercial Web development "studios" follow the film-production model. There are producers, directors, writers, editors, designers, technical specialists, and personalities - be it a well-known columnist, chat moderator, or a fictional character. Projects move through a pre-production/planning stage to a production mode, where work is often displayed on a staging server before going "live". When production wraps, the project launches on a specified date on a specified time, not unlike a premiere for a feature film. As someone who has worked for multiple Internet start-ups, it's easy to see how the start-up in this context can be illustrated as the independent filmmaker, working on a minimal budget, under tight deadlines and often against impossible odds. Following economy of scale, as the budget increases, so does the ability to increase resources, staff, and development time and costs. Overall, the pattern remains fairly constant: throw more money at it and it will grow and get better . This is the lesson of Hollywood, and it has not been lost in the Web development community .
Not always is one model better than the other, but again, it is the through the media - each of these media - that we begin to understand the content and effect of the communication. If the new media industry is satisfied to follow the pattern of the past, following the production models of the established media, it must also be content to live within the same ceilings and boundaries of those former media types. This is the Pandora's Box that is frustrating to many. I often hear people who get disappointed by the Web's inaptitude to reach the threshold, speed, and professionalism of film, radio, and television. At a recent Web community event in San Francisco, a member of the audience asked one of the panelist about the bandwidth problems currently existing on the Internet, "When will it be like TV?" This elicited a round of laughter from the audience, but it quickly faded as everyone there awaited the answer. The emotion in the audience was crystal clear, they all wanted to know: tell us, when will it be like TV?
Why reinvent the message?
The corollaries between the Web and the print industry are many, however if I had to choose the single greatest benefit the Web provides over traditional print media it is this: immediacy. In short, there are no newspapers or magazines that can produce content 24 hours a day. Print is constrained by production and time. Frequency of distribution is limited to how much you can spend on production and publication. Again, the constraint is on distribution of the information. The range of distribution via print is directly scaleable with the amount of money you put into publishing the information. Along with immediacy, the Web provides the ability to deliver a tremendously larger amount of content than traditional print. This is a often taken for granted, but it is extremely important.
In many ways, the Web is to print as cable television was, and is, to broadcast television. The message remains constant and the core element is the same but the medium has been modified. Consider the following facts about cable and the Net as proof: First, one of the largest successes of the Internet to date: e-mail. Second, it is without question that the primary content on the Web is text, plain and simple. Third, for cable television, the benefits it provides are not strictly premium content, such as HBO, but much more choice of content. Fourth, the cable industry emerged because of a need for better distribution, developing in the late 1940s and early 1950s in mountainous or geographically remote locations where traditional, over-the-air broadcast transmissions were very poor. These juxtapositions between cable and the Internet illustrate that they are not so much of a convergence and not such a new breed of content as they are an extension of the existing machine, driven by an unsatisfied market segment.
Did cable television run out the broadcast industry? Out of the 97% of television households in the United States that have cable available to them, there still remains only a 58.5% utilization for cable in American households. This is a lesson for the print industry. The Web will not destroy print, it's exactly the opposite: like the relationship between the broadcast and cable industries, print and the Web will support one another in many ways, most of which are still unseen.
There is another, older historical perspective on this: the printing press. Although we often credit the birth of print to Gutenberg, it was actually an aggregation of ideas and events that moved printing into a mass media. The printing press was an ensemble of inventions in one place - technologies known for centuries before Gutenberg. The ensemble included, the wine or olive oil screw-type press, oil-based inks, block-print technology (known in Europe since the return of Marco Polo from Asia at the end of the 13th century), and Gutenberg's own development of a punch and mold system which allowed the mass production of the movable type used to reproduce a page of text. But even with Gutenberg's printing press, there was still very little change in the medium. This stemmed from the poor organization of book distribution. The market was available and the potential for utilization, but the transport and control and "advertising" systems were not in place. In addition, there was still a very low literacy rate in Europe. Most Europeans could not read at all. But many events helped the print medium. One was the availability of paper. Before the advent of the printing press, books were made of vellum (lamb or calf skin) because of its durability. The problem was, for print books, vellum was too costly to produce and use for mass distribution. However, at the time there was a large surplus of rag paper. The surplus was from the (literally) tons of clothing left over from the massive numbers of dead caused by the Plague in the mid-15th century. This surplus drove the price of rag paper down significantly (before the Plague, rag paper was an expensive commodity) and therefore provided an affordable, accessible media on which Gutenberg could use his tool to print and distribute information.
Distribution was greatly aided by events such as the Frankfort Book Fair. Frankfort was an early hub for printing, therefore the city sponsored a book fair which drew publishers, booksellers, collectors, and scholars. The fair also produced a catalog of all the works shown at the fair - an early Books in Print. Other important events include texts brought from Byzantium by the Crusades, and the multitudes of tracts, bibles, and religious information published by the Church. In addition, people who could not read were still exposed to book culture via entertainers and street minstrels who read from books at plays, carnivals, and marketplaces. All of this helped to instill and align supply and demand.
But the greatest similarity was the "information overload" effect of the Gutenberg breakthrough. Approximately 50 years after Gutenberg's first printed Bible, printing presses were disseminating product throughout Europe, producing commercially in over 110 cities. Near the end of the 15th century, there were roughly ten million copies of books available, with a readership population of only a few hundred thousand. For the literate, the amount of information available was thoroughly overwhelming. A situation that confronts the Web today. Digital smog, data overload, information saturation levels - these are the terms we assign to the barrage of content pushed, pulled, and converging upon us as we browse the Web.
Further evidence of the unionization of media, tools, and product can be found with one man's attempt at a solution to Gutenberg's unstoppable content machine. Agostino Ramelli, an Italian engineer working for the King of France, designed a "reading wheel" in 1588. In his descriptions, Ramelli explains how "a man can see and turn through a large number of books without moving from one spot". It was, quite simply, a browser. It allowed books to sit open-face in a Ferris wheel-like machine that could be rotated by the reader as he browsed through the various texts. Again, we return to the medium. Like cable and like the Web, the reading wheel provided a way to access more content in a world saturated by information. In this light, Ramelli's reading wheel can be seen as the great, great, grandfather of the Web browser .
So what then has history taught us about convergence? Most certainly, it has shown that media travels through a cycle of invention, access limitations, and information overflow. If there is a convergence theory that has lasted over the history of media, it applies to the convergence of revenue models and profit motives, market tides that ebb and flow, changing and reinventing as the media matures. The problem is not within the theory itself, because in some ways, such as economic convergence, it has had tremendous success. The problem rests with the hybrid by-products of the convergence theory and those by-products are the models used by other mass media to create, develop, and disseminate information. By continuing to use the models of past media as our measuring sticks we continue to develop the new media of the Web within the standards and ideas of these past media.
The solution is simple: give the Web it's own model. This does not mean clean the slate, it means give the Web it's due. The key to defining this model is use. As architects of this media, we build, shape, and produce based on how people use the Web. Why has Yahoo! remained at the top of the Web traffic lists? They help people find information. Why is e-mail tremendously successful? It helps people communicate. These two pieces are, in my belief, the pillars of the Web media model: accessible information and efficient communication. Hardly revolutionary in concept, but difficult to successfully implement in practice. There are evolutionary threads to this proposed model. Information organization is an entire science with many forefathers. For years, libraries, card catalogues, indices, encyclopedias, almanacs, guides, and menus have provided users with a "map" to the information they may be searching for. From the late 1950s, using computers to transform the traditional methods of information organization was a major focus of J. C. R. Linklider, the first director of ARPA's Information Processing Techniques Office and one of the key figures in the conceptualizing of the Internet .
Similarly, the advances in communications in the last century helped to foster the development of e-mail. Landmarks in the telegraphy, telephony, and cable industries continually pushed the envelope in efficient and affordable commercial and personal intercommunication. Without these - most notably the massive telephone infrastructure established by AT&T and Bell - global distributed networking (and thus e-mail) would have been extremely difficult to realize. Ironically, the ARPANET (the computer network that gave rise to the global Internet), was never designed to be a messaging communications system - it's creators had designed it for mainframe computer resource sharing between universities. This quickly changed with developments such as Ray Tomlinson's SENDMSG program which demonstrated a method for sending message files across a networked system (he's also the person who came up with the "@" sign to separate the users name from the machine). This was a revolutionary change that propelled the ARPANET into phenomenal growth and development.
But what then is still to be learned if we have such forthright and robust precedence? Quite a bit. Quite a few bits actually. On the immediate horizon are great changes for Web information organization in the form of the DOM (Document Object Model) and XML (eXtensible Markup Language). The DOM is a specification from the World Wide Web Consortium (W3C) which details how browsers should interpret HTML, Cascading Style Sheets, and XML documents. In essence, the DOM provides a clear and well-defined interface to describe, program, reference and build a Web page. One of the most anticipated applications of the proposed DOM is XML. According to the W3C, XML is "a common syntax for expressing structure in data". XML allows content providers to define, validate and share each piece of content on a page in an organized structure that will allow people to use the information in countless ways. Moreover, it will allow developers to form site specific meta-files that will provide a virtual card catalogue of their site. This is not a run-of-the-mill method for Web site mapping, but a way to give developers the functionality to allow their site's information to be easily, intuitively, and quickly utilized. Along these lines will be developments in data sets that are structured and relative. As the Web begins to form in an organizational media model, theoretically moving from the chaotic to the complex , users will no longer be satiated simply by the amount of data offered on the Net. Rather they will demand that their data be both quantitative and qualitative. This will require data (research reports, news stories, financial information, images, sounds, etc.) that is intelligently relative to other significant data. This will be a true realization of the non-linear origins of the hyperlink: one piece of data may link to ten others below and ten to the side and ten above but all relative in some way (chronologically, thematically, financially, etc.) to the other data - think of a map of a distributed network where nodes connect web-like to other nodes within the network.
The last few years has provided the evidence to this notion: debacles such as the browser wars and the Bell's lock on regional markets, and, on the opposite end, triumphs of user-supported, free products such as Apache and Linux.
The business power of the Web will skyrocket from deregulation and aggressive competition. Think TCI, Netscape, Bloomberg, and Qwest/LCI - companies that are simultaneously converging and decentralizing, building new revenue streams while continuing to expand in their traditional vertical markets.
Using the past, learning from experience and folding those ideas into the creative stream of original creation and development will progress the Web into a new type of media model - not merely emulating or juxtaposing old media elements. The Farnsworth's, Marconi's, Edison's, Bell's, Nipkow's, Turner's, and Gutenberg's of history have paved a phenomenal road for us, it's now time to learn from these concepts and build our own.
About the Author
Bill Hilf lives in San Francisco where he works for C|NET: The Computer Network as Web site engineer for NEWS.COM (http://www.news.com). He holds an MA from Chapman University where he researched digital media, computer science, and information theory. You can find his thesis work and other information about Mr. Hilf at WebWonk (http://www.sirius.com/~webwonk).
1. Stewart Brand, 1987. Media Lab: Inventing the Future at MIT. N. Y.: Viking Press, p. 11.
2. A recent survey conducted by research company CustomerSat.com in conjunction with Inter@ctive Week supports this concept by showing a direct relation between site budgets and traffic to the site.
- With investment of USD10,000 or less, 60 percent reported 100 visits or less per day to the site.
- Investment of USD50,000 to USD100,000 saw traffic increase accordingly, 53 percent reporting traffic between 101 and 1,000 visits and 14 percent between 1,001 to 10,000 visits.
- 6 percent report 10,001 to 100,000 visits per day.
3. The caveat to this is software development. As Web development becomes more technically driven and technology-dependent, many companies build their Web "applications" within the model of traditional software development.
4. Ramelli never built the reading wheel, it was only invented "by design" in his Le diverse et artificiose machine del Capitano Agostino Ramelli (The Various and Ingenious Machines of Captain Agostino Ramelli). The reading wheel was finally built in the mid-1970s for a film, and then later recreated for an exhibition.
5. Katie Hafner and Matthew Lyon, 1996. Where Wizards Stay Up Late: The Origins of the Internet. N. Y.: Touchstone.
6. In brief, Complexity theory questions how and why large systems behave in ways unexplainable by the sum of their parts. See Complexity and Chaos Theory Resources below.
Early Internet Business - Stories from Early Radio
Is It Really Gutenberg All Over Again?
The History Of Cable Television
Surfing th e Aether - Radio & Broadcasting Technology History
Todd Lappin, "Déjà Vu All Over Again", WIRED 3.05
Digital Entertainment: Art, Technology, and the New Forms of Storytelling in the Digital Era
The Evolution of ARPANET e-mail
History of Telecommunications
Telco empires emerge
The CDA: Case closed
Complexity and Chaos Theory Resources
Copyright © 1998, ¡ ® s - m ¤ ñ d @ ¥ Media Lullabies: The Reinvention of the World Wide Web by Bill Hilf.
First Monday, Volume 3, Number 4 - 6 April 1998