First Monday

Cyberinfrastructure and Innovation Policy by Brian Kahin

The National Science Foundation’s cyberinfrastructure initiative evokes the federal program and policy initiatives around the Internet 15–20 years ago. Although there are important differences between the leading edge platform infrastructure of the Internet and cyberinfrastructure’s focus on knowledge, community, and resource integration, both involve boundary–crossing that provokes interaction with the social and economic environment. This interaction has led to new forms of technology–enabled “private ordering” and “virtual organization.”

The Internet standards (and therefore the Internet as infrastructure) were the product of a legendary virtual organization, the IETF, which was supported by research funding for science–based data communications. This proved a powerful but unspoken standards policy that ultimately triumphed over the official standards setting process for data communications. Cyberinfrastructure remains very concerned with standards, but with adopting open international standards to ensure interoperability rather than pioneering them.

Public research, infrastructure development, and standards are closely allied as elements of innovation policy in IT. The other key element, the patent system, looks like it should be an aspect knowledge infrastructure, but it operates very differently, especially in the IT sector where an overabundance of questionable patents has led to failure of public disclosure — a principal purpose of the patent system — and growing tension between patents and both standards and infrastructure development. Bridging this gap may be the greatest challenge facing the evolution of cyberinfrastructure.


Infrastructure development as innovation policy
Infrastructure for science
Telecommunications infrastructure
Infrastructure for commerce
A model for innovation policy
Standards as innovation policy
East Coast/West Coast
The political economy of knowledge and information infrastructure
Aligning patents and knowledge
The changing innovation policy landscape
Patent reform as innovation policy
Cyberinfrastructure as innovation policy
Patents and infrastructure
Reconciling the power to enable and the right to exclude



The National Science Foundation’s cyberinfrastructure initiative suggests that information technology will play a renewed transformative role, not just in research and education but in supporting collaboration and innovation throughout the economy. Cyberinfrastructure can provide support for aggregating, organizing, enhancing, validating, and legitimating expertise — wherever expertise is valued or needed. Like the earlier federal investment in Internet technology and infrastructure, cyberinfrastructure raises questions of sustainability, institutionalization, commercialization, and privatization. But cyberinfrastructure functions at a higher level of social and economic interaction that is more centered on knowledge and process than on information and content — and therefore more concerned with patents than copyright.

The NSF vision of cyberinfrastructure builds on the success of the Internet, including the historic role of NSF in supporting high–performance networking, supercomputing, databases, and applications. Yet the investment case for cyberinfrastructure must be made in the shadow of the commercial Internet’s revolutionary achievements and seemingly infinite potential. Beyond its advanced nature, large scale, and dedication to science, how does the vision of cyberinfrastructure differ qualitatively from the vast interoperating complex of hardware, services, and applications built around the Internet? How is cyberinfrastructure transformative — rather than merely helpful? What is the practical relationship between cyberinfrastructure and market–driven advances in the private sector — or the huge public investment in the federal government’s information infrastructure?

Innovation and competitiveness have gained renewed attention in the past few years. Spurred by reports such as Innovate America, the report of the National Innovation Initiative of the Council on Competitiveness [1], and Rising Above the Gathering Storm from the National Academies’ Committee on Science, Engineering, and Public Policy [2], the Bush Administration launched the American Competitiveness Initiative (ACI) in early 2006 [3]. Although ACI advocates increasing household uptake of broadband, neither ACI nor the reports advocate enriching and enhancing the functionality of information infrastructure, despite the conspicuous role played by the Internet as a platform for innovation and increased productivity [4].

But that is precisely what the National Science Foundation proposes to do — not as a policy initiative but as a program to develop and research advanced cyberinfrastructure. NSF’s cyberinfrastructure program has been re–launched in the office of the NSF director, and the final version of a long evolving vision paper has been released [5]. While the program is focused on research and education, the potential applications, and implications, of cyberinfrastructure are not confined to NSF’s core constituency [6].

The cyberinfrastructure initiative builds on a longstanding NSF commitment to computer–supported cooperative work, including a two–year program on “knowledge and distributed intelligence.” While the cyberinfrastructure program is not a policy initiative, it is supportive of innovation policy: a commitment not only to developing advanced infrastructure but to understanding how advanced infrastructure affects research and innovation. Significantly, NSF has also just launched a program on the Science of Science and Innovation Policy (SciSIP) [7].

As noted above, the report of the National Innovation Initiative does not address cyberinfrastructure per se, but it sets forth a uniquely broad and holistic vision of “innovation infrastructure” complementary to the cyberinfrastructure program’s emphasis on collaboration and virtual organizations. It defines infrastructure as “the physical and policy structures that support innovators, including networks for information, transportation, health care and energy; intellectual property protection; business regulation; and structures for collaboration among innovation stakeholders.” In particular, it recognizes the importance of standards, middleware, and a pragmatic view of intellectual property:

[Intellectual property] collaboration is becoming an increasingly critical tool for IT innovation as well. No single organization has the scale to build today’s complicated systems, but a single entity can inhibit or block access to IT networks through control of patent portfolios and prohibitive rents. More broadly, the need for interoperability — linking the patchwork–quilt arrays of legacy systems within most large enterprises and between systems of distinct firms — has resulted in a shift towards open standards, coupled with development of new middleware tools to enable this connectivity. Standards, like TCP/IP — the transmission protocol that makes the Internet work — have created an extraordinary platform for innovation of new technologies, markets, industries and business models.

The protection of and global respect for IP are now more critical than ever. But optimizing for innovation will likely require an evolutionary but deliberate shift in IP systems and standards — including patent pools, open access databases, open standards, flexible and affordable cross–licensing, multi–jurisdictional patents and harmonized patent systems — that can be tailored to rapidly evolving technology and knowledge networks. [p. 15]

By contrast, the consensus political commitment to innovation policy remains centered on publicly funded research and privately held patents [8]. Under the model institutionalized by the Bayh–Dole Act, government–funded research and patents secured by universities are viewed as an innovation pipeline in which the three sectors play straightforward sequential roles: Public funding supports research at universities, and any practical outcomes are then patented and licensed to industry, usually on an exclusive basis, to develop and market [9]. The Bayh–Dole model seems especially suited to science–based biotechnology, which requires substantial additional investment to create and test marketable products [10]. The fit with information technology, especially software, where individual patents are normally a small part of a successful product is less apparent [11].



Infrastructure development as innovation policy

By contrast, there is a long history of government involvement in different forms of infrastructure that are recognized economically important. Of these, the Internet has had an especially dramatic effect not just on economic activity but on innovation, an impact rivaled only by the electric grid.

The federal role in the development of the Internet extended from DARPA’s early support for the packet–switching technology to the NSFNET and the High Performance Computing and Communications Program, to the White House policy initiatives for “national information infrastructure” and “global information infrastructure” in 1993–1996, and finally the Framework for Global Electronic Commerce in the late 1990s. Ironically, a decades long federally funded research program suddenly transformed into perhaps the most privatized and unregulated form of infrastructure yet.

The rapid transformation of the Internet came from the confluence and synergy of other forms of infrastructure — infrastructure for science, telecommunications infrastructure, and infrastructure for commerce. In turn, the Internet would transform each of these.



Infrastructure for science

Infrastructure for science was originally little more than shared assets within particular fields and disciplines, such things as costly instruments, databases, and printed journals. Although a product of computer science, research networking offered the prospect of a general–purpose infrastructure for all sciences. Early funding was community– or discipline–specific (ARPANET, CSNET, HEPnet), but the efficiency case for sharing networking capacity, especially for accessing costly general–purpose supercomputers led to the creation of NSFNET as a common network for research and education in 1987. Compelling economies of scope and scale as well as political considerations urged constant broadening of the user base to include industrial research, public agencies, and K–12. NSF’s desire to see the NSFNET become institutionalized and self–sustaining, coupled with growing demand for extension in many directions, created a strong case for commercialization, which NSF sought to address by partial privatization [12].

The commercialization and privatization worked differently at different levels. The overall architecture of the NSFNET was opened up by liberalizing the acceptable use policy and by designing network access points that facilitated interconnection by commercial service providers [13]. TCP/IP technology was adopted in parallel by the private sector for corporate networks, because it enabled the interconnection of different kinds of local and wide–area data networks. The rapid uptake of nonproprietary TCP/IP led to commoditization of routers, which in turn fueled further growth of the Internet.

While the evolution of the NSFNET technology and infrastructure was too situation–specific to serve as a model for other forms of technology transfer, the default of open standards and free interconnection set precedent for the World Wide Web and for the commercial Internet that followed. The OECD would report: “Openness is an underlying technical and philosophical tenet of the expansion of electronic commerce. The widespread adoption of the Internet as a platform for business is due to its non–proprietary standards and open nature ... .” [14] The market’s embrace of openness was motivated by negative user experience with closed systems and expanded appreciation of user needs for choice and independence.



Telecommunications infrastructure

The Internet was a practical application of advancing computer science, propelled in part by the interest of computer scientists in making practical use of data networking. Yet the Internet was developed at a time when telecommunications providers, typically owned or closely regulated by the state, were also pursuing data communications — but following a different technological framework, Open Systems Interconnection (OSI), under development at the International Organization for Standardization (ISO).

Commercial e–mail and remote login services existed at the time the science–driven Internet was expanding, but these commercial services were not seen as offering a path to broadband. It was commonly expected that telephone or cable technology would evolve to support videoconferencing and movies on demand. While cable had brute capacity, the financial resources, inherent two–way capabilities, and regional scope of telephone networks made voice telecommunications the dominant model for advanced infrastructure. It was widely assumed that, as in the case of other forms of infrastructure, enormous top–down investments would be needed to lay fiber and upgrade the whole of the network, especially the “last hundred feet” to the home. Yet the Internet was able to work on top of the existing telecom infrastructures, use dial–up connections, leased lines, and eventually DSL.



Infrastructure for commerce

Although familiar forms of infrastructure were physical or hard–wired, established markets for goods and services also function as a form of infrastructure. Electronic commerce capitalized on well–developed markets for mail order and widespread acceptance of credit cards, but it made remote commerce more immediate, efficient, and global. It enabled consumers to do directly what was previously done through intermediaries, automatically creating order information and receipts. It was able to link advertising to sales and to radically expand markets for specialty and unique items, enabling almost anybody to open a store and sell to the world.



A model for innovation policy

The extraordinary success of the Internet shows the importance of advanced infrastructure for supporting innovation in the broadest sense, including remote collaboration, sharing of inputs, ordering of data, modeling and simulations, communication of ideas, connecting research and the application of research. Its boundary–crossing capabilities offer a vehicle for rapid low–cost technology transfer that may be most appropriate when other barriers are low and there is little need for strong incentives. The rapid adoption and diffusion of the basic Internet technology took place because it did not require licensing, exclusive or nonexclusive. The computer scientists who developed its nonproprietary protocols as a large collaborative project were users themselves and wanted no barriers to inhibit future users who would add value by further enlarging the user and resource base. This free, software–defined, general–purpose technology created a market for off–the–shelf infrastructure that could be rapidly implemented by anyone. Instead of a Bayh–Dole pipeline based on exclusive licensing and the incentive to large investment, the Internet offered “massively parallel” transfer at a modest level of investment (and many opportunities for adding proprietary value) that did not need the inducement of exclusivity.

Cyberinfrastructure is not the coherent single–layer utility of the NSFNET but the aggregate of many interoperating functions and elements:

Cyberinfrastructure integrates hardware for computing, data and networks, digitally–enabled sensors, observatories and experimental facilities, and an interoperable suite of software and middleware services and tools. Investments in interdisciplinary teams and cyberinfrastructure professionals with expertise in algorithm development, system operations, and applications development are also essential to exploit the full power of cyberinfrastructure to create, disseminate, and preserve scientific data, information and knowledge. [15]

Cyberinfrastructure has to encompass data, communications, and content, but its focus is on meaningful integration, interpretation, and application of these and other resources. In short, cyberinfrastructure supports the creation, management, and use of knowledge — echoing the scope and mission of the research university, including its expanded role in continuing education, technology transfer, open education, and other public service. Unlike the federal agenda that accompanied the rapid expansion of the Internet in the early 1990s [16], the cyberinfrastructure vision has an explicit socio–technical dimension. As described by NSF Director Arden Bement:

At the heart of the cyberinfrastructure vision is the development of virtual communities that support peer–to–peer collaboration and networks of research and education. The sea change in the way science, engineering, and education are conducted involves more multidisciplinary work, greater collaboration, and a trend toward international connections.

These “boundary–crossing” experiences require more than technical knowledge and skills. They rest on competencies in collaborating and communicating across disciplines, distances, and cultures. [17]

Boundary crossing enables collaboration and the transfer of knowledge, but it is also the most conspicuously disruptive aspect of the Internet. It brings different regimes and practices into unaccustomed proximity, often with unpredictable, transformative results. Since cyberinfrastructure is constructed not just on technical specifications but within a legal, normative, contractual, and market framework, cyberinfrastructure–enabled boundary crossing allows a variety of forces — social, economic, political, cultural — to interact outside of their established confines and settings. In the business context, this disruption creates unexpected competition and drives “creative destruction” that paves the way for innovation. In a policy context, it may inspire innovation, but it is also likely to inspire defensive efforts to protect old business and policy models.

When the Clinton Administration launched its National Information Infrastructure initiative in 1993 (yet another NII, not to be confused with the 2004 National Innovation Initiative of the Council on Competitiveness), the implications of Internet–enabled boundary crossing were not well understood. There was already concern about maintaining boundaries around personal data and copyright–protected content, which had surfaced in other electronic contexts [18]. But there was little anticipation of the transformative effects on commerce, the range of pathologies and bad actors that the commercial Internet would attract, and the recurrent interaction with the broad spectrum of laws, regulations, and practices established for the physical world. The Internet and the Web had been designed as enabling infrastructure without regard for economic and social issues — and unconnected to any institutional framework for protecting or mediating among rights and interests, social, economic, or political.

Thanks in part to its academic roots, the Internet passed traffic across national borders with little regulatory inhibition. It was not only insensitive to distance, it was global by default.

Since most assumed that broadband would be driven by video on demand or two–way video communications, it was easy to think that any necessary boundaries or rules could be engineered into these more or less regulated networks, and that network owners might reasonably be held responsible for policing their use [19]. However, the end–to–end architecture of the Internet, its decentralized ownership, and its boundary–crossing nature made it difficult to impose controls under existing frameworks and institutions. Unlike cable and telephony, which were integrated, capital–intensive, and territorially defined, the Internet was an overlay that made very efficient use of leased lines, local area networks, dial–up telephony and other physical infrastructure already in place. Thanks in part to its academic roots, the Internet passed traffic across national borders with little regulatory inhibition [20]. It was not only insensitive to distance, it was global by default. Recognizing the boundary–crossing power of the Internet and the need for international leadership, the administration announced a Global Information Infrastructure initiative less than a year after the National Information Infrastructure initiative was announced [21].

With the embrace of globalization, the rapid expansion of the commercial Internet, and the boom in Internet services, a wide spectrum of policy issues came to the fore. Sensing the implications for world trade, the Clinton Administration launched yet another policy initiative in 1996, the Framework for Global Electronic Commerce [22]. In line with its view that the private sector would build the National Information Infrastructure, the administration advocated private sector leadership on policy combined with governmental restraint — on the grounds that the rapidly evolving digital environment made it difficult to develop informed public law. On the other hand, the World Wide Web facilitated private ordering, including proactive industry self–regulation, through labeling and software–enabled interaction. The World Wide Web Consortium (W3C) took the lead in developing “social protocols” such as the Platform for Privacy Preferences (P3P) and Platform for Internet Content Selection (PICS).

In many respects, the Administration’s model for enlightened private ordering was the Internet Engineering Task Force (IETF), the remarkably informal group of experts responsible for the rapid evolution of Internet protocols. A paradigmatic virtual organization enabled by the technology that it was advancing, the IETF operated outside the official standards development framework in the U.S. [Official standards development is done in committees formally accredited by the American National Standards Institute (ANSI), then passed to ANSI for review, and ultimately carried by ANSI to the International Organisation for Standardization (ISO).] The IETF became known for its pragmatic culture of “rough consensus and running code,” its intensive use of the Internet to conduct its business, its free and open exchange of working drafts and adopted standards, and its requirement for two or more competing implementations before a standard became final. Those who built to the IETF standards could do so confident that the technology worked and would be freely available to any and all implementers and users. It was exemplary “private ordering” not in the classic sense of self–governance or self–regulation, but in collaborating at a distance to create a common enabling platform upon which different private implementations could compete [23].



Standards as innovation policy

Most importantly, the IETF succeeded in setting the de facto international for data communications. It was responsible for the unexpected triumph of the Internet protocol suite over the Open Systems Interconnection (OSI) approach developed and endorsed by major telecommunications providers, national governments, and the International Organisation for Standardization (ISO) [24]. Its success represented a victory of computer science and open borderless collaboration over committee–driven international hierarchies.

This success was the result of an unarticulated backhanded standards policy in which standards development was supported as a scientific enterprise through funding from public research agencies. Yet despite this success — or perhaps because of it — standards policy has not drawn increased attention. Standards as a whole are seen as technical and non–innovative and are therefore of little interest to either scientists or politicians [25]. Standards policy remains bound into an arcane institutional framework that serves for all technologies and all kinds of standards. At the same time, many IT standards seem to develop well enough on an ad hoc basis outside of this framework. The plethora of IT standards and their highly specific market–oriented means that standards policy, such as it, is mainly reactive, coming into play mostly in response to anti–competitive practices.

In line with the philosophy of restraint espoused in the Framework for Global Electronic Commerce, policymakers have been reluctant to enact legislation that might be perceived as regulating the Internet. The exceptions that prove the rule include several attempts to curb the availability of pornography, the Digital Millennium Copyright Act of 1998, and the CAN–SPAM Act of 2003. But although Congress may have been slow to legislate, judges confronted with contending litigants must make decisions and have repeatedly faced the peculiarities of cyberspace in interpreting laws designed for the physical world.



East Coast/West Coast

In Code and Other Laws of Cyberspace, Lawrence Lessig famously explained how the architecture of both physical space and cyberspace interacts with case law in regulating human behavior. In doing so, Lessig analogizes and contrasts law and software as two forms of “code,” describing them as “East Coast code” and “West Coast” code respectively. West Coast code is the “running code” of the IETF and the vision of enabling infrastructure embraced by the National Information Infrastructure initiative. The West Coast also represents the creative, entrepreneurial, and pragmatic frontier of Silicon Valley. East Coast code is the politically and institutionally ground set of laws and regulations that has traditionally mediated and controlled human behavior, often under the political influence of established commercial interests.

Today, the interaction of technology with social and economic context is accepted, albeit not in terms of top–down government control. As Bement notes:

The challenge further encompasses an array of social, economic, and legal factors affecting, and affected by, cyber science. These include policies; norms of practice; and rules, incentives, and constraints that shape individual and collective action.

The key words here are “affecting, and affected by,” i.e., genuine interaction, not just impact in one direction or the other. Complex enough within the institutional context of basic science, the issues raised become far more challenging where commercial opportunity is present. They are also more challenging when the interaction involves fundamentally different professional perspectives, as in Lessig’s two kinds of code.

“Social protocols” such as P3P and PICS attempt to bridge the gap by wiring contractual terms into software. They allow users to define their preferences for reuse, so that future contracting is either automated or at least simplified.

Alternatively, the hard edge of legal code can be softened to accommodate the virtual environment. The “notice and takedown” provision of the Digital Millennium Copyright Act of 1998 limited copyright liability of infrastructure providers such hosting services if they responded to requests from copyright owners to remove infringing content. Instead of the old alternatives of no liability (common carrier model) or strict liability (publisher model), the law provided for an intermediate process–based model.

Of course, everyone would like to extend the infrastructure they understand or control. From a vendor standpoint, it would be nice to extend market infrastructure into information infrastructure by linking identifiable machines to identifiable customers. Yet individuals may use multiple computers, and computers may be used by multiple individuals, and any “hardwiring” of individuals to machines raises privacy concerns that might keep people off the network or invite countervailing legislation. Instead, the connection between people and transactions has been managed through a variety of softer tools: cookies, registration, e–mail confirmation, IP addresses, and authentication.

Sometimes the gap is simply moved or reconfigured. The so–called “click–wrap” license is an attempt to standardize interaction on the merchant’s terms, but it can also be understood as institutionalizing a gap: The user agrees to terms without reading them because the likelihood that they will be enforced is outweighed by the burden of reading and making sense of them and the futility of negotiating alternatives. On the other hand, residual uncertainty on close issues led software vendors to push states to enact the Uniform Computer Information Transactions Act (UCITA) in the late 1990s to reinforce contract–based control over customers [26].



The political economy of knowledge and information infrastructure

In the sciences, the expansion of explicit knowledge is commonly understood as an unalloyed, positive good. Information infrastructure enhances the value of explicit knowledge by facilitating access, use, and re–use. The explosion of explicit knowledge, including data, papers, journals, discussion lists, Web sites, etc., in turn increases the value of information infrastructure. It helps when exchange takes place with a community, such as the academy, that values free and open exchange. The commonality of interest within the firm also makes a compelling case for sharing infrastructure and knowledge, as seen in the attention given to “knowledge management.” However, the firm encompasses a variety of professional and functional perspectives that sometimes compete for internal turf, and there is the perpetual concern that employees may leave the firm taking knowledge they may have acquired from within.

Today, the global aggregation of expertise and community made possible by the Internet has given rise to a variety of knowledge communities based on voluntary exchange. Furthermore, these communities may be very porous — in part because they recognize the futility of boundaries, but also because they recognizing that sharing knowledge creates professional opportunity. This is most prominent in the world of open source software, where a powerful symbiosis has developed between the community and the business models of many companies.

Complementaries between different forms of knowledge abound, along with opportunities for hiding and manipulating knowledge for tactical and strategic advantage. The distance between explicit knowledge (including data and content) and tacit knowledge is greatly magnified by information infrastructure. Explicit knowledge flows more readily than ever, while tacit knowledge remains as “sticky” as ever. Yet the distance is misleading because the complementarity remains, along with the economic opportunities that it provides.

Markets for new knowledge, explicit or tacit, remain notoriously “thin,” beset by the classic paradox: You don’t know the real value of knowledge until you possess it, but then you no longer need it.

Markets for new knowledge, explicit or tacit, remain notoriously “thin,” beset by the classic paradox: You don’t know the real value of knowledge until you possess it, but then you no longer need it. Patents provide an answer to this paradox, by making it possible to share explicit knowledge while still controlling its use. Those that have patented knowledge can negotiate agreements with suppliers, customers, partners, and employees without having to rely on contracts to guard against leakage and misuse. This makes it easier for firms to look beyond their walls for functions historically performed within, including manufacturing and R&D. This form of “open innovation” allows firms to concentrate on what they do best and shed underperforming functions, by using the best available inputs wherever they may be found [27].

Unlike copyright, patents also protect against independent invention, so in principle, companies can exploit a technology in the open market without fear that they will face direct competition on the particular technology. From a policy perspective, this is a quid pro quid in which the patentee contributes knowledge to the public domain in exchange for a limited–term monopoly. This knowledge has to meet threshold requirements of novelty, utility, and nonobviousness as determined in a formal process of examination by the government.



Aligning patents and knowledge

This elegant trade–off appears to link the patent system to the growing public knowledge base of science and the infrastructure that supports it. But in practice, there are factors that work against the expectations of science–based knowledge infrastructure and that pose substantial challenges to strengthening, deepening, and broadening the connections between patents and scientific knowledge. The problems include:

Blindsiding: There is a blind period that extends from prior to application through the first 18 months that the application is in process. During this time, a very long period in certain industries, there is no way of knowing what a patent applicant is claiming. Although the blind period looks like the lag time in academic peer review, journal articles do not provide exclusivity, whereas a patent effectively vitiates work that may have done independently. Moreover, formal academic publishing is preceded by shared drafts, conference presentations, and working papers, which establish priority while disseminating results expeditiously.

Low standards: While academic publish adheres to rigorous standard of peer review, the nonobviousness standard in patents is keyed to the “person having ordinary skill in the art.” The patent examiner (who need only have a bachelor’s degree in science or engineering) is required to demonstrate that the invention would not be obvious this hypothetical journeyman, otherwise the applicant is entitled to a patent.

Constraint: While patents must provide enabling knowledge, they operate as a constraint against using the knowledge for 20 years from the date of filing — nearly an eternity in some sectors. By contrast, the information contained in a database or research paper is purely enabling.

Ambiguity: There are two distinct levels of knowledge within each patent: the disclosure, which is enabling technical information; and the claims, which are the legal parameters of the patent — the disabling information. In principle, the disclosure should be sufficient to support all of the claims, but this is a matter of legal interpretation that requires costly professional advice.

Uncertainty: Is the patent valid? There may be prior art that defeats the patent or at least some of the claims. Or it may be invalid for obviousness. A lawyer’s opinion on patent validity cost over $US13,000 in 2005 [28]. Other grounds for invalidity may come to light only in the course of litigation. There may also be other sources of uncertainty, such who the current owner of the patent is — or whether the invention actually works as claimed, since working models or tests are not required.

Willfulness: Under patent law, patent owners may be entitled to treble damages if they can prove “willfulness” — i.e., that the infringer had knowledge of the patent. This enhanced liability leads attorneys to advise against reading patents.

In the IT sector where a product can contain thousands of patentable functions, the heavy costs and uncertainties of locating, evaluating, and interpreting patents overload the system and lead to failure of the public disclosure function. It makes strategic sense for patent holders not to assert patents until after producing firms have made investments that are put at risk by the patent [29]. Conversely, it also makes sense for firms that are threatened by patents to reserve knowledge (“prior art”) that might defeat the patents until the patents are asserted in litigation — for two reasons: first, stepping forward to defeat the patent may incur the wrath of the patent holder while benefiting competitors who are also threatened by the patent; second, the artificially high presumption of validity accorded to issued patents makes it tactically advantage to show prior art for the first time in court [30].



The changing innovation policy landscape

Over the past 15 fifteen years the innovation policy landscape around information technology has changed dramatically.

Research: The emphasis on science infrastructure is not a new direction so much as a coming of age of digital infrastructure, as reflected in moving cyberinfrastructure out of the NSF Directorate for Computer & Information Science & Engineering (CISE) and into the Office of the Director. Unlike other forms of science infrastructure, cyberinfrastructure is a general–purpose infrastructure whose broad scope raises special issues related to sustainability and exploiting common processes — as well as potential for commercial use.

Standards: The federal research agencies are no longer driving platform standards as they did for the Internet via the IETF. With the commercialization of the Internet and the Web, the push for new standards is coming from all sectors. Nonetheless, standards are critically important to cyberinfrastructure for achieving scale and scope, for reusing software and data, and for enabling interoperability and boundary crossing from researcher to researcher, virtual organization to virtual organization, discipline to discipline, jurisdiction to jurisdiction, firm to firm, and process to process. While the cyberinfrastructure initiative is not engaged with developing new standards, it is concerned with adopting the right standards [31].

Infrastructure: There is no agenda today for constructing a new form of quasi–public infrastructure as there was in 1993. Instead of creating a new utility, the cyberinfrastructure initiative is aimed at improving linkages, integration, and resource management within and across existing science infrastructure. Toward this end, it has a distinctive research agenda focused on support for discipline–specific data and knowledge, interoperation, workflow, middleware, the semantic Web, and virtual organizations [32]. Unlike the interagency initiatives of the early 1990s, cyberinfrastructure is manifestly an NSF initiative [33].

Patents: Today, patent reform is a high–profile issue before both Congress and the Supreme Court with the IT and financial services sector pitted against pharmaceuticals, biotech, universities, and the patent bar. There are major concerns about quality across the board, with IT especially vulnerable to hold–up and uncertainty [34].

These changes must be kept in full perspective. In the early 1990s, only the infrastructure initiatives were visible beyond the immediate communities of interest. Beginning in 1993, the Clinton Administration made information infrastructure a White House initiative. Today, only the debate over patent reform is widely reported, while cyberinfrastructure remains a single–agency initiative. Publicly funded IT research and IT standards remain low–visibility activities that attract little political interest.



Patent reform as innovation policy

Patent reform is not an initiative, but a public debate over how to reduce costs and risks within a mature innovation policy regime. As presented before Congress, it is cast predominantly in legal terms with little empirical grounding, but with strong anecdotal links back to standards, infrastructure, and other special circumstances and characteristics of IT. It highlights the contrast between the innovation ecosystem around IT, and the more patent–driven ecosystem around biotechnology, pharmaceuticals, and university licensing.

Two of the most contentious issues illuminate the differences between the IT and life science perspectives. One concerns the calculation of damages when the infringed component is a small part of the marketed product. The IT sector favors framing damages to show that an infringing function may be a very minor part of the product (whereas a patent on a drug may be close to the value of the marketed product).

The other is the timing of proposed opposition proceedings. The life science industries would like to limit oppositions to a nine–month window after the patent is granted. This may be reasonable in an environment where innovators pay attention to patents as they are published and granted. IT companies want to be able to oppose patents at any time, because, for reasons noted above, IT firms do not look at patents as they emerge.

A recent case that illustrates both problems is the $US1.5 billion verdict that Alcatel won against Microsoft for Microsoft’s used of MP3 technology in Windows. Microsoft had already paid a flat $US16 million to license MP3 technology from the Fraunhofer Institute in 1994, and like all others in the industry thought it was clear to use MP3. By waiting to litigate, Alcatel let Microsoft accumulate liability. The $US1.5 billion verdict looks grossly disproportionate to MP3’s value within the sprawling complexity of the Windows operating system — as well as to the licensing fee initially paid for MP3.



Cyberinfrastructure as innovation policy

Cyberinfrastructure represents the institutionalization of IT–based knowledge infrastructure as an NSF–wide function. In retrospect, the Internet history may have been unique: a radically new technology developed by DARPA, evolved and standardized by university and agency researchers, institutionalized as research and education infrastructure by NSF, and rapidly commercialized as a new form of general–purpose public infrastructure that in turn raised a number of public policy challenges. Today, information infrastructure is no longer a leading–edge technology (as the Internet platform was) but a diffuse set of context–specific challenges dealing with the design and management of knowledge and human enterprise.

Cyberinfrastructure gives special emphasis to what is new and important for innovation: the process–oriented aggregation of creativity, expertise, and information resources at a distance and across boundaries. Expectations have been raised from content and communication to knowledge and innovation. At this level, infrastructure becomes too rarefied, nuanced, and complex for policy discourse but has been recaptured as on object of research.

Yet cyberinfrastructure cannot be segregated from knowledge infrastructure outside of academic research and education. The outward–looking policy towards standards makes this clear. Cyberinfrastructure operates in the shadow of all–purpose public knowledge infrastructure that includes not just standards but search engines, Web 2.0, institutional repositories, open source software, patents, and so on. Whatever else may be driving them, these are all enablers and objects of research.



Patents and infrastructure

Within today’s innovation environment, publicly funded research, infrastructure, and standards are closely linked. The patent system stands apart by virtue of its rights–based operation, its specialized institutions, long history, and unique professional requirements. The U.S. Patent and Trademark Office is fee–funded; it does not engage in research, either in support of its own operations or on the effects of patents on innovation, business practice, or the economy. Patent law, policy, and administration do not acknowledge standards or infrastructure, and there is considerably hostility towards software patents within the technical community, inside and outside of academia.

The expanded power and availability of patents have brought patents into conflict with standards, infrastructure, and IT products generally, as the Alcatel v. Microsoft verdict illustrates. The vulnerability of infrastructure was brought home by NTP’s successful litigation against Blackberry, which threatened to bring the entire Blackberry network down, alarming users, and forcing Blackberry to settle for $US612 million, even though the Patent and Trademark Office was in the process of invalidating NTP’s patents.

Fifteen years earlier, there was little to suggest that patents would be a threat to the highly abstracted technology of standards, and certainly not the Internet standards emerging from a largely academic environment. The 1995 White Paper issued by the National Information Infrastructure Working Group on Intellectual Property Rights (Intellectual Property and the National Information Infrastructure [35]), focused almost exclusively on copyright. There is no mention of database protection, on which U.S. and European policies would soon diverge [36]. There was a single paragraph on the issue of trademark and domain names, which would explode in the next few years as one of the truly novel policy challenges engendered by the Internet [37]. Fourteen heavily footnoted pages of the report are devoted to patents, half of which are simply a primer on patent processes and rights, while the remaining pages are illuminating mainly in terms of what they do not address [38].

One of the three issues briefly discussed was “patentability of software.” The working group report concluded that the issue was not relevant to the National Information Infrastructure because the scope of patentability for software would be decided by the courts. In fact, the patent–specialized Court of Appeals for the Federal Circuit would do away with all limitations on the patentability of software or business methods within the next four years, legitimating patents on everything from data formats and protocols to the most abstract elements of system design to tax avoidance strategies [39].

This shift in the scope of patentable subject matter is the most significant change in the innovation environment from the early Internet to today’s cyberinfrastructure initiative, and it happened during a period when patents were made stronger and easier to get [40]. Since software can be created without substantial investment in laboratories or manufacturing, this further “democratized” the patent system by lowering barriers to participation. But in doing so, it exacerbated the problem of complex products — i.e., the threat of fragmented ownership, congestion, information overload, strategic behavior, and failure of the disclosure function. After, the dot com bust, startups that secured patents to attract financing failed, leaving patents to be bought by speculators who had no need for cross–licenses and no business reputation to preserve. They were free to extract value by asserting patents against widely marketed products with embedded standards, and against fully functional infrastructure that supported many applications and multitudes of users. By virtue of being sources of substantial value, standards and infrastructure have become sources of substantial liability.

The evolving Internet was relatively safe from patent claims by virtue of being on the cutting edge of computer science in an environment in which patents were only gradually becoming legitimated and available. By contrast, cyberinfrastructure is focused on developing high–functionality software, especially broadly useful middleware, within a heavily patented environment. This software may be valuable in data–driven technologies such as bioinformatics or in generic knowledge management applications [41, places where it is more likely to attract attention from patent holders.

By drawing on industry consensus standards cyberinfrastructure may be able to hide behind deep–pocketed targets. But Microsoft’s loss to Alcatel shows that anybody using MP3 is vulnerable. And while Microsoft is frequently a target, it has the resources to litigate and to buy off or buy out patent holders when necessary, which is not the case for universities. The rhetoric of intellectual property and the potential for jackpot payoffs leads to extraordinary behavior, even by large companies sensitive to public relations [42. Small companies and licensing firms have no compunctions against asking universities for money, and colleges and universities have been recently been threatened by broad patents on common functions such as online testing and streaming video [43. Educational infrastructure is vulnerable because it is relatively public and either similar or standardized across institutions.

Open source software, which in important respects inherits the open platform model of the Internet, offers some protection in that powerful proponents, including IBM, have developed their own strategies for guarding against patent claimants. A number of companies have contributed patents to the Open Invention Network, which operates a defensive patent portfolio that can be asserted against any company that uses patents to attack Linux [44. Most industry standards and open source programs do not have a collective defense mechanism, but open source software and cyberinfrastructure may benefit from expert communities of interest motivated to identify patent–defeating prior art. This may help explain why patent attacks on open source software have so far been rare, even though the workings of open source software are publicly exposed and therefore more vulnerable to certain kinds of software patents. Nonetheless, fear of liability may discourage large corporate users from adopting software if there is no deep–pocketed guarantor to indemnify them. This may inhibit commercialization of cyberinfrastructure to the degree that it inhibits adoption of open source software [45.



Reconciling the power to enable and the right to exclude

Patents are often confused with the enabling technology that they protect. Patents are rights to exclude, not rights to exploit; they disable in order to induce investment.

The frequent disconnect between patent practice and the practice of technology needs far greater attention than it has received. Unfortunately, there are great differences in disciplinary and professional perspective that are not easily bridged, as well as institutional imbalances that reinforce insularity. This appears to be the kind of boundary–crossing problem that cyberinfrastructure can help overcome, but it is far more daunting than the sharing of resources across disciplines.

There are fundamentally two sides to the problem: First, the standards and procedures by which patents are granted, especially novelty and nonobviousness requirements; second, the role that patents play in a marketplace increasingly shaped by open and collaborative innovation.

The requirements problem could be addressed by peer review of patent applications. Peer review is the accepted gold standard not only at NSF, but wherever outside opinion is used to inform funding or regulatory decisions. Toward this end, an important experiment underway in which a number of IT companies have agreed to submit patent applications for review by volunteer experts outside the patent office. This project, known as “Peer–to–Patent” or community review, uses software to support a collaborative effort to help locate and evaluate prior art [46. Although the examiner will still make the final decision on the statutory criteria, this is a major step away from the conventional ex parte lawyer–driven process.

The second part concerns the strategic use of patents, especially with respect to standards. The National Innovation Initiative report, cited earlier, explains:

While IP ownership is an essential driver of innovation, technological advances in many cutting–edge areas are dependent on shared knowledge, standards and collaborative innovation....

Having seen the enormous benefits gained when proprietary technologies stand upon standards–based collaborative tools, one objective of the NII is to seek ways, respectful of intellectual property rights, to promote more effective integration of IP in the standards–setting process. Open standards, created through a transparent and accessible process (coupled with the rapid innovation occurring in middleware software) can accelerate the interoperability and expansion of the global infrastructure. Such standards are an important part of the collaborative innovation that will become increasingly important in the 21st century.

From an intellectual property perspective, open and proprietary IP models should not be seen as mutually exclusive; rather, the IP framework must enable both approaches. Because collaborative innovation is relatively new, however, the structure and processes to accommodate ownership, openness and access are evolving. New creative models are emerging across sectors. A mature, balanced understanding of the purpose and practice of standards, including the important role of open standards and global harmonization, is essential to further interoperability, spur technological innovation and expand market applications.

The interaction between IT standards is being worked through as a policy matter at several levels. The Federal Trade Commission has sought to limit patent applicants from using information gained in standards development processes to modify patent applications so as to capture the standards that emerge [47. A recent joint report by the FTC and the Department of Justice notes that “ex ante” licensing negotiations with patent holders may be a pro–competitive means of avoiding hold–up after standards are set [48. In many standards development organizations, standards are determined by technologists with nothing more than a promise from patent holders that they will license on “reasonable and non–discriminatory” (RAND) terms, and there is no monitoring or enforcement of this promise after the standard is fixed. Ex ante licensing would allow costs and terms to be disclosed in advance, allowing standards to be chosen with confidence on a practical business basis that takes all relevant factors. Again, this means integrating different professional perspectives in process that has traditionally been compartmentalized.

A somewhat more difficult problem is how to address ambush of standards by patent holders who are not part of the standards process. Proposals have been made to require patent holders to assert their rights promptly against well–known open standards or lose the right to damages and injunction against the practice of the standard (an application of the common law doctrine of laches) [49. But it requires both acknowledging the failure of disclosure function in patents and asserting the value inherent in open standards — value that is difficult to quantify or convey to those outside of IT.

The NII report speaks of “enormous benefits gained when proprietary technologies stand upon standards–based collaborative tools.” This appealing model has become increasingly blurred and uncertain. In conventional technologies, patents came early and standardization came as an industry matured. With the Internet, standards clearly preceded patents, a progression that makes intuitive economic sense. Yet the scope and number of patents has expanded greatly, moving patents upstream into basic research, blurring boundaries between proprietary and public, undermining meaningful disclosure, producing administrative delays that are increasingly at odds with accelerated innovation processes and product cycles, and creating unintended opportunities for strategic and tactical behavior.

These tensions are experienced most intensely within IT and related sectors. But as the enabling technology of data, information, knowledge, and cyberinfrastructure, IT carries a special burden. It is the bound–crossing technology that is uniquely capable of aggregating and integrating knowledge wherever knowledge aspires to consistency, replicability, and certainty. It must not only work to keep its own house in order, but it must ensure that knowledge systems in all disciplines and professions operate, and interoperate, as openly and effectively as possible. End of article


About the author

Brian Kahin is Senior Fellow at the Computer & Communications Industry Association in Washington, D.C. He is also Research Investigator and Adjunct Professor at the University of Michigan School of Information and a special advisor to the Provost’s Office.
E–mail: kahin [dot] umich [dot] edu




2. (2007 version).


4. For an overview of the role IT and the Internet in promoting innovation, see Digital Prosperity, Information Technology and Innovation Foundation, 2007,


6. See, for example, An Operations Cyberinfrastructure: Using Cyberinfrastructure and Operations Research to Improve Productivity in the Enterprise, Report on a Workshop held in Washington, D.C., August 30–31, 2004, sponsored by the National Science Foundation,

7. Meanwhile, the Bush Administration has defunded the Technology Administration in the Department of Commerce, including its Office of Technology Policy.

8. Although these two models are transactional and central, they are augmented by long–term support for human capital and favorable tax treatment of R&D.

9. See David C. Mowery and Bhaven Sampat, “The Bayh–Dole Act of 1980 and University–Industry Technology Transfer: A Policy Model for Other Governments?” In: Brian Kahin and Dominique Foray (editors). Advancing Knowledge and the Knowledge Economy. Cambridge, Mass.: MIT Press, 2006.

10. While it is fair to say that Bayh–Dole is an accepted policy, it is still subject to criticism for various reasons. It has been criticized for compromising the integrity, for the lack of breakthrough medicine, and for creating unrealistic expectations for university licensing programs.

11. Arti Rai, John R. Allison, Bhaven Sampat, and Colin Crossman, “University Software Ownership: Technology Transfer or Business as Usual?” working paper, 2007.

12. See discussion in Brian Kahin, RFC 1192, “Commercialization of the Internet,”

13. The NSFNET funding program also used a sophisticated three–tiered model that differentiated in terms of subsidy and policies toward interconnection and use. The national backbone was fully subsidized under a single cooperative agreement with NSF–specified acceptable use policy; regional networks were partially subsidized; and universities were eligible for a small one–time grant to connect to a regional network.

14. Organisation for Economic Cooperation and Development, Economic and Social Impacts of Electronic Commerce, 1998, “Executive Summary,”

15. Cyberinfrastructure Vision for 21st Century Discovery, National Science Foundation, March 2007, p. 6, available at

16. E.g., the High Performance Computing and Communications Program, the National Research and Education Network, and the National Information Infrastructure Initiative. See Brian Kahin, “Information Technology and Information Infrastructure,” In: Lewis M. Branscomb (editor). Empowering Technology: Implementing a U.S. Policy. Cambridge, Mass.: MIT Press, 1993. For a contemporaneous perspective on IT and scientific research, see National Collaboratories: Applying Information Technology for Scientific Research, Computer Science and Technology Board, Washington, D.C.: National Academies Press, 1993.

17. Arden Bement, keynote address for Designing Cyberinfrastructure, this issue, at

18. See Privacy Working Group of the Information Infrastructure Task Force (IITF), “Principles for Providing and Using Personal Information,” draft, 4 May 1994; Working Group on Intellectual Property Rights [of the Information Infrastructure Task Force], Intellectual Property and the National Information Infrastructure, preliminary draft, July 1994 [“Green Paper”], available at Despite the title, the latter dealt almost exclusively with copyright; see discussion below.

19. This was less true under the common carrier model of telephony where carriers were expected not to control the content of transmissions. Under the “end–to–end” architecture of the Internet, the carrier does not even control the functionality of Internet services (e.g., e–mail, file transfer, the Web).

20. It was coordinated at an international level by the Internet Assigned Numbers Authority, a project under a DARPA contract to the Information Sciences Institute at the University of Southern California. The IANA functions were built into the newly formed ICANN in 1999.

21. “Global Information Infrastructure: Agenda for Cooperation,” 1994,

22. A public draft of the “Framework for Global Electronic Commerce” was released in December 1996. The final version was published in July 1997 at

23. The IETF remains very active, although within a more technologically mature, slower–moving commercial environment with many participants representing companies with billions of dollars at stake.

24. Andrew L. Russell, “‘Rough Consensus and Running Code’ and the Internet–OSI Standards War,” IEEE Annals of the History of Computing, volume 28, number 3 (July–September 2006), pp. 48–61,

25. There are of course important exceptions such as public safety standards, but these are very different from IT standards. In Europe, standards policy has been an important adjunct to market integration. China has come to understand the role of standards in supporting its economic advantage in low-cost manufacturing.

26. Courts often uphold the contracts if the user has to agree in some affirmative manner that is more than just hitting the return key — such as clicking a box. But that does not mean that the court will hold all the provisions enforceable, especially if they are unexpected, punitive, or unconscionable.

UCITA was opposed by consumer groups, libraries, state attorney generals, and others. Maryland and Virginia enacted UCITA before the battle was fully joined, while other states enacted anti–UCITA “bomb shelter” legislation to protect their citizens from the reach of UCITA. Patrick Thibodeau, “Anti–UCITA Measures Outnumber State Adoptions,” Computerworld (9 June 2003),,10801,81884,00.html.

27. This use of the term “open innovation” has been advanced by Henry Chesborough in his book of that title (Harvard Business School Press, 2003) and subsequent work.

28. American Intellectual Property Lawyers Association, Report of the Economic Survey 2005.

So obtaining patents has become for many people and companies an end in itself ... . They try to patent things that other people or companies will unintentionally infringe and then they wait for those companies to successfully bring products to the marketplace. ... They gamble that people will infringe these patents without ever learning anything from the patentee, and without interfering with any effort by the patentee to commercially exploit their invention. The long delays in the patent office work to their benefit by keeping the eventual coverage of their patents indefinite while others produce products.
[Competition and Intellectual Property Law and Policy in the Knowledge–Based Economy: Hearings Before the Federal Trade Commission, 28 February 2002 (statement of Robert Barr, Vice President, Worldwide Patent Counsel, Cisco Systems, Inc.) available at].

30. If prior art is disclosed early, it may be used in a reexamination where the patentee can color the way it is cited. If it is cited in the patent, it loses its evidentiary value as something that the patent examiner clearly missed — and instead looks like something that the examiner considered ineffective in invalidating the patent.

31. As expressed in the vision statement:
Interoperable, open technology standards will be used as the primary mechanism to support the further development of interoperable, open, extensible networked resources and VOs. Ideally, standards for data representation and communication, connection of computing resources, authentication, authorization, and access should be open, internationally employed, and accepted by multiple research and education communities....
The use of standards creates economies of scale and scope for developing and deploying common resources, tools, software, and services that enhance the use of cyberinfrastructure in multiple science and engineering communities. This approach allows maximum interoperability and sharing of best practices. A standards–based approach will ensure that access to cyberinfrastructure will be independent of operating systems; ubiquitous, and open to large and small institutions. Together, Web services and service–oriented architectures are emerging as a standard framework for interoperability among various software applications on different platforms. They provide important characteristics such as standardized and well–defined interfaces and protocols, ease of development, and reuse of services and components, making them potential central facets of the cyberinfrastructure software ecosystem. [p. 35]

32. For the scope of the cyberinfrastructure research agenda, see

33. There is an interagency National Coordinating Office for Networking and Information Technology Research and Development (, but it has no coordinating group on cyberinfrastructure.

34. In contrast, the 1992 Advisory Commission on Patent Reform was set up not to address patent reform in the present sense but as political groundwork towards international harmonization. The university representative on the Advisory Commission resigned in protest of the recommendation that the U.S. allow prior user rights (as other countries did) to allow companies to continue using secret processes after someone else has secured a patent on the process. While the Commission did not intend to address software patents, it was forced to do so, and 60 percent of the comments it received addressed only the patentability of software.


36. Following passage of the European Database Directive in early 1996, PTO and USTR favored similar legislation, while NSF and other research agencies opposed it. Bills were introduced in 1996 and subsequent years but were never enacted.

37. Domain names were a point of intersection between trademark and network administration. They would become the main focal point for a number of issues around ICANN, the poster child for Internet governance, which today is still tethered to the U.S. Department of Commerce despite the objections of much of the world.

38. The first of three areas discussed is “determinations of patentability,” which points to the explosion of varieties of prior art in electronic form and raises questions about how they would be treated. The paper assumes that patents would remain canonically embodied in paper, and that digital conversion would remain a value–added service by the private sector. However, the White House soon demanded that the PTO put patents directly on the Internet.

The second area, “infringement determinations,” refers to a technical legal issue that once kept software on media from being patented directly. Running software on a computer could be an infringement, but this meant that only users were direct infringers, and vendors were at best contributory infringers. This issue was resolved the following year when the PTO reversed course and decided that the “printed matter” exception did not apply to software on a disk, so it could be patented, allowing the patentee to sue the producer of infringing software directly.

39. State Street Bank & Trust Co. v. Signature Financial Group, Inc., 149 F.3d 1368 (Fed. Cir. 1998); AT&T Corp. v. Excel Communications, Inc. 172 F.3d 1352 (Fed. Cir. 1999).

40. Adam B. Jaffe and Josh Lerner, Innovation and Its Discontents: How our Broken Patent System is Endangering Innovation and Progress, and What to Do About It. Princeton, N.J.: Princeton University Press, 2004.

41. Nancy Weil, “NSF middleware initiative goes beyond science,” InfoWorld (28 May 2004), at

42. For example, BT, the British telecom giant, claimed to own hyperlinks and sued an Internet provider to enforce its claim.

43. Dan Carnevale, “Watchdog Group Says Patent For Online Testing Should Be Invalidated,” Chronicle of Higher Education (21 April 2006), at; “Colleges Plan to Oppose Company That Claims Patents on Streaming Media,” Chronicle of Higher Education (18 November 2005), at


45. Mere use of patented technology is a direct infringement. IBM has avoided liability by working with partners, such as Red Hat, that are willing to distribute Linux to corporate customers.

46. See and the operational site, For an in–depth analysis, see Beth Simone Noveck, “Peer to Patent: Collective Intelligence and Intellectual Property Reform,” Harvard Journal of Law and Technology, volume 20, number 1 (Fall 2006), p. 123, available at

47. FTC proceedings against Rambus:

48. Antitrust Enforcement and Intellectual Property Rights: Promoting Innovation and Competition: A Report Issued By the U.S. Department of Justice and the Federal Trade Commission (April 2007), pp. 53–56,

49. IBM, Issue Paper: Toward an Open Standards Based Innovation Economy, 2005; Consumer Project on Technology, “Proposed WIPO Protocol for the Development of Open Standards (PDOS) Version 1.0,”



Contents Index

Copyright ©2007, First Monday.

Copyright ©2007, Brian Kahin.

Cyberinfrastructure and Innovation Policy by Brian Kahin
First Monday, volume 12, number 6 (June 2007),