13
The Technology and Culture of Web 2.0 By Barry Schaeffer and Bruce Nevin Background Of the many subjects percolating through today’s information universe, few have received more attention than the mix of technology and culture known as Web 2.0. Web 2.0 is perhaps not the most dramatic expression of technological breakthrough we have seen—indeed, as we shall see later, much of its technology foundation is evolutionary rather than new. Nonetheless, Web 2.0 is uniquely important because it represents perhaps the first time we have seen culture and technology impact one another in real time with little or no latency. This dramatically higher level of immediacy in Web 2.0’s design and reality has engendered a number of forces that appear, and in fact may be, revolutionary in their nature and impact. Indeed, the blogoshpere has crackled with statements about Web 2.0, at one extreme hailing it as harbinger of a new “democratic information world 1 ”, while at the other condemning it as “false”, “disaster”, and “societal suicide.” While it is likely none of these, Web 2.0 raises a number of issues for which there is little precedent, and as such, is worthy of a close look from all sides. Like few developments before it, Web 2.0 has grown from a series of intersections between cultural and technological evolution. It will likely be adopted along lines dictated by these two dimensions, impacting society as much in one as the other. In this article, we attempt to address a number of questions about Web 2.0, organized roughly into three areas; Why is there a Web 2.0 and what trends in our culture have generated the perception that change is needed in the way the Web functions? What is Web 2.0 and what is it not? How is Web 2.0 different from earlier incarnations of the Web, and what technology and cultural factors are integral to its architecture and operation? What will Web 2.0 and its successors (Web 3.0 is already being discussed) mean for the culture that created it? While no single article can hope to fully illuminate these areas, we hope to provide readers with a frame of reference by which the Web 2.0 phenomenon may be measured, evaluated and, if we are attentive, managed. 1 http://www.paulgraham.com/web20.html

Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

The Technology and Culture of Web 2.0By Barry Schaeffer and Bruce Nevin

BackgroundOf the many subjects percolating through today’s information universe, few have received more attention than the mix of technology and culture known as Web 2.0.

Web 2.0 is perhaps not the most dramatic expression of technological breakthrough we have seen—indeed, as we shall see later, much of its technology foundation is evolutionary rather than new. Nonetheless, Web 2.0 is uniquely important because it represents perhaps the first time we have seen culture and technology impact one another in real time with little or no latency. This dramatically higher level of immediacy in Web 2.0’s design and reality has engendered a number of forces that appear, and in fact may be, revolutionary in their nature and impact. Indeed, the blogoshpere has crackled with statements about Web 2.0, at one extreme hailing it as harbinger of a new “democratic information world1”, while at the other condemning it as “false”, “disaster”, and “societal suicide.” While it is likely none of these, Web 2.0 raises a number of issues for which there is little precedent, and as such, is worthy of a close look from all sides.

Like few developments before it, Web 2.0 has grown from a series of intersections between cultural and technological evolution. It will likely be adopted along lines dictated by these two dimensions, impacting society as much in one as the other. In this article, we attempt to address a number of questions about Web 2.0, organized roughly into three areas;

Why is there a Web 2.0 and what trends in our culture have generated the perception that change is needed in the way the Web functions?

What is Web 2.0 and what is it not? How is Web 2.0 different from earlier incarnations of the Web, and what technology and cultural factors are integral to its architecture and operation?

What will Web 2.0 and its successors (Web 3.0 is already being discussed) mean for the culture that created it?

While no single article can hope to fully illuminate these areas, we hope to provide readers with a frame of reference by which the Web 2.0 phenomenon may be measured, evaluated and, if we are attentive, managed.

Technology and Culture; a Changing BalanceIf society has displayed one enduring characteristic during the past several millennia, it is the interaction and mutual impact of culture and technology: how we live, how we imagine, create, and adopt new technology; and how we are, in turn, changed by it. The recent description of a growing “continuous partial attention” 2 phenomenon among frequent users of instantaneous communications provides at least anecdotal evidence that how we use technology is as important as the technology itself. While much has been written about culture and technology separately, the more subtle aspects of their interaction have often escaped our attention. Perhaps the inattention is because until relatively recently those mutual interactions have been offset by long periods of time, sometimes centuries, between (1) desire and perceived need, (2) development of technology responsive to the need, and, once it has been adopted, (3) its cultural impact. For instance, the printing revolution usually associated with Gutenberg in the mid-fifteenth century, actually had its roots nearly 1,200 years before with invention the Codex or edge-sewn book3. It then continued for another several centuries after Gutenberg as the potential for mass availability of reading material impacted society’s educational and governance systems. Reading material changed

1 http://www.paulgraham.com/web20.html2 https://continuouspartialattention.jot.com/WikiHome3 http://www.answers.com/topic/codex?cat=entertainment

Page 2: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

from a focus on education for the religious and royal elite to education—and empowerment—for the masses. With centuries between cause and effect, these interactions, however concrete, were easy to miss.

With the passage of time, however, the pace of this cultural/technological interaction has quickened, making the perceived need, realization, and impact more contemporaneous and easily discernible—and arguably more important. A young United States’ desire for respect in a world dominated by the navies and privateers of the European powers, for example, drove a Philadelphia naval architect’s development of a radical improvement in warship design4 in the 1790s, having a material effect less than 15 years later on the outcome of the war of 1812 and the rise of the new nation as a global power. This compression of time between need/desire, development, and impact has continued apace until today, the process is often measured in months, a few years at most.

Typically, the individuals and groups responsible for technological development have focused on what the technology could be made to do, leaving the ultimate uses and impact of that technology to others. In some cases, this neglect has created a dramatic level of myopia that, although not intentional, has had far-reaching effect on society. Ray Tomlinson, the developer of email in the early 1970s, failed to anticipate the rise and growth of spam,5 admitting that his focus was on the roughly 1,000 users of the systems on which he was working even though it is arguable that the signs and roots of spam were present even then in the broader society. Packet-switched networking had been described several years before and junk “mail” was already a problem, for both postal and telephone media. As we live with the results of these early failures to anticipate and deal with writing on the wall, we must confront the question, with the same level of development now requiring scarcely three years, whether we can afford to ignore the impacts inherent in today’s developments, including Web 2.0.

To do that, we must first grasp just what is, and is not, part of Web 2.0.

Web 2.0 TechnologiesThe term “Web 2.0” suggests a new release, a significant advance introducing new features and capabilities on the web all at once, but the analogy of a major release of a software or hardware product is misleading. It is more of a conceptual shift of emphasis, a recognition of changes that have been emerging over time in the uses of web capabilities which, to a surprising extent, are not new.

From a technological standpoint, it might better be called Web 1.2. Many of the technologies that support Web 2.0 have in fact been part of the web since its beginning, and salient usage features of Web 2.0 have long been common web practice. Beginning with XHTML and HTML markup that is semantically more informative, and CSS further separating content from presentation (and support for XML/XSL coming “real soon”), the evolutionary advances in technology that support Web 2.0 include improvements in

Server software Web syndication (a form of publication by making parts of a website available for use on other

websites) with “web feeds” using RSS or Atom Protocols for instant messaging, such as XMPP, and software for social networking, such as

FOAF and XFN. SOAP to send messages and instructions for servers, and other protocols for web APIs to

exchange XML or JSON payloads Network architecture principles such as REST to support transparent machine-to-machine

interactions Plugins and extensions for standards-compliant (or standards-oriented) browsers Mechanisms for users to annotate content with comments or metadata tags, sometimes

elaborating them into folksonomies, as with del.icio.us Systems for combining content from different sources into mashups Tools for creating and publishing content in blogs (weblogs), wikis, and forums Internet storage systems, as seen in the vast storage capacity provided by Google

4 http://en.wikipedia.org/wiki/Six_original_United_States_frigates5 http://technology.timesonline.co.uk/tol/news/tech_and_web/article3525110.ece

Page 3: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

Web 2.0 has much in common with the open source movement in software development, and indeed we might say that Web 2.0 is the extension of motivations and organizing concepts from the open source development world to a more general population, riding on open source tools, such as wikis. Any technology that fosters the creativity and collaboration of volunteer participants on the web—the keynote of both open source and Web 2.0—can be called Web 2.0 technology.

From a user perspective, content is king on the web, and always has been. Now the power of dynamic content is emerging. The focus is now on participation, beyond simply publishing your content on the web. Web 2.0 technology establishes spaces on the web where users collaborate to share, modify, and recombine content, adding value by doing so. This is not to say that publishing one’s content has ceased. One of the remarkable consequences is the emergence of the ‘blogosphere’, in which previously unheralded people compete with and in many cases become more influential than professional journalists and pundits in reporting and evaluating news. Blogs thrive on readers’ collaborative comment and cross-talk in ways that letters to the editor never could.

In these collaboration spaces created and enabled by Web technologies, communities of interest emerge, so that users with quirky minority and niche interests more easily find their fellow birds of a feather—the hitherto neglected 20 percent (or less) of the 80-20 rule, the “long tail”6 that enables web-based business models exemplified by Amazon.com, eBay, and Craigslist. As the established success of these business models indicates, this trend is not really new. It is an acceleration of existing trends and an extension of them to broader populations. Satisfying the user has always been the problem. One key to their success is in making the collective intelligence of those populations part of the solution. Increasingly, technology supports these trends.

Tim O’Reilly did much to clarify the term Web 2.0 from a business model perspective at a 2004 conference. A brainstorming session produced the preliminary “meme map”7 reproduced here, which very conveniently provides us a succinct summary of these ideas.

Is Web 2.0 only a change of perspective, an epiphany by builders of business models, with no significant technological developments to mark it? No, hardly. But of those technologies that enable collaboration, sharing, spontaneous creativity, and all the rest of it, no single innovation can stand up and declare itself to be the change that constitutes Web 2.0. The nearest contender to that title might be RSS, “the most significant advance in the fundamental architecture of the web since early hackers realized that CGI could be used to create database-backed websites” according to Tim O’Reilley.8 “RSS allows someone [not only] to link … to a page, but to subscribe to it, with notification every time that page changes.” But that’s only one part of the picture.

6 http://en.wikipedia.org/wiki/The_Long_Tail7 http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html8 http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html?page=3#designpatterns

Page 4: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

In Web 1.0, applications run either on the user's client desktop or on the server, but now Web 2.0 “network as platform” applications run in the user's browser. In Web 1.0, the browser is only a tool for the user to view someone else’s content on a website, but now the user may own the data on a website and control it through the browser. The site's "architecture of participation" may enable the user to add value in the course of using it. A site's user interface may use rich media such as Ajax (Asynchronous JavaScript and XML) or Adobe Flex (based on flash) to emulate a desktop application. These technologies improve interactive performance, a critical aspect of usability, by updating in the browser only what has changed on the site, instead of reloading an entire web page, so that the user sees his or her changes to the content with a minimum of process lag.

Web 2.0 applications make much heavier demands on database and workflow systems in the back end than in the past, and correspondingly the browser must be given more client-side muscle lest server access issues obtrude on the user interaction. Beyond traditional HTML forms, these demands require intensive use of JavaScript (and Ajax) scripting, and programming with Flash, Java applets, and other like technologies. This shift has some pretty profound business consequences, but to talk about those we must first review some of the background.

Earlier, we suggested that open source has some parallels to Web 2.0. Open source has deep roots. The protocols that enable the Internet were developed in the 1960s by a collaborative process called Request for Comment (RFC). The freely distributed GNU operating system 9 was the original motivation and focus of the Free Software Foundation, beginning in 1983. The concept of open vs. closed (proprietary) source code was thus well established when Netscape released Navigator code and called it open source; which became Mozilla, the open source browser family. Tim O’Reilley organized the Open Source Summit in 1998. Since then, the open source movement has greatly expanded, the reliability and worth of open source software has been amply demonstrated, and it is now an integral part of the production and delivery of products in many enterprises. Open source tools like Mozilla, Linux, and Apache have achieved impressive market share, a stunning success for the movement. The EU has come out more and more strongly in favor of open source.10

At this point, the plot thickens. Actually, the plot has been thick and cloudy for years. The adversarial relationship alluded to above in the history of Netscape and Mozilla persists in the experience of Linux and of open source software in general, due to a profound mismatch of business philosophy and practices. As its response to the business imperatives of Web 2.0, and while still sustaining the public message that they are “making the PC the place where it all comes together” (Bill Gates in a 2007 interview on CNN about Vista11), Microsoft is nonetheless moving strongly into “cloud computing”. What does that mean?

Cloud computing moves software applications and services from PCs to centralized data centers, where they are made available by Internet connectivity. The computing centers that are being built out by dozens of companies like Amazon, Google, and Salesforce will effectively outsource data processing as a commodity like electricity. Phenomena like Google applications directly challenge the traditional model of packaged desktop software in the enterprise and in the home.

On Tuesday, April 22, 2008, Microsoft announced its entry into cloud computing, a data storage and Web software system, called Live Mesh, which delivers services to an expanding range of electronic gadgets. In a “Services and Strategies” document, quoted in the New York Times the next day, CTO Ray Ozzie articulated three “guiding principles,” the first of which is “The Web is the hub of our social mesh and our device mesh.” But the foundation for this move was in place three years earlier. The familiar dozen Office tools (Word, Excel, PowerPoint, and so on) have been transformed to use an XML-based file format called Office Open XML (OOXML). OOXML reportedly was rushed through the standards process (stronger words have been used for this experience, such as “brutal and corrupt”12) so as to

9 This facetiously recursive acronym stands for “GNU’s Not UNIX” because it contains no UNIX code, only free software.10 http://www.nytimes.com/2008/06/11/technology/11soft.html?_r=1&ref=technology&oref=slogin11 http://money.cnn.com/2007/01/30/technology/gates/index.htm12 http://www.tbray.org/ongoing/When/200x/2008/04/15/OOXML

Page 5: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

compete with the already existing ISO OpenDocument standard from the open source community. Competing browsers then—competing standards now.

How does this relate to cloud computing? With Internet Information Services (IIS), SharePoint, and other cloud products, Microsoft has integrated these XML-based Web services so that Office applications can easily interchange data with back-end systems. Office is thus transformed to a development platform and a front-end portal for user interactions with CRM, ERP, and other back-end systems in the cloud—or the “mesh,” to use the Microsoft term. (At the time of the product announcement in 2005, this role was laid out in an interview with Steven Sinofsky, SVP for the Windows and Windows Live Engineering Group at Microsoft. Sinofsky was previously in charge of development for Microsoft Office, a career development that mirrors the transformation of Office software from desktop to cloud.13) On this platform, Microsoft aims to support users on a continuum from traditional users who prefer a desktop environment (only now it might be invisibly interacting with cloud resources); to those comfortable with dynamic web tools delivering remote services, to those who fully embrace the new conception of portals interacting with resources in the cloud. Either way, Microsoft wants the data and processes in the cloud to come to the user through a spigot with their brand—and their use fees. In “What is Web 2.0,” Tim O’Reilley said “Unless a vendor can control both ends of every interaction, the possibilities of user lock-in via software APIs are limited.”14 At the time, that seemed an optimistic statement; it may turn out to be prescient.

A populist spirit is pervasive in Web 2.0, the open source movement, indeed in “Web 1.0” and (as the governments of China and Burma appear to agree) in the Internet itself, championed by organizations like the Free Software Foundation and the Electronic Frontier Foundation. In the Web 2.0 open source world, the cry is up to resist OOXML as a grab for control of crucial APIs for cloud computing. Paranoia, or justified concern? In true Web 2.0 form, we leave that to our readers to decide.

What will Web 2.0 Mean for Information Developers?As Web 2.0 becomes more pervasive within the corporate intranet, how will it affect the development of technical information? For engineering organizations, Web 2.0 opens barriers in two directions:

Outward facing toward customers and partners Inward facing toward information developers for technical documentation, training,

support, sales support, and marketing

As a business deploys Web 2.0 technologies, its customers and prospective customers can have greater influence on product roadmaps and even current design and development processes. “Customer feedback” has always been a desideratum, but a feedback form at the back of the book seems a positive barrier to communication when compared to the interactivity of Web 2.0.

Web 2.0 spells the eventual demise of an over-the-wall philosophy of product development. That philosophy means getting customer requirements, then going away to develop a product with no further communication until you start getting customer feedback after the product has shipped. Web 2.0 suggests that prototypes and user interface mockups should be exposed to customers even during the design and specification process. It gives a new depth of significance to the notion of an iterative development process.

As engineering organizations take up Web 2.0 technologies for internal use, they open the barriers to other organizations in the company. For example, normal practice in the past has been

13 http://xml.coverpages.org/ni2005-06-02-a.html#sinofsky14 http://www.salon.com/tech/feature/1999/11/16/microsoft_servers/print.html

Page 6: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

to create specifications by filling in the blanks of a specification template or by modifying the specification documents from a previous product. Then, all too often, discussions lead to refinements and revisions that do not make it into the specs because no one has time, and there’s little motivation as long as everyone relevant takes part in the discussions. But then along come the information developers, and all they have to go on is the specs, which are out of date. However, they don’t discover the gaps until they can look at a product, which by then is close to shipping.

As developers begin to use wikis and other Web 2.0 tools to write and maintain specifications, the information is kept current in the very process of having the discussions and making decisions and revisions. As vendors develop systems to funnel relatively unstructured wiki content into XML schemas used by content developers to create customer content, writers can take an active role earlier in the development cycle. Technical content can be worked into customer-appropriate forms more quickly, more reliably, and with better quality assurance.

Web 2.0 opens barriers and broaches silos throughout the enterprise. Do trainers copy and paste bits from customer documentation to create their own documentation and presentations? Does the support organization warehouse its responses to customer problems in its own repository that technical documenters and trainers never see? Sales support, marketing--all these silos of information, experience, and expertise--can begin to communicate with each other—in the best of all possible worlds.

However, there are sources of resistance in the enterprise that Web 2.0 does not encounter in the general public.

First and most obviously, the enterprise has proprietary information that it rightly defends from the free interchange of Web 2.0. That is the primary justification for an estimated 80 percent of the content on the Internet being “dark,” never indexed by web crawlers and indexers. Of course, a great deal that is locked behind firewalls is not truly proprietary. Businesses err on the side of caution. In fact, one of the effects of Web 2.0 over time may be a relaxation of that caution, with better management of what may safely be disclosed.

Within an intranet, Web 2.0 pushes against cultural factors that are not found abroad on the Internet. For example, consider just some of the inward-facing and customer-facing concerns of a development organization. Engineers have traditionally looked upon documentation as a kind of clerical occupation, merely restating the specs and describing things after the hard work of design and development has been done. The self-enfranchisement of the blog finds a ready welcome but to elevate someone of lesser status is another matter. Conversely, for tech writers to make the transition to be information developers calls for changes of attitude and aptitude that better fit them as teammates in the development process. Engineers rightly resist “requirements creep” during the course of development, so that an iterative development process is at best a kind of “punctuated equilibrium,” to borrow a concept of biological evolution. In a traditional “over the wall” process, the engineers’ need for a stable and well-understood development environment conflicts with customers’ evolving knowledge of their desires and requirements

Finally, within the enterprise one finds resistance from those who defend their knowledge silos as the source of their power and influence. But over time, resistance may be futile—not because Web 2.0 is the Borg, to borrow a Star Trek metaphor, but because human beings like to communicate and learn from one another when it is easy and fun. And that, Web 2.0 does well.

Page 7: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

What will Web 2.0 Mean to the Culture that Created it?We are left with the question, “If Web 2.0 is not a radical new technology and many of its individual components are in use today, why does it appear to be so noteworthy?” The answer may be more a matter of culture than of science. Indeed, with few exceptions, technology has followed a relatively stable path of discovery, development, introduction, adoption, and evolution, leading many observers, certainly most technologists, to conclude that that technology in itself is non-threatening and benign.

But as we have seen, the interaction between technology and culture has at times been neither stable nor uniformly beneficial. Particularly confusing about this interaction is the fact that there often appears to be little predictable relationship between the nature of the technological change and its cultural impact. Obviously, something more than technological change itself is at work. If we can understand what that something is and how it behaves, we may be able to better predict how culture is likely to respond to evolving technology. As the pace of change quickens, especially in electronics, computers, and information, this understanding may come none too soon. The latency period available for society to absorb technological change is constantly shrinking, making its outcomes increasingly unpredictable and irreversible. How then might we view Web 2.0 to understand where it falls on that continuum between boon and bane?

If we accept that technology itself is neither good nor bad yet impacts society in good and bad ways, one approach with some promise is a focus on the degree to which technological development provides expanded avenues and outlets for expression of positive and negative impulses that already exist in individuals and society.

After the development and use of nuclear weapons, for example, the world did not, as many feared it would, spiral into oblivion through their proliferation and subsequent use in the many regional conflicts since 1945. From this example, one might surmise that while nuclear technology is a remarkable and very dangerous force, its introduction did not parallel a corollary impulse within society to destroy itself. Conversely, while it didn’t create them, the rise of the Internet and Web appears to have magnified a number of less than beneficial impulses in society, enabling rapid and dangerous proliferation of the fruits of these impulses; from spam, to porn, to identity theft, to malevolent viruses with no apparent purpose other than mischief.

In light of these phenomena, we must view with healthy skepticism any significant change in our electronic infrastructure and the way we use it, not because of what it is, but of what cultural levers it may turn. While those responsible for the changes are likely to emphasize (and perhaps only see) the positive aspects of their creation, society will ultimately live with both good and bad and must therefore think carefully about what forces among us will be brought to the fore by the contemplated changes. The example of email is instructive. Had Ray Tomlinson, its developer in 1971, not missed the potential for the blizzard of unwanted email messages we now call spam, he might have insisted, at the very least, on more visibility into who is sending what to whom. While nothing would have completely prevented the rise of spam, some simple changes to the mail and Internet protocols back then might have minimized it.

Given the prominent role of unintended consequences in technological evolution, perhaps we should look closely at Web 2.0 before we embrace it at face value. Out of the box, there are a couple of Web 2.0 elements that could be troubling if not approached carefully. Perhaps the most controversial of these are the concepts of user participation as the foundation for how the Web will work and the concept of common ownership of its content.

The first of these features, described variously as “emergence” and “participation,” describes an egalitarian Web in which how users think things should work is how they will work. While the analogy to building the road wherever drivers want to go as they drive may be a stretch, one can’t help but wonder if the appetites, prejudices, and maturity levels of millions of anonymous Web users are an appropriate basis for how something as powerful as the World Wide Web should function. If the proposition—advanced

Page 8: Web 2€¦  · Web viewBy Barry Schaeffer and Bruce Nevin. Background. Of the many subjects percolating through today’s information universe, few have received more attention than

above—that technological change can magnify elements of our baser natures, then letting the mass of users design the Web by their sheer participation may be akin to letting the devil design the details.

The second trend is embodied in the concept of common ownership of content. While the development of a more involved user base for web content may bring a richer dimension to information resources than possible under traditional “authority-based” systems, it may also saddle us with a blizzard of content, comment, and rancor, much of it the product of individual users’ naiveté, narcissism, or downright malevolence. Once a reality, this kind of content haystack could make finding the desired needle virtually impossible for all but the infinitely perseverant. Wikipedia, often cited as a conceptual foundation for Web 2.0 is described as “radical experiment in trust.” The question facing those who would understand Web 2.0, however, is this: “can foundations like Wikipedia be extended infinitely without breaking down into information anarchy?”

Given the growing cultural trends of immediacy, impatience, and the desire—often described as “the right”—to be heard, especially among the younger generations, one might wonder where a user-directed and user-owned Web will take us and what now-hidden impulses it will unleash. While perhaps not a reason for undue pessimism, recent experience with technological evolution appears to offer ample justification for detailed and thoughtful examination, from perspectives much wider than the technologies themselves or the assumptions made by their promoters.

The Internet, with its ability to rapidly impact potentially billions of participants, must be viewed as a cultural lever of unprecedented power and significance. If we are careful, it can deliver many of its heralded benefits with an acceptable level of negative side effects. If we are not, however, it can and may give us just the opposite. But do we have the means to exercise such deliberate care and control? With the enablement of Web 2.0 technologies, is the herd of cats already out of the bag?