10
Book Reviews At the Crossroads: Librarians on the Information Superhigh- way. Herbert S. White. Englewood, CO: Libraries Unlimited; 1995: 422 pp. Price: $55.00. (ISBN l-56308- 165-2.) Librarians who want to avoid becoming road kill on the in- formation superhighway may regret the title, but they probably will enjoy reading this book for reinforcement. Never mind the fact that much of the content appeared in Library Journal’s “White Papers” over the years, or that some ofthe material has already been published in collected form (Librarians and the Awak~ning,~~omInnocence, G. K. Hall, 1989). The essays and reprints appear here in a different context, largely resisting the editors’ attempts to arrange them into three parts: ( 1) Librari- ans and Their Roles as Defined by Themselves and by Others; (2 ) Librarians. Their Self-Image, and the Perceptions that De- fine Their Preparation; and (3) Librarians in the Cruel World of Politics and Money. All three sections have original intro- ductions by the author. Each essay effectively reproduces one of those delightful conversations with the author, where every attempt at interject- ing one’s own thoughts is overwhelmed by a new onslaught of White’s conjectures and reminiscences. Nor is this observation meant as a complaint; it is instead a compliment, because the conversational tone makes the book an easy read. The sections dealing with publishing, education, and man- agement are easily the most interesting, reflecting the author’s background and experiences. Whether readers agree with him or not, they will appreciate the no-nonsense approach and can- dor. White’s talks and writings anticipated the current reaction against politically correct positions. It comes as no surprise, for example, when he disagrees with long-time Library Journal ed- itor John Berry, whose brittle thoughts on many issues crystal- lized in the 1960s. Specifically, White rejects the notion that schools of library and information science do not prepare grad- uates sufficiently for practice in libraries, and he has noted cor- rectly and consistently over the years that no other professions expect their practitioners to function fully on the first day of work. We train bears, but we educate people. There is a misleading assertion in the Introduction: Namely, that White was hired as a dean at Indiana University (p. xi). In fact, he was hired as a professor by his predecessor, the late Bernard M. Fry, and his chief responsibility initially was to di- rect the Research Center that the School of Library and Infor- mation Science had at that time. White became dean later, when he and Fry essentially swapped positions. Other flaws can be traced to the publisher and the editors. Someone decided to omit White from the Index, although he is cited (by my count) 28 times throughout the book. This lapse of standard scholar- ship might be attributed to misguided modesty. However, the omission of these entries drags down two co-authors as well: 0 1996 John Wiley & Sons, Inc. Marion Paris, who is cited only three out of a possible five times, and Sarah Mart, whose name does not appear in the index at all, even though she is the co-author of an article that is reprinted virtually in its entirety. I enjoy seeing people take on the thorny issue of ethics, and White does not disappoint. His almost Talmudic musings on page 250 include a reference to Werner von Braun (initially a “bad” and subsequently a “good” German), speculation on whether (considering the Hippocratic oath) a Jewish surgeon should have treated Hitler, and the following passage, which typifies the author’s irrepressible sense of humor: When Perry Mason shows that his client is innocent, as he al- ways does, he also quite conveniently finds the guilty party. I am waiting, I am sure in vain, for one episode where Hamilton Burger says, “but if your client didn’ t commit the murder, who did?’ and Mason replies, “that’s not my problem.” Popular culture has its place. Given this book’s title, one might expect a fuller account of the impact of changing technology. However, as Allen Veaner puts it in a carefully crafted Foreword, the book “focuses on an abstract reality that changes very. very slowly, namely, human behavior.” Perhaps. But human nature also changed very, very slowly during the Renaissance, and one might expect a work with such a title to address the revolutionary changes we are experiencing as we approach not only a new century, but a new millennium. In fairness, the author is not the most likely candidate for such an assignment. Consider the following paragraph, taken in its entirety and first published in the Fall of 1994: I use computers, but I also continue to use typewriters, and I use the latter almost exclusively for creating papers such as this one. Why? Because the great virtue in writing on the computer is the ability to move paragraphs and phrases around, to rewrite and edit online. I don’ t write that way. I write front to back, and I change very little. Given that, I find a typewriter keyboard much more forgiving of the heavy-handed way in which I ham- mer keys. In other words, for me, for certain applications, the typewriter is easier, faster, and more comfortable. That doesn’ t make me a Luddite. it simply makes me selfish. I do what works best for me. I think it is important that all of us in the informa- tion field remember that our clients will always do what they think is best for them, regardless of what the so-called experts say. (p. 300) Those who have received Herb’s memos know that he means what he says. However, to say that he is being “selfish” represents something of an understatement, and this tortured explanation sounds like the rationalization of a person who does not fully grasp the power of word processing, let alone the implications of more sophisticated computer applications. In spite of the unfortunate title, At the Crossroads will serve JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE. 47( 10):789-798, 1996 CCC 0002~8231/96/ 100789-l 0

Finding government information on the Internet

Embed Size (px)

Citation preview

Book Reviews

At the Crossroads: Librarians on the Information Superhigh- way. Herbert S. White. Englewood, CO: Libraries Unlimited; 1995: 422 pp. Price: $55.00. (ISBN l-56308- 165-2.)

Librarians who want to avoid becoming road kill on the in- formation superhighway may regret the title, but they probably will enjoy reading this book for reinforcement. Never mind the fact that much of the content appeared in Library Journal’s “White Papers” over the years, or that some ofthe material has already been published in collected form (Librarians and the Awak~ning,~~om Innocence, G. K. Hall, 1989). The essays and reprints appear here in a different context, largely resisting the editors’ attempts to arrange them into three parts: ( 1) Librari- ans and Their Roles as Defined by Themselves and by Others; (2 ) Librarians. Their Self-Image, and the Perceptions that De- fine Their Preparation; and (3) Librarians in the Cruel World of Politics and Money. All three sections have original intro- ductions by the author.

Each essay effectively reproduces one of those delightful conversations with the author, where every attempt at interject- ing one’s own thoughts is overwhelmed by a new onslaught of White’s conjectures and reminiscences. Nor is this observation meant as a complaint; it is instead a compliment, because the conversational tone makes the book an easy read.

The sections dealing with publishing, education, and man- agement are easily the most interesting, reflecting the author’s background and experiences. Whether readers agree with him or not, they will appreciate the no-nonsense approach and can- dor. White’s talks and writings anticipated the current reaction against politically correct positions. It comes as no surprise, for example, when he disagrees with long-time Library Journal ed- itor John Berry, whose brittle thoughts on many issues crystal- lized in the 1960s. Specifically, White rejects the notion that schools of library and information science do not prepare grad- uates sufficiently for practice in libraries, and he has noted cor- rectly and consistently over the years that no other professions expect their practitioners to function fully on the first day of work. We train bears, but we educate people.

There is a misleading assertion in the Introduction: Namely, that White was hired as a dean at Indiana University (p. xi). In fact, he was hired as a professor by his predecessor, the late Bernard M. Fry, and his chief responsibility initially was to di- rect the Research Center that the School of Library and Infor- mation Science had at that time. White became dean later, when he and Fry essentially swapped positions. Other flaws can be traced to the publisher and the editors. Someone decided to omit White from the Index, although he is cited (by my count) 28 times throughout the book. This lapse of standard scholar- ship might be attributed to misguided modesty. However, the omission of these entries drags down two co-authors as well:

0 1996 John Wiley & Sons, Inc.

Marion Paris, who is cited only three out of a possible five times, and Sarah Mart, whose name does not appear in the index at all, even though she is the co-author of an article that is reprinted virtually in its entirety.

I enjoy seeing people take on the thorny issue of ethics, and White does not disappoint. His almost Talmudic musings on page 250 include a reference to Werner von Braun (initially a “bad” and subsequently a “good” German), speculation on whether (considering the Hippocratic oath) a Jewish surgeon should have treated Hitler, and the following passage, which typifies the author’s irrepressible sense of humor:

When Perry Mason shows that his client is innocent, as he al- ways does, he also quite conveniently finds the guilty party. I am waiting, I am sure in vain, for one episode where Hamilton Burger says, “but if your client didn’t commit the murder, who did?’ and Mason replies, “that’s not my problem.”

Popular culture has its place. Given this book’s title, one might expect a fuller account of

the impact of changing technology. However, as Allen Veaner puts it in a carefully crafted Foreword, the book “focuses on an abstract reality that changes very. very slowly, namely, human behavior.” Perhaps. But human nature also changed very, very slowly during the Renaissance, and one might expect a work with such a title to address the revolutionary changes we are experiencing as we approach not only a new century, but a new millennium.

In fairness, the author is not the most likely candidate for such an assignment. Consider the following paragraph, taken in its entirety and first published in the Fall of 1994:

I use computers, but I also continue to use typewriters, and I use the latter almost exclusively for creating papers such as this one. Why? Because the great virtue in writing on the computer is the ability to move paragraphs and phrases around, to rewrite and edit online. I don’t write that way. I write front to back, and I change very little. Given that, I find a typewriter keyboard much more forgiving of the heavy-handed way in which I ham- mer keys. In other words, for me, for certain applications, the typewriter is easier, faster, and more comfortable. That doesn’t make me a Luddite. it simply makes me selfish. I do what works best for me. I think it is important that all of us in the informa- tion field remember that our clients will always do what they think is best for them, regardless of what the so-called experts say. (p. 300)

Those who have received Herb’s memos know that he means what he says. However, to say that he is being “selfish” represents something of an understatement, and this tortured explanation sounds like the rationalization of a person who does not fully grasp the power of word processing, let alone the implications of more sophisticated computer applications.

In spite of the unfortunate title, At the Crossroads will serve

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE. 47( 10):789-798, 1996 CCC 0002~8231/96/ 100789-l 0

quite nicely as a time capsule. Historians may use it as a handy guide to the issues that moved librarians during the last few decades. and future generations will be exposed to the consid- erable wit and wisdom of this 20th century fox.

Charles H. Davis School r$Librarv and Iefi,rmation Science Indiana C’nivcr.& Bloomington, IN 47405 E-mail: [email protected]

Fril-Fuzzy and Evidential Reasoning in Artificial Intelligence. J. F. Baldwin, T. P. Martin, and B. W. Pilsworth. Taunton. UK: Research Studies Press; 1995: 388 pp.. 2 computer disk- ettes. Price: $79.95. (ISBN 0-86380-l 59-5.)

The book has the following aims: ( 1) To bring together probability theory and fuzzy logic theory into a coherent theory for representing and processing of uncertainties: (2) to imple- ment this theory as an extension to the logic programming par- adigm, thus creating a new language called Fril, which is an enhanced PROLOG-l ike language; and (3) to show how Fril can be used for solving AI problems where dealing with uncer- tainties is crucial for the quality of the solution.

Chapter 2 gives a good introduction to the programming in logic paradigm as Fril has a logic programming language as a part of it. The language is goal driven. so the program infers answers to a given query (goal). Concepts such as predicate representation, exact match of facts against rules, variable binding through unification, recursion, lists as major informa- tion structures, control of the reasoning process, etc. are ex- plained and well illustrated in the Fril syntax. More advanced programming techniques are demonstrated in Chapter 5. Chapter 3 presents briefly the main characteristics of the three theories used in Fril to represent uncertainties, namely: Proba- bility theory, fuzziness, and possibility theory, and suggests a unified representation called mass assignmmt. Mass assign- ments are used to establish a link between the three theories as follows: A normalized fuzzy set on X induces a possibility distribution for a particular object Y over X. This possibility distribution defines a family of probability distributions repre- sented as a mass assignment. A mass assignment is viewed as a probability distribution over the power set of X.

Example: A fuzzy set defined as

J‘= a/ 1 + b/0.7 f c/O.5 + d/0.1

induces a possibility distribution of Y over X:

Pos( a) = 1, Pos( b) = 0.7, Pos( c) = 0.5, Pos( d) = 0.1.

Let us denote a probability distribution of Y over X as follows:

Pr(a) = pl, Pr(b) = p2, Pr(c) = p3, Pr(d) = p4.

If we assume that a probability measure is always less or equal to a possibility measure for any event. subset of X, then the following is valid:

pl +p2+p3+p4= 1

p2 + p3 + p4 < = 0.7

p3 + p4 < = 0.5

p4 < = 0.1.

This represents a family of probability distribution functions which can be represented by a mass assignment:

m= {a}:O.3, {a,b}:0.2. {a,b,c}:0.4, {a,b,c,d}:O.l

which is equivalent to the following probability intervals (support pairs):

a:[0.3.1], b:[0.0.7], c:[O,O.5], d:[O, 0.11.

We should note here that the family of probability distributions represented as a mass assignment is an infinite set. But in a case when more a priory knowledge is available on the problem under consideration, some restrictions can be established.

A general characteristic ofa mass assignment is that the sum of its values over the power set of X is equal to 1. If the mass assignment of the empty set is equal to zero, then this mass assignment is said to be complete and it corresponds to a family of probability distributions, otherwise it is not complete. Sim- ilar operations. to the well known operations over fuzzy sets, are introduced over mass assignments, such as: Complement, meet (equivalent to union). and join (equivalent to inter- section ). Fril allows users to define and use the following ways of representing uncertainties in data: Fuzzy sets, possibility dis- tributions, and support pairs ofprobabilities equivalent to mass assignments.

Fril is a multi-paradigm reasoning tool. It combines the logic programming deductive reasoning, fuzzy reasoning and evidential reasoning in one tool. Fuzzy reasoning is imple- mented as an extension to the deductive logic programming reasoning as well as a fuzzy control reasoning. Facts and rules in Fril can have support pairs attached to them. The head of a Fril rule can contain a fuzzy set, for example:

((Height of X is short) (condition elements)): (x,~).

There are several methods implemented in Fril for combining inferred results by multiple rules in the rule-base. The classical, also called Zadeh-Mamdani fuzzy control inference method, is implemented in Fril. thus allowing fuzzy control applications to be developed in this language. This method can use fuzzy sets with support pairs attached to them, for example:

(Temperature is high): (0.3 0.7).

Six forms of implications are facilitated, namely: Zadeh’s im- plication operator, and also: Kleene-Dienes, Rechenbach, Lu- kasiewicz, Godel, and Goguen implications. Different opera- tors for combining inference from several rules are imple- mented, among them an extended Dempster rule.

Evidential logic uses weighted rules, where every condition element is assigned a degree of importance. The inference is based on weighted sum. This type of inference is similar to an inference performed in a neural network architecture which al- lows for using neural networks to learn evidential logic rules

790 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996

from examples. This is a meeting point of three important and widely used paradigms, namely neural networks, fuzzy sys- tems. and logic programming. Even though neural networks are not generic objects in Fril, rules extracted from them by using appropriate methods can eventually be used for reason- ing in Fril.

Several small demo programs have been developed in the book as illustrations of how Fril can be used for practical appli- cations. These demo cases are presented in Chapters 6 and 7. The program realizations can be found on the demonstration diskettes accompanying the book-one diskette for Microsoft Windows 3.1 for PC and another for Macintosh computers. The following case studies have been used as demo programs: ( 1) Fuzzy control, (2) causal nets, (3) fuzzy database, (4) in- telligent manual, (5) evidential support logic, and (6) PRUF (Possibilistic, Relational, Universal, Fuzzy)-the Zadeh’s fuzzy inference language. Even though the examples are rather simple and toy-like, which does not indicate precisely the po- tential problems when applying Fril to real-size problems, this part of the book and the programs on the diskettes give a good idea of the Fril functionality and its areas of application.

The book is a good tutorial on programming in a new lan- guage called Fril. The book represents also the theoretical back- ground of Fril, which is a combination of fuzzy reasoning, pos- sibility reasoning, probability reasoning, and reasoning in a symbolic AI logic programming system. Bringing all these par- adigms in one system is the main achievement ofthe book. The book is written in a very clear style with plenty of examples to illustrate the variety of methods implemented in Fril.

Nikola Kasabov Department ofInformation Science University of Otago New Zealand E-mail: [email protected]

Electric Words: Dictionaries, Computers, and Meanings. Yor- ick A. Wilks, Brian M. Slator, and Louise M. Guthrie. Cam- bridge, MA: The MIT Press; 1996: 289 pp. Price: $32.50. (ISBN O-262-23 182-4.)

The title alone of this work could be misleading: It is not an exploration of the epistemological issues raised by attempts to capture meaning in dictionaries or by manipulation of word forms for computer applications. Rather it is a consideration of the relevance of dictionaries as resources for use in natural language processing (NLP). Epistemological issues. which are introduced, arise largely as a result of this practical endeavor.

The premise which informs the book and some of the work reported is that dictionaries, particularly, although not exclu- sively, monolingual dictionaries, constitute an extensive and real world source of data about the relation between word forms and meanings. An analogy is drawn with courts and legal processes: Both the judiciary and lexicographers take real world decisions, about ethics and about meanings, while philosophers debate principles, possibly inconclusively. An indirect inheri- tance, plausible in a United States context, might be detected from the tradition of attaching value to practical reasoning in legal decisions, associated, for instance, with Oliver Wendell

Holmes. A warning as to the viability of using dictionaries as sources for relatively stable meanings could be derived from the historically variable interpretations assigned to verbally unal- tered significant legal documents, such as the U.S. Constitu- tion.

The NLP tasks to which meanings obtained from dictionar- ies are to be applied are not fully enumerated, although they clearly include machine translation, and may either be as- sumed to be known to the intended communities of readers, or, less charitably, could be regarded as poorly defined. The guiding presumption is of the desirability of extensive autono- mous computer processing of data and not of frequent human interaction with computer held data. The coverage of relevant empirical literature seems to be relatively full. The currency of the projects and literature reported is more questionable, with citations diminishing from the late 1980s and reports of later ( 1992) projects can descend to rather uncritical itemization without supporting analysis. A slightly more recent (1993) project reported, which takes a user-centered empirical ap- proach in support of machine assisted translation, for which a database of aligned parallel texts is provided, does contrast with earlier projects in its emphasis on interactivity. The contrast between this and the less interactive approach of other NLP projects discussed should have received greater attention, and a full discussion of their relative value would have been helpful. There would seem to be an analogy with current debates in information retrieval, as a move is made from evaluative para- digms developed during an era of batch processing to an at- tempt to grapple with the current reality of operational systems which allow a high degree of interaction. The strongest contrast may lie in system intention, to support human exploration and judgment, whether with regard to translation or to document and information retrieval, not in procedures and techniques. The complexity and extent of computer transformations be- tween human interaction may also be diminished.

The framework of understanding brought to the account of NLP projects is sophisticated and widely, although not fully, informed by relevant sources. It recognized that dictionaries are compiled with particular intentions and that applying them to other tasks may be problematic. The messy inconsistency of the world and real world languages, too diverse for any formal system fully to encompass, is acknowledged. The style of writ- ing is literate and intelligently allusive with a number of refer- ences to Samuel Johnson’s work as a lexicographer-the occa- sional misquotation could be regarded as a tribute to Johnson’s own practice.

How might the work have been more fully informed? An awareness that the word is a problematic concept in linguistics, not necessarily demarcated in unconstrained oral utterance, and that it can be regarded as an historically developed conven- tion of written language, would have been helpful. Similarly, a fuller sense of the limitations of an undifferentiated concept or plane of meaning and of the possibility of distinguishing be- tween meaning and definition, with meaning always liable to outrun and evade definition, could have contributed. John- son’s own remark that, “to circumscribe poetry by a definition will only show the narrowness of the definer” (Johnson, 1984, p. 752) does indicate an acute awareness of the limitations of lexicography and of definition. Equally relevant, if likely to be unfamiliar. are Volosinov’s reservations on attempts to con- struct a semantic unity or even strong coherence for a word: “The various contexts of usage for any one particular word are thought of as forming a series of circumscribed, self-contained

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996 791

utterances all pointed in the same direction. In actual fact, this is far from true. Contexts do not stand side by side in a row, as if unaware of one another, but are in a state of constant tension, or incessant interaction and conflict” (Volosinov, 1973, p. 80). Experience with full text retrieval would seem to be lending empirical confirmation to the theme of the variety of unpredictable contexts and usages for words.’ With regard to the understanding of formal languages, Wittgenstein’s ac- count of the conditions for a logically perfect language in the Tructatus l(lgico-philosophicus. particularly the insistence that primitive units of that language cannot be known (Wittgenstein. 198 1 ), could have been more fully informative.

A fundamental weakness seems to lie in the conception of men as organic entities with symbolic activities (pp. 43-44). A contrasting perspective, associated for instance with the Italian philosopher Giambattista Vito. would be that to be fully hu- man is to be engaged in social communication and significa- tion. Symbolic activities can then only be reified and decontex- tualized, as they are for forms of NLP, at the expense of reduc- ing or distorting their richness and complexity. Yet the labor invested in creating NLP applications can be adapted to purposes more sympathetic to this perspective. Vito, while re- jecting Aristotelian categories as paths for discovery, valued them for the analytical comprehensiveness they could be made to yield, particularly at the beginning of an inquiry: “Aristotle’s Categories and Topics are completely useless if one wants to find something new in them. One turns out to be a Llull or Kircher and becomes like a man who knows the alphabet. but cannot arrange the letters to read the great book of nature. But ifthese tools were considered the indices and ABC’s of inquiries about our problem [of certain knowledge] so that we might have it fully surveyed, nothing would be more fertile for re- search” (Vito, 1988, pp. 100-10 I ). Analogously, while reject- ing a device such as Wordnet as an account of the stable re- lations between different word-forms and their meanings, it can still be regarded as valuable for the cognitive control it can be made to yield over an otherwise disordered and amorphous set of entities.

In conclusion, some reservations must be placed on the cur- rency of the work and its lack of full awareness of the contrasts it reveals. For those sympathetic to NLP, it has value as a seem- ingly comprehensive survey. Those less sympathetic might re- gard it as partially trapped within an ahistorical notion of in- formation processing whose correspondence to everyday inter- active practice has been diminished. It may still be possible to retrieve and transform the labor invested in some NLP appli- cations, from such a critical perspective.

Julian Warner Infi,rmation Management Division School of Finance and Infiwmution The Queen ‘.r University qf Be@zst Be&t B T7 INN Northern Ireland E-mail: j. [email protected]

‘I cite, but do not reproduce, the recall ofan article discussing a new translation of the Puma SZQYCI from the full text file of The Guurdian ( London. 1992) by the logical combination of“university AND library AND finance.”

References

Johnson, S. ( 1984). Life of Pope. In D. Greene (Ed.), Sumzfel Johnson (The Oxford authors) (pp. 725-752). Oxford and New York: Ox- ford University Press.

Vito. G. ( 1988). On fhr most uncicnt ~~isdom c$iiw 11uliun.v: Uneurrhed ,fiwm lhc origins 0flhc Lulin /un~wzp~: Inclirdinl: thc~ dispulution wiih thr Giovnrrlc, &‘Lr//c~ra/i d’l/u/ia. Ithaca and London: Cornell Uni- versity Press.

Volosinov, V. N. ( 1973). Mur.ri.rm und the philo.wph~~ of lungwge. New York and London: Seminar Press.

Wittgenstein. L. ( 198 1 ), Tructurlrs IfKIL.O-I)IIi/O,s(~~lliCrf,s. London and New York: Routledge and Kegan Paul.

Finding Government Information on the Internet. John Maxy- muk, Ed. New York: Neal-Schuman; 1995: 264 pp. Price: $39.95. (ISBN l-55570-228-7.)

This excellent reference book begins with an introduction entitled “What is this thing called Internet?’ by Maxymuk that sets the stage for the following chapters. He succinctly covers all the aspects of the Internet such as ftp, the Web, WAIS, and more. The text of the book itself is divided into three sections: Topics, tools, and treasures.

The Topics section begins with an overview by Aldrich of the establishment of the Federal Depository Library Program (FDLP), its history, and the effect the Internet is having on the storage and dissemination of federal government information. The Government Printing Office (GPO) has been the premier provider of federal government information. As more and more government agencies provide direct access via the In- ternet (including gopher. ftp. and the World Wide Web), the FDLP and GPO will cease to be the main resource as many users will go directly to agency information and bypass deposi- tory libraries altogether. The adoption by the Clinton admin- istration of OMB Circular A- 130 as the basic guide for govern- ment information policy in the National Information Infra- structure almost guarantees that many, if not most, executive agency publications in electronic format will not be distributed by the FDLP. These latest developments in the FDLP bode ill for free public access to federal government documents. The unpoliced Internet offers no assurance of quality in documents available. There will be no central agency responsible for assur- ing all government agencies comply with their mandate to in- form the public. There are also technological (i.e., hardware/ software) challenges that may limit which libraries can provide access to resources available on the Internet. Adrich concludes that librarians and concerned users must continue to voice their concerns about access to federal publications and gain the skills necessary to use them on the Internet.

The succeeding chapter by Love is devoted to the theme of free public access to federal government information. In 1980, a Democratic Congress and President passed the first Pa- perwork Reduction Act (PRA). In 198 1, President Reagan came into office and with a Republican Senate, the OMB cre- ated OIRA (the Office of Information and Regulatory Affairs) to coordinate federal information policies and oversee regula- tory relief. OIRA spearheaded a privatization effort to dissemi- nate federal government information. This culminated in 1985 with OMB Circular A- 130, which told federal agencies to rely as much as possible on the private sector for disseminating pub-

792 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996

lit information. While large commercial data vendors lauded OIRA. right-to-know advocates, including the American Li- brary Association, protested loudly. In the late 1980s the PRA statutory authorization expired. In 1995. as part of the Repub- lican’s Contract with America, Congress introduced HR 830, which would reduce public access to federal government infor- mation. HR 830 was a bill to reauthorize the PRA. However, one section of the bill seemed to benefit vendors, particularly West Publishing. This section superseded all federal law and said that if any person “adds value” to public information, that s/he basically has a copyright on it. Using the Internet, the Tax- payers Assets Project (TAP), a public right-to-know advocacy group, alerted its members and others to the bill. The public rallied and its calls, faxes, and E-mail to members of Congress caused removal of the provision. (As this book went to press, HR 830 was still in the House Senate Conference Committee.) Love concludes that we need a positive policy on public access to government information and that we need to be “vigilant of attempts to subvert free access to [government information] .”

In the following chapter, Patwell’s perspective from New Jersey’s largest public library, Newark Public, provides an ad- ditional viewpoint on the National Information Infrastructure and the Government Printing Office. Patwell argues that changes in information formats and dissemination of govern- ment information pose a challenge for many depository librar- ies which are not equipped to handle and disseminate elec- tronic formats of government information. Newark Public re- ceived a grant from the National Telecommunications and Information Administration (NTIA) to fund a “demonstra- tion project to create an electronic information infrastructure that would serve to empower residents, students, and busi- nesses of Newark”-the Newark Electronic Information Infra- structure, and which was in line with the National Information Infrastructure. The Library established NEON (Newark Online) which provides increased access to information and “ensures the people’s right to know.”

Shane starts off the Tools section of the book by providing a good, basic overview of World Wide Web (urls, html, design) and gopher basics (organization, structure, etc.). However, if one is sincerely considering building Web or gopher resources, a text dedicated to those topics would provide additional infor- mation to allow for more complex creations. In the next chap- ter, Dossett reviews the Online Interactive Service of GPO Ac- cess which premiered in June 1994. GPO chose to use Wide Area Information Servers (WAIS) software, so the Service is also called GPO WAIS. Dossett explains what files are available on GPO WAIS, how to connect, search, display, and save or print retrieved documents. The author also provides a brief comparison of GPO WAIS to the House of Representatives World Wide Web (WWW) site Thomas. In the following chap- ter, Etkin reviews another aspect of GPO Access-the Federal Bulletin Board Service (FBB), which began in 199 1 and pro- vides a history of this growing resource. Etkin supplements in- formation on how to connect to and search the FBB, download files, and use its E-mail system with easy-to-read images of ac- tual FBB screens. Etkin also describes the GPO Locator Ser- vice, another facet of GPO Access. The Locator Service pro- vides an electronic directory of federal electronic information via the World Wide Web.

The final section of the book, Treasures, begins with author Keating identifying and describing “key legal information re- sources and guides in order to assist the reader in selecting among the myriad of legal resources available on the Internet.”

She lists gopher sites where one can view and/or retrieve bibli- ographies of legal and legislative Internet resources. She then cites sources of both “Primary Law”-court decisions, the Constitution, and Congressional acts: and “Secondary Law”- treatises, periodicals, etc. Additional resources found at law schools are described and shown as graphics. A short descrip- tion of commercially produced resources, such as the Congres- sional Quarterly gopher, are mentioned. Author Miller covers federal government trade and business information available via the Internet. She categorizes chapter sections into general, marketing, industry, international trade, economic, and company/investment. Under each section, she provides an ex- cellent list of appropriate resources with an informative para- graph describing each. Some resources such as FEDWORK and EDGAR are widely known. Others, however, provide in- formation less well known, such as U.S. Treasury Auction Re- sults. Telnet, ftp, gopher, and Web addresses are listed for each site as applicable. The chapter is enhanced by graphics of some of the sites mentioned.

While we often think of government resources for educa- tional, demographic. and labor information, it also provides information on the arts, anthropology, and other topics. This chapter covers the census, commerce and economics, crime and justice, labor and employment, sociology and anthropol- ogy, psychology and psychiatry, history and government, edu- cation and children, and the humanities and arts. Next, there is an informative chapter written by Rogers and Livingston on federal government social sciences and humanities resources on the Internet. Whether one seeks the latest census statistics or information on the Glass Ceiling Commission (created by Congress to monitor the movement of women and minorities up the corporate ladder to higher management), one will find information about it here. One can even view digital images from current or past exhibiting artists at the National Museum of Art site. Nice graphics illustrate the variety of information available.

Churchill provides an excellent collection of Web sites for health, medicine, and the environment. Expected sites, such as the National Library of Medicine or the Department of Health and Human Services are covered, as are sites that one might not expect. These include the Social Security Administration and the Health Care Financing Administration. Environmen- tal Web sites include the Environmental Protection Agency and the Bureau of Land Management, as well as the Public Health Services Agency for Toxic Substances and Disease Reg- istry. Urls are given for all sites mentioned and each site has numerous links to related sites. Striking graphics of some of the home pages mentioned are included.

Maxymuk illustrates that government science and technol- ogy resources are numerous and varied. This chapter is divided into seven subject areas: General sciences and technology, ag- riculture, biology and chemistry, computer science, earth sci- ences (including geology and geography), math and physics, and space science (including astronomy, astrophysics, and aeronautics). Maxymuk primarily lists Web sites, but also in- cludes telnet sites and ftp archives. Whether one wants to find information on the weather, plant genetics, volcanoes, or phys- ics, it is all here. Nice graphics tempt the reader to try out the sites reviewed.

Before the advent of the WWW, ftp, gopher, and telnet were the only means of transferring files. Images had to be down- loaded and later viewed with pc-based viewing software. Web browsers, such as Netscape and Mosaic, now make it possible

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996 793

to view image and animation files and listen to audio files on- line. Anderson explains how to view graphical files and where to find image files. His explanations of image viewing and de- compression software are clear and complete enough for even the novice to get up and running. Anderson gives both gopher and Web sites. There are numerous image sites available and a few are mentioned in detail. From LC Marvel, one can go to Northwestern University’s American Politics gopher. (Several graphics of sites are shown. but my favorite is a photograph of President Nixon shaking hands with Elvis Presley.) One can take a virtual tour of the Smithsonian or see images collected by the Hubbell Space Telescope. Anderson’s introduction to this visual area of the Internet is enough to entice the reader to drop everything and go take a look.

Kelly notes that, in the past, state and local government in- formation has not been nearly as accessible as that for the fed- eral government. Access to this largely untapped valuable re- source has been the goal of librarians and government agencies for some time. The Internet is changing all that. Several states are ahead of others in utilizing the Internet to make informa- tion available. Kelly points out that improving access to this information requires planning, staffing. and organization of ex- isting and future resources. This chapter provides an excellent overview of state (including legislative) and local Internet re- sources. Eckman and Sheehy cover international and foreign government information in the book’s final chapter. As might be expected, the United Nations and its agencies lead the way in providing international information via the Internet. Addi- tionally, regional government organizations, such as the Euro- pean Union and NATO, have gopher and Web sites containing information on many countries, including those of the former Soviet Union. The Southern Common Market and the Pan American Health Organization are representative of “New World” sites. Foreign government resources for North America (Canada). Asia and the Pacific Region, Africa and the Middle East, Russia and Eastern Europe, Central and Western Europe, and Scandinavia are covered. Other resources include informa- tion on treaties and conventions, nongovernmental organiza- tions, and political parties. (Many of these sites are in English.) The book ends with appendices that list all the Web. gopher, telnet, and ftp sites/servers (arranged by subject) in the preced- ing chapters. This is a great ready reference tool. A detailed index rounds out the book and makes agencies, subjects, etc. easy to locate in the text.

Maxymuk has brought together an impressive group of li- brarians and information professionals to create and compile this valuable resource. Although information on the Internet changes daily, this book will provide a sure foundation for both the novice and experienced Internet user who is seeking to mine the government information treasures available via the Internet. This book should be at the reference desk in most libraries and in the personal library of information brokers. re- searchers, and businesses hoping to do business with govern- ment here or abroad.

Deborah Hunt Information Edge 2014 Harvard Drive Alumedu, CA 94501-1632 E-mail: [email protected]

Measuring Information: An Information Services Perspective. Jean Tague-Sutcliffe. San Diego. CA: Academic Press; 1995: 206 pp. Price: $59.95. (ISBN o-12-682660-9.)

Jean Tague-Sutcliffe is one ofthe leading scholars and teach- ers in information science, with a particular strength in biblio- metrics, and has written an important monograph on measur- ing the performance of methods that provide information to users, with chapters describing how to measure user-record in- formativeness, aggregating these values, and evaluating infor- mation services. The book provides a sophisticated justification for tools and measures used to evaluate information-based sys- tems, moving beyond simple statistical techniques to take into account the various problems and conditions that face infor- mation professionals managing information systems.

The book begins with an introduction to information and information services and their measurement in Chapter 1. The serious work of the book begins in Chapter 2. where several possible characteristics of a measure of information and effectiveness are examined. including additivity. the consider- ation of user needs, and document ordering and usefulness. In a work ofthis type, one can attempt to develop broad principles applicable to a broad range of disciplines and problems, as Weaver did with Shannon’s work (Shannon & Weaver, 1949). or one can move inward, developing problem-specific mea- sures and concepts. In Chapter 2, Tague-Sutcliffe begins to do the latter, following the lead of Roberts ( 1979). Chapter 3 re- views the variety of information measures that have been pro- posed and how they address the desirable properties of infor- mation measures provided in Chapter 2.

The first half of the book provides a discussion of informa- tion and information services, providing a framework from which one might develop measures of information systems. In Chapter 2, Tague-Sutcliffe defines “informativeness” as a “sub- jective, ordinal-level. cumulatively logarithmic. noncommuta- tive context-sensitive measure of the information produced by the interaction of users and records” (p. 55 ). Beginning with Chapter 4 and continuing through Chapter 6. Tague-Sutcliffe develops and argues for measures of information and system effectiveness. This development is mathematical in nature but is clearly written. Given the amount of mathematical notation present in this book, many will find the book imposing, but carefully written works like this need to be written and need to be published. The innumerate will pick up the book and quickly put it down. while the numerate will find progress slow but rewarding. It is a pleasure reading material that so clearly addresses the problems of the field.

Chapter 4 proposes a general measure of informativeness represented by a 3-tuple based on a set of database records, a set of potential uses, and an information function. An ordinal informativeness function is developed that measures the infor- mativeness of one record in an optimal ordering of records. It is shown that Shannon’s ( 1948) measure ( -lo&Pk) can be understood as a special case of Tague-Sutcliffe’s measure, where plr represents the probability that, after examining “the first k records, we have already obtained the information re- maining in the database” (p. 99). Chapter 5 examines aggre- gating this record-based measure, looking at a series of individ- ual records taken together. Because examining one record may affect the utility of another record and the utility of a record is determined in part by historical and maturational effects, the informativeness of a “retrieval trail” is often not simply the sum of the individual informativeness values. Of particular in- terest is Tague-Sutcliffe’s analysis of other retrieval measures, showing that recall is “a special case ofinitial informativeness.” and precision a special case of discount due to delay, following Marschak ( 197 I ). Chapter 6 continues to expand the applica- bility of the informativeness measure to groups of users, e.g., society. Unlike other evaluative measures of information ser-

794 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996

vices. the informativeness measure used by Tague-Sutcliffe di- rectly considers how the information “satisfies the needs of its user community.” Methods of data gathering and forms of sta- tistical analysis (e.g., ANOVA) are considered.

The book is not meant as a handbook but will need to be read from beginning to end. Casual readers may find this a drawback, and may find it hard to find specific useful formula or techniques by browsing. None of the chapter summaries contains a formula, and none refers back to specific formulas or models by referring to an equation number or a page number.

I have one minor quibble with the typography, and this is the number ofasterisks and double asterisks that are used. “Us- ing the string” “C**(u)” many times, with the asterisks in the same font as the other characters, dominates the visual appear- ance of some of the pages where the eye is drawn immediately to this notation. Except for this and the accidental inclusion of typographic instructions on page 170, the book is conserva- tively and attractively presented.

Robert Losee University OfNorth Carolina Chapel Hill, NC 27.599-3360 E-mail: losee@ils. unc.edu

References

Marschak, J. ( 197 I ). Economics of information systems. Jaurnal of t/wAtnericun Siatrsticul Asso~ialion, 66( 333), 192-2 19.

Roberts, F. S. ( 1979). A~OXZ~~~V?XVZ~ tlzcoq~. Reading, MA: Addison- Wesley.

Shannon, C., & Weaver, W. ( 1949). Thr mathematical theory q~cor??-

munication. Urbana: University of Illinois Press.

Information Management for the Intelligent Organization: The Art of Scanning the Environment. Chun Wei Choo. Medford, NJ: Information Today; 1995: 255 pp. Price: $39.50. (ISBN l- 57387-018-K)

One volume in the “ASIS Monograph Series.” Choo’s Zn- jtirmation Management for the Intelligent Organization: The Art ofScanning the Environment concerns techniques for ac- quiring the big information picture; corporate and academic libraries alike will find it a worthwhile addition to their business management collections. Organization executives. especially CEOs, must have long-range vision skills; the big picture is an important aspect of forming an accurate view of the future and in how to position the organization to take maximum advan- tage of emerging opportunities, both in the long and short terms. Environmental scanning reduces uncertainty in the de- cision-making process, thereby reducing risk. The sheer prolif- eration of information in this “age of information,” makes sift- ing through what is available a much more difficult and time- consuming task. Questions such as, “which information is important?’ and “which information source is most accurate?’ must be answered correctly to guide the organization safely and productively into the future.

In the Preface, Choo clarifies the purpose and value of envi- ronmental scanning: “Competit ion is the consequence of the unequal distribution of information among organizations and their differential abilities to acquire, absorb and actuate infor- mation. Competit ion has turned into an information race of discovery and learning” (Choo, p. xi). However, the objective of the book is somewhat more broadly-based in that it seeks to

“develop an understanding of how an organization may man- age its information processes more effectively in order to in- crease its capacity to learn and adapt” (Choo, p. xii).

Chapter I, “The Intelligent Organization,” is an overview of the learning organization and how it may adapt to, and use productively, the increasing plethora of information available. Choo discusses the types of organizational knowledge (tacit, rule-based, and background) and how these might be used in the intelligent organization.

In Chapter 2, “A Process Model of Information Manage- ment,” Choo defines the problem and offers some solutions in the form of an organizational information management model to incorporate the following requirements. First, the organiza- tion must define its information needs, then decide how the needed information is to be acquired, and finally decide how the information might be organized, stored, distributed, and used.

Chapter 3, “Managers as Information Users,” contains much research on how managers of typical large organizations use the information they are provided, includes a section on the politics of information sharing, and prescribes how this sharing might be more easily facilitated inter-departmentally.

In Chapter 4. “Environmental Scanning as Strategic Orga- nizational Learning,” Choo defines environmental scanning: “ . the acquisition and use of information about events, trends, and relationships in an organization’s external environ- ment, the knowledge of which would assist management in planning the organization’s future course of action” (Choo, p. 72). Figure 4.1 defines the perimeters of social intelligence and its subparts (issues management, environmental scanning, business intelligence, competitive intelligence, and competitor intelligence) and how each relates to one another on issues of scope and time horizon (Choo, p. 76). The definitions and the exclusive domain of some of the multiple subparts of social in- telligence are presented in an interesting and succinct fashion.

In Chapter 5, “Environmental Scanning in Action,” exam- ples of scanning are presented from the perspectives of Ameri- can, British, Swedish, and Japanese corporations, along with detailed information on the scanning practices of five Canadian CEOs. Although expounding on a portion of the human anat- omy (eye/brain connection) might be interesting to some, “Perspectives from Neurobiology,” (pp. 105-l lo), was out of step with the rest ofthe monograph and would have been better not detailed here. Basically, the section attempts to parallel, and apply to information management, the principles of the human visual system. “The human visual system is so rich in connections that it behaves both as a parallel and as a hierar- chical information processing system” (Choo, p. 109).

In Chapter 6, “Managing Information Sources,“Choo iden- tifies the types of information sources suitable for use in scan- ning, and rates the types for quality and accessibility. The traits of information (for example, “Quantitative Continuum” and eight others) are described, and a comparison of human vs. tex- tual information sources is presented.

Chapter 7, “The lnternet and Online Database: Scanning on the Information Highway,” is both the most current chapter in the book and also the most likely to become dated in the years (or even months) ahead. The chapter discusses Internet Resources and Services and tables include the URLs of numer- ous World Wide Web sites, some of which may already have changed.

Chapter 8, “Learning to Be Intelligent,” pulls the book to- gether and “looks at some new tools and methodologies that the smart organization can use to understand the forces and dynamics that are shaping the future” (Choo, p. xv).

Along with assigning current supplemental readings, utiliz- ing case studies, and some actual practice in environmental scanning (possibly on the Internet), Choo’s Information Man- agement.for the Intelligent Organization will make a good text-

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996 795

book for courses covering the topic in both LIS and Business/ Management schools. Although the corporate library as a source of information is deemed fairly accessible to manage- ment, the perceived quality ofthat source is relatively low when compared to other sources (Choo, p. 144). If corporate librar- ies could improve their reliability and value as sources of envi- ronmental scanning information, their futures might be better assured in this phase of corporate downsizing. In sum, even though many of the statistical figures and Internet URLs are, or might soon become, dated, Choo’s book is highly recom- mended as a current contribution to the somewhat sparse monographic literature on the subject.

Kenneth G. Madden C’niversit~~ of’North TeAus UNT Box 5 924 Dcnton, TX 76203 E-muil: [email protected] http://www. unt.cdu/- kgmOOO1/

Contextual Media: Multimedia and Interpretation. Edward Barrett and Marie Redmond, Eds. Cambridge, MA: The MIT Press; 1995: 262 pp. Price: $35.00. (ISBN 0-262-02383-O.)

Once again, Edward Barrett has not disappointed his read- ers and followers with this edited volume. Like previous antho- logies or collections of interesting articles on the related theme of social construction of knowledge, Barrett and Redmond have included some very absorbing writing that questions, pon- ders. and reflects upon multimedia applications in a virtual en- vironment. Readers will certainly see a progression in idea de- velopment if they consider this volume as a third segment in a trilogy, following The) Society ofTe.xt: Hypertext, HJ’permedia and thL> Social Construction ~j’Infi,rmution (MIT Press; 1989) and Sociomedia: Multimc~diu, Hypermediu and the Social Con- struction oj’Knowiedge (MIT Press; 1992). The latter is more of a text than a group of articles or papers, but this collection retains interest and curiosity and offers balance in perspective and ideas. There seems to be evenness among the chapters and one could use the volume for a discussion piece in a seminar, or just for ideas. as one tests some hypotheses for multimedia applications.

The chapters in Contextual Media were part of what must have been a marvelous 1993 conference in Dublin, Ireland on “Culture, Technology, Interpretation: The Challenge of Multi- media,” following up on Barrett’s earlier works. The thesis cov- ered by the conference was “how artists and humanists could use multimedia technology. . . and how multimedia technol- ogy might be used to place the conflicts of our respective cul- tures in some sort of meaningful context for analysis and un- derstanding” (p. xiii). The collection of papers attempts to re- spond to the challenge of their “human need to make sense out of the welter of experience, whether in riot or in treaty. What tools could the computer give us to analyze this process of sense-making, to add voices to it, to create new contexts for understanding a specific action. or image, or object-to archive voices. images, thoughts that could be lost, marginalized, de- feated, or could assume the status of a core? How could we modify those tools to suit our human needs” (p. xiv).

With such an introduction, readers cannot help but antici-

pate the most interesting ways of how “thought and language in a virtual environment seek a higher synthesis, a re-imagining of an idea in the context of its truth” (ibid). Each of 13 papers focuses on different applications of combining language and media in a virtual environment; some ideas still seem light years ahead of their time for most of us, while others are cur- rently in practice and in the mainstream of multimedia access and products.

Michael Roy’s contribution on “How to Do Things without Words: the Multicultural Promise of Multimedia” introduces concepts of diversity and speculates how Eliot Weinberger’s earlier seminal writing on ethnographic fi lmmaking can be ap- plied today with digital tools and database resources. One can- not help but think how relevant and important Janet Murray’s comments in describing developments in multimedia comput- ing as “the invention of a medium.” Examples cited, such as the network capabilities of sending high-resolution images to simultaneous users and linking them to highly indexed data- bases. multidisciplinary coverage of broad subject areas, and databases and homepages created for Internet access, all suggest that there is a connection between multimedia and multicul- turalism. whether intended or not, and that Roy’s efforts to sup- port how multimedia scholarship and teaching lend to merging the more standard academic products with the current techno- logical and fast paced environments of television and video en- tertainment.

The last chapter, “Wheel ofculture” by Ben Davis urges the reader to look at the existing language structures that allow for multilevel understanding, and it is not difficult to conclude that without information technology and computers there really is not much. It is suggested that multimedia is currently thought of as a process of packaging information (p. 25 I ), but he de- fines multimedia as “not a thing, computer technology. It is the current literacy condition of the environment. Film, video, audio, text and graphics are ubiquitous: they are being created and distributed 24 hours a day every day. The air is literally filled with images. These forms are the mechanisms for indus- trial design, communication and resource monitoring” (p. 249). One realizes how critical good design, technical skill, and good ideas make for creating a hyperculture.

The work of museums cannot be overlooked when one re- views how fast multimedia is developing. Exhibits and interac- tive displays and taking museum holdings public via Internet sites is what Colin Beardon and Suzette Wroden explore in their chapter, “The Virtual Curator: Multimedia Technologies and the Roles of Museums.” Like libraries, museums are very organization oriented but try to eliminate biases or censorship. The exploration of the use of the “Virtual Curator” (p. 74) and responses by viewers offers new insights in how important the current developments within information technology and mul- timedia are for design history, and for those interested in how people receive information and create their own statements or exhibits (p. 8 I ).

Janet Murray shares experiences of teaching a course on Structure and Interpretation of Nonlinear and Interactive Nar- rative at MIT and what evolves in students’ understanding of the pedagogy of cyberfiction. By creating writing assignments that not only emphasize the literary criticism of the specific works being studied but encourage them to write in a hypercard environment, exploring an appropriate form and following three distinct paths of a narrative path. Again, some of the ex- amples of student projects indicate that they understand this very well and stretch themselves in a verse and form not pre-

796 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996

viously explored or played with. This is a very inspirational chapter for those of us instructors trying to create new assign- ments and wonder how people organize their thoughts using this method of programming logic.

Many neophytes of multimedia have been introduced to various desktop publishing packages and this may be the only time they have explored any kind of authoring utilizing multi- media concepts. Edward Brown and Mark Chignell introduce a new style ofapplication development, free-form environment in their chapter, “End User as Developer: Free-Form Multime- dia.” They articulate this best by saying, “Free-form multime- dia environments blur distinctions between developers, au- thors and readers. Participant of end users in the development process is important since end users will typically have the best understanding of their own needs and will best be able to eval- uate prototypes during interactive design” ( p. 198).

However impossible it is to comment on each chapter or idea in this book, the themes Ricki Goldman-Segall explores in her paper touches all chords in using multimedia in the class- room and with organizing. analyzing. and interpreting video, text, and sound chunks. In “Deconstructing the Humpty Dum- pty Myth: Putting It Together to Create Cultural Meaning,” she tackles the hard relationships between culture, interpreta- tion, and technology and how the Humpty Dumpty Myth can be overcome to establish collaborative vitual communities. She proposes that “conducting multimedia research means ( I ) breaking up linear streams into pieces-Deconstruction: (2) analyzing them in new juxtapositions-Deconstruction, and, (3) working with others to build new forms of representation- CoConstructionsim” (pp. 29-30). Her research with school children indicates that they need tools to organize and interpret multimedia and the examples, such as the Significance Mea- sure and Learning Constellation, can be pioneered in that way. This chapter highlights much of her work in multimedia eth- nography.

The book is logically arranged and will well serve all stu- dents and professionals in information technology, informa- tion pedagogy. critical thinking, and multimedia design. They will feel inspired by this reading and have many concrete ex- amples to illustrate how increasingly important contextual me- dia has become in our technically sophisticated lives. The po- tential for multimedia development is nowhere full and learn- ing establishments ranging from primary schools to research universities, cultural institutions like museums, and business/ corporate environments all benefit from the opportunities multimedia affords. The challenge remains how to design and develop future multimedia systems which Barrett and Red- mond clearly are committed to sharing. Chapter references provide a helpful bibliography of related works. Definitions are clearly given in every chapter. so the novice reader is not out of their league and the more informed reader has a greater context in which to place the understanding. The chapters blend well together and provide a pleasurable reading experience. This volume will accompany the growing body of literature on this topic and brings together some of the more theoretical orienta- tions to multimedia.

Julia Gelfand Applied Scirncrs Librarian Universily of CaliJi,rnia. Irvine Irvine. CA 92713-9556 E-muil: jgc&[email protected]

Learning Networks: A Field Guide to Teaching and Learning Online. Linda Harasim, Starr Roxanne Hiltz, Lucia Teles, and Murray Turoff. Cambridge, MA: The MIT Press; 1995; 329 pp. Price: $35.00 (ISBN o-262-08236-5.)

The authors define learning networks as, “groups of people who use CMC [Computer Mediated Communication] net- works to learn together, at the time, place and pace that best suits them and is appropriate to the task” (p. 3). “Learning” in this context is a cooperative enterprise with the personal partic- ipation of both the teacher and the learner as the key. Thus, the book is more about networking learners as about learning networks. One of the book’s themes is to facilitate the teacher’s transition from a traditional classroom teacher to an online teacher. The authors summarize research done over the rela- tively short but active life oflearning networks into a handbook promoting the theme that teaching and learning through net- works, with the active involvement of the learner, has signifi- cant learning advantages over traditional forms of classroom instruction. Another theme of the book is to foster an approach to learning, that in addition to teachers talking to learners and learners talking to teachers, learners must be encouraged to talk to, participate with, and learn from other learners. The online learning environment effectively erases time and distance con- straints of the traditional classroom. It allows both student and teacher to put their best work forward over E-mail or bulletin boards because response does not necessarily have to be imme- diate.

Because this book treats the subject from the perspective of past experience, the software tools of computer mediated com- munication are largely text-based tools: Electronic mail, bulle- tin boards, computer conferencing, file transfers, gophers, and various electronic resources as exist on the Internet. Depending on the learning situation, these tools are supplemented by var- ious traditional media such as textbooks, handouts, video and audio recordings, etc. The hardware recommended is basic. Throughout the book. the authors place less emphasis on the hardware and the software than on the effect their use has in stimulating interaction between teacher and learner. Comput- ers with software appropriate for dial-up access through mo- dem lines, from home or direct wire at schools, can be used with effectiveness to promote networked learning. The book emphasizes exchange of text messages between teacher and learner and among students. Electronic networks need not be sophisticated to implement a learning network. Internet access need not be part of the online classroom, but obviously access to it and its contents can greatly enhance learning.

The authors divide the handbook into an introduction and three parts. The introduction (Chapter I ) provides the scope of the work, historical perspectives, educational underpinnings, and electronic tools useful to the learning network. Part I, “The Field,” explores learning networks used at different educational levels. Chapter 2. “Networks for Schools,” examines a number of elementary and secondary school experiments case by case. For each, it describes the situations, the framework, the logis- tics, the goals, and the accomplishments. It also discusses fail- ures and the reasons why a class did not perform as it should have. Where available, the authors give statistical findings and references for further reading. Chapter 3, “Higher Education, Training and Informal Learning,” turns the focus from ele- mentary and secondary education to online experiences in col- lege and university instruction, business training, and self-edu- cation. The authors differentiate between online learning in elementary/secondary education and higher education by not- ing the use of online networks in the former as a “. . .supple- ment or adjunct to regular instruction” (p. 77). In higher edu- cation, the “adjunct mode” is also the most common form of online learning; but, I‘. . .two additional modes of networking are also widely used in postsecondary courses: mixed mode, in

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996 797

which a significant portion of a face-to-face of distance educa- tion class is conducted by email or computer conferencing. and totally online mode, in which the network serves as the primary environment for course discussion, assignments and interac- tions” (p. 77).

Part II, “The Guide,” is a guide to establishing a course or a course component using a learning network. Chapter 4, “De- signs for Learning Networks,” reviews techniques and struc- tures of on-line course design. While it does not attempt to teach topics such as “stimulation” from the perspective of a beginner, it does point to specifically online programs such as “MUDS” (Multiple User Dialogues) and “MUSES” (Multiple User Simulations) as structures peculiar to online simulation. The chapter reminds the teacher of the factors that make a suc- cessful course design including careful planning of curriculum, the integration of other types of media and tools, and planning the specific tasks to be required of the students.

Chapter 5, “Getting Started: The Implementation Process,” advises on teaching through a learning network with emphasis on preassessment and postassessment of student skills. The au- thors emphasize the necessity for simplicity in choosing the software associated with the class including the teaching of computer skills prior to beginning. The chapter discusses tech- niques for teachers to encourage student participation and ex- pression from the outset ofthe class. It emphasizes that the time commitment to instruction in computer use on the part ofboth teacher and student may exceed that of the traditional class- room. Chapter 6, “Teaching Online,” focuses on the teachers’ experiences and potential in online learning. Summarizing, the authors state that “Teachers, trainers, and professors with years of experience in classrooms report that computer networking encourages the high-quality interaction and sharing that is the heart of education. There can be close and daily contact be- tween the student and the teacher and among all the students, regardless of their appearance, location, or assertiveness” (p. 173). Topics covered include the role of the teacher, how to set the stage for learning, how to monitor and encourage partici- pation, facilitating group interaction, and grading student per- formance.

Chapter 7, “Learning Online,” looks at online learning from the student viewpoint. The point that online learning is new to most students and carries its own set of difficulties is well- known. The authors advise the encouragement of intensive communication with other students and utilization of the re- mote computer resources through the Internet, Archie. file transfers, electronic mail, and news services as prerequisites for success. Chapter 8, “Problems in Paradise: Expect the Best, Prepare for the Worst,” highlights technical problems, commu- nication anxiety, information overload, getting and keeping a conversation going, balancing cooperation and competition among students, and student failure to find enough time for the online classroom.

Part III, “The Future,” attempts to look ahead and see di- rections learning through networks might take. The authors project the potential future importance of learning networks in Chapter 9, “New Directions.” They see contemporary prob- lems that learning networks could creatively address in elemen- tary and secondary education. An example is school districts where a growing population would require expensive local school construction only to have a population shift several gen- erations later and leave expensive physical plants unused. The

chapter underscores that learning networks are capable of de- livering education with little regard to geographic proximity. Furthermore, learning networks allow for the recruitment of teachers irrespective of location. Indeed, the learning network would allow teachers who gave up their careers for raising their own children an opportunity to reenter the workplace as re- mote location instructors. Similarly, colleges and universities would be able to attract students irrespective of distance. An important area of development is international learning. In- deed, the promise is that both on an individual level, a school level, and a business and professional level, there are advan- tages to learning networks built across national borders. The future also holds the development ofadditional software for the delivery of graphic images, motion images in various formats, more sophisticated interactive software, hypertext, hyperme- dia, class and discussion management software to enhance on- line learning. In Chapter 10, “Network Learning: A Paradigm for the Twenty-First Century,” the authors see the future of education as a paradigm shift to increased use of the learning network, which will enable schools to overcome physical re- strictions of time and place. It will be simultaneously a shift from competitive learning to a new kind of cooperative learn- ing. Further, online learning is put forth as an approach to life- long learning.

Finally, Chapter I I, “Epilogue: Email from the Future,” gives a fictional newsletter from the year 20 15. The news stories are intended to provoke thought about the economic and edu- cational implications which have occurred or might occur if the network model begins to be implemented on a wide scale. The chapter suggests that the future implementation of learning networks holds considerable controversy yet to be resolved over certification issues, closing of physical facilities (even whole schools), and downsizing the physical plants used by conventional education. Appendices A through E give lists of online resources and organizations, both commercial and pub- lic. In the fast changing world of online resources, these repre- sent only a sampling and will be outdated quickly. Sample URLs (Universal Resource Locators) of the kind of resources available are cited. Appendix F is a “Sample Course Descrip- tion and Letter to Online Students” and Appendix G is a set of “Annotated Excerpts from an Online Course.” A bibliography and an index completes the book.

Learning Networks is a handbook for someone actively in- terested in teaching online using the authors’ participative model. The authors write with enthusiasm and give advice from the perspective of experience. The book is about learning online, and its point, that computers provide a powerful me- dium for a variety of learning situations. is an important one in an age when online learning is in its infancy. Learning Net- works is. however, a handbook and at times topics tend to be only briefly considered. Nevertheless, it is a groundbreaking work on the human preparation for teaching online and it touches on many ideas for which additional research can and ought to be done.

Robert Wittorf Univer.Gt~S Libraries qfNotre Dame 221 Theodore M. Hesburgh Librar?! lhiversit?l qf.h’otre Dume Notre Dame, IN 46556 E-mail: U%[email protected]

798 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-October 1996