Transcript
Page 1: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

O N T O L O G I E S

What Are Ontologies, andWhy Do We Need Them?B. Chandrasekaran and John R. Josephson, Ohio State UniversityV. Richard Benjamins, University of Amsterdam

THEORIES IN AI FALL INT O TWObroad categories: mechanismtheories andcontenttheories. Ontologies are content the-ories about the sorts of objects,properties ofobjects,and relations between objects that arepossible in a specified domain of knowledge.They provide potential terms for describingour knowledge about the domain.

In this article, we survey the recent devel-opment of the field of ontologies in AI. Wepoint to the somewhat different roles ontolo-gies play in information systems,natural-language understanding, and knowledge-based systems. Most research on ontologiesfocuses on what one might characterize asdomain factual knowledge, because knowl-ede of that type is particularly useful in nat-ural-language understanding. There is an-other class of ontologies that are importantin KBS—one that helps in sharing know-eldge about reasoning strategies or problem-solving methods. In a follow-up article, wewill f ocus on method ontologies.

Ontology as vocabulary

In philosophy, ontology is the study of thekinds of things that exist. It is often said thatontologies “carve the world at its joints.” InAI, the term ontology has largely come to

mean one of two related things. First of all,ontology is a representation vocabulary,oftenspecialized to some domain or subject matter.More precisely, it is not the vocabulary as suchthat qualifies as an ontology, but the concep-tualizations that the terms in the vocabularyare intended to capture. Thus,translating theterms in an ontology from one language toanother, for example from English to French,does not change the ontology conceptually. Inengineering design,you might discuss theontology of an electronic-devices domain,which might include vocabulary that describesconceptual elements—transistors,operationalamplifiers,and voltages—and the relationsbetween these elements—operational ampli-fiers are a type-ofelectronic device, and tran-sistors are a component-ofoperational ampli-fiers. Identifying such vocabulary—and theunderlying conceptualizations—generally

requires careful analysis of the kinds of objectsand relations that can exist in the domain.

In its second sense, the term ontology issometimes used to refer toa body of knowl-edge describing some domain,typically acommonsense knowledge domain,using arepresentation vocabulary. For example,CYC1 often refers to its knowledge repre-sentation of some area of knowledge as itsontology.

In other words,the representation vocab-ulary provides a set of terms with which todescribe the facts in some domain,while thebody of knowledge using that vocabulary isa collection of facts about a domain. How-ever, this distinction is not as clear as it mightfirst appear. In the electronic-device exam-ple, that transistor is acomponent-ofopera-tional amplifier or that the latter is a type-ofelectronic device is just as much a fact about

THIS SURVEY PROVIDES A CONCEPTUAL INTRODUCTION

TO ONTOLOGIES AND THEIR ROLE IN INFORMATION

SYSTEMS AND AI. THE AUTHORS ALSO DISCUSS HOW

ONTOLOGIES CLARIFY THE DOMAIN’S STRUCTURE OF

KNOWLEDGE AND ENABLE KNOWLEDGE SHARING.

20 1094-7167/99/$10.00 © 1999 IEEE IEEE INTELLIGENT SYSTEMS

.

Page 2: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

its domain as a CYC fact about some aspectof space, time, or numbers. The distinctionis that the former emphasizes the use ofontology as a set of terms for representingspecific facts in an instance of the domain,while the latter emphasizes the view of ontol-ogy as a general set of facts to be shared.

There continues to be inconsistencies inthe usage of the term ontology. At times,the-orists use the singular term to refer to a spe-cific set of terms meant to describe the entityand relation-types in some domain. Thus,wemight speak of an ontology for “liquids” orfor “parts and wholes.” Here, the singularterm stands for the entire set of concepts andterms needed to speak about phenomenainvolving liquids and parts and wholes.

When different theorists make different pro-posals for an ontology or when we speak aboutontology proposals for different domains ofknowledge,we would then use the plural termontologiesto refer to them collectively. In AIand information-systems literature, however,there seems to be inconsistency: sometimes wesee references to “ontology of domain”andother times to “ontologiesof domain,” bothreferring to the set of conceptualizations forthe domain. The former is more consistent withthe original (and current) usage in philosophy.

Ontology as content theory

The current interest in ontologies is the lat-est version of AI’ s alternation of focus be-tween content theories and mechanism the-ories. Sometimes,the AI community getsexcited by some mechanism such as rule sys-tems,frame languages,neural nets,fuzzylogic, constraint propagation,or unification.The mechanisms are proposed as the secretof making intelligent machines. At othertimes,we realize that, however wonderful themechanism,it cannot do much without agood content theory of the domain on whichit is to work. Moreover, we often recognizethat once a good content theory is available,many different mechanisms might be usedequally well to implement effective systems,all using essentially the same content.2

AI researchers have made several attemptsto characterize the essence of what it meansto have a content theory. McCarthy andHayes’theory (epistemic versus heuristic dis-tinction),3 Marr’s three-level theory (infor-mation processing, strategy level,algorithmsand data structures level,and physical mech-anisms level),4 and Newell’s theory (Knowl-

edge Level versus Symbol Level)5 all grap-ple in their own ways with characterizingcontent. Ontologies are quintessentially con-tent theories,because their main contributionis to identify specific classes of objects andrelations that exist in some domain. Ofcourse, content theories need a representa-tion language. Thus far, predicate calculus-like formalisms,augmented with type-ofrelations (that can be used to induce classhierarchies),have been most often used todescribe the ontologies themselves.

Why are ontologiesimportant?

Ontological analysis clarif ies the structureof knowledge. Given a domain,its ontologyforms the heart of any system of knowledgerepresentation for that domain. Withoutontologies, or the conceptualizations thatunderlie knowledge, there cannot be a vocab-ulary for representing knowledge. Thus,thefirst step in devising an effective knowledge-representation system,and vocabulary, is toperform an effective ontological analysis ofthe field, or domain. Weak analyses lead toincoherent knowledge bases.

An example of why performing goodanalysis is necessary comes from the field ofdatabases.6 Consider a domain having sev-eral classes of people (for example, students,professors,employees,females,and males).This study first examined the way this data-base would be commonly organized: stu-dents,employees,professors, males,andfemale would be represented as types-oftheclass humans. However, some of the prob-lems that exist with this ontology are that stu-dents can also be employees at times and canalso stop being students. Further analysisshowed that the terms studentsand employeedo not describe categories of humans,but areroles that humans can play, while terms suchas females andmalesmore appropriately rep-resent subcategories of humans. Therefore,clarifying the terminology enables the ontol-ogy to work for coherent and cohesive rea-soning purposes.

Second, ontologies enable knowledgesharing. Suppose we perform an analysis andarrive at a satisfactory set of conceptualiza-tions,and their representative terms,for somearea of knowledge—for example, the elec-tronic-devices domain. The resulting ontol-ogy would likely include domain-specific

terms such as transistors and diodes; generalterms such as functions, causal processes,and modes; and terms that describe behaviorsuch as voltage. The ontology captures theintrinsic conceptual structure of the domain.In order to build a knowledge representationlanguage based on the analysis,we need toassociate terms with the concepts and rela-tions in the ontology and devise a syntax forencoding knowledge in terms of the conceptsand relations. We can share this knowledgerepresentation language with others whohave similar needs for knowledge represen-tation in that domain,thereby eliminating theneed for replicating the knowledge-analysisprocess. Shared ontologies can thus form thebasis for domain-specific knowledge-repre-sentation languages. In contrast to the previ-ous generation of knowledge-representationlanguages (such as KL-One),these lan-guages are content-rich; they have a largenumber of terms that embody a complex con-tent theory of the domain.

Shared ontologies let us build specificknowledge bases that describe specific situ-ations. For example, different electronic-devices manufacturers can use a commonvocabulary and syntax to build catalogs thatdescribe their products. Then the manufac-turers could share the catalogs and use themin automated design systems. This kind ofsharing vastly increases the potential forknowledge reuse.

Describing the world

We can use the terms provided by thedomain ontology to assert specific proposi-tions about a domain or a situation in adomain. For example, in the electronic-device domain,we can represent a fact abouta specific circuit: circuit 35 has transistor 22as a component,where circuit 35 is aninstance of the concept circuit and transistor22 is an instance of the concept transistor.Once we have the basis for representingpropositions,we can also represent knowl-edge involving propositional attitudes (suchas hypothesize, believe, expect, hope, desire,and fear). Propositional attitude terms takepropositions as arguments. Continuing withthe electronic-device domain,we can assert,for example:the diagnostician hypothesizesor believes that part 2 is broken, or thedesigner expectsor desiresthat the powerplant has an output of 20 megawatts. Thus,an ontology can represent beliefs,goals,

JANUARY/FEBRUARY 1999 21

.

Page 3: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

hypotheses,and predictions about a domain,in addition to simple facts. The ontology alsoplays a role in describing such things as plansand activities, because these also requirespecification of world objects and relations.Propositional attitude terms are also part ofa larger ontology of the world, useful espe-cially in describing the activities and prop-erties of the special class of objects in theworld called “intensional entities”—forexample, agents such as humans who havemental states.

Constructing ontologies is an ongoingresearch enterprise. Ontologies range inabstraction, from very general terms thatform the foundation for knowledge repre-sentation in all domains,to terms that arerestricted to specific knowledge domains. Forexample, space, time, parts, and subpartsareterms that apply to all domains; malfunctionapplies to engineering or biological domains;and hepatitis applies only to medicine.

Even in cases where a task might seem tobe quite domain-specific, knowledge repre-sentation might call for an ontology that des-cribes knowledge at higher levels of gener-ality. For example, solving problems in thedomain of turbines might require knowledgeexpressed using domain-general terms suchas flowsand causality. Such general-leveldescriptive terms are called the upper ontol-ogy or top-level ontology. There are manyopen research issues about the correct waysto analyze knowledge at the upper level. Toprovide some idea of the issues involved,Figure 1 excerpts a quote from a recent callfor papers.

Today, ontology has grown beyond philos-ophy and now has many connections to infor-mation technology. Thus,research on ontol-ogy in AI and information systems has had toproduce pragmatically useful proposals fortop-level ontology. The organization of a top-level ontology contains a number of problems,similar to the problems that surround ontol-

ogy in philosophy. For example,many ontolo-gies have thing or entityas their root class.However, Figure 2 illustrates that thing andentity start to diverge at the next level.

For example, CYC’s thinghas the subcat-egories individual object, intangible, andrep-resented thing; the Generalized UpperModel’s7 (GUM) um-thinghas the subcate-gories configuration, element, and sequence;Wordnet’s8 thing has the subcategories li v-ing thingand nonliving thing, and Sowa’sroot T has the subcategories concrete, pro-cess, object, and abstract. (Natalya FridmanNoy’s and Carol Hafner’s article discussesthese differences more fully.9) Some of thesedifferences arise because not all of theseontologies are intended to be general-pur-pose tools,or even explicitl y to be ontolo-gies. Another reason for the differences isthat, in principle, there are many differenttaxonomies.

Although differences exist within ontolo-gies,general agreement exists between on-tologies on many issues:

• There are objectsin the world.• Objects have propertiesor attributesthat

can take values.• Objects can exist in various relationswith

each other.• Properties and relations can change over

time.• There are eventsthat occur at different

time instants.• There are processesin which objects par-

ticipate and that occur over time.• The world and its objects can be in dif-

ferent states.• Events can causeother events or states as

effects.• Objects can have parts.

The representational repertoire of objects,relations,states,events,and processes doesnot say anything about which classes of these

entities exist. The modeler of the domainsmakes these commitments. As we move froman ontology’s top to lower taxonomic levels,commitments specific to domains and phe-nomena appear. For modeling objects onearth,we can make certain commitments. Forexample, animals, minerals, and plantsaresubcategories of objects; has-life(x)and con-tains-carbon(x)are object properties; andcan-eat(x, y) is a possible relation betweenany two objects. These commitments are spe-cific to objects and phenomena in this do-main. Further, the commitments are not arbi-trary. For them to be useful,they shouldreflect some underlying reality.

There is no sharp division between do-main-independent and domain-specific on-tologies for representing knowledge. Forexample, the terms object, physical object,device, engine, and diesel engineall describeobjects,but in an order of increasing domainspecificity. Similarly, terms for relationsbetween objects can span a range of speci-ficity, such as connected, electrically-con-nected, and soldered-to.

Subtypes of concepts.Ontologies generallyappear as a taxonomic tree of conceptual-izations, from very general and domain-independent at the top levels to increasinglydomain-specific further down in the hierar-chy. We mentioned earlier that differentontologies propose different subtypes of evenvery general concepts. This is because, as arule, different sets of subcategories will resultfrom different criteria for categorization. Two,among many, alternate subcategorizations ofthe general concept objectare physicalandabstract, and livingand non-living. In somecultures and languages,words for objectshave gender, thus creating another top-levelclassification along the gender axis. We caneasily think of additional subcategorizationsbased on other criteria. The existence of alter-nate categorizations only becomes more acuteas we begin to model specific domains ofknowledge. For example, we can subcatego-rize causal process into continuousand dis-cretecausal processes along the dimensionof how time is represented, and into mechan-ical, chemical, biological, cognitive, andsocial processes along the dimension of thekinds of objects and relations involved in thedescription.

In principle, the number of classificationcriteria and distinct subtypes is unlimited,because the number of possible dimensionsalong which to develop subcategories can-

22 IEEE INTELLIGENT SYSTEMS

Figure 1. Call for papers for a special issue on temporal parts for The Monist, An International Quarterly Journal ofGeneral Philosophical Inquiry. This quote suggests that ontology has always been an issue of deep concern in philoso-phy and that the issues continue to occupy contemporary philosophers.

On the one hand there are entities,such as processes and events,which have temporalparts.… On the other hand there are entities,such as material objects,which are always pre-sent in their entirety at any time at which they exist at all. The categorical distinction betweenentities which do,and entities which do not have temporal parts is grounded in commonsense. Yet various philosophers have been inclined to oppose it. Some … have defended anontology consisting exclusively of things with no temporal parts. Whiteheadians have favoredontologies including only temporally extended processes. Quine has endorsed a four-dimen-sional ontology in which the distinction between objects and processes vanishes and everyentity comprises simply the content of some arbitrarily demarcated portion of space-time.One further option,embraced by philosophers such as David Lewis, accepts the oppositionbetween objects and processes,while still finding a way to allow that all entities have bothspatial and temporal parts.

.

Page 4: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

not be exhaustively specified. Often,this factis not obvious in general-purpose ontologies,because the top levels of such ontologiescommit to the most commonly useful sub-types. However, domain-specific ontologiescan contain categorizations along dimensionsthat are usually outside the general ontology.

Task dependence of ontologies.How task-dependent are ontologies? Presumably, thekinds of things that actually exist do notdepend on our goals. In that sense, ontologiesare not task-dependent. On the other hand,what aspects of reality are chosen for encod-ing in an ontology does depend on the task.For example, in the domain of fruits, wewould focus on particular aspects of reality ifwe were developing the ontology for theselection of pesticides; we would focus onother aspects of reality if we were develop-ing an ontology to help chefs select fruits forcooking. In ontologies for engineering appli-cations,categorizing causal processes intothose that do,and that do not,produce dan-gerous side effects might be useful. Designengineers and safety analysts might find thisa very useful categorization, though it isunlikely to be part of a general-purpose ontol-ogy’s view of the causal process concept.

Practically speaking, an ontology is un-likely to cover all possible potential uses. Inthat sense, both an ontology for a domain anda knowledge base written using that ontologyare likely to be more appropriate for certainuses than others and unlikely to be sharableacross widely divergent tasks. This is,by now,a truism in KBS research and is the basicinsight that led to the current focus on the rela-tionship between tasks and knowledge types.Presuppositions or requirements can be asso-ciated with problem-solving methods for dif-ferent tasks so that they can capture explicitlythe way in which ontologies are task-depen-dent. For example,a method might have a pre-supposition (or assumption10) stating that itworks correctly only if the ontology allowsmodeling causal processes discretely. There-fore, assumptions are a key factor in practicalsharing of ontologies.

Technology for ontologysharing

There have been several recent attempts tocreate engineering frameworks for construct-ing ontologies. Michael R. Genesereth and

Richard E. Fikes describe KIF (KnowledgeInterchange Format), an enabling technologythat facilitates expressing domain factualknowledge using a formalism based on aug-mented predicate calculus.11 Robert Nechesand his colleagues describe a knowledge-shar-ing initiative,12 while Thomas R. Gruber hasproposed a language called Ontolinguato helpconstruct portable ontologies.13In Europe, theCommonKADS project has taken a similarapproach to modeling domain knowledge.14

These languages use varieties of predicatecalculus as the basic formalism. Predicatecalculus facilitates the representation ofobjects,properties,and relations. Variationssuch as situational calculusintroduce timeso as to represent states,events,and pro-cesses. If we extend the idea of knowledgeto include images and other sense modali-ties,we might need radically different kindsof representation. For now, predicate calcu-lus provides a good starting point for ontol-ogy-sharing technologies.

Using a logical notation for writing andsharing ontologies does not imply any com-mitment to implementing a related knowl-edge system or a related logic. We are simplytaking a knowledge-level5 stance in describ-ing the knowledge system,whatever themeans of implementation. In this view, wecan ask of any intelligent system,even oneimplemented as a neural network, “Whatdoes the system know?”

Use of ontologies

In AI, knowledge in computer systems isthought of as something that is explicitly rep-resented and operated on by inference pro-cesses. However, that is an overly narrowview. All inf ormation systems traffic in knowl-edge. Any software that does anything usefulcannot be written without a commitment to amodel of the relevant world—to entities,prop-

erties,and relations in that world. Data struc-tures and procedures implicitly or explicitlymake commitments to a domain ontology. Itis common to ask whether a payroll system“knows” about the new tax law, or whether adatabase system “knows” about employeesalaries. Information-retrieval systems,digi-tal libraries, integration of heterogeneousinformation sources, and Internet searchengines need domain ontologies to organizeinformation and direct the search processes.For example, a search engine has categoriesand subcategories that help organize thesearch. The search-engine community com-monly refers to these categories and subcate-gories as ontologies.

Object-oriented design of software sys-tems similarly depends on an appropriatedomain ontology. Objects,their attributes,and their procedures more or less mirroraspects of the domain that are relevant to theapplication. Object systems representing auseful analysis of a domain can often bereused for a different application program.Object systems and ontologies emphasizedifferent aspects,but we anticipate that overtime convergence between these technolo-gies will increase. As information systemsmodel large knowledge domains,domainontologies will become as important in gen-eral software systems as in many areas of AI.

In AI, while knowledge representation per-vades the entire field, two application areasin particular have depended on a rich body ofknowledge. One of them is natural-languageunderstanding. Ontologies are useful in NLUin two ways. First,domain knowledge oftenplays a crucial role in disambiguation. A well-designed domain ontology provides the basisfor domain knowledge representation. Inaddition,ontology of a domain helps identifythe semantic categories that are involved inunderstanding discourse in that domain. Forthis use, the ontology plays the role of a con-cept dictionary. In general,for NLU, we need

JANUARY/FEBRUARY 1999 23

Thing

Intangible RepresentedIndividual object

CYCThing

NonlivingLiving

Wordnet

Um-Thing

Element SequenceConfiguration

GUMThing

Process AbstractObjectConcrete

Sowa's

Figure 2. Illustration of how ontologies differ in their analyses of the most general concepts.

.

Page 5: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

both a general-purpose upper ontology and adomain-specific ontology that focuses on thedomain of discourse (such as military com-munications or business stories). CYC,Word-net,8 and Sensus15 are examples of sharableontologies that have been used for languageunderstanding.

Knowledge-based problem solving is thesecond area in AI that is a big consumer ofknowledge. KBPS systems solve a variety ofproblems—such as diagnosis,planning, anddesign—by using a rich body of knowledge.Currently, KBPS systems employ domain-specific knowledge, which is often sufficientfor constructing knowledge systems that tar-get specific application areas and tasks. How-ever, even in specific application areas,knowledge systems can fail catastrophicallywhen they are pushed to the edge of the capa-bility of the domain-specific knowledge. Inresponse to this particular shortcoming,researchers have proposed that problem-solving systems need commonsense knowl-

edge in addition to domain-specific knowl-edge. The initial motivation for CYC was toprovide such a body of sharable common-sense knowledge for knowledge-based sys-tems. There is a similar need for developingdomain-specific knowledge. Thus,ontology-based knowledge-base development providesa double advantage. The ontologies them-selves are sharable. With these ontologies,we can build knowledge bases using thestructure of conceptualizations to encodespecific pieces of knowledge. The knowledgebases that we develop using these ontologiescan be shared more reliably, because the for-mal ontology that underlies them can helpclarify the representation’s semantics.

Information systems and NLU systemsneed factual knowledge about their domainsof discourse. The inferences they make areusually simple. Problem-solving systems,incontrast,engage in complex sequences ofinferences to achieve their goals. Such sys-tems need to have reasoning strategies that

enable them to choose among alternative rea-soning paths. Ontology specification inknowledge systems has two dimensions:

• Domain factual knowledge providesknowledge about the objective realities inthe domain of interest (objects,relations,events,states,causal relations, and soforth).

• Problem-solving knowledge providesknowledge about how to achieve variousgoals. A piece of this knowledge might bein the form of a problem-solving methodspecifying—in a domain-independentmanner—how to accomplish a class ofgoals.

Most early research in KBPS mixed fac-tual and problem-solving knowledge intohighly domain-specific rules,called domainknowledge. As research progressed, it be-came clear that there were systematic com-monalities in reasoning strategies between

24 IEEE INTELLIGENT SYSTEMS

Related workThe field of ontology attracts an interdisciplinary mix of researchers,

both from academia and industry. Here we give a selection of referencesthat describe related ontology work. Because the literature is vast,a com-plete list is impossible. For an extensive collection of (alphabeticallyordered) links to ontological work, including proceedings and events,seehttp://www.cs.utexas.edu/users/mfkb/related.html.

Special issues on ontology

• N. Guarino and R. Poli, “The Role of Ontology in the InformationTechnology,” Int’ l J. Human-Computer Studies,Vol. 43,Nos. 5/6,Nov.-Dec. 1995,pp. 623–965.

• G. Van Heijst,A.T. Schreiber, and B.J. Wielinga,“Using ExplicitOntologies in KBS Development,” Int’ l J. Human-Computer Studies,Vol. 46,Nos. 2/3,Feb.-Mar. 1997,pp. 183–292.

• M. Uschold and A. Tate, “Putting Ontologies to Use,” KnowledgeEng. Rev., Vol. 13,No. 1,Mar. 1998,pp. 1–3.

Ontology development

• J. Benjamin et al.,“Ontology Construction for Technical Domains,”Proc. EKAW ’96: European Knowledge Acquisition Workshop, Lec-ture Notes in Artificial Intelligence No. 1076,Springer-Verlag,Berlin, 1996,pp. 98–114.

• W.N. Borst and J.M. Akkermans,“Engineering Ontologies,” Int’ l J.Human-Computer Studies, Vol. 46,Nos. 2/3,Feb.-Mar. 1997,pp.365–406.

• A. Farquhar, R. Fikes,and J. Rice, “The Ontolingua Server:A Toolfor Collaborative Ontology Construction,” Int’ l J. Human-ComputerStudies,Vol. 46,No. 6,June 1997,pp. 707–728.

• A. Gomez-Perez,A. Fernandez,and M.D. Vicente, “Towards aMethod to Conceptualize Domain Ontologies,” Working Notes 1996European Conf. Artificial Intelligence (ECAI ’96) Workshop on Onto-logical Eng., ECCAI, Budapest,Hungary, 1996,pp. 41–52.

• T.R. Gruber, “Towards Principles for the Design of Ontologies Used

for Knowledge Sharing,” Int’ l J. Human-Computer Studies,Vol. 43,Nos. 5/6,Nov.-Dec. 1995,pp. 907–928.

• R. Studer, V.R. Benjamins,and D. Fensel,“Knowledge Engineering,Principles,and Methods,” Data and Knowledge Eng.,Vol. 25,Mar.1998,pp. 161–197.

• M. Uschold and M. Gruninger, “Ontologies:Principles,Methods,and Applications,” Knowledge Eng. Rev.,Vol. 11,No. 2,Mar. 1996,pp. 93–155.

Natural-language ontology

• J.A. Bateman,B. Magini, and F. Rinaldi,“The Generalized UpperModel,” Working Papers 1994 European Conf. Artificial Intelligence(ECAI ’94) Workshop on Implemented Ontologies, 1994,pp. 34–45;http://www.darmstadt.gmd.de/publish/komet/papers/ecai94.ps.

• K. Knight and S. Luk, “Building a Large-Scale Knowledge Base forMachine Translation,” Proc. AAAI ’94,AAAI Press,Menlo Park,Calif. 1994.

• G.A. Miller, “Wordnet:An Online Lexical Database,” Int’ l J.Lexicography, Vol. 3,No. 4,1990,pp. 235–312.

• P.E. Van de Vet,P.H. Speel,and N.J.I. Mars,“The Plinius Ontology ofCeramic Materials,” Working Papers 1994 European Conf. ArtificialIntelligence (ECAI ’94) Workshop on Implemented Ontologies,ECCAI,Amsterdam,1994,pp. 187–206.

Ontologies and information sources

• Y. Arens et al.,“Retrieving and Integrating Data from Multiple Infor-mation Sources,” Int’ l J. Intelligent and Cooperative InformationSystems,Vol. 2,No. 2,1993,pp. 127–158.

• S. Chawathe, H. Garcia-Molina,and J. Widom,“Flexible ConstraintManagement for Autonomous Distributed Databases,” IEEEDataEng. Bulletin,Vol. 17,No. 2,1994,pp. 23–27.

• S. Decker et al.,“Ontobroker:Ontology-Based Access to Distributedand Semi-Structured Information,” Semantic Issues in MultimediaSystems,R. Meersman et al.,eds.,Kluwer Academic Publishers,Boston,1999.

.

Page 6: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

goals of similar types. These reasoningstrategies were also characterized by theirneed for specific types of domain factualknowledge. It soon became clear that strate-gic knowledge could be abstracted andreused.

With few exceptions,16,17 the domain fac-tual knowledge dimension drives the focusof most of the AI investigations on ontolo-gies. This is because applications to languageunderstanding motivates much of the workon ontologies. Even CYC,which was origi-nally motivated by the need for knowledgesystems to have world knowledge, has beentested more in natural-language than inknowledge-systems applications.

KBPS RESEARCHERS REALIZEDthat, in addition to factual knowledge, thereis knowledge about how to achieve problem-solving goals. In fact,this emphasis on meth-

ods appropriate for different types of prob-lems fueled second-generation research inknowledge systems.18 Most of the KBPScommunity’s work on knowledge represen-tation is not well-known to the generalknowledge-representation community. In thecoming years,we expect an increased focuson method ontologies as a sharable knowl-edge resource.

AcknowledgmentsThis article is based on work supported by the

Office of Naval Research under Grant N00014-96-1-0701. We gratefully acknowledge the support ofONR and the DARPA RaDEO program. Any opin-ions,findings,and conclusions or recommenda-tions expressed in this publication are those of theauthors and do not necessarily reflect the views ofONR. Netherlands Computer Science ResearchFoundation supported Richard Benjamins withfinancial support from the Netherlands Organiza-tion for Scientific Research (NWO).

References

1. D.B. Lenat and R.V. Guha,Building LargeKnowledge-Based Systems:Representationand Inference in the CYC Project,Addison-Wesley, Reading, Mass.,1990.

2. B. Chandrasekaran,“AI, Knowledge, and theQuest for Smart Systems,” IEEE Expert,Vol.9, No. 6,Dec. 1994,pp. 2–6.

3. J. McCarthy and P.J. Hayes,“Some Philo-sophical Problems from the Standpoint ofArtif icial Intelligence,” Machine IntelligenceVol. 4,B. Meltzer and D. Michie, eds.,Edin-burgh University Press,Edinburgh,1969,pp.463–502.

4. D. Marr, Vision: A Computational Investiga-tion into the Human Representation and Pro-cessing of Visual Information,W.H. Freeman,San Francisco,1982.

5. A. Newell, “The Knowledge Level,” Artifi-cial Intelligence, Vol. 18,1982,pp. 87–127.

6. R. Wieringa and W. de Jonge, “Object Iden-

JANUARY/FEBRUARY 1999 25

• S. Luke et al.,“Ontology-Based Web Agents,” Proc. First Int’l Conf.Autonomous Agents,ACM Press,New York, 1997,pp. 59–66; http://www.cs.umd.edu/projects/plus/SHOE/ 1997.

• S.T. Polyak et al.,“Applying the Process Interchange Format (PIF) toa Supply Chain Process Interoperability Scenario,” Proc. 1998 Euro-pean Conf. Artificial Intelligence (ECAI ’98) Workshop on Applica-tions of Ontologies and Problem-Solving Methods,ECCAI, Brighton,England, 1998,pp. 88–96.

• G. Wiederhold, “Intelligent Integration of Information,” J. IntelligentInformation Systems,Vol. 6,Nos. 2/3,1996.

• G. Wiederhold and M. Genesereth,“The Conceptual Basis for Medi-ation Services,” IEEE Intelligent Systems,Vol. 12,No. 5,Sept./Oct.1997,pp. 38–47.

Ontologies and knowledge management

• A. Abecker et al.,“Toward a Technology for Organizational Memories,”IEEE Intelligent Systems,Vol. 13,No. 3,May/June 1998,pp. 40–48.

• V.R. Benjamins and D. Fensel,“The Ontological Engineering Initia-tive (KA)2,” Formal Ontology in Information Systems,N. Guarino,ed., IOS Press,Amsterdam,1998,pp. 287–301.

• M.S. Fox, J. Chionglo,and F. Fadel,“A Common-Sense Model of theEnterprise,” Proc. Industrial Eng. Research Conf., Inst. for IndustrialEngineers,Norcross,Ga.,1993,pp. 425–429.

• Manual of the Toronto Vir tual Enterprise, tech. report, EnterpriseIntegration Laboratory, Dept. of Industrial Eng., Univ. of Toronto,Toronto,1995.

• M. Uschold et al.,“The Enterprise Ontology,” Knowledge Eng. Rev.,Vol. 13,No. 1,Mar. 1998.

Task and method ontologies

• D. Fensel et al.,“Using Ontologies for Defining Tasks,Problem-Solving Methods,and Their Mappings,” Knowledge Acquisition,Modeling, and Management, E. Plaza and V.R. Benjamins,eds.,Springer-Verlag, Berlin, 1997,pp. 113–128.

• J.H. Gennari et al.,“Mapping Domains to Methods in Suppport of

Reuse,” Int’ l J. Human-Computer Studies,Vol. 41,No. 3,Sept. 1994,pp. 399–424.

• A. Tate, “Roots of SPAR—Shared Planning and Activity Represen-tation,” Knowledge Eng. Rev., Vol. 13,No. 1,Mar. 1998,pp.121–128.

• Y.A. Tijerino and R. Mizoguchi, “Multis II: Enabling End-Usersto Design Problem-Solving Engines via Two-Level Task Ontolo-gies,” Proc. EKAW ’93: Seventh European Workshop on Know-ledge Acquisition for Knowledge-Based Systems,Lecture Notes inArtif icial Intelligence No. 723,Springer-Verlag, 1993,pp.340–359.

Ontology workshops

• Applications of Ontologies and Problem-Solving Methods,ECAI ’98(European Conf. AI), http://delicias.dia.fi.upm.es/WORKSHOP/ECAI98/index.html

• Building, Maintaining, and Using Organizational Memories,ECAI’98, http://www.aifb.uni-karlsruhe.de/WBS/ECAI98OM/

• Formal Ontologies in Information Systems (FOIS ’98),http://krr.irst.itc.it:1024/fois98/program.html

• Intelligent Information Integration, ECAI ’98, http://www.tzi.de/grp/i3/ws-ecai98/

• Sharable and Reusable Components for Knowledge Systems,KAW’98 (Workshop on Knowledge Acquisition,Modeling, and Manage-ment),http://ksi.cpsc.ucalgary.ca/KAW/KAW98/KAW98Proc.html

• Ontological Engineering, AAAI Spring Symp. Series,Stanford,Calif., 1997,http://www.aaai.org/Symposia/Spring/1997/sss-97.html

• Problem-Solving Methods,IJCAI ’97 (Int’l Joint Conf. AI), http://www.aifb.uni-karlsruhe.de/WBS/dfe/PSM/main.html

• Ontological Engineering, ECAI ’96, http://wwwis.cs.utwente.nl:8080/kbs/EcaiWorkshop/homepage.html

• Practical Aspects of Ontology Development,AAAI ’96• Sharable and Reusable Ontologies,KAW ’96, http://ksi.cpsc.

ucalgary.ca/KAW/KAW96/KAW96Proc.html• Sharable and Reusable Problem-Solving Methods,KAW ’96, http://

ksi.cpsc.ucalgary.ca/KAW/KAW96/KAW96Proc.html

.

Page 7: What Are Ontologies, and Why Do We Need Them? · ONTOLOGIES What Are Ontologies, and Why Do We Need Them? B. Chandrasekaran and John R. Josephson, Ohio State University …

tifiers, Keys and Surrogates: Object Identi-fiers Revisited,”Theory and Practice ofObject Systems (TAPOS),Vol. 1, No. 2, 1995,pp. 101–114.

7. J.A. Bateman, B. Magini, and F. Rinaldi, “TheGeneralized Upper Model,”Working Papers1994 European Conf. Artificial Intelligence(ECAI ’94) Workshop on Implemented Ontolo-gies, 1994, pp. 34–45; http://www.darmstadt.gmd.de/publish/komet/papers/ecai94.ps.

8. G.A. Miller, “Wordnet: An Online LexicalDatabase,”Int’l J. Lexicography, Vol. 3, No.4, 1990, pp. 235–312.

9. N. Fridman Noy and C.D. Hafner, “The Stateof the Art in Ontology Design,”AI Magazine,Vol. 18, No. 3, 1997, pp. 53–74.

10. D. Fensel and V.R. Benjamins, “The Role ofAssumptions in Knowledge Engineering,”Int’l J. Intelligent Systems,Vol. 13, No. 7,1998, pp. 715–747.

11. M.R. Genesereth and R.E. Fikes,KnowledgeInterchange Format, Version 0.3, KnowledgeSystems Lab., Stanford Univ., Stanford, Calif.1992.

12. R. Neches et al., “Enabling Technology forKnowledge Sharing,”AI Magazine,Vol. 12,No. 3, 1991, pp. 36–56.

13. T.R. Gruber, “A Translation Approach toPortable Ontology Specifications,”Knowl-edge Acquisition,Vol. 5, 1993, pp. 199–220.

14. G. Schreiber et al., “CommonKADS: AComprehensive Methodology for KBSDevelopment,”IEEE Expert,Vol. 9, No. 6,

Dec. 1994, pp. 28–37.

15. K. Knight and S. Luk. “Building a Large-Scale Knowledge Base for Machine Transla-tion,” Proc. Am. Assoc. Artificial Intelligence,AAAI Press, Menlo Park, Calif., 1994.

16. D. Fensel et al., “Using Ontologies for Defin-ing Tasks, Problem-Solving Methods andTheir Mappings,”Knowledge Acquisition,Modeling and Management,E. Plaza and V.R.Benjamins, eds., Springer-Verlag, New York,1997, pp. 113–128.

17. R. Mizoguchi, J. Van Welkenhuysen, and M.Ikeda, “Task Ontology for Reuse of ProblemSolving Knowledge,”Towards Very LargeKnowledge Bases, N.J.I. Mars, ed., IOS Press,Amsterdam, 1995.

18. J.M. David, J.P. Krivine, and R. Simmons,Second Generation Expert Systems,Springer-Verlag, 1993.

B. Chandrasekaran is professor emeritus, asenior research scientist, and the director of theLaboratory for AI Research (LAIR) in the Depart-ment of Computer and Information Science atOhio State University. His research focuses onknowledge-based systems, causal understanding,diagrammatic-reasoning, and cognitive architec-tures. He received his BE from Madras Universityand his PhD from the University of Pennsylvania,both in electrical engineering. He was Editor-in-Chief of IEEE Expertfrom 1990 to 1994, and heserves on the editorial boards of numerous inter-national journals. He is a fellow of the IEEE,AAAI, and ACM. Contact him at the Laboratoryfor AI Research, Ohio State Univ., Columbus, OH,

43210; [email protected]; http://www.cis.ohio-state.edu/lair/.

John R. Josephsonis a research scientist and theassociate director of the Laboratory for AIResearch in the Department of Computer andInformation Science at Ohio State University. Hisprimary research interests are knowledge-basedsystems, abductive inference, causal reasoning,theory formation, speech recognition, perception,diagnosis, the logic of investigation, and the foun-dations of science. He received his BS and MS inmathematics and his PhD in philosophy, all fromOhio State University. He has worked in severalapplication domains, including medical diagnosis,medical test interpretation, diagnosis of engineeredsystems, logistics planning, speech recognition,molecular biology, design of electromechanicalsystems, and interpretation of aerial photographs.He is the coeditor with Susan Josephson of Abduc-tive Inference: Computation, Philosophy, Tech-nology, Cambridge Univ. Press, 1994. Contact himat the Laboratory for AI Research, Ohio StateUniv., Columbus, OH, 43210; [email protected]; http://www.cis.ohio-state.edu/lair/.

Richard Benjamins is a senior researcher and lec-turer at the Department of Social Science Infor-matics at the University of Amsterdam. Hisresearch interests include knowledge engineering,problem-solving methods and ontologies, diagno-sis and planning, and AI and the Web. He obtainedhis BS in cognitive psychology and his PhD in arti-ficial intelligence from the University of Amster-dam. Contact him at the Dept. of Social ScienceInformatics, Univ. of Amsterdam, Roetersstraat15, 1018 WB Amsterdam, The Netherlands;[email protected]; http://www.swi.psy.uva.nl/usr/richard/home.html.

26 IEEE INTELLIGENT SYSTEMS

March/April IssueIntelligent AgentsGuest Editor Jim Hendler of DARPA will present articles discussing development techiques for and thepractical application of intelligent agents, which might be the solution to handling the data andinformation explosion brought about by the Internet. Scheduled topics include

• Agents for the masses: Is it possible to develop sophisticated agents simple enoughto be practical?

• Extempo Systems’ interactive characters, who engage, assist, and entertain peopleon the Web

• AgentSoft’s efforts at commercializing intelligent agent technology

Intelligent Systems will also continue its coverage of the ontologies track, started in this issue, andthe track on vision-based vehicle guidance, which began in the November/December 1998 issue.

IEEE Intelligent Systems covers the full range of intelligent system developments for the AIpractitioner, researcher, educator, and user.

IEEE Intelligent Systems


Recommended