22

Click here to load reader

An empirical investigation of ODSS impact on individuals and organizations

Embed Size (px)

Citation preview

Page 1: An empirical investigation of ODSS impact on individuals and organizations

Ž .Decision Support Systems 30 2000 51–72www.elsevier.comrlocaterdsw

An empirical investigation of ODSS impact on individualsand organizations

Radhika Santhanam a,), Tor Guimaraes b,1, Joey F. George c,2

a DSIS Area, Gatton School of Business and Economics, UniÕersity of Kentucky, Lexington, KY 40506-0034, USAb J.E. Owen Chair of Excellence in MIS, Tennessee Technological UniÕersity, CookeÕille, TN 38505, USA

c Department of Information and Management Sciences, College of Business, Florida State UniÕersity, Tallahassee, FL 32306, USA

Accepted 12 June 2000

Abstract

Ž .Organizational Decision Support Systems ODSS are large decision aiding systems, which provide organization-widesupport for business processes. An ODSS shares some characteristics with other management support systems, but it hasdistinctly different objectives, scope and components. Its goal to support both the individual and organizational leveldecision processes may require unique development and management approaches. Several case studies have been conductedto address this issue. However, no systematic investigation has been conducted to determine factors that influence thesuccessful development and use of ODSS. We designed this study to investigate ODSS impact both at the individual andorganizational level based on several ODSS currently in use. Our findings indicate that in order to have a successful ODSS,management must pay attention to individual user needs and also have several organizational level coordinating mechanismsin place. User participation, support of management, and DSS system characteristics were found to be importantdeterminants of ODSS success. Several factors at the organizational level, such as the use of steering committees and theextent of institutionalization of the system, were found to be correlated to ODSS success. The implications of these resultsfor the management of ODSS and other organization-wide systems are discussed. q 2000 Elsevier Science B.V. All rightsreserved.

Keywords: Organizational DSS; Implementation; Impact; Success

1. Introduction

New forms of telecommunication technology haveenabled work teams within organizations to interact

) Corresponding author. Tel.: q1-606-257-4397.Ž .E-mail addresses: [email protected] R. Santhanam ,

Ž [email protected] T. Guimaraes , [email protected]Ž .J.F. George .

1 Tel.: q1-615-372-3385.2 Tel.: q1-904-644-5508.

and enhance business decisions. One specific type ofsystem that utilizes communication, data and prob-lem-solving technologies to support organizationaldecision processes has been termed Organizational

Ž . w xDecision Support Systems ODSS 29 . ODSS areinformation systems that provide an organization-wide platform to enhance, facilitate and enable the

w xwork of organizational members 26 . Recent orga-nizational changes resulting from business reengi-neering and growth in international businesses hasnecessitated a greater reliance on distributed deci-

0167-9236r00r$ - see front matter q 2000 Elsevier Science B.V. All rights reserved.Ž .PII: S0167-9236 00 00089-0

Page 2: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7252

sion-making. In these situations, related decisions aremade among a set of organizational decision-makers,which share full or partial responsibility for a certainsphere of decisions. ODSS are designed with thenecessary network and functional capabilities to shareinformation, coordinate multiple decisions and sup-port users in these distributed decision making envi-

w xronments. According to George 28 , the key featuresŽ .of an ODSS are: 1 it focuses on an organizational

task or decision that affects several organizationalŽ .units, 2 it cuts across organizational functions and

Ž .hierarchical layers, 3 it involves computer-basedtechnologies and communication technologies, andŽ .4 it provides multiple users easy access to data andmodels.

For example, the Enlisted Airforce SystemŽ .EFMS is an ODSS developed by the US Airforceand Rand to assist Airstaff in the task of managing

w xenlisted personnel 70 . This ODSS is used by fivemajor organizational units to make interrelated butindependent decisions regarding the recruitment,training, assignment and compensation of the en-listed force, their promotion policies and their overallstructure. These five organizational units are in threegeographically dispersed locations, and a communi-cation network connects all users. Users can access avariety of models that provide short and long termforecasts of the aggregate force, effects of alternativebonus plans, and manpower targets. The databaseconsist of inputs from other models, external datasuch as general unemployment trends, internal dataabout personnel, training costs, program costs andbudget constraints, etc. Prior to the development ofthe EFMS, various organizational units in the AirForce used their own independent models and devel-oped projections of future requirements for man-power. These projections varied widely, and it wasvery difficult to identify if this occurred due todifferences in models, data or policies. The use ofthe ODSS has standardized procedures for develop-ing projections, and various units now share commondata and models. The use of the ODSS has promotedgreater consistency in decisions among differentunits. If projections made by various units differ, theproblem is much more easily traceable.

Several other similar case studies on the use ofwODSS in organizations have been reported 11,14,

x22,55 . These cases study the development of one

ODSS in the specific context in which it is used andprovide rich descriptions of one specific system.These case studies are useful and provide a goodstart to study ODSS. They do not however, identifycommon factors across systems and organizationsthat can foster the successful development and use ofODSS. Other research on ODSS has primarily clari-fied the technical aspects and the role of this system

win enhancing organizational performance 29,41,42,x64,72 . Research to this point has thus described the

technical components needed to build an ODSS andprovided some clues on how to manage an ODSS. Ithas produced no empirical evidence on the impact ofODSS on decision-making activities or factors influ-encing its successful use.

Organizations are increasingly deploying informa-tion technologies to coordinate decisions amongfunctional and dispersed geographical units. Theavailability and rapid growth of software such asLotus Notes, SAP Rr3 and data warehousing toolsattest to this trend of centralized repositories that aredesigned to support multiple decision-makers. Thishas even given rise to new fields of inquiry, termedOrganizational Computing, that seek to identify fac-tors influencing the successful design and use of

w xthese technologies such as an ODSS 36 . Therefore,it is important to obtain a systematic evaluation offactors that influence the success of an ODSS. Thesestudies can inform organizations on how to manageand implement large organization-wide systems.

While an ODSS shares some functional character-istics with an individual decision support systemŽ . Ž .DSS , Executive Information Systems EIS and

Ž .Group Decision Support Systems GDSS , it differswidely in its goal and scope. An ODSS has tosupport interrelated but autonomous decisions. Interms of technology, it differs from other systems inthe components, tools and languages used to developthe system. The development of the system requiresa lot of effort particularly in coordinating the re-quirements among various units. Due to this large-scale nature of the project, individual users may findit more impersonal and less relevant than an individ-ually designed DSS. Hence, a factor such as userparticipation found important to the success of DSSmay not be as important in the management ofODSS. The objective of this study is to examineODSS use in different organizations, measure its

Page 3: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 53

impact both at the individual and organizational levelto understand factors that influence its successfuluse. Section 2 reviews the literature relevant toODSS and presents a model of ODSS impact. Sec-tion 3 describes the research methodology used,followed by the results. Section 5 provides a discus-sion of the results, followed by some concludingremarks.

2. Model of ODSS impact

ODSS are designed to support decision-makingactivities of the user and thus share some of thefunctions of a traditional DSS. However, ODSS are

Ž w x.Anot simply big traditional DSSB p. 20, 14 thatare designed to support many decisions of one indi-vidual decision maker. Instead, ODSS are designedto support interdependent decisions made by manyindividuals with multiple interests. An ODSS shouldbe viewed as supporting the Aorganization’s division

Ž w x.of labor in decision makingB p. 138, 65 . Simi-larly, an ODSS shares some characteristics with aGDSS because it enhances the performance of workgroups. However, while GDSS focus on single workteams, an ODSS facilitates the interaction of multiplework teams. Organizational-level decision processesinvolve issues of greater consequence than group-level processes and are not influenced by the sametype of social factors that influence group behavior.Hence, an ODSS should not be viewed as a simpleextension of GDSS, just as group support systemscannot be viewed as simple extensions of individual

w xDSS 42 .An ODSS provides some of the functionality of

an EIS because it provides critical information tow xmany managers 8 . The objectives and scope of

these systems are however very different, and theycannot be viewed as being the same. The purpose ofan EIS is primarily to meet the Ainformation needsB

w xof managers 46 , while that of an ODSS is tosupport organizational decision processes. An ODSShas to provide many coordinating functions to facili-tate interdependent task execution while an EIS maynot provide these functions. Furthermore, an EISneed not support the interaction of work teams. It ispossible that users of an ODSS make decisions thatcan be considered good at the individual level but

are organizationally suboptimal. This has seriousconsequences for the designer of the system who hasto provide the coordination and resolution mecha-nisms to ensure that organizational decision pro-

w xcesses are optimized 42 .It is clear that an ODSS shares some character-

istics with other management support systems suchas DSS, GDSS and EIS, but it has distinctly differentobjectives and a much broader scope. It has a strongorganizational component not present in a traditionalDSS or a GDSS and a coordination component notpresent in an EIS. Hence, an ODSS has differentfunctions and components, and requires differentdesign and development approaches compared to

w xDSS and other management support systems 14 .Factors found important for the successful use ofDSS, GDSS and EIS may be relevant to the manage-ment of ODSS, but without empirical evidence, it isnot clear if this is the case. To identify those factorsthat are relevant to ODSS impact, we start withmodels of ODSS provided in the literature.

w xApplegate and Henderson 3 describe the charac-teristics of an ODSS on two dimensions, namely,technological and functional. The technological di-mension primarily describes the technologies used tobuild an ODSS or what it should be. Therefore it isuseful to identify ODSS systems. Applegate and

w xHenderson 3 describe three layers of technologies:communications, data and process. The communica-tion technology is the enabling technology that pro-vides the physical connectivity for different workunits to interact. The data layer of the technologyenables the access, storage and retrieval of data thatpromotes a degree of common semantic understand-ing among work units. The process layer providesthe analytic or problem solving tools. Each layerbuilds on the other so that the ODSS provides com-mon models and standardized data to support busi-ness processes across work units. We used this de-scription to ensure that every system we studied metthe technical specifications for an ODSS.

This functional dimension helps to understand theimpact and factors that affect the success of anODSS, and we describe how we used it to developour research model. The functional dimension de-fines what an ODSS should do in terms of produc-tion, coordination and policy functions. These func-tions illustrate how an ODSS can be studied on at

Page 4: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7254

least two different levels of analysis, the individualand the organization. This sets an ODSS apart froman individual DSS. The production function operatesat the level of the individual user to increase theefficiency and effectiveness of the specific task exe-cution. The ODSS must provide all the benefits foran individual user and imitate the role of a traditional

w xDSS 3 . The other functions of an ODSS identifiedby Applegate and Henderson, coordination and pol-icy, pertain to the organizational dimension. Coordi-nation functions improve the effectiveness of in-terdependent task execution among two or moredecision-makers. In this role, an ODSS has to facili-tate information sharing among decision-makers andsupport distributed decision-making activities. Thepolicy function serves to standardize guidelines andprocedures across the organization. Since it is en-forced at the organizational level, this falls under theorganizational dimension of ODSS.

It is clear that ODSS impact can be examined attwo levels, namely the individual and organizational.We posit that certain factors, such as the character-istics of the system and the characteristics of theimplementation process from an individual user per-spective, will influence ODSS impacts at the individ-ual level of analysis. We further posit that certainorganizational level factors will determine the impactof ODSS in several specific areas, such as businessperformance, the speed of decision making, and soon, from an organizational perspective. We also positthat there is a relationship between individual levelimpacts and organizational level impacts, as the or-ganization is in many ways a collection of individu-als, so individual impacts would have an influenceon organizational outcomes. If individuals do notbenefit from the system, then they may not use it andthere will be no organizational benefits. Based onthis reasoning, we develop a model as shown in Fig.1, which makes a clear demarcation between individ-ual and organizational level impacts and correspond-ing influencing factors. We now describe how theinfluencing factors at each level, shown in Fig. 1,were identified.

2.1. IndiÕidual leÕel impacts

One goal of an ODSS is to ensure that the indi-vidual user gets timely and accurate information

necessary to make informed decisions. In this re-spect, ODSS can be considered as a type of Manage-

Ž .ment Information System MIS and certain designprinciples used in MIS development, such as thesystems development life cycle, also apply to an

w xODSS 14 . One of the most tested and widely usedmeasures to determine whether users have receivedrelevant information is User Information Satisfactionw x w x19 . Developed by Bailey and Pearson 4 , and

w xmodified by Ives et al. 40 , User Information Satis-faction measures the extent to which users feel thesystem meets their information requirements. We usethis as one measure of the individual level impact ofan ODSS.

In addition to providing information, an ODSShas to make available many decision models so thata user can identify problems faster, examine variousalternatives and make a choice. It performs all thefunctions of a DSS. The impact of a DSS in improv-ing the decision-making performance of the individ-ual user has been evaluated in two different ways. Intightly focused laboratory studies, the actual outcomeof a decision has been evaluated using measuressuch as decision quality and simulated business prof-

w xits 15,61 . In field studies, such objective measuresof actual DSS outcomes are difficult to capture.Hence, a perceptual measure such as perceived DSSbenefits has been used to determine the extent towhich the user believes that the system has improved

w xtheir decisions 38,59 . Since ours was a field study,we used this as one measure of ODSS impact.

The purpose of an ODSS is to improve an organi-zational process. Case studies indicate that the use ofthe ODSS promotes greater communication and co-

w xordination among users 11,13,14 . Due to this func-tion, individual users are likely to feel that the ODSShas changed their role in the organization and madetheir task more important. Impact on end-user jobs isa measure that captures user’s perception of howtheir job roles and responsibilities in the organizationhave changed as a result of using a computer-based

w xsystem 73,74 . We use this as one measure of ODSSimpact.

Thus, as shown in Fig. 1, at the individual level,we used three different measures to study ODSSimpact: User Satisfaction, Perceived DSS Benefitsand Impact on End-User’s Job. This, we believe,captures the multiple functions of an ODSS, as a

Page 5: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 55

Fig. 1. Research model.

system providing information, a system functioningas a decision-aid, and a system supporting an organi-zational process.

2.2. IndiÕidual leÕel factors affecting ODSS impact

The impact of an information system on the userwill obviously depend upon the nature of hisrherinteraction with the system, i.e., how much the sys-tem is used and how well the system supports taskcompletion. We refer to these factors as systemcharacteristics. Research findings from information

Ž .system IS implementation studies indicate that userperceptions of management practices used in thedevelopment and installation of the system also af-

w xfect the impact of a system on the user 1,27 .Therefore, factors such as whether users were al-lowed to participate in the development of the sys-

tem and whether they were adequately trained to usethe system may also affect the eventual outcome of asystem. We refer to these factors as implementationcharacteristics. In Section 2.2.1, we discuss the roleof system and implementation characteristics in in-fluencing ODSS impact.

2.2.1. System characteristicsA major goal of an ODSS is to provide support

for decisions. It should be highly interactive, has theflexibility to show results of various decision choices,and provides control to the user. These are collec-tively referred to as DSS characteristics. Studies onDSS impact have found these DSS characteristics areparticularly important in affecting user perceptions

w xof the system as a decision-aiding tool 32,38,67 .While the presence of DSS characteristics improves

Page 6: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7256

the functionality of the ODSS, the system must havefeatures that make it easy to learn and operate. Thisis referred to as the user-friendliness of the system

w xand directly impacts the productivity of the user 30 .We therefore propose the following hypotheses

that influences a user’s direct interaction with thesystem and its consequent impact.

H1. The extent of DSS characteristics in the ODSSis positively associated with its impact on the user.

H2. The extent of user-friendliness of the ODSS ispositively associated with its impact on the user.

2.2.2. Implementation characteristicsUser participation should be an important part of

w xany new IS implementation strategy 39,49 . Whenusers participate in the development and implementa-tion process, they are more likely to develop a betterunderstanding of how the system can assist them in

w xperforming their jobs effectively 5,20,27,49 . Sev-eral empirical studies of DSS implementation iden-tify user participation as an important factor in-

w x w xfluencing its success 1 while Carter et al. 14conclude from their case study that ODSS can besuccessful only if users participate in the implemen-tation process and Adevelop a sense of ownershipB.

The implementation of a new large system suchas an ODSS typically results in changes in businessprocesses and individual tasks. Users will be morelikely to use such a system if they perceive it ashaving strong support from management. Further-more, ODSS tend to be large expensive systems andwithout adequate backing from management, the sys-tem costs will not be funded and the implementation

w xprogram could end in failure 14 . Several re-searchers have reiterated the value of obtaining man-agement support and empirical studies find this to be

w xtrue 37,50,73,74 . In addition to supporting the im-plementation process, management must also provideadequate training so that users can operate the new

w xsystem effectively 14,52 . The extent of traininggiven to users is generally recognized as influencingthe productive use of any IS and has been foundto have a large effect on implementation successw x1,17,57,60,66 .

Based on the above, we proposed the followinghypotheses reflecting the implementation process andits impact on the user.

H3. The extent of user participation in the develop-ment and implementation of the ODSS is positivelyassociated with its impact on the user.

H4. The extent of management support in the devel-opment and implementation of the ODSS is posi-tively associated with its impact on the user.

H5. The extent of training provided for the use of theODSS is positively associated with its impact on theuser.

Note that there are few other characteristics of theimplementation process that have been found to in-fluence the outcome of an IS at an individual level.These include communication patterns among ana-lysts and the level of conflict among developersw x6,12,58 . Our goal was to study ODSS impact, andit was important to study those systems, which hadbeen in use for a period of time. We felt that wewould not have been able to adequately capturesome of these interpersonal behaviors that occurduring system development because we excludedODSS that were currently being developed.

2.3. Organizational leÕel impacts

An assessment of the organizational impact of anIS is extremely important but very few field studies

w xattempt this task 19 . This seems to be primarily dueto the difficulty of isolating and measuring the orga-nizational outcome of an IS. We measure the organi-zational impact of an ODSS in two ways, using afunctional perspective and a performance perspec-tive. The central function of an ODSS is to supportand coordinate decision-making processes among

w xvarious units 72 . Hence, the collective use of ODSShas to lead to improvements in decision-makingprocesses throughout the organization. With anODSS, users have better access to information, areable to share information, test decision choices andmake faster decisions. Using two measures, namelyimprovements in decision making speed and deci-

Page 7: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 57

w xsion-making analysis 46 , we try to capture thefunctional impact of an ODSS on organizationaldecision processes. Use of advanced technologieshas been found to have an impact on overall corpo-

w xrate performance 10,18,19 . ODSS are designed tostreamline organizational business processes, and wecan expect it to improve business profits, increasemarket share and return on investments, etc. We useimpact on business as a measure of the organiza-

w xtional performance impact of an ODSS 33 . Thus, asshown in Fig. 1, we used three different measures toassess the organizational impact of an ODSS.

2.4. Organizational factors affecting organizationalimpact

At the organization level, several factors havebeen found to influence the successful implementa-

w xtion of large IS 14,63,71 . Using an EIS implemen-w xtation framework 71 , we group these factors as

being internal and external to the organization.

2.4.1. Internal factors

2.4.1.1. Project champion and project committee.The successful introduction of large computer-basedsystems typically requires a committed and influen-tial person within the organization. These are re-ferred to as executive sponsors or project championsw x7,71 . An ODSS is a very large system representinga significant undertaking by the organization. A pro-ject champion within the organization with sufficientpower is needed to sustain budgetary and politicalsupport over the long life cycle of the system. It alsorequires the presence of a strong project committee

w xwith members from the affected departments 14 .The use of similar project mechanisms such as steer-ing committees have been tested in various IS imple-mentation processes. The use of these committeeshas been found effective in coordinating organiza-tional units and in fostering the use of large systems

w xin organizations 48,68 . We, thus, hypothesized that:

H6. The extent of involvement of a project championin the development and implementation of an ODSSis positively associated with its impact on the organi-zation.

H7. The extent of involvement of a project commit-tee in the development and implementation of anODSS is positively associated with its impact on theorganization.

2.4.1.2. Institutionalization. An ODSS has to supportbasic business processes. The ODSS will have agreater impact on business if it is designed to beclosely related to improving basic business pro-cesses. This is part of the policy function of anODSS and is the distinguishing feature of an ODSS,

w xsimilar to the modeling component in a DSS 3 . Werefer to this as ODSS institutionalization and pro-posed:

H8. The extent of institutionalization of an ODSS ispositively associated with its impact on the organiza-tion.

2.4.2. External factors

2.4.2.1. CompetitiÕe enÕironment. Environmentalfactors tend to influence the successful use of ad-

w xvanced information systems 31,43,71 . Among these,the nature of the perceived competitive environmenthas been found to be one of the key environmental

w xfactors influencing the impact of systems 45 . Thus,we proposed:

H9. The extent of external competition is positivelyassociated with the impact of an ODSS on theorganization.

2.5. Relationship between indiÕidual leÕel impactsand organizational impact

By definition, an ODSS supports basic businessprocesses and is closely intertwined with organiza-tional business procedures. They are designed toimprove organizational decision processes along withsupporting individual decisions. Therefore, it is ex-pected that the ODSS perceived as having a strongimpact at the individual level will also have animpact at the organizational level. Individual level

Page 8: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7258

impact of any IS has to lead to organizational impacteven though this relationship has seldom been exam-

w xined 19 . Thus, a final hypothesis is proposed:

H10. The extent of impact of the ODSS on the usersis positively associated with its impact on the organi-zation.

3. Research method

3.1. Questionnaire design

In developing the questionnaire for this survey,we adopted many of the measures validated in priorstudies. The development of these measures is pro-vided in Appendix A. In addition to these measures,open-ended questions dealing with the demographicsof the respondent, such as hisrher educational de-gree, hisrher experience with computers, organiza-tional position, etc., were added. Similarly, somequestions about the organization’s business, the typeof application for which the ODSS was developed,the number of years it was being used, etc., wereadded.

The questionnaire was designed so that it wouldbe answered by three sets of respondents. For eachODSS, the project managerrIS manager was onerespondent. She had to answer questions relating tothe characteristics of the ODSS, such as its name, theapplications for which it was used, number of yearsin use, and the number of users in each department.She also had to answer questions relating to thedevelopment and management of the system, such asthe composition of the project committee, the natureof the project champion, etc. The internal auditor ofthe company answered the second set of questions.He had to answer questions regarding the organiza-tional impact of the system, the organization’s busi-ness performance, the extent of external competition,the extent of institutionalization of the system, etc.We felt that the internal auditor would not be biasedtowards the system and yet would be well informedabout the operations of the business and the system.Therefore, he would be in the best position to answerquestions about certain organizational aspects of the

Ž .system. The third set of respondents were the users

of the system. Users had to answer questions thatmeasured their perceptions about the friendliness ofthe system, the extent of their participation in theODSS development, their perceptions of manage-ment support, the extent of perceived benefits fromthe system, etc. Thus, the questionnaire was splitinto three parts. The first two parts, answered by theprojectrIS manager and the internal auditor, mea-sured organizational level factors and impacts, re-spectively. The third part, answered by system users,measured individual level factors and impacts. Eachof these parts had an introduction explaining thepurpose of the survey, and each respondent receivedonly the relevant part of the questionnaire.

3.2. Pilot study

Before widespread data collection, eight IS direc-tors were asked to review the questionnaire with thepurpose of checking its content, readability and re-spondents’ interpretation of the questions. Based onthe results, we altered the wording of some questionsbut no changes were necessary to alter its content.

3.3. Sampling procedure

A sample of 28 IS Directors known through priorconsulting engagements, and whose organizationshad implemented ODSS, were invited to participatein this study. Six IS Directors declined for a varietyof reasons such as Aalready involved in too manyother research projectsB, Atoo busy to participate atthis timeB, or Anot particularly interested in thisresearch projectB. The nature of the system was onceagain verified by talking to the IS manager, and thenthe survey instruments were distributed. The ques-tionnaires from five of the companies were either toolate or too incomplete to be included in the analysis.Seventeen companies in the US Midwest and South-east participated in the study with a full set ofquestionnaires. The project managerrIS manager hadto answer the first part of the questionnaire. She alsowrote the name of the system on the third, i.e., userpart of the questionnaire. The user part of the ques-tionnaire was given to 8–10 users of each system.Care was taken to see that the number of users ofeach system was fairly uniform across systems. Thiswas done to ensure that no one system had a dispro-

Page 9: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 59

portionately larger number of respondents that mighthave biased the results. The internal auditor wasgiven the part of the questionnaire that referred tothe measures relating to the organizational impact ofthe system.

3.4. Checking ODSS nature

We took great care to ensure that the systems westudied clearly fit the definitions of an ODSS usingboth descriptive as well as quantitative information.

Ž .These factors were examined. 1 The system’s func-tionality was evaluated. We had asked the projectmanagers to fully describe the system before thesurvey was mailed. We made certain that the systemhad analytical tools, it was developed to improveorganizational decision processes, it had a central-ized modelbase and database, and a communicationcomponent. It therefore fit the necessary three tech-nology layers, process, communication and data, as

w x Ž .defined by Applegate and Henderson 3 . 2 ODSSmust be used by several organizational units andacross several hierarchical levels. To determine thisaspect, we provided a table in the questionnaire andasked the projectrIS manager to state the approxi-mate number of users from each department and thenumber of users at each organizational level. Thereare no standards available to define what should bethe number of users of an ODSS or the number ofdepartments it should span. We used a rough stan-dard that all the systems we would study were usedby at least three functional units and by at least twolevels in the organization hierarchy. We thus useddescriptive information provided by the managers aswell as quantitative information obtained from theresponses in the questionnaire on the number ofusers, number of units, etc., to make certain that allthe systems could be called an ODSS.

For example, one of the systems in our samplewas developed by an organization that manufacturedfilters for cars and trucks. The ODSS was developedas part of a new Total Quality Management InitiativeŽ .TQM to improve the process of providing customerservice. It was primarily used in the four functionalunits of accounting, marketing, inventory control andplanning. The supervisory personnel in the marketingdepartment used it to answer customer queries. Themanagers in the marketing department used it to

make decisions regarding customer deliveries withthe objective of minimizing costs and delivering ontime. Accounting personnel used it to evaluate creditratings and make decisions regarding customers. In-ventory control managers used it to evaluate suppli-ers and make decisions regarding orders to be placed.Managers in the planning department used it toexamine aggregate customer service performance,total costs, and to plan for changes in customerrelated operations. Other systems included in thesurvey were very similar and every system spannedseveral functional units to support decision-makersin each unit positioned at various organizationallevels.

4. Results

Seventeen companies returned the completed sur-vey. There were 147 users of these 17 systems.Information about the nature of these companies andthe nature of the systems that was analyzed is pro-vided in Fig. 2. As seen from Fig. 2A, six companieswere in manufacturing, four in financial servicessuch as banks, and others in retailing, wholesaledealership, merchandising and utility businesses. Thesales value of these companies is shown in Fig. 3B.Companies in this sample are to be consideredmedium to large size, with no small company repre-sentation. It can be seen from Fig. 2C that they wereused for several applications such as planning, mar-keting, manufacturing and engineering. Further, theycut across functional areas and were used in morethan one application. From Fig. 2D, it is seen thatthe ODSS have not been used for a long time,ranging in usage from 1 to 6 years. Given that theODSS technology is relatively new, this is not un-usual.

The number of departments in the organizationthat used the ODSS ranged from 4 to 10 departmentswith an average of six departments. The variousdepartments included production, accounting, fi-nance, marketing, corporate planning, purchasing,customer services, and personnel, among others. Thenumber of vertical levels spanned by every ODSS inthis sample was four. This information was collectedto ensure that systems other than ODSS were notincluded in the sample. All the systems passed the

Page 10: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7260

Fig. 2. Organization and system characteristics.

check that it was used by more than three functionalunits and by more than two organizational levels. Wetherefore proceeded to test our research model using

w xSpssx software 54 .

4.1. IndiÕidual leÕel analysis

4.1.1. Data analysis and assessing Õalidity of mea-sures

Please note at the individual level, all of ouranalysis was based on responses from individualusers, giving a sample size of 147. Since all themeasures were obtained from prior studies and havebeen tested, we ran a confirmatory factor analysiswith varimax rotation on the independent and depen-

w xdent variables separately 9 . Among the independentvariables, one item in DSS characteristics had aloading of less than 0.40 and did not load appropri-ately. Hence, it was not included. It was also foundthat three training items loaded on one factor andtwo items on another factor. The three items per-tained to special courses while the two items per-tained to training on the system through self-study ofmanuals and tutorials. Hence, these items were split

into two factors, one called course related trainingand the other called system related training. Thefactor analysis for dependent and independent vari-ables were done after the item in DSS characteristicwas removed. All the items loaded appropriately.The Cronbach alpha coefficient for scale internalreliability was calculated. Table 1A shows the resultsfrom confirmatory factor analysis on the six indepen-dent variables. The same for the dependent variablesis shown in Table 1B.

Since we had several dependent variables, we hadto check if multivariate data analysis was appropri-ate. A Bartlett test of sphericity, i.e., a test ofcorrelation among the dependent variables, was done.

Ž .This was significant p-0.01 , and therefore wedecided to use multivariate techniques instead ofsimple multiple linear regression. Whenever thereare multiple output or dependent measures, canonicalcorrelation analysis is useful, particularly when the

Ž .predictor variables independent variables are notw xcategorical 35 . It provides a measure of the strength

of the overall relationship between the linear com-Ž .posites of the predictor independent variables and

Ž .criterion dependent sets and has been used in other

Page 11: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 61

w xMIS studies 47 . The number of canonical functionswill be equal to the number of variables in thesmaller set. In this case, there were criterion vari-ables, namely user satisfaction, perceived DSS bene-fits, and impact on end-user’s jobs, so there werethree functions developed. The first function had thehighest cannonical correlation. In relative terms, itexplained 80% of the variability in the sets of linearfunctions. The other two functions explained about18% and 2% of the variability. The eigenvalue of thefirst function was 0.88 while the second and thirdwere 0.19 and 0.01, respectively. It was felt that thefirst function captured most of the relationship be-tween the dependent and independent variables, andthis is analyzed.

4.2. IndiÕidual leÕel impacts

Table 2 shows the results of the canonical correla-tion analysis. The canonical correlation coefficient of

Ž .0.69 Pillai’s Multivariate F s 6.37, p - 0.001shows a significant degree of relationship betweenthe predictor and criterion variables. The redundancy

Ž 2 .index similar to R statistic in multiple regressionindicates that 27% of the variation in the criterion setis explained by the variability in the predictor setvariables. Similarly, 28% of the variation in thepredictor set is explained by the variability in thecriterion variables. Canonical loadings can be inter-preted similar to factor loadings. In the compositecanonical variate of criterion set, all three measures,namely user satisfaction, perceived DSS benefits,and impact on end-user’s jobs, has high loadings. Inthe predictor set canonical variate, user participationhas a relatively high loading followed by DSS char-acteristics and then by management support.

As in any multivariate analysis, after a significantoverall test, univariate analysis can be conducted as afollow up. Univariate regression tests on each of thecriterion variables were done. Tests for multi-collinearity among the independent variables wasconducted. The tolerance for each variable was much

w xgreater than 0.1 53,54 suggesting no problems dueto multicollinearity. The results are shown in Table3. It is seen that the model for user satisfaction is

Ž 2 .significant R s0.27, Fs8.8, p-0.001 withŽ .DSS characteristics betas0.23, p-0.01 and user

Ž .participation betas0.16, p-0.04 being signifi-

cant predictors. Similarly, the model for perceivedŽ 2DSS benefits is significant R s0.28, Fs8.5, p-

. Ž0.001 with user participation betas0.36, p-

. Ž0.001 and DSS characteristics betas0.27, p-

.0.005 being significant. The model for impact onŽ 2end-user’s jobs is significant R s0.39, Fs14.8,

.p-0.001 . The significant variable is user participa-Ž .tion betas0.58, p-0.001 .

4.2.1. Results regarding indiÕidual leÕel hypothesesWhile the proposed individual level model was

significant, a Pearson product moment correlationstatistic was calculated to test each hypothesis. Theresults are shown in Table 4. It is seen that DSScharacteristics are significantly related to user satis-

Ž .faction rs0.21, p-0.01 , perceived DSS benefitsŽ .rs0.39, p-0.01 and impact on end-user’s jobsŽ .rs0.43, p-0.01 . Thus, Hypothesis H1 is sup-ported. There is mixed support for Hypothesis H2.User-friendliness of the system is correlated with

Ž .user satisfaction rs0.28, p-0.01 and perceivedŽ .DSS benefits rs0.21, p-0.01 , but not with im-

pact on end-user jobs. In terms of implementationcharacteristics, it was found that H3 is positivelyassociated with its impact on the user, was sup-ported. User participation is significantly correlatedwith all measures of ODSS impact at the individual

Ž .level, namely, user satisfaction rs0.32, p-0.01 ,Ž .perceived DSS benefits rs0.44, p-0.01 , andŽ .impact on end-user jobs rs0.61, p-0.01 . Simi-

larly, it is seen that Management Support is signifi-Žcantly correlated to user satisfaction rs0.27, p-

. Ž .0.01 , perceived DSS benefits rs0.21, p-0.01Ž .and impact on end-user jobs rs0.28, p-0.01 ,

providing support for Hypothesis H4. There is mixedsupport for Hypothesis H5. System related training is

Žsignificantly correlated to user satisfaction rs0.31,. Žp-0.01 , perceived DSS benefits rs0.21, p-

. Ž0.01 and impact on end-user jobs rs0.17, p-

.0.04 . However, while course related training is sig-Žnificantly correlated to user satisfaction rs0.16,

.p-0.04 , it is not significantly correlated to impacton end-user jobs or perceived DSS benefits.

4.3. Organizational leÕel impacts

Due to the limited number of responses at theŽ .organizational level 17 , and the nature of the scale

Page 12: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7262

Table 1

Ž .a Individual level analysis — factor analysis of dependentŽ .variables ns147

Factor loading

User satisfaction Cronbach Alpha — 0.93Dependence on the ODSS 0.73Become more valuable in the 0.64organizationBenefited from ODSS in this 0.70organizationReliance on ODSS in 0.84performing the jobODSS is an important system 0.75ODSS is extremely useful 0.69ODSS has helped make better 0.74decisionsODSS facilitates priority 0.60setting in decision makingODSS has helped make more 0.69convincing argumentsODSS has improved the 0.74quality of decisionsODSS has increased the 0.67speed of decisionsODSS provides more relevant 0.67informationODSS allows greater analysis 0.66in decision making

PerceiÕed DSS benefits Cronbach Alpha — 0.90Clerical benefit 0.55Management time utilization 0.69benefitDecision-making benefit 0.81Problem appreciation benefit 0.65Data utilization benefit 0.60Planning and control benefit 0.58Decision search benefit 0.75Communication benefit 0.56Unplanned analysis benefit 0.72

Impact on end-users job Cronbach Alpha — 0.84Importance of job 0.46Amount of work required 0.58on jobAccuracy demanded from 0.62the jobSkill needed to do the job 0.62Job appeal 0.68Feedback on job performance 0.69Freedom in how to do the job 0.57Opportunity for advancement 0.51Job security 0.61Job relationships 0.61Job satisfaction 0.77

Ž .Table 1 continued

Ž .b Individual level analysis–factor analysis of independentŽ .variables ns147

Factor loading

DSS characteristics Cronbach Alpha — 0.76Flexibility of the system to change 0.54and adjust to new conditions,demands, or circumstancesThe nature of interaction with 0.59the ODSSTime for response from the 0.66systemAdvice and help from the 0.52systemEase of use of documentation 0.60The level of control over 0.68the system

User-friendliness Cronbach Alpha — 0.71Learning to use the ODSS 0.69Operators required to use the 0.63ODSSRemembering to perform tasks 0.69using the ODSSBecome skillful at using the 0.78ODSS

User participation Cronbach Alpha — 0.90Initiating the project 0.73Determining system objectives 0.80Determining the user’s 0.78information needsDetermining ways to meet 0.75information needsIdentifying sources of 0.68informationOutlining information flows 0.76Developing input formsrscreens 0.67Developing output formats 0.75Determining availabilityraccess 0.69to the system

Management support Cronbach Alpha — 0.83Management understand the 0.55benefits from use of theODSSDesignated person or group in 0.59the organization to help inusing the ODSSTraining facilities are readily 0.63available to help in the useof the ODSSSupport and encouragement 0.82from my boss to use theODSS

Page 13: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 63

Ž .Table 1 continued

Ž .b Individual level analysis–factor analysis of independentŽ .variables ns147

Factor loading

Management support Cronbach Alpha — 0.83Management has provided most of 0.82the necessary help and resourcesfor using the ODSS effectivelyManagement is really keen to see 0.76that we are happy with usingthe ODSSSystem related training Cronbach Alpha — 0.69Training through self-study using 0.76system tutorialsTraining through self-study using 0.73system manuals and printeddocumentsCourse related training Cronbach Alpha — 0.54Training from general courses at a 0.61community college or universityTraining provided by vendors or 0.81outside consultantsIn-house company sponsored 0.57courses

Ž .Likert , we felt it was inappropriate to conductmultivariate analysis such as factor analysis andcanonical correlations. We felt it was more appropri-

Table 2Ž .Individual level analysis — canonical correlation ns147

Criterion set — redundancy index — 26.51

Criterion set Canonical Canonicalweights loadings

User satisfaction 0.228 0.621Perceived DSS benefits 0.372 0.739Impact on end-users jobs 0.664 0.879

Predictor Set — Redundancy Index — 27.80

Predictor set Canonical Canonicalweights loadings

DSS characteristics 0.277 0.567User-friendliness 0.043 0.256User participation 0.812 0.937Management support 0.134 0.482System related training 0.046 0.372Course related training 0.062 0.179

Canonical Correlation coefficient — 0.683.Pillais F-values6.37, p-0.001.

Table 3Ž .Individual level analysis — multiple regression ns147

Individual Individual level impactlevel factors User Perceived Impact on

satisfaction DSS benefits end-users jobs

DSS characteristics) ) ) ) ) )Beta 0.23 0.27 0.05

Significance p-0.01 p-0.005 p-0.50Tolerance 0.62 0.62 0.63

User-friendlinessBeta 0.12 0.02 0.09Significance p-0.14 p-0.78 p-0.19Tolerance 0.80 0.80 0.80

User participation) ) ) ) ) ) ) )Beta 0.16 0.36 0.58

Significance p-0.04 p-0.001 p-0.001Tolerance 0.85 0.85 0.85

Management supportBeta 0.09 0.01 0.09Significance p-0.23 p-0.88 p-0.18Tolerance 0.82 0.82 0.82

System related training)Beta 0.14 0.01 0.01

Significance p-0.07 p-0.84 p-0.89Tolerance 0.82 0.82 0.82

Course related trainingBeta 0.07 0.03 0.02Significance p-0.31 p-0.71 p-0.74Tolerance 0.94 0.94 0.94

2R 0.27 0.28 0.39F-value 8.8 8.5 14.8

) ) ) ) ) ) ) ) )Significance p-0.001 p-0.001 p-0.001

)Significant at 0.1 level.))Significant at 0.05 level.)))Significant at 0.001 level.

w xate to conduct non-parametric tests 16 . TheKendall’s t-statistic was used to test the organiza-tional level hypotheses. This is a less powerful but amore conservative test of correlation than the Pear-son’s product moment correlation statistic. The re-sults are shown in Table 5.

The existence of a project champion was signifi-cantly correlated with all the dependent variables.Thus, there is strong support for Hypothesis H6. Wefound support for Hypothesis H7. The direction by aproject committee was significantly correlated with

Ž .impact on business ts0.42, p-0.02 , and deci-

Page 14: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7264

Table 4Ž .Individual level impact — tests of hypotheses ns147

Individual Individual level impactlevel factors User- Perceived Impact on

satisfaction DSS benefits end-users jobs

DSS r s0.21 r s0.39 r s0.43) ) ) ) ) )characteristics p-0.01 p-0.01 p-0.01

User- r s0.28 r s0.21 r s0.05) ) ) )friendliness p-0.01 p-0.01 p-0.57

User r s0.32 r s0.44 r s0.61) ) ) ) ) )Participation p-0.01 p-0.01 p-0.01

Management r s0.27 r s0.21 r s0.28) ) ) ) ) )Support p-0.01 p-0.01 p-0.01

System r s0.31 r s0.21 r s0.17) ) ) ) )Related p-0.01 p-0.01 p-0.04

TrainingCourse r s0.16 r s0.11 r s0.06

)Related p-0.04 p-0.19 p-0.47Training

)Significant at 0.05 level.))Significant at the 0.01 level.

Ž .sion making analysis ts0.43, p-0.03 and at alower significance level with improvements in deci-

Ž .sion making speed ts0.36, p-0.06 . There wasstrong support for Hypothesis H8. The institutional-ization of ODSS was significantly correlated with allof the dependent variables. There was mixed supportfor Hypothesis H9. The extent of external competi-tion was significantly correlated to the impact on

Ž .business ts0.40, p-0.03 but did not have aŽsignificant relation to decision making analysis ts

Table 5Ž .Test of organizational level impact ns17

Organizational Organizational impactfactors Impact on Decision Decision

business making makingperformance analysis speed

) ) ) ) ) ) )Project t s0.43 t s0.42 t s0.47Champion p-0.02 p-0.03 p-0.01

) ) ) ) )Project t s0.46 t s0.42 t s0.36Committee p-0.02 p-0.02 p-0.06

) ) ) ) ) ) )Extent of t s0.67 t s0.42 t s0.59Institutionalization p-0.01 p-0.03 p-0.01

) )External t s0.40 t s0.24 t s0.28Competition p-0.03 p-0.22 p-0.14

)Significant at 0.1 level.))Significant at 0.05 level.)))Significant at 0.01 level.

Table 6Correlation between individual level impact and organizationallevel impact

Individual Organizational level impactlevel impact Impact on Decision Decision

business making makinganalysis speed

) ) ) ) ) ) )User t s0.52 t s0.39 t s0.41satisfaction p-0.01 p-0.04 p-0.03

) ) )Perceived t s0.43 t s0.25 t s0.34DSS benefits p-0.02 p-0.17 p-0.06Impact on t s0.26 t s0.18 t s0.30end-users jobs p-0.14 p-0.32 p-0.11

)Significant at 0.1 level.))Significant at 0.05 level.)))Significant at 0.01 level.

. Ž0.24, p-0.21 or decision making speed ts0.28,.p-0.14 .

4.4. Relationship between organizational impact andindiÕidual impact

Aggregate values of individual level dependentvariables were calculated for each system. Onceagain, a conservative non-parametric test was done.Table 6 shows the Kendall’s t-correlation betweenorganizational level dependent variables and individ-ual level dependent variables. It is seen that there is asignificant degree of correlation between user satis-faction and the organizational impact measures. Usersatisfaction was significantly related to impact on

Ž .business ts0.52, p-0.01 , decision making anal-Ž .ysis ts0.39, p-0.04 , and decision making speed

Ž .ts0.41, p-0.03 . Perceived DSS benefits wereŽcorrelated significantly with impact on business ts

. Ž0.43, p-0.02 and decision making speed ts0.46,.p-0.06 but not with decision making analysis

Ž .ts0.30, ps0.23 . Individual measures of impacton end-user jobs were not correlated with any of theorganizational impact measures. Thus there is con-siderable support for Hypothesis H10.

5. Discussion

ODSSs are designed to improve the business ac-tivities of many organizational members and there-

Page 15: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 65

fore share some of the characteristics of other classesof information systems. Like DSS, an ODSS sup-ports the decision-making activities of the individualuser. Like GDSS, an ODSS enhances the perfor-mance of work groups. Like EIS, an ODSS providescritical information to managers at many differentlevels in the organization. Despite its similarity toother classes of information systems, however, anODSS is not simply a big DSS, or a big distributed

w xGSS, or a complex EIS 14,28,42,65 . In addition toseveral functional differences, its objective to sup-port both the individual’s and organizational decisionprocesses sets it apart from other systems. This dualobjective also has serious implications for the design,implementation and impact of an ODSS. The pur-pose of this study was to identify those factors thatare critical for successful ODSS implementation.

Keeping in line with the objectives of an ODSS,we organized our analysis into individual and organi-zational components and examined relationships be-tween the two. At the individual level, we postulatedseveral system-specific factors and implementationfactors to affect the impact of ODSS, and our canon-ical correlation analysis found this to be true. Fol-low-up univariate analysis and tests of hypothesesshowed that three factors related to implementation,user participation, management support, and systemrelated training, and one system-specific factor, DSScharacteristics, were significantly correlated with allthree measures of individual level impact of theODSS. Our regression analysis identified one sys-tem-specific factor, namely DSS characteristics, andone implementation factor, namely User Participa-tion, to be the most critical factors affecting thesuccessful impact of ODSS. Our findings, thus, indi-cate that for an ODSS to be successful from anindividual user perspective, the system has to be firstdesigned to provide all the DSS functions of interac-tive dialog, flexibility, and tools to examine alterna-tives. Users also have to be allowed to participateactively in the development and implementation pro-cess.

Despite differences in system objectives andscope, our findings indicate that the determinants ofODSS success at the user level are consistent withsuccess factors of other systems. User participationhas been an important predictor of IS system success

w xparticularly when the system is complex 49 . In

empirical studies of individual DSS, user participa-tion, management support, and DSS characteristicswere significantly correlated with individual DSS

w xsuccess 32,59 . Similarly, studies on the impacts ofexpert systems find user participation, managementsupport, and system characteristics to be important

w xdeterminants of success 69,73,74 . Our investigationwas premised on the fact that ODSS are very largeorganizational systems developed with an organiza-tional process objective. Users may not relate to it orsee it as being very relevant to their tasks. Hence,they may not value their participation in the develop-ment and implementation process or expect it tohave all the DSS features they see in an individualDSS designed for them. Our findings suggest thecontrary. The individual user of the ODSS seems toview it as a decision making tool and likes toparticipate and be trained just as he or she would forany individual DSS. At the individual user levelthen, several factors found important to the success-ful impacts of DSS and expert systems are alsoimportant to ODSS.

Our observations at the organizational level ofanalysis highlight the fact that ODSS success is alsodependent on other organizational level factors suchas strong project governance mechanisms. The in-volvement of a project team comprised of membersfrom various user departments and a project cham-pion were correlated with organizational impact.Given that an ODSS involves many tasks that spanmany organizational levels and departments, the moreinstitutionalized it is, the more of an impact it willhave, and this expectation was supported by the data.The level of external competition was not found tobe significantly correlated with all the measures oforganizational impact. These results are similar to

w xthose found in the case of EIS 44 . Studies showthat the competitive environment is the main externalpressure that leads to the development of an EISw x71 . Perhaps the level of external competition is acatalyst for developing an ODSS but does not affectthe impact of ODSS on the organization. At theorganizational level of analysis, then, factors impor-tant to EIS play a role in the successful impact ofODSS.

Importantly, there was also an association be-tween individual level dependent variables and orga-nization level variables. Measures of individual user

Page 16: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7266

satisfaction with the ODSS and their perceptions ofDSS type benefits were correlated significantly withorganizational impacts of ODSS. Impact on end-userjobs at the individual level was not correlated withmeasures of organizational impact. Impact on end-user jobs measured the effects of ODSS on the user’srole in the organization while organizational impactmeasures related to performance improvements. Im-pact on end-user jobs gauged the extent to whichODSS had improved the user’s opportunities to ad-vance in their jobs, increased their job appeal, theirfreedom to do their jobs, and the importance of theirjobs, etc. Organizational impacts measured improve-ments in decision processes and business perfor-mance. Hence, the lack of correlation between im-pact on end-user jobs and organizational impactscould be attributed to differences in what these itemsmeasured; namely job role benefits vs. performancebenefits. It should be recalled that individual levelimpacts were measured through asking ODSS usersabout their own perceptions of benefits obtainedfrom the system, while organizational level impactswere measured from the perspective of IS managersand internal auditors. The significant relationshipsbetween decision making benefits at both levelsdemonstrates a consistency in the perceptions of theODSS, as impacts at one level should also appear atother levels of analysis. It also highlights the need tomanage and support both the user level decisionprocesses and organizational level processes to ob-tain positive benefits from an ODSS.

5.1. Implications for practice

From a practical perspective, our results indicatethat even though an ODSS is a very large andcomplex system, it needs to be designed so thatindividual users have the Awhat-ifB capabilities, theflexibility and interactive dialogue facilities madeavailable to them. Despite the system being devel-oped to serve the whole organization or a large partof it, to the ODSS user, the system performs andlooks just like an individual DSS. Any enterprise thatplans to use an ODSS should let users participate inthe development and pay attention to system featuresjust as they would in any end user system project. Atthe organizational level, however, the system needsto be managed like any other large system, such as

having a project champion, a project committee, etc.An ODSS tends to have a greater impact if it isdesigned to tightly integrate with existing systems,processes, and procedures. Perhaps the most daunt-ing task in the management of an ODSS is thesimultaneous attention that needs to be paid to meet-ing individual user needs and also ensuring the func-tioning of organizational level coordinating mecha-nisms. Since positive impacts at the individual levelare correlated to organizational level benefits, thismust be done to have a successful ODSS.

5.2. Implications for research

We designed our study specifically to relate ourresearch findings with other management supportsystems so as foster a cumulative research tradition.Our findings indicate that many of the managementsupport systems share common functions and suc-cess factors, and therefore it may be possible todevelop a framework that identifies a core set offactors that influence the success of all managementsupport systems. Each individual system success canthen be viewed as being influenced by the coresuccess factors and other factors specific to thesystem being studied. For example, some core suc-cess factors for all management support systemswould probably be user participation and manage-ment support. For an individual DSS, these corefactors along with DSS characteristics would influ-ence the impact of the system. For an ODSS, it couldbe user participation, management support, and DSScharacteristics and also project management mecha-nisms at the organizational level. This type of aframework could help practitioners to understandfactors that have to be managed for successful imple-mentation of a specific system or class of systems. Itwill also help researchers conduct studies for groupsof systems rather than each individual system, a

w xsentiment echoed by others 23 .The study reported here provided support for the

findings from past case studies of ODSS, and thatsupport is stronger than it might otherwise be be-cause it was found through research methods otherthan case studies. Future research of ODSS shouldcombine multiple methods in a single study, usingmultiple methods to study the same ODSS, in orderto strengthen the reliability of common findings even

Page 17: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 67

more. It would be useful to develop and validatebetter organizational level measures for ODSS. Thisis particularly true for organizational decision pro-cesses, which are difficult to measure holistically andseparate from studying the aggregated processes ofindividual decision-makers. Assessing organizationalimpact of IS remains an area where most work is

w xneeded 19 . Therefore, development of these mea-sures would help not only in ODSS research but alsoin other research related to supporting organizationaldecision making.

5.3. Contributions of this research

This study has made several contributions to theinformation systems implementation literature. It wasthe first field study of ODSS that identified factorsinfluencing its success. Systems that operate at bothan organizational and local level are likely to be-come more widespread in the future, as they comple-ment such business trends as downsizing and cross-functional team work and technological trends innetworking and distributed systems. The growing useof software such as Lotus Notes and SAP Rr3support this premise. The findings from this studywill be useful to those considering the developmentof such organization-wide systems that coordinatethe work of multiple decision-makers.

Second, the study was specifically designed to beable to examine the impacts of ODSS at both theindividual and organizational levels. This has notbeen attempted in past research. The linkage betweenindividual and organizational level impact has beenstrongly reiterated by many researchers, but thisrelationship has not been examined. Third, usingmultivariate tests, the impacts of ODSS were testedon several dependent variables. Most studies of theimpact of management support systems have focusedon one dependent variable. Success of an IS is amultidimensional construct and too often this is ig-

w xnored 19 . By collectively examining the impacts onseveral dependent variables, a more comprehensiveunderstanding of the success of an IS and the inter-dependencies can be obtained.

5.4. Limitations of the study

This study had several limitations. First, we werecompelled to use a convenience sample. We had to

ensure that the respondents were referring to ODSSand not to any other type of computer-based systems.We had to contact several organizations and identifythose that had installed systems that fit the definitionof an ODSS. We also had to monitor the filling outof the questionnaire since it went to both managersand system users in the organization. All of thesefactors necessitated the use of a convenience sample.Furthermore, attempting to get a random sample on arelatively new system might have increased thechance of non-response. It would have been difficultto decide if the non-response was due to the fact thatthe respondent did not have a system or whether theyhad a system and refused to participate. The secondlimitation was the small sample size for conductingthe organizational level analysis. The measures oforganizational impact could not be tested for con-struct reliability. However, these measures have beenused in prior studies, and therefore their validity hasbeen tested.

6. Conclusions

The concept of organizational decision supportw xgoes back to 1981, when Hackathorn and Keen 34

first discussed it. They envisioned the use of infor-mation technologies primarily to communicate andcoordinate business activities in the organization.Later commentators embellished and expanded the

Ž w x.original vision cf. 28 . However, it is only in the1990s that both the networking and decision supporttools have become practically feasible to developoperational ODSS. Success with individual DSS,GDSS and networks has provided organizations thenecessary expertise to build ODSS. Even then, oursurvey identified only about two dozen systems inthe midwest and southeast parts of the country. Thecosts of development and the coordination mecha-nisms necessary to build an ODSS make it a riskyproject. While the progress in technology providesmany capabilities for building ODSS, the success ofthese systems depends on other factors as well.Hence, practitioners will benefit from knowing thefactors that could influence the successful impact ofan ODSS.

Our empirical study of ODSS follows trends in ISresearch where new systems are proposed, tested,

Page 18: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7268

deployed, and then evaluated. For example, earlyresearch on DSS defined the functional and technical

w xaspects of a DSS 2,62 . Prototype systems weretested for various applications. Vendors developedDSS generators, and such systems found widespread

w xuse in organizations 24,25 . After DSS were adopted,empirical studies examined their impacts across vari-

w xous organizations 32,59 . The same pattern is true ofGDSS, EIS and ES research. The study reported hereis the first that we know of to survey ODSS acrossseveral organizations and across several users ofeach system. The findings, though limited to 17systems, provide insight into the impacts of ODSS atboth the individual and organizational levels.

Appendix A. Variables measurement

A.1. IndiÕidual leÕel impacts

A.1.1. User satisfactionThis was measured by a 13-item five-point scale,

anchored on strongly disagree to strongly agree,w xdeveloped for DSS by Sanders and Courtney 59

w xand used by others 49,73,74 . These questionsgauged whether the user felt that the system hadimproved their decision-making activities.

A.1.2. PerceiÕed DSS benefitsPerceived benefits was measured with a nine item

w xfive-point scale from Money et al. 51 and Guimaraesw xet al. 32 . The questions ask the user to quantify

several DSS benefits, such as, better planning, bettercommunication, better search for alternatives, betterutilization of data, etc.

A.1.3. Impact on end-user jobsJob impact was measured by eleven questions, on

a five-point scale anchored on greatly decreased togreatly increased. This was adopted from Yoon et al.w x73,74 that asked users to state the effect of theODSS on the importance of their jobs, the skillsneeded to complete their tasks, the freedom to dotheir jobs, etc.

A.2. Independent Õariables measuring system char-acteristics affecting indiÕidual impact

A.2.1. DSS characteristicsThis measure of DSS characteristics adopted from

w xIgbaria and Guimaraes 38 consisted of seven ques-tions anchored on AExtremely dissatisfiedB to AEx-tremely satisfiedB. The questions asked users to ratewhether the system provided them with flexibility toexamine various decisions, whether the user felt incontrol, and whether the system responded promptly,etc.

A.2.2. User-friendlinessThis measure, adopted from Igbaria and

w xGuimaraes 38 had three questions that asked usersto rate how long it took to learn to use the systemand whether it was easy to become skillful in the useof the system.

A.3. Independent Õariables measuring implementa-tion characteristics affecting indiÕidual impact

A.3.1. User participationUser participation was assessed by nine different

w xitems based on the work of several authors 20,21,74 .Using a five-point scale anchored on Anot at allB toAa great dealB, respondents were asked to state theextent to which hershe had participated in initiatingthe project, determining objectives of the system,developing input requirements, etc.

A.3.2. Management supportw xThis measure, adopted from Yoon et al. 73,74

consisted of six questions on a five-point scale an-chored between Astrongly disagreeB to AstronglyagreeB. These questions asked the user if hershewas encouraged to use the system, whether manage-ment provided the resources and help to use thesystem, whether there are help centers, etc.

A.3.3. User Trainingw xAdopted from Yoon et al. 74 , training was mea-

sured by five questions asking users to state theextent to which they received training pertinent to

Page 19: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 69

the use of the ODSS. All the responses were ob-tained on a five-point scale anchored between Anot atallB to Ato a great extentB.

A.4. Organizational leÕel impacts

A.4.1. Impact on businessThe measure of business performance adopted

w xfrom Gupta and Govindarajan 33 had a set of 12questions anchored on Anot at allB to Aa great im-pactB. The questions asked the respondent to rate thebusiness impact of ODSS over the last 2 years oncorporate performance items such as, sales growthrate, market share, operating profits, rate of profit tosales, cash flow from operations, return on invest-ment, cost reduction activities, etc.

A.4.2. Decision making improÕementsTwo measures of decision making improvements

were used, namely, improvements in decision mak-ing speed and decision making analysis adopted

w xfrom Leidner and Elam 45,46 . Decision MakingAnalysis was measured by three questions asking therespondents to rate whether the ODSS helped theorganization evaluate more alternatives, increase thenumber of information sources, test assumptions andspend more time before making a decision. Theresponses were obtained on a five-point scale rang-ing from Anot at allB to Ato a great extentB. Deci-sion-Making Speed was measured on the same scalewith two questions that asked users if the ODSShelped the organization to implement decisions faster,and if it shortened the time frame to make decisions.

A.5. Independent Õariables affecting organizationalimpact

A.5.1. Project championThis was measured by questions addressing the

project champion characteristics developed fromw x w xBeath 7 , Grover and Goslar 31 , and Premkumar

w xand Ramamurthy 56 . The respondents rated theextent to which the project champion possessed char-acteristics such as being politically powerful, havecontrol over organizational resources, had sound

knowledgeable about the business, about the indus-try, etc.

A.5.2. Project committeeThis was measured by two questions on a five-

point scale. One question addressed the issue aswhether the project was directed by a steering com-mittee consisting of top and middle level managerswhile the other question asked whether the commit-tee had members from all the affected departmentsw x14 .

A.5.3. Extent of institutionalizationw xThis was measured by two questions 14 on a

five-point scale anchored between Anot at allB andAto a great extentB. One question was AHow inter-twined is the ODSS on your organization’s policiesand procedures?B The other question asked, AHowmuch support for your business processes does thisODSS provide?B

A.5.4. CompetitiÕe enÕironmentUsing a five-point scale, this measure had three

questions that asked the respondent to rate the extentto which other organizations capture their customers,whether the firm operates in a competitive environ-ment and the extent to which the organization seeks

w xto improve its market share 44 .

References

w x1 M. Alavi, E.A. Joachimsthaler, Revisiting DSS implementa-tion research: a meta-analysis of the literature and sugges-

Ž . Ž .tions for researchers, MIS Quarterly 16 1 1992 95–116.w x2 S. Alter, A taxonomy of decision support systems, Sloan

Ž . Ž .Management Review 19 1 1977 39–56.w x3 L.M. Applegate, J.C. Henderson, Organizational decision

support: an interacting team perspective, in: E.A. Stohr, B.R.Ž .Konsynski Eds. , Information Systems and Decision Pro-

cesses, IEEE Computer Society Press, Los Alamitos, CA,1992, pp. 154–158.

w x4 J.E. Bailey, S.W. Pearson, Development of a tool for measur-ing and analyzing computer user satisfaction, Management

Ž . Ž .Science 29 5 1983 530–544.w x5 H. Barki, J. Hartwick, Measuring user participation, user

Ž . Ž .involvement and user attitude, MIS Quarterly 18 1 199459–79.

Page 20: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7270

w x6 H. Barki, J. Hartwick, User participation, conflict and con-flict resolution: the mediating roles of influence, Information

Ž . Ž .Systems Research 5 4 1994 422–438.w x7 C.M. Beath, Supporting the information technology cham-

Ž . Ž .pion, MIS Quarterly 15 3 1991 355–374.w x8 L.W. Belcher, H.T. Watson, Assessing the value of Conoco’s

Ž . Ž .EIS, MIS Quarterly 17 3 1993 239–253.w x9 I.H. Bernstein, C.P. Garbin, G.K. Teng, Applied Multivariate

Analysis, Springer-Verlag, New York, 1988.w x10 D. Bender, Financial impact of information processing, Jour-

Ž . Ž .nal of MIS 3 2 1986 232–238.w x11 H. Bidgoli, Decision Support Systems: Principles and Prac-

tice, West Publishing, St. Paul, MN, 1989.w x12 R.P. Bostrom, Successful application of communication tech-

niques to improve the system development process, Informa-Ž . Ž .tion and Management 16 2 1989 279–295.

w x13 P.W.G. Bots, Shaping organizational information systemsthrough coordination support, in: R.M. Lee, AndrewM. Cosh,

Ž .P. Migliarese Eds. , Organizational Decision Support Sys-tems North-Holland, New York, 1988, Chap. 11.

w x14 J.R. Carter, M.P. Murray, R.G. Walker, W.E. Walker, Build-ing Organizational Decision Support Systems, AcademicPress, San Diego, CA, 1992.

w x15 W.L. Cats-Baril, G.P. Huber, Decision support systems forill-structured problems: an empirical study, Decision Sci-

Ž . Ž .ences 18 3 1987 350–372.w x16 J. Cohen, P. Cohen, Applied Multiple RegressionrCorrela-

tion Analysis for the Behavioral Sciences, Lawrence ErlbaumAssociates, New Jersey, 1983.

w x17 D.R. Compeau, C.A. Higgins, Application of social cognitivetheory to training for computer skills, Information Systems

Ž . Ž .Research 6 2 1995 119–143.w x18 W. Cron, M. Sobol, The relationship between computeriza-

tion and performance: a strategy for maximizing economicbenefits of computerization, Information and Management 6Ž . Ž .2 1983 171–181.

w x19 W. DeLone, E. McLean, Information systems success: thequest for the dependent variable, Information Systems Re-

Ž . Ž .search 3 1 1992 60–95.w x20 W.J. Doll, G. Torkzadeh, Discrepancy model of end-user

Ž . Ž .computing involvement, Management Science 35 10 19891151–1171.

w x21 W.J. Doll, G. Torkzadeh, The measurement of end-userŽ . Ž .software involvement, Omega 18 4 1989 399–406.

w x22 C. Dondi, P. Migliarese, G. Moia, G. Salamone, An organi-zational decision support system for Italetel projects andinternal resource planning, in: R.M. Lee, A.M. McCosh, P.

Ž .Migliarese Eds. , Organizational Decision Support Systems,Proceedings of the 8.3 Working Conference on Organiza-tional DSS, Lake Como, Italy Elsevier, 1988.

w x23 P. Eindor, E. Segev, A classification of information systems:analysis and interpretation, Information Systems Research 4Ž . Ž .2 1993 166–204.

w x24 J. Elam, G. Huber, M. Hurt, An examination of the DSSŽ . Ž .literature 1975–1985 , in: E.R. Mclean, H.G. Sol Eds. ,

Decision Support Systems — A Decade in Perspective,Proceedings of the IFIP Conference Elsevier, 1986.

w x25 H. Eom, J.P. Shim, A survey of DSS applications, InterfacesŽ . Ž .20 3 1990 25–32.

w x26 J. Fedrowicz, B.R. Konsynski, ODSS technology and deci-Ž .sion processes, in: E.A. Stohr, B.R. Konsynski Eds. , Infor-

mation Systems and Decision Processes, IEEE ComputerSociety Press, Los Alamitos, CA, 1992, pp. 143–145.

w x27 C.R. Franz, D. Robey, Organizational context, user involve-ment, and the usefulness of information systems, Decision

Ž . Ž .Sciences 17 2 1986 329–356.w x28 J.F. George, The conceptualization and development of orga-

Ž .nizational decision support systems, Journal of MIS 8 3Ž .1991–1992 109–125.

w x29 J.F. George, J.F. Nunamaker, J.S. Valacich, ODSS informa-tion technology for organizational change, Decision Support

Ž .Systems 8 1992 307–315.w x30 J.H. Gerlach, F. Kuo, Understanding human–computer inter-

Ž .action for information systems design, MIS Quarterly 15 4Ž .1991 527–549.

w x31 V. Grover, M.D. Goslar, The initiation, adoption and imple-mentation of telecommunications technologies in U.S. Orga-

Ž . Ž .nizations, Journal of MIS 10 1 1993 141–163.w x32 T. Guimaraes, M. Igbaria, M. Lu, The determinants of DSS

Ž . Ž .success: an integrated model, Decision Sciences 23 2 1992409–430.

w x33 A.K. Gupta, V. Govindarajan, Business unit strategy, man-agerial characteristics, and business unit effectiveness atstrategy implementation, Academy of Management Journal

Ž . Ž .27 11 1984 25–41.w x34 R.D. Hackathorn, P.G.W. Keen, Organizational strategies for

personal computing in decision support systems, MIS Quar-Ž . Ž .terly 5 3 1981 21–27.

w x35 J.H. Hair, R.E. Anderson, R.L. Tatham, W.C. Black, Multi-variate Data Analysis with Readings, 3rd edn., Macmillan,New York, 1987.

w x36 C. Holsapple, A.B. Whinston, Organizational computing forŽ .decision support, in: C. Holsapple, A.B. Whinston Eds. ,

Decision Support Systems — A Knowledge-Based ApproachWest Publishing, MN, 1996, Chap. 16.

w x37 M. Igbaria, End-user computing effectiveness: a structuralŽ . Ž .equation model, Omega 18 6 1990 637–652.

w x38 M. Igbaria, T. Guimaraes, Empirically testing the outcomesŽ .of user involvement in DSS development, Omega 22 2

Ž .1994 157–172.w x39 B. Ives, M.H. Olson, User involvement and MIS success: a

Ž . Ž .review of research, Management Science 30 5 1984 503–586.

w x40 B. Ives, M.H. Olson, J.J. Baroudi, The measurement of userinformation satisfaction, Communications of the ACM 26Ž . Ž .10 1983 785–793.

w x41 V.S. Jacob, H. Pirkul, Organizational decision support sys-Ž .tems, International Journal of Man–Machine Studies 36 3

Ž .1992 817–832.w x42 J.L. King, S.L. Star, Conceptual foundations for the develop-

ment of organizational decision support systems, Proceedingsof the Twenty-Fourth Annual Hawaii Conference on SystemSciences, 1990, pp. 143–150.

w x43 A. Lederer, A.L. Mendelow, The impact of the environment

Page 21: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–72 71

on the management of information systems, InformationŽ . Ž .Systems Research 1 2 1990 205–222.

w x44 D.E. Leidner, The transition to open markets and modernmanagement: the success of EIS in mexican organizations,

Ž .in: J. DeGross, S. Jarvenpaa, A. Srinivasan Eds. , Proceed-ings of the Seventeenth International Conference on Informa-tion Systems, Cleveland, Ohio, 1996, pp. 290–306.

w x45 D.E. Leidner, J. Elam, Executive information systems: theirŽ .impact on executive decision making, Journal of MIS 10 3

Ž .1993–1994 139–155.w x46 D.E. Leidner, J. Elam, The impact of executive information

systems on organizational design, intelligence, and decisionŽ . Ž .making, Organization Science 6 6 1995 645–664.

w x47 M.O. Mahmood, G.J. Mann, Measuring the organizationalimpact of information technology investment an exploratory

Ž .study, Journal of MIS, 10 1 , 97–122.w x48 J.D. McKeen, T. Guimaraes, Selecting MIS projects by

Ž .steering committee, Communications of the ACM 28 12Ž .1985 1345–1352.

w x49 J.D. McKeen, T. Guimaraes, J.C. Wetherbe, The relationshipbetween user participation and user satisfaction: an investiga-

Ž . Ž .tion of four contingency factors, MIS Quarterly 18 4 1994427–451.

w x50 L. Mohan, W.K. Holstein, R.B. Adams, EIS: it can work inŽ . Ž .the public sector, MIS Quarterly 14 4 1990 435–448.

w x51 A. Money, D. Tromp, T. Wegner, The quantification ofdecision support benefits within the context of value analysis,

Ž . Ž .MIS Quarterly 2 2 1988 223–236.w x52 P.P. Mykytyn, End-user perceptions of DSS training and

Ž . Ž .DSS usage, Journal of System Management 39 6 198832–35.

w x53 J. Neter, M.H. Kutner, C.J. Nachtsheim, W. Wasserman,Applied Linear Statistical Models, 4th edn., Irwin, Chicago,1996.

w x54 M.N. Norusis, SPSS for Windows Base System Users GuideRelease 6.0, SPSS, North Michigan, 1993.

w x55 M. Pagani, A. Belluci, An organizational decision supportsystem for Tellettra’s top management, in: R.M. Lee, A.M.

Ž .McCosh, P. Migliarese Eds. , Organizational Decision Sup-port Systems, Proceedings of the 8.3 Working Conference onOrganizational DSS, Lake Como, Italy Elsevier, 1988.

w x56 G. Premkumar, K. Ramamurthy, The role of interorganiza-tional and organizational factors on the decision mode foradoption of interorganizational systems, Decision Sciences

Ž . Ž .26 3 1995 303–336.w x57 S. Rivard, S. Huff, Factors of success for end-user comput-

Ž . Ž .ing, Communications of the ACM 31 5 1988 552–561.w x58 D. Robey, D. Farrow, C.R. Franz, Group process and conflict

Ž . Ž .in system development, Management Science 35 10 19891172–1189.

w x59 G.L. Sanders, J.F. Courtney, A field study of organizationalŽ . Ž .factors influencing DSS success, MIS Quarterly 9 1 1985

77–93.w x60 R. Santhanam, M. Sein, Improving end-user proficiency-ef-

fects of conceptual training and nature of interaction, Infor-Ž . Ž .mation Systems Research 5 4 1994 378–399.

w x61 R. Sharda, S.H. Barr, J.C. McDonnell, Decision support

system effectiveness: a review and an empirical test, Man-Ž . Ž .agement Science 34 2 1988 139–159.

w x62 R.H. Sprague, A framework for the development of decisionŽ . Ž .support systems, MIS Quarterly 4 4 1980 1–26.

w x63 J. Sviokla, The examination of the impact of expert systemsŽ . Ž .on the firm: the case of XCON, MIS Quarterly 14 2 1990

126–140.w x64 E.B. Swanson, Distributed decision support systems: a per-

spective, Proceedings of the Twenty-Fourth Annual HawaiiConference on System Sciences, IEEE Press, Los Alamitos,1990, pp. 129–136.

w x65 E.B. Swanson, R.W. Zmud, ODSS concepts and architecture,Ž .in: E.A. Stohr, B.R. Konsynski Eds. , Information Systems

and Decision Processes, IEEE Computer Society Press, LosAlamitos, CA, 1992, pp. 138–141.

w x66 R.L. Thompson, C.A. Higgins, J.M. Howell, Personal com-puting: toward a conceptual model of utilization, MIS Quar-

Ž . Ž .terly 15 1 1991 125–143.w x67 P. Todd, I. Benbasat, An experimental investigation of the

impact of computer based decision aids on decision makingŽ . Ž .strategies, Information Systems Research 2 2 1991 87–

115.w x68 G. Torkzadeh, W. Zia, Managing telecommunications by

Ž . Ž .steering committee, MIS Quarterly 16 2 1992 187–199.w x69 C.K. Tyran, J.F. George, The implementation of expert

systems: a survey of successful implementations, DatabaseŽ . Ž .24 1 1993 5–15.

w x70 W.E. Walker, Differences between building a traditional DSSand an ODSS: lessons from the air force’s enlisted forcemanagement system, Proceedings of the Twenty-Fourth An-nual Hawaii Conference on System Sciences, IEEE Press,Los Alamitos, 1990, pp. 120–128.

w x71 H.J. Watson, R.K. Rainer, C.E. Koh, Executive informationsystems: a framework for development and a survey of

Ž . Ž .current practices, MIS Quarterly 15 1 1991 13–30.w x72 R.T. Watson, A design for an infrastructure to support

organizational decision making, Proceedings of the Twenty-Fourth Annual Hawaii Conference on System Sciences, IEEEPress, Los Alamitos, 1990, pp. 111–118.

w x73 Y. Yoon, T. Guimaraes, Assessing expert systems impact onŽ . Ž .user’s jobs, Journal of MIS 12 1 1995 225–249.

w x74 Y. Yoon, T. Guimaraes, Q. O’Neal, Exploring the factorsŽ .associated with expert systems success, MIS Quarterly 19 1

Ž .1995 83–106.

Radhika Santhanam is an Associate Professor in the School ofManagement at the University of Kentucky. She obtained a PhDfrom the University of Nebraska, Lincoln and a MS from TexasA&M University. Her research interests focus on issues relating toHuman-Computer Interaction, Management Support Systems andInformation System Project Selection. She has published herresearch on these topics in various journals including InformationSystems Research, International Journal of Human-ComputerStudies, Information and Management, Omega and Computersand Operations Research. She serves on the editorial board ofComputers and Operations Research.

Page 22: An empirical investigation of ODSS impact on individuals and organizations

( )R. Santhanam et al.rDecision Support Systems 30 2000 51–7272

Tor Guimaraes holds the Jesse E. Owen Chair of Excellence atTennessee Technological University. He has a PhD in MIS fromthe University of Minnesota and an MBA from California StateUniversity, Los Angeles. Tor was a Professor and DepartmentChairman at St. Cloud State University. Before that, he wasAssistant Professor and Director of the MIS Certificate Program atCase-Western Reserve University. He has been the keynote speakerat numerous national and international meetings sponsored byorganizations such as the Information Processing Society of Japan,Institute of Industrial Engineers, American Society for QualityControl, IEEE, ASM, and Sales and Marketing Executives. Torhas consulted with many leading organizations including TRW,American Greetings, AT&T, IBM and the Department of Defense.Working with partners throughout the world, Tor has publishedclose to 150 articles about the effective use and management ofInformation Systems and other technologies. He is also the Editor-in-Chief of COMPUTER PERSONNEL, an Association for Com-puting Machinery Journal.

ŽJoey F. George PhD, University of California at Irvine, 1986;.AB, Stanford University, 1979 is Professor of Information Sys-

tems and the Thomas L. Williams Jr. Eminent Scholar in IS atFlorida State University. His research interests focus on the use ofinformation systems in the workplace, including computer-basedmonitoring, group support systems, and deception in computer-mediated communication.