26
An Empirical Assessment of the Information Resource Management Construct BRUCE R. LEWIS, CHARLES A. SNYDER, AND R. KELLY RAINER, JR. BRUCE R. LEWIS is the Executive Director of the Division of University Computing at Aubum University, Aubum, Alabama. He received his Ph.D. in management information systems from Auburn University. He has worked in the field of informa- tion systems for over twenty years. Among his research interests are applied statistics, information resource management, and organizational aspects of information systems management. He has published in the Journal of Computer Information Systems. CHARLES A. SN YDER is Professor of MIS in the Department of Management at Aubum University. He has broad management, research, and consulting experience. His more than fifty publications have appeared in leading joumals such as the Journal of Management Information Systems, Information & Management, The Academy of Management Review, The Academy of Management Executive, Data Management, The International Journal of Man-Machine Studies, and Decision Support Systems. His research interests include information resource management, systems analysis and design, executive information systems, and telecommunications. He is a member of SIM, AIS, DSI, ACM, IEEE, IRMA, APICS, and other professional societies. R. KELLY RAINER, JR. is Associate Professor of MIS in the Department of Manage- ment at Aubum University. He received his Ph.D. in management information systems from the University of Georgia. He has published articles in such joumals as the Journal of Management Information Systems, MIS Quarterly, Decision Sciences, Educational and Psychological Measurement, as well as others. His research interests include executive information systems, business process reengineering, and structural equation modeling. He is co-editor, with Hugh Watson and George Houdeshel, of Executive Information Systems (New York: Wiley, 1992). ABSTRACT: The concept of information resource management (IRM) has been sur- rounded by confusion for almost two decades. This study first defines the IRM construct as a comprehensive approach to planning, organizing, budgeting, directing, monitoring, and controlling the people, fiinding, technologies, and activities associ- ated with acquiring, storing, processing, and distributing data to meet a business need for the benefit of the entire enterprise. The study then operationalizes the IRM construct by developing a measurement Acknowledgment: The authors wish to acknowledge the helpful comments of the Editor-in-Chief and of the two anonymous reviewers which helped strengthen the paper. Journal (if Munugemenl Inftumalion Syxlemx / Summer 1995. Vol. 12. No. 1. pp. 199-223

An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

  • Upload
    others

  • View
    5

  • Download
    1

Embed Size (px)

Citation preview

Page 1: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

An Empirical Assessment of theInformation ResourceManagement Construct

BRUCE R. LEWIS, CHARLES A. SNYDER,AND R. KELLY RAINER, JR.

BRUCE R. LEWIS is the Executive Director of the Division of University Computingat Aubum University, Aubum, Alabama. He received his Ph.D. in managementinformation systems from Auburn University. He has worked in the field of informa-tion systems for over twenty years. Among his research interests are applied statistics,information resource management, and organizational aspects of information systemsmanagement. He has published in the Journal of Computer Information Systems.

CHARLES A. SN YDER is Professor of MIS in the Department of Management at AubumUniversity. He has broad management, research, and consulting experience. His morethan fifty publications have appeared in leading joumals such as the Journal ofManagement Information Systems, Information & Management, The Academy ofManagement Review, The Academy of Management Executive, Data Management,The International Journal of Man-Machine Studies, and Decision Support Systems.His research interests include information resource management, systems analysis anddesign, executive information systems, and telecommunications. He is a member ofSIM, AIS, DSI, ACM, IEEE, IRMA, APICS, and other professional societies.

R. KELLY RAINER, JR. is Associate Professor of MIS in the Department of Manage-ment at Aubum University. He received his Ph.D. in management information systemsfrom the University of Georgia. He has published articles in such joumals as theJournal of Management Information Systems, MIS Quarterly, Decision Sciences,Educational and Psychological Measurement, as well as others. His research interestsinclude executive information systems, business process reengineering, and structuralequation modeling. He is co-editor, with Hugh Watson and George Houdeshel, ofExecutive Information Systems (New York: Wiley, 1992).

ABSTRACT: The concept of information resource management (IRM) has been sur-rounded by confusion for almost two decades. This study first defines the IRMconstruct as a comprehensive approach to planning, organizing, budgeting, directing,monitoring, and controlling the people, fiinding, technologies, and activities associ-ated with acquiring, storing, processing, and distributing data to meet a business needfor the benefit of the entire enterprise.

The study then operationalizes the IRM construct by developing a measurement

Acknowledgment: The authors wish to acknowledge the helpful comments of the Editor-in-Chiefand of the two anonymous reviewers which helped strengthen the paper.

Journal (if Munugemenl Inftumalion Syxlemx / Summer 1995. Vol. 12. No. 1. pp. 199-223

Page 2: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

200 LEWIS, SNYDER. AND RAINER

instrument. The instrument demonstrates acceptable content validity as well as con-struct validity and reliability. Eight dimensions underlying the IRM construct werefound via exploratory factor analysis: chief information officer, planning, security,technology integration, advisory committees, enterprise model, infomiation integra-tion, and data administration. The instrument serves two functions: (1) to create acoherent, theoretical foundation for further research on the IRM construct, and (2) toprovide reference norms for practicing managers to use to assess the extent of IRMimplementation in their organizations.

KEY WORDS AND PHRASES: information resource management, management of infor-mation technology.

THE CENTRAL TENET OF THE INFORMATION AGE HAS BEEN the Crucial significance ofinformation and the critical importance of its management to the enterprise. Themanagement of information is the only area of business in which investment hasconsistently increased faster than economic growth [31]. These investments may equalnearly one-half of large firms' annual capital expenditures [31]. Investments in ITincreased from $55 billion to $190 billion during the 1980s, an annual growth rate of15 percent [31]. As these expenditures have grown, information has become recog-nized as a vital resource in the enterprise that must be effectively managed. Informa-tion resource management (IRM) has been widely acclaimed as the vehicle for themanagement of information and the associated technologies. The issue of IRMremains important, as evidenced by current research (see, e.g., [3, 15,44]).

As eariy as 1979, Diebold [16,17] addressed the information resource managementconcept when he stated that "It is clear that the organizations which excel. . . will bethose that recognize infomiation as a major resource and structure it as efficiently asthey do other assets." Although the IRM concept has received widespread attentionfrom information systems (IS) academicians and practicing managers since that time,uncertainty still surrounds the concept [19, 21, 35, 37, 47, 50, 55, 60]. In spite of theambiguity, the premise underlying the IRM concept acknowledges that informationis a valuable resource to the enterprise, comparable to other organizational assets suchas people, plant, and capital, and should be managed accordingly [3,7,15,23,35,37,43, 44, 52, 63, 66].

Guimaraes extensively reviewed three predominate views of IRM from the litera-ture: "IRM as the management of information as a resource" [22, p. 12]; "IRM as themanagement of information systems development" [22, p. 13]; and "IRM as themanagement of computing resources" [22, p. 14]. In addition, other views approachIRM from a technology perspective [32], an organizational perspective [34], and aneducational perspective [36]. These views suggest that the IRM concept is multifac-eted [50, 66]. Several authors have noted the lack of a commonly accepted definition[38, 46, 51, 59]. This uncertainty of meaning and purpose leads to difficulty inestablishing a coherent, theoretical fotindation for research on the topic of IRM.

The information systems (IS) field has long been criticized for a lack of development

Page 3: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 201

as a cohesive academic discipline. Problem areas have included inadequate constructdevelopment and a scarcity of valid, reliable measurement instruments [64]. Thedevelopment of constructs and instruments to measure them empirically provides thetheoretical basis for research in a field. The complex, multifaceted nature of the IRMconcept and the lack of a universal definition for IRM suggest that further developmentof the IRM construct would be useful.

Further, a means of assessing the extent to which the IRM concept is implementedin an organizational setting does not exist [22]. Given that IRM has received so muchexposure, the question arises as to how IRM is empirically related to other organiza-tional factors. In order to address this issue, and others, a metric is needed for the IRMconcept [18].

The body of research on IRM lacks a common, widely accepted IRM construct thathas been defmed, operationalized, tested, and validated. Accordingly, this research isdirected at defining and operationalizing the IRM construct through development ofa valid and reliable measure.

Background for the Information Resource Management Construct

THE CONCEPT OF INFORMATION RESOURCE MANAGEMENT was introduced in themid-1970s by the U.S. federal government as part of its attempt to diminish thepaperwork burden on the general public [42, 51]. IRM was presented as a planningand control mechanism in the context of records management, although the term IRMreferred to infonnation in a broader context than just hardcopy documents.

Existing Defmitions of the IRM Concept

The literature provides numerous defmitions of IRM. Many of these previous defmi-tions attempt to defme IRM by elaborating on the three words (infonnation, resource,management) in the term and by reusing the words of the term in the body of thedefinition (see, e.g., [4, 23, 26, 29, 37, 47, 53, 57, 60, 66, 68]). Notably, Guimaraes[22, p. 17] provided a definition of IRM that did not exhibit these limitations: "IRMis a collection of subfimctions whose objectives are to perform and manage theacquisition, storage, manipulation, retrieval, and communication (distribution) ofdata." However, no one definition is accepted as the standard for IRM [38, 59].

Previous Attempts at Operationalizing the IRM Concept

A number of authors have proposed features of the IRM construct. Holmes [26]suggested the notions of the convergence of information technologies and top infor-mation executives to manage them. Poppel [54] proposed IRM as a mechanism toconvert business goals into strategic objectives. Horton [27] further elaborated on theidea that IRM included multiple information-handling technologies and functions.Diebold [17] and Horton [27] offered the proposition that IRM meant viewinginformation as a valuable corporate resource comparable to capital and labor. Synnott

Page 4: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

202 LEWIS. SNYDER. AND RAINER

and Gmber [65] discussed IRM as a means of tying management of the infonnationresource to the overall goals of the organization. They also introduced the term "chiefinformation officer" (CIO) for the senior executive responsible for organization-wideinformation and technology policy. Horton [28] concluded that IRM encompassed themanagement of both infonnation and infonnation technologies.

Three authors have presented extensive reviews of the IRM literature [41, 42, 66].Levitan [41] found that IRM went beyond MIS by addressing the entire scope ofinformation resources in an organization. She postulated that the objectives of acorporate-wide IRM program encompass the relevancy of information, the appropri-ateness of technology, the cost-benefit balance of managing information, the properaccountability mechanisms, and the capability to initiate a change in the corporateattitude toward information. Lytle [42] noted that IRM included the data administra-tion function and emphasized the convergence of data processing, data communica-tions, and office automation technologies, and the management of these phenomena.He further cited the relationship between infonnation and strategic planning andconcluded that information and infonnation technologies were critical to the en-terprisemission. Trauth [66] observed that two phenomena, the concept of knowledgework and development of advanced computational and communications technologies,led to the advent of IRM. She traced the development of IRM to three managementdisciplines covering database, records, and data processing. She noted that the threegoals of IRM were to advance a global view of corporate data, to position theinfonnation management function at a high level in the organization, and to integrateboth the technology and data of the organization. Several authors have proposedmodels for IRM [5, 10, 22, 50, 51, 60]. Guimaraes presented a model that depictedthe subfunctions of IRM: operations management, data resources management, sys-tems development, quality assurance, project management, user support, communi-cations management, and planning. Smith and Medley's [60] model adopts atechnology perspective and depicts IRM as composed of applications, databases, andtechnology infrastructure with a desktop computer as the user interface. Bryce andBryce's [5] model portrays IRM from the systems development perspective andincludes project management, database engineering, enterprise modeling, and devel-opment methodology as the principal components. Corbin [10] postulated a modelthat enumerated the technology and process aspects of IRM. Owen's [51] modelcharacterized IRM as a function of information technology, upon which infonnationsystems are built, which in turn produce information. O'Brien and Morgan [50]presented a multidimensional model of IRM that portrayed three basic managementactivities: resource (systems and data) management, technology (computers, telecom-munications, and office automation) management, and functional (planning, opera-tions, systems development, and user support) management. Although there arecommonalities among these models, they do not achieve a consensus. Thus, IRMmodel development has contributed to the difficulties in achieving a common frame-work for the conduct of research on the IRM concept.

The research on IRM, as described in the literature, is not particularly rigorous [56].Most articles on the topic of IRM are based on either opinion or anecdote. Lytle [42]

Page 5: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 203

observed that reports on IRM programs in private business and industry are uncom-mon. There are, however, a few case study descriptions [10,13,14,20,23,25,45,48,53]. These case studies are primarily from public sector organizations, both institutionsof higher education and government; however, they clearly support the IRM premiseand definition determined in this study.

Three descriptive investigations of IRM were based on surveys. Guimaraes [21 ]appraised IRM implementation in private industry. He concluded that there wassubstantial confusion about IRM and consequently that most companies had notimplemented the concept. Guimaraes [21 ] provided a checklist of IRM features thatmade the point that there are multiple indicators of the deployment of IRM. O'Brienand Morgan [50] conducted a survey of IS executives to assess the viability of theirmultidimensional model of IRM. They found that managing infonnation as a resourcereceived the largest acknowledgment as a primary theme of IRM. Overall, theyconcluded that their multidimensional framework had merit in practice. Laribee [38]conducted a nationwide survey of IS professionals and educators to appraise theimportance of IRM topics in educational curricula.

Although these three studies did not report on the validity and reliability of theirinstniments, each made a substantial contribution to the current study. First,Guimaraes [21 ] noted the confusion over the IRM concept. Second, Guimaraes [21 ]and O'Brien and Morgan [50] provided initial evidence on the muhidimensionalnature of the IRM concept. Third, all three studies [21,38,50] produced and discussedmultiple activities related to the IRM concept.

The Need for the IRM Construct

The need for the IRM construct arises from the growing importance of, and expendi-tures for, IT in organizations. Managers must accomplish two tasks: first, they mustensure that IT pays its way, and second, they must ensure that IT investments aretargeted appropriately [3,31,44]. A clear definition and operationalization of the IRMconstruct will help practicing managers understand the variety of activities that fallunder the umbrella of IT. This understanding will help these managers perform thetasks above. Consequently, managers need a consistent, inclusive definition andoperationalization that encompasses the multiple facets of the IRM construct.

Developing this definition and operationalization leads to methodological issues thatmotivate the need for the IRM construct. Sato and Horiuchi [56, p. 93] noted that "littleresearch is reported on the function of IRM." Likewise, Lytle [43] observed the lackof empirical models in the IRM literature and called for a better research base. Thebody of knowledge on the IRM construct has not provided a consistent, widely accepteddefinition of the constnict. Previous studies have proposed a variety of facets for theconstruct, but without rigorous empirical testing. In particular, a complete set of dimen-sions for the IRM construct has not been tested. As a result, the body of knowledge on theIRM construct has led to inconsistent interpretations, resulting in an inability to testreplicate, and extend previous research. Finally, the nomological network of the IRMconstnict is difficult to establish given the current state of research in the area.

Page 6: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

204 LEWIS. SNYDER. AND RAINER

Development of the IRM Construct

A CONSTRUCT IS AN ABSTRACT REPRESENTATION OF A PHENOMENON of interest to

researchers. Information resource management is the construct of interest in this study.The methodology for the present study followed the paradigm for construct measure-ment enumerated by Churchill [8]. First, the domain of the IRM construct wasspecified. Next, a sample of items was generated, and a measurement instrumentdesigned and refined through several iterations. Then, data were collected and thereliability and validity of the instrument were assessed. Last, data from an industrysample were summarized to provide a profile of IRM implementation, which mayserve as a norm for future use of the instrument.

Domain of the IRM Construct (Definition)

The domain of a construct is essentially a definition of the concept. The premiseunderlying the IRM concept calls for information to be managed as a valuable asset,hence the definition of IRM should address IRM as a management activity. Thedefinitions of IRM determined through a review of the literature were synthesized intothe following inclusive definition:

IRM is a comprehensive approach to planning, organizing, budgeting, directing, moni-toring and controlling the people, funding, technologies and activities associated withacquiring, storing, processing and distributing data to meet a business need for the ben-efit of the entire enterprise.

Sample of Items and Measurement Instrument (Content Validity)

The objective of item creation is to ensure content validity. Content validity is therepresentativeness or sampling adequacy of the construct domain [6,33]. To generatea representative sample of items and achieve content validity, a variety of procedureswere employed in this study.

The first procedure was a content analysis of the literature. The selected literattirespanned both academic and professional journals and books in MIS, as well as otherdisciplines. The literature was prescreened to determine the pieces that directlyaddressed the topic of IRM. Articles and books were chosen for review if the terms"infonnation resource(s) management" or "IRM" were in the title or key word list.The rationale for this search standard was based on the fact that, because IRM isill-defined, authors who label their publications with the term are contributing to theexplication of the IRM construct. The time period covered by the literature was 1975to the present.

The content analysis of the literature elicited forty-four activities associated with theIRM construct (see Table 1). These activities depict the primary means mentioned inthe literature by which the IRM construct may be realized in practice. As such, theyrepresent indicators of the extent to which IRM may be implemented within anorganization.

Accordingly, an instrument was designed to measure the extent of IRM implementa-

Page 7: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 205

Table 1 IRM Activities from a Content Analysis of the Literature

• Integrated computer based information systems [10, 13, 16, 18, 23, 26, 29, 32,35,36,37,41,47,50,51,57,60,65]

• Integrated communications [10, 13, 16,18,23,26,29,32,35,36,37,41,47,50,51,57,60,65]

• Integrated office automation [10, 13, 16, 18, 23, 26, 29, 32, 35, 36, 37, 41, 47,50,51,57,60,65]

• Data integration across applications [4, 16, 21, 22, 42, 47, 50, 51, 60, 65, 66]• Applications systems integration [16, 21, 42, 50, 51, 60, 65]• Local IT facilities (microcomputers, workstations, minicomputers, LANs and

servers) [14, 23, 32, 37, 43, 65]• IT architecture: computers and communications [14, 23, 32, 37, 43, 65]• Assess potential of new technology [53, 63, 68]• CIO establishes organization-wide IS/IT policies [4, 21, 26, 29, 32, 36, 37, 47,

51,53,60,65,66]• CIO involved in organization-wide strategic planning [4, 21, 26,29, 32, 36, 37,

47,51,53,60,65,66]• CIO responsible for central and distributed IS/IT support [4, 21,26, 29, 32, 36,

37,47,51,53,60,65,66]• CIO authorizes corporate-wide IT acquisitions [4, 21, 26,29,32, 36, 37,47, 51,

53, 60, 65, 66]• Program for quality assurance of information systems [4,21, 22,26, 29, 32, 36,

37,47,51,53,60,65,66]• Data administration function [4, 16, 21, 22, 42,47, 50, 51, 60, 65, 66]• Data architecture [4, 16, 21, 22, 42,47, 50, 51, 60, 65, 66]• Data ownership policies [4, 16, 21, 22, 42, 47, 50, 51, 60, 65, 66]• Data dictionary [4, 16, 21, 22, 42, 47, 50, 51, 60, 65, 66]• Data shared between users [4, 16, 21, 22, 42, 47, 50, 51, 60, 65, 66]• Data security [4, 16, 21, 22, 42, 47, 50, 51, 60, 65, 66]• Access control security [4, 16, 21, 22,42, 47, 50, 51, 60, 65, 66]• Security awareness program [13, 18, 21, 22, 23, 32, 42, 50, 51, 60, 65]• Corporate-wide IS/IT plan [13, 18, 21, 22, 23, 32,42, 50, 51, 60, 65]• IS/IT plan encompasses MIS and EUC [13, 18, 21, 22, 23, 32, 42, 50, 51, 60,

65]• IS/IT plan reflects business strategies [ 13, 18, 21, 22, 23, 32, 42, 50, 51,60, 65]• Support for EUC [13, 18, 21, 22, 23, 32, 42, 50, 51, 60, 65]• Training programs for end users [13, 18,21,22,23,32,42,50,51,60,65]

• Information center support [13, 18, 21, 22, 23, 32, 42, 50, 51, 60, 65]• Control of technology resides with users [21, 22, 23,43, 47, 50, 60, 65]• Users involved in planning [21, 22, 23,43,47, 50, 60, 65]• Support provided for user management decision making (DSS, EIS) [21,23,43,

47, 50, 60, 65]

Page 8: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

206 LEWIS. SNYDER. AND RAINER

Table 1 Continued

• Infonnation center support [13, 18, 21, 22, 23, 32, 42, 50, 51, 60, 65]• Control of technology resides with users [21, 22, 23,43, 47, 50, 60, 65]• Users involved in planning [21, 22, 23,43,47, 50, 60, 65]• Support provided for user management decision making (DSS, EIS) [21,23,43,

47, 50, 60, 65]• Management and support of infonnation resources is the responsibility of users

[21,23,43,47,50,60,65]• Distributed technology standards [23, 27,43,47, 53, 63, 65]• Adherence to distributed technology standards [23, 27, 43, 47, 53, 63, 65]• Cooperative processing and client/server applications [14, 23, 32, 38,43, 65]• Telecommunications between and within distributed and central facilities [14,

22, 23, 32, 37,43, 65]• Formal guidelines for systems analysis, design, development and implementa-

tion [5, 14, 22,32,43, 47, 60]• Automated development tools [5, 14, 32, 43, 47, 60]• Business/enterprise model [5, 14, 21, 43, 51]• Documentation of corporate data flow [16, 21, 42, 47, 50, 51, 60, 65]• Data/information inventory [5, 14, 21, 43, 51]• Inventory of IT facilities [14, 21, 43, 51]• Policy review/advice oversight committee [37,47, 51, 53, 65]• User participation oversight committee [37, 47, 51, 53, 65]• Executive-level participation in oversight committee [37,47, 51, 53, 65]

tion in an organization. The instrument employed each IRM activity as a responseitem. Respondents were asked to indicate the extent of implementation of each IRMactivity on monotonically increasing six-point scales. The six-point scales ranged from" 1 , " meaning "not at all," to "6," meaning "very great extent."

The questionnaire was then pretested using seven MIS professionals (one CIO, fivedata center managers, and an applications development director). Respondents re-marked on the descriptions of the activities purported to comprise the IRM construct.They were asked to note any activities that should be added, deleted, or modified.They also commented on each item's meaningftilness and readability. Refinements tothe instrument were made based on their suggestions and two IRM activity items wereadded: disaster recovery plan, and network integration.

A pilot test of the instrument was then undertaken with eleven members of a statechapter of the Society for Information Management (SIM). These participants wereasked to evaluate the questionnaire in similar fashion to the pretest respondents. Inaddition, these eleven respondents were asked to give an overall impression of the

Page 9: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 207

instrument's ability to capture the multidimensional nature of the IRM construct. TheSIM members were given a copy of the definition developed in this study and askedto provide comments on it as well. They suggested no changes in the definition.Wording and format refinements to the instrument were made based on commentsfrom the eleven respondents. Their responses suggested that the items adequatelycovered the content domain as they made no suggestions for additional items.

The pretest and pilot test respondents were asked to comment on the meaningfulnessof the six-point scales. The consensus of the respondents was that a "not at all" pointwas necessary because the scale measured amount of implementation. Further, therespondents noted that a "very great extent" was sufficient to capture the opposite endof the spectrum because they felt that it was unlikely that IRM activities would betotally implemented.

The content validity of the measurement instrument was then investigated byexecuting a variation on the procedure developed by Lawshe [39] for quantitativelyassessing content validity. This procedure employed a content evaluation panel ofindividuals knowledgeable about the concept being measured. The panel consisted ofthirteen MIS professionals from both industry and academia. Ten industry panelistswere selected from the Computerworld Premier 100 [9] top information executives,one from each of ten industries. The ten industry professionals encompassed sevenCIOs and three data center directors. The average information systems/technologyexperience for the industry professionals was twenty-two years. Three faculty mem-bers from different universities, who had published articles on IRM, were alsoincluded on the panel. All three of the academicians had published at least two pieceson IRM.

The panelists responded to each activity item's relation to IRM on a three-pointscale: "1 = not relevant"; "2 = important (but not essential)"; "3 = essential." Fromthese data, a content validity ratio (CVR) was computed for each item from thefollowing formula:

CVR = in-N/2y{N/2),

where n is the frequency count of the number of panelists rating the item as either "3 =essential" or "2 = important (but not essential)" and A'̂ is the total number ofrespondents.

Lawshe [39] only utilized the "essential" response category in the computation ofthe CVR. In the present study, a less stringent criterion was employed. Responses ofboth "important (but not essential)" and "essential" were utilized because they werepositive indicators of the items' relevance to IRM. Table 2 presents the means of theitems from the Lawshe procedure. All but three items exhibit means of two or greater,indicating that the panel considered them to be important to the IRM concept.

The CVR for each item was evaluated for statistical significance at the 0.05 level,using the table published by Lawshe [39]. Statistical significance meant that more than50 percent of the panelists rated the item as either "essential" or "important."According to Lawshe [39], this majority vote indicated some content validity for theitem. The three IRM activity items with means less than two and not significant at the

Page 10: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

208 LEWIS, SNYDER. AND RAINER

Table 2 Means of IRM Activity Items from Lawshe Procedure

Planning process for IS/IT incorporates end users 3.00Data administration 2.85Network integration 2.79Corporate data architecture 2.78Data security 2.78Information technology (IT) integration 2.77IS/IT plan incorporates central, distributed, and desktop domains 2.76IS/IT plan reflects business goals 2.76Standards for distributed information systems and technology 2.74Plan for corporate-wide information systems and technology 2.72Business continuity/disaster recovery plan 2.70

Access control security 2.69Information technology architecture 2.69Corporate policy on data ownership 2.69Senior management participates in advisory committees 2.64Data communications between central and distributed facilities 2.63Data shared between users and departments 2.63Data integration between applications 2.62CIO involved in the corporate business planning process 2.62

Quality assurance program for information systems and facilities 2.55CIO responsible for corporate-wide IS/IT policy 2.54Application systems integration 2.54

Documentation for corporate-wide information flow 2.48Corporate-wide adherence to IS/IT standards 2.47Distributed facilities 2.46Security awareness program 2.40

CIO approves corporate-wide IS/IT acquisitions 2.39Formal support for end-user computing 2.38CIO responsible for distributed information systems and technology 2.38Inventory of company IT facilities 2.37Inventory of corporate data and information 2.35Formal methodology for systems development 2.35Training programs for end users 2.32Data dictionary 2.31Users participate in advisory committees 2.30

Office automation capabilities 2.24Communications integration 2.23Systems and support for management decision making (DSS, EIS) 2.23Assessment of potential for new technologies 2.22Users support distributed IT facilities 2.15IS/IT advisory/oversight committee(s) 2.09Cooperative processing and client/server applications 2.08Use of automated development tools 2.03

Strategic & functional business (enterprise) model 1.93Control of IS/IT rests with end users 1.92Information Center group 1.62

Page 11: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 209

0.05 level were dropped from the final version of the questionnaire: information centergroup; control of information systems and technology rests with end users; andstrategic and functional business (enterprise) model.

Data Collection

The fmal version of the instmment (see appendix) was sent to a target sample of seniorcomputing managers in 470 different Fortune 1,000 companies. Respondents wereasked to indicate the extent to which their firms had implemented each IRM activityon six-point scales ranging from " 1 " meaning "not at all," to "6" meaning "very greatextent."

The target group represented a systematic sample, stratified by industry. The topcomputer executive was defined as the most senior employee in the company whooversees management information systems functions. Typical titles for these jobsincluded VP of Information Systems, Director of MIS, Director of Data Processing,and Chief Information Officer. Industries present in this population encompassedmanufacturing, banking, finance, instirance, retail, transportation, utilities, and healthservices. A total of 102 questionnaires were returned from the first mailing, represent-ing 21.7 percent of the target sample. A second mailing was conducted approximatelyfour weeks later to the same sample. Known respondents to the first mailing wereexcluded from the second mailing. An additional 48 questionnaires (10.2 percent)were returned from the second mailing. The total number of questionnaires receivedfrom both mailings, 150, represented a 32 percent response rate. A total of 91 (61percent) of the respondents requested that results of the study be sent to them.

Fifty-one percent of the respondents were the chief information officer (CIO) in theirorganizations. Sixty-five percent of the respondents reported that they had over tenyears experience in information systems. The mean MIS experience in the sample was21.6 years. The companies in the sample encompassed a representative variety ofindustries. The most common industry was manufacturing, which was twice aspredominant as any of the other industries, and represented a finding typical of theFortune 1,000 register. Companies in the return sample were relatively large, both innumber of employees and in company revenue, again reflective of the Fortune 1,000composition. Fifty-two percent of the firms reported more than 5,000 employees and50 percent reported revenues over one billion dollars.

Nonresponse bias in the sample data was investigated by comparing the industrydistribution in the returned questiormaires to the population industry distribution usinga chi-square one-sample test. The computed chi-square statistic, testing the sampleindustry distribution against the population distribution, was not statistically signifi-cant at the 0.05 level. Thus, the industry distribution present in the sample was notsignificantly different from the industry distribution in the population. This findingsuggested a lack of nonresponse bias in the returned questionnaires relative to industryaffiliation.

Page 12: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

210 LEWIS, SNYDER. AND RAINER

Instrument Assessment (Reliability and Validity)

Data collected from the stirvey were used to evaluate the validity and reliability of themeasurement instrument. Construct validity is concerned with whether the measurereflects true dimensions of the concept or is influenced by the methodology [11]. Inorder to demonstrate construct validity, the instrument should reflect a reasonableoperational defmition of the concept it purports to measure [33]. Stone [62] noted thatexamining the components that make up the overall meastire is a legitimate methodfor assessing construct validity. Likewise, Allen and Yen [1] suggested that theappearance of logical factors is one indication of construct validity for a measure.Straub [64] asserted that factorial validity was confirmation that the measure exhibitedlatent constructs. Spector [61 ] observed that factor analysis is a useful approach in theprocess of scale validation. Factor analysis has been utilized in several IS studies toexamine this aspect of construct validity [40, 58, 64, 67].

Factor Analysis

An exploratory principal components factor analysis was conducted to assess theconstruct validity of the instrument in this study. As a first step, the factor analysiswas run without specifying the number of factors to be extracted. The result was asolution with ten factors that exhibited eigenvalues greater that 1.0. Based on agraphical plot of the eigenvalues, it was estimated that eight factors should be in thefinal solution.

A second iteration of the factor analysis was run with eight factors stipulated androtated using the varimax method. This solution explained 67 percent of the systerhaticcovariance among the items. Items were dropped from further analysis if their factorloadings were less than 0.45 or if they loaded on two factors with loadings greater than0.45. Four items were dropped based on these criteria: information technologyarchitecture, corporate policy on data ownership, systems support for managementdecision making (DSS, EIS), and cooperative processing and client/server applica-tions. The final thirty-nine-item, eight-factor model is reported in Table 3

The rotated factor solution met the following three criteria: simplicity [24, 58],interpretability [30,40], and the percent of variance explained [2,64]. The first factor(eigenvalue = 15.83) was labeled chief infonnation officer and accounted for 34.4percent of the covariance. The four items defining this factor, with loadings rangingfrom 0.80 to 0.88, represent a chief infonnation officer who is the executive respon-sible for corporate-wide information technology policy, planning, management, andacquisitions. The internal consistency reliability coefficient for the chief informationofficer factor was 0.94.

The second factor (eigenvalue = 3.99) was labeled planning, and accounted for 8.7percent of the covariance. The eight items describing this factor, with loadings rangingfrom 0.46 to 0.67, relate to an inclusive infonnation systems/technology planningprocess that refiects business goals, encompasses both central and distributed technol-ogies, involves end users, and features a mechanism for assessing the potential of new

Page 13: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 211

Table 3 Underlying Dimensions of the IRM Construct

Loading

Chief information off/cer (covariance explained = 34.4 percent/alpha = 0.94)C1: CIO approves corporate-wide information systems and technology 0.88acquisitions

C2: CIO is responsible for distributed information systems and 0.86technologyC3: CIO is responsible for corporate-wide information systems and 0.85technology policyC4: CIO is involved in the corporate business planning process 0.80

Planning (covariance explained = 8.7 percent/alpha = 0.87)P1: Information systems/technology plan incorporates central, 0.67distributed and desktop domainsP2: Planning process for information systems and technology 0.64incorporates end usersP3: Users support distributed IT facilities 0.62P4: Plan for corporate-wide information systems and technology 0.57P5: Formal support for end-user computing 0.54P6: Training programs for end users 0.49P7: Information systems/technology plan reflects business goals 0.47P8: Assessment of potential for new technologies 0.46

Secur/fy (covariance explained = 5.1 percent/alpha = 0.84)S1: Access control security 0.82S2: Data security 0.82S3: Security awareness program 0.70S4: Business continuity/disaster recovery plan 0.60

Technology integration {covariance explained = 4.6 percent/alpha = 0.75)T1: Distributed facilities 0.67T2: Office automation capabilities 0.60T3: Communication integration 0.52T4: Network integration 0.49T5: Information technology integration 0.48

Advisory committees (covariance explained = 4.1 percent/alpha = 0.83)A l : Information systems and technology advisory/oversight committee(s) 0.76A2: Senior management participates in advisory committees 0.73A3: Users participate in advisory committees 0.70

Enterprise mode/(covariance explained = 3.8 percent/alpha = 0.88)E1: Data communications between central and distributed facilities 0.67E2: Inventory of company IT facilities 0.59E3: Formal methodology for systems development 0.52E4: Inventory of corporate data and information 0.51E5: Standards for distributed information systems and technology 0.50E6: Documentation for corporate-wide infomnation flow 0.49E7: Use of automated development tools 0.46E8: Corporate-wide adherence to information systems and technology 0.46standards

Information integration (covariance explained = 3.3 percent/alpha = 0.86)11: Application systems integration 0.8712: Data integration between applications 0.84

Page 14: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

212 LEWIS. SNYDER. AND RAINER

Table 3 Continued

Loading

13: Data shared between users and departments 0.67

Data administration (covariance explained = 2.9 percent/aipha = 0.74)D1: Data administration 0.69D2: Corporate data architecture 0.67D3: Quality assurance program for information systems and facilities 0.65D4: Data dictionary 0.46

technologies. The intemal consistency reliability coefficient for the planning factorwas 0.87.

The third factor (eigenvalue = 2.36) was labeled security and accounted for 5.1percent of the covariance. The four items comprising this factor, with loadings rangingfrom 0.60 to 0.82, depict a comprehensive security program that includes accesscontrol and data security, a security awareness effort, and a disaster recovery plan.The intemal consistency reliability coefficient for the security factor was 0.84.

The fourth factor (eigenvalue = 2.11) was labeled technology integration andaccounted for 4.6 percent of the covariance. The five items describing this factor, withloadings ranging from 0.48 to 0.67, delineate a comprehensive and integrated ap-proach to information technologies, including computing, telecommunications, andoffice automation. The intemal consistency reliability coefficient for the technologyintegration factor was 0.75.

The fifth factor (eigenvalue = 1.86) was labeled advisory committees and accountedfor 4.1 percent of the covariance. The three items defining this factor, with loadingsranging from 0.70 to 0.76, refer to advisory committees that deal with systems andtechnology issues and include end-user and senior management participation. Theintemal consistency reliability coefficient for the advisory committees factor was 0.83.

The sixth factor (eigenvalue = 1.74) was labeled enterprise model and accounted for3.8 percent of the covariance. The eight items defining this factor, with loadingsranging from 0.46 to 0.67, illustrate an enterprise model approach featuring docu-mented business processes, a development methodology, inventories of facilities andinformation, corporate-wide technology standards, and the use of automated develop-ment tools. The intemal consistency reliability coefficient for the enterprise modelfactor was 0.88.

The seventh factor (eigenvalue = 1.51) was labeled information integration andaccounted for 3.3 percent of the covariance. The three items comprising this factor,with loadings ranging from 0.67 to 0.87, relate to integrated applications and data,with data shared among users. The intemal consistency reliability coefficient for theinformation integration factor was 0.86.

The eighth factor (eigenvalue =1.35) was labeled data administration and accountedfor 2.9 percent of the covariance. The four items describing this factor, with loadingsranging from 0.46 to 0.69, include a data administration function, headed by a databaseadministrator, based on a corporate data architecture, and utilizing a data dictionary.

Page 15: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE M A N A G E M E N T C O N S T R U C T 213

with policies on data ownership. The intemal consistency reliability coefficient forthe data administration factor was 0.74.

These eight factors specify an operational definition of IRM and provide evidencesupporting the constmct validity of the instrument. The interitem correlation matrixfor the questionnaire items is presented in Table 4.

Reliability

Reliability refers to the lack of measurement error in the items on a scale [33]. Thereliability of the instrument in this study was determined by computing the intemalconsistency coefficient, Cronbach's alpha coefficient, for each of the dimensionsdetermined from the factor analysis [12]. Nunnally [49] advised that a magnitude of0.5 to 0.6 for the Cronbach alpha statistic was sufficient in the early stages of basicresearch, but that an alpha of 0.8 is more desirable. These coefficients for each of thefactors discovered in this study are presented in Table 3. The alpha coefficients forall eight of the factors were above 0.6, with six above 0.8.

Profile of IRM Implementation (Measurement Norms)

The mean and standard deviation were computed for each of the remaining thirty-nineactivity items on the questionnaire, as shown in Table 5. The statistics in Table 5provide a detailed perspective on the current state of IRM implementation for the firmsin the sample. This information is presented as a benchmark for assessing the extentof IRM implementation in future studies, and as normative data for the fliture use ofthe measurement instmment developed in this study.

The larger the mean in Table 5, the more the activity was implemented. The activitythat was the most implemented was "formal support for end-user computing," with amean of 4.82. The item that was implemented the least was "documentation forcorporate-wide information flow," with a mean of 2.87.

The degree of implementation for each of the eight factors may be calculated fromthe mean implementation of their underlying items. The larger the mean, the more thefactor was implemented. The security factor was most implemented (4.53), followedin order by planning (4.17), information integration (4.03), chief information officer(3.97), technology integration (3.96), advisory committees (3.82), data administration(3.65), and enterprise model (3.64).

Summary and Conclusions

PRACTICING MANAGERS OFTEN FEEL THAT THEY MUST INVEST in critically importantinformation technologies to remain competitive, but they have no strong evidence ofthese investments' impacts on organizational bottom lines. A clear definition andoperationalization of the IRM construct should help their understanding of the issuesand areas to which their IT investments are targeted and the effects that theseinvestments have on corporate performance.

Page 16: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

214

•£

COCO

CM

tion

1oU

I5c

Q.

5CO

O

CM

o

o

r«.t^cDinincDOooir)coC3CMC0C0C\li-C3'tC\IC0C0^ddiddddddd

P ; p ^ j•r^dddddddddddCM00'<J-00OtDT-Olf)CM0000O O m T- C\l CM T- f- T- n rf CM C\l^^dddddddddddCM o> lO o 1-o r̂ CO CO CO CO m

C3'*COC0C\JCOCOCOCMC\ICO'*COCO•r^ddddddddddddd

pr--;5;CMCMCMCOCOC\jCMWCMCOCMCO•r^dddddddddddddd

)oqr--"<tT-CMCMCOCvJCMC\l->-COTj;C\!C\JdddddddddddddddT^ddddddddddddddd

cor^ocotoincoh-CMOxjit^.T-r-tpCVICMCMCMCOCOCOTTCOCOCOCMCVJCOTt^T^dddddddddddddddd

p ; q ^ ;T^ddddddddddddddddd

i-^dddddddddddddddddd

p p j ; j

•r^dcoddddddddddddddddd

p q ^ ; ;T^ddddddddddddddddddddc o c o t ^ - i - h o j c o m ' ' o o c M C M h T T t c o T t m r i n

P'«l;C\ICMCOC\I'>-;CiJCMi-CMCMCVji-i-CMi-i-i-CMi-T-T^ddddddddddddddddddddd

y^dddddddddddddddddddddd

p ; ; ;T-^ddddddddddddddddddddddd

p ; pT^dddddddddddddddddddddddd

p ^ q•r^ddddddddddddddddddddddddd

•r^dddddddddddddddddddddddddd

p l j p•r^ddddddddddddddddddddddddddd

C M C O ^ T C M C O T t i n C O t ^ O O ^ C M C O t i C M C O ' t i n ' C M C O ' C M C O *OOOOQ.Q.CLQ.D.Q.Q.Q.COCOCOCOI-l-HI-l-<<<UJLUmiUC O I - l

Page 17: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

215

I•5

CO

w

CM

W

CO

0.

a.

COQ .

CO

O

C\J

O

O

29 0

o

o

39 0

o

o

CMO

OO O O OO o o

CM CM

oooooooo

oooooooo

C M C M C M T - C M C M C M I -

o o

CO CM

o o

O 1-

o o

O OO

CO CO CM

O O O

O

O

o

o

o

o

CO

o

o

o o

o o

o o

o o

O O

o o o o

•«t CM CM CO

o o o o

O O OO

o o o o

coco cocoo o o or-. 1- o r-.CO CO CO COo o o o

CM O »- CM

o o o o

o o o o

o o o o

o o o o

OO OO

O O O O OO

CO CO CO ^— 'r— f—

o o

o o

o o o

o o o o

o o o o

OO OO

o o

O O

CO

o o o

CO CM CO

o o o

OO

CO OCM CM

o oO) in

o o

aso o

O O

O O

o o

O O

o o

o o

o o

O O

31 0

o

o

p>

29 0

24 0

45 0

o

o

36 0

o

o

o

OO O

o

a>

o

Pio

21 0

o

o

29 0

20 0

40 0

o

o

32 0

o

oo

o

18 0

qqddddddddddd

UIUJLLJUJ — ^ £ 2

Page 18: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

216

I

o

CO

Q

g

00UJ

1^UJ

COLU

N.LU

LLI

COUJ

CMLU

LU

p

•r̂ d

1- COp in CO

o r>.. 00p CO in CO

p •r^^d dd d

P CO CO CO CM CO

•r^ ddddd

O O> ̂ CO ^ COp 00 in CM C\l CM COT-̂ c> d d d d d

CM CM 1 - O C35 CM inp C O C O T f T f C O C O C O•r^ddddddd^yy ^D CO ^ ^ ^ ^ ^ ^ ^— f ft

pCMpi-CVICM^CMCO

•r^dddddddd

•r^ddddddddd

•r^dddddddddd

y-dddddddddddocoojincooO'-co'-^ooo

t CO in m in ^^ CM CO CM m m ^^ co

•r^dddddddddddd

y-ddddddddddddd

•dddddddddddddd

T-ddddddddddddddd

•r.^dddddddddddddddd

•r^ddddddddddddddddd

•r^dddddddddddddddddd

•ddddddddddddddddddd

.(—<<<i!uLLILLILLILULlJLLILlj:iS!£2QQQQ

Page 19: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 217

Table 5 Mean and Standard Deviation of IRM Questionnaire Items

Questionnaire Item Mean Std. dev.

Formal support for end-user computingAccess control securityData securityData communications between central and distributed facilitiesBusiness continuity/disaster recovery planInventory of company IT facilitiesCIO approves corporate-wide information systems andtechnology acquisitionsAssessment of potential for new technologiesNetwork integration (local area networks, corporate-widenetworks, wide area networks)Planning process for information systems and technologyincorporates end usersSecurity awareness programInformation systems/technology plan reflects business goalsInformation technology (IT) integration (data processing,computing, office automation, telecommunications)Information systems/technology plan incorporates central,distributed, and desktop domainsData shared between users and departmentsDistributed facilities (minicomputers, network servers,workstations, microcomputers)CIO is responsible for corporate-wide information systems andtechnology policyTraining program for end-usersCIO is responsible for distributed information systems andtechnologyApplication systems integrationUsers participate in advisory committeesData integration between applicationsSenior management participates in advisory committeesFormal methodology for systems developmentData administration (policies, standards, corporate oversight)Office automation capabilities (text processing/desktoppublishing/e-mail//scheduling/bulletinboards)Plan for corporate-wide information systems and technologyUsers support distributed IT facilitiesCorporate data architectureCorporate-wide adherence to information systems andtechnology standardsQuality assurance program for information systems and facilitiesInformation systems and technology advisory/oversightcommittee(s)Standards for distributed information systems and technologyCommunications integration (voice, data, text, image, video)Data dictionaryCIO is involved in the corporate business planning processInventory of corporate data and informationUse of automated development tools (CASE, code generators)Documentation for corporate-wide information flow

4.82 14.81 14.67 14.66 14.40 14.36 14.32 1

4.29 14.24 1

4.21 1

4.21 14.204.19

4.17

4.144.11

4.10

4.074.01

4.004.003.953.883.873.833.81

3.803.783.683.64

3.623.59

3.503.463.453.453.203.002.87

.20

.00

.04

.24

.29

.26

.81

.12

.22

.17

.23

.25

.08

.32

.09

.08

.93

.34

.79

.011.451.081.581.351.191.17

1.361.321.261.29

1.171.50

1.341.401.381.821.321.441.31

Page 20: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

218 LEWIS. SNYDER, AND RAINER

This study was undertaken in an effort to mitigate the confusion stirrounding the IRMconstruct, which has led to multiple definitions and models, none of which is accepted asstandard. In addition, the study extended previous work to refme the IRM construct andestablish a coherent foundation upon which further research could be based.

The study contributes to the hody of knowledge on the IRM construct in severalways. First, the study proposes a definition of the IRM construct. This definition willhelp to establish a coherent base of knowledge for the construct and to provide atheoretical basis for further research on the topic of IRM. Second, the research elicitedan extensive list of IRM activities. Third, the study developed and tested a measure-ment instrument to operationalize the IRM construct. Fourth, the validity and reliabil-ity of the instrument were assessed and found to be acceptable. This assessmentrepresents a rigorous attempt at measurement of the IRM constmct. Finally, theinstrument was also used to create reference norms that practicing managers may useto measure the extent of IRM implementation in their organizations.

The overall conclusion of this research is that IRM was found to be a viableconstruct: the domain of the IRM construct can he delineated, and the extent of IRMimplementation can be reliably measured. Based on the IRM items identified throughthis research, a comprehensive theoretical model of the IRM construct was developed.This empirically derived theoretical model enhances the body of knowledge concem-ing IRM, as well as creating a basis for future research. The results of this researchclearly indicate that the IRM construct is multidimensional. Specifically, there areeight dimensions that constitute the IRM constmct, as determined through a contentanalysis of the IRM literature and refined based on a factor analysis of survey data.The eight dimensions are:

• A chief information officer, who is responsible for corporate-wide informationtechnology; policy, planning, management, and acquisitions (chief informationofficer);

• An inclusive information systems/technology planning process that reflectsbusiness goals, encompasses both central and distributed technologies, involvesend users, and features a inechanism for assessing the potential of new technol-ogies (planning);

• A comprehensive security program that includes access control and data secu-rity, a security awareness effort, and a disaster recovery plan (security);

• A comprehensive and integrated approach to information technologies, includ-ing computing, telecommunications, and office automation (technology inte-gration);

• Advisory committees that deal with systems and technology issues and includeboth senior management and users (advisory committees);

• An enterprise model approach featuring documented business processes, adevelopment methodology, inventories of facilities and information, corporate-wide technology standards, and the use of automated development tools (enter-prise model);

• Integrated data and applications systems, with data shared between users (infor-

Page 21: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 219

motion integration);• A data administration function headed by a database administrator and based on

a corporate data architecture, which utilizes a data dictionary and featurespolicies on data ownership (data administration).

These eight dimensions reinforce previous literature on the topic of IRM. In addition,they demonstrate the complex, multifaceted nature of the concept and illustrate thedifficulty of managing information and its associated technologies in modem organi-zations.

Directions for Future Research

The measurement instrument developed in this study should undergo further testing.Multiple studies, especially aimed at further assuring the reliability and validity of theinstmment, would lead to refinements and increased generalizability. In this process,specific industries could be targeted for in-depth analysis of the instrument.

The measurement instrument could be used in a longitudinal study to determine howthe extent of IRM implementation changes over time. In addition, the differences inIRM implementation could be compared among industries, and between private sectorand public sector organizations.

The most significant area for further research on IRM deals with appraising the valueof the concept. Using the measuring instrument developed in this study, studies shouldbe undertaken to relate IRM implementation to other organizational factors. It wouldbe especially relevant to determine how IRM relates to or determines the effectivenessof an organization (e.g., financial perfonnance measures such as profitability, net income,as weU as others). The IRM literature is full of claims of the potential of IRM. However,almost all of these pieces are based on personal opinion and anecdote. No research toempirically address the linkage between the IRM concept and organizational perfonnancefactors has been conducted. Hard evidence should be produced that either verifies orrefutes the value of the IRM concept. The measurement instrument developed in thepresent study provides a potential vehicle to accomplish such research.

This research project extended previous seminal work to provide a foundation forfurther empirical research on IRM with several research directions proposed. Inaddition, the results of this research can serve as a planning and/or diagnostic tool forIRM practitioners who wish to assess the extent to which IRM is implemented withintheir organizations, and as a guide for course and curriculum development by educa-tors. The large investment in information technology and the degree of organizationaldependence on information resources reinforces the value of this study.

REFERENCES

1. Allen, M.J. and Yen, W.M. Introduction lo Measurement Theory. Monterey, CA:Brooks-Cole, 1979.

2. Bemstein, I.H. Applied Multivariate Analysis. New York: Springer-Verlag, 1988.3. Boynton, A.C.; Zmud, R.W.; and Jacobs, G.C. The influence of IT management practice

Page 22: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

220 LEWIS. SNYDER. AND RAINER

on IT use in large organizations. MIS Quarterly, 18,4 (September 1994), 299-318.4. Bryce, M. The IRM idea. Datamation, 33, 8 (April 15, 1987), 89-92.5. Bryee, M., and Bryce, T. The IRM Revolution: Blueprint for the 21st Century. Palm

Harbor, FL: M. Bryce and Associates, 1988.6. Carmines, E.G., and Zeller, R.A. Reliability and Validity Assessment. Newbury Park,

CA: Sage Publications, 1979.7. Cheng, T.C.E. Toward a policy framework for business information resources manage-

ment. Industrial Management and Data Systems (January-February 1987), 5-8.8. Churchill, G.A. A paradigm for developing better measures of marketing constructs.

Journal of Marketing Research, /(5 (February 1979), 64—73.9. Computerworld Premier 100. Framingham, MA: CW Publishing, September 14, 1992.

10. Corbin, D.S. Strategic IRM plan: user involvement spells success. Journal of SystemsManagement, 39, 5 (May 1988), 12-16.

11. Cronbach, L.J. Test validation. In R.L. Thomdike (ed.). Educational Measurement, 2ded. Washington, DC: American Council on Education, 1971, pp. 443-507.

12. Cronbach, L.J., and Meehl, P.E. Construct validity in psychological tests. PsychologtcalBulletin, 52 (1955), 281-302.

13. Davies, T. R. Managing the stages of information resources management implementation.Information Management Review, 2, 3 (Winter 1987), 43—54.

14. De Jong, F.G. Information resource management and strategic systems architecture.Information Review, 3,1 (Fall 1987), 47-56.

15. DeSanctis, G., and Jackson, B.M. Coordination of information technology management.Journal of Management Information Systems, 10,4 (Spring 1994), 85-110.

16. Diebold, J. Information resource management—the new challenge. Infosystems, 26, 6(June 1979), 50-53.

17. Diebold, J. IRM: new directions in management. Infosystems, 26, 10 (November 1979)41-42.

18. Fosdick, H. IRM: putting theory to work. Infosystems, 32, 8 (August 1985), 33-34.19. Francella, K. Information resource management: a brief overview. Data Management,

21, 1 (January 1983), 15-16.20. Gloster, A.S. Establishing an information resource management organization at Cal Poly.

Proceedings of the 1987 CA USE National Conference (December 1987), 227-238.21. Guimaraes, T. IRM revisited. Datamation, 31,5 (March 1, 1985), 130-134.22. Guimaraes, T. Information resources management: improving the focus. Information

Resources Management Journal, I. 1 (Fall 1988), 10-19.23. Haney, G.P. IRM in the federal government: opinions and reflections. Information

Management Review, 4,4 (Spring 1989), 39-45.24. Harman, H.H. Modem Eactor Analysis. Chicago: University of Chicago Press, 1976.25. Henderson, J.A. An IRM manager's manifesto. Information Management Review, I, 3

(Winter 1986), 9-22.26. Holmes, F.W. Information resource management. Journal of Systems Management, 28,

9 (September 1977), 6-9.27. Horton, F.W. Information Resources Management: Concept and Cases. Cleveland, OH:

Association for Systems Management, 1979. .28. Horton, F.W. The Information Management Workbook: IRM Made Simple. Washington,

DC: Information Management Press, 1982.29. Jackson, L. Organizing for the merging technologies in information management. Jour-

nal of Information Systems Management, 4, 1 (Winter 1987), 88-90.30. Kachigan, S.K. Multivariate Statistical Analysis. New York: Radius Press, 1982.31. Keen, P.G.W. Shaping the Euture. Boston: Harvard Business School Press, 1991.32. Kercher, J.W. Getting the most out of your IRM investment. Price Waterhouse Review,

i2, 2(1988), 36-^1.33. Kerlinger, F. Eoundations of Behavioral Research. New York: Holt, Rinehart and

Winston, 1973.34. Kerr, J. A soothsayer's guide to the future of IRM. Database Programming and Design

(February 1991), 50-53.35. King, J.L., and Kraemer, K.L. Information resource management: is it sensible and can

Page 23: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 221

\tworkl Informalion and Management, 15, 1 (August 1988), 7—14.36. Kull, D. The dawn of IRM. Computer Decisions, 14, 14 (October 1982), 94-188.37. Langemo, M. An introduction to information resource management. ARMA Records

Management Quarterly, 20,4 (October 1988), 20-41.38. Laribee, J.F. Building a stronger IRM curriculum. Journal of Information Systems

Management, 9, 2 (Spring 1992). 22-28.39. Lawshe, C.H. A quantitative approach to content validity. Personnel Psychology 28

(1975), 563-575.40. Lederer, A.L., and Sethi, V. Root causes of strategic infonnation systems planning im-

plementation problems. Journal of Management Information Systems, 9, I (Summer 1992), 25-45.41. Levitan, K.B. Information resource(s) management—IRM. Annual Review of Informa-

tion Science and Technology, / 7 (1982), 227-266.42. Lytle, R.H. Information resource management: 1981-1986. Annual Review of Informa-

tion Science and Technology, 21 (1986), 309-335.43. Lytle, R.H. Information resource management: a five-year perspective. Information

Management Review, 3, 3 (Winter 1988), 9-16.44. Mahmood, M.A., and Mann, G.J. Measuring the organizational impact of infonnation

technology 'm\QS\mm\.. Journal of Management Information Systems, 10, 1 (Summer 1993)97-122.

45. Marchand, D.A., and Kresslein. J.C. South Carolina no longer a state of infonnationconfusion (IRM in South Carolina). Computerworld, 18, 23 (June 4. 1984), 1-16.

46. McLeod. R., and Brittain-White, K. Incorporation of IRM concepts in undergraduatehusmsss c\imcu\?i. Information Resources Management Journal, I, I (Fall 1988), 28-37.

47. Miller, B.B. Managing infonnation as a resource. In Handbook of Information ResourceManagement. New York: Marcel Dekker, 1988, pp. 3-33.

48. Miselis, K., and Updegrove, D.A. Planning for infonnation resource management at theUniversity of Pennsylvania: searching for a new paradigm. Proceedings of the 1989 CAUSENational Conference (Novemher 1989), 83-91.

49. Nunnally, J. Psychometric Theory. New York: McGraw-Hill, 1967.50. O'Brien, J.A., and Morgan, J.N. A multidimensional model of information resource

management. Information Resources Management Journal, 4, 2 (Spring 1991), 2-11.51. Owen. D.E. IRM concepts: building blocks for the 1990s. Information Management

Review, 5, 2 (Fall 1989), 19-28.52. Penn, I. A. Managing information: who does it and why? The Office, /07,3 (March 1988)

15-18.53. Penrod, J.I., and Dolence, M.G. Development of IRM at California State University, Los

Angeles. Information Management Review, 3, 2 (Fall 1987), 9-22.54. Poppel, H. Portfolio on information resource management—the process. Data Process-

ing Management. Auerbach Publishers, 1978.55. Reeder, F.S. Directions in federal infonnation resources management: a view from the

Office of Management and Budget. Information Management Review, 4, 4 (Spring 1989)29-37.

56. Sato, 0., and Horiuchi, M. IRM as a coordinating mechanism: a study in large Japanesefirms. Information and Management, 15, 2 (Septetnber 1988), 93-103.

57. Schneyman, A.H. Organizing infonnation resources. Information Management ReviewI, I (Summer 1985), 33-45.

58. Sethi, V., and King, W.R. Construct measurement in information systems research: anillustration in strategic systems. Decision Sciences, 22,4 (July-August, 1991), 455-472.

59. Sherron, G.T. Organizing to manage infonnation resources. Proceedings of the 1987CAUSE National Conference iDecember 1987), 185-195.

60. Smith, A.N., and Medley, D.B. Information Resource Management. Cincinnati, OH:Southwestern Publishing, 1987.

61. Spector, P.E. Summated Rating Scale Construction: An Introduction. Newbury Park, CA:Sage Publications, 1992.

62. Stone, E. Research Methods on Organizational Behavior. Santa Monica, CA: Goodyear1978.

63. Stonecash, J.C. The IRM show. Infosystems, 28, 10 (October 1981), 42-48.

Page 24: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

222 LEWIS, SNYDER. AND RAINER

64. Straub, D.W. Validating instruments in MIS research. MIS Quarterly, 13,2 (June 1989),147-165.

65. Synnott, W.R., and Gruber, W.H. Information Resource Management. New York: JohnWiley and Sons, 1981.

66. Trauth, E.M. The evolution of infonnation resource management. Information andManagement, 16, 5 (May 1989), 257-268.

67. Wan, B., and Wah, L. Validation of a user satisfaction instrument for office automation.Information and Management, 18,4 (April 1990), 203-208.

68. Wood, C. The IRM perspective. Computerworld, 17, 17 (July 20, 1983), 11-17.

APPENDIX: IRM Implementation Questionnaire

Please rate the extent to which the following activities and functions are CURRENTLY

implemented in your company by circling one of the numbers in the scale to the right

ofeach item. The numbers in the scale mean:

1 = not at all

2 = very little extent

3 = little extent

4 = some extent

5 = great extent

6 = very great extent

Activity/Function Extent currently

implemented

Information technology (IT) Integration (data processing, 12 3 4 5 6

computing, office automation, telecommunications)

Communications integration (voice, data, text, image, video) 12 3 4 5 6

Network integration (local area networks, corporate-wide 12 3 4 5 6

networks, wide area networks)

Data integration between applications 1 2 3 4 5 6

Application systems integration 12 3 4 5 6

Office automation capabilities (text processing, desktop 12 3 4 5 6publishing, e-mail, directories, calendaring/scheduling, bulletinboards)Distributed facilities (minicomputers, network servers, 12 3 4 5 6

workstations, microcomputers)

Information technology architecture 12 3 4 5 6

CIO is responsible for corporate-wide information 1 2 3 4 5 6

systems/technology policy

CIO is involved in the corporate business planning process 1 2 3 4 5 6

CIO is responsible for distributed information systems and 1 2 3 4 5 6technologyCIO approves corporate-wide information systems and 12 3 4 5 6technology acquisitions

Page 25: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments

THE INFORMATION RESOURCE MANAGEMENT CONSTRUCT 223

Assessment of potential tor new technologies 12 3 4 5 6

Quality assurance program tor intormation systems and tacilities 12 3 4 5 6

Data administration (policies, standards, corporate oversight) 1 2 3 4 5 6

Corporate data architecture (structure, tramework, philosophy) 1 2 3 4 5 6

Corporate policy on data ownership 1 2 3 4 5 6

Data dictionary 1 2 3 4 5 6

Data shared between users and departments 1 2 3 4 5 6

Data security 12 3 4 5 6

Access control security 1 2 3 4 5 6

Security awareness program 1 2 3 4 5 6

Business continuity/disaster recovery plan 1 2 3 4 5 6

Plan tor corporate-wide intormation systems and technology 1 2 3 4 5 6

Intormation systems/technology plan incorporates central, 1 2 3 4 5 6distributed, and desktop domains

Intormation systems/technology plan retlects business goals 1 2 3 4 5 6

Formal support tor end-user computing 1 2 3 4 5 6

Training programs tor end users 1 2 3 4 5 6

Planning process tor intormation systems and technology 1 2 3 4 5 6incorporates end users

Users support distributed IT tacilities 1 2 3 4 5 6

Systems and support tor management decision making (DSS, 1 2 3 4 5 6EIS)

Cooperative processing and client/server applications 1 2 3 4 5 6

Data communications between central and distributed tacilities 12 3 4 5 6

Standards tor distributed intormation systems and technology 1 2 3 4 5 6

Corporate-wide adherence to intormation systems and 12 3 4 5 6technology standards

Formal methodology tor systems development 12 3 4 5 6

Use ot automated development tools (CASE, code generators) 1 2 3 4 5 6

Documentation tor corporate-wide intormation tlow 12 3 4 5 6

Inventory ot corporate data and intormation 1 2 3 4 5 6

Inventory ot company IT tacilities 1 2 3 4 5 6

Intormation systems and technology advisory/oversight 1 2 3 4 5 6committee(s)

Users participate in advisory committees 1 2 3 4 5 6

Senior management participates in advisory committees 1 2 3 4 5 6

Page 26: An Empirical Assessment of the Information Resource ...130.18.86.27/faculty/warkentin/SecurityPapers/Leigh... · Acknowledgment: The authors wish to acknowledge the helpful comments