Upload
rade-barisic
View
222
Download
0
Embed Size (px)
Citation preview
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 1/54
Corporate Web Sites 1
CONTENT ANALYSIS OF CORPORATE WEB SITES
Content Analysis of Corporate Web Sites:Methodological Foundations, Software Solutions, and Implementation Issues
Vincent J. DuriauR.H. Smith School of Business
University of Maryland
College Park, MD 20742Tel: 1-301-405-2162Fax:1-301-314-8787
Rhonda K. Reger R.H. Smith School of Business
University of MarylandCollege Park, MD 20742
Tel: [email protected]
Hermann Ndofor R.H. Smith School of Business
University of MarylandCollege Park, MD 20742
Tel: [email protected]
Under review at Organizational Research Methods, May 2000
The authors wish to thank Dante DiGregorio, Jon Eckardt, Anne Huff, and Don Knight for their comments and help on this paper. The feedback from three anonymous Academy of Management Meetings reviewers is gratefully acknowledged.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 2/54
Corporate Web Sites 2
Abstract
This paper addresses the issues confronting management scholars in content analyzing
corporate web sites. First, we briefly review the methodological literature on content analysis.
Then, we examine the burgeoning field of computer-aided text analysis (CATA). To illustrate
the opportunities and challenges of computer-aided text analysis of corporate web sites, we apply
three of the best software packages among the 57 we studied to analyze a representative web site.
This comparison allows us to investigate some of the unique challenges posed when performing
content analysis on real data. We offer some suggested solutions. The goal of this paper is to
provide management researchers with an up-to-date and comprehensive introduction to the
potential and issues associated with the methodology and data source.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 3/54
Corporate Web Sites 3
Corporate web sites provide management scholars an intriguing new data source. Firms
use web sites to communicate with stakeholders including customers, investors, employees,
suppliers, and the general public. They contain both qualitative and quantitative data on firms’
products, markets, and strategy. Many run to hundreds and even thousands of pages of text,
graphic, audio, and video data, containing information on virtually every aspect of the
corporation, its officers, history, and identity. Many sites also provide previously difficult to
obtain executive speeches and press releases intended for investors and other publics.
To date, this rich data source has remained largely untapped in management research. In
preparation for a major study exploring issues of organizational identity, corporate reputation,
strategic action, and performance, we examine several fundamental questions. These questions
pertain to accessing corporate web sites and analyzing this vast store of data in a way that meets
the requirements of rigorous social science.
Content analysis, a class of methods at the intersection of qualitative and quantitative
traditions, is promising for this purpose. The past decade has seen an increasing scholarly
interest in qualitative methodologies to study complex business phenomena, borrowing and
adapting from more established disciplines (Miles & Huberman, 1994; Tesch, 1990). Therefore,
the first section summarizes fundamental concepts of content analysis.
The computer revolution has contributed to the proliferation of qualitative methodologies
(Weitzman & Miles, 1995; Kelle, 1995; Roberts, 1997; Tesch, 1991). New programs with
enhanced capabilities have made subtle analysis of large amounts of quantitative and qualitative
data possible (Gephardt, 1993; Lissack, 1998). We review 57 of the leading computer-aided text
analysis (CATA) packages available today and use multidimensional scaling (MDS) to make
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 4/54
Corporate Web Sites 4
sense of this vast array. With the breathtaking pace of new feature introductions, we use the
MDS results to develop criteria that will help researchers evaluate future software releases.
Finally, we choose three CATA packages, one of the best from each of three major
categories. We pilot test each to determine the opportunities and challenges of using them with
real data to address management research questions. Thus, the third section of the paper
provides a test of dtSearch, TATOE, and NVivo using the web site of Sprint, the
telecommunications giant, as an illustration of the principles discussed in the first two sections.
We conclude with issues and opportunities of using corporate web sites for management
research.
Principles of Content Analysis
Basic Content Analysis
Definition of content analysis. A wide range of theoretical frameworks, methods, and
analytic techniques have been labeled content analysis. Shapiro and Markoff (1997) review six
major definitions from various sources in the social sciences. They then propose a minimal and
encompassing definition that we also adopt: “any methodological measurement applied to text
(or other symbolic materials) for social science purposes” (Shapiro & Markoff, 1997). We
believe that the Shapiro-Markoff definition provides an acceptable conceptual background for
our attempt to evaluate alternative approaches for analyzing corporate web sites.
Central to the value of content analysis as a research methodology is the recognition of
the importance of language in human cognition (Sapir, 1956; Whorf, 1944). The key assumption
is that analysis of text lets the researcher understand other people’s cognitive schemas (Huff,
1990; Gephardt, 1993; Woodrum, 1984). At its most basic, word frequency has been considered
to be an indicator of cognitive centrality (Huff, 1990) or importance (Abrahamson & Hambrick,
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 5/54
Corporate Web Sites 5
1997). Scholars have also assumed that the change in the use of words reflects at least a change
in attention, if not in cognitive schema (Barr, Stimpert, & Huff, 1992). In addition, content
analysis assumes that words can be grouped to reveal underlying themes (Sussman, Ricchio, &
Belohlav, 1983), and co-occurrences of key words can also be interpreted as reflecting
association between the underlying concepts (Huff, 1990; Kabanoff, Waldersee, & Cohen 1995).
Challenges to content analysis. Scholars using content analysis have been concerned to
draw the method out of its “methodological ghetto” into the research mainstream (Roberts, 1997;
Woodrum, 1984). Despite its merits, which are discussed below, the use of content analysis
remains subsidiary in social science research and has often yielded studies of marginal quality
(Woodrum, 1984). The reasons for these concerns vary, but the majority arises from poorly
implemented studies, not from limitations inherent in the method. One concern is that research
procedures often lack documentation, making replication difficult and rendering results suspect.
Another source of concern is the under-utilization of reliability and validity tests, which creates
the serious risk of coder bias. Surprisingly, being at the frontier between qualitative and
quantitative research may also hinder rather than foster the development of content analysis due
to factionalism. Finally, in many studies, the content of the messages studied is disconnected
from the characteristics of the informants and the source materials, which introduces additional
ambiguity associated with the inferences drawn.
Benefits of content analysis. Content analysis offers significant advantages for social
science research that in our opinion outweighs potential limitations. Most of the limitations can
be minimized through carefully implemented research. Foremost, content analysis provides a
replicable methodology to access cognitive processes (Huff, 1990; Carley, 1997). Another key
strength is its ability to combine rich meaning associated with text with powerful quantitative
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 6/54
Corporate Web Sites 6
analysis. In that, it differs from other purely qualitative procedures such as hermeneutics and
literary interpretation (Kabanoff et al., 1995; Tesch, 1990). Third, longitudinal research design
can be implemented due to the availability of documents over time (Fiol, 1995; Kabanoff, 1996;
Weber, 1990). Fourth, content analysis is applicable to a broad range of phenomena.
Applications in management have included corporate social responsibility, industrial accidents,
corporate values, managerial cognition, and organizational effectiveness (Huff, 1990; Wolfe,
Gephardt, & Johnson, 1993). Finally, content analysis can be non-intrusive and therefore does
not suffer from researcher demand bias (Woodrum, 1984). (Note: this advantage only applies to
content analysis of existing texts or other source materials. It does not apply to content analysis
of interviews or open-ended responses to surveys). Being non-intrusive is particularly relevant
to management research and the study of senior executives where access to informants is a
serious research issue (Morris, 1994).
Several additional benefits have been noted in implementing content analysis (Kabanoff
et al., 1995; Woodrum, 1984). First, costs can be kept low and content analysis can easily be
used for small-scale studies with minimal requirements (Erdener & Dunn, 1990; Woodrum,
1984). In addition, the advent of CATA has greatly increased the cost-effective scalability of
content analysis to include quite ambitious projects, as discussed in the next section. Second,
content analysis is a “safe” methodology since the coding scheme can be corrected if flaws are
detected as the research proceeds (Tallerico, 1991; Woodrum, 1984). Third, when content
analysis is done correctly, it entails the specification of category criteria for reliability and
validity checks to foster the creation of a replicable database (Lissack, 1998; Woodrum, 1984).
Finally, content analysis can be used with other methods for the purpose of triangulation
(Erdener and Dunn, 1990; Kabanoff, 1996; Mossholder, Settoon, Harris, & Armenakis, 1995).
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 7/54
Corporate Web Sites 7
Reliability and validity. As with all methodologies, reliability and validity are the most
fundamental issues associated with content analysis (Huff, 1990; Morris, 1994; Weber, 1990).
To complicate matters, content analysis can be conducted at two levels (Erdener & Dunn, 1990).
At one level, the manifest content of the text is captured and revealed in a number of text
statistics. At a second level, the researcher is interested in the latent content and deeper meaning
embodied in the text. In general, reliability is easier to achieve at the manifest level, but validity
is likely to be higher at the latent level. To date, as the appendix to this paper documents, most
content analysis in management research has focused on manifest content.
While the implementation of content analysis varies considerably, there are
commonalities in the methodology that cut across the various approaches (Fielding & Lee, 1995;
Gephardt, 1993; Carley, 1993; Kelle, 1995; Weber, 1990; Wolfe et al., 1993). The basic phases
of data collection, coding, analysis of content, and interpretation of results each introduce unique
reliability and validity concerns (Weber, 1990).
Regarding data collection, Weber (1990) identifies three critical sampling decisions.
When it is not possible to work with the entire population, researchers have to define their
information sources, select the type of documents for the project, and choose specific text within
these documents. In exemplar content analysis research, Gephardt (1993, 1997) discusses in
great detail the theoretical rationale for the sampling decisions followed to assemble a
comprehensive and valid database from multiple sources. Lissack (1998, 484) introduced
“concept sampling, a new twist for content analysis” enabled by parsing software. This approach
will be further discussed in the CATA section.
Regarding coding, Weber (1990) suggests eight steps for creating and testing a coding
scheme to overcome concerns about researcher bias at this key phase in content analysis. The
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 8/54
Corporate Web Sites 8
Weber protocol is widely referenced in the literature and in CATA software developers’
documentation, sometimes with minor modifications (see, e.g., Wolfe, 1991). We therefore list
these steps in Table 1.
-------------------------------Table 1 about here
-------------------------------
The development, refinement, and testing of the coding scheme is central to the quality of
textual analysis, especially for latent content analysis (Gephardt, 1993). Weber (1990) provides
a comprehensive discussion and suggests numerous ways to address reliability and validity
concerns. Kabanoff (1996) observes that no consensus has emerged yet regarding a standard
procedure to establish coding reliability and validity. Morris (1994) particularly examines the
issues associated with computerized content analysis. These items are discussed in greater
details in the next section.
The third phase, analysis of content, shows considerable variation depending upon the
purposes of the research. For example, focusing on manifest content analysis, Weber (1990)
suggests four possible outputs of content analysis: key words in context lists, word frequency
lists, retrieval from the coded text, and category counts (see, also, Fielding & Lee, 1995; Morris,
1994).
The fourth phase of any content analytic research project is the interpretation of results
within the theoretical framework guiding the research endeavor. The intellectual product may be
measurement, description, or inference, depending on the research purpose (Tesch, 1990). For
example, researchers interested in manifest content can apply multivariate data analysis
techniques to numeric output culled from phase three (Weber, 1990). Delving into latent content
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 9/54
Corporate Web Sites 9
requires greater interpretation by the researcher. For example, researchers may choose to
qualitatively interpret results of textual analysis (Gephardt, 1993; Gioia & Chittipedi, 1991).
Weber (1990) identified two additional measurement problems that researchers interested
in analyzing corporate web sites must consider. First, he notes that it is necessary to control for
length when comparing across documents. This is always a concern, because a document that is
significantly longer than a comparison piece is likely to include a greater number of key words
based on length alone. Determining length is straightforward with discrete documents such as
transcription of a speech or a letter to shareholders. However, in corporate web sites, hundreds
of pages, often stored on different servers, are linked in such a way that it is often a judgment
call to define and compare documents. A second concern is that most content analysis gives
equal weight to each occurrence of a coding unit (e.g., key word or phrase). Again, this
assumption is suspect in all content analysis, but may be especially problematic when analyzing
web sites, because some are organized with significant redundancy to maximize “hits” by web
browsers conducting key word searches.
Computer-aided Text Analysis (CATA)
Definition of computer-aided text analysis. Several definitions of computer-aided text
analysis have been proposed (see, e.g., Kabanoff, 1997; Mossholder et al., 1995; Wolfe et al.,
1993). Since multiple methodologies and technologies have been included under the CATA
rubric, we have adopted Wolfe et al.’s inclusive definition: CATA is constituted by software
programs that “facilitate the analysis of textual data” (Wolfe et al., 1993, 638). Most authors use
the expression computer-aided textual analysis (CATA) because its short form is preferred to the
unfortunate acronym for computer-aided content analysis. Nonetheless, the terms are
interchangeable.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 10/54
Corporate Web Sites 10
Strengths of CATA. One way to understand the contribution of computers to content
analysis is to contrast computer-aided and manual approaches (Kelle, 1995). In addition to
easier data manipulation, the use of software affords several analytical advantages that greatly
enhance the methodology. First, computerization allows the manipulation of large data sets
(Gephart, 1991; Lissack, 1998; Morris, 1994; Wolfe et al., 1993). The complexity and potential
interrelationships of concepts increase exponentially with the quantity of data. Software
programs offer features for organizing, searching, retrieving, and linking data that renders the
process of handling a large project much more manageable. For instance, Lissack (1998)
describes how parsing software can be used to sample concepts from a large corpus. This
sampling approach allows the researcher to content analyze a reasonable amount of data
representative of the initial corpus. Second, computers reduce the time and cost of undertaking
content analysis projects (Mossholder et al., 1995). Time savings using computers stem from the
minimization of the coding task, the reduction in coder training, the elimination of inter-rater
checks, and the ease of running multiple analyses (Carley, 1997). Third, the analytical flexibility
afforded by CATA is a recurring theme in the literature (Gephart 1997; Tesch, 1991).
Importantly, the use of computers addresses many of the reliability concerns associated with
manual coding (Morris, 1994; Gephart & Wolfe, 1989; Wolfe et al., 1993). Coding rules are
made explicit which ensures perfect reliability and comparability of results across texts.
There are encouraging results that the semantic validity possible with manual coding
using multiple coders can be achieved at lower overall cost with CATA (Kabanoff, 1996; Kelle,
1995; Morris, 1994). Morris (1994) tested the validity and reliability of manual and
computerized approaches. Using mission statement data from Pearce and David (1987), she
compared the outcome of computerized coding in ZyIndex, a text management software, with
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 11/54
Corporate Web Sites 11
that of a panel of six graduate business students. She found that results from ZyIndex and the
human coders agreed at an acceptable level and that computerized coding yielded an acceptable
level of semantic validity (Morris, 1994).
The use of network concepts have been one of the most exciting developments of the past
few years in CATA research (Kelle, 1995). New linkage features between text, memos, and
codes such as hyperlinks and graphical tools apply to the areas of theory building, hypothesis
testing, and integration of qualitative and quantitative analysis. These developments of CATA
seem particularly apt to quell concerns about the decontextualization of results that is inherent to
a methodology based on coding and retrieval (Dey; 1995, Prein & Kelle, 1995). Gephart (1993)
also observed the methodological fit of computer-aided text analysis with grounded theorizing
because information can be retrieved in meaningful ways that allows for the emergence of
grounded hypotheses (see, also, Wolfe et al., 1993).
Limitations of CATA. There are still a number of questions associated with the use of
CATA (Carley, 1997; Gephardt, 1991; Morris, 1994; Tallerico, 1991). The debate regarding
manifest versus latent content comes first and foremost (Woodrum, 1984). Although CATA
software is increasing in sophistication, measuring content with computers will miss some latent
aspects within texts such as tone or irony of expression (Morris, 1994). However, human coders
also exhibit low reliability for latent content. Further, we believe that the significance of these
problems may be over-estimated for business texts appearing on corporate web sites. These sites
usually are written for clarity because they are read by people from around the globe. Still,
validity may be a particularly critical issue when dealing with metaphors, homonyms,
colloquialisms (Carley, 1997), and other aspects of natural language (Morris, 1994).
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 12/54
Corporate Web Sites 12
There are three additional issues in implementing CATA. First, retrieval capabilities may
be insufficient in certain software categories (Gephart & Wolfe, 1989). Second, researchers
should avoid the pitfalls of the false sense of security afforded by a computer and the
justification fallacy of using a computer program (Tallerico, 1991). Computerization will never
replace human judgement in all cases. Finally, the fragmentation of the field of CATA software
makes the choice of the appropriate package difficult (Kabanoff, 1996).
Categorization of CATA software. Several interesting CATA typologies have been
proposed. Tesch (1991, 1990) introduced a typology based on the two dimensions of
methodology and technology that has been adapted by others (see, e.g., Wolfe et al., 1993;
Roberts, 1997). She made the - now blurring - distinction between commercial and academic
software (Tesch, 1991; Tesch, 1990). Then, she categorized the academic packages. More
recently, Weitzman and Miles (1995) established a practical list of types of CATA packages,
ranging from simple to more complex programs. Their typologies are summarized in Table 2.
-------------------------------Table 2 about here
-------------------------------
In his introduction to the Journal of Organizational Behavior ’s special issue, Kabanoff
(1997) suggested a typology of CATA-based management research using two dimensions: data
sources and analytic methods. Data sources can be evoked – collected by the researcher in
questionnaires or surveys, or natural – documents such as annual reports or newspaper articles.
The second dimension ranges from quantitative to qualitative (Kabanoff, 1997, 1996).
We used dimensions from these three typologies and additional practical considerations
to compare 57 software packages referenced on the Georgia State University web site on content
analysis (http://www.gsu.edu/~wwwcom/content/csoftware/software_menu.html). For that
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 13/54
Corporate Web Sites 13
purpose, we conducted a multidimensional scaling (MDS) analysis. While it is difficult to list all
CATA packages, the GSU site is the most exhaustive listing that we could find. Altogether, we
uncovered 70 software in the content analysis literature, some of which are unavailable today.
Among the 57 CATA packages, 24 were excluded from the MDS analysis because they did not
operate on Windows 95 or more recent platforms, were no longer supported, had a broader
functional scope, or did not have an English version.
We then established criteria on which to evaluate these packages. The choice of features
was based on extant typologies (Kabanoff, 1997; Tesch, 1991; Weitzman & Miles, 1995), the
purpose of our research project, as well as a review of the documentation downloaded from
CATA software suppliers’ web sites. A comprehensive database of CATA packages was
compiled using web sites’ information, software demo versions, users’ manuals, and contacts by
e-mail with software suppliers when needed. Due to its size, the complete database cannot be
rendered in this paper, but an evaluation of dtSearch, TATOE, and NVivo according to 30 major
features is shown on Table 3.
-------------------------------Table 3 about here
-------------------------------
To prepare the data for statistical processing, we indicated the presence of each feature in
a software package by a binary count. To clarify the MDS analysis, features were then grouped
in seven categories: price, technical support, data input, text management, content analytical
functionalities, statistical analysis, and data output. Consistent with extant typologies, we
maintained the distinction between the functionalities of searching, basic text statistics, content
analysis, and qualitative data analysis.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 14/54
Corporate Web Sites 14
The results of the MDS analysis confirm and enrich the typologies previously discussed.
First, the two-dimension analysis reveals several categorization dimensions: quantitative-
qualitative, standard-advanced, unbundled-integrated, and economic-expensive (see Figure 1).
Second, three clusters differentiating qualitative, quantitative, and search CATA packages also
appear, which are consistent with Tesch’s (1991) and Weitzman and Miles’ (1995) typologies.
These results counter our expectations that the market for CATA packages was converging and
that the segmentation valid ten years ago was blurring. In addition, CATPAC, SphinxSurvey,
and TextSmart cluster in a new group of integrated content analysis packages. While the use of
three dimensions improve the quality of the MDS analysis (stress=0.158 and R
2
=0.901 versus
stress=0.254 and R 2=0.814), we show the results for the two-dimension model, which provides a
simpler taxonomy of CATA software packages for practical purposes.
-------------------------------Figure 1 about here
-------------------------------
Content Analysis Software Solutions
The management researcher will be confronted with the same complexity when
implementing CATA as encountered in manual content analysis (Tesch, 1991). We chose to
focus on three packages representative of the two major schools of content analysis (manifest
content) and qualitative data analysis (latent content), and also added the third category of search
(knowledge management) uncovered in our MDS analysis of CATA packages. We selected
dtSearch, based on previous scholarly evaluation of knowledge management software (Knight,
1999). TATOE represents the content analysis category. This software has long been reputed
among researchers (Weitzman & Miles, 1995). The recent development of a version for
Windows 95 (and higher) and the HTML page import feature make it particularly appealing.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 15/54
Corporate Web Sites 15
NVivo, one of the qualitative data analysis packages, is ideal for latent content analysis and
theory development. This package is the most recent offering of the Australian firm QSR and
represents a step forward from their leading NUD*IST software in terms of modeling and
hypothesis testing.
Testing of CATA Packages
Sprint web site. We chose to test the packages in the wireless telecommunications industry.
Wireless telecommunications is a high velocity industry characterized by fast technology change,
sweeping deregulation, intense competition, and massive globalization (Economist, 1999). Our
preliminary investigation showed significant sophistication, but also variance, of corporate web
sites.
First, we used Alexa (http://www.Alexa.com) to obtain basic statistics on Sprint’s web
site (http://www.sprint.com). Alexa is a freeware software package that operates in conjunction
with web browsers to provide information on the web sites visited by the user. According to
Alexa, the Sprint web site had 208 pages and links into 14,236 other web sites on December 30,
1999. Top related links included US West, Southwestern Bell, MCI, Concert Global
Communications, AT&T, Ameritech, Sprintbiz, France Telecom, Deutche Telecom, and
BellSouth. Alexa also provides ratings on a five-point scale for speed, freshness, and overall
quality. On these dimensions, the Sprint web site scored 3, 5, and 3 respectively, with 5 being
the top rating.
We downloaded relevant files from the Sprint web site as pilot data to input into our three
CATA packages. The first challenge was to identify a specific date on which to download files.
Some documents on company web sites are rarely updated (e.g., organizational history,
statement of values), but others are updated as often as daily (e.g., press releases, “what’s new”).
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 16/54
Corporate Web Sites 16
Especially in high velocity environments like the wireless telecommunications industry, data
comparability across firms can best be achieved by downloading all files of interest in close
temporal proximity.
We searched the site following a pre-defined protocol in order to locate documents
containing concepts of interest. For example, we used the key words ‘mission’, ‘vision’,
‘values’, ‘purpose’, and ‘culture’ to identify pages relating to ‘corporate culture’. Each
document was downloaded as a separate MS-Word file. By saving each occurrence as a separate
document, each can be analyzed independently and the richness of the original data is
maintained. This procedure also makes it easier to re-verify the validity of each result as
research progresses, especially if additional key words are identified. In total, 65 documents
were downloaded.
Evaluation of dtSearch. DtSearch is a knowledge management search tool that sells for
$199/copy. The software has a broad base of users worldwide and support resources are
available through the company web site, telephone, and e-mail contact at the firm’s headquarters
in Arlington, Virginia (http://www.dtseach.com). DtSearch is designed for high performance
search in large databases. It is an excellent program for structured content analysis of manifest
content, especially when searching very large text data sets. In addition to lightening speed, it
offers a wide array of search features.
DtSearch uses a Microsoft window-based menu system that immediately provides a
familiar interface to the user. This is advantageous over TATOE that features a custom
interface. To use dtSearch, files must first be loaded into the program. DtSearch refers to this
operation as indexing. Each time a file is loaded into dtSearch, the software builds an index of
the location of all the words in the file. There is no limit to the number of files that dtSearch can
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 17/54
Corporate Web Sites 17
process in a single index or the number of indexes that can be built. Indexing tremendously
reduces searching time.
DtSearch prompted us to create our first index folder corresponding to all 65 files, which
we called Sprint. We could either load files individually or grouped in folders. This later option
was time effective in indexing our Sprint files that were archived in folders named ‘identity’,
‘acquisitions’, ‘merger’, ‘alliances’, ‘partnerships’, ‘joint-ventures’, and ‘consortia’. In addition,
dtSearch offers a file type option that let us discriminate between duplicate files saved in
different formats – DOC and TXT in this instance.
Once all the necessary files were indexed, we tested the main function – searching. The
search screen is divided into three parts. In one window are listed all the existing indexes while
another window shows a word list for the selected index. For example, we had 65 occurrences
of ‘acquisition’, 19 of ‘acquisitions’, 128 of ‘merger’, and 18 of ‘mergers’ in the Sprint index.
The final window is an easy and intuitive interface to the search functions.
In addition to the basic search functions such as phrase (natural language) search,
Boolean operators, proximity searching, wildcards and placeholders, dtSearch offers more
advanced and user-friendly features that we found invaluable in analyzing the Sprint web site.
These features include:
1. Stemming let the researcher find words with identical roots, e.g. ‘alliance’ would also find
‘alliances’.
2. Fuzzy searching allows finding terms even if they are misspelled. The 'fuzziness' of a word
can be adjusted from 0 to 10 to specify the level of dissimilarity tolerated. In using this
feature, we found that higher levels of fuzziness led to unintended search results. For
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 18/54
Corporate Web Sites 18
example at fuzziness of 3, a search for ‘vision’ found occurrences of ‘mission’, ‘lotion’, and
‘potion’ in addition to ‘vission’ and ‘vition’, two misspellings we inserted to test this feature.
3. Thesaurus/concept searching offers automatic or customizable synonym expansion. The
provided thesaurus can be browsed to selectively choose and add words to a search query.
For example, we customized a search for ‘alliances’ to include ‘partnerships’. This feature
greatly added to the ease of simultaneous search for multiple words representing one concept.
DtSearch offers a variety of options to handle search results. The documents and search
results can be viewed on screen in either a horizontal split window, a vertical split window, or
two separate windows. One window showed the location of the files containing the key words.
Another window displayed the hits within a selected file. In addition, search results can also be
exported to DBF, CSV (for spreadsheet or database use), and RTF (for word processing use) by
using the 'save as' function. These features facilitate moving from qualitative data analysis to a
statistical analysis package such as SPSS.
DtSearch's forte is its speed and ability to search large databases containing gigabytes of
text. However, unless a researcher is interested in data mining massive databases, the speed
advantage is a barely perceptible split second. As such, we think dtSearch's appeal for
management researchers is in its user friendliness and search functions. In addition, it supports
most popular file formats including word processing, HTML, PDF, ZIP, and Eudora messages.
This saved up front time that would otherwise have been used to convert files into compatible
formats. For example, since our Sprint files were already in MS-Word, they were directly
indexed in dtSearch without having to be resaved in an intermediary format. At the functional
level, dtSearch's usefulness to a management researcher is limited to data search functions. As
such, it would be most useful, (if not only useful) for doing structured content analysis where
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 19/54
Corporate Web Sites 19
concepts, models, and relationships have already been identified and designed. We could
envision a research project that develops grounded theory using a sample of data in a more
advanced product such as NVivo or even TATOE, and subsequently testing the model in larger
data sets using programs such as dtSearch.
Evaluation of TATOE. The second package we pilot tested was TATOE, or Text
Analysis Tool with Object Encoding for computer-assisted text analysis. This package was
selected to represent the content analysis category, useful for quantitative measurement and
description of manifest content. It was developed in VisualWorks’ SmallTalk, which renders it
quite user-friendly, although it provides a different look-and-feel from standard Microsoft
windowing. TATOE is the result of a collaboration of Melina Alexa of ZUMA and Lothar
Rostek of GMD/IPSI, both based in Germany. The Windows 95 version, which is still in Beta
testing, is free for researchers. It can be downloaded from the developers’ web site at
http://www.darsmadt.gmd.de/~rostek/tatoe.htm.
Because TATOE is loaded by opening a bat extension file in the directory in which the
rest of the TATOE files are located, it is impossible to create shortcuts or use a windows 'start'
menu item to open the program. TATOE opens up with the Notebook screen, which provides
access to seven major functions: ‘Textbase’, ‘Scheme’, ‘Styles’, ‘Pattern’, ‘List’, ‘Import’, and
‘Export’. The other major TATOE screen displays the content analytical features.
Related files or documents in a TATOE project are held within a given Textbase. This is
equivalent to an index in dtSearch or a document set in NVivo. The Textbase screen serves two
functions. First, it allows the creation of Textbases and second, it provides summary statistics
for a selected Textbase. Once the Textbase is created, the ‘Import’ option can be used to load
documents. After we had created a Textbase called Sprint and imported all the documents, the
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 20/54
Corporate Web Sites 20
screen correctly indicated the following statistics: 65 texts, 635 paragraphs, and 1,225kb of
memory used.
TATOE provides the option to import texts, schemes, categories and styles. We did not
test this feature, but it is useful for research utilizing complex coding schemes. The ‘Import’
option recognizes ASCII text, XML files, or HTML pages. In particular, the software recognizes
markers in a saved HTML document which, if present, will help extract portions of text and
display other elements. This feature allowed us to work directly with HTML pages saved from
the Sprint web site, rather than use MS-Word files.
Once the documents have been imported into the Textbase, the next step is to setup the
coding categories in the ‘Scheme’ screen. In the case of the Sprint Textbase, we created the
categories of ‘mission’, ‘vision’, ‘alliances’ and ‘identity’ as one scheme. Categories can be
configured with various styles (fonts, colors) which enhances the visual presentation of the coded
sections of the text in TATOE’s Screen, a helpful feature for hybrid human/automated coding.
After completing the categories, we began coding and analyzing the files from TATOE’s
screen. The text is loaded into the central window and previously defined categorization
schemes are used to code the text. TATOE provides standard search options including Boolean,
sequence, proximity, and wildcard searches. Several options are available such as sorting by
name, frequency, number of paragraphs, and number of texts. Counts are possible for coded text
segments as well as individual words, and can be disaggregated by category. Concordances can
be displayed for a selected word, a whole category, a coded text segment, a user-defined search,
and an occurrence list. Co-occurrences (words or phrases) on the left and the right of a word or
string of words may be created. Finally, going back to the Notebook screen, the results of the
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 21/54
Corporate Web Sites 21
word search and coding can be exported using the Export function as either XML file (for text)
or SPSS file (for statistics).
Based on our test, we conclude that this version of TATOE provides a comprehensive
package of content analytic features of noteworthy user-friendliness. Although it does not use
Microsoft’s windowing, its object-orientation design makes it easy to manipulate. However,
researchers developing specialized search protocols would probably prefer dtSearch or would
favor the more sophisticated coding capabilities of NVivo if engaged in complex modeling
activities. In addition, we were concerned about using a software package in Beta version.
While we encountered a few minor difficulties such as persistent errors in importing files, we did
not encounter any major and devastating bugs.
Evaluation of NVivo. NVivo was introduced by QSR in 1999. NVivo was chosen to
represent sophisticated content analytic packages for uncovering latent content and sophisticated
coding of complex causal and argument mapping. Although positioned as “different” from
QSR’s leading NUD*IST program, it appeared to be an extension of NUD*IST. The standard
version sells in the United States for $701 (single copy educational price = $425). The software
benefits from significant technical support and training resources and is distributed worldwide by
Scolari/Sage Publications Software.
On opening NVivo, the researcher notices the sensitized (hyperlinked) buttons in the
welcome screen to navigate the program. It became obvious as we familiarized ourselves with
the program that this approach (as opposed to menu bars that are also present) provides a more
interactive interface for researchers as they progress in the analysis. The welcome screen has
four buttoned options to create a new project, open a project, open a tutorial project or exit
NVivo. Creating a new project activates a wizard with the options for a typical setup or a
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 22/54
Corporate Web Sites 22
custom setup. A typical setup prompts for the name and description of a new project. Selecting
a custom setup provides additional options for project administration, which are valuable on
large projects involving participants with different degrees of research authority.
After a new project is created, or an existing project is opened, the user is taken to the
project pad screen, which is NVivo’s main screen. It contains two page markers at the top that
lead to the two main systems of NVivo. The document system is the area where the project's
documents reside. The node system houses the concepts, ideas, and themes developed by the
researchers. For example, in this project, the node system contained such nodes as ‘vision’,
‘history’, ‘mission’, ‘alliances’ and ‘image’. In both the document and the node systems, NVivo
provides a modeling approach to guide the researcher through the project. It identifies the main
functions (sensitized buttons) to be performed and uses directional arrows to link functions in a
sequential order. For example, within the document system, the first function is to create
documents. This is linked to the next function, explore documents, which in turn is linked to the
‘Browse’, ‘Change’, ‘Link’ and ‘Code’ functions. The same sequence is followed with the node
system.
We first imported all the Sprint files into the document system. As in TATOE, we first
had to convert all MS-Word files into a text format, since NVivo accepts only text (TXT) or rich
text formats (RTF). Once created, all documentation in NVivo is handled as editable rich text.
This allows an 'edit-while-you-code' feature that enables users to edit documents without
upsetting or invalidating existing codes or links. Thus, documents can be updated in terms of
codes, text, content, and links even for already coded text. This feature is very helpful for
iterative coding of multiple concepts where each round of coding builds on the previous one.
For example, in our testing, we edited a document several times, each time coding new material
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 23/54
Corporate Web Sites 23
or unmarking previously coded material. The software easily handled this coding test, which
surpassed anything we foresee when implementing an actual study. The editable rich text format
also enables visual coding whereby documents can be processed on screen simply by
highlighting or changing fonts. This feature greatly facilitates the coding process.
NVivo documents support hyperlinks to any computer files. For example, in analyzing
Sprint documents, we embedded hyperlinks in some documents to their original web site
locations. These links are useful for validity checks and cross-referencing. NVivo can also
support any font or language written from left to right, top to bottom. We did not test this
feature, but it may prove useful as we expand our sample to include international sites.
NVivo's node system is similar to the ‘Scheme’ window of TATOE. The node system
provides structured and unstructured ways to represent information. Both the node and the
document systems supplement each other. For example, through the exploration of documents in
the document system, we identified the important and prevalent concepts to be coded. These
concepts were used as the nodes in the node system. We then coded the documents in the
document system using the nodes created in the node system.
NVivo also allows nodes and documents to be collected into any number of non-
exclusive sets. For example, we collected the nodes of 'mission', 'vision', and 'history' into a set
we called 'identity' and the associated documents into another set of all Sprint documents. Nodes
and documents can also be cross-referenced and linked together into networks by hyperlinks.
NVivo supports three types of links. The DocLink feature is useful in linking concepts back to
their document of origin. For example, by clicking on a DocLink in a concept, the researcher is
taken to the segment of the original document from which the concept is derived. NodeLinks are
helpful for creating super-ordinate categories by linking nodes. For example by clicking on the
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 24/54
Corporate Web Sites 24
NodeLink of the set ‘identity’, we were taken to the various nodes of 'mission', 'vision' and
'history' that made up this set. Finally, a DataBite can open any file system supported by the
user's computer and provide links to external databases or files. Inasmuch as we did not use this
feature, it is easy to conceive of a scenario in which transcribed text is used and a DataBite
serves as a link to the specific tape exert.
NVivo provides another useful feature, a ‘box-and-line’ or ‘directed graphs’ facility for
drawing models. This option makes it possible to create representations of models and
hypotheses at different levels of detail. The ability to ‘Clone’ a model made it relatively easy to
develop and manage models, especially layered ones, and saved time in reconstructing models.
Before using NVivo, we held the common misconception that advanced QDA packages would
automatically diagram models from the nodes we specified. This is not the case. NVivo
provides the tools such as advanced drawing and linking capabilities for modeling, but the
researcher must construct the models. A researcher who is familiar with MS-PowerPoint and
MS-Word would find the modeling features of NVivo easy to use.
NVivo not only provides powerful qualitative data analysis tools, but also has basic
content analysis functionalities. Its search tools are part of a fully integrated project analysis
process for coding attribute values, word patterns, phrases, and frequencies. Search results can
be reported in a variety of ways such as editable rich text documents and profiles (on-screen
tables) which are exportable to statistical programs for quantitative analysis.
Overall, although NVivo has some content analysis capabilities, a researcher who is
mostly interested in manifest content analysis would find NVivo cumbersome, like using a
jackhammer to nail a picture to a wall. It is too complex, expensive, difficult to learn, and time
consuming in terms of start up costs compared to TATOE and other packages designed
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 25/54
Corporate Web Sites 25
specifically for manifest content analysis. However, for researchers who wish to explore latent
content and develop grounded theory, NVivo is a powerful and useful research tool.
Conclusion
The goal of this research was to provide an up-to-date and comprehensive introduction to
the potential and issues associated with content analysis of corporate web sites. We first focused
on the strengths and challenges of the content analysis methodology in general. Content analysis
implemented with care should be of particular interest to management researchers due to several
factors including access to managerial cognitive processes, non-intrusiveness, combination of
quantitative and qualitative approaches, and the ability to implement longitudinal designs.
We then turned our attention on the burgeoning computer-aided text analysis (CATA)
field. The advent of CATA software has led to significant benefits in terms of cost, flexibility,
and reliability, while maintaining satisfactory levels of validity (Morris, 1994). This paper
highlights several empirical challenges facing researchers and suggests theoretically sound, yet
practical, solutions. In our opinion, the highest quality and most interesting content analytic
research has used a combination of computer automation and human intervention to increase
efficiency and perform more subtle and powerful analyses (see, e.g., Abrahamson & Park, 1994).
We believe that this is the optimal approach to conduct sophisticated content analysis with the
necessary level of scientific rigor. But, as with all research, the choice of methodology and
research design must be driven by the research purpose.
Next, we provided practical insights for the evaluation and selection of a CATA solution.
We identified and empirically tested the typologies proposed in the literature. The results of a
multidimensional scaling (MDS) analysis largely confirm the utility of previously proposed
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 26/54
Corporate Web Sites 26
typologies, but also suggest refinements. Additionally, the placement in multidimensional space
of over 30 current and widely available CATA software packages should benefit researchers in
selecting the package that is optimal for their research.
Finally, we tested three popular CATA packages, one from each of the three major
groups that emerged from the MDS analysis. It would be difficult to design a comprehensive
test of even one CATA package that anticipates every managerially relevant research use of
content analysis. Before designing our tests of these three packages, we searched and evaluated
the myriad uses of content analysis represented in the management literature. Appendix 1
provides a table of the results of our search. We hope this table is helpful to future researchers
interested in the state of the art of content analysis applications in management research.
The examination of the management literature shows that content analysis has been a
valuable methodology to study such important organizational phenomena as managerial
cognition (e.g., Carley, 1997), competitive dynamics (e.g., Smith et al., 1992), corporate social
responsibility (e.g. Wolfe, 1991), industrial accidents (e.g., Gephardt, 1993), and organizational
change (e.g., Kabanoff et al., 1995). While multiple data sources have been leveraged in this
research, a striking observation is that corporate web sites have not been used in any major
published study to date.
We tested the three CATA packages using real data from the Sprint web site. The test of
dtSearch, TATOE, and NVivo shows the range of capabilities and the challenges of using these
CATA packages. Our test suggests that content analysis will be a powerful methodology to
apply to corporate web sites. Like all methodology, it requires careful implementation. First, all
of the issues inherent in designing any rigorous content analytic study must be carefully taken
into account. The rationale for each decision should be documented and evaluated by objective
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 27/54
Corporate Web Sites 27
observers. In addition, the paper highlights how using corporate web sites as data sources poses
a number of unique challenges. For instance, researchers need to control for site size and site
design, as well as clarify site boundaries.
In our opinion, the richness, complexity, and impact of the Internet cannot be ignored.
CATA provides an excellent and reliable way for researchers to begin untangling the
implications of this medium on a wide range of organizational and managerial issues. Future
research should further investigate and report on the difficult empirical issues involved in
studying corporate web sites. We hope many researchers will take on the challenges of
pioneering research using this vast yet largely untapped data source.
We realize the breadth of material covered in this paper can be daunting. We sought to
strike a balance between comprehensiveness and richness of detail. Our aim has been to show
the promise of the method while also alert potential adopters to key research issues. We caution
researchers for the need to delve more deeply into these issues when designing a study. The
review of the literature, and especially Appendix 1, provides extensive references to the state of
the art for the reader interested in developing a deeper understanding of the issues addressed in
this paper.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 28/54
Corporate Web Sites 28
References
Abrahamson, E., & Hambrick, D.C. (1997). Attentional Homogeneity in Industries: the Effectof Discretion. Journal of Organizational Behavior, 18, 513-532.
Abrahamson, E., & Park, C. (1994). Concealment of Negative Organizational Outcome: anAgency Theory Perspective. Academy of Management Journal, 37, 1302-1334.
Barr, P.S., J.L. Stimpert, & Huff, A.S. (1992). Cognitive change, strategic action, andorganizational renewal. Strategic Management Journal, 13, 15-36.
Carley, K. (1993). Coding Choices for Textual Analysis: A Comparison of Content Analysisand Map Analysis. Sociological Methodology, 23, 75-126.
Dey, I. (1995). Reducing Fragmentation in Qualitative Research. In Kelle, U. (ed.), Computer
Qualitative Data Analysis: Theory, Methods, and Practice (pp. 69-79). Sage Publications,London.
Erdener, C.B., & Dunn, C.P. (1990). Content Analysis. In Huff, A.S. (ed.), Mapping StrategicThought, (pp. 291-300). John Wileys and Sons, Chichester.
Economist, (1999). A Survey of Telecommunications. October 9.
Fielding, N.G., & Lee, R.M. (1998). Computer Analysis and Qualitative Research. SagePublications, London, United Kingdom.
Fiol, M. (1995). Corporate Communications: Comparing Executives’ Private and PublicStatements. Academy of Management Journal, 38, 522-536.
Gephart, R.P. (1997). Hazardous Measures: an Interpretive Textual Analysis of QuantitativeSense-making During Crises. Journal of Organizational Behavior, 18, 559-582.
Gephart, R.P. (1993). The Textual Approach: Risk and Blame in Disaster Sensemaking,Academy of Management Journal, 36, 1465-1514.
Gephardt, R.P. (1991). Multiple approaches for tracking corporate social performance: insightsfrom a study of major industrial accidents. Research in Corporate Social Performance andPolicy, 12, 359-383.
Gephardt, R.P., & Wolfe, R.A. (1989). Qualitative Data Analysis: Three Micro-SupportedApproaches. Academy of Management Proceedings, 382-386.
Gioia, D.A., & Chittipeddi, K. (1991). Sense-making and sense-giving in strategic changeinitiation. Strategic Management Journal, 12, 433-448.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 29/54
Corporate Web Sites 29
Grimm C.M., & Smith, K.G. (1997). Strategy as Action: Industry Rivalry and Coordination.South-Western Publishing, Cincinnati, Ohio.
Huff, A.S. (1990). Mapping Strategic Thought, John Wileys and Sons, Chichester.
Jauch, L.R., Osborn, R.N., & Martin, T.N. (1980). Structured Content Analysis of Cases: AComplementary Method for Organizational Research. Academy of Management Review, 5,517-525.
Kabanoff, B. (1997). Introduction. Computers can read as well as count: Computer-aided textanalysis in organizational research. Journal of Organizational Behavior, 18, 507-511.
Kabanoff, B. (1996). Computers can Read as well as Count: How Computer-Aided TextAnalysis can Benefit Organizational Research. Trends in Organizational Behavior, 3, 1-21.
Kabanoff, B., Waldersee, R., & Cohen M. (1995). Espoused Values and Organizational Change
Themes. Academy of Management Journal, 38, 1075-1104.
Kelle, U. (1995). Computer-aided Qualitative Data Analysis: Theory, Methods, and Practices.Sage Publications, London, United Kingdom.
Knight, D. (1999). Mental models and firm actions. Unpublished dissertation, University of Maryland.
Lissack, M.R. (1998). Concept Sampling: A New Twist for Content Analysis. OrganizationalResearch Methods, 1, 484-504.
Miles, M.B., & Huberman A.M. (1994). Qualitative Data Analysis (2nd edition) . SagePublication, Thousand Oaks, CA.
Morris, R. (1994). Computerized Content Analysis in Management Research: a Demonstrationof Advantages and Limitations. Journal of Management, 20, 903-931.
Mossholder, K.W., Settoon, R.P., Harris, S.G., & Armenakis A.A. (1995). Measuring Emotionin Open-ended Survey Responses: an Application of Textual Data Analysis. Journal of Management, 21, 335-355.
Pearce, J.A., & David, F. (1987). Corporate Mission Statements: the Bottom Line. Academy of Management Executive, 1, 109-116.
Porter, M.E. (1980). Competitive Strategy: Techniques for Analyzing Industries andCompetitors. The Free Press, New York.
Prein, G., & Kelle, U. (1995). Using Linkages and Networks for Theory Building. In Kelle, U.(ed.), Computer Qualitative Data Analysis: Theory, Methods, and Practice, (pp. 69-79). SagePublications, London.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 30/54
Corporate Web Sites 30
Roberts, C.W. (1997). Text Analysis for the Social Sciences: Methods for Drawing StatisticalInferences from Text and Transcripts. Lawrence Erlbaum Associates, Mahwah, New Jersey.
Sapir, E. (1944). Grading: a study in semantic. Philosophy of Science, 11, 93-116.
Shapiro, & Markoff. (1997). In Roberts, C. W (ed.). Text Analysis for the Social Sciences:Methods for Drawing Statistical Inferences from Text and Transcripts. Lawrence ErlbaumAssociates, Mahwah, New Jersey.
Smith, K.G., Grimm, C.M., & Gannon, M.J. (1992). Dynamics of Competitive Strategy. SagePublications, Newbury Park.
Sussman, L., Ricchio P., & Belohlav J. (1983). Corporate Speeches as a Source of CorporateValues: an Analysis across Years, Themes, and Industries. Strategic Management Journal, 187-196.
Tallerico, M. (1991). Applications of Qualitative Analysis Software: A View from the Field.Qualitative Sociology, 14, 275-285.
Tesch, R. (1991). Introduction. Qualitative Sociology, 14, 225-243.
Tesch, R. (1990). Qualitative Research: Analysis Types and Software Tools. The Falmer Press, New York, NY.
Weber, R.P. (1990). Basis Content Analysis (2nd edition) . Sage Publications, Thousand Oaks,CA.
Weitzman, E.A., & Miles, M.B. (1995). Computer Programs for Qualitative Data Analysis.Sage Publications, Thousand Oaks, CA.
Whorf, B. L. (1956). Science and Linguistics. In Carroll, J.B. (ed.), Language, Thought, andReality : Selected Writings of Benjamin Lee Whorf . MIT Press, Cambridge, MA.
Wolfe, R.A. (1991). The use of content analysis to assess corporate social responsibility. InPost, J. (ed.), Research in Corporate Social Performance and Policy, 13, (pp. 281-307). JAI,Greenwich, CT.
Wolfe, R.A., Gephart, R.P., & Johnson, T.E. (1993). Computer-facilitated Qualitative DataAnalysis: Potential Contributions to Management Research. Journal of Management, 19, 637-660.
Woodrum, E. (1984). “Mainstreaming” Content Analysis in the Social Science: MethodologicalAdvantages, Obstacles, and Solutions. Social Science Research, 13, 1-19.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 31/54
Corporate Web Sites 31
Table 1: Steps in Coding Text
The Weber Protocol
(Weber, 1990)
1) Definition of the recording units (e.g., word, phrase, sentence, paragraph).
2) Definition of the coding categories.
3) Test of coding on a sample of text.
4) Assessment of the accuracy and reliability of the sample coding.
5) Revision of the coding rules.
6) Return to step 3 until sufficient reliability is achieved.
7) Coding of all the text.
8) Assess the achieved reliability or accuracy.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 32/54
Corporate Web Sites 32
Table 2: CATA Typologies
Tesch Typology
(Tesch, 1991)
Type of software, main features Examples*
Descriptive/interpretive analysis. The main
goal is to discover the meaning of the
phenomena (patterns, types). Data is kept
unstructured and is flexibly manipulated.
Main functions are coding and retrieval of
text. Enhancements may include frequency
count, chain, co-occurrence, advancedsearch, memoing, and quantitative option.
Ethnograph, Qualpro, Textbase Alpha,
TAP, MARTIN, LTT Ethnoscript
Theory-building research. The objective is
to elicit concepts/linkages in the data.
Basic functions of text coding and retrieval
are included. Key functions are concept
identification, co-occurrence.
Kwalitan, HyperRESEARCH, NUDIST,
AQUAD, ATLAS.Ti
Traditional content analysis or cultural
analysis. Expansion of the traditionalsearch function of word processors.
General Inquirer, TextPack,
WordCruncher, FlexText
Linguistic Programs. Linguistic aspects of
data.
CODEF, PLCA
* In 1991, these programs were running on MS-DOS or MacIntosh personal computers.
Examples are given for the MS-DOS platform only.
Table 2 (cont.)
Weitzman and Miles Typology
(Weitzmann and Miles, 1995)
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 33/54
Corporate Web Sites 33
1. Word processors, such as Word and WordPerfect, can be used for a variety of support tasks
including handling field notes, transcribing interviews, preparing files, taking project notes
(memoing), and writing reports.
2. Text retrievers specialize in finding and retrieving words, phrases, and characters strings.
Some, such as ZyIndex, have content analytical features.
3. Text base managers are focused on the organization of textual data, including features such
as sorting, searching, and retrieving. Asksam and WinMax are examples.
4. Code and retrieve programs are geared toward the coding and manipulation of text.
Ethnograph is an example.
5. Code-based theory builders, such as NUD*IST, provide additional features including theory
building tools, such as connections between codes, coding hierarchies, and formulation and
testing of hypotheses.
6. Conceptual network builders focus on the network aspects of theory development and
provide graphical features to support theoretical development.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 34/54
Corporate Web Sites 34
Table 3: Comparison of dtSearch, TATOE, and NVivo
SOFTWAREPACKAGE
DTSEARCH TATOE NVIVO
Price $199 Free $425
Orders 1-800-483-4637 N/A Online/phone
Technical suppor t (703) 413-3670 N/A 805-499-1325
Website www.dtsearch.com http://www.darmstadt.gmd.de/~rostek/tatoe.htm
www.qsr-software.com
Fax (703) 413-3473 N/A 805 499 0871
E-mail [email protected] [email protected]; rostek @darmasdt.gmb.de
User group N/A N/A YES
Demo version YES YES YES
Data Input ASCII YES YES YES
Productivity software YES N/A N/A
HTML YES YES N/A
Other N/A XML RTF
Text manipulation
Preparation YES YES YES
Memoing/coding N/A YES YES
Coding N/A YES YES
Dictionaries YES YES YES
Project management N/A YES YES
FunctionalitiesSearching YES YES YES
Basic text statistics YES YES YES
Content Analysis N/A YES N/A
Qualitative DataAnalysis
N/A N/A YES
Statistical Analysis
Multiple regression N/A N/A N/A
Cluster analysis N/A N/A N/A
Multidimensionalscaling
N/A N/A N/A
Other N/A N/A N/A
Data output
ASCII N/A N/A YES
Productivity software N/A N/A YES
HTML N/A YES N/A
Statistical packages N/A YES YES
Other DBV, CSV, RTF XML RTF
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 35/54
Corporate Web Sites 35
Appendix 1: Content Analysis in Management Research
To comprehensively review empirical articles using content analysis in the management
literature, we employed a two-stage search strategy. First, we conducted a key word search of
the top journals in management on the ABI/INFORM database. Journals searched included
Academy of Management Journal , Administrative Science Quarterly, Journal of Management ,
Journal of Organizational Behavior , Organization Science, Organizational Research Methods,
and Strategic Management Journal . Second, we checked the reference lists of the papers
obtained through the ABI/INFORM search. References from articles in the 1997 Special Issue
of the Journal of Organizational Behavior were especially valuable for uncovering additional
work using CATA. In total, this search yielded 41 papers from 1986 to 1999; these articles are
summarized in Table 4. Unfortunately, and surprisingly, none of this work used corporate web
sites as their data source.
-------------------------------Table 4 about here
-------------------------------
Next, we discuss sources of data, themes investigated, and lessons that we believe can be applied
to management research using corporate web sites.
Sources of Data
The contribution of content analysis to the increased use of annual reports in management
research is well reflected in our review. Among the 41 papers, fourteen list annual reports or
proxy statements as a data source. Other sources of textual data include: trade magazines (10),
survey questionnaires (6), other company documents (6), databases (4), government documents
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 36/54
Corporate Web Sites 36
(3), field data (3), mission statements (2), transcription of videos (1), and scholarly journals (1).
Among the 41 articles, 12 adopted a longitudinal design.
Research Themes
Content analysis has been associated with several emerging areas of research such corporate
social responsibility, industrial accidents, corporate values, and organizational effectiveness
(Wolfe, 1991; Wolfe et al., 1993). Here, we highlight recent research in managerial cognition
and competitive dynamics because the use of content analysis in these two areas has yielded
significant results in the past decade.
Managerial cognition. Recent literature using content analysis has particularly
contributed in the areas of collective cognition and sensemaking. One example is Abrahamson
and Hambrick’s (1997) study of the effect of discretion on the attentional homogeneity of
executives across industries. Their argument is that variation of managerial discretion across
industries will result in different levels of attentional homogeneity. Their paper is valuable
methodologically for its careful development of lexical commonality and lexical density of
letters to shareholders as measures of managerial attentional homogeneity.
One of the criticisms of this line of research has been that documents used as data sources
for inferring managerial cognitive maps, such as speeches, industry publications, and annual
reports, are intentionally biased for specific audiences (Huff, 1990; Morris, 1994). Huff (1990)
also notes that measures such as word centrality may not reflect the hidden intent of the strategist
and that the themes elicited through content analysis may not capture real-time dimensions of
strategic decision-making. She therefore recommends the use of more fine-grain approaches
such as causal and argument mapping to capture the intricacies of strategic thinking (Huff,
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 37/54
Corporate Web Sites 37
1990). In other words, Huff (1990) recommends moving beyond manifest content to also
examine latent content.
At the group level of analysis, Carley (1997) presents the latest results of her work on
team mental models. This paper is interesting for the introduction of new refinements in her
methodology of cognitive map analysis, including automation through the use of the Automap
software. Team cognitive maps are constructed by examining the intersections between
individual maps. Her work spans manifest and latent content (Carley, 1997).
Gephart (1997, 1993) is an exemplar of content analytic work focusing on latent content.
He explores the social construction aspects of sensemaking. One of his contributions to latent
content analysis has been to carefully delineate the steps of a textual approach. His structured
approach allows researchers to retain the richness of the original data while developing grounded
theory.
Competitive Dynamics. One of the challenges of management research in recent years
has been to bridge the gap between “macro” and “micro” perspectives (Smith, Grimm, &
Gannon, 1992). Macro models such as Porter’s (1980) five forces do not capture the dynamic
nature of competitive interactions, and predictions on firm-level behavior from game theory are
difficult to study empirically. Content analysis has provided one answer to these challenges.
The most representative and relevant competitive dynamics papers applying content
analysis appear in Table 4. Structured content analysis (SCA) has been used with significant
success in this domain (Jauch, Osborn, & Martin, 1980; Smith et al., 1992). SCA, as the name
suggests, is a structured method for examining manifest content, here competitive actions. It has
allowed for the investigation of many interesting hypotheses including the impact on firm
performance of firm reputation, action characteristics, and responders’ timing, imitation, order,
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 38/54
Corporate Web Sites 38
and likelihood of responses (Chen, Smith, & Grimm, 1992; Smith et al. 1991, 1992).
Schomburg, Grimm, and Smith (1994) studied the influence of industry characteristics on new
product introductions. Chen and Hambrick (1995) showed that small firms tend to be more
active, speedier, more low key, and more secretive in their actions than large firms, and that their
responses are less likely and slower. Young, Smith, and Grimm (1996) tested how the S/C/P and
Schumpeterian economic paradigms explained differences in competitive moves. Lee, Smith,
Grimm, and Schomburg (2000) showed that lack of imitation and speed of product introduction
favorably impacted market returns. Recent work (Miller & Chen, 1996; Ferrier; Smith, &
Grimm, 1999) has begun to use SCA data to examine patterns of competitive interactions. To
date, none of this research has moved to the level of studying latent content of competitive
moves; a potentially fertile research ground.
Lessons from the Management Literature
Reliability and validity of content analysis. Much of the empirical literature in
management extensively illustrates and justifies the reliability and validity of content analysis.
Reliability has been addressed primarily through multiple raters. One key validity concern is
whether content analyses of documents such as letters to shareholders are indicative of executive
cognition. Researchers have thoroughly addressed this issue (see, e.g., Bowman,1984, 1978;
Huff, 1990; Abrahamson & Hambrick, 1997), but skeptics will continue to raise the concern.
Content analysis process. The development of categorization (or coding) schemes is a
critical issue in implementing content analysis (Doucet & Jehn, 1997; Weber, 1990). In their
study of conflict across cultures, Doucet and Jehn (1997) provide an illustration of the impact of
inductive and deductive categorization techniques on analytical results. They first used an
inductive approach by performing frequency counts and context ratings on their data. Conflict
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 39/54
Corporate Web Sites 39
related words were identified by three judges, and then categorized. In a separate analysis,
Doucet and Jehn (1997) applied a deductive approach to examine the degree of hostility implied
by the texts. Following Weber (1990) they developed custom dictionaries of terms using various
construction techniques. Methodologies to create dictionaries are also discussed in other
management work (see, e.g., Smith et al., 1991; Gephart, 1993; Gioia & Chittipeddi, 1991). The
different results from inductive and deductive approaches reflect an important difference
between exploratory and confirmatory research purposes (Gephart, 1993, 1997; Smith et al.,
1991; Weber, 1990). The key lesson learned is that the analytic approach to coding texts should
be driven by the research purpose.
Many types of analyses have been used in conjunction with content analysis. Due to the
qualitative nature of much of this work, a number of studies report only descriptive statistics to
convey their results (see, e.g., D’Aveni & McMillan; 1990, David, 1989; Pearce & David, 1987;
Salancik & Meindl 1983). Recently, more sophisticated quantitative analyses have come into
vogue. For example, factor and cluster analyses are natural fits with content analysis (see, e.g.,
Doucet & Jehn,1997; Kabanoff & Holt, 1996; Kabanoff et al., 1995; Mossholder et al., 1995).
Variables measured through content analysis have also been incorporated into multiple
regression analysis (see, e.g., Abrahamson & Hambrick, 1997; Fombrun and Shanley, 1990).
Value of CATA. Examination of the management literature confirms the multiple
advantages associated with the use of computers when conducting content analysis also found in
other social science research, as discussed earlier. Here, we briefly highlight a few exemplars
that illustrate the value of CATA in terms of quantity of data, economy, flexibility of
manipulation, and analytical sophistication.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 40/54
Corporate Web Sites 40
In the area of competitive dynamics, computers and computerized databases have
allowed researchers to access and code hundreds of trade magazines longitudinally using
Boolean algorithmic searches (Grimm & Smith, 1997), saving countless hours of graduate
assistant time. For instance, Salancik and Meindl (1983) employed 23 raters to code 324 letters
to shareholders. Today, this work could be done via CATA with higher reliability, lower cost,
and greater speed. As a general characterization, most recent exemplar recent work uses a
combination of manual and computerized approaches to coding.
Mossholder and colleagues (1995) provide a detailed illustration of the flexibility
afforded by CATA in the analysis of open-ended responses to a survey. Prior to the advent of
CATA, it is safe to say that much survey research focused on developing easy-to-analyze closed-
ended questions, avoided open-ended questions, and often failed to analyze open-ended
responses systematically. Finally, as a result of the increased ease of use of content analysis,
more complex treatments have been introduced, such as linguistic indicators (see, e.g.,
Abrahamson & Hambrick, 1997).
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 41/54
Corporate Web Sites 41
Table 4: Content Analysis in Management Research
Authors Title Source Type of data Method and key findings
Berg, Donald D,and Gordon F.Holbein (1997)**
Assessment and redirectionof longitudinal analysis:demonstration with a study
of the diversification anddivestiture relationship.
Strategic ManagementJournal, Vol. 18:7, 557-571.
203 longitudinal strategicmanagement studies asreported in SMJ between
1980 and 1993.Diversification anddivestiture decisions of 180Fortune 500 companiesover 1985-1988.Compustat database.
Most researchers: 1) didnot test and control for violations in data
assumptions underlyinglongitudinal analysis, 2)did not test the stability andthe form of the relationshipover time. Illustration of the impact of suchshortfalls with a study of divestiture anddiversification at 180companies between 1985and 1988.
Bowman, EdwardH. (1984)**
Content Analysis of Annual Reports for Corporate Strategy andRisk.
Interfaces, Vol. 14,January-February, 61-71.
Annual reports of: 82 firmsin the food processingindustry – test of severalissues and validity of
content analysis. 46 firmsin the computer periphericals and PCindustries (1974); 26 firmsin the container industry(1976) – growth and risk.
Counts and quotes.Reveals the negativecorrelation between risk and return of companies
within industries andtendency of troubledcompanies to seek risk.
Bowman, EdwardH. (1978)**
Strategy, Annual Report,and Alchemy.
California ManagementReview, Vol. 20, No 3, 64-71.
46 1974 annual reportsfrom firms in the computer industry (SIC 3573).
Count of key wordsindicative of theindependent variables.Environmental coping,
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 42/54
Corporate Web Sites 42
customer orientation,internationalization, controgrowth, and verticalintegration and value addedexplain firm performance.
Chen, C.C., andJ.R. Meindl(1991)**
The Construction of Leadership Image in thePublic Press: the Case of
Donald Burr and PeopleExpress.
Administrative ScienceQuarterly, 36, 521-551.
22 articles over a six-year period w/ Donald Burr’sname included in the title
used as a stimulus. Imageof Burr rendered by 75undergraduate businessstudents based on their reading of the 22 articles.
Two types of frequency: 1) propositions in each period(salience of theme), 2) %
of respondents (consensus)Evidence of construction ofBurr’s image as the PeopleExpress drama unfolded.
Chen, Ming Jer,and DonaldHambrick (1995)**
Speed, Stealth andSelective Attack: HowSmall firms Differ fromLarge Firms inCompetitive Behavior.
Academy of ManagementJournal, Vol. 38, No. 2,452-482.
28 major airlines between1985-86. 396 moves in 17types. SCA of Aviation Daily.
Manual SCA of competitive actions andresponses. Small firmsdiffer from large firms intheir competitive strategy.
Chen, Ming Jer,Ken G. Smith, andCurtis M. Grimm(1992)**
Action Characteristics asPredictors of CompetitiveResponses.
Management Science, Vol.38, No. 3, 439-455.
Competitive moves by 32major US domestic airlines1/79 to 12/86.
Manual SCA. Competitiveactions characteristicsdetermine the nature of theresponse.
Cochran, Daniel S.and , Fred R. David(1986)**
Communicationeffectiveness of organizational missionstatements.
Journal of AppliedCommunication Research,14, 2, 108-118.
135 usable missionstatements out of Fortune500 and AACB universitiescontacted.
Measure of inspiration andreadability of the missionstatements. Both firms anduniversities have potentialto improve the readabilityand inspiration of their mission statements.
D’Aveni, RichardA., and IanMacMillan(1990)**
Crisis and the Content of ManagerialCommunications: a Studyof the Focus of Attention
Administrative ScienceQuarterly, 35, 634-657.
Letters to shareholders of 57 failing firms and 57successful firms between1972 and 1982. Five-year
Manually counted thenumber of sentences ineach letter referring toseven elements of the
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 43/54
Corporate Web Sites 43
of Top Managers inSurviving and FailingFirms.
period leading to the bankruptcy.
environment. Studyreveals different patterns ofattention of managers of failing and surviving firms.
David, Fred R.(1989)**
How companies definetheir mission.
Long Range Planning, 22,1, 90-97.
75 exploitable missionstatements obtained fromCEOs of large companies – Business Week 1000.
Two raters usingcategories. Guidelines towrite and use missionstatements.
Dirsmith, Mark W.,and Mark A.Covaleski (1983)**
Strategy, externalcommunication, andenvironmental context.
Strategic ManagementJournal, 137-151. 1613 survey instrumentsmailed to study participants. 116 usableanalyst and investor responses.
Environmental contextsinfluence the emergence oforganizational strategies;external stakeholders usesources reflecting contextin their evaluation of thefirm.
Fiol, Marlene(1995)**
CorporateCommunications:Comparing Executives’Private and PublicStatements.
Academy of ManagementJournal, Vol. 38, No. 2,522-536.
Letters to shareholders andinternal planningdocuments of 10 firms inthe forest product industry between 1979 and 1988.
Documents processed bytwo coders:threat/opportunitycategorization with fivetypes of attribution coding(Table 1). No significantcorrelation betweenevaluations of events and
situation. Positive andsignificant for perceptionof control. Executive public statements may not be believable.
Fombrun, Charlesand Mark Shanley(1990)**
What’s in a name?Reputation building andcorporate strategy.
Academy of ManagementJournal, Vol. 33, No. 2,233-258.
292 firms included in theFortune’s 1985 study of corporate reputation.
Content analysis of thetitles of 15,400 articles tomeasure media exposure - positive or negative newsabout a firm. 1 principal
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 44/54
Corporate Web Sites 44
rater and 3 additional onesfor reliability. Antecedentsof reputations: structural position, conformance tonorms, and strategic postures.
Gioia, Dennis A,and Kumar
Chittipedi (1991)**
Sense-making and sense-giving in strategic change
initiation.
Strategic ManagementJournal, Vol., 12, 433-448.
Multiple tape-recordedinterviews with the senior
executives of a large publicuniversity. Diary of events. Internal planningdocuments.
Methodology for qualitative content analysis
as part of the ethnographicstudy. Identifies sense-making and sense-giving asCEO’s primary role.
Marcus, Alfred A.,and Robert S.Goodman (1991)**
Victims and shareholders:the dilemmas of presentingcorporate policy during acrisis.
Academy of ManagementJournal, Vol. 34, 281-305.
Fifteen crises documentedin case studies andmonographs – accidents,scandals, safety and health.Identified firm’s reactionsto the crisis as reported inthe Wall Street Journal.
Coded reactions based onthe newness of theinformation provided andaccommodative or defensive character.Identified conditions for the emergence of corporatecrises.
Miller, Danny andMing-Jer Chen(1996)**
The Simplicity of Competitive Repertoire: anEmpirical Analysis.
Strategic ManagementJournal, Vol. 17, 419-439.
18 largest airlines between1979-86SCA of Aviation Daily.
Manual SCA.Environmental andorganizational factors lead
to repertoire simplicity;repertoire simplicity isnegatively related to performance.
Osborn, Richard N.,Lawrence R. Jauch,Thomas N. Martinand William F.Glueck (1981)**
The event of CEOsuccession, performance,and environmentalconditions.
Academy of ManagementJournal, Vol. 24, No. 1,183-191.
Non random sample of 313large industrialcorporations from 1930 to1974. Executivesuccession events asrendered in Fortune
Structured content analysis(SCA) – content analysisschedules used by threeraters. Six significant predictors of succession: prior profits, financial
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 45/54
Corporate Web Sites 45
Magazine. strategies, merger strategy,socioeconomic volatilitysupplier volatility, andowner volatility.
Pearce, John A. andFred David,(1987)**
Corporate MissionStatements: the BottomLine.
Academy of ManagementExecutive, Vol. 1, No. 2,109-116.
Mailed questionnaire to theCEO of all Fortune 500companies, 44% responserate, resulting in 61 usable
mission statements.
Identification of the eight possible themes by threeraters. Statisticaldifference was found for
philosophy, self-concept,and public image betweenlow and high performers.
Salancik, Gerald R.and James R.Meindl (1984)**
Corporate attribution asstrategic illusions of management control.
Administrative ScienceQuarterly, 29, 238-254.
Letters to shareholders of US corporations over 18years. Sample of management controlledfirms.
Letters to shareholderscoded by 2 students.Unstable firms manipulatecausal attribution tomanage impression of theircontrol.
Simons, Tony(1993)**
Academy of ManagementJournal, Vol. 36, no 1, 139-156.
Speech Patterns and theConcept of Utility inCognitive Maps: the Caseof Integrative Bargaining.
Study 1: Transcribedvideotapes of 24 dyads of undergraduate students.Study 2: 55 transcribedvideotapes.
Showed that dyads innegotiations were morelikely to reach agreementwhen utility was viewed asa subjective preference.
Smith Ken G., CurtGrimm, MartyGannon, and MingJer Chen (1991)**
Organizational InformationProcessing: CompetitiveResponse and Performancein the US Domestic AirlineIndustry.
Academy of ManagementJournal, 34: 60-85.
Competitive moves by 32major US domestic airlines1/79 to 12/86.
Manual SCA. The natureof the competitive responseis affected byorganizational andmanagerial characteristics.
Staw, Barry M.,PamelaMcKechnie, andSheila M. Puffer (1983)**
The justification of organizational performance.
Administrative ScienceQuarterly, 28, 582-600.
49 high performing and 32low performingorganizations.Explanations of company’s performance in the annualletters to shareholders.
Manual study. Found thatunsuccessful firms reliedmore heavily on externalexplanatory factors toexplain poor performance.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 46/54
Corporate Web Sites 46
Sussman, Lyle,Penny Ricchio, andJames Belohlav(1983)**
Corporate Speeches as aSource of CorporateValues: an Analysis acrossYears, Themes, andIndustries.
Strategic ManagementJournal, 187-196.
882 speeches in 265companies selected from1950-75 Vital Speeches based on three criteria.
Identification of the themeof corporate speeches byreviewers. Found that atotal of 28 industriesaddressed a total of 8
themes value of corporate speeches.
Abrahamson, Eric,
and DonaldHambrick (1997)
Attentional Homogeneity
in Industries: the Effect of Discretion.
Journal of Organizational
Behavior, Vol. 18, SpecialIssue, 513-532.
Sample of firms in
Compact DisclosureDatabase (1985-89), 14industries with > 20 non-diversified companies. Alittle more than 2 letters per firm for a total of 1920.
Variables: lexical
commonality and lexicaldensity to measureattentional homogeneity.Used CRUNCH software.Industry-level discretionhas a strong and significanteffect on attentionalhomogeneity. Validity of lexical homogeneity anddensity as measures of attentional homogeneity.
Abrahamson, Eric,and Choelsoon Park (1994)
Concealment of NegativeOrganizational Outcome:an Agency TheoryPerspective.
Academy of ManagementJournal, Vol. 37, No. 5,1302-1334.
1,118 firms listed on NYSE and AME from theCompact Disclosurecomputerized database.March 1989. Presidentletters.
Proceeded in three steps: 1)definition of negativewords, 2) selection of negative outcomes, 3)coding of negativeoutcomes. Used customsoftware programs.Outside directors, largeinstitutional investors, andaccountants limitconcealment of negativeoutcomes, but smallinstitutional investors andoutside directors who are
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 47/54
Corporate Web Sites 47
shareholders prompt it.
Birnbaum-More,Philip H. and A. R.Weiss (1990)
Discovering the basis of competition in 12Industries: computerizedcontent analysis of interview data from the USand Europe.
In Huff (ed.), MappingStrategic Thought, JohnWileys and Sons,Chichester, 291-300.
158 firms in 12 industrialsectors between 1974 and1984. Interviews with 72individuals involved in the12 industries.
Coded interviews toidentify competitiveactions. Used TextPack.24 competitive actionsidentified. “Concentration,compounded growth, and batch size explained
significant portions of variance in everymentioned group of competitive actions”.
Carley, KathleenM. (1997)
Extracting Team MentalModels through TextualAnalysis.
Journal of OrganizationalBehavior, Vol. 18, SI, 533-558.
41 students at a privateuniversity in aninformation system course.Students were organized in8 teams dealing with the(fictive) informationsystem need of a client.Two open-ended questionsof a questionnaireadministered three timesduring a semester.
Procedural mappinganalysis.The program used inAUTOMAP, designed byK. Carley, which featuresan automated procedure tocoding. Key findings pertain to the impact of coding choices on resultsregarding collective mentalmap.
Doucet, Lorna andKaren A. Jehn(1997)
Analyzing Harsh Words ina Sensitive Setting:American Expatriates inCommunist China.
Journal of OrganizationalBehavior, Vol. 18, SI, 533-558.
76 American managerscurrently or recentlyworking in Sino-American joint-ventures. Eachmanager described anintra- and inter-culturalconflict based on 5questions.
Comparison of differentmethodologies (factor analysis of context ratings,multidimensional scaling,dictionary-based contentanalysis) in predicting thecategorization of intra- andinter-cultural conflicts.Unspecified software.
Ferrier, Walter J., Role of Competitive Academy of Management 41 industry leaders and Computerized SCA. By
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 48/54
Corporate Web Sites 48
Ken G. Smith, andCurtis M. Grimm(1998)
Action in Market ShareErosion and Dethronement – A Study of IndustryLeaders and Challengers.
Journal, Vol. 42, no. 4,372-388.
respective challengers between 1987-93SCA of Predicast F&SIndex by SIC codes.
taking certain competitiveactions, leaders can limitmarket share erosion anddethronement bychallenger.
Gephart, Jr., Robert(1997)
Hazardous Measures: anInterpretive TextualAnalysis of Quantitative
Sense-making DuringCrises.
Journal of OrganizationalBehavior, Vol. 18, SI, 559-582.
1977-78 Lodgepole “sour gas well” blow-out andinquiry. Used official
proceedings (217p.) andfinal report (16p.) of theEnergy ResourcesConservation Board.
Used TACT software.Quantitative practices andterms where found to be
playing an important rolefor the government(ERCB) and the operator company (AMOCO) in theinterpretation of the crisis.
Gephart, Robert P.(1993)
The Textual Approach:Risk and Blame in Disaster Sensemaking.
Academy of ManagementJournal, Vol. 36, No 6,1465-1514.
Official proceedings of theinquiry, companydocuments, field notes,energy board final report,newspaper articles, other documents.
Used TACT version 1.2. tocreate database: textualexhibits and tables, keywords, etc. in support of the implementation of thetextual approach.Validation of ethno-methodology and textualapproach for managementresearch. Findings include
evidence of socialconstruction in industrialcrisis sensemaking,contradictoryinterpretation, sensemakingas mythmaking, and role performance of organizational members.
Gephart, Robert P.(1991)
Multiple approaches for tracking corporate social
Research in CorporateSocial Performance and
Report on the variety of data collected as part of the
Methodologies used:ethnography, documentary
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 49/54
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 50/54
Corporate Web Sites 50
organization-widedocuments.
participation, commitment, performance, reward,affiliation, normative.Found four clusters of eliteleadership, meritocratic, or collegial firms.
Lee, Hun, Ken G.Smith, and Curtis
M. Grimm, andAugust Schomburg(2000)
Timing, order anddurability of new product
advantages with imitation.
Strategic ManagementJournal, forthcoming.
45 first movers, 22 secondmovers, and 38 late movers
in the brewing,telecommunications andPC industriesSCA of Predicast F&SIndex United States.
Computerized SCA. Firstmovers suffer from product
imitation of their rival,more so from early thanlate movers.
Mossholder, KevinW., Randall P.Settoon, Stanley G.Harris, and AchillesA. Armenakis(1995)
Measuring Emotion inOpen-ended SurveyResponses: an Applicationof Textual Data Analysis.
Journal of Management,Vol. 21, No. 2, 335-355.
Fortune 100 based in theMidwest and undertakingorganizationaltransformation. 173responses from topechelons to the open-endedquestions of a survey.
Used the Dictionary of Affect in Language (DAL).Qualitative and quantitativemeasurement of emotions.Showed that contentanalytical measure of emotion corresponded tomore traditional measures.
Palmer, Ian, BorisKabanoff and
Richard Dunford(1997)
Managerial Account of Downsizing.
Journal of OrganizationalBehavior, Vol. 18, SI, 623-
640.
Content analysis of 502annual reports of
Australian organizationsover the 1986-92 time period. 87 firms involved.
Both manual andcomputerized content
analysis. ISYS software package. 9 downsizingthemes grouped in 3 major dimensions: strategiclanguage, process languageand cost versusconsideration.
Schomburg,August, Ken G.Smith, and Curtis
Avoiding New ProductWarfare: The Role of Industry Structure.
Advances in StrategicManagement, Vol. 10B,145-173.
83 new productintroductions and 632responses in the brewing,
Computerized SCA.Importance of industryvariables in explaining
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 51/54
Corporate Web Sites 51
M. Grimm (1994) LD communications andPC industries between1975-90SCA of Predicast F&SIndex by SIC codes.
intensity of product rivalryat the product level.
Wade, James B.,Joseph F. Porac,and Timothy G.
Pollock (1997)
Worth, Words and theJustification of ExecutivePay.
Journal of OrganizationalBehavior, Vol. 18, SI, 641-664.
266 companies drawn fromthe S&P500 end of 1992.Proxy statements and S&P
Compustat.
Used VBPro software.Tested relationships between type of
justification, executive payand performance measures.
Wolfe, Richard A.(1991)
The use of content analysisto assess corporate socialresponsibility.
In Jim Post (ed.), Researchin Corporate SocialPerformance and Policy,Vol. 13, Greenwich, CT,JAI, 281-307.
Letters to shareholders inelectronic format wereaccessed from theDisclosure Database.
Illustration of a seven stepmethodology to conductcontent analysis.
Young, Greg, KenG. Smith, andCurtis M. Grimm(1996)
“Austrian” and IndustrialOrganization Perspectiveson Firm-level CompetitiveActivity and Performance.
Organization Science, Vol.7, No. 3, 243-254.
1,903 competitive movesin the software industry(SIC 7371, 7372, 7373)from 1983 to1991.Citations in PredicastF&S Index.
Computerized SCA. Test of Austrian versus S/C/P
Models: competitiveactivity increase performance, industrycompetition decreases performance, andcooperation increases firm
competitive activity.
** Indicates manual content analysis.
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 52/54
Corporate Web Sites 52
Figure 1: Map of 33 CATA packages
-4.0
½ à
½ á
½ â
Ëã
Ëä
2.0
-2.0 d C " ç Å ¤ ƒ b 3.0
Dimension 1
D i m e n s i o n 2
CatPac
Diction
DIMAP
Childes
SphinxSurvey
Tatoe
TextAnalyst
TextPack
TextSmartWinAta
WordStat
Aquad5
ATLASti
EZText
CodeAText
Ethnograph
HyperResearc
Storyspace
NUDIST
NVIVO
WinMax
AskSam
ConcordanceDataflight
CrossReader
dtSearch
ISYS
Metamorph
MonoConc
PL
SonarPro
Prospector
ZyIndex
Search
QualitativeData Analysis
Integrated
ContentAnalysis
Standard
Advanced
Quantitative
Qualitatitve
Economical
Expensive
Unbundled Integrated
Tatoe
NVivo
dtSearch
1
2
3
4
7
8
9
10
11
1213
1415
16
17
18
19
21
22
23
24
25 27
28
29
30
31
32
33
5
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 53/54
Corporate Web Sites 53
8/3/2019 Qual Met and the Computer 3
http://slidepdf.com/reader/full/qual-met-and-the-computer-3 54/54
Corporate Web Sites 54