Upload
lamphuc
View
217
Download
3
Embed Size (px)
Citation preview
Cite As: Cheang, B, Chu, S.K.W., Li, C. & Lim, A. (in press) "A Multidimensional Approach to Evaluating Management Journals: Refining PageRank via the Differentiation of Citation Types and Identifying the Roles that Management Journals Play." Journal of the American Society for Information Science and Technology.
A Multidimensional Approach to Evaluating Management Journals: Refining PageRank via the Differentiation of Citation Types and Identifying the Roles that
Management Journals Play
Brenda Cheang Division of Information and Technology Studies, Faculty of Education,
University of Hong Kong,Runme Shaw Building, Pokfulam Road, Hong Kong, Telephone: +852 6423 3389, Fax Number: +852 2517 7194, E-mail: [email protected]
Samuel Kai Wah ChuDivision of Information and Technology Studies, Faculty of Education,
University of Hong Kong, Runme Shaw Building, Pokfulam Road, Hong Kong, Telephone: + 852 2241 5894, Fax Number:+852 2517 7194, E-mail: [email protected]
Chongshou Li*Department of Management Sciences, College of Business,
City University of Hong Kong,Tat Chee Ave, Kowloon, Hong Kong, Telephone: +852 6350 5292, Fax Number: +852 3442 0189, Email: [email protected]
Andrew LimDepartment of Management Sciences, College of Business,
City University of Hong Kong, Tat Chee Ave, Kowloon, Hong Kong, Telephone: +852 3442 8248, Fax Number: +852 3442 0189, Email: [email protected]
Abstract
In this paper, we introduce two citation-based approaches to facilitate a multidimensional evaluation of 39 selected management journals. The first is a refined application of PageRank via the differentiation of citation types. The second is a form of mathematical manipulation to identify the roles that the selected management journals play. Our findings reveal that Academy of Management Journal, Academy of Management Review, and Administrative Science Quarterly are the clear top three management journals, respectively. We also discover that these three journals play the role of knowledge hub in the domain. Finally, when compared to ISI’s JCR, our results more closely matched expert opinions.
Key words: PageRank; Journals Evaluation; Citation Analysis; Journal Influence; Journal Quality; Journal Impact; Impact Factor
_________________________________________________
*Corresponding Author.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 1
Introduction
Since Coe and Weinstock’s (1969) study, evaluations of management journals have
periodically been reported in print. Such efforts, although discipline-focused, are not unique
to the management academic community1. In fact, domain-specific journal evaluations are
increasingly being reported in scientific communication outlets because of the immense
practical and intellectual utility they provide for today’s academic stakeholders (DuBois &
Reeb, 2000; Leydesdorff, 2008; Xu, Cheang, Lim & Wen, 2011).
But at the same time, the concept of journal evaluation has hatched into a subject of
contentious debate (Seglen, 1997; Glanzel & Moed, 2002). For in reality, a journal’s quality
or impact may be perceived differently from stakeholder to stakeholder (Robey & Markus,
1998); while some may be planted unflinchingly on the high-citation-count bandwagon,
others may be sat on the publishing-in-a-perceived-quality-journal one, or even others who
may consider academic research to be of impact only if the undertaking culminates in
industrial applicability (Rousseau, 2002; Weingart, 2005).
Resultantly, in a recently published manuscript, Moed, Colledge, Reedijk, Moya-Anegon,
Guerrero-Bote, Plume and Amin (2012) assert that there is mounting consensus among
bibliometricians that the concept of journal evaluation is so multifaceted, and therefore
complex, that it “cannot be captured in one single metric” (p.368). Thereupon, it is with this
caveat in mind that we introduce two citation-based approaches to facilitate a
multidimensional evaluation of 39 selected management journals. The first is the
application of an emerging technique known as PageRank as recent studies have shown
how journal evaluation via this technique is not only a closer match to those of expert
opinions, the technique can also be used to further refine the assessment of management
journals via the differentiation of citation types. The second involves a form of mathematical
manipulation to establish the roles that the 39 selected management journals play.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 2
The remainder of the paper is organized as follows. In Section 2, a review of the
literature pertaining to PageRank and the mathematical mapping of journal roles is
provided. Section 3 describes the PageRank methodology and the extension of the
technique via the differentiation of citation types. This is followed by a description of the
methodology in identifying the roles that journals play. Section 4 establishes the
delimitations of the study. In Section 5, we present the evaluation results and highlight
insights of our analysis. This is followed by a comparison of our results to those of ISI’s.
Finally, Section 6 concludes the paper.
Literature Review
PageRank
Origins
Although it is not well known, the creators of the PageRank algorithm had been inspired by
Pinski and Narin’s (1976) work in citations analysis when researching the development of
web search engines (Page, Brin, Motwani & Winograd, 1998). In particular, Pinski and Narin
(1976) postulated that citations are not equally prized; that the value of citations by an
article published in a reputable scientific outlet (i.e. journal) should outweigh citations by
an article published in an inferior one (Xu et al., 2011). Drawing from this concept, Brin and
Page (1998) founded Google upon the development and implementation of the PageRank
algorithm to ascertain the level of importance of a given webpage in a given webgraph
network. Summarily, it is the popularity and applicability of Google’s PageRank method,
which is based on Pinski and Narin’s (1976) conjecture that citations are not equally valued,
that Pinski and Narin’s (1976) work is enjoying a renaissance in recent bibliometrics
research2(Butler, 2008).
Transposability of PageRank in Webgraph Networks and Citation Networks
With the exception of permanence in publication citations, the behavior of visiting/linking
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 3
webpages is observed to be similar to the behavior of reading/citing scientific publications.
In the context of webgraph networks, PageRank is as an algorithm implemented on a given
webgraph network consisting vertices and edges, where vertices serve as pages on the
World Wide Web (WWW) and edges serve as hyperlinks (Page, Brin, Motwani & Winograd,
1998). Based on the algorithm and web activity, the rank value of a given webpage is
computed to indicate the level of importance of that particular webpage. Thus, a hyperlink
boosts the level of influence of a given linked webpage (Langville & Meyer, 2004). Odom
(2013) simplifies this concept by explaining that webpage P’s PageRank value is ascertained
from the following three elements:
1. The number of hyperlinks to webpage P
2. The PageRank values of the webpages that link to webpage P
3. The outgoing hyperlinks from the webpages that link to webpage P
In essence, webpage P would receive a sizable PageRank boost from webpages that
have high PageRank values themselves, and if it also makes only few links to other
webpages. Alternatively, being linked by webpages with low PageRank values but with
numerous outbound links would, in a small way, also increase webpage P’s PageRank value
assuming webpage P does not link to those webpages (Langville & Meyer, 2004). In other
words, the PageRank algorithm is built upon the principal of transitive relationships in a
given webgraph network (Cheang, 2013).
In the context of citation networks, PageRank is thus implemented on a given citations
network consisting nodes and edges, where nodes represent journals while edges represent
the citation relationship between any two journals (Cheang, 2013). Based on the algorithm
and citation activity, the derived value of a given journal is indicative of the level of prestige
of that particular journal. As it is with web activity, PageRank factors the transitive
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 4
relationships among citations, mapping citations to their originating sources. This is
equivalent to apportioning “credit where credit is due” (Xu et al., 2011).
Refining PageRank via the Differentiation of Citation Types
Although PageRank enables the determination of a journal’s impact or quality, the
procedure to analyze the transitivity of citations does not actually establish what kind(s) of
influence or impact a journal has (Cheang, 2013). For instance, a journal may receive a
significant volume of self-citations (where articles in a journal cite the same journal)
because it is perhaps a highly specialized journal. Or, a journal may be influential because it
is not only well cited within its core discipline (known as internal citations), it is also highly
cited outside of its core discipline (known as external citations). Consequently, while we are
able to order journals based on the transitivity (and therefore, influence) of citations, we are
unable to pinpoint exactly why a journal is influential. Therefore, with Moed et al.’s (2012)
assertion in mind, we surmise that by differentiating between the various citation types (i.e.
self-citations, internal citations, and external citations), we may then pinpoint the nature of
a journal’s impact or quality; that through this differentiation process, we enable the
development of a metric flexible enough to accommodate multidimensional views (Cheang,
2013).
Identifying Journal Roles
Origins
In a recent PageRank-based study, Lim et al. (2009) proposed the manipulation of a given
journal’s PageRank value and its citation percentage to establish the role that the particular
journal plays. In their study, the authors assert that journals may be knowledge sources,
knowledge hubs, or knowledge stores. Cheang (2013) describes the characteristics of each
role in the following:
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 5
i. Knowledge Source
A knowledge source is a journal that receives significantly more citations from other
journals in a core journals list (CJ). As well, a knowledge source is a journal that cites
very few other journals despite exerting tremendous influence in the CJ. Hence, a
knowledge source is an extremely influential journal.
ii. Knowledge Hub
As its name suggests, a knowledge hub is a journal that not only exchanges knowledge
in a particular domain, it also exchanges knowledge with other disciplines as well.
Thus, because a knowledge hub is highly cited and also cites journals in and out of the
CJ, it also exerts tremendous influence in a CJ.
iii. Knowledge Store
A knowledge store is a journal that cites a lot of journals but fewer
journals cite them. As such, review journals tend to be knowledge stores.
Identifying Roles of Management Journals
This method is an extension of PageRank in the sense that the PageRank value, along with
the citation percentage of a given journal, are mathematically manipulated to establish the
role that the given journal plays. We include this method in our paper as it produces a
different kind of evaluation that adds to our understanding, and therefore, enriches our
perception of a given journal (Cheang, 2013). Additionally, because the results of this
method are graphically captured, we would be able to visualize and grasp the results more
easily.
Methodology
PageRank for Journal Evaluation
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 6
PageRank is a model designed to harness the various citation data available. However, the
construction of a citations network database is first required. Figure 1 is a pictorial
representation of a citations network consisting three journals.
“Insert Figure 1 here”
Construction of the Citations Graph Network
Let us provide a simple illustration when constructing a citations graph network. Figure 1 is
a citations graph network containing three journals. The nodes represent journals whilst
the edges represent the citations and directionality of the citations. Let Ci,j represent the
number of times journal i cites journal j. Thus, edge (i,j) reflects the direct influence or
impact of journal j on journal i. Thus, in the case of C1,2 in Figure 1, it simply refers to the
number of times journal 1 cites journal 2.
Formulation
To solve the problem, we can use the random walk method or matrix multiplication
iteratively. Both methods share a similar philosophy in that they are recursive in nature but
one is based on randomly selecting an initial starting point while the other is based on
matrix multiplication. Thus, for each journal i, the recursive formulation can be expressed
by the following:
PRix+1= ∑
j=1
n
p ji PR jx
The equation refers to the impact of a journal (which is denoted as PR) where i is the
sum of the product of the impact of every journal j multiplied by the proportion (p) of
citations from j to i.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 7
We now further illustrate the inner-workings of this equation. In Figure 1, we have
three journals (J1, J2, and J3) in the citations network, where the citations and their
directionality are as follows: C1,1 = 5, C1,2 = 2, C1,3 = 3; C2,1 = 4, C2,2 = 2, C2,3 = 4; C3,1 = 1, C3,2 = 4,
C3,3 = 5.
Plugging the citation data into the equation, we get the following:
For Journal 1, we have PR1x+1= ¿) + ( p2,1 PR2
x) + ( p3,1 PR 3x)
Which means that P R1x +1= ¿) + (
410
PR2x) + (
110
PR3x)
For Journal 2, we have PR2x+1= ¿) + ( p2,2 PR2
x) + ( p3,2 PR 3x)
Which means that PR2x+1= ¿) + (
210
PR2x) + (
410
PR3x)
For Journal 3, we have PR3x+1= ¿) + ( p2,3 PR2
x) + ( p3,3 PR3x)
Which means that PR3x+1= ¿) + (
410
PR2x) + (
510
PR3x)
Finally, when PRix stabilizes/converges at the x t h iteration,
PRix = Journal i’s influence (JIi)
Upon determining the JIi, we calculate the Article PageRank Score (or APRS) of a journal
by dividing the JIi by the number of articles journal i publishes per year, for the period
under consideration. The APRS score for journal i, in turn, determines the rank order that
journal i would be placed in a given journals set list.
Refining PageRank via the Differentiation of Citation Types
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 8
We use the following example to illustrate the various citation types: Consider the citation
network shown in Figure 1. Let (J1) be Academy of Management Journal, (J2) be Academy of
Management Review and (J3) be Operations Research. Let us also define Academy of
Management Journal and Academy of Management Review as core journals to represent the
domain of management. The citations between these two journals are internal citations.
Let us also define that J3 is not a core journal to management. Therefore, the citations
between J1 and J3, or, J2 and J3 are external citations. With these definitions, the citation
relationships may be summarized as shown in Table 1.
“Insert Table 1 here”
Identifying Knowledge Roles that Journals Play
Formulation
We define the percentage of citations made to core journals for journal i as PTC i . PTC i is
thus formulated as follows:
PTC i=∑j∈CJ
Ci , j
∑j∈UJ
C i , j
×100 %.
Based on the illustration provided in Table 1, the percentage of citations made to the
core for the three journals may be computed as follows:
For J1,PTC1=C 1,1+C1,2
C1,1+C1,2+C1,3×100 %=70 %
For J2,PTC2=C2,1+C2,2
C2,1+C2,2+C2,3×100 %=60 %
For J3,PTC3=C 3,1+C3,2
C3,1+C3,2+C3,3×100 %=50 %
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 9
Therefore, based on its APRS value and citation percentages, a journal is positioned in
one of four quadrants in a graph. In general, journals positioned within the top left quadrant
are considered knowledge stores, while journals positioned within the top right quadrant
are considered knowledge hubs. Journals positioned within the bottom left quadrant are
indistinguishable journals while journals positioned within the bottom right quadrant are
considered knowledge sources.
Delimitations of Study
In applying our proposed approaches, we consider three important components: (1) data
sources and datasets; (2) period and parameter settings; and (3) computing PageRank
scores. These components set the boundaries for our study and are described below:
Data Sources and Datasets
Presently, data sources are primarily available from Google Scholar, SCOPUS, and the
Institute of Scientific Information (ISI). Of these, we have elected to use citation data from
the ISI’s Journal Citation Report (JCR) for two main reasons: Firstly, in order to differentiate
between citation types, we require incoming and outgoing citations at the journal level. At
present, ISI’s database is the easiest of the three to extract data from; Secondly, ISI is not
only the oldest, most established, indexing agency in the world, its indexing criteria is
probably the most stringent, and their journals database is immense (at over 12,000
journals to date).
As for the datasets that are required for this study, we require two journal sets, namely
a universal set denoted as UJ and a core set denoted as CJ. The UJ is composed of all journals
indexed by the ISI’s JCR while the CJ is composed of 39 selected management journals based
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 10
on the number of times they appeared in previous related studies3 as well as informal
interviews with a number of management academics.
Period and Parameter Settings
Besides the UJ and CJ, we establish the various relevant parameters for our study. First, we
consider the time windows relevant to our study. In order to obtain a timely evaluation of
management journals, we focus our study on the citations from articles published in 2010
as they are quite recent. Additionally, a study on citation phenomenon by Amin and Mabe
(2000) revealed that “Citations to articles published in a given year rise sharply to a peak
between two and six years after publication” (p. 2). Based on this finding, we concentrate our
analysis on the citations that articles received in the four years prior to 2010 as well as the
citations received in year 2010. Therefore, we consider citations from citing papers
published in 2010 to cited papers published in the time period from 2006 to publication
time of citing papers, which is 2010. The following examples clarify the period settings:
We disregard a paper published in 2010 that cites paper B published in 2005
because the cited time does not satisfy our criteria.
We disregard a paper published in 2009 that cite paper B published in 2008 because
the citing time does not satisfy our criteria.
We take into account a paper published in 2010 that cites paper B published in 2007
as the citing and cited period both satisfy our criteria.
We take into consideration a paper published in 2010 that cites paper B published
in 2010 as the citing and cited time both satisfy our criteria.
Next, we establish the citation parameter settings. In particular, we denote the self-
citation parameter as S and the external citation parameter as E. We are interested in four
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 11
parameter settings, where (S = E = 0), (S = E = 1), (S = 1, E = 0), and (S = E = 1). Having said
that, internal citations are, by default, set at 1. This is because all citations are evaluated
based on the core journals set list. Therefore, setting (S = E = 0) indicates that only internal
citations are considered, while self-citations and external citations are excluded; setting (S =
E = 1) implies that all citations are considered; setting (S = E = 1) indicates that external
citations, relative to internal citations, are considered; and setting (S = 1, E = 0) indicates
that self-citations, relative to internal citations, are considered.
Results and Analyses
Results of 39 Selected Management Journals Based on the Differentiation of Citation Types
Table 2 reports the ratings and APRS scores of the 39 selected management journals based
on the parameters described in Section 4 to generate the ratings.
“Insert Table 2 here”
Analysis of the Results via the Differentiation by Citation Types
Top Five Management Journals
First, we examine the top five management journals based on the results as shown in Table
2. Tagged with the ^ symbol, the top three journals, namely AMJ, AMR, ASQ, post identical
ratings on each of the four (parameter) settings. The fourth and fifth ranked journals are
not as clearly determined. For if the ratings are based on the consideration of all citation
types (S = E = 1), JAP is ranked fourth while SMJ is ranked fifth. Alternatively, if only internal
citations (S = E = 0) are considered, JM and OS would then be ranked fourth and fifth,
respectively.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 12
Upon these findings, we note that the APRS values of all four (parameter) settings of
the top three journals are in the vicinity of 0.1 on average, while JAP and SMJ have APRS
values in the vicinity of 0.05 on average, and JM and OS post APRS values slightly lower than
0.05 on average. And because the APRS values of the top three journals are significantly
higher than the other journals in the CJ, they exert tremendous influence within the CJ
because they receive a lot of citations directly and indirectly from the CJ. Figure 2 provides a
visual representation of the relative placements of the selected journals in the CJ.
Summarily, our findings correlate with those by Podsakoff et al. (2005), where “…by 1999,
AMJ and AMR had joined ASQ at the top, and this group was followed by another group
comprised of SMJ, JAP...”(p. 487).
Insights Related to External Citations
Next, we discuss the various journals tagged by the * symbol and the # symbol in Table 2.
First, we discuss the ones tagged with *. We note that the ratings of CMR, HBR, IEETEM, JAP,
JBR, JVB, OBHDP, ORM, and RP are around three to five placements higher when both
internal and external citations (S = 0, E = 1) are compared to ratings based on internal
citations only (S = 0, E = 0). The findings suggest that these journals are more influential
outside of the CJ. Indeed, whereas OBHDP is ranked seventh and JAP is ranked fourth overall
(S = E = 1) in this CJ, the overall placement of OBHDP is second, while JAP is third in the field
of applied psychology for the year 2010 (Journal-ranking.com, 2013). And in another
example, RP is ranked 18th overall in the CJ but is ranked sixth overall in the field of planning
and development for the year 2010 (Journal-ranking.com, 2013). Accordingly, for this CJ,
because of the better ratings of these journals when both internal and external citations are
considered, their overall placements are improved when compared to their ratings for when
only internal citations are considered.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 13
As for the journals tagged by the # symbol (namely, BJIR, DS, IR, and MS), their various
placements noted in Table 2 suggest that they share similar rating trends as the journals
highlighted in yellow. The primary divagation is that the rating differences of these journals
are significantly higher (between 9 and 13 placements). Take DS, for example. In 2010, DS
was ranked 26th overall in the field of information systems, and ranked 10th overall in the
field of operations research and management science (Journal-ranking.com, 2013). Its
overall ratings are significantly higher in other disciplines when compared solely to its
internal rating (which is 37th) in this CJ. Consequently, DS scores an overall rating of 27th in
the CJ. Similarly, we note the journal MS, which is rated 18th when only internal citations are
considered. In 2010, MS ranked sixth overall in the field of information systems, and ranked
first overall in the field of operations research and management science (Journal-
ranking.com, 2013). As a result, MS is ranked 9th overall in this CJ although it ranked twice
as high (18th) when considering its impact solely based on internal citations. These
examples, as well as the findings concerning all journals highlighted in green, indicate that
these journals are well regarded enough to be considered for inclusion into this CJ even
though the research published by these journals may not have as much influence compared
to the other selected journals in the CJ. They may very well be far more influential in other
disciplines, making them (pure) core journals in some other disciplines. Conversely, these
findings may also suggest that there are some journals that are actually core journals in the
domain but they are simply not influential at all. These findings are also graphically
illustrated in Figure 2, which depicts their indistinguishable or borderline indistinguishable
influence in the CJ.
Insights Related to Self-citations
In this segment, we briefly discuss the following five journals, namely JBR, LQ, LRP, MS, and
RP. Tagged with the ** symbol in the (S = 1, E = 0) column, these journals are rated between
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 14
four to five placements higher when the self-citation (S = 1, E = 0) parameter is compared to
their ratings when only internal citations (S = E = 0) are considered. These findings suggest
that the proportion of self-citations to internal citations is large. On the one hand, such a
finding is not alarming if the journal is an influential journal (determined by their APRS
values). But if a journal is not influential and still receives high self-citations, a plausible
explanation is that the journal is highly specialized. Alternatively, it exposes a journal as
playing the citation manipulation game as have been observed in numerous studies
(Weingart, 2005; Smith, 2006; Public Library of Science Medicine Editors, 2006, Lynch,
2010).
Results of the Roles that the 39 Selected Management Journals Play
Based on their APRS scores (on the x-axis) and their citation to core percentages (on the y-
axis), Figure. 2 shows the roles that the 39 selected management journals play.
“Insert Figure 2 here”
Analysis of the Roles that the 39 Selected Management Journals Play
In general, the higher the percentage of citations to the CJ and the higher the APRS values of
a particular journal, the more the particular journal plays the role of a knowledge hub. In
this case, we note from Figure. 2 that AMJ, AMR and ASQ are not only positioned in the
‘knowledge hub’ quadrant, they also visually correspond to their no. 1, 2 and 3 rankings,
respectively.
Next we note the bottom left quadrant, or the ‘indistinguishable’ quadrant. As
mentioned in our insights related to external citations, journals in this quadrant are well
regarded enough to be selected for inclusion into this CJ even though the research published
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 15
by these journals may not have as much influence compared to the other selected journals
in the CJ. And as shown with our various examples, these journals may very well be
significantly more influential in other core disciplines. Conversely, these findings may also
suggest that there are some journals that are actually core journals in the domain but they
are simply not influential at all. Reading into which category an indistinguishable journal
falls into could be based on the reader’s past experiences with a particular journal and/or
based on rankings of the journal in other fields that it is also featured in. With that being
said, we note that the position of HBR is indeterminable because it traditionally does not
publish references for its featured articles (Lim et al., 2009).
Finally, we note the knowledge store quadrant where the remaining journals in the CJ
are full-fledged knowledge stores (i.e. LRP, GOM, JMS, ORM, JOB, PP), approaching
knowledge hubs (i.e. SMJ, OM, JM), or borderline knowledge stores (i.e. IJHRM, HRM, JVB,
OBHDP, CMR, IEETEM). We also note that there are no knowledge sources in the CJ.
A Comparison Between PageRank and ISI’s JCR Rank Orders
In this subsection, we compare our results with that of ISI’s. Table 3 in Annex 1 shows the
rankings and ranking disparities of the 39 selected journals obtained via our approach as
well as ISI’s JCR for 2010. Because the ISI’s IF method does not differentiate between
citation types, we use the results based on the parameter setting of (S = E = 1), as this
parameter setting considers all citation types, for a comparable analysis.
First, we discuss some of the broad findings in Table 3. Overall, the journals tagged by
the * symbol indicate that PageRank’s ratings of those journals are between (at least) 5 and
(at most) 17 placements higher than those produced by ISI, while the journals tagged by the
** symbol indicate that PageRank’s ratings of those journals are between (at least) 5 and (at
most) 10 placements lower than those produced by ISI.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 16
Next, we used Spearman’s Rank Correlation Coefficient (Spearman, 1904), Kendall’s Tau
Rank Correlation Coefficient (Kendall, 1938), Normalized Spearman’s Footrule Distance
(Dwork, Kumar, Naor & Sivakumar, 2001) and Normalized Weighted Footrule Distance
(Langville & Meyer, 2012) to analyze the ranking disparities of our results with that of ISI’s.
Through the Spearman’s Rank Correlation Coefficient and Kendall’s Tau Rank Correlation
Coefficient, the two rankings are highly correlated. Based on the Normalized Spearman’s
Footrule Distance, which considers not only the deviation but also the position where the
deviation occurs, the ranking gaps appear to be small. However, because the value of the
Normalized Weighted Footrule Distance is larger than the Normalized Spearman’s Footrule
Distance, it suggests that the deviation between these two rankings occur towards at the
upper tier of these rankings. Table 4 is a summary of the results4.
“Insert Table 4 here”
With that being said, we note the placement of ASQ, which has consistently been
viewed as one of the top three journals in management5, is similarly ranked third via our
method but manages to rank only 9th via ISI’s JCR. Indeed, according to our results, other
top-rated journals such as SMJ, OBHDP, and MS suffer similar ranking disparities as ASQ
under the ISI’s ranking methodology. The disparity in rankings are said to be because the
ISI’s method only takes citation frequency into account, completely ignoring citation impact
(Seglen, 1997; Pendlebury, 2009; Xu et al, 2011).
Conclusions
In this study, we introduced two citation-based approaches to enable a multidimensional
evaluation of 39 selected management journals. In particular, we refined the application of
PageRank via the differentiation of citation types and identified the roles that the selected
management journals play. Our findings concur especially with those produced by
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 17
Podsakoff et al. (2005) that AMJ, AMR, ASQ have clearly remained the top three management
journals, respectively. We also discovered that these three journals play the role of
knowledge hub in the domain. This means that they not only exchange knowledge in a
particular domain, they also extensively exchange knowledge with other disciplines. In
addition, when compared to ISI’s JCR, our results more closely matched those from past
studies.
Notwithstanding, we note the limitation in terms of data sources. Presently, we are
confined to evaluating only journals indexed by ISI since their data organization allows us to
extract citing and cited information in an efficient manner. Thus, we are unable to evaluate
journals that are not indexed by ISI.
Finally, the proposed multidimensional approach to journal evaluation is a new
concept. In this light, we aim to do three subsequent things: promote this concept to other
disciplines, get academic stakeholders to adopt/provide input on this concept, and to
improve upon or solicit/introduce new approaches that strengthen this concept.
Endnotes
See Olson (2005) for operations management journals; Rainer & Miller (2005) for management information systems journals; Eliades & Athanasiou (2001) for biomedical journals; Bollen, Rodriguez, & Van de Sompel (2006) for dermatology journals; Dubois and Reeb (2000) for international business journals; and Brown (2003) for accounting and finance journals.
2For PageRank-based journal evaluations, see Bollen, et al. (2006); Chen, Xie, Maslov, and Redner (2007); Ma, Guan, and Zhao (2008); Lim, Ma, Wen, Xu and Cheang (2009); and Xu et al. (2011). For the assessment of author impact via PageRank, see Ding, Yan, Frazho and Caverlee (2009); Ding (2011, a, b); and Yan and Ding (2011).
3 Namely, Coe & Weinstock, 1984; Sharplin & Mabry, 1985; Macmillan & Stern, 1987; Extejt & Smith, 1990; Johnson & Podsakoff, 1994; Thai & Meyer, 1999; Podsakoff et al., 2005; Singh et al, 2007; and Harris, 2008.
4 Detailed results can be found in Annex 2.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 18
5 Expert surveys include: Coe and Weinstock’s (1984) study which found ASQ to be a top
two management journal; Extejt and Smith’s (1990) study which found ASQ to be a top
three behavioral-based management journal. Citations-based studies include: Johnson and
Podsakoff (1994) which found ASQ to consistently be one of two most influential journals in
management, and showed that SMJ had significantly improved its ranking; Tahai and Meyer
(1999) found that top rated SMJ had accounted for 11% of all journal citations under study;
Podsakoff, Mackenzie, Bachrach, and Podsakoff (2005) ASQ, AMR, AMJ, JAP, OBHDP and
SMJ were consistently rated as top-tier journals over the past two decades.
References
Amin, M., & M. Mabe. (2000). Impact Factors: Use and Abuse. Perspectives Publishing
1(October) 1-6. Elsevier Science, Oxford, UK
Bollen, J., Rodriguez, M., & Van de Sompel, H. (2006). Journal Status. Scientometrics,
69(3), pp. 669–687.
Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual Web search engine.
Computer Networks and ISDN Systems. 30(1-7), 107–117.
Brown, L. (2003). Ranking Journals Using Social Science Research Network Downloads.
Review of Quantitative Finance and Accounting, 20, 291-307.
Butler, D. (2008). Free journal-ranking tool enters citation market. Nature, 451(7174), 6.
Cheang, B. (2013). A Proposed Multidimensional Information System Framework for
Journal Evaluations: A Case Study in the Field of Education and Educational Research. (PhD
Thesis [in Progress], The University of Hong Kong, Hong Kong).
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 19
Chen, P., Xie, H., Maslov, S., & Redner, S. (2007). Finding Scientific Gems with Google’s
PageRank Algorithm. Journal of Informetrics, 1, 8-15.
Coe, Robert & Irwin Weinstock. (1969). Evaluating journal publication: Perceptions
versus reality. AACSB Bulletin 27(3) pp. 23–37.
Coe, Robert & Irwin Weinstock. (1984). Evaluating the management journals: A second
look. The Academy of Management Journal 27(3) pp. 660–666.
Diaconis, P., & Graham, R. L. (1977). Spearman's footrule as a measure of
disarray. Journal of the Royal Statistical Society. Series B (Methodological), 262-268.
Ding, Y. (2011, a). Applying weighted PageRank to author citation networks.Journal of
the American Society for Information Science and Technology, 62(2), 236-245.
Ding, Y. (2011, b). Topic-based PageRank on author cocitation networks. Journal of the
American Society for Information Science and Technology, 62(3), 449-466.
Ding, Y., Yan, E., Frazho, A., & Caverlee, J. (2009). PageRank for ranking authors in co-
citation networks. Journal of the American Society for Information Science and
Technology, 60(11), 2229-2243.
Dwork, C., Kumar, R., Naor, M., & Sivakumar, D. (2001, April). Rank aggregation methods
for the web. In Proceedings of the 10th international conference on World Wide Web (pp.
613-622). ACM.
DuBois, F. & Reeb, D. (2000). Ranking the International Business Journals. Journal of
International Business Studies, 31(4), 689-704.
Eliades, T. & Athanasiou, A. (2001). Impact Factor: A Review with Specific Reference to
Orthodontic Journals. Journal of Orofacial Orthopedics, 62(1), 76.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 20
Extejt, Marian M. & Jonathan E. Smith. (1990). The behavioral sciences and
management: An evaluation of relevant journals. Journal of Management, 16(3) 539–551.
Kendall, M. G. (1938). A new measure of rank correlation. Biometrika, 30(1/2), 81-93.
Glanzel, W. & Moed, H., (2002). Journal impact measures in bibliometric research.
Scientometrics, 53(2), 171–193
Harris, C. (2008). Ranking the management journals. Journal of Scholarly Publishing,
39(4) 1710–1116.
Johnson, J.L. & Podsakoff, P.M. (1994). Journal influence in the field of management: An
analysis using salancik’s index in a dependency network. The Academy of Management
Journal, 37(5) pp. 1392–1407.
Journal-ranking.com. (2013). Results of various journals were taken from their online
calculations. Retrieved May 10, from http:// www.journal-ranking.com / ranking/
searchCommonRanking.html
Langville, A. & Meyer, C. (2004). Deeper Inside PageRank. Internet Mathematics.
akpeters.metapress.com
Leydesdorff, L. (2008). Caveats for the use of citation indicators in research and journal
evaluations. Journal of American Society of Information Science and Technology, 59(2), 278–
287.
Lim, A., Ma, H., Wen, Q., Xu, Z., & Cheang, B. (2009). Distinguishing citation quality for
journal impact assessment. Communications of the ACM, 52(8), 111-116.
Langville, A. N., & Meyer, C. C. D. (2012). Who's N° 1?: The Science of Rating and
Ranking(Chapter 16, pp. 201-214) . Princeton University Press.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 21
Lynch, J.G. (2010). Frivolous Journal Self-Citation. From http://ama-
academics.communityzero.com/elmar?go=2371115
Ma, N., Guan, J. & Zhao, Y. (2008). Bringing PageRank to the citation analysis.
Information Processing and Management, 44, 800–810.
Macmillan, I.C. & Stern, I. (1987). Delineating a forum for business policy scholars.
Strategic Management Journal, 8(2) 183–186.
Moed, H. F., Colledge, L., Reedijk, J., Moya-Anegon, F., Guerrero-Bote, V., Plume, A., &
Amin, M. (2012). Citation-based metrics are appropriate tools in journal assessment provided
that they are accurate and used in an informed way. Scientometrics, 92(2), 367-376.
Odom, S. (2013) SEO Consulting. Retrieved August 29, 2013 from
http://www.seopt.com/articles/pagerank.html
Olson, J. (2005). Top-25-Business-School Professors Rate Journals in Operations
Management and Related Fields. Interfaces, 35(4), 323–338.
Page, L., Brin, S., Motwani, R., & Winograd, T. (1998). The PageRank citation ranking:
Bringing order to the web (Tech. Rep.). Stanford, CA: Stanford Digital Library Technologies
Project.
Pendlebury, D. (2009). The use and misuse of journal metrics and other citation
indicators. Archivum Immunologiae et Therapiae Experimentalis, 57, 1–11.
Pinski, G., & Narin, F. (1976). Citation influence for journal aggregates of scientific
publications: Theory, with application to the literature of physics. Information Processing &
Management, 12(5), 297-312.
Podsakoff, Philip M., Scott B. Mackenzie, Daniel G. Bachrach, Nathan P. Podsakoff.
(2005). The influence of management journals in the 1980s and 1990s. Strategic
Management Journal. 26(5) pp. 473–488.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 22
Public Library of Science Medicine Editors. (2006). The impact factor game. Public
Library of Science Medicine, 3(6), 291
Rainer, R. & Miller, M. (2005). Examining Differences Across Journal Rankings.
Communications of the ACM (CACM), 48(2), 91-94.
Robey, D. & Markus, M. (1998). Beyond Rigor and Relevance: Producing Consumable
Research about Information Systems. Information Resources Management Journal, 11(1), 7-
15.
Rousseau, R. (2002). Journal Evaluation: Technical and Practical Issues. Library Trends,
50(3), 418-439.
Seglen, P. (1997). Why the Impact Factor of Journals Should Not Be Used for Evaluating
Research. BMJ: British Medical Journal 314, 7079: 499.
Sharplin, Arthur D., Rodney H. Mabry. (1985). The relative importance of journals used
in management research: An alternative ranking. Human Relations, 38(2) 139–149.
Singh, Gangaram, Kamal M. Haddad, & Chee W. Chow. (2007). Are articles in top
management journals necessarily of higher quality? Journal of Management Inquiry, 16(4)
319–331.
Smith, R. (2006). Commentary: the power of the unrelenting impact factor – is it a force
for good or harm? International Journal of Epidemiology, 35(5), 1129-30.
Spearman, C. (1904). The proof and measurement of association between two
things. The American journal of psychology, 15(1), 72-101.
Tahai, A. & Meyer M.J. (1999). A revealed preference study of management journals’
direct influences. Strategic Management Journal. 20(3) pp. 279–296.
Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent
consequences? Scientometrics, 62(1), 117–131. /tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 23
Xu, Z., Cheang, B., Lim, A., & Wen, Q. (2011). Evaluating OR/MS Journals via
PageRank. Interfaces, 41(4), 375-388.
Yan, E., & Ding, Y. (2011). Discovering author impact: A PageRank
perspective. Information processing & management, 47(1), 125-134.
Annex 1
“Insert Table 3 here”
Annex 2
4 Different Distance Measures for Two Rankings
We formulate Spearman’s Rank Correlation Coefficient, Kendall’s Tau Rank
Correlation Coefficient, Normalized Spearman Footrule Distance and Normalized Weighted
Footrule Distance as follows.
Spearman’s Rank Correlation Coefficient (Spearman, 1904):
rho=∑i=1
N
(x (i )−x)( y ( i )− y )
√∑i(x (i )−x )2∑
i( y (i )− y)2
where xand 𝑦 are two full ranking lists of nitems; x (i ) is the rank position of item i in
ranking x ; y (i ) is the rank position of item i in ranking y . x=∑
ix (i)
n , y=
∑i
y (i )
n.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 24
Obviously, −1≤rho ≤ 1. If rho=1, the two lists totally agree with each other; If rho=−1, the
two lists are completely reversed.
Kendall’s Tau Rank Correlation Coefficient (Kendall, 1938):
τ=nc−nd
n(n−1)/2
where nc is the number of concordant pairs and nd is the number of discordant pairs;
n(n−1)/2 is the total number of pairs of n items in the ranking. Clearly, −1 ≤ τ ≤ 1. If τ=1,
the two lists totally agree with each other; If τ=−1, the two lists are completely reversed.
Spearman Footrule Distance (Diaconis & Graham, 1977):
F (l , k )=∑i=1
n
|l (i )−k (i )|
where land k are two full ranking lists of nitems; l (i) is the rank position of item i in
ranking l ; k ( i ) is the rank position of item i in ranking k .
Normalized Spearman Footrule Distance (Dwork, et al., 2001):
ρ=∑i=1
n
¿ l ( i )−k ( i)∨¿
n2/2¿
This value is derived by dividing the maximum value. Thus the normalized Spearman
Footrule distance always lies between 0 and 1. 0 means that two rankings are in perfect
agreement, while 1 means two rankings completely disagree with each other.
Normalized Weighted Footrule Distance (Langville & Meyer, 2012):
Weighted Footrule Distance of two full ranking lists land k :
ϕ=∑i=1
n
¿ l (i )−k (i)∨ ¿min {l (i ) , k (i)}
¿
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 25
Normalized Weighted Footrule Distance:
~ϕ=∑i=1
n
¿l ( i )−k (i)∨ ¿min {l (i ) , k (i)}
−4 ⌊ n2⌋+2(n+1)∑
i=1
⌊ n2 ⌋
1/i
¿
where the denominator is the maximum weighted footrule distance of two full rankings.
The normalized weighted footrule distance is always between 0 and 1. 0 indicates that two
rankings perfectly agree with each other, while 1 indicates that two rankings are completely
reversed.
List of Tables
TABLE 1. Summary of Quantifying the Three Citation Types in FIG.1.
TABLE 2. This table displays the ratings of 39 selected management journals based
on four parameter settings. Columns 1 and 2 show the acronym and full name of the
selected management journals. Columns 3, 5, 7 and 9 show the Article PageRank
Scores (APRS) while columns 4, 6, 8, and 10 show the ratings of each journal based
on one of four parameter settings, where (S = E = 0), (S = 0, E = 1), (S = 1, E = 0), and
(S = E = 1), respectively.
TABLE. 3 Comparisons between PageRank and ISI’s JCR Rank Orders for 2010.
TABLE 4. Qualitative Deviation between Two Ranking Lists in Table 3.
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 26
List of Figures
FIG. 1. Illustration of Citations Graph Network.
FIG. 2. Roles that the 39 selected Management Journals Paly.
TABLE 1. Summary of Quantifying the Three Citation Types in Figure 1.
Journals Internal Citations
External Citations
Self-Citations
Core Journals
J1, Academy of Management Journal C2,1 C3,1 C1,1
J2, Academy of Management Review C1,2 C3,2 C2,2
Non-core Journals
J3, Operations Research 0 C1,3 + C2,3 C3,3
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 27
TABLE 2. This table displays the ratings of 39 selected management journals based on four parameter settings. Columns 1 and 2 show the acronym and full name of the selected management journals. Columns 3, 5, 7 and 9 show the Article PageRank Scores (APRS) while columns 4, 6, 8, and 10 show the ratings of each journal based on one of four parameter settings, where (S = E = 0), (S = 0, E = 1), (S = 1, E = 0), and (S = E = 1), respectively.
Journal
Parameters
(S = 0,E = 0)
(Internal-citation only)
Parameters
(S = 0, E = 1)
(Internal-citation + External-citation)
Parameters
(S = 1, E = 0)
(Internal-citation + Self-citation)
Parameters
(S = 1, E = 1)
(Internal-citation + Self-citation+
External-citation)
Acronym Name APRS Rank APRS Rank APRS Rank APRS Rank
AMJ ^ Academy of Management Journal 0.1469 1 0.1115 1 0.1472 1 0.1157 1
AMR ^ Academy of Management Review 0.1207 2 0.0991 2 0.0984 2 0.0898 2
ASQ ^ Administrative Science Quarterly 0.1041 3 0.0833 3 0.0848 3 0.0746 3
BJIR # British Journal of Industrial 0.0035 36 0.0113 25 0.0041 37 0.0109 22
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 28
Relations
BJM British Journal of Management 0.0059 30 0.0068 33 0.0048 34 0.0060 35
CMR* California Management Review 0.0085 26 0.0120 23 0.0070 28 0.0105 26
DS # Decision Sciences 0.0026 37 0.0113 24 0.0030 38 0.0104 27
ETPEntrepreneurship Theory and Practice 0.0148 19 0.0154 21 0.0152 21 0.0161 21
GOMGroup and Organization Management 0.0100 23 0.0097 29 0.0102 22 0.0108 23
HBR* Harvard Business Review 0.0050 32 0.0102 28 0.0057 31 0.0103 28
HR Human Relations 0.0119 21 0.0123 22 0.0095 23 0.0107 25
HRM Human Resource Management 0.0072 27 0.0057 35 0.0056 32 0.0049 36
IEETEM*IEEE Transaction on Engineering Management 0.0015 39 0.0048 36 0.0014 39 0.0044 39
IJHRMInternational Journal of Human Resource Management 0.0048 34 0.0047 37 0.0048 35 0.0046 37
IR # Industrial Relations 0.0048 33 0.0217 14 0.0069 29 0.0221 17
JAP* Journal of Applied Psychology 0.0450 7 0.0539 4 0.0522 6 0.0563 4
JBR* Journal of Business Research 0.0025 38 0.0058 34 0.0049 33** 0.0065 34
JBV Journal of Business Venturing 0.0183 14 0.0212 15 0.0205 16 0.0229 16
JIBSJournal of International Business Studies 0.0181 16 0.0194 16 0.0274 14 0.0236 14
JM Journal of Management 0.0542 4 0.0436 7 0.0405 7 0.0372 10
JMI Journal of Management Inquiry 0.0154 17 0.0103 27 0.0158 19 0.0108 24
JMS Journal of Management Studies 0.0286 12 0.0226 13 0.0284 12 0.0234 15
JOB Journal of Organizational Behavior 0.0298 11 0.0303 12 0.0240 15 0.0275 12
JOOPJournal of Occupational and Organizational Psychology 0.0112 22 0.0108 26 0.0083 25 0.0091 30
JPIMJournal of Production Innovation Management 0.0055 31 0.0086 31 0.0063 30 0.0092 29
JVB* Journal of Vocational Behavior 0.0133 20 0.0181 17 0.0158 20 0.0183 20
JWB Journal of World Business 0.0085 25 0.0088 30 0.0074 26 0.0079 31
LQ Leadership 0.0182 15 0.0180 18 0.0311 10** 0.0259 13
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 29
LRP Long Range Planning 0.0061 28 0.0046 38 0.0084 24** 0.0067 33
ML Management Learning 0.0039 35 0.0038 39 0.0047 36 0.0045 38
MS # Management Science 0.0153 18 0.0413 8 0.0280 13** 0.0388 9
OBHDP*Organizational Behavior and Human Decision Process 0.0342 10 0.0486 6 0.0305 11 0.0423 7
ORM* Organizational Research Methods 0.0391 9 0.0504 5 0.0369 8 0.0517 6
OS Organization Science 0.0541 5 0.0388 11 0.0533 5 0.0404 8
OST Organization Studies 0.0184 13 0.0170 20 0.0204 17 0.0184 19
PP Personnel Psychology 0.0397 8 0.0389 10 0.0316 9 0.0348 11
RP* Research Policy 0.0095 24 0.0176 19 0.0163 18** 0.0190 18
SMJ Strategic Management Journal 0.0529 6 0.0411 9 0.0716 4 0.0555 5
SMR Sloan Management Review 0.0059 29 0.0068 32 0.0071 27 0.0075 32
TABLE. 3 Comparisons between PageRank and ISI’s JCR Rank Orders for 2010.
Journal PageRank
(S = E = 1)
ISI’s JCR
IF 2010
Disparity
(PageRank – IF)
Acronym Name Ranking Ranking Positions
AMJ Academy of Management Journal 1 2 -1
AMR Academy of Management Review 2 1 1
ASQ* Administrative Science Quarterly 3 9 -6
BJIR* British Journal of Industrial Relations 22 37 -15
BJM British Journal of Management 35 31 4
CMR California Management Review 26 28 -2
DS** Decision Sciences 27 20 7
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 30
ETP Entrepreneurship Theory and Practice 21 19 2
GOM** Group and Organization Management 23 16 7
HBR Harvard Business Review 28 25 3
HR Human Relations 25 29 -4
HRM Human Resource Management 36 33 3
IEETEM** IEEE Transaction on Engineering Management 39 32 7
IJHRMInternational Journal of Human Resource Management 37 39 -2
IR* Industrial Relations 17 34 -17
JAP Journal of Applied Psychology 4 5 -1
JBR** Journal of Business Research 34 26 8
JBV* Journal of Business Venturing 16 22 -6
JIBS** Journal of International Business Studies 14 4 10
JM Journal of Management 10 8 2
JMI* Journal of Management Inquiry 24 35 -11
JMS** Journal of Management Studies 15 6 9
JOB* Journal of Organizational Behavior 12 17 -5
JOOP*Journal of Occupational and Organizational Psychology 30 38 -8
JPIM**Journal of Production Innovation Management 29 23 6
JVB** Journal of Vocational Behavior 20 13 7
JWB** Journal of World Business 31 24 7
LQ Leadership 13 12 1
LRP** Long Range Planning 33 27 6
ML Management Learning 38 36 2
MS* Management Science 9 21 -12
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 31
OBHDP*Organizational Behavior and Human Decision Process 7 15 -8
ORM Organizational Research Methods 6 3 3
OS Organization Science 8 7 1
OST Organization Studies 19 18 1
PP Personnel Psychology 11 11 0
RP Research Policy 18 14 4
SMJ* Strategic Management Journal 5 10 -5
SMR Sloan Management Review 32 30 2
TABLE 4. Qualitative Deviation between Two Ranking Lists in Table 3.
Measures ValueSpearman’s Rank Correlation Coefficient 0.828Kendall’s Tau Rank Correlation Coefficient 0.644Normalized Spearman’s Footrule Distance 0.271Normalized Weighted Footrule Distance 0.431
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 32
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 33
FiG. 1. Illustration of Citations Graph Network
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 34
FiG. 2. Roles that the 39 selected Management Journals Paly
/tt/file_convert/5ad3a5e27f8b9a72118e9e11/document.docx 35