Where Should I Publish?

Preview:

Citation preview

Elaine M. Lasda BergmanBibliographer for Social Welfare,

Gerontology and Dewey Referenceelasdabergman@albany.edu

March 29, 2012

Introduction

• Many factors go into choosing journal

• Relevance

• Prestige

• Who will accept my article?

• Visibility

Identifying Key Journals in the field

• Two basic approaches:

– Reputation approach

– Bibliometric approach

REPUTATION APPROACH

Reputation approach

• Perceived quality of journals by scholars within field

• Evaluative

• Editorial board composition

• Where journal is indexed (which databases)

Where to find this type of ranking

• Journal articles : surveys, polls, etc. assessing expert opinion/recognition/value of journals in field

• Subject database searches

• Anecdotal info from mentors, colleagues, peers

• Journal website (for editorial board)

• Ulrich’s (for database indexing)

• ASK YOUR SUBJECT BIBLIOGRAPHER

Why use Reputation approach?

• “Name Brand”

– publish in a well known journal implies visibility

• Interdisciplinary fields or narrow subfields

– Bibliometric measurements may not adequatelyreflect influence, prestige of journals that fall outside of traditional disciplinary lines

• Can be a mark of “quality” as opposed to influence, prestige, other bibliometric indicators

• Can be more important than bibliometrics

Disadvantages to reputation approaches

• Difficult to quantify if no published studies, or the studies are dated

• In a small subspecialty, the broader group of academics in your discipline many not know your journal

• Emerging fields may not have journals with any sort of reputation

• Subjective nature of expert opinion

BIBLIOMETRIC APPROACH

What is bibliometrics?

• Scholarly communication: tracing the history and evolution of ideas from one scholar to another

• Measures the scholarly influence of articles, journals, scholars

The birth of citation analysis

• Eugene Garfield: “father of citation analysis” developed the first bibliometric index tools

• Citation indexes and Journal Citation Reports

– “ISI Indexes”: Science Citation Index, Social Science Citation Index, Arts and Humanities Index

• Better coverage on hard sciences than on social sciences and worse still on humanities

Garfield’s metrics

• Citation count

• Impact Factor

• Immediacy Index

• Citation Half-Life

Citation count

• Number of times cited within a given time period

– Author

– Journal

• Does not take into account

– Materials not included in citation database

– Self citations

Impact factor

• Measures “impact” of a journal (not an article) within a given subject

• Formula is a ratio:

– Number of citations to a journal in a given year from articles occurring in the past 2 years Divided by the number of scholarly articles published in the journal in the past 2 years

Concerns with Impact Factor

• Cannot be used to compare cross disciplinary (per Garfield himself) due to different rates of publication and citation

• Two year time frame not adequate for non-scientific disciplines

• Coverage of some disciplines not sufficient in the ISI databases

• Is a measure of “impact” a measure of “quality”?

Immediacy index

• What it’s supposed to measure: how quickly articles in a given journal have an impact on the discipline

• Formula: the average number of times an article in a journal in a given year was cited in that same year

Citation Half-Life

• What it’s supposed to measure: duration of relevance of articles in a given journal

• Formula: median age of articles cited for a particular journal in a given year

TWENTY FIRST CENTURY TOOLS

Influence of Google Page Rank

• Eigenvector analysis:

– “The probability that a researcher, in documenting his or her research, goes from a journal to another selecting a random reference in a research article of the first journal. Values obtained after the whole process represent a ‘random research walk’ that starts from a random journal to end in another after following an infinite process of selecting random references in research articles. A random jump factor is added to represent the probability that the researcher chooses a journal by means other than following the references of research articles.” (Gonzales-Pereira, et.al., 2010)

Sources Using ISI Data

Eigenfactor.org http://libguides.library.albany.edu/content.php?pid=60086&sid=441804

• Uses ISI data

• Similar to PageRank

• Listed in JCR as of 2009

• Eigenfactor Score :

– Influence of the citing journal divided by the total number of citations appearing in that journal

• Example: Neurology (2006): score of .204 = an estimated 0.2% of all citation traffic of journals in JCR (Bergstrom & West, 2008).

• Larger journals will have more citations and therefore will have larger eigenfactors

Article Influence Score

• From Eigenfactor: measure of prestige of a journal

• Average influence, per article of the papers on a journal

• Comparable to the Impact Factor

• Corrects for the issues of journal size in the raw Eigenfactor score

• Neurology’s 2006 article influence score = 2.01. Or that an avg. article in Neurology is 2X as influential as an avg. article in all of JCR

Journal Citation Reports (JCRWeb)

• Library website ->Databases->Search by Name ->J

• http://library.albany.edu

NEW SOURCES FOR CITATION INFORMATION

Scopus

Google Scholar

Scopus: alternate database of citation data

• Review panel, i.e., quality control

• Bigger field than ISI: covers all the journals in WoS and more

• Strongest in “hard” sciences, ostensibly improved social science coverage, arts and humanities: are “getting there”

• Algorithmically determined with human editing

Scopus analytics

• SJR/SCIMago

• SNIP

• Citation Count

• Document count

• % Not Cited

• % Review Articles (not original research)

SNIP (Source Normalized Impact Per Paper)

• Journal Ranking based on citation analysis with adjustments for the frequency of citations of the other journals within the field (the field is all journals citing this particular journal)

• SNIP is defined as the ratio of the journal’s citation count per paper and the citation potential in its subject field. (Moed, 2009)

SJR:SCImago Journal Rank

• What it’s supposed to measure: “current “average prestige per paper”

• SCImago website uses journal/citation data from Scopus, and is also available from Scopus

• Formula: citation time window is 3 years instead of 2 like JIF

• Corrections for self citations

• Strong correlation to JIF

SCImago Journal Rank

• Prestige factors include: number of journals in db, number of papers from journal in database, citation numbers and “importance” received from other journals: size dependent: larger journals have greater prestige values

• Normalized by the number of significant works published by the journal: helps correct for size variations

• Corrections made for journal self citations

Scopus

• Library website->databases->search by name->S

• http://library.albany.edu

Google Scholaralternate database of citation data

• No rhyme or reason to what is included

• Biggest source of citation data

• Foreign language sources

• Sources other than scholarly journals

• Entirely algorithmically determined, no human editing

• AVAILABLE METRICS NOT GOOD FOR JOURNAL RANKING

Google Scholar

• Publish or Perish

• CIDS

Publish or Perish

• Provides a variety of metrics for measuring scholarly impact and output.

• More useful for metrics on authors than journals or institutions

• Uses Google Scholar citation information

• Useful for interdisciplinary topics, fields relying heavily on conference papers or reports, non-English language sources, new journals, etc.

• Continuously updated since 2006

Publish or Perish Metrics

• Basic metrics:

– # papers, #citations, active years, years since first published, average #of citations per paper, average # of citations per year, average # citations per author, etc.

• Complex metrics

– H index (and its many variations, mquotient, g-index (corrects h-index for variations in citation patterns), AR index, AW index

• Does not have any corrections for SELF CITATIONS

CIDS

• Measures output of authors for prestige and influence

• Similar to PoP

• Corrects for Self-Citations

CIDS metrics

• Citations per year, h-index, g-index, total citations, average cites per paper, self citations included and excluded, etc.

Why use Bibliometric approach

• Considered empirical evidence of journal use

• Means of tracing the evolution of scholarship in a topic/discipline

Disadvantages to Bibliometric approach

• Prestigious, but small journals in a subspecialty may not rank as highly in JCR and other metrics as general publications

• “impact” vs. “quality”

• Editors tend to publish articles which cite their own journal – increase self citation and their own ranking

• There are many reasons to cite a work, not all of them good!

OTHER METRICS

Other Metrics

• Journal Acceptance Rates

– Cabell’s Directory of Publishing Opportunities

• Various disciplines

– Journal website (sometimes)

– Information from professional associations

– ASK YOUR SUBJECT BIBLIOGRAPHER

Other Metrics

• Ulrichsweb

– Comprehensive directory of published journals and periodical literature

– Circulation stats, referee status, publisher, frequency of publication, etc.

Library website->databases->search by name-> U

http://library.albany.edu

Other Metrics: The Future?

• Online “Clicks” or downloads

• MESUR

What does it all mean for Tenure and Promotion?

• Choosing the right journal is a balancing act

• No one bibliometric indicator is the final word on journal “quality”

• Reputation and bibliometrics in tandem can paint a positive picture of your journal choice

• Bibliometrics can also be applied to an individual scholar’s work

Final thought:

• The ranking or prestige of a journal is *not necessarily* an indicator of the quality of an individual article published in it.

• To judge the quality of an individual article, READ THE ARTICLE

QUESTIONS?

Elaine M. Lasda BergmanSocial Welfare, Gerontology and Dewey Reference BibliographerDewey Graduate Libraryelasdabergman@albany.edu442-3695

Recommended