Upload
emmeline-proffitt
View
212
Download
0
Tags:
Embed Size (px)
Citation preview
MEASURING ACADEMIC RESEARCH IN CANADA:ALEX USHERHIGHER EDUCATION STRATEGY ASSOCIATES
IREG-7
Warsaw, Poland – May 17, 2013
The Problem
When making institutional comparisons, biases can occur both because of institutional size and distribution of fields of study
Can we find a way to compare institutional research output in a way that controls for size and field of study?
YES
Basic methodology
Simple 2-indicator system: publication (H-index) and research income (granting councils)
Data gathered at the level of the individual researcher, not institution
Every researcher given a score for his/her performance relative to the average of his/her discipline. Scores are then summed and averaged.
Publication Metric: H-Index
“A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np − h) papers have no
more than h citations each.” (i.e., the largest possible number N where a scientist has a total of N papers with
N or more citations)
Ex. 2Publication 1: 10 citationsPublication 2: 2 citationsPublication 3: 2 citationsPublication 4: 2 citations
Ex. 1Publication 1: 5 citationsPublication 2: 4 citationsPublication 3: 3 citationsPublication 4: 2 citations
H-Index: 3
H-Index: 2
H-Index (pros and cons)
- Pros
- Discounts publications with little or no impact- Discounts sole publications with very high impact
Cons- Requires a large, accurate, cross-referenced database
(labour)- Age bias (less concern on aggregates)- Differences in publication cultures (can be fixed)- Not very useful in disciplines with low publication
cultures
The HiBar Database
Automated collection & calculation
Manual correction
Analysis
Faculty listsStandardized discipline
names
Example: Dr. Joshua Barker
Barker, Joshua D.
Associate Professor
University of Toronto
Social cultural anthropology, violence & power, crime & policing, theories of modernity, anthropology of technology, nationalism, urban studies; Indonesia, South East Asia
• Simple automated search
129 (1000+ pubs)
• Add advanced filtering and Boolean logic
43 (800+ pubs)
• Manual elimination of false positives, excluded publication types, etc.
2 (5 pubs)
The Canadian Prestige Hierarchy
Institution ARWU/THE
Toronto 1
British Columbia 2
McGill 3
Alberta, McMaster, Montreal, Waterloo 2nd tier
Dalhousie, Laval, Queen’s, Simon Fraser, Calgary, Western, Guelph, Manitoba, Ottawa, Saskatchewan, Victoria
3rd tier
Laval, Carleton, Quebec, UQAM, Concordia Other major institutions
Science-Engineering H-Index
Rank Institution Score Rank
Institution Score
1 UBC 1.509 11 McMaster 1.197
2 Toronto – St. G 1.504 12 Trent 1.160
3 Montreal 1.500 13 Scarborough 1.153
4 McGill 1.327 14 Manitoba 1.057
5 Simon Fraser 1.306 15 Trois-Rivieres 1.054
6 Waterloo 1.257 16 Alberta 1.026
7 Ottawa 1.254 17 Western 0.996
8 York 1.208 18 Concordia 0.992
9 Queen’s 1.200 19 Laval 0.989
10 Rimouski 1.200 20 UQAM 0.967
Arts H-Index
Rank Institution Score Rank
Institution Score
1 UBC 1.927 11 Concordia 1.244
2 Toronto – St. G 1.647 12 Trent 1.238
3 McGill 1.629 13 Mississauga 1.219
4 Queen’s 1.533 14 Scarborough 1.192
5 Alberta 1.370 15 Carleton 1.162
6 McMaster 1.364 16 Manitoba 1.130
7 York 1.331 17 Montreal 1.096
8 Guelph 1.320 18 Calgary 1.070
9 Simon Fraser 1.312 19 Saskatchewan 1.054
10 Waterloo 1.289 20 Western 1.016
Medicine
We did not cover medical fields
Impossible to do so because manner in which certain institutions choose to list staff at associated teaching hospitals made it impossible to generate equivalent staff lists.
Research Income
Collected data on peer-evaluated individual grants (i.e. major institutional allocations for equipment, etc excluded) made by two main granting councils (SSHRC and NSERC) over a period of three years
Data then field-normalized as per process for H-Index.
Research Income (pros and cons)
- Pros - Publicly available, 3rd party data, with personal
identifiers- Based on a peer-review system designed to
reward excellence
Cons- Issues with respect to cross-institutional awards- Ignores income from private sources which
may be substantial
Science-Engineering Income
Rank Institution Score Rank
Institution Score
1 UBC 1.640 11 Guelph 1.250
2 Ottawa 1.623 12 McMaster 1.230
3 Montreal 1.572 13 Waterloo 1.229
4 Alberta 1.465 14 Queen’s 1.216
5 Toronto- St. G 1.447 15 Simon Fraser 1.206
6 Calgary 1.359 16 Scarborough 1.187
7 Rimouski 1.295 17 Carleton 1.139
8 Saskatchewan 1.292 18 Western 1.093
9 McGill 1.281 19 Sherbrooke 1.011
10 Laval 1.272 20 Chicoutimi 0.969
Arts Income
Rank Institution Score Rank
Institution Score
1 McGill 2.258 11 Calgary 1.305
2 UBC 2.206 12 Dalhousie 1.263
3 Montreal 1.944 13 Laval 1.263
4 Guelph 1.901 14 Queen’s 1.105
5 Alberta 1.895 15 Ottawa 1.090
6 McMaster 1.799 16 Waterloo 1.065
7 Toronto – St. G 1.733 17 Carleton 0.991
8 York 1.615 18 Rimouski 0.971
9 Concordia 1.582 19 Scarborough 0.953
10 Simon Fraser 1.372 20 Western 0.951
Science-Engineering Total
Rank Institution Score Rank
Institution Score
1 UBC 100 11 Queen’s 76.85
2 Montreal 97.63 12 Scarborough 74.40
3 Toronto – St. G 93.97 13 Calgary 73.26
4 Ottawa 91.05 14 Laval 71.55
5 McGill 83.05 15 Saskatchewan 70.15
6 SFU 80.04 16 Guelph 66.88
7 Rimouski 79.24 17 Western 66.34
8 Waterloo 79.14 18 York 65.97
9 Alberta 78.67 19 Carleton 62.01
10 McMaster 77.18 20 Concordia 59.67
Arts Total
Rank Institution Score Rank
Institution Score
1 UBC 98.84 11 Queen’s 64.25
2 McGill 92.26 12 Waterloo 57.03
3 Toronto – St. G 81.83 13 Calgary 56.65
4 Alberta 77.52 14 Dalhousie 54.09
5 Guelph 76.35 15 Carleton 51.27
6 Montreal 75.32 16 Scarborough 51.26
7 McMaster 75.22 17 Trent 48.36
8 York 70.29 18 Western 47.42
9 Concordia 67.15 19 Mississauga 47.15
10 Simon Fraser 64.44 20 Ottawa 46.06
Controversies (1)
The double-count issue. In an initial draft, we included a record count of staff rather than a head count (former is higher because of cross-appointments). Led to questions
The part-time professor issue. Many objected to our inclusion of part-time staff in the total. So we re-did the numbers without them…
NSERC Scores (revised)
New Rank
Institution Old Rank
New Rank
Institution Old Rank
1 UBC 1 11 Rimouski 7
2 Toronto-St. G 3 12 McMaster 10
3 Montreal 2 13 Queen’s 11
4 SFU 6 14 York 18
5 McGill 5 15 Guelph 16
6 Ottawa 4 16 Saskatchewan 15
7 Alberta 9 17 Manitoba 27
8 Waterloo 8 18 Trent 21
9 Laval 14 19 Western 17
10 Calgary 13 20 Concordia 20
SSHRC Scores (revised)
New Rank
Institution Old Rank
New Rank
Institution Old Rank
1 McGill 2 11 Concordia 9
2 UBC 1 12 Calgary 13
3 Toronto-St.G 3 13 Waterloo 12
4 Guelph 5 14 Laval 21
5 Alberta 4 15 Ottawa 20
6 McMaster 7 16 Dalhousie 14
7 Montreal 6 17 UQAM 43
8 Queen’s 11 18 Trent 17
9 Simon Fraser 10 19 Carleton 15
10 York 8 20 Western 18
The Philosophical Part
Who is a university?
Whose performance gets included in a ranking says something about who one believes embodies a university. Should it include:
FT faculty only? PT faculty? Emeritus faculty? Graduate students?
At the moment, most ranking systems decision driven by data collection methodology.
Do all subjects matter equally?
Field-normalization implies that they do. But is this correct? Are some fields more central to the creation of knowledge than others? Should some fields be privileged when making inter-institutional comparisons?
Does Size Matter?
Does aggregation of talent bring benefits of its own, independent of the quality of people being aggregated?
Where Does Greatness Lie?
On whose work should institutional reputation be based? Its best scholars, or all of its scholars?
Norming for size implicitly rewards schools with good average professors. Failure to norm more likely to reward a few “top” professors