THE CHANGING SOCIAL CONTRACT FOR SCIENCE AND THE EVOLUTION OF THE UNIVERSITY
Ben R. MartinSPRU Science and Technology Policy Research University of Sussex, Brighton BN1 9RF United Kingdom
Draft chapter prepared for book edited by A. Geuna, A. Salter and W. E. Steinmueller on Science and Innovation: Rethinking the Rationales for Funding and Governance
THE CHANGING SOCIAL CONTRACT FOR SCIENCE AND THE EVOLUTION OF THE UNIVERSITYBen R. Martin, SPRU, University of Sussex, Brighton BN1 9RF, UK 1. Introduction
According to some (e.g. Ziman, 1991 & 1994; Pelikan, 1992), science and the university are under threat. As we move towards a knowledge-intensive society, academics face pressures to link their work more closely to the needs of the economy and society with (it is feared) potentially adverse long-term consequences for scientific research and the university. This has been characterised (e.g. by Guston and Keniston, 1994a) as a fundamental change in the social contact between science and the university, on the one hand, and the state, on the other, with the latter now having much more specific expectations regarding the outputs sought from the former in return public funding. Others (Gibbons et al., 1994) have described it as a transition from Mode 1 to Mode 2 knowledge production. This paper argues that, if one adopts a longer-term historical perspective, then what we are witnessing appears to be not so much the appearance of a new (and hence potentially worrying) phenomenon, but rather a shift back towards a social contract closer to the one in effect for much of the period before the second half of the 20th Century. In what follows, we first consider previous versions of the social contract, in particular those embodied in the Humboldt university and the contract set out by Vannevar Bush in 1945. After analysing the global driving forces subjecting the social contract to change, we examine the revised contract emerging over recent years. We identify some key questions that have been raised about these changes and their possible implications for science and universities. To answer these, we consider the process of historical evolution of universities, including the emergence of different species of university reflecting their differing functions, ethos and relations with the surrounding environment.1 As we shall see, for much of the history of modern science, funds have been provided with a clear expectation that the work will result in specific benefits. Only for a period of a few decades after the Second World War was this former social contract relaxed, when governments were prepared to invest in science with much less precise and immediate expectations as to the eventual benefits. That period is now apparently ending.
This section draws substantially on Martin and Etzkowitz (2001). The author is very grateful to Henry Etzkowitz for many of the historical examples discussed here. The chapter has also benefited from comments of those who attended the International Workshop on New Policy Rationales for the Support of Public Research in the EU held in Paris on 3-4 May 2001. However, any errors are those of the author.
History of the Social ContractThe Humboldt social contract
While it is conventional to begin by focusing on the Vannevar Bush social contract which held sway for most of the second half of the 20th Century, it is worth looking first at the earlier social contract embodied in Humboldt university model. In this, government assumed responsibility for funding the university (in contrast with earlier ad hoc funding arrangements). However, the key characteristic of the Humboldt model was the unity of teaching and research the assumption that both functions had to be conducted within the same institution. The Humboldt model subsequently spread from Germany to other countries in the 19th and 20th Centuries but not to all. In France, for example, universities and particularly grandes coles continued to concentrate on teaching while much of the academic research was carried out elsewhere in laboratories of organisations such as CNRS and INSERM (cf. Schmoch and Winnes, 2000). The separation of teaching and research was even more pronounced in Eastern Europe. Nevertheless, by the second half of the 20th Century, there was a widespread belief among academics and others that the unity of teaching and research was essential to the university and to scientific knowledge production. The Humboldt social contract, despite the reliance on the state for funding, was characterised by a high level of autonomy for both individuals and institutions. Academics were free to engage in research (typically spending 30-50% of their time on this) and to choose their research topic. At the institutional level, in European countries (and some others but not the United States) governments provided general institutional funding for both teaching and research, leaving the university free to determine the allocation of resources across disciplines. 2.2 The Vannevar Bush social contract
The social contract that ran from 1945 to approximately the end of the 1980s is generally linked to Vannevar Bush and his 1945 report, Science: The Endless Frontier. A succession of scientific discoveries in first half of the 20th Century together with several prominent applications of science during the Second World War gave rise to a belief in a simple linear science-push model of innovation, beginning with basic research, leading on to applied research, then technological development and finally innovation. This model was set out in the Bush report.2 The clear implication of the model was that, if government put money into the basic research end of the chain, out from the other end of the chain would eventually come (at some indeterminate time) benefits in terms of wealth, health and national security, although exactly what form those benefits would take was also unpredictable.
This was perhaps the first time that the linear model had been set down formally, although it had appeared in discussions at the end of 19th Century (Godin, 2000, p.5).
The linear model had the great merit of simplicity (even politicians could understand it!) as well as obvious financial convenience it furnished a ready case for getting money out of governments.3 It also implied that few strings should be attached to the public funds provided to basic researchers, leaving them with considerable autonomy. The social contract for the post-war period can be described as follows: Government promises to fund the basic science that peer reviewers find most worthy of support, and scientists promise that the research will be performed well and honestly and will provide a steady stream of discoveries that can be translated into new products, medicines, or weapons. (Guston and Keniston, 1994a, p. 2) There were thus several essential characteristics of the Bush social contract. First, it implied a high level of autonomy for science. Secondly, decisions on which areas of science should be funded should be left to scientists. It therefore brought about the institutionalisation of the peerreview system to allocate resources, a system used before the War by private foundations which supported research. Thirdly, it was premised on the belief that basic research was best done in universities (rather than government or company laboratories). The Bush social contract proved very successful in the decades after 1945, especially in the United States. It contributed to large increases in government funding for science,4 in the number of trained scientists and in research outputs (e.g. publications in scientific journals).
Global Driving Forces for Change
Some time around the late 1980s (but perhaps slightly earlier in the UK and US) we began to see the emergence of a revised social contract replacing that set in place by Vannevar Bush (Guston, 2000). In this section, we consider the main forces bringing about that change. One that was particularly important in the US was the end of the Cold War, resulting in a greatly reduced need for research in physical sciences and engineering. A related factor with similar consequences was dwindling enthusiasm for nuclear energy.5 However, three factors which have been more global in their impact are increasing competition, constraints on public expenditure, and the growing importance of scientific competencies.
A proper economic justification for the Bush social contract came later with the work of Nelson (1959) and Arrow (1962). There were other important contributing factors in the US such as the arms and space race with the USSR and decisions by Congress to wage war on diseases like cancer. Both these factors are obviously linked to another trend, namely the decline in importance of physical sciences compared with biomedical sciences.
We live in an ever more competitive world.6 Over the last 10-15 years, many more marketeconomy players have emerged in Asia, Eastern Europe, Latin America and elsewhere, greatly increasing the level of economic competition. Moreover, there are huge variations in labour costs (e.g. by a factor of 100 or more between Germany and China) at a time when the process of globalisation means that firms can much more easily shift resources and production between countries to benefit from lower costs or other local resources. For industrialised countries, the key to success lies in continuous innovation to improve productivity and competitiveness. Consequently, new technologies such as information and communication technologies and biotechnology are becoming more important. These are heavily dependent on basic research for their development and exploitation,