Está en la página 1de 2

Vol 464|25 March 2010

OPINION
Let’s make science metrics more scientific
To capture the essence of good science, stakeholders must combine forces to create an open, sound and
consistent system for measuring all the activities that make up academic productivity, says Julia Lane.

M
easuring and assessing academic prioritize research money risks missing out on
performance is now a fact of scientific an important discovery from left field. It is true Summary
life. Decisions ranging from tenure to that good metrics are difficult to develop, but ● Existing metrics have known flaws
the ranking and funding of universities depend this is not a reason to abandon them. Rather it ● A reliable, open, joined-up data
on metrics. Yet current systems of measure- should be a spur to basing their development infrastructure is needed
ment are inadequate. Widely used metrics, in sound science. If we do not press harder for ● Data should be collected on the full
from the newly-fashionable Hirsch index to the better metrics, we risk making poor funding range of scientists’ work
50-year-old citation index, are of limited use1. decisions or sidelining good scientists. ● Social scientists and economists
Their well-known flaws include favouring older should be involved
researchers, capturing few aspects of scientists’ Clean data
jobs and lumping together verified and discred- Metrics are data driven, so developing a and publishing communities to create unique
ited science. Many funding agencies use these reliable, joined-up infrastructure is a necessary researcher identifiers using the same prin-
metrics to evaluate institutional performance, first step. Today, important, but fragmented, ciples as the Digital Object Identifier (DOI)
compounding the problems2. Existing metrics efforts such as the Thomson Reuters Web of protocol, which has become the international
do not capture the full range of activities that Knowledge and the US National Bureau of standard for identifying unique documents.
support and transmit scientific ideas, which can Economic Research Patent Database have been The ORCID (Open Researcher and Contribu-
be as varied as mentoring, blogging or creating created to track scientific outcomes such as tor ID) project, for example, was launched in
industrial prototypes. publications, citations and patents. These efforts December 2009 by parties including Thomp-
The dangers of poor metrics are well known are all useful, but they are labour intensive and son Reuters and Nature Publishing Group. The
— and science should learn lessons from the rely on transient funding, some are proprietary engagement of international funding agencies
experiences of other fields, such as business. and non-transparent, and many cannot talk would help to push this movement towards an
The management literature is rich in sad exam- to each other through compatible software. international standard.
ples of rewards tied to ill-conceived measures, We need a concerted international effort to Similarly, if all funding agencies used a
resulting in perverse outcomes. When the Heinz combine, augment and institutionalize these universal template for reporting scientific
food company rewarded employees for divi- databases within a cohesive infrastructure. achievements, it could improve data qual-
sional earnings increases, for instance, managers The Brazilian experience with the Lattes ity and reduce the burden on investigators.
played the system by manipulating the timing of Database (http://lattes.cnpq.br/english) is In January 2010, the Research Business
shipments and pre-payments3. Similarly, narrow a powerful example of good practice. This Models Subcommittee of the US National
or biased measures of scientific achievement can provides high-quality data on about 1.6 mil- Science and Technology Council recom-
lead to narrow and biased science. lion researchers and about 4,000 institutions. mended the Research Performance Progress
There is enormous potential to do better: to Brazil’s national funding agency recognized in Report (RPPR) to standardize the reporting
build a science of science measurement. Glo- the late 1990s that it needed a new approach to of research progress. Before this, each US sci-
bal demand for, and interest in, metrics should assessing the credentials of researchers. First, ence agency required different reports, which
galvanize stakeholders — national funding it developed a ‘virtual community’ of federal burdened principal investigators and rendered
agencies, scientific research agencies and researchers a national overview of science investments
organizations and publishing “If we do not press harder to design and develop the impossible. The RPPR guidance helps by
houses — to combine forces. Lattes infrastructure. Sec- clearly defining what agencies see as research
They can set an agenda and for better metrics, we ond, it created appropriate achievements, asking researchers to list every-
foster research that estab- risk making poor funding incentives for researchers thing from publications produced to websites
lishes sound scientific met- decisions or sidelining and academic institutions created and workshops delivered. The stand-
rics: grounded in theory, to use the database: the data ardized approach greatly simplifies such data
built with high-quality data good scientists.” are referred to by the federal collection in the United States. An interna-
and developed by a commu- agency when making fund- tional template may be the logical next step.
nity with strong incentives to use them. ing decisions, and by universities in deciding Importantly, data collected for use in
Scientists are often reticent to see themselves tenure and promotion. Third, it established metrics must be open to the scientific commu-
or their institutions labelled, categorized or a unique researcher identification system to nity, so that metric calculations can be repro-
ranked. Although happy to tag specimens ensure that people with similar names are cred- duced. This also allows the data to be efficiently
as one species or another, many researchers ited correctly. The result is one of the cleanest repurposed. One example is the STAR MET-
do not like to see themselves as specimens researcher databases in existence. RICS (Science and Technology in America’s
under a microscope — they feel that their On an international level, the issue of a Reinvestment — Measuring the Effects of
work is too complex to be evaluated in such unique researcher identification system is Research on Innovation, Competitiveness and
simplistic terms. Some argue that science is one that needs urgent attention. There are Science) project, led by the National Institutes
unpredictable, and that any metric used to various efforts under way in the open-source of Health and the National Science Foundation
488
© 2010 Macmillan Publishers Limited. All rights reserved
NATURE|Vol 464|25 March 2010 OPINION

iLLuSTRATion By DAviD PARKinS


under the auspices of the White House Office be weighted differently in different fields. scientists to develop metrics and test their
of Science and Technology Policy. This project People are starting to think about collect- validity through wikis, blogs and discus-
aims to match data from institutional adminis- ing alternative kinds of data. Systems such sion groups, thus building a community of
trative records with those on outcomes such as as MESUR (Metrics from Scholarly Usage of practice. Such a discussion should be open
patents, publications and citations, to compile Resources, www.mesur.org), a project funded to all ideas and theories and not restricted
accomplishments achieved by federally funded by the Andrew W. Mellon Foundation and the to traditional bibliometric approaches.
investigators. A pilot project completed at six National Science Foundation, record details Some fifty years after the first quantitative
universities last year showed that this automa- such as how often articles are being searched attempts at citation indexing, it should be
tion could substantially cut investigators’ time and queried, and how long readers spend on feasible to create more reliable, more trans-
on such tasks. them. New tools are available to capture and parent and more flexible metrics of scientific
Funding agencies currently invest in frag- analyse ‘messy’ data on human interactions performance. The foundations have been laid.
mented bibliometrics projects that often — for example, visual analytics intended to Most national funding agencies are supporting
duplicate the work of proprietary data sets. A discover patterns, trends, and relationships research in science measurement, vast amounts
concerted international strategy is needed to between terrorist groups are now being of new data are available on scientific interac-
develop business models that both facilitate applied to scientific groups (http://nvac.pnl. tions thanks to the Internet, and a community
broader researcher access to the data produced gov/agenda.stm). of people invested in the scientific development
by publishing houses, and compensate those There needs to be a greater focus on what of metrics is emerging. Far-sighted action can
publishers for the costs associated with collect- these data mean, and how they can be best ensure that metrics goes beyond identifying
ing and documenting citation data. interpreted. This requires the input of social ‘star’ researchers, nations or ideas, to captur-
scientists, rather than just those more tradi- ing the essence of what it means to be a good
Getting creative tionally involved in data capture, such as com- scientist. ■
As well as building an open and consistent data puter scientists. Basic research is also needed Julia Lane is the director of the Science of Science
infrastructure, there is the added challenge of into how measurement can change behaviour, & Innovation Policy programme, National Science
deciding what data to collect and how to use to avoid the problems that Heinz and others Foundation, 4201 Wilson Boulevard, Arlington,
them. This is not trivial. Knowledge creation have experienced with well-intended metrics Virginia 22230, USA.
is a complex process, so perhaps alternative that lead to undesirable outcomes. If metrics e-mail: jlane@nsf.gov
measures of creativity and productivity should are to be used to best effect in funding and pro-
be included in scientific metrics, such as the motion decisions, economic theory is needed 1. Campbell, P. Ethics Sci. Environ. Polit. 8, 5–7 (2008).
2. Curtis, B. Globalis. Soc. Edu. 6, 179–194 (2008).
filing of patents, the creation of prototypes4 to examine how changes to incentives alter the 3. Kerr, S. Acad. Manage J. 18, 769–783 (1975).
and even the production of YouTube videos. way research is performed5. 4. Thrash, T. M., Maruskin, L. A., Cassidy, S. E., Fryer, J. W., &
Many of these are more up-to-date measures How can we best bring all this theory and Ryan, R. M. J. Pers. Soc. Psychol. (in the press).
5. Gibbons, R. J Econ. Perspect. 12, 115–132 (1998).
of activity than citations. Knowledge trans- practice together? An international data plat-
mission differs from field to field: physicists form supported by funding agencies could The opinions expressed are those of the author and
more commonly use preprint servers; com- include a virtual ‘collaboratory’, in which may not reflect the policies of the National Science
puter scientists rely on working papers; others ideas and potential solutions can be pos- Foundation.
favour conference talks or books. Perhaps ited and discussed. This would bring social Comment on this subject and view further reading
publications in these different media should scientists together with working natural online at go.nature.com/nByVmy.

489
© 2010 Macmillan Publishers Limited. All rights reserved

También podría gustarte