Measuring the value of scientific research

The true value of scientific research is often misunderstood by the general public. This is especially true in the Philippines, where researchers face the pressure of justifying their work in view of “societal needs” or “national interests.” Such expectations of course put the basic sciences at a disadvantage, and unfortunately often tends to encourage work that lack scientific depth and quality. One needs to have a close look at the history of science and technology to understand the subtle dynamics underlying true progress. While it is tempting to imagine scientific revolutions emerging from dramatic discoveries motivated by the need to solve existing problems, history tells us that, in many cases, the so-called “solutions” came before the “problems.” Good examples include Watson and Crick’s discovery of the structure of DNA, which came decades before the current biotech age, and Nash’s contributions to game theory, which later led to the design of treaties during the Cold War to avert a global nuclear holocaust. In both these examples, the researchers were more concerned with getting the research right; applications to real problems came later. The lesson from these cases is that the value of scientific work is best judged in hindsight. A scientist seeks to get his or her ideas published (i.e., made public), primarily through scientific journals which are read by his or her peers all over the world. The publication itself only comes (if at all) after an idea passes muster through the peer review process; and after a scientific paper is published, it becomes subject to scrutiny by the global scientific community. Obviously, any low-quality work will eventually be exposed as such, while high-quality and important work will be recognized as other researchers use it as basis for their own investigations. Such recognition leaves a paper trail of “citations” that provide a measure of the value of scientific work, as judged by other researchers. Highly cited work can then be part of important scientific breakthroughs that yield real and measurable benefits for humanity.

In 2005, Hirsch first proposed a numerical index that has come to be known as the h-index, in a paper entitled “An index to quantify an individual’s scientific research output” which appeared in Proceedings of the National Academy of Sciences (the entire article can be accessed via the URL http://www.pnas.org/content/102/46/16569.full). In numerical terms, this article has proven to be the most influential of Hirsch’s career, having been cited more than a thousand times in the past seven years. Use of the h-index for measuring the research productivity of individuals or institutions has become commonplace, and it has even been extended as an alternative means of measuring the importance of scientific journals. The concept itself has been discussed in this column by past contributors, notably Dr. Giselle Concepcion, Dr. Ed Padlan and Dr. Caesar Saloma of the UP-Diliman, so it will suffice to just briefly revisit its definition. Hirsch himself provides a very concise definition, stating that it is “the index h, defined as the number of papers with citation number ≥ h.” In other words, a researcher finds his or her h-index by listing down his or her publication, in order of descending number of citations, and then identifying the hth item on the list with at least h citations. Note that the index provides a simultaneous measure of both research productivity (the h-index provides a lower bound for the number of publications in a researcher’s career) and research quality (as measure in terms of importance that is implied by citations). Furthermore, Hirsch also proposed an index m, which is the h-index divided by the length of a research career in years (i.e., the age of one’s first published work). Hirsch suggested m = 1 as the benchmark for a moderately successful scientific career, presumably based on the norms for US-based physicists. An interested young researcher can easily check his or her h-index, for example using the author preview feature of Scopus (www.scopus.com).

As a developing country, the Philippines has lagged behind the rest of the world both in the quantity and quality of scientific output. While this country has never had a strong scientific tradition, it is alarming that, even within the ASEAN region, we are slowly slipping behind other nations in terms of research output; for example, a recent article (http://www.scidev.net/en/science-communication/science-publishing/news/south-east-asian-nations-publish-more-science.html) reports that the Philippines had the lowest rate of growth in number of publications over the past two decades in ASEAN. This is a worrisome trend, as examples of Asia’s most successful economies clearly demonstrate how the transition to knowledge-based economies is fundamental to long-term development. As a final note, I think that these grim prospects should be taken as a challenge by any serious-minded Filipino researcher, whether a seasoned veteran or a graduate student, to make truly significant scientific contributions in his or her field.

* * *

Prof. Raymond R. Tan is a university fellow and full professor of Chemical Engineering at De La Salle University. He is also the current director of that institution’s Center for Engineering and Sustainable Development Research (CESDR). He is the author of more than 70 process systems engineering (PSE) articles that have been published in chemical, environmental and energy engineering journals. He has over 80 publications listed by Scopus, with an h-index of 18. He is a member of the editorial boards of the journals Clean Technologies and Environmental Policy, Philippine Science Letters and Sustainable Technologies, Systems & Policies, and is co-editor of the forthcoming book Recent Advances in Sustainable Process Design and Optimization. He is also the recipient of multiple awards from the National Academy of Science and Technology (NAST) and the National Research Council of the Philippines (NRCP). He may be contacted via e-mail (raymond.tan@dlsu.edu.ph).

Show comments