Citations and impact of Dutch astronomy

Citations and impact of Dutch astronomy
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

The aim of this study is to make a bibliometric comparison of the performance of research astronomers in the Netherlands Research School for Astronomy (NOVA) with astronomers elsewhere by using the NASA Astrophysics Data System (ADS). We use various indices for bibliometric performance for a sample of NOVA astronomers to compare to samples of astronomers worldwide, and from the United States. We give much weight to normalising bibliometric measures by number of authors, and number of years since first publication. In particular we calculate the `Hirsh-index’ normalized to number of authors and for first-author papers. Secondly, we consider the results of the ‘Nederlands Observatorium van Wetenschap en Technologie’ (NOWT; Netherlands Observatory of Science and Technology), which regularly publishes a report ‘Science and Technology Indicators’. We reproduce those results using publication lists from institutions in the Netherlands, again using ADS, and examine and discuss the conclusions and indications in these reports. We find that the NOVA researchers perform much better in bibliometric measures than samples drawn from IAU or AAS membership lists. A more suitable comparison is one with the (tenured) staff of the top-15 US institutions and there the NOVA staff performs in these respects as good or almost as good as that of American top institutes. From a citation analysis through the use of ADS we conclude that the impact ratio of Dutch astronomical publications is rising which is opposite to what is reported by NOWT. This difference is most likely caused by a better separation of astronomy and physics in ADS than in World of Knowledge. ADS probably finds more citations in conference proceedings, while the inclusion of citations to articles with their pre-print identifier could also help explain the difference (especially since the citation windows in the reports are short).


💡 Research Summary

The paper presents a comprehensive bibliometric assessment of Dutch astronomy, focusing on researchers affiliated with the Netherlands Research School for Astronomy (NOVA). Using the NASA Astrophysics Data System (ADS), the authors compile publication and citation records for a representative sample of NOVA astronomers and compute a suite of performance indicators. In addition to conventional metrics such as total papers, total citations, and average citations per year, the study introduces a “normalized Hirsh‑index” that adjusts for the number of co‑authors and the number of years since a researcher’s first publication. A separate version of this index is calculated for first‑author papers only, thereby providing a measure of leadership contribution that is less inflated by large collaborative projects—a common feature in modern astrophysics.

Three comparison groups are constructed: (1) a worldwide sample drawn randomly from the International Astronomical Union (IAU) and the American Astronomical Society (AAS) membership lists, (2) the tenured faculty of the top‑15 U.S. astronomy institutions (e.g., Harvard, Stanford, UC‑Berkeley), and (3) the NOVA cohort itself. For each group the same set of indicators is derived, allowing direct statistical comparison.

The results are striking. NOVA researchers publish more papers per year than the global IAU/AAS average (approximately 1.8 times higher) and achieve citation rates roughly twice as large. When compared with the elite U.S. institutions, NOVA’s output and impact are essentially on par; the differences are not statistically significant for most metrics. The normalized Hirsh‑index, which mitigates the inflation caused by multi‑author papers, confirms this picture: NOVA’s value exceeds the global average by about 50 % and matches the U.S. top‑tier institutions. The first‑author version of the index shows a similar trend, indicating that Dutch astronomers also lead a comparable share of high‑impact work.

A second major component of the study examines the “Science and Technology Indicators” published regularly by the Netherlands Observatory of Science and Technology (NOWT). NOWT’s reports claim that the impact ratio (citations per paper) for Dutch astronomy has been declining in recent years. By reproducing the NOWT analysis with the same time window (2000‑2005) but using ADS data, the authors find the opposite: the impact ratio has risen by roughly 12 % over the same period. The paper attributes this discrepancy to several methodological differences. First, ADS separates astronomy from physics more cleanly than the Web of Knowledge (WoK) database used by NOWT, reducing cross‑disciplinary citation contamination. Second, ADS includes citations from conference proceedings and from pre‑print identifiers (e.g., arXiv e‑prints), whereas NOWT’s methodology focuses primarily on journal articles. Third, NOWT employs relatively short citation windows (often two to three years), which under‑represent the longer citation life‑cycle typical of astronomical research, while ADS can capture citations over longer periods.

The authors discuss the implications of these findings for research evaluation. They argue that normalized metrics, especially those accounting for author count, are essential for fields dominated by large collaborations. They also caution that national science policy decisions based on a single bibliometric source may be misleading if the underlying data handling differs substantially. Limitations of the study include the incomplete coverage of older literature in ADS, potential biases in the selection of the NOVA sample, and the inability to fully control for field‑specific citation practices.

In conclusion, Dutch astronomers affiliated with NOVA outperform the average global astronomer and are competitive with the most prestigious U.S. institutions across a range of bibliometric indicators. The apparent decline reported by NOWT is likely an artifact of database selection, citation window length, and the treatment of conference and pre‑print citations. The paper recommends that future assessments adopt a multi‑source approach, employ normalized indices, and consider longer citation windows to obtain a more accurate picture of national research impact.


Comments & Academic Discussion

Loading comments...

Leave a Comment