An FWCI decomposition of Science Foundation Ireland funding
In response to the 2008 global financial crisis, Science Foundation Ireland (SFI), now Research Ireland, pivoted to research with potential socioeconomic impact. Given that the latter can encompass higher technology readiness levels, which typically correlates with lower academic impact, it is interesting to understand how academic impact holds up in SFI funded research. Here we decompose SFI \textit{Investigator Awards} - arguably the most academic funding call - into $3,243$ constituent publications and field weighted citation impact (FWCI) values searchable in the SCOPUS database. Given that citation counts are skewed, we highlight the limitation of FWCI as a paper metric, which naively restricts one to comparisons of average FWCI ($\overline{\mathrm{FWCI}}$) in large samples. Neglecting publications with $\textrm{FWCI} < 0.1$ ($8.8%$), SFI funded publications are well approximated by a lognormal distribution with $μ= -0.0761^{+0.017}{-0.0039}$ and $ σ= 0.933^{+0.011}{-0.012}$ at $95 %$ confidence level. This equates to an $\overline{\mathrm{FWCI}} = 1.433^{+0.029}_{-0.015}$ well above $\overline{\mathrm{FWCI}}=1$ internationally. Broken down by award, we correct $\overline{\mathrm{FWCI}}$ for small samples using simulations and find $\sim 67%$ exceed \textit{median} international academic interest, thus exhibiting a positive correlation between the potential for socioeconomic impact and academic interest.
💡 Research Summary
This paper investigates the academic impact of research funded by Science Foundation Ireland (SFI), now known as Research Ireland, during the period 2012‑2016. In the wake of the 2008 global financial crisis, SFI shifted its funding strategy toward projects with clear potential for socioeconomic impact, a move that raised concerns that such a focus might diminish traditional scholarly influence, especially for work at higher technology readiness levels (TRLs). To assess whether this policy shift compromised academic performance, the authors performed a detailed bibliometric analysis of the SFI Investigator Awards (IA), which are the most academically oriented funding stream within SFI.
The dataset comprises 148 IA grants, collectively yielding 3,243 original research publications indexed in Elsevier’s SCOPUS database. For each paper, the Field‑Weighted Citation Impact (FWCI) – a metric that normalises citation counts by year, subject category, and document type – was extracted. Recognising that citation counts are heavily right‑skewed and typically follow a log‑normal distribution, the authors first examined the distribution of FWCI values. After removing low‑impact papers (FWCI < 0.1, representing 8.8 % of the sample), the natural logarithm of FWCI was found to be approximately normally distributed. Maximum‑likelihood estimation yielded parameters μ = −0.0761 (95 % CI: −0.0039 to +0.017) and σ = 0.933 (95 % CI: −0.012 to +0.011), confirming that the FWCI data conform well to a log‑normal model.
From the log‑normal parameters, the mean FWCI was calculated as e^{μ+σ²/2} = 1.433 (95 % CI: 1.418 to 1.462), substantially above the international benchmark of 1.0. This indicates that, on average, SFI‑funded papers receive 43 % more citations than the world average for comparable works, suggesting that the socioeconomic focus did not erode scholarly visibility.
However, the authors caution that FWCI is a ratio based on averages; for small samples the mean can be a poor proxy for the median or mode, leading to misleading interpretations. To address this, they performed Monte‑Carlo simulations of log‑normal distributions with variance σ² ranging from 1 to 1.8 (the range observed in prior literature). For each award, the observed number of papers (n) and the award‑level mean FWCI were used to generate a simulated distribution of possible median FWCI values. By comparing the observed mean to the simulated median, they derived a “relative performance” metric. Approximately 67 % of the awards exceeded the simulated median, meaning that two‑thirds of the IA grants produced papers that are at least as impactful as the median international paper in their respective fields. The remaining ~33 % fell below this threshold, often due to very small publication counts or a reliance on the anticipated socioeconomic outcomes rather than scholarly citations.
Data collection faced several practical challenges. SCOPUS only tracks FWCI for a maximum of four years after publication, so newer papers (post‑2021) lack a stable FWCI value. Moreover, inconsistencies in how funding acknowledgments are recorded (misspelled grant numbers, missing metadata) caused some papers to be omitted. The authors mitigated these issues by cross‑checking grant identifiers in Google Scholar and by excluding document types unlikely to reflect original research (e.g., reviews, editorials, book chapters). Despite these limitations, the average of about 20 papers per award provided a robust sample for statistical analysis.
The discussion emphasizes that FWCI measures “academic interest” rather than intrinsic research quality, though prior studies (e.g., the UK REF) have shown a positive correlation between FWCI and quality assessments, especially in STEM fields. The authors note that FWCI does not correct for self‑citations, large collaborative networks, or the “sleeping beauty” phenomenon where papers gain citations long after publication; however, such outliers are estimated to be extremely rare (<0.01 %).
In conclusion, the study provides empirical evidence that SFI’s policy of prioritising projects with socioeconomic potential has not compromised, and may even have enhanced, the academic impact of its funded research. The log‑normal modeling of FWCI, combined with simulation‑based benchmarking for small samples, offers a methodological template for other funding agencies seeking to balance societal relevance with scholarly excellence. The authors advocate for broader adoption of such quantitative assessments while acknowledging the need for complementary qualitative evaluations to capture the full spectrum of research impact.
Comments & Academic Discussion
Loading comments...
Leave a Comment