Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

National Institutes of Health Institute and Center Award Rates and Funding Disparities

View ORCID ProfileMichael Lauer
doi: https://doi.org/10.1101/2020.12.27.424490
Michael Lauer
National Institutes of Health,Office of the Director, Office of Extramural Research, 1 Center Drive, Room 144, Bethesda, MD 20892
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Michael Lauer
  • For correspondence: michael.lauer@nih.gov
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

A previous report found an association of topic choice with race-based funding disparities among R01 applications submitted to the National Institutes of Health (“NIH”) between 2011-2015. The report noted that applications submitted by African American or Black (“AAB”) Principal Investigators (“PIs”) skewed toward a small number of topics that were less likely to be funded (or “awarded”). It was suggested that the lower award rates may be related to biases of peer reviewers against topics preferred by AAB PIs. However, the previous report did not account for differential funding ecologies among NIH Institutes and Centers (“ICs”). In a re-analysis, I find that 10% of 148 algorithmically-designated topics account for 50% of applications submitted by AAB PIs. These applications on “AAB Preferred” topics are indeed funded at lower rates than applications on other topics, but their peer review outcomes are similar. The lower rate of funding for applications focused on AAB Preferred topics is likely primarily due to their assignment to ICs with lower award rates. In probit regression analyses, I find that topic choice does partially explain race-based funding disparities, but IC-specific award rates explain the disparities to an even greater degree.

Introduction

Data recently reported by Hoppe et al [1] from the National Institutes of Health (“NIH”) suggest that part of the well-documented funding disparity [2] affecting African-American Black (“AAB”) principal investigators (“PIs”) may be related to the topic of their applications. The authors of that report (including this writer) found that topic choice accounted for over 20% of the disparity and wondered whether biases on the part of peer reviewers might explain why some application topics fare less well when submitted to the NIH for consideration of funding.

However, peer review outcomes are not the only determinant of funding. Applications submitted to the NIH are assigned to one of 24 grant-issuing institutes or centers (“ICs”) that in turn make decisions about which proposal to fund. The proportion of applications funded (or “award rate”) varies accross ICs; therefore, we can think of the NIH process as not one competition hinging entirely on peer review but rather 24 separate competitions. The variablity of award rates relates to differences in number of applications each IC receives, available funds, and IC priorities.

Hoppe et al [1] did not account for IC assignment or variation in IC-specific award rates. It is possible that the apparent link between topic choice and funding disparities may reflect differences in IC assignment, since ICs receive applications according to alignment with their stated mission. For example, applications focusing on cancer epidemiology are more likely to be assigned to the National Cancer Institute while those focusing on basic human biology are more likely to be assigned to the National Institute of General Medical Sciences. If award rates at the National Institutes of General Medical Sciences are higher than at the National Cancer Institute, it might appear that NIH “prefers” basic human biology over cancer epidemiology. While the former topic does fare better with a higher likelihood of funding, this may be largely because of different IC award rates as opposed to differences in how the topics are received by peer reviewers.

I therefore re-analyzed the data from Hoppe et al [1] focusing on the possible role of IC assignment in application outcomes. To minimize biases related to repeated looks by the peer system on individual proposals (from resubmissions [3] or competing renewals [4]) I focus on de novo applications submitted to the NIH for the first time.

Materials and Methods

These analyses are based on R01 applications submitted to NIH between 2011 and 2015. Hoppe et al [1] described in detail NIH peer review processes and the “Word2vec” algorithm [5] used to designate a topic for each application. Briefly, each application is assigned to a peer review group. After a preliminary pre-meeting review, approximately half are deemed to be potentially meritorious and are therefore discussed during a formally convened meeting. After the meeting, each discussed application receives a priority score ranging from 10 (best) to 90 (worst); many, but not all, applications also receive a “percentile ranking” to account for differences in how individual peer review groups calibrate their scores.

Applications are not only assigned to peer review groups; they are also assigned to ICs. ICs ultimately make decisions about which applications to fund, with funding decisions based on peer review scores, strategic priorities, and availability of funds.

To eliminate biases due to prior reviews, I focused on applications coming to the NIH for the very first time; in other words, I excluded resubmissions [3] and competing renewals [4]. For each IC, I calculated award rates as number of applications funded divided by number of applications assigned. I also noted what proportion of applications had a principal investigator (“PI”) who self-identified as AAB. For multi-PI applications, I considered the self-identified demographic of the contact PI. I designate those ICs in the top quartile of AAB application proportions as “Higher AAB” ICs.

There were 148 topics identified by the Word2vec algorithm [5]. For each topic, I counted the number of applications of submitted by AAB PIs. Consistent with the findings of Hoppe topics were not randomly distributed by PI race; there were 15 topics that accounted for 50% of applications submitted by AAB PIs. I designate these applications as having “AAB Preferred” topics.

To assess the association of topic choice, IC assignment, and peer review on application success, I compared peer review and funding outcomes according to whether applications were assigned to Higher AAB ICs and separately whether applications topics were AAB Preferred or Other. I performed a series of probit regression analyses with funding as the dependent variable and AAB PI as an explanatory variable. I added topic choice (AAB Preferred or Other), IC assignment (Higher or Lower AAB), and IC award rate in separate models and examined whether the regression coefficient relating AAB PI to funding decreased, and if so, by how much. Akaike Information Criteria, Bayesian Information Criteria, and Log Likelihood values informed model strength.

To assess the the association of topic choice with peer review outcomes, I focused on applications that were discussed and therefore received a priority score. I constructed a plot of topic-specific mean peer review scores according to number of applications in each topic. I expected to find a greater variance of mean scores for topics receiving fewer applications (“regression to the mean”). I generated a linear regression model to estimate a predicted mean score for each topic based on topic size, and calculated a residual for each topic by substracting from each topic-specific mean score the model-based predicted mean score.

All analyses used R [6] packages, including tidyverse [7], ggplot2 [8], finalfit [9], and texreg [10].

Results

Of 157,405 applications received, there were, after exclusion of resubmissions and competing renewals, 99,195 applications considered by NIH for the first time. Of these 8422 were funded, for an overall award rate of 8%. There were 1685 applications, or 2%, submitted by AAB PIs. Table 1 shows IC-specific values for applications received, applications funded, award rates, and percent applications coming from AAB PIs. Of note, award rates varied from 6% to 15%, while the proportion of applications with AAB PIs ranged from <1% to nearly 15%.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 1.

Application characteristics according to Institute or Center (IC). AAB = African American or Black; PI = Principal Investigator; EY = National Eye Institute; DC = National Institute of Deafness and Communications Disorders; GM = National Institute of General Medical Sciences; DE = National Institute of Dental and Craniofacial Research; MH = National Institute of Mental Health; DA = National Institute on Drug Abuse; NS = National Institute of Neurological Disorders and Stroke; NINR = National Institute of Nursing Research; HL = National Heart, Lung, and Blood Institute; AI = National Institute of Allergy and Infectious Diseases; ES = National Institute of Environmental Health Sciences; DK = National Institute of Diabetes and Digestive and Kidney Disease; AA = National Institute on Alcohol Abuse and Alcoholism; AG = National Institute on Aging; EB = National Institute of Biomedical Imaging and Bioengineering; CA = National Cancer Institute; HD = Eunice Kennedy Shriver National Institute of Child Health and Human Development; MD = National Institute on Minority Health and Health Disparities; AR = National Institute of Arthritis and Musculoskeletal and Skin Diseases. Data for ICs with cell sizes not exceeding 11 are not shown due to privacy concerns.

Review and Funding Outcomes According to IC and to Topic

Table 2 shows review and funding outcomes for applications according to whether the assignment was to an IC in the top quartile of AAB applications (“Higher AAB”). These ICs were the National Institute of Allergy and Infectious Diseases, the National Institute of Environmental Health Sciences, the National Institute of Child Health and Human Development, the National Institute of Minority Health and Disparities, the National Institute of Nursing Research, and the Fogarty International Center. Applications submitted to Higher AAB ICs were 3 times more likely to come from AAB PIs. Review outcomes - proportion discussed and, for those applications that were discussed at peer review meetings, priority scores and percentile rankings - were similar in both groups. Despite the similar review outcomes, they were 13% less likely to be funded.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 2.

Application review and funding outcomes according to whether Institute or Center received a higher or lower proportion of applications from AAB principal investigators. AAB = African American or Black; PI = Principal Investigator.

Table 3 shows corresponding values according to whether applications were focused on the 15 topics that made up 50% of all applications with AAB PIs (“AAB Preferred” topics). Again, peer review outcomes were similar in the two groups, but applications focusing on AAB Preferred topics were 8% less likely to be funded.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 3.

Application review and funding outcomes according to whether topic was among those that accounted for the majority of AAB applications. Abbreviations as in Table 2.

Why do applications on AAB Preferred topics have worse funding outcomes despite similar peer review assessments? Table 3 shows that applications on AAB Preferred topics are 41% more likely to be assigned to Higher AAB ICs. The scatter plot in Figure 1 shows IC award rate according to the proportion of applications assigned to it that focus on AAB Preferred topics. ICs with receiving a higher percentage of AAB Preferred topic applications have lower award rates.

Fig 1.
  • Download figure
  • Open in new tab
Fig 1.

Scatter plot of IC specific award rates according to proportion of applications focusing on AAB Preferred topics. Abbreviations are the same as in Tables 1 and 2. ICs that receive relatively more applications on AAB Preferred topics have lower award rates. Data for ICs with cell sizes not exceeding 11 are not shown due to privacy concerns.

Probit Regression Models

Table 4 shows the association of an application with an AAB PI with the probability of funding. Consistent with Hoppe et al [1] and prior literature [2], AAB PI applications had a lower likelihood of funding (negative regression coefficient for AAB Principal Investigator). Adjusting for the topic (AAB Preferred or Other) reduced the regression coefficient for race by 5%; similarly adjusting for IC assignment (Higher or Lower AAB) reduced the regression coefficient by 6%. However, adjusting for the award rate of the assigned IC reduced the regression coefficient for race by 14%.

View this table:
  • View inline
  • View popup
  • Download powerpoint
Table 4.

Probit Regression Models. Model 1 shows the univariable association of funding success according to whether the PI is AAB. Model 2 adjusts for topic, Model 3 adjusts for IC assignment, and Model 4 adjusts for IC award rate. Note that the absolute value for the regression coefficient linking AAB PI to funding outcome decreases with each of these adjustments, with the greatest reduction after adjusting for IC award rate. AIC = Akaike Information Criterion; BIC = Bayesian Information Criterion; Num. obs. = Number of Observations. Other abbreviations as in Tables 1 and 2.

Topics and Review Outcomes

To gain greater insights into possible peer review biases against topics preferred by AAB PIs, Figure 2, Panel A, shows the mean priority score by topic (of note, only discussed applications receive priority scores) according to the topic size, namely the number of submitted applications that were discussed for each topic. As would be expected topics of smaller size showed greater variability, a manifestation of regression to the mean.

Fig 2.
  • Download figure
  • Open in new tab
Fig 2.

Topic peer review scores according to number of applications recevied (“Topic Size”) and topic type (AAB Preferred or Other). Panel A: Scatter plot of topic-specific mean peer review scores according to topic size. Each dot refers to a topic, with orange dots AAB preferred topics and green dots all others. The line is based on a linear regression of mean peer review scores on topic size. The slope of the line was not significant (P=0.68). Panel B: Distribution of weighted residuals of topic-specific mean review scores. Residuals are calculated as the distance between the dots in Panel A and the regression line, and are then weighted by topic size.

The line in Figure 2, Panel A, is based on a linear regression of predicted mean score according to topic size. Although the slope is slightly negative (coeffecient −0.0002264), the association was not significant (p = 0.68). Among AAB preferred topics (orange dots), there were 5 more than 1 point above the line (meaning with scores worse than predicted), while there were 3 more than 1 point below the line (meaning with scores better than predicted). The remaining 7 topics had mean scores that were within 1 point of the predicted value.

For each topic, I calculated a residual by subtracting from the topic-specific mean priority score the predicted mean priority score; I weighted the residuals by the topic size, as the larger topics contribute more information. Figure 2, Panel B, shows the distribution of the weighted residuals according to topic type. Residuals were more positive (i.e. worse) for AAB preferred topics. However, the absolute differences are small, much less than one priority score point (over a possible range of 10-90, with topic-specific mean values ranging from 35-45).

Discussion

Among over 99,000 R01 applications submitted to NIH between 2011 and 2015, 2% were submitted by AAB PIs. Their applications were skewed towards a relatively small group of “AAB Preferred” topics; 10% of 148 topics accounted for 50% of AAB applications. Applications on AAB Preferred had similar review outcomes as those on other topics (Table 3) but were less likely to be funded. The lower award rates for AAB Preferred applications were associated assignment to ICs with lower overall award rates.

These observations reflect that there are two factors at play in determining whether an application submitted to NIH will be funded. The first, well known to all involved with NIH system, is peer review; those applications that receive better scores are more likely to be funded. But there is a second factor, namely the funding ecology of the IC to which the application is assigned. As shown in Table 2 applications with similar peer review outcomes are less likely to be funded if they are assigned to ICs with lower overall award rates. AAB PIs are more likely to submit applications to ICs with lower award rates, and applications (whether submitted by AAB or other PIs) that focus on AAB Preferred topics are more likely to be assigned to ICs with lower award rates (Figure 1).

Hoppe et al [1] found that topic choice partially accounted for funding disparities that adversely effect AAB PIs. I confirm this, but find that IC assignment (which, of course, is linked to topic) explains the disparities just as well, and that IC award rates explain the disparities even better (Table 4).

There is variability in how well different topics fare at peer review, but inspection of Figure 2 suggests that much of this variability reflects instability of estimates stemming from smaller sample sizes. Many topics that are not favored by AAB PIs receive better (lower) priority scores than the overall average, but many other such topics receive worse scores (Figure 2, Panel A). An inspection of weighted residuals suggest that AAB Preferred topics may fare a bit worse (Figure 2, Panel B), but to a much lower degree than the difference of award rates among assigned ICs (Table 1). Furthermore, it should be noted that applications on these topics were more likely to make it past the first hurdle of peer review, that is reaching the point of formal discussion (Table 3, see line “Discussed”); thus, if anything, peer reviewers may be slightly biased in favor of AAB-preferred topics.

Conclusion

The lower rate of funding for applications focused on AAB Preferred topics is likely primarily due to their assignment to ICs with lower award rates. These applications have similar peer review outcomes as those focused on other topics. Topic choice does partially explain race-based funding disparities, but IC-specific award rates explain the disparities to an even greater degree.

References

  1. 1.↵
    Hoppe TA, Litovitz A, Willis KA, Meseroll RA, Perkins MJ, Hutchins BI, et al. Topic choice contributes to the lower rate of NIH awards to African-American/Black scientists. Science Advances. 2019;5: eaaw7238. doi:10.1126/sciadv.aaw7238
    OpenUrlFREE Full Text
  2. 2.↵
    Ginther DK, Schaffer WT, Schnell J, Masimore B, Liu F, Haak LL, et al. Race, Ethnicity, and NIH Research Awards. Science. 2011;333: 1015–1019. doi:10.1126/science.1196783
    OpenUrlAbstract/FREE Full Text
  3. 3.↵
    Lauer MS. Resubmissions Revisited: Funded Resubmission Applications and Their Initial Peer Review Scores [Internet]. NIH Extramural Nexus. 2017. Available: https://nexus.od.nih.gov/all/2017/02/17/resubmissions-revisited-funded-resubmission-applications-and-their-initialra-peej
  4. 4.↵
    Lauer MS. Are Attempts at Renewal Successful? [Internet]. NIH Extramural Nexus. 2016. Available:https://nexus.od.nih.gov/all/2016/02/16/are-attempts-at-renewal-successful/
  5. 5.↵
    Mikolov T, Chen K, Corrado G, Dean J. Efficient Estimation of Word Representations in Vector Space [Internet]. 2013. Available: https://arxiv.org/abs/1301.3781
  6. 6.↵
    R: The R Project for Statistical Computing [Internet]. Available: https://www.r-project.org/
  7. 7.↵
    Wickham H, Averick M, Bryan J, Chang W, McGowan LD, François R, et al. Welcome to the tidyverse. Journal of Open Source Software. 2019;4: 1686. doi:10.21105/joss.01686
    OpenUrlCrossRefPubMed
  8. 8.↵
    Wickham H. GGplot2: Elegant Graphics for Data Analysis [Internet]. Springer-Verlag New York; 2016. Available: https://ggplot2.tidyverse.org
  9. 9.↵
    Quickly Create Elegant Regression Results Tables and Plots when Modelling [Internet]. Available: https://finalfit.org/index.html
  10. 10.↵
    Leifeld P. texreg: Conversion of statistical model output in R to LaTeX and HTML tables. Journal of Statistical Software. 2013;55: 1–24. Available: http://dx.doi.org/10.18637/jss.v055.i08
    OpenUrl
Back to top
PreviousNext
Posted December 29, 2020.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
National Institutes of Health Institute and Center Award Rates and Funding Disparities
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
National Institutes of Health Institute and Center Award Rates and Funding Disparities
Michael Lauer
bioRxiv 2020.12.27.424490; doi: https://doi.org/10.1101/2020.12.27.424490
Digg logo Reddit logo Twitter logo Facebook logo Google logo LinkedIn logo Mendeley logo
Citation Tools
National Institutes of Health Institute and Center Award Rates and Funding Disparities
Michael Lauer
bioRxiv 2020.12.27.424490; doi: https://doi.org/10.1101/2020.12.27.424490

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Scientific Communication and Education
Subject Areas
All Articles
  • Animal Behavior and Cognition (3609)
  • Biochemistry (7585)
  • Bioengineering (5533)
  • Bioinformatics (20824)
  • Biophysics (10344)
  • Cancer Biology (7995)
  • Cell Biology (11653)
  • Clinical Trials (138)
  • Developmental Biology (6616)
  • Ecology (10224)
  • Epidemiology (2065)
  • Evolutionary Biology (13639)
  • Genetics (9556)
  • Genomics (12856)
  • Immunology (7929)
  • Microbiology (19568)
  • Molecular Biology (7675)
  • Neuroscience (42180)
  • Paleontology (308)
  • Pathology (1259)
  • Pharmacology and Toxicology (2208)
  • Physiology (3271)
  • Plant Biology (7057)
  • Scientific Communication and Education (1295)
  • Synthetic Biology (1953)
  • Systems Biology (5433)
  • Zoology (1119)