The Role of Computing Resources in Publishing Foundation Model Research
Exploring the relationship between computing resources and scientific advancement in foundation models (FM).
Research Overview
Abstract
Cutting-edge research in Artificial Intelligence (AI) requires considerable resources, including Graphics Processing Units (GPUs), data, and human resources. In this paper, we evaluate the relationship between these resources and the scientific advancement of foundation models (FM). We reviewed 6517 FM papers published between 2022 to 2024, and surveyed 229 first-authors to understand the impact of computing resources on scientific output. We find that increased computing is correlated with individual paper acceptance rates and national funding allocations, but is not correlated with research environment (academic or industrial), domain, or study methodology.
Study Scope
Analysis of 34,828 accepted papers between 2022 and 2024, identifying 5,889 foundation model papers.
Resource Analysis
Examining GPU access, TFLOP measurements, and their correlation with research outcomes.
Survey Results
Insights from 229 authors across 312 papers on resource usage and impact.
Key Findings
Computing Impact
Greater GPU access yields superior pre-trained models and correlates with higher acceptance rates in top AI/ML publications.
Resource Distribution
Analysis of hardware resources and their relationship with publication success in selective AI/ML computer science conferences.
Future Implications
Projections for GPU availability and recommendations for individuals and institutions to ensure progress in AI research.
Research Team

Yuexing Hao
EECS, MIT & Cornell University

Yue Huang
CSE, University of Notre Dame

Haoran Zhang
EECS, MIT

Zhenwen Liang
CSE, University of Notre Dame

Paul Pu Liang
EECS, MIT

Yue Zhao
School of Advanced Computing, USC

Lichao Sun
CS, Lehigh University

Saleh Kalantari
Cornell University

Xiangliang Zhang
CSE, University of Notre Dame

Marzyeh Ghassemi
EECS, MIT