# Issues Regarding the Reliability of India’s HEI Ranking System: NIRF Facing Criticism
The unveiling of the eagerly awaited ninth edition of the National Institutional Ranking Framework (NIRF) in August has brought attention to the escalating worries about its legitimacy. While rankings serve as an essential measure for students choosing their higher education institutions (HEIs), recent findings raise doubts about whether this government-supported framework genuinely provides a transparent representation of educational quality or if it has become susceptible to manipulation by schools seeking to improve their standings. Research policy analysts have sounded alarms, suggesting that certain institutions may be exploiting the system, thereby distorting their educational offerings presented to potential students.
## The Development of Indian HEI Rankings
Launched in 2015, NIRF was introduced by the Indian government to bridge a significant void: the necessity to rank Indian institutions at a national scale. Before this, internationally recognized ranking systems such as the QS World University Rankings and Times Higher Education World University Ranking offered minimal attention to Indian HEIs. NIRF sought to fill this gap, aiming to create an accurate, data-driven ranking structure tailored specifically for India.
The NIRF framework categorizes institutions based on various parameters divided into five primary criteria:
1. **Teaching, Learning, and Resources (30%)**
2. **Research and Professional Practices (30%)**
3. **Graduation Outcomes (20%)**
4. **Outreach and Inclusivity (10%)**
5. **Perception (10%)**
Although the purpose behind establishing NIRF was to foster a more accountable and standardized assessment process, the ongoing discrepancies in rankings have faced significant backlash from both the academic and policy-making sectors.
## Discrepancies in Rankings and Manipulative Strategies
V. Ramgopal Rao and Abhishek Singh from the Birla Institute of Technology and Science (BITS) have emerged as some of the most prominent critics of NIRF’s methodological shortfalls, particularly regarding consistency. In a recent **[study](https://www.currentscience.ac.in/Volumes/126/11/1321.pdf)**, these researchers examined the rankings of the top 100 institutions across two consecutive years (2022–2023), observing that while the top 20 rankings remained fairly stable, institutions ranked below them experienced significant fluctuations.
This instability in ranking outcomes prompts critical inquiries. Are these institutions genuinely enhancing the quality of their education within such short time frames, or are they finding methods to manipulate the rankings for their own benefit? According to experts like Rao, the latter is more likely to be the case.
## Manipulating the System: The Research Metrics Scenario
“Gaming the system” involves deliberately skewing practices to artificially elevate ranks within NIRF. One of the most easily manipulated metrics is research output, which falls under the **Research and Professional Practices** criterion. As stated by **[Moumita Koley](https://dstcpriisc.org/moumita-koley/)**, a senior research analyst at the DST Centre for Policy Research at the Indian Institute of Science, Bangalore, institutions have been distorting research metrics not through data fabrication—an endeavor complicated by NIRF’s dependence on leading academic databases like Scopus and Web of Science—but by prioritizing quantity over quality.
The significant emphasis on bibliometrics as a measure of research output has, perhaps unintentionally, fostered a “publish or perish” mentality, where institutions pressure their faculty to publish profusely, often disregarding the substantive value or societal impact of the work. The sheer number of publications and their citations are perceived as indicators of academic success, causing institutions to concentrate more on the volume of output rather than its substance.
### A Prime Example: Saveetha Dental College
A striking illustration of manipulation via research metrics is the instance of Saveetha Dental College in Chennai, which stood at the top of the NIRF rankings for dental institutes in 2023 and 2024. Nevertheless, an **[investigation](https://www.science.org/content/article/did-nasty-publishing-scheme-help-indian-dental-school-win-high-rankings)** carried out by *Science* in collaboration with *Retraction Watch* exposed claims of academic publishing misconduct. The college faced allegations of engaging in mass self-citation, where undergraduates and faculty would cite each other’s works to artificially boost citation numbers. Researchers associated with the institution noted it was suspicious that their papers suddenly garnered hundreds of citations, highlighting the harmful repercussions of such practices on the integrity of the institution’s research output.
## The Rise of Citation Cartels
Strategies aimed at manipulating research are not confined to internal self-citation schemes. **Achal Agrawal**, founder of **India Research Watchdog (IRW)**, pointed out another concerning trend: the formation of “citation cartels.” These cartels are networks of researchers or even paper mills that collaborate across different regions, citing one another’s