Let me summarize my most important objection to NIRF. While a lot of stake holders use ranking for taking important decisions, and therefore, it is important for the universities and colleges to participate in ranking games, there are several inherent problems with ranking games. The biggest problem is that a single linear rank gives a hugely distorted view of the academic landscape. And even to get that, the data is simple not available in reliable fashion. So a lot of fudging takes place.
Having lots of rankings in private sector is still ok, since most stakeholders will take each of them with a pinch of salt, and hopefully do a bit more homework about the universities they are interested in. While stakeholders may not understand the exact nature of problems with rankings, they are skeptical of any private ranking whose aim typically is to make money out of this information. Also, having multiple rankings mean that to some extent stakeholders realize that either rankings are inconsistent or there are indeed multiple ways to look at universities. But when government creates a ranking, stake holders may use it as the primary information and that would be disastrous.
The second major problem with a government ranking is that it leads to inconsistency in public policy. On one hand, having more out of state students is considered positive for educational outcomes, and on the other hand, we could have 50% reservation for in-state students in the institutes managed by the same government. Opposite for women participation. It is said that having a 50-50 population is good for educational outcomes, but nothing will be done to attract greater number of women students to IITs.
And, third major problem with government ranking is that the government is further reducing its already low credibility. There is no way one can verify this huge amount of data. So there will be huge errors in data leading to some strange ranking. Who should be held responsible if a potential student trusts these rankings, get admission, and then find the truth. What if this happens at a government institute. Will some heads roll, or will we just say, "you are stupid to trust government, and don't deserve anything better."
Last year, when NIRF ranking came out, there were lots of questions, and the answers were rarely provided, and really, the only answer was, "We did not have enough time. So some things may have been overlooked. We will do a better job next year." I personally was very upset at the lack of transparency. We were told that the data consistency would be maintained by way of complete transparency. We will all know what all has been submitted by various institutions and that would keep a check on each institution from lying. When I approached some of the folks in NIRF, I was told that due to very short period within which they had to come up with the ranking, they had taken some ad hoc decisions which may be difficult to explain to public, and hence only that year, the transparency is not there, and the 2017 ranking would be completely transparent.
Well, I don't see the submissions of various institutions on NIRF website this year too. I am told that this year, it was compulsory for all participating institutes to keep a copy of their NIRF submissions on their own websites, and if any institute does not keep that information, they will be removed from the ranking. I just checked several universities, and there is at least one in the top 10 who does not have NIRF data on their website (at least I search on the main page, used their search and used google search with "NIRF
Also, transparency is not just about making submitted data public. It is also how NIRF has interpreted it. How that information has been converted to marks. Yes, they do give score in the five categories. This is appreciated, but they should give out information in each of the part of major factors. And also tell us various parameters that they have used in ranking.
In the small bit of research that I did, reading several submitted reports, this is what I find:
One of the top 5 business school is claiming that in the past 3 years, 100% of their graduates have got a job through campus placement. Frankly, it is unbelievable. Not a single student opting out of placement to start a company, for example. I see this happening in all IITs, NITs, etc., but not at the top business school of the country. Not even one graduate deciding that may be PhD is a good idea. Not even one graduate going back to the company where s/he came from 2 years ago, and not participating in the campus placement. (I thought even Government of India sponsors a few candidates for MBA at such places, and they are expected to join back, and not seek campus placement.) Also, every faculty member, except one has an experience of 5 years or more. Have not not hired any faculty member in the last 5 years, or everyone hired in the last 5 years has left them, or they only hire faculty with 5 years of experience. All this could be true. But a bit unbelievable. And at least as far as the student placement is concerned, if the data is correct, then it is quite sad.
In one university, the submitted data for consultancy amount is X, and the summary sheet for that university put out by NIRF shows more than 10X. I can understand 10X. Someone typing the information could have made a mistake in placing decimal. But having an arbit number which is unbelievably large is rather strange.
In one university, I saw the number of faculty members in NIRF data sheet to be unexpectedly large. I went to the university website. The NIRF submitted data for this is not on their website. Then I went to the website of each department and counted the faculty members shown there. The number is less than half.
The information on papers has been outsourced by them. I am told that in one university, the numbers shown in the datasheet is very less compared to what this university is claiming. The university folks tell me that NIRF sent them an email several months ago telling them the number of papers that they have found. The university wrote back giving its numbers, which was much higher. Now, if both of them are looking at the same database, the only reason for this difference could be that the search queries are different. In particular, faculty members use different names of the university (like IIT Kanpur, I.I.T.K., Indian Institute of Technology Kanpur and so on), and it is possible that they have searched for some names and not other names. It would have been absolutely trivial for them to share their search query, or ask for our search query. But no communication from them, and at the end when the rankings are out, they have simply used their own data. To me, this is height of callousness and incompetence. And we are going to use such data to take important decisions on funding and autonomy.
I see in the data sheets for engineering colleges that they asked for median salary and from management schools, average salary. Why this difference? I thought median salary captured the performance of placement much better, and must be for everyone. But anyway, I checked the numbers. While many institutes have given numbers which are believable as median salaries, there are many where the numbers are simply not believable. My guess is that they have given average salaries rather than median salaries. (Averages are invariably very high compared to median for most colleges.) In fact, about two institutes, I am sure this has happened. I guess in some places this may have happened inadvertently, but in some places it may be a deliberate error.
Then there is this data about capital expenses (not counting building construction). Will this number vary from year to year very drastically. What if it is shown as a small fraction of last year's expense. Shouldn't it raise a suspicion that perhaps last year, they had some construction going on and this year, there is none. Shouldn't they seek some verification of data at least in cases of suspicion.
And note that I am only talking about universities ranked in the top few in some category or the other. I understand that it is not easy to verify data for all colleges and universities, but they can always have a 2-stage or 3-stage process. That is, get data from all colleges which is not verified. The only "stick" is that all this data will be posted on the NIRF website, and if any questions are raised, they will be investigated and if a serious error is found, the information will be given out to press and they will be barred from ranking for some time. With this, you finalize top 120 or so univs in each category, and for these univs, some level of checking can be done. May be some proofs can be asked for. May be someone can check for information on the univ website, and so on. If any glaring errors are found, they are out of ranking. This way, there will be better trust in top 100. And finally, those whom you are going to declare in top 20-25, there should be yet another level of data verification. May be an agency can be hired to actually visit the university and verify everything. So the top 20 or so ranks would be based on very high quality data.
(Of course, the problem of a linear ranking not reflective of all diverse strengths of universities will always remain.)
Notice that I have not discussed the parameters per se, only the poor quality of data that they have against those parameters.
The parameters are also a problem. They are strongly in favor of bigger institutes. I wonder why. I can see that a bigger class to some extent leads to peer-to-peer learning. But beyond a point it does not help. And then what helps is having a variety of disciplines, a variety of courses available on the campus. But just having larger number of students does not guarantee that variety. There are many other serious problems with the parameters, but may be that is for another blog. But I wonder if they did some research to establish correlation between those factors and better teaching/learning or better research. I also wonder if they did any sensitivity analysis with their parameters. (What if I change the parameters slightly, does it result in major changes in the top order.)
I did not write last year after NIRF ranking were published since I wanted to look at what they will do after they have had enough time to do things right. But unfortunately, they have not utilized the time effectively.
Of course, the good thing is that most students/parents who have approached me for admission related queries, do not seem to bother about NIRF ranking. I hope it stays that way.
Added on 9th April:
More interesting stuff.
Jawaharlal Nehru Centre for Advanced Scientific Research which has no teaching program and is considered a tiny research center is ranked the third best teaching place in the country, overall. Such is the stupidity of this ranking that a place without a teaching program has been ranked 3rd best place in India in teaching.
Homi Bhabha National Institute shows its annual budget as 473 crores, and has 973 faculty members. Basically, the entire budget of DAE institutions. Are they really educational institutions. That they have been declared as a deemed university to encourage their scientists to get PhD in-house now means that they can declare the entire budget as university budget, and all scientists as faculty. And NIRF 2017 admits that. Further, with 973 faculty members, they have 252 publications listed in Web of Science, easily the worst ratio of all research places in India. But guess what. They are in the overall list because of a very strange rule. The number of faculty members will be deemed to be 10% of the number of students, irrespective of the actual number of faculty. So in case of HBNI, it will be assumed that these 252 papers have been written not by 973 faculty members, but 310 faculty members. Absolutely crazy stuff.
This article in wired points out that there are drastic changes in 2016 ranks and 2017 ranks. Can quality of universities change this drastically from year to year. It specifically mentions a university having jumped the rank from #83 to #12. It means that either the ranking last year was arbitrary or ranking this year is arbitrary. How do we know that it is not arbitrary this year.
If you look at the overall ranking in teaching learning, you would find that most of our Agriculture universities or Veterinary colleges have a far superior teaching programs than IITs. May be we should handover IITs to Ministry of Agriculture. They seem to be running far better academic institutions.
A very sensible advice from prof. Ashish Nanda, Director, IIM Ahmedabad, "Rate, don’t rank, academic institutions" in Hindu Business Line.
How do you explain a huge difference in the rating by NAAC and ranking by NIRF.
Added on 11th April, 2017:
A blog by Dhruv Baldawa, "Why we should not be ranking our educational institutions"