Why RUG policy is being based on university rankings

University rankings have long enjoyed popularity in the US, and now they are gaining popularity amongst European universities such as the RUG. On a regular basis, press releases are being published praising the RUG for being a being a top 100 university. However, upon closer inspection of these ‘prestigious’ rankings, we have concluded that they are definitely not a guarantee for quality.

By Christine van der Veer

The RUG site names at least 10 different rankings that feature our university. One of these is the ‘Academic Ranking of World Universities’, the most influential and stable ranking, from 2003, also dubbed the Shanghai ranking. Right now, the RUG is listed as number 72. The text accompanying these rankings states: ‘The UG is proud to be classified as a top university, with a ranking below 100 in rankings such as the ‘Shanghai’ ARWU ranking and the World University Rankings. But is our inclusion in these rankings really something to be proud of? Of course, they do help the RUG to attract international students. These rankings can be decisive in a foreign student’s choice to study here. However, we wonder whether these rankings should be decisive in those kind of decisions. We think that especially a university, as a critical institution, should not just blindly follow this measuring and ranking ‘trend’. We support qualitative (over quantitative) values in education, and therefore the RUG should adopt a critical attitude towards rankings.

Rankings as a driving force

University policy is increasingly being faced with the pressures of scoring well in rankings. Admittedly, Sibrand Poppema, chairman of the executive board, agrees with us that the importance rankings has been overstated. Sadly enough, this opinion has not been reflected by Poppema’s administration. If we examine the developments in the university, it has become undeniable that rankings are an important driving force behind those decisions. Examples can be found in the focus on internationalization, and the heightened emphasis on publishing. So while the university rhetoric might be that rankings are unimportant, practice proves otherwise. At times, this contradiction starts becoming ironic, such as when the ‘good’ rankings are used to legitimise the universities reappointment of Poppema for another 4-year term. The rankings are not important, unless they happen to be convenient.

At the expense of quality and plurality

What is the exact problem with these rankings? Even though the focus might vary between them, they all base their scores on numbers. The number of publications, citations, awards, cash flows, researchers, student satisfaction (in numbers!), and reputation. We see this emphasis on numbers as a dangerous development. First and foremost, the numbers and measurements used for these rankings ignore the value of plurality within the university. The RUG has many different faculties and disciplines. All of these have their own way of researching, and educating. Training a student to become a philosopher requires completely different means – think about manner of education, time, or number of students – than educating someone to be dentist. When rankings measure all courses in the same ways these differences are being neglected. This has some serious consequences, the plurality of the university will lessen and this will be at the expense of small, ‘non-profitable’, courses.

Another issue with this persistent focus on numbers, is the pressure publication numbers. To achieve the highest possible ranking, the university strives to maximize output. Researchers are constantly stimulated to publish more, and more, and as a result they are treated as producers, rather than scientists. They are allowed to stay, as long as they create enough output. Because of this pressure, it becomes more attractive to publish a higher quantity of similar articles rather than, let us say, one well thought out book. This leads to fraud, as well as inaccuracy, and repetition. The same goes for citation ‘scores’, another important part of rankings. The number of citations should be a sign of quality. However, by just measuring number of citations, rather than the reason for the citation, important motives (that have nothing to do with quality) are being overlooked. Works are often cited to contest them, to debunk rather than to praise. Or, even worse, scholars can make citation deals with each other, to make sure their numbers are high enough.

Recognition is given to values that should not matter to a university, namely rapidity (read inaccuracy), and appearance. One example of what this can lead to can be found in the ‘Times Higher Education Ranking’ (2010). In this ranking the university of Alexandria reached the 147st place, surpassing the universities of Delft, Rotterdam, Amsterdam, and Groningen. This may seem to be a praiseworthy achievement, however after an investigation it was discovered that Mohamed El Naschie was to thank for the larger number of citations. This Egyptian engineer had published 320 articles for the university, in a journal he was the editor of. More instances like this one can found. Think, for example of Diederik Stapel, who also caved to publication pressure. In cases like these, it is very easy to blame the researchers, however it is important that we also look for structural causes. When the demand is quantity, it is easy to forget the importance of content.

Education as number one!

We can conclude that, due to rankings, universities are becoming more alike. Additionally the rankings shift the focus of universities away from education and knowledge. More money is being invested in marketing (helpful for citation and reputation scores). A good illustration of this is the new position ‘Head of International Strategy and Relations’ (since 2016) at the RUG. This position focuses on visibility, marketing, internationalizing, and most importantly: rankings. We think it is time we start prioritizing quality over quantity, content over marketing. We, the students of DAG, think the focus on rankings is a negative influence on our education. We are thrilled that Poppema seems to agree with us. However, policies should start reflecting this position. The rankings, for now, remain  an invisible, but dominating force within the university’s policies. The critical attitude expressed by Poppema (and the university) shouldn’t just remain rhetoric, but should be reflected in the board’s actions.

One thought on “Why RUG policy is being based on university rankings

  1. Ben Nijhoff, CEO says:

    Helemaal mee eens! Groeten van een eerdere generatie die t begin van deze gruwelijke ellende heeft meegemaakt. Pracht initiatief dus, zwaar tijd om het stuur om te gooien en het onderwijs weer in dienst te stellen van land en samenleving.

Comments are closed.