top of page

By:

Quaid Najmi

4 January 2025 at 3:26:24 pm

YouTuber challenges FIR, LoC in HC

Mumbai : The Bombay High Court issued notice to the state government on a petition filed by UK-based medico and YouTuber, Dr. Sangram Patil, seeking to quash a Mumbai Police FIR and revoking a Look Out Circular in a criminal case lodged against him, on Thursday.   Justice Ashwin D. Bhobe, who heard the matter with preliminary submissions from both sides, sought a response from the state government and posted the matter for Feb. 4.   Maharashtra Advocate-General Milind Sathe informed the court...

YouTuber challenges FIR, LoC in HC

Mumbai : The Bombay High Court issued notice to the state government on a petition filed by UK-based medico and YouTuber, Dr. Sangram Patil, seeking to quash a Mumbai Police FIR and revoking a Look Out Circular in a criminal case lodged against him, on Thursday.   Justice Ashwin D. Bhobe, who heard the matter with preliminary submissions from both sides, sought a response from the state government and posted the matter for Feb. 4.   Maharashtra Advocate-General Milind Sathe informed the court that the state would file its reply within a week in the matter.   Indian-origin Dr. Patil, hailing from Jalgaon, is facing a criminal case here for posting allegedly objectionable content involving Bharatiya Janata Party leaders on social media.   After his posts on a FB page, ‘Shehar Vikas Aghadi’, a Mumbai BJP media cell functionary lodged a criminal complaint following which the NM Joshi Marg Police registered a FIR (Dec. 18, 2025) and subsequently issued a LoC against Dr. Patil, restricting his travels.   The complainant Nikhil Bhamre filed the complaint in December 2025, contending that Dr. Patil on Dec. 14 posted offensive content intended to spread ‘disinformation and falsehoods’ about the BJP and its leaders, including Prime Minister Narendra Modi.   Among others, the police invoked BNSS Sec. 353(2) that attracts a 3-year jail term for publishing or circulating statements or rumours through electronic media with intent to promote enmity or hatred between communities.   Based on the FIR, Dr. Patil was detained and questioned for 15 hours when he arrived with his wife from London at Chhatrapati Shivaji Maharaj International Airport (Jan. 10), and again prevented from returning to Manchester, UK on Jan. 19 in view of the ongoing investigations.   On Wednesday (Jan. 21) Dr. Patil recorded his statement before the Mumbai Police and now he has moved the high court. Besides seeking quashing of the FIR and the LoC, he has sought removal of his name from the database imposing restrictions on his international travels.   Through his Senior Advocate Sudeep Pasbola, the medico has sought interim relief in the form of a stay on further probe by Crime Branch-III and coercive action, restraint on filing any charge-sheet during the pendency of the petition and permission to go back to the UK.   Pasbola submitted to the court that Dr. Patil had voluntarily travelled from the UK to India and was unaware of the FIR when he landed here. Sathe argued that Patil had appeared in connection with other posts and was not fully cooperating with the investigators.

Global Rankings, Local Blind Spots

The world’s most-watched university league tables reveal more about academic geopolitics than teaching quality, exposing the fallacy of the international university ranking system.

The QS World University Rankings were released a few days ago. As if on cue, the Indian media lapped it up, like they do each year.  Their deftly written press release prompted many news outlets to report a brighter outcome this year for Indian universities. However, most of them missed the significant point which is that the highest-ranked Indian university’s rank (IIT-Delhi at 123) is five places lower than that of last year (IIT-Bombay at 118). These two, along with IIT-Madras, managed rankings between 100 and 200.


Such international educational rankings have become routine annual attention-grabbers. Their prominence has only grown since the turn of the millennium, when higher education became truly global. But are they genuinely useful or just another instrument for perpetuating the Western world’s hegemony over the global academic hierarchy?


In 2003, Nian Cai Liu, a professor at Shanghai University, decided to rank the world’s universities using a number of relatively objective criteria. His modest idea gave birth to the so-called Shanghai List (officially, the Academic Ranking of World Universities, or ARWU). A year later, in London, the Times Higher Education Supplement began compiling its own ranking, which was initially in partnership with the consulting company Quacquarelli Symonds. In 2009, disagreement between the partners led to the emergence of two independent rankings: the THE and QS lists. Alongside ARWU, these have become the three most prestigious and widely referenced systems for evaluating the quality of tertiary education worldwide.


Although their methodologies vary, they tend to produce broadly similar results. The top 100 in each ranking almost invariably comprises around 50 American universities, followed by British, Canadian and Australian institutions with a few Swiss, French, Singaporean or Chinese universities thrown in for flavour or legitimacy. Russia, Central and Eastern Europe, and Japan are conspicuously absent.


While experts continue to debate the quality, transparency or impartiality of the ranking systems, it is worth considering their actual usefulness from a prospective student’s point of view. University rankings should tell students where they can expect the best teaching, learning opportunities, outcomes and cost-benefit balance. Do these rankings truly serve that purpose? The problem is that they often shift the focus away from good teaching and towards other factors.


Perhaps teaching quality and faculty training are difficult to quantify. But the ranking systems hardly try. Their main emphasis lies not in pedagogy but in research output. In the THE rankings, research environment, publications and citations account for a hefty 59 percent of the score. In QS, the figure is 50 percent and in ARWU, even higher. By contrast, the teaching and learning environment counts for just 10 percent in THE and 29 percent in QS. Although QS recently increased its weighting for employment outcomes and sustainability to 20 percent, THE covers such outcomes only under broader rubrics. Other criteria include student-teacher ratios, the proportion of international students and faculty, and institutional reputation. This methodology ends up reinforcing existing perceptions of prestige and with them, the dominance of large Western universities, particularly expensive private ones.


This is incorrect or at least contestable for several reasons. First, not all global higher education institutions follow the Anglo-Saxon or German model of research universities. In Russia, Eastern Europe and Japan, the majority of universities place greater (if not exclusive) emphasis on teaching quality. As a result, they stand little chance of being recognised in these rankings. Russia has just one university in ARWU’s top 100 and none in QS or THE. The Czech Republic has only one in the top 300. Institutions in the Balkans rarely make it onto the lists at all.


Motivations differ too. For high-fee private universities in the US and UK, it is crucial to top the rankings in order to attract wealthy students. By contrast, institutions in countries with free or low-cost education face no such pressure. This partly explains the relatively low position of German universities – a supreme irony given that it was in this country where the modern research university was born. Even the renowned Max Planck Institute fails to appear in the top 100 of most rankings.


Universities can also game the system. Even seemingly objective indicators such as publication volume can be manipulated. Many American and British universities attract top graduate students from around the world and employ them for both teaching and research, fuelling massive research output. The world is now awash in a deluge of scientific papers, many indistinguishable and with little impact on the advancement of knowledge. Reviewers are often unable to keep up and mediocre work slips through. Quantity, not quality, has become the coin of the realm which often incentivises a publish-or-perish culture that prioritises citation counts over originality. Entire journals have sprung up to cater to this demand, some little more than vanity presses in academic garb.


Some wealthy Arab universities have been accused of hiring highly cited academics under part-time contracts purely to lift their publishing statistics and, in turn, their rankings. Others hire professional consultants to prepare documentation designed to appeal to the ranking agencies' criteria. Some even employ ‘ranking managers’ to advise on how best to position themselves in the global hierarchy.


Furthermore, countries differ significantly in their higher education systems, socio-economic conditions and developmental stages. Russia has demonstrated its scientific prowess across fields such as space technology and defence, often matching or surpassing the West. Its science and engineering education, particularly in physics and mathematics, is globally respected. Likewise, India, despite institutional shortcomings, has achieved significant success in high-tech fields using indigenous technology and talent. Many top scientists at ISRO or BARC studied at universities not featured in global rankings.


Deciding where to study is complicated by the wide variance in options and cost-benefit outcomes. Education today transcends national borders, with Indian students free to choose from universities across dozens of countries. Yet rankings biased towards research tip the scale away from teaching quality. Experience shows that well-screened students, taught by dedicated teachers, can excel at both ranked and unranked institutions. Many such graduates have gone on to succeed in industry and academia worldwide.


A short-term focus on research may benefit science, but in the long run it risks producing fewer well-trained specialists. Indian universities were among the first to raise concerns over these distortions. Since 2018, QS has published a separate India-specific ranking, drawing on the BRICS model and incorporating indicators such as staff with PhDs, employer reputation, sustainability, and internationalisation.


Hopefully, such improvements will continue so that rankings become a more reliable guide for students seeking quality education, wherever in the world they may find it.


(The author is a veteran journalist based in Navi Mumbai. Views personal.)

Comments


bottom of page