top of page

By:

Prasad Dixit

11 October 2024 at 1:09:23 am

The Human Advantage in an Artificial Age

As artificial intelligence grows smarter and more efficient, the real battle may not be about machines surpassing humanity but about whether humans squander the qualities that still set them apart. With the recent news of a Chinese robot beating the human record in a half- marathon, there is renewed debate on how AI could outsmart human beings. Many experts see it as yet another proof of impending disaster as AI takes over most of the jobs in the years to come. This is not the first time when...

The Human Advantage in an Artificial Age

As artificial intelligence grows smarter and more efficient, the real battle may not be about machines surpassing humanity but about whether humans squander the qualities that still set them apart. With the recent news of a Chinese robot beating the human record in a half- marathon, there is renewed debate on how AI could outsmart human beings. Many experts see it as yet another proof of impending disaster as AI takes over most of the jobs in the years to come. This is not the first time when human civilization is facing a technological revolution that has the potential to impact society and economy in a profound manner. There is, however, a crucial difference with AI driven revolution that is often missed out. The first industrial revolution happened because steam engines were invented and it led to mechanization of production. It was followed by discovery of electrical energy and technologies to harness it for mass production. Next wave of evolution was led by computerization and automation in practically all the fields covering both offices and industrial shop floors through mainframes, personal computers, and programmable logic controllers. While all these leaps in technologies are very different in terms of the specific underlying inventions, they all have one thing in common. They were all invented to do things that were humanly impossible to do. One steam engine or electric motor could do the work that perhaps hundreds of humans would never be able to accomplish even with their collective muscle power. Automation of the manufacturing assembly line would deliver speed and accuracy that human beings would never be able to achieve. Beyond Human Technological advances in Telecommunication, for that matter, have simply expanded the range of 'hearing' and 'seeing' far beyond what human vocal chords, ears, and eyes could manage to do on their own. Computers, at its core, are essentially doing the math and calculations at a speed and accuracy that the human brain can never achieve. To add to that, machines using all these innovations in technology would work tirelessly without any fatigue for a duration that human beings would never be able to match. Although AI is yet another highly potent technological innovation, it is not as straightforward as the previous ones. It can absorb and synthesize huge amounts of data that the human brain perhaps cannot do. Ability of AI to answer any question reasonably well using all the global knowledge made available to it, summarize enormous amount of data and text quickly, quickly draw a complex picture based on instructions given verbally, predict a trend, recognize and highlight a specific face in a fraction of a second from millions of faces, write code based on simple English instructions, are all examples where the speed and accuracy of underlying computation is delivering what human being cannot match. However, there are several areas where human beings are trying to improve AI so that it can, some day, match or exceed capability that human beings themselves already have. Examples of this include the ability of AI to completely replace a human driver safely in all situations, understand full context or an intent behind a statement, carry out complex and well-coordinated mechanical activity in response to various unpredictable situations, react appropriately by correctly assessing the emotions at play, integrate generated code appropriately in the existing larger systems landscape, and so on. In such cases, AI is not exhibiting any capability that is humanly impossible to match. On the contrary, AI is trying to catch up with what humans can do easily. In other words, in these areas, AI is trying to become what humans already are. This very aspect separates AI driven technology revolution from all the previous ones. Direct Competition It is often said that AI and humans will co-exist in the future, and people will need to change their ways of working. It is obvious that AI is also going to directly compete with humans in many sectors. Equipment with an embedded chip on-board do compete with humans even today. A case in point is household equipment such as ‘intelligent’ washing machines and dish-washers where robots to do vacuum cleaning and floor mopping do compete with humans offering these services. A human household help can perform these activities far better than what a machine can do. However, given an affordable choice, an increasing number of households prefer machines over human maid services for a reason. Human household help may not always be punctual, sincere, honest, and reliable. But machines are. Uncontrolled emotions, anger, frustration, laziness, indiscipline, absenteeism do affect humans - but not AI driven machines (at least till the time AI itself acquires emotions of its own, and becomes self-aware some day). This aspect of comparison between AI and humans is likely to become far more prominent and consequential as AI driven machines and robots become more and more intelligent and thereby start competing far more effectively with human capability in many spheres. Competition is said to bring about improvement. Just as AI improves itself through continuous learning to mimic human behaviour and actions, human workforce also needs to improve itself by avoiding behavioural issues and inefficiencies referred to above. Otherwise, humans would lose the natural advantage that they still enjoy over AI, and which is likely to continue even in the foreseeable future. Employers or consumers in the labour-intensive service sector will accept AI driven machines and robots with all its known limitations if it turns out to be a better net-net deal in comparison to services offered by humans. This specific aspect has tremendous significance for India. Many Countries from the developed world do not have a young population with reasonably good IQ in required numbers. India, on the other hand, has it in abundance. One could compare it with abundant availability of Thorium or Sunlight in India as compared to the Western world. Consequently, unlike many Countries in the world that have a Uranium centric approach towards nuclear energy, India's approach needs to be centered around Thorium. India's strategy related to renewable, non-conventional, green energy needs to be based on solar power. Indian Context Strategies for adopting AI in the Indian context need to be similarly tailored for the Indian context. India needs to adopt AI in the areas where it clearly has an advantage over humans in terms of speed, throughput, ease of use, accuracy, and efficiency. However, the use of AI needs to be judiciously controlled in areas where AI is trying to catch up with the capabilities of the human mind and body. Several labour-intensive services such as drivers, caregivers for the elderly people, parcel delivery, security guards, maintenance and repair of various equipment, are all examples in that category. Educational policies and overall work culture in the Country needs to appreciate this reality. Just as AI experts are trying hard to 'teach' AI algorithms and improve them through supervised learning, another set of experts need to sensitize and teach humans on how to understand, appreciate, preserve, and further hone the significant natural advantage that they already have over AI. Despite all the technological breakthroughs in AI, in many areas, still, it is a battle that humans will lose only if they choose to. (The writer works in the Information Technology sector. Views personal.)

The Lost Art of Critical Thinking

In an age of instant outrage and echo chambers, the rapid erosion of independent thought is being driven by systems that reward conformity and emotional ease.

Imagine a world where few ask “why?” anymore. What happens to a society when the ability to reason, challenge, and reflect is replaced by quick reactions, shallow opinions, and echo chambers of agreement? The world has become so loud, so fast, and so saturated with information that the human mind, once a powerful tool for inquiry and growth, has dulled into passive acceptance.


This is not some distant possibility. It is already happening. Many philosophers and social scientists have warned of this trend, sometimes calling it collective shallowness. It does not necessarily mean that people are unintelligent. Rather, individuals increasingly accept ready-made ideas and give up their independence of thought without realizing it. When that occurs, societies become easier to control, divide and mislead.


Rote Learning

Part of the problem lies in systems designed more for efficiency than enlightenment. Schools and universities often reward memorisation over exploration. In India, for instance, students preparing for board exams such as Class 10 and Class 12 focus on practising most probable questions. They are evaluated mainly on standardised answers, which teachers find convenient to correct. Many take the path of least effort. Those who ask unusual questions or try different approaches are not encouraged; they may even score fewer marks. Yet many of them shine later in life, when original thought and problem-solving matter more than exam results.


Teachers themselves often prefer the safe and predictable. Researchers chase popular trends or easy numbers instead of grappling with difficult questions. Politicians resort to slogans rather than evidence. Decision-makers frequently adopt popular policies without carefully examining long-term consequences. In every case, the path of least resistance prevails over the harder work of reasoning.


Information Overload

The digital age promised a flood of knowledge but delivered instead a tsunami of noise. Neuroscientist Daniel Levitin notes that the average person today handles many times more information each day than just a few decades ago. Our brains, however, are not built to cope with this flood. Overwhelmed, we often rely on mental shortcuts. Instead of weighing evidence, we glance at what others believe. This instinct, known as ‘social proof,’ is natural but makes us vulnerable to shallow thinking.


Another worrying trend is the decline of deep reading. Research by scholar Maryanne Wolf suggests that digital media is reshaping the reading brain: people skim more, jump between screens and struggle to focus on long texts. Yet such focus is essential for real analysis. Without it, understanding remains superficial. Carl Sagan warned that we live in a society deeply dependent on science and technology, yet very few people understand these subjects. His concern was not ignorance alone. It was the loss of the ability to think critically about the systems upon which we rely.


Uncomfortable Truths

Critical thinking also requires confronting uncomfortable truths. It asks us to admit we might be wrong. To change our minds in the light of new evidence. To challenge our identities and cherished beliefs. The psychologist Erich Fromm argued that many people prefer to give up their independence of thought because freedom brings responsibility, and responsibility is difficult to handle.


Social media fuels the fire further. These platforms often reward outrage more than nuance, and they thrive on division rather than understanding. As Marshall McLuhan famously said, “The medium is the message.” The way we consume information pressures us toward speed instead of depth, certainty instead of humility and popularity instead of truth


How, then, do we reclaim critical thinking? The answer lies not in intelligence but in courage and practice. Courage to admit we may be wrong. Courage to listen before speaking. Courage to say “I don’t know” and search for a better answer. Socrates’ ancient wisdom remains apt: “The only true wisdom is in knowing that you know nothing.” That humility is the starting point.


Rebuilding a culture of inquiry must occur at all levels. Students can be encouraged to ask at least one deeper question in every class. Teachers can grade for reasoning steps, not only for correct answers. Researchers can preregister studies to focus on substance rather than fashionable results. Politicians often prefer slogans to evidence, and decision-makers may feel compelled to adopt popular policies quickly. Instead, they could require evidence briefs laying out pros, cons, and uncertainties before acting.


On the personal level, small habits matter. Read one long article or book chapter without glancing at your phone. Seek out at least one opposing view each week - not to argue against it, but to understand it. Before sharing a post or claim, trace it to its source and read at least the summary. Prefer primary sources over summaries wherever possible.


Psychologists call this metacognition: thinking about our thinking. Another powerful habit is dialectical reasoning - the ability to hold two opposing ideas in mind long enough to learn from both.


Today, critical thinking is not merely an academic skill but a moral choice. It is the choice of truth over comfort and that choice can be painful. Yet clarity is always preferable to ignorance, because reality, however demanding, is the only ground upon which freedom stands.


The disappearance of critical thinking is not inevitable. For the future will not be shaped by louder voices or sharper slogans, but by minds willing to sit with complexity and value understanding over easy victory.


While critical thinking is fading, it can be restored through awareness and reflection. It begins with one simple act: thinking for yourself with humility and courage. And if each of us pauses once a day to ask a better question, this culture of thinking will gradually return.


(The author is the former Director, Agharkar Research Institute, Pune and Visiting Professor, IIT Bombay. Views are personal.)

Comments


bottom of page