AI’s Hidden Theft: Culture Without Consent
- Rik Amrit

- 2 days ago
- 5 min read
The real ethical crisis is not AI in art but who owns cultural intelligence.

The current debate on artificial intelligence and art is filled with anxiety. Artists worry about machines replacing human creativity. Scholars fear the loss of authenticity. Audiences wonder whether tradition will survive automation. These concerns are understandable, especially in societies where art is closely tied to identity and livelihood.
However, these debates often miss the real ethical problem. The central crisis is not that artificial intelligence can generate images, music, choreography, or text. The deeper and more urgent issue is who owns the cultural intelligence on which these systems are built.
When AI systems engage with Indian art forms, aesthetics, and theories, the question is not whether machines can be creative. The question is whether civilisational knowledge is being quietly absorbed into private systems without consent, accountability, or justice.
Collective Traditions
Indian performing arts are not individual inventions. They are collective traditions shaped over centuries. Gesture systems, movement vocabularies, rhythmic structures, rāga theory, abhinaya principles, and dramaturgical frameworks have evolved through guru śiṣya lineages, community practice, and shared memory.
These systems were never meant to be owned by a single author. They belong to a civilisation. Today, when performances are recorded, manuscripts digitised, commentaries translated, and archives uploaded, this knowledge enters a new ecosystem. Once digitised, it becomes data. Once it becomes data, it becomes available for training artificial intelligence systems.
At this point, something critical changes. Cultural intelligence moves from shared heritage into proprietary infrastructure. AI models trained on global datasets quietly absorb Indian aesthetic theory, movement logic, and symbolic systems. These models are then owned by private corporations, licensed commercially, and monetised globally.
The communities and traditions from which this knowledge originates are rarely acknowledged. They receive no control, no share, and often no visibility.
Ethical Imbalance
Modern copyright law is built around individual authorship, originality, and fixed duration. Traditional Indian arts do not fit easily into this structure.
Who owns a mudrā? Who owns a rāga? Who owns the theory of rasa?
These are not creations of single individuals. They are layered systems refined across generations. Copyright law struggles to recognise such collective authorship. As a result, traditional knowledge often falls into a grey zone. It is treated as public domain, free for extraction.
This creates a serious ethical imbalance. Corporations can claim ownership over AI models trained on traditional knowledge, while the source traditions are told that their knowledge belongs to everyone and therefore to no one.
The result is asymmetry. Capital can own the output, while culture is reduced to raw material.
Most AI ethics discussions focus on creativity and authorship. Can AI be considered an artist? Should AI generated work receive copyright? Will human artists become irrelevant?
These are important questions, but they are incomplete. They focus on the surface level of output, not the deeper level of input.
AI systems do not create from nothing. They learn from existing material. When that material includes Indian philosophy, aesthetics, movement theory, and performance practice, the system is drawing from civilisational memory. If a company profits from this memory without recognition or responsibility, the issue is not creativity. It is exploitation.
Ethics cannot be limited to whether a machine can compose or choreograph. Ethics must address how knowledge is sourced, who benefits from it, and who bears the cost.
One of the most troubling aspects of AI development is its invisibility. Cultural extraction today does not look like colonial plunder. It happens quietly through datasets, repositories, and training pipelines.
A performance video becomes part of an online archive. A translation enters a digital library. A theoretical explanation is scraped from a website. Each step appears harmless. Yet collectively, they feed systems that generate economic value elsewhere.
The danger lies not in digitisation itself. Digitisation can be empowering. The danger lies in digitisation without governance. When cultural intelligence flows in one direction only, from tradition to corporation, without return, this is not innovation. It is extraction.
In Indian artistic traditions, knowledge is transmitted with responsibility. A guru does not merely teach technique. The guru ensures ethical grounding, contextual understanding, and discipline. AI systems lack this lineage accountability. They can replicate patterns without understanding purpose. More importantly, their owners are not bound by cultural responsibility.
When a company builds a proprietary model using traditional knowledge, there is no obligation to respect its philosophical context. There is no requirement to consult practitioners. There is no duty to return value to the source communities.
This absence of accountability creates an ethical vacuum. The term cultural intelligence is often used loosely. In this context, it refers to the deep logic that governs artistic systems. It includes not only visible forms, but also philosophy, pedagogy, symbolism, and ethics.
Treating this intelligence as a commodity reduces culture to content. It ignores the lived labour of generations who sustained these traditions under difficult historical conditions.
Indian arts survived colonial disruption, economic marginalisation, and social upheaval because they were sustained by communities, not markets. Allowing them to be absorbed into market driven AI systems without safeguards risks repeating older patterns of dispossession in a new technological form.
Collective Stewardship
What is the alternative? The answer is not to reject technology. Nor is it to isolate tradition from digital spaces. The solution lies in reframing ethics as collective stewardship, not individual ownership.
Collective stewardship means recognising that traditional knowledge belongs to communities and lineages. It means involving practitioners, scholars, and cultural institutions in decisions about how data is used.
It also means questioning whether all knowledge should be freely extractable for profit. Open access should not automatically mean open exploitation. Frameworks such as community consent, shared benefit models, cultural attribution, and non-commercial licensing must become part of AI governance.
Institutions have a critical role to play. Cultural bodies, universities, and governments cannot remain passive while cultural intelligence is privatised. India has already recognised the dangers of biopiracy and has taken steps to protect traditional medical knowledge. A similar seriousness is required for artistic and aesthetic knowledge. Without policy intervention, cultural intelligence will continue to flow into global AI systems with little resistance. Once embedded, it becomes almost impossible to reclaim.
This issue is not limited to India. Indigenous and traditional cultures across the world face similar challenges. The Global South is particularly vulnerable because its knowledge systems are rich, under protected, and digitally exposed.
Addressing cultural ownership in AI is therefore not a niche concern. It is central to global debates on equity, justice, and technological ethics.
Artificial intelligence is not the enemy of art. Technology has always interacted with performance, from architecture and acoustics to lighting and recording. The real danger lies elsewhere.
The ethical crisis of our time is not that machines are learning to create. It is that cultural intelligence is being absorbed into private systems without consent, accountability, or restitution.
When AI systems advance using Indian aesthetic theory, the question is not creativity. It is civilisational justice.
If we fail to address this now, we risk a future where tradition survives only as data, stripped of its communities, responsibilities, and voice. If we act with care, courage, and collective wisdom, technology can become an ally rather than an extractor. The choice before us is not between tradition and technology. It is between exploitation and stewardship.
(The author is a Natyashastra scholar, theatre director and producer whose work bridges traditional Indian performance theory with contemporary theatre economics. Views personal.)





Comments