<p>There has been a long-standing divide between “literature” and “language” education in academia. The divide between interpretation and use, meaning and mechanics, has left us ill-prepared for the age of artificial intelligence. For decades, we treated literature as the domain of ambiguity, voice, ethics, and imagination, while relegating language to grammar, correctness, efficiency, and communication skills. Literature became elite and humanistic; language, remedial and functional. In a print-dominated world, language was largely fixed once written, and texts were produced by humans to be read and judged by other humans. The roles were clear: writers composed, readers interpreted, and teachers evaluated. In such a context, it made sense to treat literature as a space for slow reading, ambiguity, and moral reflection, while treating language skills as mechanical support—grammar, correctness, and clarity—so that meaning could be accessed. Because production and interpretation were separate acts, how language worked could be taught apart from what language meant. An AI-mediated world unsettles this arrangement.</p>.<p>Today, AI tools grade student essays, chatbots compose emails and reports, and even poems arrive fully formed at the click of a button. English proficiency is increasingly reduced to “prompting skills”, while employers reward fluency and polish over judgment and depth. When a machine can produce a competent paragraph in seconds, the question becomes unavoidable: what exactly are we teaching when we teach English?</p>.<p>The problem is not that AI uses language, but that it uses it without responsibility, intention, or accountability. Students trained only in functional language skills risk becoming replaceable, competing with machines designed to optimise speed and surface coherence. Those trained only in literary interpretation, insulated from how language circulates in real institutional, technological, and professional contexts, risk becoming disconnected. AI can generate ethical, reflective, or socially aware language—but only when those values are explicitly supplied through prompting. Judgment has not vanished; it shifts decisively onto the user. Yet our classrooms still treat language skills and interpretation as separate domains. The result is a generation fluent in expression but unsure of authorship, adept at producing text yet unprepared to think through its consequences.</p>.Saving academic language from decline.<p>This gap becomes particularly troubling when viewed against broader shifts in education. Across institutions, students, and more importantly, the systems guiding them are gravitating toward fields perceived as practical and job-linked. In India, even within liberal arts spaces, the shift is stark. At some universities, English majors fell from around 17% of graduates in 2019 to roughly 9% in 2024. Political Science dropped from about 15% to under 8%; History from around 9% to 4%; and Philosophy from 5% to barely 1%. These figures do not signal the end of the humanities, but they do point to a narrowing conception of educational value—one in which utility increasingly displaces reflection.</p>.<p>The danger is not merely institutional decline but civic and intellectual unpreparedness. If fewer students engage deeply with language, literature, and critical thinking, a society facing AI that generates texts, arguments, and even “knowledge” will have fewer people equipped to question, critique, or contextualise those outputs. As AI systems produce essays, news summaries, and policy drafts at scale, democratic life will depend on citizens who can ask difficult questions of language itself: who is speaking, for whom, with what consequences, and through what technological mediation? Without such training, we risk mistaking coherence for truth, fluency for understanding, and confidence for credibility—an especially dangerous condition in an AI-mediated public sphere where persuasion can be automated.</p>.<p>What is needed, then, is not a nostalgic return to literature as an isolated sanctuary, nor a wholesale surrender to skill-based language training. The task ahead is integration. Language must be taught alongside ethics, history, power, and interpretation. Literature must be taught with renewed attention to form, rhetoric, circulation, and use. Students should learn not only what texts mean but also how they work, particularly how they persuade, exclude, amplify, and endure. In the age of AI, the real divide is no longer between literature and language but between those who understand how language shapes thought and action and those who merely deploy it. Those who can bridge the two will gain not just employability but agency. When language becomes effortless, judgment becomes harder. The task before education is not to keep pace with machines but to teach what machines cannot assume: responsibility for meaning.</p>.<p><em>(The writer is an assistant professor at Azim Premji University)</em></p><p>(Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.)</p>
<p>There has been a long-standing divide between “literature” and “language” education in academia. The divide between interpretation and use, meaning and mechanics, has left us ill-prepared for the age of artificial intelligence. For decades, we treated literature as the domain of ambiguity, voice, ethics, and imagination, while relegating language to grammar, correctness, efficiency, and communication skills. Literature became elite and humanistic; language, remedial and functional. In a print-dominated world, language was largely fixed once written, and texts were produced by humans to be read and judged by other humans. The roles were clear: writers composed, readers interpreted, and teachers evaluated. In such a context, it made sense to treat literature as a space for slow reading, ambiguity, and moral reflection, while treating language skills as mechanical support—grammar, correctness, and clarity—so that meaning could be accessed. Because production and interpretation were separate acts, how language worked could be taught apart from what language meant. An AI-mediated world unsettles this arrangement.</p>.<p>Today, AI tools grade student essays, chatbots compose emails and reports, and even poems arrive fully formed at the click of a button. English proficiency is increasingly reduced to “prompting skills”, while employers reward fluency and polish over judgment and depth. When a machine can produce a competent paragraph in seconds, the question becomes unavoidable: what exactly are we teaching when we teach English?</p>.<p>The problem is not that AI uses language, but that it uses it without responsibility, intention, or accountability. Students trained only in functional language skills risk becoming replaceable, competing with machines designed to optimise speed and surface coherence. Those trained only in literary interpretation, insulated from how language circulates in real institutional, technological, and professional contexts, risk becoming disconnected. AI can generate ethical, reflective, or socially aware language—but only when those values are explicitly supplied through prompting. Judgment has not vanished; it shifts decisively onto the user. Yet our classrooms still treat language skills and interpretation as separate domains. The result is a generation fluent in expression but unsure of authorship, adept at producing text yet unprepared to think through its consequences.</p>.Saving academic language from decline.<p>This gap becomes particularly troubling when viewed against broader shifts in education. Across institutions, students, and more importantly, the systems guiding them are gravitating toward fields perceived as practical and job-linked. In India, even within liberal arts spaces, the shift is stark. At some universities, English majors fell from around 17% of graduates in 2019 to roughly 9% in 2024. Political Science dropped from about 15% to under 8%; History from around 9% to 4%; and Philosophy from 5% to barely 1%. These figures do not signal the end of the humanities, but they do point to a narrowing conception of educational value—one in which utility increasingly displaces reflection.</p>.<p>The danger is not merely institutional decline but civic and intellectual unpreparedness. If fewer students engage deeply with language, literature, and critical thinking, a society facing AI that generates texts, arguments, and even “knowledge” will have fewer people equipped to question, critique, or contextualise those outputs. As AI systems produce essays, news summaries, and policy drafts at scale, democratic life will depend on citizens who can ask difficult questions of language itself: who is speaking, for whom, with what consequences, and through what technological mediation? Without such training, we risk mistaking coherence for truth, fluency for understanding, and confidence for credibility—an especially dangerous condition in an AI-mediated public sphere where persuasion can be automated.</p>.<p>What is needed, then, is not a nostalgic return to literature as an isolated sanctuary, nor a wholesale surrender to skill-based language training. The task ahead is integration. Language must be taught alongside ethics, history, power, and interpretation. Literature must be taught with renewed attention to form, rhetoric, circulation, and use. Students should learn not only what texts mean but also how they work, particularly how they persuade, exclude, amplify, and endure. In the age of AI, the real divide is no longer between literature and language but between those who understand how language shapes thought and action and those who merely deploy it. Those who can bridge the two will gain not just employability but agency. When language becomes effortless, judgment becomes harder. The task before education is not to keep pace with machines but to teach what machines cannot assume: responsibility for meaning.</p>.<p><em>(The writer is an assistant professor at Azim Premji University)</em></p><p>(Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH.)</p>