<p>I fell into teaching naturally in 2019. I felt alive in the classroom even before I considered becoming a teacher. Even then, I realised that the classroom was a place of fun: conversation, jokes, and play. As a student, I especially enjoyed classes in which the teacher was accessible, and I could ask whatever questions came to mind. </p>.<p>As a teacher, I knew I wanted the same for my students. Every time I read a student’s assignment, it was with a sense of fullness and curiosity. If students wrote about the dosa place next to campus, I knew that the experience was real and honest, and that when we talked about the always-grumpy man there, the experience was shared. All of this was temporarily paused when I was forced to look at them differently: all my previously innocent and cute students seemed to have grown AI-shaped fangs. </p>.<p>For all of April 2024, I looked at student writing and pulled apart each submission line by line to try and figure out whether the words in front of me were written by a person or a machine. Reading their work stopped being less about the students and more about the strange mechanics of computing. It became clear to me, as it did to others, that the classroom would never be the same again. </p>.<p>After some amount of spy work, I had my list of defaulters in place. I called them to the department one by one and had long conversations with them. One of them, let us call him ‘C’, was a tall, gym-going boy who had already made clear that he wasn’t interested in the subject I teach — journalism. But when he sat in front of me, his tallness curved into a smaller something, almost folding in on itself. I asked him why he felt the need to take assistance from AI. He remained quiet. I lowered my voice, showed him that I meant no harm and asked him the question again. He finally said that he just didn’t have time. C was a final year student, at the very edge of graduating. I then asked him if it was worth the risk of failing. This time, I didn’t get an answer. He hadn’t even considered the possibility of getting caught. </p>.<p>On campus, assignments that have been written or edited using AI are simply not accepted. Students are warned regularly. Most teachers use plagiarism software or AI itself to detect mischief. When an assignment is suspicious, it’s most delicious to see QuillBot agree with you: yes, this was written by AI. Yet, I remained somewhat guilty; the irony ringing in my head. I stopped soon, deciding to stay with the student’s work long enough to tell whether it’s their own or AI written.</p>.Apple in talks to use Google's Gemini AI to power revamped Siri.<p><strong>Getting addicted</strong></p>.<p>“I used to be addicted to AI,” Nancy, an MA English student tells me. We are sitting in a corridor outside the English department as Nancy recalls a time when all aspects of her life led to ChatGPT. She was a second-year B.Ed student when AI took off, and she remembers the craze it triggered. The first time she used it, her teacher had dumped an impossible assignment on her: put together a literature review of 40 papers. In a fit of desperation, she turned to the new and shining OpenAI software and received a clean looking list. When I ask her if she edited the assignment, she says, laughing. “There was no need to.” Three months after that, her college introduced strict rules about the use of AI assistance. </p>.<p>Generative Artificial Intelligence Addiction Disorder (GAID) is currently being floated as the newest behaviour disorder to succeed Internet Gaming Disorder. Nancy, like several other students, began using AI not simply for academic help but also for emotional support. She would ask it for advice and escape the judgement she would receive if she were with her friends. </p>.<p>“It almost made me break up with my boyfriend!” Her face mimics the incredulousness in her voice. She speaks of the same boyfriend now — an IT professional who reassures her that ChatGPT is just a bot, there is no one behind the screen. </p>.<p>The stories of AI use are endless. On Monday, a colleague tells me about catching a student using ChatGPT in class. On Tuesday, I am handed a long list of notes put together by a burly, boyish student. I see the pride he carries, chest extra puffed up: the 3,000 words of notes are all AI. On Wednesday, another student hands me a ‘consumer report’ of students benefiting from the mid-day meal scheme on the campus, a comprehensive report he managed to prepare after talking to a grand total of three people. I am much better at spotting this discrepancy now, a far leap from 2022 when ChatGPT was new and students danced with it with festive excitement. I learned to read language more carefully, I picked up on grammatical nuances and student experiences. At the same time, it started to dawn on me that my classroom was in danger of becoming plastic, empty of students’ experiences and their contexts. </p>.<p>Christine Boomstrae, one of my colleagues, describes a first-year General English classroom. She asked her class of 60 BSc students how many felt confident about conducting research without the help of ChatGPT or Google. “Guess how many raised their hands?” she says. “Only three. In a class of 60.” She asked two classes if they used AI, and all the students raised their hands in the affirmative. One of them, she says, even showed her their conversation with ChatGPT. They shared intimate details about their day. Her face betrays some of the shock she is feeling, “You can’t even imagine,” she says slowly, “the kind of things she tells AI.” </p>.<p>Nancy’s earnest smile reveals little as she tells me something similar. “I was dealing with depression at the time,” she says. “I would turn to AI whenever I was struggling with something, but I realised soon that it was not good. It was just agreeing with everything I said.” At one point, she felt upset by something ChatGPT said, so she asked it to be nicer to her. Since then, it just continued to agree with her and she got tired of it. “AI can’t give us what human beings can,” she says. The discovery took her a long time, but the scale tipped when she went to one of her professors one day and explained to him what she was thinking. He gave her a simple piece of advice: go speak to your mother. Her professor was able to deduce from the context that she wasn’t relying on the people around her. When she did have a conversation with her mother, everything became simpler. How would AI know to give her this answer? </p>.<p>If you look closely enough at AI, patterns begin to emerge. It agrees with you, very rarely uses any negative language, and is pumped with the bias of the data it is fed. There is endless research around the world looking into the effects of AI on therapy, brain function, and thought processes. </p>.<p>Karthik Nagendra, also an MA English student, points out a similar pattern. He is generous with his use of AI but also careful about the words it gives him. After an AI obsession not unlike Nancy’s, he decided to pull back because he realised that AI agreed with everything he said. “It is designed to make you keep coming back for more,” he says. </p>.<p><strong>AI-able education</strong></p>.<p>Those who have come to rely on AI for everyday activities seem to believe that the rest of us are ‘backward thinking’. At a teaching workshop I attended recently, the facilitator declared with enthusiasm, “AI should be used everywhere!” The blanket statement seems to imply that all teaching contexts are the same, but enough conversation with colleagues has shown me otherwise. A senior professor once discovered a badly submitted AI-written assignment. He decided to speak to the student, only to find out that she had a part-time job at a cafe because of financial constraints, and did not have the time to study. Hearing this gave me pause. I went back to the drawing board: what is a teacher supposed to do? </p>.<p>It has taken me six years of teaching to learn that there is no clear answer to this; it changes from one classroom to the other, from one set of students to the other. The work of the teacher is to respond to the student, to facilitate spaces for conversation and learning. A one-size-fits-all approach reduces students to names, register numbers, and the fees they pay. </p>.<p>Mythili Sripad Bhat, a Communication Studies professor in another Bengaluru college, believes using AI in the humanities “just makes no sense”. It is impossible to respond to AI outside of discipline-specific pedagogy. Nagendra reminds me that people are still taught to count numbers even though calculators exist. Similarly, why do people still attempt to read and write? The humanities classroom requires more from the student, but it demands even more from the teacher: time, energy, but most importantly, attention. </p>.<p>When I first discovered AI use in C’s class, it was with just one student’s work. I remember sitting up in horror, the realisation making me look at students’ work differently. What started with one assignment became 20 and I experienced a feeling of betrayal. I thought back to the discussions we had had in class, and the work we had been talking about. They were all supposed to write essays and feature stories, but most of them had simply submitted AI-regurgitated nonsense. When I called them to meet me one by one, they were taken aback. They looked at me with guilt, but they were mostly surprised at being caught. </p>.<p>I am reminded of a friend who teaches in a Delhi University college. We often exchange screenshots of ridiculous student assignments. We ask ourselves, ‘Do these students think we don’t read their work?’ They seem to believe that all their work falls into a vacuum, that it is for nothing. </p>.<p><strong>AI proofing</strong> </p>.<p>A teacher’s job is one of care; to remind the student that their insights are valuable. Mythili now ‘AI-proofs’ assignments. She gives tasks that require her students to really engage with whatever they are reading. She also does regular check-ins, and encourages them to maintain a daily journal where they build habits and learn to express themselves regularly. Another senior colleague reminds me that most students do not really think of AI as ‘cheating’ while they would harbour some guilt if they cheated in an exam. They don’t seem to worry about the ethical or moral implications of using AI. </p>.<p>The risk of the student not reading is by no means a new one, but the newness rests in the stilted exchange between students and teachers. Ananya Mehta, a student in Jawaharlal Nehru University, describes getting assignments designed to be done by AI, with no inputs needed from the student — they ask for so much that it becomes impossible for the student to finish the task. “What is the point of doing it then?” she wonders.</p>.<p>She describes some discomfort with the idea of using AI regularly, but reminds me that several verticals of education are simply impractical.</p>.<p>As a student, I remember leaping out of exam halls to find my teachers to discuss the question paper with them. My classmates and I would argue about who we thought set the paper, and make small wagers. We learned to read each other’s work and talk about it with such leisure that even a process as tedious as answering an exam became joyful.</p>.<p>Ananya tells me about a question that recently appeared in a Communicative English exam, where the students were asked to cast their classmates as characters from Shakespeare’s ‘Macbeth’, with reasons for why each person would suit their role. “Our entire class was buzzing after the paper,” she says, “it gave us so much to think about.” </p>.<p><strong>Find your voice</strong></p>.<p>In this hyper tech-charged universe, Nancy discovered the importance of her own voice after a year of AI-obsession. “AI takes away my voice,” she says. “When I write, I am able to understand what I am thinking. It makes things clearer.”</p>.<p>The variety of AI use takes me by surprise. Some students use it simply for research, while others use it to enhance creativity. A majority use it simply to write for them. Mythili finds similar patterns. “They don’t want to think at all,” she says. “One part of it is insecurity, another is just laziness.”</p>.<p>Haripriya, an Industrial Psychology student, who describes herself as an ‘old school person’, says, “AI helps me by giving me a starting point, I don’t need to start from scratch.” She found AI later than her classmates, being slightly uncomfortable with tech, but admits to a general complacency that has seeped into research. Most often, she would use AI to read summaries of political happenings. </p>.<p>A poem by teacher-poet Joseph Fasano went viral in 2023 for a line that made its rounds on social media. Fasano ends the poem, addressed to the student using AI, with the line ‘love is for the ones who love the work’, a statement that collates learning and working with just simply living. Nagendra describes a similar experience as we speak. He has found that the act of writing allows him to think through the concepts he is trying to understand, eventually allowing him to have original thoughts.</p>.<p><strong>What next?</strong> </p>.<p>As the semester began this year, I gave my students one clear instruction: you must have a notebook for all your writing and note-taking. Classroom spaces are now fraught with suspicion and the only logical way to deal with it is to return to more ‘archaic’ teaching methods. When I looked at one of these students’ work after a class, I experienced the unexpected pleasure of underlining their writing and pointing out exactly what could be improved. That two-minute exchange did more for us than what has been happening in the last three years. </p>
<p>I fell into teaching naturally in 2019. I felt alive in the classroom even before I considered becoming a teacher. Even then, I realised that the classroom was a place of fun: conversation, jokes, and play. As a student, I especially enjoyed classes in which the teacher was accessible, and I could ask whatever questions came to mind. </p>.<p>As a teacher, I knew I wanted the same for my students. Every time I read a student’s assignment, it was with a sense of fullness and curiosity. If students wrote about the dosa place next to campus, I knew that the experience was real and honest, and that when we talked about the always-grumpy man there, the experience was shared. All of this was temporarily paused when I was forced to look at them differently: all my previously innocent and cute students seemed to have grown AI-shaped fangs. </p>.<p>For all of April 2024, I looked at student writing and pulled apart each submission line by line to try and figure out whether the words in front of me were written by a person or a machine. Reading their work stopped being less about the students and more about the strange mechanics of computing. It became clear to me, as it did to others, that the classroom would never be the same again. </p>.<p>After some amount of spy work, I had my list of defaulters in place. I called them to the department one by one and had long conversations with them. One of them, let us call him ‘C’, was a tall, gym-going boy who had already made clear that he wasn’t interested in the subject I teach — journalism. But when he sat in front of me, his tallness curved into a smaller something, almost folding in on itself. I asked him why he felt the need to take assistance from AI. He remained quiet. I lowered my voice, showed him that I meant no harm and asked him the question again. He finally said that he just didn’t have time. C was a final year student, at the very edge of graduating. I then asked him if it was worth the risk of failing. This time, I didn’t get an answer. He hadn’t even considered the possibility of getting caught. </p>.<p>On campus, assignments that have been written or edited using AI are simply not accepted. Students are warned regularly. Most teachers use plagiarism software or AI itself to detect mischief. When an assignment is suspicious, it’s most delicious to see QuillBot agree with you: yes, this was written by AI. Yet, I remained somewhat guilty; the irony ringing in my head. I stopped soon, deciding to stay with the student’s work long enough to tell whether it’s their own or AI written.</p>.Apple in talks to use Google's Gemini AI to power revamped Siri.<p><strong>Getting addicted</strong></p>.<p>“I used to be addicted to AI,” Nancy, an MA English student tells me. We are sitting in a corridor outside the English department as Nancy recalls a time when all aspects of her life led to ChatGPT. She was a second-year B.Ed student when AI took off, and she remembers the craze it triggered. The first time she used it, her teacher had dumped an impossible assignment on her: put together a literature review of 40 papers. In a fit of desperation, she turned to the new and shining OpenAI software and received a clean looking list. When I ask her if she edited the assignment, she says, laughing. “There was no need to.” Three months after that, her college introduced strict rules about the use of AI assistance. </p>.<p>Generative Artificial Intelligence Addiction Disorder (GAID) is currently being floated as the newest behaviour disorder to succeed Internet Gaming Disorder. Nancy, like several other students, began using AI not simply for academic help but also for emotional support. She would ask it for advice and escape the judgement she would receive if she were with her friends. </p>.<p>“It almost made me break up with my boyfriend!” Her face mimics the incredulousness in her voice. She speaks of the same boyfriend now — an IT professional who reassures her that ChatGPT is just a bot, there is no one behind the screen. </p>.<p>The stories of AI use are endless. On Monday, a colleague tells me about catching a student using ChatGPT in class. On Tuesday, I am handed a long list of notes put together by a burly, boyish student. I see the pride he carries, chest extra puffed up: the 3,000 words of notes are all AI. On Wednesday, another student hands me a ‘consumer report’ of students benefiting from the mid-day meal scheme on the campus, a comprehensive report he managed to prepare after talking to a grand total of three people. I am much better at spotting this discrepancy now, a far leap from 2022 when ChatGPT was new and students danced with it with festive excitement. I learned to read language more carefully, I picked up on grammatical nuances and student experiences. At the same time, it started to dawn on me that my classroom was in danger of becoming plastic, empty of students’ experiences and their contexts. </p>.<p>Christine Boomstrae, one of my colleagues, describes a first-year General English classroom. She asked her class of 60 BSc students how many felt confident about conducting research without the help of ChatGPT or Google. “Guess how many raised their hands?” she says. “Only three. In a class of 60.” She asked two classes if they used AI, and all the students raised their hands in the affirmative. One of them, she says, even showed her their conversation with ChatGPT. They shared intimate details about their day. Her face betrays some of the shock she is feeling, “You can’t even imagine,” she says slowly, “the kind of things she tells AI.” </p>.<p>Nancy’s earnest smile reveals little as she tells me something similar. “I was dealing with depression at the time,” she says. “I would turn to AI whenever I was struggling with something, but I realised soon that it was not good. It was just agreeing with everything I said.” At one point, she felt upset by something ChatGPT said, so she asked it to be nicer to her. Since then, it just continued to agree with her and she got tired of it. “AI can’t give us what human beings can,” she says. The discovery took her a long time, but the scale tipped when she went to one of her professors one day and explained to him what she was thinking. He gave her a simple piece of advice: go speak to your mother. Her professor was able to deduce from the context that she wasn’t relying on the people around her. When she did have a conversation with her mother, everything became simpler. How would AI know to give her this answer? </p>.<p>If you look closely enough at AI, patterns begin to emerge. It agrees with you, very rarely uses any negative language, and is pumped with the bias of the data it is fed. There is endless research around the world looking into the effects of AI on therapy, brain function, and thought processes. </p>.<p>Karthik Nagendra, also an MA English student, points out a similar pattern. He is generous with his use of AI but also careful about the words it gives him. After an AI obsession not unlike Nancy’s, he decided to pull back because he realised that AI agreed with everything he said. “It is designed to make you keep coming back for more,” he says. </p>.<p><strong>AI-able education</strong></p>.<p>Those who have come to rely on AI for everyday activities seem to believe that the rest of us are ‘backward thinking’. At a teaching workshop I attended recently, the facilitator declared with enthusiasm, “AI should be used everywhere!” The blanket statement seems to imply that all teaching contexts are the same, but enough conversation with colleagues has shown me otherwise. A senior professor once discovered a badly submitted AI-written assignment. He decided to speak to the student, only to find out that she had a part-time job at a cafe because of financial constraints, and did not have the time to study. Hearing this gave me pause. I went back to the drawing board: what is a teacher supposed to do? </p>.<p>It has taken me six years of teaching to learn that there is no clear answer to this; it changes from one classroom to the other, from one set of students to the other. The work of the teacher is to respond to the student, to facilitate spaces for conversation and learning. A one-size-fits-all approach reduces students to names, register numbers, and the fees they pay. </p>.<p>Mythili Sripad Bhat, a Communication Studies professor in another Bengaluru college, believes using AI in the humanities “just makes no sense”. It is impossible to respond to AI outside of discipline-specific pedagogy. Nagendra reminds me that people are still taught to count numbers even though calculators exist. Similarly, why do people still attempt to read and write? The humanities classroom requires more from the student, but it demands even more from the teacher: time, energy, but most importantly, attention. </p>.<p>When I first discovered AI use in C’s class, it was with just one student’s work. I remember sitting up in horror, the realisation making me look at students’ work differently. What started with one assignment became 20 and I experienced a feeling of betrayal. I thought back to the discussions we had had in class, and the work we had been talking about. They were all supposed to write essays and feature stories, but most of them had simply submitted AI-regurgitated nonsense. When I called them to meet me one by one, they were taken aback. They looked at me with guilt, but they were mostly surprised at being caught. </p>.<p>I am reminded of a friend who teaches in a Delhi University college. We often exchange screenshots of ridiculous student assignments. We ask ourselves, ‘Do these students think we don’t read their work?’ They seem to believe that all their work falls into a vacuum, that it is for nothing. </p>.<p><strong>AI proofing</strong> </p>.<p>A teacher’s job is one of care; to remind the student that their insights are valuable. Mythili now ‘AI-proofs’ assignments. She gives tasks that require her students to really engage with whatever they are reading. She also does regular check-ins, and encourages them to maintain a daily journal where they build habits and learn to express themselves regularly. Another senior colleague reminds me that most students do not really think of AI as ‘cheating’ while they would harbour some guilt if they cheated in an exam. They don’t seem to worry about the ethical or moral implications of using AI. </p>.<p>The risk of the student not reading is by no means a new one, but the newness rests in the stilted exchange between students and teachers. Ananya Mehta, a student in Jawaharlal Nehru University, describes getting assignments designed to be done by AI, with no inputs needed from the student — they ask for so much that it becomes impossible for the student to finish the task. “What is the point of doing it then?” she wonders.</p>.<p>She describes some discomfort with the idea of using AI regularly, but reminds me that several verticals of education are simply impractical.</p>.<p>As a student, I remember leaping out of exam halls to find my teachers to discuss the question paper with them. My classmates and I would argue about who we thought set the paper, and make small wagers. We learned to read each other’s work and talk about it with such leisure that even a process as tedious as answering an exam became joyful.</p>.<p>Ananya tells me about a question that recently appeared in a Communicative English exam, where the students were asked to cast their classmates as characters from Shakespeare’s ‘Macbeth’, with reasons for why each person would suit their role. “Our entire class was buzzing after the paper,” she says, “it gave us so much to think about.” </p>.<p><strong>Find your voice</strong></p>.<p>In this hyper tech-charged universe, Nancy discovered the importance of her own voice after a year of AI-obsession. “AI takes away my voice,” she says. “When I write, I am able to understand what I am thinking. It makes things clearer.”</p>.<p>The variety of AI use takes me by surprise. Some students use it simply for research, while others use it to enhance creativity. A majority use it simply to write for them. Mythili finds similar patterns. “They don’t want to think at all,” she says. “One part of it is insecurity, another is just laziness.”</p>.<p>Haripriya, an Industrial Psychology student, who describes herself as an ‘old school person’, says, “AI helps me by giving me a starting point, I don’t need to start from scratch.” She found AI later than her classmates, being slightly uncomfortable with tech, but admits to a general complacency that has seeped into research. Most often, she would use AI to read summaries of political happenings. </p>.<p>A poem by teacher-poet Joseph Fasano went viral in 2023 for a line that made its rounds on social media. Fasano ends the poem, addressed to the student using AI, with the line ‘love is for the ones who love the work’, a statement that collates learning and working with just simply living. Nagendra describes a similar experience as we speak. He has found that the act of writing allows him to think through the concepts he is trying to understand, eventually allowing him to have original thoughts.</p>.<p><strong>What next?</strong> </p>.<p>As the semester began this year, I gave my students one clear instruction: you must have a notebook for all your writing and note-taking. Classroom spaces are now fraught with suspicion and the only logical way to deal with it is to return to more ‘archaic’ teaching methods. When I looked at one of these students’ work after a class, I experienced the unexpected pleasure of underlining their writing and pointing out exactly what could be improved. That two-minute exchange did more for us than what has been happening in the last three years. </p>