<p>A few years ago, the internet posed a challenge. Google became a synonym for searching. Students gleefully copied chunks of information and pasted them into their digital notebooks or copied verbatim. A few copied into their notebooks, and a few more, as usual, copied from their friends.</p>.<p>While faculty members fumed and fretted about the assignments, meticulously reproduced, without acknowledgement in the majority of cases, students turned in assignments on time. Then came Turnitin, which turned all these upsides down, pinpointing exactly from where they had been plagiarised. PhD guides and supervisors of projects used plagiarism software to ensure that their students did not violate rules by plagiarising content.</p>.<p>Now, it is not just the internet, but also the GPTs (Generative Pretrained Transformers), an AI capable of understanding human text and generating a wide range of content, that are under scrutiny. Faculty tore their hair in despair, as neatly worded assignments landed on their desk, much before the due date. This bubble would inevitably burst, too, as tools for detecting GPT content became popular. Most learning platforms have a built-in checker for such material, and students were evaluated based on the reports generated by this.</p>.Apple WWDC 2025 Highlights: Apple brings Liquid Glass UI with new iOS 26, iPadOS 26, macOS Tahoe and more.<p><strong>To what extent?</strong></p>.<p>The latest debate is on the use of AI by faculty. When they warn their wards against AI usage, isn’t it natural that they aren’t expected to use it? There is no straightforward answer to this. Everyone is inevitably embracing AI. However, to what extent can educators utilise it? </p>.<p>When used thoughtfully and ethically, AI can enhance teaching practices, aid in creating personalised learning plans for students, and also provide multiple question papers and assignments. There was a furore recently when a student in a university demanded a refund of tuition fees after discovering that a professor had used AI to grade papers and take notes. While it was embarrassing to the faculty member, it cannot be used as a case against the usage of AI by educators.</p>.<p>AI can be a valuable tool, but not a replacement for human teaching. Although GPTs can generate notes and content, most concepts require explanation from faculty members. This applies to both fundamentals and advanced concepts.</p>.<p>An argument in favour of AI is that it can be used to analyse learning styles and adjust the delivery of content. Most adaptive learning platforms utilise AI to provide personalised feedback, challenges for advanced learners, and diverse learning levels for slower learners. This is not easy to provide in a large classroom, where the student-teacher ratio is unfavourable. Grading is faster for multiple-choice options, report generation is faster and lesson plans, assignments, and dataset preparation are faster and more efficient. Thus, AI is like a teaching assistant, providing much-needed free time for educators, which can be utilised for better interactions with students or for the faculty to recharge.</p>.<p>If the classroom has students with special challenges, AI-powered speech recognition, text-to-speech, and translation tools can help them to integrate better with the rest. Language and grammar can be checked more quickly, and tasks such as math and coding become easier, as AI can provide a “push” when one is stuck.</p>.<p>However, AI systems require access to student data, which raises significant concerns about privacy. This can lead to numerous issues, and faculty members must be cautious. Additionally, usage should not lead to dependency; however, faculty must consider the ethical, emotional, social, and other parameters of students before drawing a general conclusion about a batch. AI cannot replicate human feelings, emotions, and the connection that teachers have with their students, particularly when teaching or evaluating. The primary decision maker is the faculty, and AI can be a supplement, not a substitute.</p>.<p>Not all students and campuses have access to AI tools, which require internet connectivity, updated software and hardware components, and technical training for teachers. This might lead to further division between the haves and have-nots in the education space.</p>.<p>Since AI systems utilise training data, they may inadvertently reinforce certain biases or stereotypes. Thus, clear guidelines on its usage, along with integrating AI literacy into training programmes, which enable not just its use but also the wisdom of how much and when to use it, are the need of the hour. </p>.<p>The goal should be to create a harmonious partnership between technology and teaching, keeping in mind that faculty shape the youngsters of tomorrow.</p>.<p>(The author is an academic)</p>
<p>A few years ago, the internet posed a challenge. Google became a synonym for searching. Students gleefully copied chunks of information and pasted them into their digital notebooks or copied verbatim. A few copied into their notebooks, and a few more, as usual, copied from their friends.</p>.<p>While faculty members fumed and fretted about the assignments, meticulously reproduced, without acknowledgement in the majority of cases, students turned in assignments on time. Then came Turnitin, which turned all these upsides down, pinpointing exactly from where they had been plagiarised. PhD guides and supervisors of projects used plagiarism software to ensure that their students did not violate rules by plagiarising content.</p>.<p>Now, it is not just the internet, but also the GPTs (Generative Pretrained Transformers), an AI capable of understanding human text and generating a wide range of content, that are under scrutiny. Faculty tore their hair in despair, as neatly worded assignments landed on their desk, much before the due date. This bubble would inevitably burst, too, as tools for detecting GPT content became popular. Most learning platforms have a built-in checker for such material, and students were evaluated based on the reports generated by this.</p>.Apple WWDC 2025 Highlights: Apple brings Liquid Glass UI with new iOS 26, iPadOS 26, macOS Tahoe and more.<p><strong>To what extent?</strong></p>.<p>The latest debate is on the use of AI by faculty. When they warn their wards against AI usage, isn’t it natural that they aren’t expected to use it? There is no straightforward answer to this. Everyone is inevitably embracing AI. However, to what extent can educators utilise it? </p>.<p>When used thoughtfully and ethically, AI can enhance teaching practices, aid in creating personalised learning plans for students, and also provide multiple question papers and assignments. There was a furore recently when a student in a university demanded a refund of tuition fees after discovering that a professor had used AI to grade papers and take notes. While it was embarrassing to the faculty member, it cannot be used as a case against the usage of AI by educators.</p>.<p>AI can be a valuable tool, but not a replacement for human teaching. Although GPTs can generate notes and content, most concepts require explanation from faculty members. This applies to both fundamentals and advanced concepts.</p>.<p>An argument in favour of AI is that it can be used to analyse learning styles and adjust the delivery of content. Most adaptive learning platforms utilise AI to provide personalised feedback, challenges for advanced learners, and diverse learning levels for slower learners. This is not easy to provide in a large classroom, where the student-teacher ratio is unfavourable. Grading is faster for multiple-choice options, report generation is faster and lesson plans, assignments, and dataset preparation are faster and more efficient. Thus, AI is like a teaching assistant, providing much-needed free time for educators, which can be utilised for better interactions with students or for the faculty to recharge.</p>.<p>If the classroom has students with special challenges, AI-powered speech recognition, text-to-speech, and translation tools can help them to integrate better with the rest. Language and grammar can be checked more quickly, and tasks such as math and coding become easier, as AI can provide a “push” when one is stuck.</p>.<p>However, AI systems require access to student data, which raises significant concerns about privacy. This can lead to numerous issues, and faculty members must be cautious. Additionally, usage should not lead to dependency; however, faculty must consider the ethical, emotional, social, and other parameters of students before drawing a general conclusion about a batch. AI cannot replicate human feelings, emotions, and the connection that teachers have with their students, particularly when teaching or evaluating. The primary decision maker is the faculty, and AI can be a supplement, not a substitute.</p>.<p>Not all students and campuses have access to AI tools, which require internet connectivity, updated software and hardware components, and technical training for teachers. This might lead to further division between the haves and have-nots in the education space.</p>.<p>Since AI systems utilise training data, they may inadvertently reinforce certain biases or stereotypes. Thus, clear guidelines on its usage, along with integrating AI literacy into training programmes, which enable not just its use but also the wisdom of how much and when to use it, are the need of the hour. </p>.<p>The goal should be to create a harmonious partnership between technology and teaching, keeping in mind that faculty shape the youngsters of tomorrow.</p>.<p>(The author is an academic)</p>