<p>On February 5, the Mumbai Police arrested two people in connection with cheating a septuagenarian of <a href="https://www.ndtv.com/india-news/retired-iit-medical-officer-loses-rs-4-62-crore-to-digital-arrest-scam-10954853">Rs 4.6 crore via a digital arrest</a>. In another case, according to news reports, the Ahmedabad Cyber Crime Department is investigating a case where an Australia-based NRI was <a href="https://www.indiatoday.in/india/story/digital-arrest-scam-nri-australia-rupees-48-lakh-gujarat-cyber-crime-investigation-2863525-2026-02-05">duped of Rs 48 lakh</a> in a similar digital arrest.</p><p>Until recently, the preferred route for criminals was to hide behind clumsy texts and suspicious links to scam unsuspecting people of their hard-earned money. While it is still used, more digital routes are taken, and criminals today copy the one thing we trust most: a familiar voice. This escalates the threat to a greater level.</p><p>With a few seconds of audio lifted from a social post, a webinar clip, or an old voicemail, cheap tools can now clone a person’s tone, pace, and pet phrases. There is no dodgy URL to check and no spelling mistakes to spot. It feels real because it <em>sounds</em> real. <a href="https://timesofindia.indiatimes.com/city/mumbai/mumbai-businessman-falls-victim-to-ai-voice-cloning-loses-rs-80000/articleshow/109225530.cms">Not too long ago,</a> a Mumbai businessman lost ₹80,000 after a caller mimicked his son’s voice and claimed a detention abroad — police later called it an AI voice-clone ruse.</p>.NRI in Australia falls victim to digital arrest gang, loses Rs 48 lakh.<p><strong>How the fraud works</strong></p><p>Cloning tools are straightforward. A short sample is fed into a model that learns the unique features of a person’s speech. The scammer then types a script, and the system speaks it in the cloned voice. Some tools let the caller talk in real time with the clone layered over their speech, including basic emotion controls that make the voice sound calm or panicked on demand. From there, it is standard social engineering. The goal is to get you to share a one-time passcode, approve a money transfer, or reveal enough information to reset an account. <a href="https://www.ndtv.com/india-news/scamsters-use-ai-to-clone-cousins-voice-dupe-indore-teacher-of-rs-1-lakh-10604289">In January, </a>Indore police reported a case where a teacher transferred nearly ₹1 lakh after a caller cloned her cousin’s voice and spun an urgent emergency — one of the first such confirmed cases in Madhya Pradesh.</p><p>Everyone with a phone is in scope, but patterns repeat. Families receive fake distress calls that claim a relative is in trouble. The tactics are tailored, but the aim is constant: create panic, compress time, and win consent before the person checks.</p><p>Police have repeatedly warned about ‘digital arrest’ calls — often on WhatsApp — where deepfaked or scripted voices pose as officials and keep victims online while demanding payments; multiple cases were flagged across cities in late 2025 and early 2026.</p><p>Last week, the Rajasthan High Court, while dismissing the bail plea of two accused in a digital arrest scam, termed the scam as the <a href="https://indianexpress.com/article/legal-news/every-hour-of-every-day-rajasthan-high-court-issues-directives-to-centre-reserve-bank-of-india-for-insidious-digital-scams-10510305/">‘most insidious’</a> form of cybercrime. </p><p><strong>Simple habits that break the spell</strong></p><p>The best defence is not technical, it’s behavioural. Treat any unusual phone request as unverified, no matter who it sounds like. Hang up politely, and call back on a saved number or a known company line. Never share OTPs, PINs, or reset links on a call. Banks do not ask for them. Families can set an easy code word for genuine emergencies. Ask for the request in writing on official channels, and follow the playbook every time.</p><p>If a call feels off, slow down. Scammers thrive on speed because speed skips scrutiny. Say you will call back, then use a number you already have — not one they give you. If the caller resists, that is your answer.</p><p><strong>What platforms and institutions must change</strong></p><p>People cannot carry this burden alone. Networks, apps, and employers need to raise the floor. Telecom providers can expand caller name presentation so verified business names appear on screen, and they can block routes known for spoofing. Messaging and social platforms should watermark or label synthetic audio where possible, and make reporting fast. Banks and fintechs can add a second confirmation inside the app for sensitive actions that follow a phone call, so a spoofed voice cannot move money by itself. Large employers should run short training sessions that demonstrate a real cloned call and rehearse the right response. The lesson should be memorable and repeatable: pause, verify, and switch the channel.</p><p>In this connection, the Reserve Bank of India (RBI)’s announcement, on February 6, to provide a compensation of up to Rs 25,000 for small-value frauds to account holders who are victims of digital fraud is a welcome first step. </p><p><strong>Living with the risk without living in fear</strong></p><p>Voice cloning will not disappear. It will become a background risk, like spam once did. The goal is not zero exposure. The goal is resilience. Build family and workplace routines. Talk openly about near misses so others learn without shame. Praise caution. If a colleague pauses a payment to verify, that is not a delay. That is a discipline that protects everyone.</p><p>As generative audio improves, scams will try to be more convincing. Our answer is simple and practical. Slow the moment. Change the channel. Make proof part of every important decision. Verify a request on a saved number. Refuse to share codes. Ask for a written confirmation on official channels. These habits are not heavy lifts. They are small pauses that keep big problems away.</p><p><em><strong>Rajesh Dimania is CTO, CallerDesk.</strong></em></p><p><em>(Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH)</em></p>
<p>On February 5, the Mumbai Police arrested two people in connection with cheating a septuagenarian of <a href="https://www.ndtv.com/india-news/retired-iit-medical-officer-loses-rs-4-62-crore-to-digital-arrest-scam-10954853">Rs 4.6 crore via a digital arrest</a>. In another case, according to news reports, the Ahmedabad Cyber Crime Department is investigating a case where an Australia-based NRI was <a href="https://www.indiatoday.in/india/story/digital-arrest-scam-nri-australia-rupees-48-lakh-gujarat-cyber-crime-investigation-2863525-2026-02-05">duped of Rs 48 lakh</a> in a similar digital arrest.</p><p>Until recently, the preferred route for criminals was to hide behind clumsy texts and suspicious links to scam unsuspecting people of their hard-earned money. While it is still used, more digital routes are taken, and criminals today copy the one thing we trust most: a familiar voice. This escalates the threat to a greater level.</p><p>With a few seconds of audio lifted from a social post, a webinar clip, or an old voicemail, cheap tools can now clone a person’s tone, pace, and pet phrases. There is no dodgy URL to check and no spelling mistakes to spot. It feels real because it <em>sounds</em> real. <a href="https://timesofindia.indiatimes.com/city/mumbai/mumbai-businessman-falls-victim-to-ai-voice-cloning-loses-rs-80000/articleshow/109225530.cms">Not too long ago,</a> a Mumbai businessman lost ₹80,000 after a caller mimicked his son’s voice and claimed a detention abroad — police later called it an AI voice-clone ruse.</p>.NRI in Australia falls victim to digital arrest gang, loses Rs 48 lakh.<p><strong>How the fraud works</strong></p><p>Cloning tools are straightforward. A short sample is fed into a model that learns the unique features of a person’s speech. The scammer then types a script, and the system speaks it in the cloned voice. Some tools let the caller talk in real time with the clone layered over their speech, including basic emotion controls that make the voice sound calm or panicked on demand. From there, it is standard social engineering. The goal is to get you to share a one-time passcode, approve a money transfer, or reveal enough information to reset an account. <a href="https://www.ndtv.com/india-news/scamsters-use-ai-to-clone-cousins-voice-dupe-indore-teacher-of-rs-1-lakh-10604289">In January, </a>Indore police reported a case where a teacher transferred nearly ₹1 lakh after a caller cloned her cousin’s voice and spun an urgent emergency — one of the first such confirmed cases in Madhya Pradesh.</p><p>Everyone with a phone is in scope, but patterns repeat. Families receive fake distress calls that claim a relative is in trouble. The tactics are tailored, but the aim is constant: create panic, compress time, and win consent before the person checks.</p><p>Police have repeatedly warned about ‘digital arrest’ calls — often on WhatsApp — where deepfaked or scripted voices pose as officials and keep victims online while demanding payments; multiple cases were flagged across cities in late 2025 and early 2026.</p><p>Last week, the Rajasthan High Court, while dismissing the bail plea of two accused in a digital arrest scam, termed the scam as the <a href="https://indianexpress.com/article/legal-news/every-hour-of-every-day-rajasthan-high-court-issues-directives-to-centre-reserve-bank-of-india-for-insidious-digital-scams-10510305/">‘most insidious’</a> form of cybercrime. </p><p><strong>Simple habits that break the spell</strong></p><p>The best defence is not technical, it’s behavioural. Treat any unusual phone request as unverified, no matter who it sounds like. Hang up politely, and call back on a saved number or a known company line. Never share OTPs, PINs, or reset links on a call. Banks do not ask for them. Families can set an easy code word for genuine emergencies. Ask for the request in writing on official channels, and follow the playbook every time.</p><p>If a call feels off, slow down. Scammers thrive on speed because speed skips scrutiny. Say you will call back, then use a number you already have — not one they give you. If the caller resists, that is your answer.</p><p><strong>What platforms and institutions must change</strong></p><p>People cannot carry this burden alone. Networks, apps, and employers need to raise the floor. Telecom providers can expand caller name presentation so verified business names appear on screen, and they can block routes known for spoofing. Messaging and social platforms should watermark or label synthetic audio where possible, and make reporting fast. Banks and fintechs can add a second confirmation inside the app for sensitive actions that follow a phone call, so a spoofed voice cannot move money by itself. Large employers should run short training sessions that demonstrate a real cloned call and rehearse the right response. The lesson should be memorable and repeatable: pause, verify, and switch the channel.</p><p>In this connection, the Reserve Bank of India (RBI)’s announcement, on February 6, to provide a compensation of up to Rs 25,000 for small-value frauds to account holders who are victims of digital fraud is a welcome first step. </p><p><strong>Living with the risk without living in fear</strong></p><p>Voice cloning will not disappear. It will become a background risk, like spam once did. The goal is not zero exposure. The goal is resilience. Build family and workplace routines. Talk openly about near misses so others learn without shame. Praise caution. If a colleague pauses a payment to verify, that is not a delay. That is a discipline that protects everyone.</p><p>As generative audio improves, scams will try to be more convincing. Our answer is simple and practical. Slow the moment. Change the channel. Make proof part of every important decision. Verify a request on a saved number. Refuse to share codes. Ask for a written confirmation on official channels. These habits are not heavy lifts. They are small pauses that keep big problems away.</p><p><em><strong>Rajesh Dimania is CTO, CallerDesk.</strong></em></p><p><em>(Disclaimer: The views expressed above are the author's own. They do not necessarily reflect the views of DH)</em></p>