ADVERTISEMENT
IISc’s I3D lab unveils AI-powered rover drone, VR spacecraft  Users can control the drone and rover through voice commands, while AI-driven image detection enables object identification — crucial for search-and-rescue missions and planetary exploration.
Shantanu Hornad
Last Updated IST
<div class="paragraphs"><p>Prof Pradipta Biswas (C) seen with his team members at Rhapsody 3.0 at IISc. </p></div>

Prof Pradipta Biswas (C) seen with his team members at Rhapsody 3.0 at IISc.

Credit: Special Arrangement

Bengaluru: The Indian Institute of Science's I3D (Intelligent Inclusive Interaction Design Lab), led by Prof Pradipta Biswas, showcased cutting-edge innovations at Rhapsody 3.0, the institute’s annual fest.

ADVERTISEMENT

The highlight: an AI-powered mixed-reality rover-drone system and a bespoke VR spacecraft simulator developed with Isro's astronaut-designate.

The advanced rover-drone system, designed for applications ranging from surveillance and warehouse operations to disaster response and space exploration, features natural language processing via a Large Language Model (LLM).

Users can control the drone and rover through voice commands, while AI-driven image detection enables object identification — crucial for search-and-rescue missions and planetary exploration.

This technology is part of the lab’s broader research into Naturalistic Human-Robot Interaction (HRI) for Heterogeneous Agents. The team has developed a multimodal LLM-based system capable of managing at least three robotic agents, such as drones, rovers, and robotic arms. Their research — spanning 3D reconstruction, data generation using diffusion models, and LLM-based task distribution — has led to five papers accepted at prestigious conferences like ICRA 2025 and IUI 2025.

The team also unveiled a state-of-the-art VR spacecraft simulator, designed by Group Captain Ajit Krishnan, Isro's astronaut-designate for the Gaganyaan Mission, under Biswas’ guidance.

Focused on manual deorbiting in emergencies, the simulator demonstrates that a bottom-view perspective — showing the entire Earth below — is more effective than a front-view with only the horizon visible.

Beyond space applications, the simulator serves as a modular testing platform for human-machine interfaces (HMI), integrating haptic gloves and custom-built hardware. The team is now expanding its capabilities for spacecraft docking in microgravity, flight simulation, and Virtual Pilot Assistance systems for next-gen aircraft.

ADVERTISEMENT
(Published 28 March 2025, 01:56 IST)