IISc has big plans for Big Data initiative

IISc has big plans for Big Data initiative

Lecture series, scholarships on the cards

 The Indian Institute of Science is planning to built teams of data analysts and scientists in all possible fields as part of its Big Data Initiative.

The initiative, launched by a 10-member team of IISc’s Computer Science and Automation (CSA) department, has been garnering attention from students, scholars and industry experts.

The plan is to have a series of lectures on different aspects of Big Data over a period of six months by experts in the field. The lectures will be delivered free of cost as a step to increase awareness on the study discipline among students and create an ecosystem of big data in the institute. 

Apart from academicians, industry specialists will spell out the practical benefits about things simple - even as simple as why we need to drink filtered water.The data generated on water filtering is so high that ordinary people would not have the time to understand it. Big data experts will break down data and make it understandable.


A good feature of the initiative is that students can look forward to fellowships on big data. IISc team plans to recruit and train best talents in big data and related areas and provide them with scholarship to pursue relevant courses offered at the institute.Top companies are expected to sponsor the training campaign and fund the fellowship. The duration and extent of funding will be decided in due course.

Prof Jayant Haritsa, faculty at CSA and Supercomputer Education and Research Centre (SERC), says big data systems have to be tested seriously to confirm whether they work well. He describes what he would present in his lecture: 

“Big Data has become the buzzword of choice in recent times, especially in the software industry. The accompanying hoopla has spawned frenetic claims foretelling the development of great and wondrous solutions to Big Data challenges. However, very little is said about the testing of such systems, an essential pre-requisite for deployment. We will discuss the research challenges involved in the testing process, especially from the database perspective.”

Higgs Boson experiment

The massive experiment going on at the Large Hadron Collider (LHC) in Geneva to understand the origins of the universe by understanding the particle Higgs Boson, has a mind-boggling 150 million sensors delivering data of 40 million times per second. The number of collisions per second is nearly 600 million. 

Post filtering and after not recording more than 99.999 pc of these streams, only 100 collisions per second would be of interest. Is it humanly possible therefore to analyse data created in millions of reams? The big data field is looking at building tools to break down the huge volume of data to understandable levels.

Bangalore has companies such as EMC, HP,SAP, IBM and Accenture, which are into big data mining.

Data analysts are paid handsomely and are sought all over the world. American IT firms had organised a big data convention at Las Vegas recently featuring the top 25 companies into big data.