Size of infinity

Tech blog

High school Physics students memorise various units of measurement, from  centimetres to kilometres, which will come of use in their later life.

As the world rapidly glides into the digital era, they may have to pay more attention to the units used to size up things made of bytes. But there is a problem here: unlike the traditional units, which are fairly limited in number, the digital units continue to proliferate to keep up with the explosive growth in data.

The computing world started measuring data in KBs and MBs. According to an urban legend Bill Gates once said 640 KB memory “ought to be enough for anybody”. In the mid 60s, TCS struggled to import a Burroughs mainframe with 12 KB memory and 8 MB disk space at a cost of $340,000.

Now we are comfortable using Terabyte (TB) drives, which can hold 1,000 copies of the Encyclopedia Britannica. YouTube adds a TB of data every four minutes. Petabytes, Exabytes, Zettabytes and Yottabytes are gaining some currency in popular parlance. These are astronomically huge units. According to it would take 250 million DVDs to fill an exabyte drive. If you want to download a Yottabyte file (equal to 250 trillion DVDs) from the internet using high-power broadband, be prepared to wait for 11 trillion years.

But the continuing outpouring of data makes these units sound inadequate.  By 2016, the annual internet traffic would cross a Zettabyte of data, according to Cisco. GigaOm reports that Intel scientists are going beyond the Yottabyte and proposing new units – Brontobytes and Gegobytes – to measure things even bigger. Brontobyte is 1 followed by 27 zeroes. If you insist, Geopbyte is 152676504600228322940124967031205376 bytes!

DH Newsletter Privacy Policy Get top news in your inbox daily
Comments (+)