We live in the Date Age. It is very hard to measure the total volume of the data that is stored electronically in the universe. The estimated size of the data of Digital Universe in 2006 is 0.18 zettabytes, in 2011 it increased to 1.8 zettabytes, in 2013 it increased more to 4.4 zettabytes! According to the new estimates this Digital Universe is going to increase by 10 factor in 2020. By 2020 it will be increased to 44 zettabytes.

Confused with ZettaByte?

To be crisp Byte is the measure of the data in the digital Terminology. In present era we are aware of Megabytes, Gigabytes and Terabytes. Generally our hard disks are 500 GB or 1 TB now a days, out pen drives are 16 GB or 8 GB or more and we have many things which specify the storage limit using these bytes metric.

  • 1000 bytes = 1 kilobyte
  • 1000 kilobytes = 1 megabyte
  • 1000 megabytes = 1 gigabyte
  • 1000 gigabytes = 1 terabyte
  • 1000 terabytes = 1 petabyte
  • 1000 petabytes = 1 exabyte
  • 1000 exabytes = 1 zettabyte
  • 1000 zettabytes = 1 yottabyte

Which data comes under Digital Universe?

I guess now you can calculate how much 44 zettabytes worth! It is 44,000,000,000,000 GigaBytes or  44,000,000,000 TeraBytes or 44,000 ExaBytes. Just imagine by 2020 how much data this Digital Universe will have.

So how this data has been calculated or estimated? This is data is everything we do in the world of internet. think about your phone calls, emails, electronic documents, machine logs, RFID readers, sensor networks, vehicle GPS traces, retail transactions, social media, media, videos etc.,. what not everything that is being saved electronically come under the data of the Digital Universe.

Microsoft Research’s MyLifeBits project that gives a glimpse of archiving the personal information that is been stored in the whole life span. There is a point in common , when there is this much of growth in data there should be a way to store it and analyse it for the other usage purposes.

Lets consider some of the incoming data sources

Data is flooding now a days. There are some of the sources that get enormous of information daily and  analyse them, store them. Lets see some of such sources.

  • Facebook users are uploading 350 Millions of photos daily.
  • Twitter generates more than 8 TB of data daily.
  • Flickr users are uploading more than 3.5 new images daily.
  • The NewYork stock exchange generates 1 Tb of stock exchange data every year.
  • Ancestry.com, the genealogy site stores around 2.5 petabytes of data.
  • The Internet Archive stores around 2 petabytes of data and is growing at a rate of 20 terabytes per month.

These are some of the statistics on how data is being flooded in the Digital Universe!

What does this huge data lead to?

So there’s a lot of data out there. This much of huge data cannot be maintained by any of the DBMS or RDBMS. It should be called something or this should have a new system for the further investigation or analysis. There comes our BigData! Which is now the hot topic and hot technology to be learnt.

Happy Learning! Happy Exploring!