Is There An Easy Way To Define Big Data?

If you Google “big data size,” a Wikipedia article comes up that gives a 2012 definition of Big Data as between several dozen terabytes and multiple petabytes; but that description isn’t entirely accurate. When properly describing “big data” limitations, definitions get a little gray. The big data size threshold is always moving.

Following The Tracks Of Technology

Think about it critically. Big data size of the 1980s would look ridiculous compared to today’s colossal data storage and processing techniques. A big data size definition of the mid-1990s might be a hundred megabytes or more. Today, big data size in petabytes is more appropriate. While terabyte usage is in the low numbers, terabytes are quickly becoming regularly used by individuals.

Storing a terabyte’s worth of information on a computer is now relatively easy. A feature film is several gigabytes; a library of several hundred movies will fill up a hard-drive quickly. Between legitimate downloads and online piracy of music and film, a terabyte drive can be maxed out in a year by a private user. This creates a need for additional hard drives to facilitate storage. While those volumes of data would have been beyond the largest corporations in the 1990s, computer technology has advanced so much that the lowliest individuals today routinely handle such data volumes.

That’s why petabytes and exabytes of data are usually used to define big data operations. But according to Moore’s Law, and the observably exponential trend in technological development, big data of tomorrow will just as likely make the petabytes and exabytes of today appear as inconsequential as a kilobyte does now. For this reason, properly figuring out big data size is done through examining what big data application looks like today.

Processing Exceptionally Large Quotients Of Data In Near Real-Time

Now, big data file sizes are in the multiple terabytes at nearly real-time processing speeds. Google’s cloud platform can process terabytes of data in seconds. Applications like Facebook have made big data size compression, processing, and analytics absolutely integral. With millions of users logged on simultaneously, substantial computing power is always available. Handling such big data really can’t be done without some kind of big data application, as big data brings multiple computers’ processing power to the table.

The size of big data sets doesn’t matter if the processing power can be parsed out across multiple computers, as in an Apache Hadoop. This is a kind of data lake software solution allowing distributed storage and processing of data on a computer cluster. When big data incorporates a whole demographic region, like those from corporations like Wal-Mart, quickly processing that data and achieving useful, accurate results saves millions of dollars.

Typical big data size is going to be less than that of the datasets a corporation like Wal-Mart uses. While that juggernaut will be in the dozens or hundreds of petabytes, most mid-sized businesses—with one hundred to five hundred employees—will probably have all the information they require in less than a single Petabyte. Big data volume grows as technology and business do; so at least expansion isn’t an issue with this new technological innovation.

Also, as big data applications become more mainstream, Extract, Transform and Load (ETL) applications will naturally become more dependable. Extract, Transform, and Load data processing is an analytical procedure that can be done much more quickly and securely via the cloud and Big Data than by other means. This comes without increased infrastructure costs, meaning more usable information and less stress to your budget.

A Way To Understand Big Data

The entire Lord of the Rings Trilogy is 481,103 words. If a “bit” represents one letter of data, and an average Tolkien word is 8 letters, that means 132 words per kilobyte, or 139,392 per megabyte. Ergo, the entire LOTR trilogy is 3.45 megabytes. With 1,056 megabytes in a gigabyte, this is 306.08 copies of the LOTR trilogy. A single terabyte is 323,227 copies of the LOTR trilogy. In a Petabyte, that goes up to 341,328,584 copies. At the level of Exabytes, the number comes to 360,442,985,071 copies. Very few businesses generate so much data. Generally, most businesses will use data in the range of 300,000 to 300,000,000, which contemporary big data applications can process very fast, in real time.

With the ability to process data at such levels, competitive information acquisition becomes a vital part of modern operations. Big data size will continue to expand as technology does, so in order to retain the maximum benefit from using such solutions, get in early and remain ahead of the curve as long as possible.

Need to design a Big Data application? See how hTrunk makes it easy.

""
1
Previous
Next