What is Big Data or Big Data and what does it have to do with Hardware?

The Boss

PC

What is Big Data or Big Data and what does it have to do with Hardware?

Big, data, Hardware


It's no surprise that conspiracy theorists are promoting a lot of ideas in this name, but from now on it's going to assure you that there is no link between Big Data and global governance, you can be sure. So what is this Big Data most talked about in today's computer? Let's see it.

What is Big Data

In fact, it means «huge amount of data«, But if we have told you before that it was a certain idea, it is because this is not perfect, but also includes a study of this data to look at patterns over them. It is an efficient and sophisticated way for an information processor to try to find something useful for it.

To give you an example, consider a supercomputer that performs tests to diagnose a disease, which releases millions and millions of information. Big Data includes not only that data, but a way of managing, classifying and analyzing to try to find the desired answers.

So, Big Big has five elements that explain its use and philosophy:

  1. Volume – Yes, we're talking about big data volumes, so if the size of this is not important it won't be considered Big Data. Volume is why the main character of this concept.
  2. Miscellaneous – This attribute looks at the type and type of data to be analyzed.
  3. Speed – This data must be analyzed in real time, which means that even when big data is being analyzed, everything should be available at the same time. This is where Hardware operates, for data input capacity and power to direct it.
  4. Differences – The compatibility of data sets determines how much they fit into a concept.
  5. Veracity – data quality used for analysis. Only quality information can produce patterns, otherwise it would be a waste of time. In other words, when analyzing data in pathological investigations, you cannot include data related to the Formula 1 driver frequency analysis because it will be inconsistent.

How much data is generated and stored?

In total, there are 2.7 Zettabytes of data in the digital universe. What does this mean? Let's look at the table …

  • Terabyte is 1024 Gigabytes
  • One Petabyte is 1024 Terabytes
  • Exabyte is 1024 Petabytes
  • Zettabyte is 1024 Exabytes.

So 2.7 Zettabytes is about 2,968,681,394,995 Gigabytes. If we wanted to store it on 4TB hard drives, we would need about 725 million hard drives, something to think about, right? Well, in fact, when you consider that more than 150,000 emails are sent per minute, 3.3 million Facebook posts or 3.8 million searches are done.

In addition, these figures are increasing day by day and more details are being made. To put it simply, by 2020 more data are being produced than in 2010, and it is anticipated that the figures we have provided will increase by two before five years.

Big data management and how Hardware influence

In fact, Big Data management isn't too complicated for you to understand. We will try to explain it in a simple way (the reality is more complicated, but for us to understand it we will stretch it as much as possible):

  1. Details are being held.
  2. Capturing data is organized and subdivided into smaller units by an algorithm to make the analysis easier.
  3. A reference is made, because otherwise the time it takes to recover any data will be increased.
  4. Details are saved.
  5. Data is analyzed using a large number of algorithms to search the data we like, as we explained before.
  6. Results are shown.

Following the example of a supercomputer used to diagnose a disease and try to find a cure. This high-end computer generates a huge amount of data, multiple entries and calculations every second, so it takes a lot of storage space to be able to save and put them into further analysis.

This is where Hardware comes in. You need plenty of storage space, but also that it is very fast, all possible to manage this data in a very short time. You also need a lot of RAM and lots of computing power to be able to use algorithms that analyze this data, right?

In summary, Big Data management is only possible as the hardware industry improves, because if developers, hard drive drivers and RAM did not improve at the same rate as the data we generate, their analysis would not be possible.

Leave a Comment