Covering Disruptive Technology Powering Business in The Digital Age

Home > DTA news > News > Why management is key to leveraging the power of big data
Why management is key to leveraging the power of big data


As big data continues to expand, companies are faced with both new opportunities and challenges. Businesses can uncover new insights or strategies with big data, but they have to take care not to be overwhelmed by the mountains of information. As any data expert can tell you, a smaller amount of organized, smart data is much more useful compared to an ocean of unstructured data, especially given the increasing costs of data storage.

Big data thus requires proper management which ensures that the company can easily access and protect their information resources at the same time. Here are some important things to note both about the science of data management as well what managers can do.

Storing and the ‘data lake’ challenge

Business owners have to remember that what we call “big data” could be more accurately defined as “decentralized data.” The distinguishing factor of big data is that there is so much of it that traditional centralized databases simply cannot effectively store nor process data whose size can reach hundreds of terabytes if not petabytes.

Organizations are turning to other solutions like cloud computing, but one concept often talked about in big data management is a data lake. A data lake is essentially a repository, often using Apache Hadoop, where data can be dumped and identified with metadata tags. If a group within a business seeks certain kinds of data, they can use the metadata tags to pull up smaller data chunks. Furthermore, having one data lake repository means that the various departments within a business can more easily access departments from another department, enabling a more holistic data approach.

But while a data lake can fix big data storage issues, a poorly managed lake can make all of that data virtually useless as it will be impossible for users to reliably find out what is actually in the lake. Metadata tags are absolutely essential to show what the data is and where it came from. These tags have to be continually updated and monitored so that when new questions arise, there is a metadata tag that can come close to having all the relevant data pertaining to said question.

Cataloging and creating metadata tags requires new software like Microsoft Azure as well as a dedicated data team. But if your business just creates a data lake to dump information in without a plan, you are wasting your time.

Replicating data and virtualization

Storing big data can be challenging even with an effectively governed lake, and combining all data sets together can provide a new challenge. A data lake is a place where information from different departments is combined, but each department may then copy one particularly useful data set and use it for their own ends. But while each department is using the same data set, the constant copies means that said data set could take up 10 or even 20 times more space than it did before without providing any new insights.

Fortunately, this replication problem can be fixed with virtualization software. Virtualization basically creates a virtual computer system only using software. This lets multiple operating systems run on a single server, improving efficiency hurt by constantly replicating the same pieces of data. And through virtualization, different departments can use the exact same data footprint.

Businesses must value big data, but also need to understand that big data poses unique privacy and security risks beyond criminals and hackers. Remember that a lot of big data is personal data which is thus subject to federal regulation. Big data can be used to uncover not just new business strategies, but could be used to infer personal information of people whose data is in the lake. And while your business may understand the importance of safeguarding that information, less scrupulous companies who you may be sharing big data with may not. A big data breach can cost companies millions of dollars in direct costs and damages, never mind the loss in reputation.

All of this means that protecting big data must be done end to end. Limit physical access to servers, monitor big data accounts to keep them secure from hackers and to ensure that your data does not get compromised by a malignant individuals, and make sure your software is secure. You may also consider allowing customers to find out what personal information you have on them and delete it at their request. Your business will still have plenty of data, and it fosters customer goodwill.

Protecting customer security and effectively storing big data so that users can search for relevant information are just a few aspects of strong big data management practices, but they are some of the most important. Above all else, management must understand that big data in and of itself is meaningless. Only by maintaining a flexible approach which turns unorganized big data into structured smart data can businesses glean the most valuable insights.

This article was originally publshed on and can be viewed in full