Covering Disruptive Technology Powering Business in The Digital Age

Home > DTA news > News > How to handle the Big Data dilemma
How to handle the Big Data dilemma

 

As more things and people connect, vast amounts of data will be generated, creating opportunities to learn about customers, streamline business processes, flatten organizational structures, and transform industries.

To be competitive, organizations need to be able to “know” everything, and understand this deluge of data to put it into the context of the business. This requires a clear vision of the strategic goals of Big Data and Analytics (BDA), especially as BDA technology investments are increasing across Asia/Pacific at 34% YoY for the next few years. This rapid growth in investment is creating a divide between the organizations that “know” and the ones that do not.

In an interview with Networks Asia, Jun Shi, Vice President, Sales Engineering and Chief Technology Officer Officer (CTO) for Juniper Networks across APAC, discusses why Big Data and analytics are increasingly becoming important and how businesses can successfully derive insights from it.

“To handle the big data dilemma, companies must implement networks that are highly agile, flexible and scalable,” said Jun.

The following is the excerpt of the interview.

1 We’ve seen businesses collect data from multiple internal sources, but this is nothing new and has been going on for a while. Companies like Splunk etc. have been trying to help us make sense of networking logs for years.

So what is causing this change? What is driving the need for external sources of information?

A major catalyst for this change has been the impact brought about by Digital Disruption. The likes of Airbnb, Uber and Bitcoin have radically altered the traditional business models, and have gained market share faster than ever before by successfully eliminating customer pain points. Specifically, we now see four new forces which are accelerating the need for external sources of information – user expectation, competitive pressure, economics and technology innovation.

All these have resulted in a widening gap – that between the shorter time-to-adoption of new business models, and user expectations of value in the form of cost and time savings which are rapidly increasing. In response, companies prioritise the user experience and this fuels competition, to provide the same value at a better price, or more value at the same price.  Companies will increasingly seek to derive knowledge and insight to accelerate decision making, thereby driving the need to manipulate external sources of information in near real-time.

2 Why are current or legacy analytics tools insufficient for the vast amounts of data we are expecting? Are the current tools sufficient to combine divergent information sources? How does the IT department decide what more they need? How will big data and their tools integrate with existing analytics tools? 

System administrators are continuously looking for ways to process and present data in usable and reliable ways – and often, the time taken as well as resources utilised are not taken into consideration. As the amount of both structured and unstructured data increases exponentially, effective delivery of data is obstructed as users expect data to be available quickly, whenever and wherever it’s needed on networks that weren’t built to handle that much traffic.

One way to manage usage is through applications that deliver tailored information to an individual’s interests. Administrators can provide employees with the big data equivalent of Uber or Waze – apps that are easily downloadable and serve specific needs. While this removes the IT middleman to some extent, it accomplishes the overall mission of providing end users with easy access to information, delivered in bite-sized chunks.

However, even that doesn’t change the fact that companies will need networks equipped to handle the delivery of those applications and the data they present. This is a challenge that is just as big as the data itself. To handle the big data dilemma, companies must implement networks that are highly agile, flexible and scalable. They must become the critical pieces that allow intelligent data to be pushed out on-demand at any time to any place. Software-defined networking (SDN) is the ideal conduit for this type of service because it creates a network that is elastic, resilient and built for delivering applications and data on-demand.

This article was originally published on www.networksasia.net and can be viewed in full

(0)(0)