Covering Disruptive Technology Powering Business in The Digital Age

Home > DTA news > News > Dell Technologies Senior Director for Data Centre & Compute Discusses the Building Blocks for Modern IT Environments
Dell Technologies Senior Director for Data Centre & Compute Discusses the Building Blocks for Modern IT Environments
October 25, 2019 News

 

At the Dell Technologies Forum (DTF) held earlier this month, Dell Technologies announced a portfolio of its latest Dell EMC PowerEdge servers and new Ready Solutions for High-Performance Computing (HPC). To get a view on how these new offerings can benefit ASEAN businesses and drive value through innovation in a modern IT environment, we interviewed Mak Chin Wah, Dell Technologies’ Senior Director for Data Centre and Compute, South Asia.

One of the key goals of the new releases was to address the demands of the modern data centre. However, based on his interactions with customers in the ASEAN region, Mak said that many are not greenfield customers that are starting from scratch in building their modern data centres. The whole digital transformation journey which organisations are now embarking on means that they also have legacy applications they need to contend with.

So Mak said the question then becomes, “How do I move from here today, with all of the traditional infrastructure, to where I want to go, which is into a new, digital era. That is one of the main challenges faced by a lot of our customers.”

That means from an infrastructure perspective, they are looking at being able to run both traditional workloads as well as modern applications, which comes down to scalability without having to do a major forklift upgrade.

Automation is also very important, he said. “How do you scale out to, let’s say, a thousand servers but keep the total cost of ownership (TCO) low? Because you also need to have a lot of manpower to maintain your environment. Keeping the TCO low means you have to automate a lot of processes,” Mak further explained.

Besides continuously upgrading their servers and HPCs in order to provide customers with the hardware that they need, Dell Technologies has also been working on adding new innovations from the software front to help customers to reduce their TCO and make it easier to automate. He specifically mentioned Dell’s OpenManage system management offering as one of the solutions that can help customers overcome many of the complexities and challenges of deploying and managing a modern data centre.

Another major challenge facing today’s businesses comes in the form of security and making sure that their infrastructure is protected. Last but not least, Mak said choosing how they embark on the cloud journey is another concern for many organisations due to the fact that there are so many options and factors to consider.

Public or Private Cloud – Which Do Companies Prefer?

The multi-cloud era is here and it’s here to stay. Mak pointed out to an ESG research that found that over 90% of Dell’s customers are using more than two clouds. But as organisations move increasingly to cloud, does hardware still play an important role in enterprise IT? After all, many of their problems with regards to scalability, automation, security and lower TCO could be solved simply by going to cloud – at least that’s what the public cloud providers are advocating.

But according to Mak, the answer is not that straightforward. He again pointed to the ESG research findings that state 93% of respondents preferring on-premises (or private) cloud as their main cloud platform due to data sovereignty and other factors.

The key thing to note, he said, is that every customer is different, so they need to have a certain methodology to calculate what works best for them. He added that doing ROI studies can help businesses understand that there is a certain “break-even period” where it makes sense for them to either go on-prem or on the public cloud, based on their own value propositions as well as the pros and cons.

At the end of it, he said the most important thing is for organisations to decide on what best suits their business needs.

From Dell’s perspective, Mak said since the announcement of the Dell Technologies Cloud earlier this year, Dell’s strategy has been to give customers the flexibility and transparency to easily move their workloads between on-premises and public cloud. In that sense, they no longer have to worry about being “here or there”.

“Our strategy talks about providing consistent infrastructure and operations to our customers and channel partners. What it means is that from an end customer perspective, they don’t have to worry about whether their application or workload is on cloud or on-prem. They can automate it so that if it’s cheaper to go on public cloud, they can shift it there. If for performance reasons they want it to be on-premises, they can bring it back,” Mak explained.

Workload Optimised Systems

Among the highlights of the recent DTF announcements were the expansion of the Ready Solutions for HPC to now include Digital Manufacturing, Research and Life Sciences as well as enhancements in terms of liquid-cooling and GPU-enablement that can further improve the performance for compute-intensive workloads, such as artificial intelligence, machine learning and deep learning.

When asked to further comment on Dell’s Ready Solutions offerings, Mak said for businesses today to thrive in a dynamic, hyper-competitive environment, there is a tremendous need for agility and efficiency. And this requires matching the right workloads to the right IT environment.

“Different applications require different infrastructure,” he said before adding, “So what’s important is that we need to understand what kind of workload and application that is being used, and then we will provide the ready solutions reference architecture so that we can help the customer to optimise their environment. That’s what we mean by optimised workload.”

For example, Mak said for customers that use general enterprise applications such as SAP, Oracle, MySQL and others, they would need a hypervisor layer to help them virtualise their machines and maximise the optimisation within the physical CPUs in order to fully optimise their IT infrastructure.

On the other hand, when high performance compute is required, for instance for very targeted, niche applications that involve AI, machine learning and deep learning, the latency and dependence on hardware for performance is very high and thus, typically, there’s no overlay or hypervisor but instead a very bare-bones infrastructure is required in order to fully harness the performance of the hardware.

Therefore, he said the needs for the infrastructure is very different between the High-Performance Computing (HPC) and the typical enterprise applications.

Dell’s ongoing partnership with AMD means that the new PowerEdge servers are optimised to offer the performance, manageability and integrated security required for multi-cloud environments and a wide range of HPC as well as enterprise workloads. (You can read more on the new server innovations and capabilities by clicking here.)

To that, Mak commented, “In general, we feel that the AMD servers have a technical differentiation and we include them as part of our PowerEdge portfolio and then we wrap around with some of those automation software to help customers automate the deployment and easily manage their environment – so that comes together as a total solution for the customer.”

While most customers or partners may look at a server as just a server, Mak wanted to highlight that there are a lot of technological differences and innovations that are integrated into the little building blocks that are Dell’s PowerEdge Servers that customers should consider.

(0)(0)