Covering Disruptive Technology Powering Business in The Digital Age

image
Machine Learning Fragmentation Is Slowing Us Down: There Is a Solution
image

Machine learning is advancing at a booming pace, both in the smart devices we interact with daily, and in commercial and industrial sectors of the economy.

Machine learning capabilities are being added to everything from social media platforms, internet of things (IoT) devices and cameras to robots and cars. But the pace of innovation is leading to fragmentation, and one potential consequence of that fragmentation is a risk of stalling.

Lack of Standards Is Plaguing All Tech

Fragmentation is a common problem affecting many industries that either lack standards or are inundated by many competing standards. It can especially plague emerging technologies — ABI Research reports that fragmentation has affected the virtual reality (VR) industry — and it can be tricky to judge when and how to standardize without stifling innovation.

When it comes to machine learning, format fragmentation affects developers, data scientists and researchers by greatly increasing the time it takes to build, train and deploy neural networks. This is because these neural networks are trained using a variety of frameworks to be deployed to other varieties of inference engines — each with its own proprietary format.

Fragmentation is also present in the inferencing stage, where converters are required before the deployment of a trained neural network. At this stage, the different frameworks mentioned above each require a separate exporter for each inference engine, again requiring a greater time commitment from developers while adding no benefit to creation or deployment.

A Standard Would Solve the Problem

The problem posed by format fragmentation can be solved with a fairly simple solution: a single transfer standard that is compatible across platforms.

Standards are present in our day-to-day life to facilitate activities like driving, making payments and typing: Everyone knows which side the gas pedal is on, the devices stores use to accept credit card payments all basically work the same way, and keyboards are the same no matter where you go.

Standardization reduces the thought required, and any potential friction, for almost any process or task. The same is true for standards in technological fields, where standards in rules, coding languages and even file formats like PDF or JPEG help to drive innovation by allowing developers and creators to focus on the task at hand, rather than wasting time on translation, reformatting or even completely rewriting their work to suit each individual use case or application.

For machine learning specifically, a standardized file format could be a “PDF for neural networks” and would allow researchers, data scientists and developers to easily transfer their neural networks between any of a number of frameworks and to multiple inference engines without spending extra time on translation or adding exporters. This would allow these professionals to focus on delivering their end results and attaining the benefits of machine learning, which would drive the advancement and real-world deployments of machine learning technology.

Machine Learning Standards in the Works

A universal transfer standard for neural networks will cut down time wasted on transfer and translation and provide a comprehensive, extensible and well-supported solution that all parts of the ecosystem can depend on. The need for a standard is clear within the industry, and there are currently two potential standards in the works. Facebook and Microsoft created an open-source format called ONNX (open neural network exchange) for artificial intelligence (AI) frameworks, and The Khronos Group, a global consortium, is leading the development of a standard called NNEF (neural network exchange format). I am involved in the NNEF development effort. This competition is positive: In addition to confirming the need for a standard, it helps ensure the best possible result for the industry.

The two formats are based on the same principles but have a few differences in terms of approach.

ONNX, announced in September 2017, is an open-source project and has therefore benefited from speedy and flexible development. However, the fact that it is an open-source project could also create problems. For example, it could undergo too many changes too fast, potentially leading to a logistical nightmare for systems that are already in operation, because even minor software changes could require hardware upgrades.

The provisional release of Khronos Group’s NNEF came out in 2017, with a final version slated for 2018. It is an open standard, and therefore changes occur more slowly because many manufacturers contribute to its development and exercise extreme caution when it comes to making changes.

Help AI Make the Next Leap

It remains to be seen which standard or method of creation is optimal for the machine learning industry, but a standard is undoubtedly needed to solve the fragmentation problem.

In other industries, standards have helped to reduce or prevent fragmentation. At this point for the machine learning industry, it’s unclear whether a single standard format can address all needs, but it is clear that there is demand for neural network standards, and the two groups are working to allow the AI industry to make its next leap.

(0)(0)

Archive