Covering Disruptive Technology Powering Business in the Digital Age

Home > Archives > News > What you missed in Big Data: Predicting the weather with machine learning
What you missed in Big Data: Predicting the weather with machine learning
June 21, 2016 News

Thanks to modern analytics technology, weather forecasts are becoming more useful than ever before. Much of the credit goes to IBM Corp. and the researchers at its budding meteorology business. The vendor last week unveiled a service called Deep Thunder that uses machine learning to help companies optimize operations based on short-term changes in temperature and rainfall.

The potential applications are numerous. A power company could use the system to identify which parts of its infrastructure are most susceptible to water damage and spread out field technicians accordingly. Meanwhile, insurance providers will be able to harness IBM’s whether data to determine the validity of accident claims. They’ll have a lot of information to work with: Deep Thunder processes more than 100 terabytes worth of weather measurements on a daily basis to fuel its forecasts.

Analyzing such a large quantity of data is no small feat, especially considering that some companies are still struggling to handle their own internally-produced records. The problem is especially pronounced in regulated industries like banking where a lot of information is kept on mainframes. As a result, data often needs to be moved to an external system for processing, a task that Syncsort Inc. promises to ease with its Ironstream export tool. The vendor added a filtering option to the application last week that lets users selectively transfer files to Splunk Inc.’s popular log crunching platform. Big iron users can thus speed up migrations and reduce software licensing costs in the process, according to the company.

Cisco Systems Inc. also tried its hands at log processing  last week by introducing a 39-node monitoring appliance designed to give organizations better visibility into their data centers. The system analyzes operational data from servers and networking equipment in real-time to create a picture of day-to-day activities. It’s then able to highlight anomalies like performance issues that might require the attention of the IT department.

This article was originally published on and can be viewed in full