Covering Disruptive Technology Powering Business in The Digital Age

Home > DTA news > News > Machine Intelligence
Machine Intelligence
September 6, 2016 News

Machines that learn and act, or intelligent machines, will be of great benefit in the coming decades. In contrast to computers that are programmed and do exactly as they are told, these machines will continuously learn patterns in their environment automatically, minus the programming, thus enabling them to tackle problems on their own and in new and different ways.

Big Community decided to delve deeper into the world of machine learning and thought it would be interesting to see how this technology will impact the world of big data and other conventional forms of machine intelligence and machine learning.

We decided to approach Numenta, a company at the forefront of machine intelligence and making waves in the industry through different break throughs and innovations. We were privileged to interview Director of Marketing, Christy Maver, on the ins and outs of machine intelligence and where it will bring us to in the near future.

How does your Hierarchical Temporal Memory approach differ from other machine learning approaches?

“Machine learning is heavily based on mathematical and statistical techniques, and relies on massive amounts of labelled data. It is typically a batch process, where algorithms are tuned to a specific problem, and is used for image classification, language translation, or recognizing which emails are spam

Hierarchical Temporal Memory, or HTM, is a biological approach modelled on the human brain that learns, infers, and recalls patterns in data streams. It does not require massive data points to train, and it continuously learns from unlabelled data streams. Just like the brain, it is high capacity – able to create many models simultaneously, robust to noise, and can run on very different types of sensors. HTM is currently used for prediction, anomaly detection, and classification across many application areas. It is well-suited for applications with temporal data streams – from monitoring IoT sensors to preventative maintenance to detecting unusual traffic patterns. It can also be combined with other machine learning techniques. Many use cases involve spatial, big data (required for machine learning) as well as temporal data (required for HTM). Our co-founders wrote a blog post earlier this year about the difference between the various techniques in this space that might be of interest:”

You talk about being well suited to streaming data over time rather than static data – but isn’t this the case for all machine learning algorithms?

“No, it’s not. Although there are machine learning techniques that are starting to incorporate temporal data, most machine learning algorithms are spatial in nature. The types of patterns they find tend be co-occurrences, whereas HTM finds sequences. You can think of learning a song, for example. HTM algorithms would learn the sequence of notes in a melody and be able to predict what note comes next. Most machine learning algorithms would learn co-occurrences, when two notes happened simultaneously. It would learn chords, where HTM would learn melodies. Learning sequences requires streaming data. Learning co-occurrences does not.”

Your use cases sound interesting – can you tell us more about the rogue behaviour use case and what industries could benefit from this.

“Yes, the rogue behaviour use case is one that generates a lot of interest. This use case is about finding anomalies in human generated data. It monitors human behavior in the form of things like keystrokes, which files someone is opening, CPU usage, and app access – creating a model for each and revealing when a user is behaving differently than expected. This application would allow companies to detect when an employee is trying to access something they shouldn’t, or identify abnormal financial trading activities by individual traders, or know when someone is installing unapproved software. It can even understand when a computer or device is being used by an unauthorized person.

We see potential benefits for many industries with this application. Anomaly detection of human behavior is useful in IT security, regulatory compliance, financial risk assessment and device access control. As with all of our example applications, this is available in open source.”

How easy is it for the typical business perhaps only with a standard IT department to work with your technology?

“Numenta is transparent with our technology. All of our algorithms and sample application code are available via our open source project NuPIC, which stands for Numenta Platform for Intelligent Computing. Under the terms of our open source software license, anyone can experiment with our codebases and build non-commercial prototypes.
Because our technology is based on neuroscience concepts that are unfamiliar to most people, there is a steeper learning curve than one might experience in machine learning projects. It also helps to have a developer well-versed in Python and C++. For those new to Numenta and NuPIC, a great place to get started is here, which has some helpful overview and training material. We also have an active community forum, which is a great place to ask technical questions and interact with Numenta engineers and researchers.

Because of the steep learning curve, we created a free tool that people can use to test our technology before diving into a full deployment. It’s a desktop tool called HTM Studio that allows you to run HTM on your own, local data files. It’s available for download at All you need is a CSV file with at least 400 rows of numeric data listed in chronological order, a column for the timestamps, and a header row. For more on how to ensure your data is in the right format for HTM Studio, you can watch this short tutorial video.

Because today’s computers are programmed, they can only do exactly as they are told. In stark contrast, intelligent machines continuously and automatically learn patterns in their environment without being programmed, enabling them to tackle problems in entirely new ways. Intelligent machines that learn and act will have an enormous beneficial impact in the coming decades.”

Numentas technology has already been implemented in softwares with best practices and suitable for deployment in commercial applications. Core learning algorithms are available in the NuPIC open source project where full documentation is available.