Machine Learning Will Be 2017’s Top Trend (and Will Help Solve IoT’s Big Data Challenge)
It’s a brand new year, and a good time to look into the future to see what the next 12 months will bring. I listened to my friend Bridget Karlin make her predictions on the radio program Coffee Break with Game-Changers, which compiled what 80 thought leaders in technology, business and academics foresee for companies and industry in the coming year. Karlin, who is Intel’s managing director of Internet of Things (IoT) Strategy and Technology, made the prediction that in 2017, artificial intelligence in all its various forms will go mainstream.
I’ve decided to add my voice and make my own predictions for the coming year. 2017 is shaping up to be the year when machine learning comes of age, moving from the research labs and proof of concept implementations to cutting edge business solutions. Along the way, it will help power innovations such as autonomous vehicles, precision farming, therapeutic drug discovery, and advanced fraud detection for financial institutions.
Machine learning is a field of study at the intersection of statistics, computer science, and artificial intelligence that focuses on the development of fast and efficient algorithms to enable real-time processing of data. These algorithms are written to make predictions that will increase with accuracy and improve performance based on ongoing data interactions. In other words, machine learning systems can learn from iterative data computations. Rather than just follow explicitly programmed instructions, machine learning algorithms get “smarter” with use and experience, making them a key component of artificial intelligence platforms.
Machine Learning Helps Tackle IoT Data Flows
It turns out that machine learning may also be the answer to a challenge mounted by one of last year’s most buzzed about technology developments: the Internet of Things. Vin Sharma, the director of machine learning solutions in Intel’s Data Center Group, has explored this topic in the article Machine Learning is the Solution to the Big Data Problem Caused by the IoT. Vin notes that the first generation of big data analytics grew up around the flow of information generated by social media, online shopping, online videos, web surfing and other user-generated online behaviors.
Analyzing these massive datasets required new technologies, flexible cloud computing and virtualization, software such as Apache Hadoop and Spark, and more powerful, high performance processors that provided the tools to uncover the insights in big data.
However, the data volume in this first era of big data is dwarfed by the information flows created by the new generation of connected networks known as the IoT: IDC predicts 50 billion IoT sensors will be in place by 2020, with more than 200 billion networked devices by 2030. 
As devices and sensors proliferate, so does the volume of data created by them. Sensor-laden autonomous vehicles will generate 4,000 GB of data per day, every day, while the new Airbus A380-1000 is equipped with 10,000 sensors in each wing. Connected appliances and monitoring systems in the smart home and traffic sensors and surveillance cameras in smart cities produce an unending volume of data, as do sensors and robotic systems in manufacturing and industry.
The technologies that grew up to help analyze the first generation of big data aren’t fast or robust enough to handle the constant flood of machine and sensor data generated by the IoT or bots. We need a new generation of analytics and heuristics to help solve the challenge of IoT data flows.
I believe that machine learning is key to analyzing the enormous, repetitious volumes of data flowing from vast, always-on IoT networks. While machine learning may seem like science fiction to many, it is already in use and familiar to users of social media and online shopping. Machine learning algorithms power Facebook’s news feed, and Amazon’s recommendation engine uses machine learning to suggest what book or movie you should enjoy next.
Because machine learning algorithms get smarter as they are exposed to more data, these platforms are key to finding insights in the constant, real-time data flows generated by IoT networks. Machine learning systems can “learn” the normal flow patterns of data present on IoT networks and focus on the anomalies or patterns outside the norm. From billions of data points, machine learning can separate the “signal from the noise” in vast data flows, and help organizations focus on what’s meaningful.
However, machine learning algorithms are very complex, and to be useful and effective for businesses, they must run computations at enormous scale in a matter of milliseconds, on an ongoing basis. These ever more complex computations put pressure on traditional datacenter processors and computing platforms.
To operate at scale and in real time, machine learning systems require processors with multiple integrated cores, faster memory subsystems, and architectures that can parallelize processing for next generation analytical intelligence. SAP HANA® 2 is an ideal platform for machine learning and artificial intelligence solutions, with built-in analytical processing engines for text, spatial, graph and streaming data inputs. SAP HANA 2 has the capacity to run complex algorithms in-memory for real-time results and immediate application of insights to operations, processes and customer engagement activities. Optimized for performance on the Intel® Xeon® processor E7 v4, SAP HANA 2 includes new algorithms for classification, association, time series and regression in its predictive analytics library to empower data scientists to discover new patterns and incorporate machine learning into custom applications.
Intel® Xeon Phi™ processors are built for high performance computing, traditionally the province of research laboratories and super-computing challenges such as modeling weather patterns and genome sequencing. As IoT networks become larger and more pervasive, Intel Xeon Phi-based machine learning platforms will become more and more necessary as businesses increasingly base their success on the insights found in machine-to-machine communication. These processors deliver the performance required for the most demanding workloads, including machine learning and artificial intelligence algorithms.
My final prediction for 2017? As machine learning and artificial intelligence begin connecting the dots between IoT data flows and customer engagement for improved sales and outreach, Intel Xeon Phi processors will begin to leave the rarified environments of super computing in research centers and universities and increasingly become a requirement for cutting edge businesses.
 IDC FutureScape: Worldwide Internet of Things 2015 Predictions
 Just one autonomous car will use 4,000 GB of data/day
 That’s Data Science: Airbus Puts 10,000 Sensors in Every Single Wing!