Recently, NTT Data Corporation and NTT Data Mathematical Systems, Inc. partnered to conduct a benchmark of IQ’s in-database analytics capability applied to a “Big data” scenario. Compared to other big data analytics technologies, such as R, IQ showed phenomenal scalability and performance at large data volumes.
IQ in-database analytics is a framework that allows users to build their own custom SQL functions – User Defined Functions, or UDFs – in C/C++ or Java, and run them within the database processing space for comparable performance to built-in, native SQL functions.
NTT Data implemented a K-means algorithm as a UDF, and tested its performance on both a single IQ server, and a clustered set of servers to take advantage of IQ’s massive parallel and distributed query processing abilities. Test results showed that performance improved linearly as the number of IQ nodes was increased from 5 to 20. Processing speed also scaled linearly with the volume of data. IQ was able to process 10 billion data points in a few hours, while R fizzled and died at 100 million. This means that you can conduct category analysis for data on the entire worldwide population (around 7 billion) as an overnight job.
With the increasing presence of IoT in the marketplace, technologies like IQ that can handle huge data volumes with excellent performance on commodity hardware have never been more relevant.
You can read more about this impressive benchmark here: