The growth of the Internet of Things (IoT) has been accelerating over the last few years and has no slowdown in sight, thanks to more and more things/machines coming online. It is also one that closely leverages the existing Industrial Machine to Machine (M2M) topic. The main difference in the definition is that M2M is a foundational topic for thing interaction and IoT is the orchestration of this interaction across multiple M2M nodes. What I mean by this is that in order for a thing to exist in a connected network then connectivity, purpose, and user personalization need to be well-defined so it can know its role and the way it is supposed to interact with the world around it. This can only be achieved through a common set of standards but also extensibility to include native and proprietary ones as well. These proprietary interfaces enable communities and corporations to further extend the ability to “plug-and-play” into the orchestration framework as we know from the M2M foundation. There will always be exceptions that hinder the IoT Platform goal of “Connected Everything” and if you don’t account for them up-front in the design then it will be a constant struggle to keep up with the pace of thing technology. Aside from the IoT and M2M definitions being very similar in nature, of one system built on another, the lessons learned from M2M remain fundamental to IoT orchestration around discovery, management, and standardized interfaces. Just think about the revolution in computing standards that something as basic as a USB port has brought about. You have a device that has this rectangular-looking plug and you know instantly that it will work with your PC/Mac and even with your TV; proof that extensibility drives adoption and furthers standards.
Another key driver in adoption is the ability to provide personalized data to people in a non-intrusive and desirable way. Dave Rose, in his book “Enchanted Objects”, reiterates this point over and over again and emphasizes the need to have “pairs or series of objects, each one inspired by the ones that came before (Sometimes long before), connected to the same fundamental human desire.” In short, the same thing needs to provide different information and meaning for each person or thing involved and this is the orchestration component that IoT provides. This orchestration requires artifacts and context that it senses from the environment to determine if a person is at work, home, traveling, shopping or any other various tasks, will determine how the ‘enchanted objects’ around them influences in their lives. This is clearly a very broad scope that needs to be addressed, but one that can only imagine the possibilities of a completely realized IoT interconnected world. It would mean that each device would not only know the role the user demanded of it but it would also know it’s role in the greater scheme of devices regardless of what the person or applications using the data were doing.
One can easily imagine how a platform combined with simple applications would have the ability to apply and assign probabilities to various outcomes based on past experiences and available data from predefined models. While fully predicting the future is probably not a practical aspiration, there is something to be said about having important variables clearly outlined and available to reference each other. This type of information certainly will not hurt you, and can provide key insights into possible outcomes based on the current conditions and from other matching scenarios involving other people. This is similar to what is done with forecasting the weather today where data from various radar sensors is combined over time and used to predict the path of a storm with varying degrees of uncertainty over a period of time. There is also a scenario here where end-users should be able to define their own connections and rules in an intuitive way (not unlike www.ifttt.com or a personalized home robot; Jibo) along with being able to provide their own flavor of feed data (filtered to their personal liking) that has a personalized weighting factor or one suggested to them based on profile interests.
The caveat here is thinking that the data in front of you tells you the entire story and there is a lot of data involved but the complete story is always bigger than what can be measured. Therein lays the problem with making predictions is there is always a degree of uncertainty that is introduced and never an absolute thing. As a result using probabilistic analysis for identifying possible outcomes and basing this on a relationship that was identified from a mathematical simulation model seems to be the right way to go; focus on statistical learning and predictive analysis as the higher level functions. However, without the collection, management, and control of a multitude of lower level M2M processes the systems with higher-level functions would have a hard time manifesting themselves in a consistent enough way for scalable and reliable orchestration to occur; let alone personalized flavors.
Like what you’ve read so far…keep on going on to Part #2: