Hannover Messe day 1 – Will Industry 4.0 enable zero defects? How are business models impacted by Industry 4.0?
The first day of the Hannover Messe is past, and my head is spinning with robots and industry 4.0 talk. I will dream of robots tonight – some chrome-polished and shiny, others upholstered cosy work companions, some playing pinball, others shaking your hand, and huge ones manipulating entire cars.
Will Industry 4.0 enable “zero defects”?
I spent much of my day at the Industry 4.0 forum, starting my day with Warwick’s Dan Somers busting some myths around industry 4.0 – some controversial messages but also some really good insights.
‘
First myth was spot on, highly mill relevant: “Poor quality will be an issue of the past.” Unfortunately a myth.
While Dan made clear that we will never be able to reach zero defect, he highlighted a number of industry 4.0 measure to improve quality:
- Proactive early warning of out of tolerance issues. Fairly straightforward: This a little bit industry 4.0, as we require a near real time reflection of the real plant floor quality parameters to actually fire such early warning. Call it in-line quality control or alerting – SAP MII and SAP Plant Connectivity are great tools to achieve this.
- Much more interesting was their second recommendation: predictive analytics of “in-tolerance” issues. This contains “a lot of industry 4.0” and requires explanation.
First – what is an “in-tolerance” issue? Imagine we analysed historic production & quality data. We are interested in early warnings, indications for later quality defects. (So we are in a scenario of predictive quality). Let’s say through data mining we identify 2 parameters (among the hundreds others) that correlates with the occurrence of a quality defect later on. If these 2 parameters deteriorated both, quality also deteriorated a little later on.
Quite often in data mining, it is not possible to predict an issue only based on a single parameter. So data scientists look for rules to combine parameters. If vibration is above x, and temperature is above y, something is not ok. Interestingly each single parameter may still be “in-tolerance” individually, but if both parameters are moving close to the limits of the allowed range, data science predicts we will get a quality problem.
A good practice is therefore to actually look for such “patterns” in our data, and also fire an early warning as soon as a “rule” is violated – even if each individual parameter seems still ok. This requires quite a bit of data science & industry 4.0 and we better make sure that´such deviations are statistically relevant, before triggering a maintenance notification.
Solution wise, this is the playing field of SAP Predictive Analysis and the SAP Predictive Maintenance.
Even with both “out of tolerance” and “in-tolerance” issue detecting, we can’t eliminate quality problems totally, but we have the chance to react much earlier – and avoid wasting money and resources. This even works well if our operations people & engineers to not understand the reason why. Adjusting our statistical process control, so that it avoids also the “In-tolerance” quality issues, should have us much better off.
The second myth I would like to put for discussion. I need some validation for this one: “The more data we analyse the better”.
Dan Somers gave some very interesting comments. He highlighted that you do not necessarily to have BIG data, to come to conclusions, but rather good data with statistically relevant answers. This felt ok what I read and learned so far. (Warning – if you deal with rare events, be even more careful).
Dan Somers also recommended NOT to clear data, and to avoid assumptions. As an example, he highlighted data points that we may consider as outliers, may actually have an inherent logic or correlation that we simply do not understand, yet. Omitting, or deleting these data points, would disable us to learn & understand later. I see his point, but I also recall examples where it was absolutely necessary to clear data, to adjust false positive alerts/classifications. Sometimes sensor data was not good Often manually entered maintenance information is wrongly classified or incomplete.
Whether you agree to the myths, or not, I would recommend to look for a good business case, get the right people and skills, and start with a pilot to learn, explore and be much faster. Let’s discuss such projects and bring your operations & IT team together with our data scientists and domain experts.
How are business models impacted by Industry 4.0?
The University of Erlangen-Nürnberg presented results from a survey on exactly this question. They used the Markus Osterwalder’s business model canvas to define the affected aspects of the business models. (If you do not know this approach – please check it out and start to fill it out for your company to get an impression how powerful this simple structure is.)
In which area would you expect the biggest impact of industry 4.0 for your company? What would you expect overall for mill products?
Value proposition, customer relationships, revenue streams?
The majority of survey participants were unfortunately from the automotive, machinery and high tech industry. The survey indicated for these industries the biggest impact of industry 4.0 in their value proposition, closely followed by required resources. For high tech, the impact of industry 4.0 on the cost structure. change was also a big topic.
So how would I answer this for our mill industry?
I would bet my money on the category “cost structure” – assuming that predictive quality, predictive maintenance and logistics are our focus areas for industry 4.0. For building products there may actually be a bigger effect in customer relationships.
Agree? Disagree? Interested to start a POC? Let us know – we’d love to embark on this together.
Are you interested how the journey continues.