The art of data science
The art of data science is to create actionable value from a vast array of data points, which relate to each other in different ways. This is quite different from historical approaches where data was exchanged through an interface between a specific producer and consumer.
The art of modern digitization is to understand the importance of the many roles each piece of data can play, within an interoperable ecosystem of multiple data producers and consumers. Players with expertise in multiple domains, as well as creators of advanced temporal data processing, or AI, algorithms must come together to deliver software automation of real-time data analysis to create actionable insights .
IIoT message data, such as that used by the IIoT IPC Connected Factory Exchange (CFX) standard for manufacturing, potentially contributes to creating value through a number of different perspectives, including operational performance, product quality , materials and supply chain, compliance, traceability, etc. Each perspective is created by combining incremental information from each new data point, with that from other data points, within the context of the live “digital twin” model of the production operation, including product details, the engineering methodology focused on the design, configurations of production stations, etc.
Micro-facts are calculations based on single aspects of data in groups, derived from multiple posts, taken from a specific point of view. As a simple example, a time is reported when a production unit leaves a particular production station. An example of a derived micro-fact would be the time it took for the production unit to complete processing on a production line configuration, based on the arrival time of the production unit at the first station and from the departure time to the final station.

The difference between the two times is a simple calculation, but plays an important role when considered from different angles, including indicating the performance of the line against the target, the OEE (Operational Equipment Effectiveness ), the risk to quality derived from the spread of the variation in times for different production units, the effect on material replenishment schedules, the extrapolation of line performance expected in the short term and, taking into account the dependencies and interdependent paths of different products, the performance of the whole plant.
The necessary insights drive the contextualization process and arise from real-world issues, such as the occurrence of defects, disruptions in production flow, or abnormal fluctuations in conditions in key areas. In a single production batch of a thousand products, one product could be defective. Since every production operation went as planned in the same way for every product, finding the root cause of a “one-off” defect is best done by understanding where the most variance within the process has been. felt from the point of view of the defective product.
Such variance is likely to be the product of the combination of two or more factors, each of which is within their individual control limits, but together a defect has been created. Multiple perspectives must be considered, each fueled by the many associated micro-facts, which allow the analysis to find the unique set of conditions that was most likely to have contributed to creating the defect.
As a follow-up value, an analysis could then be performed to find the production units that nearly experienced the same conditions as the faulty production unit but appear to be good products. Understanding the importance of each of their perspectives can lead to the identification of those units that may be the “grey area” of quality, those products with reliability issues in the market. Steps can then be taken to eliminate the possibility of such a combination of factors influencing defects of this type.
Similar analyzes can be performed from perspectives that affect the performance of a manufacturing line, often referred to as machine learning or closed-loop analysis, but in ideal cases taking in much broader contexts that include past variations associated with the arrival of production units, materials and the operations carried out.
Human intelligence and experience based on knowledge of the physical world of assembly manufacturing is used to formulate insights needed for improvement. Automation, through the use of software algorithms, then manipulates combinations of micro-facts to build the big picture of events and trends, each powered by data. The true Industry 4.0 digital ecosystem contains combinations of many “big data” sources, which share their data interoperably with many solutions, each of which again exchanges its results with other solutions, each building on each other to create that insight that creates direct benefit. The process is not unlike the workings of the human brain, where there is significant distributed processing of shared sensory data, which is then contextualized into a single consciousness. Most of the time, anyway.
Comments are closed.