Thermal imaging, the door to Zero Defects
Glass container manufacturing is an extremely cost conscious process where a significant amount of products are defective due to limitations in materials involved, energy usage, and process steps driven by simplicity and cost-effectiveness more than by performance. Thermal imaging of the finished bottle provides a significant amount of raw data, that well used can give insights into both process parameters (temperature) and finished product (thickness). OpenZDM will use this information to improve the process both upstream (gob forming) and downstream (glass distribution).
Vidrala manufactures glass containers through a continuous process involving the melting of glass. This molten glass is brought through some forehearths to a spout, where some glass gobs are cut. These glass gobs will be formed into a bottle or jar each, without any addition or subtraction of mass. The conditions in which these gobs are formed depend on the temperature and homogeneity of the glass and will affect the regularity of the thickness of the bottle’s walls.
This relationship is not obvious. Some experience is held inside the factories, and some shapes are regarded as better than others, but this knowledge is qualitative and very difficult to optimize. In the first use case of Vidrala, the usage of modern data science tools should define the actual limits for the gob shape and the impacts of those limits.
For the second use case, the focus is set up downstream, where the bottle is already cold. The thickness of the bottles is measured between 45 and 60 minutes after it has been manufactured. The late measurement has an impact on the amount of defective produced until the measurement is made. The objective is to use data science to predict the thickness problem with the information available in the hot end, saving time and having a big impact on the performance of the factories.
The key problem is the uncertainty of the glass process, where “repeating” process conditions don’t deliver the same result. This is obviously due to a limited understanding of the relevant process conditions or the capacity to measure those conditions. Some of this can be solved by involving sound statistical knowledge in large amounts of process data. This is no different from how medical studies are done to assert the impact of given elements in human health. Results vary wildly due to the individual human condition and the difficulty to create identical case studies. But some conclusions are nevertheless possible with the right tools.
- The first challenge is the correct collection of data. The amount of suppliers and types of equipment involved requires a deep understanding and development of data tools.
- The second is defining the fine line that will make the balance between statistical significance, measurement errors, limits, and random variation.