Stash Contact
Blog

How to reduce Process Variability

06 de November 5 min. de leitura

Process variability is one of the routine concerns of manufacturing industries. How can we improve this quality management indicator to achieve even better results?

Introduction to the concept

Within the reality of companies, the topic of quality management is quite common.

The practice began in the mid-1920s and since then it has been developed to keep the integrity not only of the product, but also of the production process and everyone involved. Along with this practice, one of the methods that are attributed to the proper functioning of industries is the SPC, or Statistical Process Control, with its origin also dating back to the 1920s. This control works by collecting and analyzing data collected in periodic tests carried out in production. In short, statistical data helps control time and resources in industries.

What is process variability?

In the SPC method, regarding the changes and issues that may occur within operations, the concept of Process Variability arises. To predict and understand its influence on the process, the causes of variability are divided into two categories:

  • Causes of variation Common: are variations related to cumulative effects, for example, the useful life of machines and climate change. Its predictability ends up making it part of the project, having its existence as an expected factor and easily repaired by the employees involved, without the need for further intervention.
  • Causes of variation Special: These are interferences that affect the final product abruptly, such as low-quality materials and machine set-up errors. They must be studied to be integrated into the process or removed completely, to not interfere with the overall consistency of the operation.

Classifying causes into common and special is a way of identifying the situation where the variability is. Furthermore, there are still other measures to track the gravity of the situation, called Acceptable Limits. Delimited by the guidelines of regulatory bodies and industries, such as:

  • Specification Limit (SE): Market quality guidelines, which define the variation in two ways: higher than the specification limit (USL) or lower than the limit (LSL);
  • Limit Control (LC): Guidelines built by the industries themselves tend to be stricter than those mentioned above, but follow the same logic of categorization of the variation: higher than the limit (USL) and lower than the limit (LSL).

Why reduce variability?

A process without control of the variables, or management control, tends to be a process that not only fails to meet the necessary quality demands but also ends up having a considerably higher cost than necessary. The high variability in processes results in greater investment in machine assistance and maintenance, in addition to increased costs in inspections. It can also lead to materials waste and final product, enabling the generation of environmental impacts. All this still affects quality, creating a domino effect in other related areas.

These are concerns that have been known for decades and carried out by methodologies such as those mentioned above. The technological advancement of machines has brought with it new control needs, which track the speed lines and any other adversities.

Transforming statistical control to decrease process variability

The search for reliability and consistency within processes led to the creation of methods such as SPC and its attributions in process variability. The same purpose is also found in the advancement of Industry 4.0 and its applications in factories and their production.

New technology opportunities must be allied in the search for better results, feeding and transforming the already well-structured forms of planning, now based on accurate, abundant and real-time data. With the implementation and application of data collection, analysis and processing technologies, it becomes possible to understand and act on the factors that contribute to the variability of the process. In the section below, there are some areas where data collection contributed in production improvements:

  • Materials

To ensure that the product delivered at the end of production is in its best shape, quality control must start early. Investment in quality raw materials is the first step to avoid possible problems of variability and increased costs with repairs and product returns. To ensure this process, the integration of technologies that streamline the inspection and analysis of the condition of the raw material before it can begin to be transformed is essential. The more data collected about the conditions of the material before and during the process, the greater control of how this production will become, reducing the chances of variations occurring.

  • Machinery and Equipment

In addition to investing in technology, focusing on more agile strategies also improves the process. Alternatives such as Predictive Maintenance allow the identification and correction of problems even before they can demage the process, directly impacting variability.

Within this subject, data, performance, sensors and tools from the machines help to delimit more quickly the location that may present problems and in consequence act more accurately and quickly, saving resources in maintenance, materials and time.

  • Quality data

The repairing cost of the low-quality product increases proportionally the later it is identified during the process. In extreme cases, it can cause irreparable damage to the company’s brand and image when the product is already with the end customer. Therefore, real-time monitoring of quality elements throughout the production chain allows the correction of problems and their causes, reducing the cost of rework and contributing to the stability of the process as a whole.

Conclusion

Process variability comes and goes with the effectiveness of data analysis. Statistical control based on samples no longer meets the needs of industries that plan production goals much higher than those of the last century. Remote, real-time, more reliable diagnostics of process variability are key to a more accurate, easy, and efficient execution, and investment in data collection is the first step toward this reality.

Array

ST-One Ltda © 2024

Privacy Policy

We use cookies to improve your experience on our website. By continuing browsing you agree to our privacy policy.