fbpx

radixintegratedsolutions.com

Manufacturing optimization by operational excellence tools

Manufacturing optimization by operational excellence tools is a multi-dimensional process that entails integrating and streamlining all operations within the manufacturing value chain. It involves the effective utilization of technology, processes, people, resources, systems and data to optimize production volume, minimize time taken from start to finish and reduce costs.

Operational excellence tools also enable manufacturers to identify opportunities for improvement in terms of their capacity planning, logistics coordination and decision-making processes. This helps them stay competitive in today’s ever-changing global marketplace.

There are some tools for OE Process which give us optimization opportunity e

  1. Overall Equipment Effectiveness (OEE) is a measurement of efficiency which encapsulates the performance of production machinery in terms of availability, performance and quality. OEE can be used to identify areas for improvement across production machinery, enabling manufacturers to identify issues such as low inventory levels, sub-optimal machine processing speeds or excessive downtime. OEE benchmarking provides manufacturers with a continuous measure of operational excellence that allows them to consistently improve results over time.
  2. Poka Yoke is a Japanese tool used in the Lean Manufacturing process to improve quality and reduce errors. It works by providing fool-proofing mechanisms that prevent mistakes from happening in the first place. This helps increase process efficiency, reduce costs, and improve customer satisfaction. Some examples of Poka Yoke are using specific patterns for parts or product assemblies, sensing devices to detect incorrect positioning of parts or assemblies, color coding products when selecting components or entering data into a computer system, and using shapes instead of words when labelling items.
  3. Takt time is a tool used in lean manufacturing to optimize production flow and ensure that the rate of output meets customer demand. It is defined as the amount of time available per unit of product or service, usually measured in minutes or seconds. Takt time calculation additionally takes into consideration the total processing time, downtime, and cycle efficiency to determine a target rate for each operation.
  4. Project Charter: a live written document that serves as an agreement between the Sponsor and the team, outlining the expectations of the project.
  5. Critical to Quality (CTQ): a product or service characteristic that meets customer or process requirements.
  6. Voice of Customer (VOC) (Kano Model): capturing customer feedback through surveys, personal visits, questionnaires, interviews, and phone calls.
  7. SIPOC: Supplier-Inputs-Process-Outputs-Customers; a tool to identify all relevant elements of a process at a macro level, helping to better understand complex processes.
  8. Detailed Process Map Micro Level: an expanded version of SIPOC, identifying key process input (x) and output variables (y) relevant to the project.
  9. Prioritization Matrix: detailing the importance of the outputs (CTQ) with clear weightages, and initiating the funelling process.
  10. SMED, or Single-Minute Exchange of Die, is an efficient Lean manufacturing technique designed to reduce the amount of time it takes to complete changeovers between different production runs. This method seeks to reduce setup times as much as possible with the aim of enabling more frequent and shorter batches, which has a host of business benefits such as reducing inventory levels and lead times. In order to achieve this, SMED implements countermeasures that break down and separate processes into the internal activities that can be completed during machine stoppage (e.g., unloading, routing drawings), and external activities that are performed away from the machine (e.g., die preparation).
  11. Swim lane Maps: allowing for Value Analysis (VA) and Non Value Added (NVA) analysis. Swim lanes can show queues and inventory, and can be divided into task areas.
  12. Value Stream Mapping (VSM): helping to understand the current situation from a bigger perspective, showing the ratio of non-value added to value added time, exposing sources of waste, and demonstrating linkages between the various flows.
  13. 8 Types of Waste Elimination: acronym “DOWNTIME” (Defects, Overproduction, Waiting, Non-Utilized Talent, Transportation, Inventory, Motion, and Extra Processing).
  14. Fishbone Analysis: a graphical tool that helps to identify all possible causes for a particular effect. It establishes the relationship between an effort and influencing causes under 6M’s (Man, Machine, Method, Material, Measurement, and Mother Nature).
  15. Failure Mode Effect Analysis (FMEA) is a process used to identify potential causes of failure and their associated effects. Through the use of a Fishbone Analysis, all possible causes are identified and analysed to determine root causes and prevention actions. The Risk Priority Number (RPN) vs Causes – Effort vs Impact Matrix is used to prioritize the highest RPN causes.
  16. The 5S methodology, originating from the Toyota Production System (TPS), is a five-step process for creating a more organized and productive workspace. The steps are Sort, Straighten, Shine, Standardize, and Sustain. 5S serves as a foundation for deploying more advanced Lean Six Sigma tools and processes.
  17. Kaizen is a Japanese methodology that encourages employees to make small changes in the organization to deliver big results.
  18. Control Plan is a written summary (Revised SOP) of the process that outlines in detail the steps to be taken to maintain a process or device at its current level of performance. It is an essential tool for ensuring that processes remain consistent and efficient.
  19. Histograms are a powerful tool used to analyze the shape and spread of sample data. By dividing the sample values into intervals called bins, the histogram provides a visual representation of the frequency of observations within each bin.
  20. Graphical summaries, such as histograms, boxplots, and mean and median values, provide a comprehensive overview of the data.
  21. Normality tests, also known as Gaussian distributions, are bell-shaped curves that are symmetric about their mean. These tests are used to assess population normality with a normal probability plot.
  22. Run charts, or line graphs, are used to display process performance over time, allowing for the identification of trends, cycles, and large aberrations.
  23. Control charts, also known as process behavior charts, are used to capture both common and special causes.
  24. Pareto charts are used to determine which of the defects are the most significant, following the 80-20 rule.
  25. Box plots, or box-and-whisker plots, are used to assess and compare sample distributions.
  26. Process capability for continuous data captures the voice of the customer and the voice of the process, while process capability for discrete data is calculated by first finding defects per opportunities (DPO) and then converting it to DPMO by multiplying by a million.
  27. Hypothesis testing (2 sample t test) is used to determine whether two population means are significantly different, while analysis of variance (ANOVA) is used to determine whether the means of two or more groups differ.
  28. Scatter plots are used to test the statistical significance of the association between two variables.
  29. Matrix Plots provide a comprehensive overview of the relationships between multiple variables by creating an array of scatter plots. This type of visualization is especially useful when there are multiple variables influencing a single output, or when you need to understand the relationships between variables.
  30. Regression analysis is a powerful tool for uncovering the statistical relationship between one or more predictor variables and a response variable. It can be used to identify trends, make predictions, and draw conclusions.
  31. Design of Experiments (DoE) is a powerful technique that combines multiple experiments to gain insight into the interactions between variables. It is commonly used to optimize products and processes, and to identify the most effective combination of factors.
  32. Variance component analysis is a powerful statistical method used to identify sources of variation in complex datasets. It is oftentimes used in the field of genetics, where multiple factors are at play and there may be potentially hundreds of sources of variance.
  33. Chi Square Test is a statistical tool commonly used to analyse, compare, and assess categorical data. It measures observed differences between expected results and empirical findings, determining the significance of each result in terms of likelihood.

As a non-parametric test, it draws conclusions based solely on the sample provided rather than assumptions about larger populations or norms. Due to its versatility and flexibility in measuring independence among categorical variables and its straightforward interpretation of results.

Leave a Comment

Your email address will not be published. Required fields are marked *