Concept of Quality — We Must Understand this before Learning 6sigma!

picture61

Before we try to understand the 6sigma concept, we need to define the term “quality”.

 What is Quality?

The term “quality” has many interpretations, but this by the ISO definition, quality is defined as: “The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs”.

If we read between the lines, then the definition varies with the reference frame we use to define the “quality”. The reference frame that we are using here are the manufacturers (who is supplying the product) and the customer (who is using the product). Hence the definition of quality with respect to above two reference frame can be defined as

picture2

This “goal post” approach to quality is graphically presented below, where a product is deemed pass or fail. It didn’t matter even if the quality is on the borderline (football just missed the goalpost and luckily a goal was scored).

picture3

This definition was applicable till the time there was a monopoly for the manufacturers or having a limited competition in the market. The manufacturers were not worried about the failures as they can easily pass on the cost to the customer. Having no choice, customer has to bear the cost. This is because of the traditional definition of profit shown below.

picture15

Coming to current business scenario, the manufacturers doesn’t have luxury to define the selling price, now the market is very competitive and the price of goods and services are dictated by the market, hence it is called as market price instead of selling price. This lead to the change in the perception of quality, now quality was defined as producing goods and services meeting customer’s specification at the right price. The manufacturers are now forced to sell their goods and services at the market rate. As a result the profit is now defined as the difference of market rate and cost of goods sold (COGS).

In current scenario if a manufacturer wants to make a profit, the only option he has is to reduce COGS. In order to do so, one has to understand the components that makes up COGS. The COGS in has many components as shown below. The COGS consist of genuine cost of COGS and the cost of quality. The genuine COGS will always be same (nearly) for all manufacturers, but the real differentiator would be the cost of quality. The manufacturer with lowest cost of quality would enjoy highest profit and can influence the market price to keep the competition at bay. But in order to keep cost of quality at its lowest possible level, the manufacturer has to hit the football, right at the center of the goalpost every time!

picture13

The cost of quality involves the cost incurred to monitor and ensure the quality (cost of conformance) and the cost of non-conformance or cost of poor quality (COPQ). The cost of conformance is a necessary evil whereas the COPQ is a waste or opportunity lost.

picture14

Coming to the present scenario, with increasing demand of goods and services, manufacturers required to fulfill their delivery commitment on time otherwise their customers would lose market share to the competitors. The manufacturers has realized that their business depends on the business prospects of their customers hence, timely supply of products and services is very important. This can be understood in a much better way using pharmaceutical industry

Sole responsibility of any Regulator (say FDA) towards its country is to ensure not only the acceptable (quality, safety and efficacy) and affordable medicines but they also need to ensure its availability (no shortage) in their country all the time. Even that is not enough for them; those medicines must be easily accessible to patients at their local pharmacies. These may be called as 4A’s and are the KRA of any Regulatory body. If they miss any one of the above ‘4As’, they will be held accountable by their Government for endangering the life of the patients. The point that need to be emphasized here is the importance of TIMELY SUPPLY of the medicines besides other parameters like quality and price.

 Hence, the definition of quality again got modified as “producing goods and services in desired quantity which is delivered on time meeting all customer’s specification of quality and price.” A term used in operational excellence called as OTIF is acronym for “on time in full” meaning delivering goods and services meeting customer’s specification on time and in full quantity.

Coming once again to the definition of profit in present day scenario

Profit=MP-COGS

We have seen that the selling price is driven by the market and hence manufacturer can’t control it beyond an extent. So what he can do to increase his margin or profit? The only option he has is to reduce his COGS. We have seen that COGS has two components, genuine GOGS and COPQ. The manufacturers have little scope to reduce the genuine COGS as it is a necessary evil to produce goods and services. We will see latter in LEAN manufacturing how this genuine COGS can be reduced to some extent (wait till then!) e.g. if we can increase the throughput, we can bring down genuine COGS (if throughput or the yield of the process is improved, which results in less scrap would decrease the RM cost per unit of the goods produced).

But the real culprit for the high COGS is the unwarranted high COPQ.


The main reasons for high COPQ are

  1. Low throughput or yield
  2. More out of specifications (OOS) products which required to be either
    1. Reprocessed
    2. Reworked or
    3. Has to be scraped
  3. Inconsistent quality leading to more after sales& service and warranty costs
  4. Biggest of all loses would be the customer’s confidence in you, which is intangible.

If we look at the outcomes of COPQ (discussed above), we can conclude one thing and that is “the process is not robust enough to meet customer’s specifications” and because of this manufacturers faces the problem of COPQ. All these wastages are called as “mudas” in Lean terminology hence, would be dealt in detail latter. But the important

What causes COPQ?

Before we can answer this important question, we need to understand the concept of variance. Let’s take a simple example, say you start from the home for office on exactly the same time every day, do you reach the office daily on exactly same time? Answer will be a big no or a better answer would be, it will take anywhere between 40-45 minutes to react the office if I start exactly at 7:30 AM. This variation in office arrival time can be attributed to many reasons like variation in starting time itself (I just can start exactly at 7:30 every day), variation in traffic conditions etc. There will always be a variation in any process and we need to control that variation. Even in the manufacturing atmosphere there are sources of variation like wear and tear of machine, change of operators etc. Because of this variation, there will always be a variation in the output (goods and services produced by the process). Hence, we will not get a product with a fixed quality attributes, but that quality attribute will have a range (called as process control limits) which need to be compared with the customer’s specification limits (goal post).

If my process control limits are towards the goal post (boundaries of the customer’s specification limits) represented by the goal post, then my failure rate would be quite high resulting in more failures, scrap, rework, warranty cost. This is nothing but COPQ.

Alternatively if my aim (process limits) are well within the goal posts (case-2), my success rate are much higher and I would be have less, scrap and rework thereby decreasing my COPQ.

picture10

picture11

Taguchi Loss Function

A paradigm shift in the definition of quality was given by Taguchi, where he gave the concept of producing products with quality targeted at the center of the customer’s specifications (a mutually agreed target). He stated that as we move away from the center of the specification, we incur cost either at the producer’s end or at the consumer’s end in the form of re-work and re-processing. Holistically, it’s a loss to the society. It states that even producing goods and services beyond customer’s specification is a loss to the society as customer will not be willing to pay for it. There is a sharp increase in the COGS as we try to improve the quality of goods and services beyond the specification.

Picture23

For example;

The purity of medicine I am producing is > 99.5 (say specification) and if I try to improve it to 99.8, it will decrease my throughput as we need to perform one extra purification that will result in yield loss and increased COGS.

Buying a readymade suit, it is very difficult to find a suit that perfectly matches your body’s contour, hence you end up going for alterations. This incurs cost. Whereas, if you get a suit stitched by a tailor that fits your body contour (specification), it would not incur any extra cost in rework.

Six Sigma and COPQ

It is apparent from the above discussion that “variability in the process” is the single most culprit for the failures resulting in high cost of goods produced. This variability is the single most important concept in six sigma that required to be comprehended very well. We will encounter this monster (variability) everywhere when we will be dealing with six sigma tools like histogram, normal distribution, sampling distribution of mean, ANOVA, DoE, Regression analysis and most importantly the statistical process control (SPC).

Hence, a tool was required by the industry to study the variability and to find the ways to reduce it. The six sigma methodology was developed to fulfill this requirement. We will look into the detail why it is called as six sigma and not five or seven sigma latter on.

Before we go any further, we must understand one very important thing and must always remember this “any goods and services produced is an outcome of a process” also “there are many input that goes into the process, like raw materials, technical procedures, men etc”.

Hence, any variation in the input (x) to a given process will cause a variation in the output (y) quality.

Picture23

Another important aspect is that the variance has an additive property i.e. the variance from all input is added to give the variance in the output.

Picture32

How Six Sigma works?

Six sigma works by decreasing the variation coming from the different sources to reduce the overall variance in the system as shown below. It is a continuous improvement journey.

picture12

Summary:
  1. Definition of Quality has changed drastically over the time, it’s no more “fit for purpose” but also include on time and in full (OTIF).
  2. In this world of globalization, market place determines the selling price and manufacturers either have to reduce their COPQ or perish.
  3. There is a customer specification and a process capability. The aim is to bring the process capability well within the customer’s specifications.
  4. Main culprit of out of specification product is the unstable process which in turn is because of variability in the process coming from different sources.
  5. Variance has an additive property.
  6. Lean is tool to eliminate the wastages in the system and six sigma is a tool to reduce the defects from the process.

References

  1.  In order to understand the consequences of a bad process, see red bead experiment designed by Deming on Youtube  https://www.youtube.com/watch?v=JeWTD-0BRS4
  2. For different definition of quality see http://www.qualitydigest.com/magazine/2001/nov/article/definition-quality.html#

 

7QC Tools: Basis of Western Electric Rules of Control Charts

for posts

We all are aware of these famous rule, for beginners let’s understand the basis of these rule. All rules are applied to one half of the control chart. The probability of getting a reaction to the test is ~0.01.

picture106

picture108

  1. A single point outside 3σ control limits or beyond zone A.
    • Probability of finding a point in this region = 0.00135 if caused by the normal process. Anything in this region is a case of assignable cause.
  2. Two out of three consecutive points in zone A (beyond 2σ).
    • Probability of getting 2 consecutive points in zone A = 0.0227*0.0227 = 0.00052
    • Probability of 2 out of 3 points in zone A = 0.0227*0.0227*0.9773*3 = 0.0015
  3. Four out of 5 consecutive points in zone B (beyond 1σ)
    • Probability of getting one points in zone B = 0.1587
    • Probability of 4 points in zone B and 1 point in other part of the control chart = 0.1587*0.1587*0.1587*0.1587*0.8413*5 = 0.0027
  4. Eight consecutive points on one side of the central line.
    • Probability of getting one points in beyond central line = 0.5
    • Probability of 8 points in in succession on one side of the central line = 8*0.5 = 0.0039

7QC Tools: Case Study on Interpreting the Control Charts

for posts

A process was running in a chemical plant. The final stage of the process was the crystallization, which gave the pure product. There were two crystallizer used for the purpose, each operated by a different individual. The SOP says that crystallizer has to be maintained between 30-40°C and for 110 to 140 minutes. The data for a month is captured below

picture109 In order to understand the process, I-MR control chart was plotted (for simplicity, R-chart is not captured).

picture110

As we have learned from the earlier blog, the alternate points above and below the central line represents some short of stratification (see the short connecting arms and the concentration of data points in zone B and C).

We plotted the histogram of the above data set and kept on increasing the number of classes. What we saw was the emergence of a bimodal distribution as we kept on increasing the number of classes.

picture112

So, one thing was sure, there were two processes running in the plant. Now question that was to be answered was “What is causing this stratification?”

We started with crystallizer, as soon as we plotted the simple run chart of the process with groups using Minitab®, we could see the difference. Crystallizer-2 was always giving better yield. This should not happen because both the crystallizer were identical and were connected to same utilities. Then we thought about the different operators might be the reason for this behavior, as this was the only factor that was different for both the crystallizer.

picture114When we plotted the same run chart with grouping, but this time operator was used for the purpose of grouping. We got the same result as was found with the crystallizers, the operator-2 working on the crystallizer-2 was producing more quantity of the product. This run chart is not shown here.

We further grilled down to the operating procedure adopted by the two operators. We studied temperature and the maintenance time using scatter plot. The results are shown below

picture115

Finally, it was found that operator-2 was maintaining the crystallizer-2 at the lower end of the prescribed temperature and for longer duration. Hence, specification for temperature and the maintenance time was revised.

7QC Tools: Interpretation of Control Charts Made Easy

for posts

picture106

Visual Inspection of the Control Charts for Unnatural Patterns

Besides above famous rules, there are patterns on the control charts that needs to be understood by every quality professionals. Let’s understand these patterns using following examples. It would be easier to understand them if we can imagine the type of distribution of the data displayed on the control chart.

Case-1: Non-overlapping distribution

As a production-in-charge, I am using two different grades of raw material with different quality attributes (non-overlapping but at the edge of the specification limits) and I am assuming that the quality attributes of the final product will be normally distributed i.e. I am assuming that most of final product will hit the center of the process control limits.

If the quality of the raw material is detrimental to the quality of the final product then my assumption about the output is wrong. Because the distribution of the final product quality would take a bimodal shape with only few data at the junction of the distribution. Same information would be reflected onto the control chart with high concentration of data points near the control limits and fewer or no points near the center. Here is the control chart of the final product

picture96

In this completely non-overlapping distribution, there will be unusual long connecting arms in the control charts. There will be absence of points near the central line.

If we plot the histogram of this data set and go on increasing the number of classes, the two distribution would get separated.

picture97

So, whenever we see a control charts with the data points concentrated towards the control limits and no points at the center of the control charts, immediately we should assume that it is a mixture of two non-overlapping distribution. Remember long connecting arms and few data points at the center of the control chart.

Case-2: Partially overlapping distribution

Assume this scenario: A product is being produced in my facility in two shifts by two different operators. Each day I have two batches, one in each shift. There is a well written batch manufacturing record indicating that the temperature of the reactor should be between 50 to 60 °C. The control chart of a quality attribute of the product is represented by following control chart.

picture98

picture99

We can see that the data points on the control chart are arranged in an alternate fashion around the central line. The first batch (from the 1st shift) is below the central line and next batch (from the 2nd shift) is above the central line. This control chart shows that even we are following the same manufacturing process, there is a slight difference in the process. It was found that the 1st shift in-charge was operating towards 50 °C and the 2nd shift in-charge was operating towards 60 °C. This type of alternate arrangement is indication of stratification (due to operators, machines etc.) and is characterized by short connecting arms.

There are the cases of partially overlapping distribution resulting in a bimodal distribution, which means that there will be few points in the central region of the control charts but, majority of the data points would be distributed in zone C or B. In such cases, it would be appropriate to plot the histogram with groups (like operator, shift etc).

Case-3: Significant Overlapping distribution

If there is significant overlap between the two input distributions then it would be difficult to differentiate them in the final product and the combined distribution would give a picture of a single normal distribution. Suppose the operators in the above case-2 were performing the activity at 55 °C and 60 °C respectively. This would result in an overlapping distribution as shown below

picture100

Case-4: Mixture of unequal proportion

As a shift-in-charge, I am running short of the production target. What I did to meet the production target was to mix the current batch with some of the material produced earlier for some other customer with slightly different specification. I hoped that it wouldn’t be caught by the QA!. The final control chart of the process looked like

picture101

We can see from the control chart that if two distributions are mixed in an unequal proportions then the combined distribution would be an unsymmetrical distribution. In this case one-half of the control chart (in present case the lower half) would have maximum data points and other half would have less data points.

Case-5: Cyclic trends

If one observe a repetition of the trend on the control chart, then there is a cyclic effect like sales per month of the year. Sales in some of the specific months are higher than the sales in some other months.

picture104

Case-6: Gradual shift in the trend

A gradual change in the process is indicated by the change in the location of the data points on the control charts. This chart is most commonly encountered during the continuous improvement programs when we compare the process performance before and after the improvement program.

picture105

If it is observed that this shift is gradual on the control charts, then there must be a reason for the same, like wear and tear of machine, problem with the calibration of the gauges etc.

Case-7: Trend

If one observe that the data points on the control charts are gradually moving up or down, then it is a case of trend. This is usually cause by gradual shift in the operating procedure due to wear and tear of machines, gauges going out of calibration etc.

picture103

Summary of unnatural pattern on the control charts
Unnatural pattern Pattern Description Symptom in control chart
Large shift (strays, freaks) Sudden and high change Points near and or beyond control limits
Smaller sustained shift Sustained smaller change Series of points on the same side of the central line
Trends A continuous changes in one direction Steadily increasing or decreasing run of points
Stratification Small differences between values in a long run, absence of points near the control limits A long run of points near the central line on the both sides
Mixture Saw-tooth effect, absence of points near the central line A run of consecutive points on both sides of central line, all far from the central line
Systematic Variation or stratification Regular alternation of high and low values A long run of consecutive points alternating up and down
Cycle Recurring periodic

movement

Cyclic recurring patterns of points

For the case study see next blog

7QC Tools: My bitter experience with statistical Process Control (SPC)!

for posts

I just want to share my experience in SPC.

In general, I have seen that people are plotting the control chart of the final critical quality attribute of a product (or simply a CQA). But the information displayed by these control charts is historical in nature i.e. the entire process has already taken place. Hence, even if the control chart is showing a out of control point, I can’t do anything about it except for the reprocessing and rework. We often forget that these CQAs are affected by some critical process parameters (CPPs) and I can’t go back in time to correct that CPPs. The only thing we can do is to start a investigation.

picture21

HENCE PLOTTING CONTROL CHARTS IS LIKE DOING A POSTMORTEM OF A DEAD (FAILED) BATCH.

Instead, if we can plot the control chart of CPPs and if these control charts shows any out of control points, IMMEDIATLY WE CAN FORECAST THAT THIS BATCH IS GOING TO FAIL or WE CAN TAKE A CORRECTIVE ACTION THEN AND THERE ITSELF. This is because CPPs and CQA are highly correlated and if CPPs shows an out of control point on its control chart, then we are sure that that batch is going to fail.

picture92

Hence, the control charts of CPPs would help us in forecasting about the output quality (CQA) of the batch because, the CPP would fail first before a batch fails. This will also help us in saving the time that goes into the investigation. This is very important for the pharmaceutical industry as everyone in the pharmaceutical industry knows, how much time and resource goes into the investigation!

picture95

I feel that we need to plot the control chart of CPPs along with the control chart of CQA, with more focus on the control chart of CPPs. This will help us in taking timely corrective actions (if available) or we can scrap the batch, saving downstream time and resource (in case no corrective action available).

Another advantage of plotting the CPP is for looking for the evidence that a CPP is showing a trend and in near future it will cross the control limits as shown below, this will warrant a timely corrective action of process or machine.

picture93


CQA: Critical Quality attribute

CPP: Critical Process Parameter

OOS: out of specification


7QC Tools — The Control Charts

picture61

The Control Charts

This is the most important topic to be covered in the 7QC tools. But in order to understand it, just remember following point for the moment as right now we can’t go into the details

  1. Two things that we must understand beyond doubt are
    1. There is a customer’s specifications, LSL & USL (upper and lower specification limits)
    2. Similarly there is a process capability, LCL & UCL (upper and lower control limits)
    3. The Process capability and customer’s specifications are two independent things however, it is desired that UCL-LCL < USL-LSL. The only way we can achieve this relationship is by decreasing the variation in the process as we can’t do anything about the customer’s specifications (they are sacrosanct).
    4. Picture13
  2. If a process is stable, will follow the bell shaped curve called as normal curve. It means that, if we plot all historical data obtained from a stable process – it will give a symmetrical curve as shown below. The σ represents the standard deviation (a measurement of variation)
    • picture88
  3. The main characteristic of the above curve is shown below. Example, the area under ±2σ would contain 95% of the total data
    • picture19
  4. Any process is affected by two types of input variables or factors. Input variables which can be controlled are called as assignable or special causes (e.g., person, material, unit operation, and machine), and factors which are uncontrollable are called noise factors or common causes (e.g., fluctuation in environmental factors such as temperature and humidity during the year).
  5. From the point number 2, we can conclude that, as long as the data is within ±3σ, the process is considered stable and whatever variation is there it is because of the common causes of variation. Any data point beyond ±3σ would represent an outlier indicating that the given process has deviated or there is an assignable or a special cause of variation which, needs immediate attention.
    • picture89
  6. Measurement of mean (μ) and σ used for calculating control limits, depends on the type and the distribution of the data used for preparing control chart.

Having gone through the above points, let’s go back to the point number 2. In this graph, the entire data is plotted after all the data has been collected. But, these data were collected over a time! Now if we add a time-axis in this graph and try to plot all data with respect to time, then it would give a run-chart as shown below.

picture90

The run-chart thus obtained is known as the control chart. It represents the data with respect to the time and ±3σ represents the upper and lower control limits of the process. We can also plot the customer’s specification limits (USL & LSL) if desired onto this graph. Now we can apply point number 3 and 4 in order to interpret the control chart or we can use Western Electric Rules if we want to interpret it in more detail.

The Control Charts and the Continuous Improvement

A given process can only be improved, if there are some tools available for timely detection of an abnormality due to any assignable causes. This timely and online signal of an abnormality (or an outlier) in the process could be achieved by plotting the process data points on an appropriate statistical control chart. But, these control charts can only tell that there is a problem in the process but cannot tell anything about its cause. Investigation and identification of the assignable causes associated with the abnormal signal allows timely corrective and preventive actions which, ultimately reduces the variability in the process and gradually takes the process to the next level of the improvement. This is an iterative process resulting in continuous improvement till abnormalities are no longer observed in the process and whatever variation is there, is because of the common causes only.

It is not necessarily true that all the deviations on control charts are bad (e.g. the trend of an impurity drifting towards LCL, reduced waiting time of patients, which is good for the process). Regardless of the fact that the deviation is goodor badfor the process, the outlier points must be investigated. Reasons for good deviation then must be incorporated into the process, and reasons for bad deviation needs to be eliminated from the process. This is an iterative process till the process comes under statistical control. Gradually, it would be observed that the natural control limits become much tighter than the customer’s specification, which is the ultimate aim of any process improvement program like 6sigma.

The significance of these control charts is evident by the fact that it was discovered in the 1920s by Walter A. Shewhart, since then it has been used extensively across the manufacturing industry and became an intrinsic part of the 6σ process.

picture12

To conclude, the statistical control charts not only help in estimating these process control limits but also raises an alert when the process goes out of control. These alerts trigger the investigation through root cause analysis leading to the process improvements which in turn leads to the decreased variability in the process leading to a statistical controlled process.


Why Do Continuous Improvement Programs like Lean & 6Sigma Fails.

for posts

Most of the times continuous improvement programs in an organization  gradually cease to exists after consultants leaves. This really disappoint me because, it fails despite the fact that, everyone in the organization knows its benefit. The importance of these initiatives are well known across all industry and this is vetted by the number of vacancies for lean and 6sigma professionals on any job portal. (check it on LinkedIn and other job portals).


The main reasons that I have experienced are following

  1. In order to drive a lean or a 6sigma program, you need to be an external consultant or you need to be at some authorative position within the organization (this will ensure that you get the job done). The main purpose is to have a backing from the higher management.
      1. External consultant will be in direct touch with management hence, people would cooperate
      2. Higher position ensures that your message percolates down the line very well.
      3. If you are at middle management, it is going to be difficult for you to implement these changes even if you have the backing of the higher management (unless they are fully involved.

     

  2. Above scenario can be well understood by drawing an analogy with the stretching of a spring. As long as consultants are there, spring (employees) remain stretched and as soon as they leave, spring comes back to its original position. Hence, these initiatives should focus on changing the mind-set of the employees and have their buy-in prior to the start of any initiative. So, focus of these initiative should be cultural change rather than focusing on the short term financial gain.

    “The quality of an organization can never exceed the quality of the minds that make it up.”                                                                        Harold McAlindon

    It took Toyota 30 years to implement, what is now called as TPS!

  3. Usually, these initiatives are not the part of business strategy but, are usually initiated during the crisis situation and once the crisis is over and consultants leaves, it’s over! Spring regains its original state!
  4. Another reason is the lack of trained man-power in the area of  lean and 6sigma. I remember when we were searching for a 6sigma black belt, HR team gave us a list of ~65 candidates claiming to have 6sigma/lean expertise. Believe me, we could find only two persons (requirement was ~10-15) out of 65 having the required skill set.
    1. Out of the curiosity we kept on asking people “from where they have got the certification?” Most of them answer that they have undergone 3-5 days of classroom training followed by the examination to get their black belt! That’s true in most of the cases but, I wonder “how a five day course can qualify a person to be a black belt unless you really sweat at the shop-floor with your team?
    2. There is also a lack of trained people within the organization, who can really interview such candidates. Imagine that I want a black belt for my company to drive the initiative, either I have to believe that a candidate knows the concepts or I have to hire someone who can really interview these people. Latter option is much better! These days QbD has become a buzz word in the pharmaceutical industry, just include that in your CV and you will get an immediate raise.
  5. But the main reason that I experienced was the compartmentalized view of an organization, where right hand doesn’t know what left hand is doing.

picture9

Let’s assume that the whole company is excited about the initiative, even then it fails! The major reason being the presence of many compartments/departments within the system and they are habituated to work in silos! They remain committed to their KRAs and their work-flow and doesn’t know much about the processes of the department from where they are receiving the inputs or how their processes affects the processes of the next department (internal customer). These silos are becoming the vertical coffins for the organization. Before we go any further, let’s understand “what is business?” or “How business is being carried out to generate revenue?”

The central planning team, based on the monthly forecast, gives the targets to all vertical coffins for that month. All vertical coffins then perform their duty in silos to complete their target.

picture6

Now, if we really look at the business, it is not the departments that makes the product and generates the revenues instead it is the culmination of a process-flow encompassing the entire organization. In order to give a clarity, let’s look at the following example

 picture1

It is just a flow of the process across the departments that adds value to the raw material for the customers. The most important point is that these processes are being performed by the shop-floor people and not by the management. What I meant to say is that, the material flow happens in horizontal direction at the bottom of the pyramid but processes are being managed vertically and in silos. As a result there is an information gap between the decision point and the execution point. So the shop-floor people are no better than the robots who are busy in meeting their targets. In this scenario we just can’t implement the continuous improvement unless these vertical coffins are dismantled and the gap between information and the material flow diminishes. This can only be made possible through delegation and by empowering the shop-floor people.

Wait a minute! What are you talking about? If we are going to delegate our duty, then what we are going to do? What will be our role? These are the thought that may pop-up in the minds of higher management.

 picture7

My dear friend, just leave these daily operation to the middle management, do something new, read something new, think something new or make some new strategy for the company. Give some new direction to the company with your vast experience. This is because if you get involved in day-to-day operations, then there is no difference between a shift in-charge and you! If you act like this, ideally your CTC should be added to the overhead of the product! Isn’t it?

Get a right person in the middle management and just get the daily updates from him, interfere when needed. I read somewhere (can’t recall) that as you grow higher in the management, you should distance yourself from the  day-to-day operations and focus more on mentoring and drawing future roadmap for the company.

picture8

Once this conducive environment is established i.e. delegation and empowering the shop-floor people, it would easier to implement any continuous improvement initiative in the organization, and this is because the real action (process, value addition) happens at the shop-floor. Even if you look at most of the lean and 6sigma tools, you would find that it is being implemented successfully at the shop floor by the shop floor people!