## Understanding the Difference Between Long and Short Term Sigma

We have seen that the main difference between Cpk and the Ppk is the way in which the value of sigma (standard deviation) is being calculated.

In Cpk, the value of sigma comes from the control chart and usually given by the formula

Where  is the average of the absolute value of range (obtained as a difference of two consecutive points when, data is arranged in a time order). The term d2 is a statistical constant that depend on the sample size.

This sigma-short is affected by the time order to the data i.e. every time you change the time order, sigma-short would change.

Whereas, in Ppk the sigma is calculated using traditional formula and is also called as the overall sigma or sigma-long.

In this case, sigma-long is not affected by the time order of the data points. This is called as overall standard deviation.

Usually, sigma-short is less than sigma-long.

Let’s do a simulation in R to check whether sigma-short is really affected by the time order or not

 #setting the seed for reproducibility set.seed(2307)  #load library QCC library(qcc)  # Generate a normal sample of 50 data points d<-rnorm(50,100,1.1)  # Generate a data set for storing output of the control chart, sigma-short and   sigma-long IMR<-list() sigma_short<-c() sigma_long<-c()  # Generate a blank matrix of 10 rows and 50 columns to store 10 10   random samples each having 50 data points. sam<-matrix(nrow=10,ncol=50,byrow = TRUE)  # Code for generating 10 random samples from the normal sample   generated as (d) above for(i in 1:10){ sam[i,]<-sample(d,50,replace=FALSE) #generate ith sample and store in   the matrix sam.#generate I-MR chart of the ith sample. IMR<-qcc(sam[i,],”xbar.one”,plot=FALSE) #calculate sigma-short of the ith sample. sigma_short[i]<-IMR\$std.dev #calculate sigma-long of the ith sample. sigma_long[i]<-sd(sam[i,]) } #print data frame   containing sigma-short and sigma-long of all 10 sample. (data_table<-cbind(sigma_short,sigma_long))

Table-1: Short and long sigma generated from the same simulated data but with different time order.

 sigma_short sigma_long 1.1168596 1.09059 1.1462365 1.09059 1.1023853 1.09059 0.9902320 1.09059 1.1419678 1.09059 1.2173854 1.09059 0.9941954 1.09059 1.0408088 1.09059 1.1038588 1.09059 1.2275286 1.09059

It is evident from the simulation that sigma-short do get affected by the time order of the data. Therefore, the sigma or the standard deviation calculated from the control charts (short sigma) and the overall sigma are different.

for more on Cpk and Ppk see below links

Car Parking & Six-Sigma

What Taguchi Loss Function has to do with Cpm?

What do we mean by garage’s width = 12σ and car’s width = 6σ?

## Why Do Pharmaceutical Industry Requires Quality by Design (QBD)

(This article is a part of PhD thesis of Mr. Abdul Qayum Mohammed, who is my PhD student)

Authors:

Abdul Qayum Mohammed, Phani Kiran Sunkari, Amrendra Kumar Roy*

*Corresponding Author, email: Amrendra@6sigma-concepts.com

KEYWORDS: QbD, 4A’s, DoE, FMEA, Design space, control strategy

 ABSTRACT QbD is of paramount importance for the patient safety but there is another side of the coin. QbD is also required for timely and uninterrupted supply of medicines into the market. This timely uninterrupted supply is required to fulfill the 4A’s requirement of any Regulatory body as it is their main KRA. But the manufacturers are given an impression that the patients are their main customer, which is not true. Due to which QbD implementation by generic API manufacturers has not picked up. This article tries to tell that the real customer is not patients but the Regulatory bodies who on the behalf of patients are dealing with manufacturer. Hence Regulators need to tell the manufacturer that QbD is required not only for the patient safety but also for meeting the 4A’s requirement, which is equally important. This article tries to correlate the effect of inconsistent manufacturing process on the KRA of the Regulatory bodies and makes a business case out of it. It will help in developing a strong customer-supplier relationship between the two parties and can trigger the smooth acceptance of QbD by generic players. This article also presents the detail sequence of steps involved in QbD using by a process flow diagram.

Introduction:

Nowadays, Quality by design (QbD) is an essential and interesting topic in the pharmaceutical development, be it for drug substance or drug product. Various guidelines have been published by different Regulatory agencies.[i] There is a plethora of literature available on the QbD approach for the process development[ii],[iii] of drug substance, drug product and analytical method development[iv]. Most of the available literature mainly focus on patient safety (QTPP) but if QbD has to sails through, then the generic manufacturer must know why and for whom it is required (apart from patients) and what is there in for them? They should not be taking regulators as an obstacle to their business but as a part of their business itself. There has to be business perspective behind QbD, as everything in this world is driven by economics. It has to be win-win situation for Regulators and the manufacturers. This means that, there has to be synchronization of each other’s expectation. This synchronization will be most effective if API manufacturer’s (i.e. supplier’s) consider Regulators as their customer and try to understand their requirement. In this context it is very important to understand the Regulator’s expectation and their responsibility towards their fellow countrymen.

Regulators Expectations:

Sole responsibility of any Regulator towards its country is to ensure not only acceptable (quality, safety and efficacy) and affordable medicines but also they need to ensure its availability (no shortage) in their country all the time. Even that is not enough for them; those medicines must be easily accessible to patients at their local pharmacies. These may be called as 4A’s and are the KRA of any Regulatory body. If they miss any one of the above ‘4As’, they will be held accountable by their Government for endangering the life of the patients.

In earlier days when the penetration of health services to large section of the society was not there, the main focus of Regulators was on the quality and price of the medicines. During those days margins were quite high and the effect of reprocessing and reworks on manufacturer’s margins were not much. So Regulators were happy as they were getting good quality at best price for their citizens. Gradually the health services gained penetration in to the large section of the society in developed countries and as a result they needed more and more quantities of medicine at affordable price. The KRA of Regulators changed from “high quality and low price” to “quality medicine at affordable price which is available all the time at the doorstep of patients”. Another event that led to the further cost erosion was the arrival of medical insurance and tender based purchasing system in hospitals. Increased demand made manufacturer to increase their batch size but because of insurance and tender based purchasing system, now they don’t have the advantage of high margins and couldn’t afford batch failures/reprocessing anymore. But now, these wastages led to erratic production and irregular supply of medicine in the market, thereby creating a shortage. This affected the KRA (4A’s) of the regulatory bodies; hence they were forced to interfere with the supplier’s system. They realized that in order to ensure their 4A’s, there has to be a robust process at manufacturer’s site and if it is done the medicines would automatically be available in their country (no shortages) and will be accessible to all patients at affordable price. This process robustness is possible with the use of some proven statistical tools like six sigma and QbD during the manufacturing of an API. This path to robust process was shown by the Regulators in the form of Q8/Q9/Q10/Q11 guidelines1 where QbD was made mandatory for formulators but and it is strongly recommended for API manufacturer and soon it would be made mandatory. While making QbD mandatory, they are emphasizing on how QbD is related to patient safety and how it will make the process robust for the manufacturers which in turn would eliminate the fear of audits. Regulators are right but somewhere they missed to communicate the business perspective, that was behind the QbD implementation i.e. manufacturers were not having much clue about the Regulator’s KRA and as a result a customer-supplier relationship never developed.

Figure 1: Regulator’s unsaid expectations

Manufacturer’s point of view

As Regulators were insisting on QbD, manufacturers have their own constraints in plant due to inconsistency of the process (Figure 2). As Regulator’s emphasis was on the patient’s safety rather than 4A’s, manufacturer took patients as their customer instead of Regulators and they make sure that there is no compromise with the quality of the medicines to delight the customer ie, patients. It doesn’t matter to manufacturer, if the quality is achieved by reprocessing/rework as far as the material is of acceptable quality to the customers. Due to this misconception about who the real customer is, 4A’s got neglected by the manufacturer.

Another problem is the definition of quality perceived by two parties. Quality of an API from the customer’s perspective has always been defined with respect to the patient safety (i.e. QTPPs which is indeed very important) but for the manufacturer quality meant only the purity of the product as he enjoyed handsome margin.

Profit = MP – COGS                                                                         Eq-1

MP                =  market Price

COGS             = genuine manufacturing cost + waste cost (COPQ)

COPQ             = Variation/Batch failure/Reprocessing & rework /product    recall = increase in drug product/drug substance cost = loosing customer faith (intangible cost)

Coming to prevailing market scenario, the manufacturers doesn’t have luxury to define the selling price, now the market is very competitive and the price of goods and services are dictated by the market, hence it is called as market price (MP) instead of selling price (SP). This lead to the change in the perception of quality, now quality was defined as producing goods and services meeting customer’s specification at the right price. The manufacturers are now forced to sell their goods and services at the market rate. As a result the profit is now defined as the difference of market rate and cost of goods sold (COGS). If manufacturing process is not robust enough then COPQ will be high resulting in high COGS and either (patient or manufacturer) of the party has to bear the cost. According to Taguchi, it is a loss to the society as a whole as neither of the party is getting benefitted. If these failures are more frequent it leads to production loss and as a result timely availability of the product in the market is not there and manufacturer is not able to fulfill the 4A’s criteria of the customer. This not only leads to loss of market share but also loss of customer’s confidence and customer in turn would look for other suppliers who can fulfill their requirements. This is an intangible loss to the manufacturer.

The COPQ has direct relationship with the way in which process has been developed. There are two ways in which a process could be optimized (Figure 3). It is clear from the Figure 3 that if one focus on the process optimization, it will lead to less COPQ and process would be more robust in terms of quality, quantity and timelines thereby reducing the COGS by elimination COPQ. This raises another question, how process optimization is different from product optimization and how it is going to solve all problems related to inconsistency? This can be understood by understanding the relationship between QTPPs/CQAs and CPPs/CMAs. As a manufacturer we must realize that any CQA (y) is a function of CPPs & CMAs (x) i.e. the value of CQA is dictated by the CPPs/CMAs and not vice versa (Figure 4 & 7). It means that by controlling CPPs/CMAs we can control CQAs but in order to do this we need to study and understand the process very well. This will help in quantifying the effect of CPPs/CMAs on CQAs and once it is done, it is possible to control the CQAs at a desired level just by controlling the CPPS/CMAs. This way of process development is called as process optimization and QbD insists on it. Another important concept associated with process optimization is the way in which in-process monitoring of the reaction is done. Traditionally, a desired CQA is monitored for any abnormality during the reaction whereas process optimization methodology it is required to monitor the CPP/CMA (Figure 4) which is responsible for that CQA. Hence it requires a paradigm shift in which the process is developed and control strategy is formulated by a manufacturer if the focus is on the process optimization.

Figure 3: Two ways of optimization

From the above discussion, it is clear that the real customer for a generic manufacturer is not the patients but the Regulators. This is because patients can’t decide and they don’t have capability to test the quality of the medicines, for them all brands are same. Hence Regulators comes into the pictures, who on the behalf of patients are dealing with manufacturers because they have all means and capability of doing so. Going by the Figure 5, patients are the real customer for the Regulators and who in turn are the customer for the manufacturer. In business sense, patients are just the end user of the manufacturer’s product once the product is approved by Regulators for use.

Figure 4: Relationship between CQAs and CPPs/CMAs

As it is clear that the Regulators are the real customers for the manufacturer and with the current inefficient process, manufacturer is not helping his customer in meeting their goal (4A’s). They can now understand the relationship between his inefficient manufacturing process and the customer’s KRA (Figure 6). In addition, they can clearly visualize the advantage of the process optimization over product optimization and how QbD can act as an enabler in developing a robust process thereby fulfilling the requirement of 4A’s . This will encourage manufacturer to adopt QbD because now it makes a strong business case for them for retaining the existing market and also as a strategy for entering the new market. This is a win-win situation for both the parties. Therefore, QbD should be pursued by manufacturer not because of the regulatory fear but as a tool for fulfilling the customer’s KRA which in-turn would benefit manufacturer by minimizing COPQ. In addition, it helps in building customer’s trust which is an intangible asset for any manufacturer. This will enable the manufacturers to accept Regulators as their customer rather than as an obstacle. This would result in better commitment from manufacturers about implementing QbD because the definition of customer as defined by Mahatma Gandhi is very relevant even today.

 “A customer is the most important visitor on our premises. He is not dependent on us. We are dependent on him. He is not an interruption in our work. He is the purpose of it. He is not an outsider in our business. He is part of it. We are not doing him a favor by serving him. He is doing us a favor by giving us an opportunity to do so.” ― Mahatma Gandhi

Figure 5: Dynamic Customer-Suppliers relationship throughout the supply chain

Figure 6: Manufacturer perception after understanding customer-supplier relationship

Manufacturer in customer’s shoes:

Another reason provided by the manufacturer for inconsistency is the quality of KSM supplied by their vendors and any quality issue with KSM will affect the quality of the API as shown by Figure 7 and equation 2. Till now manufacturer was acting as a supplier to Regulators but now manufacturer is in the shoes of a customer and can understand the problem faced by him because of the inconsistent quality of KSM from his supplier (Figure 5, Table 1). Now manufacturer can empathize with Regulatory bodies and is in a position to understand the effect of their process on his customer’s KRA(Figure 6). Table 1 is equally applicable to the relationship between manufacturer and the Regulatory bodies.

Table 1: Effect of process inconsistency from supplier/manufacturer on API quality

Consider Case-1 (Table 1) which represents the ideal condition, where process is robust at both sides. Whereas Case-2 and Case-3 represents an inconsistent process at either of the party and this inconsistency would reflect as an inconsistency in the quality of the API at manufacturer’s site. This would result in an unsatisfied customer (Regulator) and loss of market to someone else. Lastly, an inconsistent process from both the side (Case-4) would result in a disaster situation where it would be difficult for a manufacturer to control the quality of the API because the variance from both the sides would just add up (equation 2). In this case customer can’t even think of getting material from manufacturer as it would pose a threat to the patient’s life and no regulatory body would allow that.

Someone can argue that if consistency is an issue from supplier (Case 3) then they would negotiate with them for cherry-picking the good batches, but no supplier would do the cherry-picking without any extra cost, which in turn would increase the cost of the API. Another consequence of this handpicking is the interruption in the timely supply of KSM which will result in delay in the production at manufacturer’s site. This would result in increased idle time of resources thereby increased overheads which ultimately would reflect in increased API cost. Apart from increased cost it would also result in sporadic supply to the customer. Another viable option for circumventing the inconsistency at supplier’s end is to do a reprocessing of KSM at the manufacturer’s site. Obviously this is not the viable solution as it would escalate the COGS. Hence there is no choice but to take your supplier in confidence and make him understand the implication of his product quality on your business and how his business in-turn would get affected by it. Best solution is to discuss with the supplier and ask him for improving his process (if supplier has the capability) or help them in improving his KSM process (if manufacturer has the capability).

Note: Apart from robust process, Regulators are also auditing the manufacturer’s site for the safety and the ETP facility. It is being done again for the same reason of ensuring the continuous supply of medicines to their country.

How inconsistency of the process affects the quality? And How QbD will help in getting rid of this inconsistency?

Realizing that we need to have a consistent quality and uninterrupted production is not enough, as a manufacturer we must understand the various sources of inconsistency and how it can affects the quality of the API.

Any chemical reaction that is happening in a reactor is a black box (Figure 7) for us and there are three kinds of inputs that go into the reactor. The first input known as MAs are chemical entities that go into the reactor (KSM, reagents and other chemicals). The second input known as PPs are the reaction/process parameters that can be controlled by the manufacturer and third being the environmental/external factors like room temperature, age of the equipment, operators etc. that cannot be controlled. As variance (σ2) has an additive property, hence inconsistency from all the three types of factors amplifies the inconsistency of the product quality. The variation caused by the third type i.e. by external factors is called as inherent variation and we have to live with it. At most the effect of these nuisance factors could be nullified by blocking and randomization during DoE studies. Because of this inherent variation, yield or any other quality parameters are reported as a range instead of a single number. But the variation due to other two types of factors (MAs and PPs) could be controlled by studying its effect on product attributes (QAs) by using a combination of some risk analysis tools and some statistical tools for optimization. The combination of risk based assessment of MAs and PPs and use of statistical tools as DoE/MVA for optimizing the effect of MAs and PPs on QAs is called as QbD. Hence QbD is the tool that manufacturers are looking for, to eliminate the inconsistency in their product thereby fulfilling the customer’s expectations.

The variance that is being shown by Figure 7 represents the variation only at a single stage. Consider a multi-step synthesis (most common scenario) and in such scenarios the total variance at the API stage would be the culmination of variance from all the stages, resulting in a total out of control process as shown below by equation 3.

Figure 7: Cumulative effect of variance from various sources on the variance of API quality

At what stage of product development QbD required to be applied?

The traditional approach of process development of any API is focused more on filing the DMF at earliest. As a result of this improper process development there are failures at commercial scale and process comes back to R&D for fine tuning. But if the process is developed with QbD approach at R&D stage itself, certainly it would take more time initially, but its worth investing the time as there will be less failures or no failures at commercial scale and process could be scaled up in very less time. This will reduce the reprocessing and rework at commercial scale thereby minimizing the COPQ, a win-win situation for all as depicted in Figure 8.

Figure 8: Risk and reward associated with QbD and traditional approach

[i].  (a) ICH Q8 Pharmaceutical Development, (R2); U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER): Rockville, MD, Aug 2009. (b) ICH Q9 Quality Risk Management; U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER): Rockville, MD, June 2006. (c) ICH Q10 Pharmaceutical Quality System; U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research (CDER): Rockville, MD, April 2009. Understanding Challenges to Quality by Design, Final deliverable for FDA Understanding Challenges to QbD Project, December 18, 2009.

[ii]. (a) Jacky Musters, Leendert van den Bos, Edwin Kellenbach, Org. Process Res. Dev., 2013, 17, 87. (b) Zadeo Cimarosti, Fernando Bravo, Damiano Castoldi, Francesco Tinazzi, Stefano Provera, Alcide Perboni, Damiano Papini, Pieter Westerduin, Org. Process Res. Dev., 2010, 14, 805. (c) Fernando Bravo, Zadeo Cimarosti, Francesco Tinazzi, Gillian E. Smith, Damiano Castoldi, Stefano Provera, Pieter Westerduin, Org. Process Res. Dev., 2010, 14, 1162.

[iii]. (a) Sandeep Mohanty, Amrendra Kumar Roy, Vinay K. P. Kumar, Sandeep G. Reddy, Arun Chandra Karmakar, Tetrahedron Letters, 2014, 55, 4585. (b) Sandeep Mohanty, Amrendra Kumar Roy, S. Phani Kiran, G. Eduardo Rafael, K. P. Vinay Kumar, A. Chandra Karmakar, Org. Process Res. Dev., 2014, 18, 875.

[iv]. Girish R. Deshpande, Amrendra K. Roy, N. Someswara Rao, B. Mallikarjuna Rao, J. Rudraprasad Reddy, Chromatographia, 2011, 73, 639.

## Concept of Quality — We Must Understand this before Learning 6sigma!

Before we try to understand the 6sigma concept, we need to define the term “quality”.

##### What is Quality?

The term “quality” has many interpretations, but this by the ISO definition, quality is defined as: “The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs”.

If we read between the lines, then the definition varies with the reference frame we use to define the “quality”. The reference frame that we are using here are the manufacturers (who is supplying the product) and the customer (who is using the product). Hence the definition of quality with respect to above two reference frame can be defined as

This “goal post” approach to quality is graphically presented below, where a product is deemed pass or fail. It didn’t matter even if the quality is on the borderline (football just missed the goalpost and luckily a goal was scored).

This definition was applicable till the time there was a monopoly for the manufacturers or having a limited competition in the market. The manufacturers were not worried about the failures as they can easily pass on the cost to the customer. Having no choice, customer has to bear the cost. This is because of the traditional definition of profit shown below.

Coming to current business scenario, the manufacturers doesn’t have luxury to define the selling price, now the market is very competitive and the price of goods and services are dictated by the market, hence it is called as market price instead of selling price. This lead to the change in the perception of quality, now quality was defined as producing goods and services meeting customer’s specification at the right price. The manufacturers are now forced to sell their goods and services at the market rate. As a result the profit is now defined as the difference of market rate and cost of goods sold (COGS).

In current scenario if a manufacturer wants to make a profit, the only option he has is to reduce COGS. In order to do so, one has to understand the components that makes up COGS. The COGS in has many components as shown below. The COGS consist of genuine cost of COGS and the cost of quality. The genuine COGS will always be same (nearly) for all manufacturers, but the real differentiator would be the cost of quality. The manufacturer with lowest cost of quality would enjoy highest profit and can influence the market price to keep the competition at bay. But in order to keep cost of quality at its lowest possible level, the manufacturer has to hit the football, right at the center of the goalpost every time!

The cost of quality involves the cost incurred to monitor and ensure the quality (cost of conformance) and the cost of non-conformance or cost of poor quality (COPQ). The cost of conformance is a necessary evil whereas the COPQ is a waste or opportunity lost.

Coming to the present scenario, with increasing demand of goods and services, manufacturers required to fulfill their delivery commitment on time otherwise their customers would lose market share to the competitors. The manufacturers has realized that their business depends on the business prospects of their customers hence, timely supply of products and services is very important. This can be understood in a much better way using pharmaceutical industry

Sole responsibility of any Regulator (say FDA) towards its country is to ensure not only the acceptable (quality, safety and efficacy) and affordable medicines but they also need to ensure its availability (no shortage) in their country all the time. Even that is not enough for them; those medicines must be easily accessible to patients at their local pharmacies. These may be called as 4A’s and are the KRA of any Regulatory body. If they miss any one of the above ‘4As’, they will be held accountable by their Government for endangering the life of the patients. The point that need to be emphasized here is the importance of TIMELY SUPPLY of the medicines besides other parameters like quality and price.

Hence, the definition of quality again got modified as “producing goods and services in desired quantity which is delivered on time meeting all customer’s specification of quality and price.” A term used in operational excellence called as OTIF is acronym for “on time in full” meaning delivering goods and services meeting customer’s specification on time and in full quantity.

Coming once again to the definition of profit in present day scenario

Profit=MP-COGS

We have seen that the selling price is driven by the market and hence manufacturer can’t control it beyond an extent. So what he can do to increase his margin or profit? The only option he has is to reduce his COGS. We have seen that COGS has two components, genuine GOGS and COPQ. The manufacturers have little scope to reduce the genuine COGS as it is a necessary evil to produce goods and services. We will see latter in LEAN manufacturing how this genuine COGS can be reduced to some extent (wait till then!) e.g. if we can increase the throughput, we can bring down genuine COGS (if throughput or the yield of the process is improved, which results in less scrap would decrease the RM cost per unit of the goods produced).

But the real culprit for the high COGS is the unwarranted high COPQ.

The main reasons for high COPQ are

1. Low throughput or yield
2. More out of specifications (OOS) products which required to be either
1. Reprocessed
2. Reworked or
3. Has to be scraped
3. Inconsistent quality leading to more after sales& service and warranty costs
4. Biggest of all loses would be the customer’s confidence in you, which is intangible.

If we look at the outcomes of COPQ (discussed above), we can conclude one thing and that is “the process is not robust enough to meet customer’s specifications” and because of this manufacturers faces the problem of COPQ. All these wastages are called as “mudas” in Lean terminology hence, would be dealt in detail latter. But the important

What causes COPQ?

Before we can answer this important question, we need to understand the concept of variance. Let’s take a simple example, say you start from the home for office on exactly the same time every day, do you reach the office daily on exactly same time? Answer will be a big no or a better answer would be, it will take anywhere between 40-45 minutes to react the office if I start exactly at 7:30 AM. This variation in office arrival time can be attributed to many reasons like variation in starting time itself (I just can start exactly at 7:30 every day), variation in traffic conditions etc. There will always be a variation in any process and we need to control that variation. Even in the manufacturing atmosphere there are sources of variation like wear and tear of machine, change of operators etc. Because of this variation, there will always be a variation in the output (goods and services produced by the process). Hence, we will not get a product with a fixed quality attributes, but that quality attribute will have a range (called as process control limits) which need to be compared with the customer’s specification limits (goal post).

If my process control limits are towards the goal post (boundaries of the customer’s specification limits) represented by the goal post, then my failure rate would be quite high resulting in more failures, scrap, rework, warranty cost. This is nothing but COPQ.

Alternatively if my aim (process limits) are well within the goal posts (case-2), my success rate are much higher and I would be have less, scrap and rework thereby decreasing my COPQ.

###### Taguchi Loss Function

A paradigm shift in the definition of quality was given by Taguchi, where he gave the concept of producing products with quality targeted at the center of the customer’s specifications (a mutually agreed target). He stated that as we move away from the center of the specification, we incur cost either at the producer’s end or at the consumer’s end in the form of re-work and re-processing. Holistically, it’s a loss to the society. It states that even producing goods and services beyond customer’s specification is a loss to the society as customer will not be willing to pay for it. There is a sharp increase in the COGS as we try to improve the quality of goods and services beyond the specification.

For example;

The purity of medicine I am producing is > 99.5 (say specification) and if I try to improve it to 99.8, it will decrease my throughput as we need to perform one extra purification that will result in yield loss and increased COGS.

Buying a readymade suit, it is very difficult to find a suit that perfectly matches your body’s contour, hence you end up going for alterations. This incurs cost. Whereas, if you get a suit stitched by a tailor that fits your body contour (specification), it would not incur any extra cost in rework.

###### Six Sigma and COPQ

It is apparent from the above discussion that “variability in the process” is the single most culprit for the failures resulting in high cost of goods produced. This variability is the single most important concept in six sigma that required to be comprehended very well. We will encounter this monster (variability) everywhere when we will be dealing with six sigma tools like histogram, normal distribution, sampling distribution of mean, ANOVA, DoE, Regression analysis and most importantly the statistical process control (SPC).

Hence, a tool was required by the industry to study the variability and to find the ways to reduce it. The six sigma methodology was developed to fulfill this requirement. We will look into the detail why it is called as six sigma and not five or seven sigma latter on.

Before we go any further, we must understand one very important thing and must always remember this “any goods and services produced is an outcome of a process” also “there are many input that goes into the process, like raw materials, technical procedures, men etc”.

Hence, any variation in the input (x) to a given process will cause a variation in the output (y) quality.

Another important aspect is that the variance has an additive property i.e. the variance from all input is added to give the variance in the output.

###### How Six Sigma works?

Six sigma works by decreasing the variation coming from the different sources to reduce the overall variance in the system as shown below. It is a continuous improvement journey.

###### Summary:
1. Definition of Quality has changed drastically over the time, it’s no more “fit for purpose” but also include on time and in full (OTIF).
2. In this world of globalization, market place determines the selling price and manufacturers either have to reduce their COPQ or perish.
3. There is a customer specification and a process capability. The aim is to bring the process capability well within the customer’s specifications.
4. Main culprit of out of specification product is the unstable process which in turn is because of variability in the process coming from different sources.
5. Variance has an additive property.
6. Lean is tool to eliminate the wastages in the system and six sigma is a tool to reduce the defects from the process.

References

1.  In order to understand the consequences of a bad process, see red bead experiment designed by Deming on Youtube  https://www.youtube.com/watch?v=JeWTD-0BRS4
2. For different definition of quality see http://www.qualitydigest.com/magazine/2001/nov/article/definition-quality.html#