Archive

Archive for the ‘Production Management’ Category

Applying Nobel-winning Physics Techniques to Management

October 10, 2012 Leave a comment

2012’s Nobel Prize in Physics goes to Serge Haroche of France and American David Wineland. They showed in the 1990s how to observe individual particles while preserving their bizarre quantum properties, something that scientists had struggled to do before. While this contribution may first seem far-fetching and remotely detached from the daily management challenges of a business executive, I am going to argue otherwise.

The Principle of Uncertainty

Let me first touch on the significance of this discovery. At the beginning of last century when quantum physics was born, physicists have discovered that classical laws of physics break down at sub-atomic level. Everyday objects that we are used to have deterministic states.  For example, given the starting location and the velocity of a car, we can easily determine its location at any time. Tiny particles on the other hand behave differently. The foundation of quantum mechanics was first built on the Heisenberg’s principle of uncertainty, which describes the possibility of physical objects having multiple states. Hence given the initial location and velocity of a particle, multiple locations described by probability functions are possible. This is what makes quantum mechanics such a bizarre subject for most people. Making things worse, it was not possible to observe this type of behavior. For example, observing a photon will require lights to be absorbed by our eyes or any image sensors, hence altering the state of the photon itself. This observer effect and uncertainty relation has been captured in many ways in philosophical studies such as those of Karl Popper and reflexivity. The latter one has been mentioned by George Soros as the principle behind his investment strategy. Working around these monumental theoretical and philosophical hurdles is hence what the 2 Nobel literates have achieved.

What Can Managers Learn From Quantum Physicists?

While the bizarre world of quantum mechanics may seem distant, the principle of uncertainty for tiny objects prevails well in business management. For example, many companies have installed some type of ERP systems to get a real time view of the state of their business. There is a strong belief in the existence of a single version of the truth on financial data that are at company or division levels. Day-to-day decisions are made based on this information. This is almost in analogy to management by classical physics. However, when it comes down to highly granular information like events on critical machines, individual operator performance, inventory by SKU and bin locations, or even OEE for machines, business executives tend to think of them as the world of tiny objects like the bizarre world of quantum mechanics. It is not uncommon to have multiple truths in such manufacturing operations. The reported OEEs from different plants for the same type of machine can be based on very different measurement methods and subject to different degree of human errors. Different departments on the manufacturing shop floor have different recognition of the true state of their operations. The variable cost by product line by shift can be far from the aggregate cost that was captured in ERP. The inventory accuracy by SKU quantity can be way below the ERP inventory accuracy that is based on total aggregated financial numbers. It is far too common that business executives have admitted the principle of uncertainty and allowed their manufacturing operations to operate based on multiple uncertain states.

Mastering the Quantum Bits of Your Business

It does not have to be that way. Just like the Nobel Prize winners have discovered, the technology to observe and measure the quantum bit of manufacturing information exists. Some companies have already tapped into the power of this technology and achieved significant improvement in profit margins and working capital. In the increasingly complex and turbulent world, tiny quantum bit of information can explode into a perfect storm in a very short time. The capability of a business to leverage these quantum bits is already distinguishing the winners from the losers in the marketplace.

While the technology to observe the quantum bits of manufacturing information may be a far cry from getting its own Nobel Prize, the application of such technology should not be left as a subject of uncertainty anymore.

Advertisements

The Future of Lean Manufacturing through the World of Warcraft

October 5, 2011 3 comments

Any seasoned Lean manufacturing expert will tell you that implementing lean is not about JIT, Heijunka or any sort of tools. It is about implementing a lean culture of continuous improvement. In fact in Toyota, they consider their ultimate competitive advantage is the “intoxication of improvement” by every employee from shopfloor to top floor. Thousands of improvement ideas are created every day even for the smallest mundane tasks. This is in big contrast to “don’t fix what is not broken” mindset prevails in most other organizations. Well, what they believe is one thing. Have any of these been scientifically proven? Can we simulate this kind of organizational behavior and measure its output? And if we can, what can we learn from such about managing thousands of ideas and distill them to actions every day?

In this video, Dr. John Seely Brown, one of my favorite business writer talks about the innovation dynamics within the World of Warcraft (WoW), which also happens to be my favorite on-line video game. At the end, Brown said “This may be for the first time that we are able to prove exponential learning … and figure out how you can radically accelerate on what you’re learning”. Indeed, I have found this game could interestingly cast light on the social dynamics of lean culture and how it will evolve in the future.
[gigya movie=”http://ecorner.stanford.edu/swf/player-ec.swf” quality=”high” allowFullScreen=”true” allowScriptAccess=”always” flashVars=”config=http://ecorner.stanford.edu/embeded_config.xml%3Fmid%3D2432″ src=”http://ecorner.stanford.edu/swf/player-ec.swf” type=width=”640″ height=”383″ quality=”high” type=”application/x-shockwave-flash”]

Guild structure and QC circles

“There is too much information changing too fast…The only way to get anything done seriously is to join a guild” said Brown. These guilds in WoW are groups of 20-200 people helping each other to process ideas. This greatly resembles the Quality Circle movement, in which employees are not just hired to perform a task but rather to form part of small groups that constantly seeking ways to self-improve. The differences of QC circles to these guilds could be the technology that they are using as indicated below.

Everything is measured; everyone is critiqued by everyone else

In the WoW, it is easy to record every action and measure performance. There are after-action reviews on every high-end raid and everyone is critiqued by everyone. This resembles the typical PDCA (Plan-Do-Check-Act) process used by QC circles. The challenges however in the manufacturing world are that too much information is still recorded on paper or if recorded electronically, on multiple segregated systems. This inhibits the sharing, retrieval and analysis of information that enabled the rapid group self-improvement dynamics of WoW.

Personal dashboard are not pre-made, they are mashups

Another key learning from the WoW is that you need to craft your own dashboard to measure your own performance. Brown even said that the Obama administration is stealing the idea from WoW and trying to do the same. So much for the software companies who are trying to sell pre-packaged KPIs to measure corporate performance.  Imagine a new manufacturing world that every operator and supervisor has real-time feedback on his/her own performance. Seeing how minute by minute idle time or over-production is affecting bottom-line and return on capital. The future of performance measurement technology is detail, real-time and personalized.

Exponential learning

The last slide in the video shows learning speed exponentially increases as one goes up the level in WoW. The high performance guilds need to distill what they have learnt from their own guild and share with other guilds throughout the network. Those who can do that effectively tend to move up level faster. In the manufacturing world, there are many companies trying to share best-practices across and within organizations. However, manufacturing executives may not realize that effective continuous improvement and best-practice sharing can lead to a state of exponential learning that constitutes an ultimate competitive advantage.

In a sense, the computer world of WoW is able to simulate the social dynamics of how individuals could form groups to process and create ideas, how groups could measure and improve within themselves and how groups could interact with each other in order to accelerate learning that results in high performance. Such social dynamic also resembles that of the lean culture, long promoted within companies like Toyota. Looking forward, the promises of manufacturing 2.0 are technologies to enable almost everything to be measured, allow information from individuals to interact freely as groups and also empower groups to effectively share best-practices. Such multi-tier collaboration from shopfloor to topfloor will bring about a new form of highly competitive organization that harnesses the power of exponential learning. On that note, the future evolution of lean culture may not be that much different from the present World of Warcraft.

Golf Lessons from Lean, Six-Sigma and TOC

November 14, 2010 2 comments

After taking lessons from several coaches, I noticed some very fundamental differences between their approaches. My current coach is very good at giving a one point advice based on my swing. Although one day I would like to swing like Ernie Els, right now I am settled with my ugly swing and  happy to experience the notable score improvement after every lesson. That is quite different from the lessons that my friend took. His coach basically asked him to forget all he had learnt and tried to revolutionize his swing in order to take him to the next level. He is scared to go to the course now because he is stuck with a setback before he can get any better. He however believes that he is taking the necessary steps towards his goal of turning professional someday.

What are your long and short term goals and which approach is more suitable for you?

Lean:

You should focus on eliminating muda in your swing. Do not try to “push” the club head towards the ball but rather let a synchronized body turn to naturally “pull” the club head in order to achieve a smooth flow of your swing. The game of golf is a process of relentless continuous improvement. We do not generally recommend you to invest too much energy to your tools because dependence on such frequently undermines the development of the correct mindset. If you focus on improving every little piece, your efforts will eventually show up in your score and hence your handicap, which should not be your ends but means to the way of golf.

Six-sigma:

Golf is a game of consistency. You should hence focus on reducing variability of your swing. We have a set of statistical tools to measure the defects of your swing as well as scientific instruments to monitor and track your progress. You need to certify your skills from green to black belts. Through leveraging the right tools with scientific measurement and objective feedback, you will ultimately reduce your swing variability to under 6-sigma.

TOC (Theory of Constraint):

You can maximize the return of your practice time by focusing on identifying and improving the bottleneck. At every stage of your skill development, there is a constraint that determines the throughput of your entire game. At one point it may be the grip or the address or the swing plane or approach shot or putt … but the point is that the bottleneck moves. By identifying the bottleneck and concentrate on it, you will be able to get notable handicap reduction within the shortest time. While lean and six-sigma can get you closer to the “perfect” swing, TOC focused on optimizing what you have already got to quickly improve your score.

Whatever the approach you pick to improve your golf game or to help transform your manufacturing operations, you can benefit from applying technology that automatically records your current swing (or process) to then give you instant feedback on what to improve. In my opinion, there is no better example than golf to illustrate how your actual execution can be deceptive to the best intended plan.

A Lot-Sizing Case Study of a Socket Manufacturer

November 10, 2010 Leave a comment

One of my customers is meeting with the following problem: how to optimize production batch size during an upcoming national promotion that will greatly increase demand?

The end products are sets of sockets, each of which has a different diameter.

An important challenge is that demand is in “set” that include many different diameters but actual production is grouped by “sockets of the same diameter”.

This is because significant change-over time is required to setup the bottleneck forming machine for production of a given diameter. Such change-over time varies from 2-4 hours. In some cases it can take an entire shift. Therefore until now, they have grouped sockets of the same diameter by product batches. Such “lot-for-lot” production is simple to execute and minimizes the number of change-over required.

However, their business model has recently changed. Marketing is trying to conduct a national promotion on certain sets. Therefore the demand of a number of sets will be greatly increased. This creates a new problem for using “lot-for-lot” rules in production. If you focus on making 1 diameter before making the next one, then you will accumulate a lot of inventory before you can ship sets. The cost of holding inventory and the responsiveness to demand fluctuation will become a much significant problem than when demand is in smaller batches. Given that, should you divide the demand quantity into smaller batches and if so, how should you determine the size of which?

Assuming that each set has 5 pieces and each piece cost $10, we can calculate the following 2 cases with setup cost estimated at $100 and inventory holding cost for each piece estimated at $0.04 per day.

Case 1 (Divide 10,000 pieces of demand into 2 batches, each of which 5000 pieces)

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

Day 8

Day 9

Day 10

Demand

0

0

0

0

0

5000

0

0

0

0

5000

Production Qty

1000

1000

1000

1000

1000

1000

1000

1000

1000

1000

Inventory

1000

2000

3000

4000

5000

1000

2000

3000

4000

5000

Setup cost

100

100

100

100

100

100

100

100

100

100

1000

Holding cost

40

80

120

160

200

40

80

120

160

200

1200

Total Cost

140

180

220

260

300

140

180

220

260

300

2200

Case 2 (Produce 10,000 as 1 batch)

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

Day 8

Day 9

Day 10

Demand

0

0

0

0

0

0

0

0

0

0

10000

Production Qty

1000

1000

1000

1000

1000

1000

1000

1000

1000

1000

Inventory

1000

2000

3000

4000

5000

6000

7000

8000

9000

10000

Setup cost

100

100

100

100

100

500

Holding cost

40

80

120

160

200

240

280

320

360

400

2200

Total Cost

40

180

120

260

200

340

280

420

360

500

2700

Notice that when taking both setup cost and inventory holding cost into considerations, it is actually cheaper to divide the demand into 2 batches (Case 1).

Note that there is 5 change-over required for gather 5 diameters of sockets for each set. Therefore the smaller batches of demand, the higher setup cost but at the same time inventory holding cost is reduced.

What happen if we further divide demand into smaller batches?

Case 3 (Divide 10,000 pieces of demand into 5 batches, each of which 2000 pieces)

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

Day 8

Day 9

Day 10

Demand

0

0

2000

0

2000

0

2000

0

2000

0

2000

Production Qty

1000

1000

1000

1000

1000

1000

1000

1000

1000

1000

Inventory

1000

2000

1000

2000

1000

2000

1000

2000

1000

2000

Setup Cost

200

300

200

300

200

300

200

300

200

300

2500

Holding Cost

40

80

40

80

40

80

40

80

40

80

600

Total Cost

240

380

240

380

240

380

240

380

240

380

3100

Our analysis is showing that further reduction of batch size from Case 1 to Case 3 has increased the total cost.

Therefore, under the given conditions, the optimal demand batch size exist somewhere between 2000 and 10,000.

How to determine the optimal batch size? At first glance, this may seem like a case that requires dynamic lot-sizing procedure of Wagner-Whitin or the like because demand is not constant. However, this problem actually can be simplified and hence solved in a similar way to EOQ calculation.

Without gong into the detail of the derivation of the formula, I will just give the results:

Therefore the optimal demand batch size is 10000/2=5000 which is case 1 and the total cost is about 18% lower than “lot-for-lot” production of case 2.

On the other hand, it is important to further reduce setup time in order to reduce overall cost and improve demand responsiveness. For example, if S is reduced from 500 to 100, N can be calculated to 4.47≈5 and hence a demand batch of 2000 with a total cost 30% of the original total cost of case 3.

From this calculation, we learn that :

Why Shortage vs Stock is NOT a Tradeoff?

October 16, 2010 2 comments

I have frequently come across factories that keep a lot of on-hand stocks and at the same time material shortage is among the top reasons of unexpected production downtime. Same goes for retail operations that have kept safety stock level arbitrarily high. More often than not, shortage is not reduced by keeping more stocks.

At first glance, this seems to be counter-intuitive. After all, textbook stock management models always indicate that safety stock level and the probability of shortage, which directly affects service level, has an inverse relationship. You reduce stock and you end up increasing the chance of shortage, which leads directly to loss of sales opportunity. On the other hand, more stock means more working capital, more product obsolescence, more warehouse cost …etc and hence serving customers is a financial tradeoff between shortage and stock levels.

In practice, this is not necessary true. In fact, I argue that in order to reduce shortage, you need to reduce your stock level first.

The reason is that excess stock level has many side effects that are not accounted by the textbook model. Most notable ones are the followings:

Inefficient use of procurement budget

Typically a fixed budget is allocated for procurement. High stock level reduces the flexibility of allocating budget to purchase what is really needed. This is very commonly seen not only in retail but also for large manufacturers’ raw material procurement and sales company operations.

Loose inventory management practice

Excess stock level tends to create an inappropriate peace of mind for managers. When less attention is given to keep stock level low, larger variability of turnover among SKUs is resulted. Replenishment of the SKUs that are really needed by operation is hence more likely to be forgotten.

Lack of continuous improvement incentives

This is the classical lean wisdom that when stock level is excessively high, problems are hidden behind the stocks. There is no pressure to improve supply chain responsiveness by reducing manufacturing lead time or improving overall material flow synchronization. Eventually, demand and competition will catch up with the limitation of the supply chain responsiveness. For example, studies have shown that US automobile manufacturers tend to keep higher dealer stock than their Japanese counterparts. This has been one of the major differences in competitiveness between US and Japan automotive companies.

Classical inventory model assumes that inventory decouples supply and demand functions. In practice, supply chain is complex and you cannot simply decouple supply and demand with stock. The key success factor to reduce shortage by reducing stock is to actively manage the complex relationship by better synchronization of manufacturing with customer demand.

Better synchronization can be achieved in 2 ways: more accurate forecasting and higher flexibility of execution. In today’s market of increasing demand volatility, there is a limit on how accurate you can predict the future by improving forecasting. However, lean methodology has taught us that there is almost no practical limit in improving execution.

Take the example of a manufacturing plant that I visited recently. They make products that are distributed across US through a franchise network. 5 years ago, their plant inventory alone was at 90 days and they only met 70% of customer orders. Today, their plant and downstream distribution center total inventory is less than 80 days while they are now meeting more than 96% of orders. The key to this change is that the manufacturing plant now is accountable for not only the plant inventory but also downstream DC and soon warehouse inventory. In this case, the overall stock reduction target has put manufacturing operation under pressure to reduce lead time and improve flexibility. Any such improvement has an impact to downstream supply chain stock level as well as customer satisfaction level. Such improvement cannot be achieved when manufacturing and supply chain stock are managed separately as silos. In order to achieve the next level of operational performance, they are evaluating a unified IT platform across manufacturing and supply chain.

By the same token, manufacturers are now using the latest information technology to synchronize better with their suppliers. An industrial equipment manufacturing plant that I visited has implemented an IT platform that allows them to see in real time the progress of WIP at its suppliers’ production lines. Such visibility allows them to control material synchronization between key supplying parts and the in-house final assembly operations, resulting in overall inventory and shortage reduction.

Are you wondering why your operation is keeping high stock level but still cannot reduce shortage? Do you see your manufacturing operation driving supply chain stock management? What is keeping your manufacturing operation from better synchronization with supplier operations as well as market demand?

The Neglected Law of 6-Sigma

September 18, 2010 Leave a comment

I have just come back from a consulting engagement at a manufacturing plant. This plant is the most successful plant of a global enterprise. In the past 6 years they have been exceeding improvement targets in productivity and order fulfillment. Even in this economy, they have been turning out record profit. Marketing department loves to promote their product because of their high profit margin. Success has brought them more pressure because any improvement would impact the company performance. I can’t held to ask “Why would such a successful operation need any sort of consulting and not just continue to do what have made them successful?”

This turns out to have something to do with the natural law of business process.

Take the example of order fulfillment which is their most important metric. They were at 70% 6 years ago. A yearly target of 5% improvement has taken them to around 96%. But then, they are hitting a wall. Well why should the last 5 % be more difficult than the others? The basics of 6-sigma would cast light on this problem.

Going from 70% to 96% is the journey of going from 1 sigma to 2 sigma. The natural law of business processes says that it will require the same level or more effort to increase every sigma level. Frequently it is a totally different ball game that requires significant resource investment, skill acquisition, technology advancement and cultural transformation to ramp up each sigma level.

They have been relying on automation and Lean methodology in the past 6 years and have succeeded in the journey from  1 to 2 sigma. In order to get to 3 sigma, I have suggested them to start applying scientific and statistical tools to business processes. Without taking more detail measurements and applying appropriate quantitative methods along with Lean, there is a limitation on how far that they can further improve. It is indeed a different ball game that they are more than ever in need of enabling information technology to get them the data and visibility required for scientific management. Years of neglect in IT investment at manufacturing may come to a point to limit further growth of the operation.

Does your operation set key improvement targets, the associated resource and infrastructure investment based on target 6-sigma level? How far can you go down the 6-sigma journey without implementation of enabling information technologies?

Golf and Factory Dynamics

May 10, 2010 Leave a comment

It could be interesting to take a brief moment on the golf course and think about how queues and production lines should work. Some foods for thought are:

  1. Why is there always longer wait time before but not after a Par 3 short hole whereas you are supposed to finish a Par 3 faster than other holes?
  2. How should a golf course manager decide the release timing of player groups to the course in order to maximize revenue while minimizing wait time?
  3. What would be the impact on wait time and throughput if we change from 4-sum to say multiple 2-sum or 5-sum?

 

These questions are related to the followings:

  1. What are the causes of inventory before a workstation? What is the nature of the bottleneck?
  2. How should release time of orders be decided in order to maximize throughput and minimize cycle time?
  3. What is the impact of changing batch size on cycle time and throughput?