Wednesday, February 27, 2013

Focus and Leverage Part 190

In my last posting I told you I would take a look at some other performance metrics and see how they impact our improvement efforts.  I’m going to delay that posting because I want to share an experience I had with one of my healthcare client’s teams.  Although I won’t go into the details of the experience, I will tell you that they had proposed a change in the way a specific process is being run.

It has been said many times that the natural tendency of people is to resist change and in many ways I believe this premise.  Assuming this resistance is real, why is it that people resist change?  If you ask most people this question, you’ll probably get a response like, “it’s outside the comfort zone of the people being asked to change.”  I know from my experiences that this is one of the most often heard responses to this question.  There is an almost art to get people to change, but I’m here to tell you that it doesn’t have to be as difficult as some people make it.

When confronted with an opportunity to implement an improvement, many times we take the easy way out when we face this resistance by developing a compromise.  A compromise is letting go of part of what we want and giving more of what the “changers” want.  If we haven’t learned but one thing from the late Eli Goldratt it is that we should never compromise!  A compromise is essentially a win-lose scenario when in fact we should only want to come away with a win-win one.

This team I mentioned earlier had a great idea about how to reduce the financial impact of missed billings.  They had studied lost billings due to immunizations, but quickly realized that their solution would apply to other areas such as various medical tests and especially the more expensive tests like EKG’s, Point of Care Testing, etc..  In fact the amount of money lost due to immunization billing errors paled in comparison to these other tests.

So knowing that we have an excellent solution, the question becomes how do we present it without a compromise?  From experience I know that as long as we think that the only way to handle a conflict is by compromising,, such as trying to change a process, we won’t be successful in making the change.  What needs to happen is that we must surface the assumptions on why we believe there will be resistance to the process change we are going to propose. And if we never think about the underlying assumptions and know how to remove at least one of them, we’ll never find the way to eliminate the conflict and “sell” our breakthrough solution.  In fact, we’ll just simply lower our expectations and continue with business as usual.

The first and most profound obstacle to change is that people believe that reality is complex and sophisticated.  And because we believe this, we have a tendency to believe that complex problems require complicated solutions.  Goldratt introduced us to the concept of Inherent Simplicity which clearly states that complex problems require simple solutions.  In other words, the more complicated the situation seems to be, the simpler the solution must be.

Earlier I mentioned that we need to develop a win-win solution, so how do we do this?  The first place to start is by constructing a solution by seeking the other party’s win, but not the win that is in conflict.  If we want our win to be bigger, we have to ensure that the other sides win will be bigger.  In other words, we must demonstrate how by applying our solution, the side we are asking to change must see immediately that there is a win in the solution for them.

A good solution deals with the core conflict in that it changes an underlying assumption and therefore significantly changes the situation for the better.  When you present your solution effectively, you immediately face a reality that is very different from the reality you’re currently in.  We must first transfer ourselves into the future to realize the situation that will exist after the solution is implemented and then communicate that reality effectively.  So back to our GB project.

The figure below in a simplified current reality tree that summarizes the most prominent Undesirable Effects (UDE’s) encountered by the team.  In order to solve the billing error problem, the team had to identify a core problem that, if eliminated, would reduce the impact of many of these UDE’s.

The team concluded that by the MD’s not entering their immunization orders and instead gave verbal orders that the Medical Assistants (MA’s) made errors due to trying to translate what the MD had said.  And if there were translation errors, then the charges would be incorrect.  And when the front desk scanned the incorrect documents to the billing company, then the revenue from billing would be missing.  The team then concluded that if the MD’s would simply enter their own orders (bottom entry on simplified CRT), then most of the UDE’s would disappear.  The other problem stemmed from the problem that the billing documents were sometimes unreadable, so the team recommended that the billing document be redesigned to correct this problem.

So how could this simple solution (i.e. MD’s entering their own orders into the database) be a win-win.  Quite simply, because of this simple change, there were other forms of paperwork that the MD would no longer have to fill out as they would now be completed by the MA’s. The result was, the MD could now see more patients.  The MA’s liked this solution because they would know exactly what the MD’s orders were and they could prepare the immunizations, paperwork, etc. while the MD was still seeing the patient.  The patients would like this, because their wait time would be reduced significantly.  Just as soon as the MD opened the exam room door, the MA, having all that was needed to give the immunization, would simply walk in and administer the vaccine and the patient would leave.  The organization would win by significantly reducing the lost revenue.  So the team created a win-win-win solution which will be very simple to sell.

Bob Sproull


Sunday, February 24, 2013

Focus and Leverage Part 189

In my last posting I demonstrated how efficiency could have a negative impact on your operation, but before moving on to a solution or a replacement for this metric, I want to look at the negative impact of using efficiency as a metric a bit further.  What I demonstrated in my last posting was what the process would look like after 100 minutes.  Let’s look at the impact of running this process for a full shift and then 3 full shifts.  And by the way, for you purists out there, these examples assume 100% uptime (i.e. no breakdowns, no breaks or lunches).

In the figure below we see the impact on the process by running at top speed (100 % efficiency) for full 8 hours.  Since Step 1 only takes 10 minutes to produce a part, in a full 8 hour shift it produces 48 parts and passes them on to Step 2.  Because Step 2’s capacity is only one part every 20 minutes, it can only process half of Step 1’s parts or 24 of them.  Steps 3 and 4 have the capacity to process all 24 of the parts produced by Step 2.  So after 1 shift we see that the accumulated WIP in front of Step 2 is 24 units while the finished goods inventory is also 24.
The efficiencies at both Steps 1 and 2 are 100% and according to cost accounting beliefs, are both doing their jobs.  But because the Operators at Steps 3 and 4 are well below 100% they are not doing their jobs and should probably be disciplined.  So as before, the overall system efficiency is hovering around 67%.  So, based upon what we’ve seen here, does it make any sense at all to continue running Step 1 at capacity?  Need more proof?  Let’s look at the results after 3 shifts of operating this way.
After running this process for three full shifts (i.e. 1,440 minutes), look what’s happened to the WIP levels!!  Step 1 produced 144 units during the 3 shifts, but Step 2 could only process half of them (72), so the WIP has exploded to 72 units.  We were able to ship 72 units, but believe it or not, WIP costs a company cash.  So again I ask you….does it make any sense at all to continue running Step 1 at its full capacity?

OK, so if efficiency isn’t such a great metric, then how should we appraise how well our overall process is performing?  Let’s review just the first three of Goldratt’s 5 Focusing Steps and see if we might be able to move closer to the answer:
1.  Identify the system constraint or which step is limiting achievement of the goal of the   process.
2.  Decide how to exploit the system constraint or how do we get more out of our constraint
3.  Subordinate everything else to the above decision or for our 4-step process, don’t outrun the constraint.

Let’s return again to our simple 4-step process.  In our process, Step 2 has been identified as the system constraint simply because it has the least amount of capacity to produce units.  If we wanted to improve the output of our process, we must improve the capacity of Step 2 by reducing its cycle time.  And if we wanted to reduce the overall WIP in our process, then we must run all four steps at the same rate as the constraint, even though the other three steps can all produce units at a higher rate.  In other words, we must subordinate them to the constraint.
Efficiency in the cost accounting world is considered a measure of productivity, so we need an equivalent measure to use.  In my blog, many times I have discussed the benefits of using Throughput Accounting (TA).  The basis for TA comes from three individual measurements:

1.  Throughput is defined as the rate at which a company generates money through sales.  This doesn’t mean that we produce product to sit in a warehouse, it means for sales.

2. Operating Expenses includes all expenses except totally variable costs, which in most cases are direct material costs.  Another way of saying this is all expenses incurred when turning inventory into throughput.  This includes all labor costs (i.e. both salaried and hourly labor costs).

3. Inventory includes all purchases that a company makes for things it intends to sell.

The key is that throughput isn’t considered throughput until money is received from the sale of products.  In looking at it this way, we are forced to consider the entire system rather than localized parts of the system.  So how do we measure productivity in the TOC world?  The way we do it in the TOC world is to divide throughput by operating expenses or T/OE.  For example, if each product sold for $100 and our OE was $25, then our productivity measure would be 4.0.  If we improved our throughput to $200 and our OE remained the same, our new level of productivity would be 8.0.

The natural managerial tendency for many companies that utilize traditional Cost Accounting (CA) is that when excess labor capacity is exposed there should be a labor reduction.  So when operators in non-constraints have no product to work on, like in our example process, in the CA world this is considered a bad thing and typically lay-offs result. However, in my way of thinking, if a company is truly interested in moving forward into a world-class competitive arena, lay-offs should be avoided at all costs.  I do realize that sometimes due to economic conditions that a company can’t avoid layoffs, but these times must not happen as a routine way of doing business.  Layoffs can clearly send the wrong message to the remaining employees about the effects of improvement programs and the improvement effort will come to a virtual standstill.

So what happens in companies that embrace TOC?  What happens when excess capacity is realized?  These companies do one of two things.  One thing they do is focus on developing new products and entering into new markets.  The problem with traditional cost accounting is that companies that make decisions based on cost accounting metrics view labor as a variable cost rather than a fixed cost.  If, for example, you keep the same labor in place, but you create new markets, the new revenue, minus totally variable costs (e.g. raw material costs, sales commissions, shipping costs), all flows directly to the bottom line!  Wouldn’t this be better than laying people off?  To me it clearly is!!

The other possibility is to increase market share of existing products by differentiating your company through things like superior on-time delivery, superior quality, and sometimes even price breaks into other market segments as long as these reductions don’t create a price war.  So how would you be able to increase your capacity to support additional shipments?  By focusing your improvement efforts on the constraint to drive throughput levels higher.

In my next posting, we’ll take a look at some other performance metrics and see how they impact our improvement efforts.

Bob Sproull


Saturday, February 23, 2013

Focus and Leverage Part 188

In the next few postings I want to talk about some of the basics of Continuous Improvement (CI).  Not the tools of CI, but rather more of a strategic viewpoint.  In other words, some of the principles and guidelines I use in my work which has provided me with a strategy that I can honestly say has never failed to deliver excellent results.  But in order for everyone to clearly understand my approach, I need to back up and review some of the basic principles and beliefs that I believe in and aspire to.
Let’s start this discussion with the end product…..performance metrics.  Why start with the end in mind you may be thinking?  In my way of thinking, the key purpose of performance metrics is that they drive behaviors within an organization.  For example, if operator efficiency is one of your metrics, what behaviors does it drive?  The basic math used to calculate efficiency is that you have a standard time to complete a task or process step.  If you complete this task in exactly the same time that the work standard says it should take, then your efficiency is 100%.  On the surface, that seems like something that we all want, but let’s look at this more closely.
The figure above is a simple four-step process used to manufacture a product.  Using the definition of efficiency, in order for Step 1 to achieve and sustain 100% efficiency, it would have to produce one product every ten minutes.  Simple….right?  But, in reality, what would happen after 100 minutes to this process if Step 1 continued producing one part every 10 minutes?

If Step 1 produced one part every 10 minutes and Step 2 could only process 1 part every 20 minutes, then Step 2 would only process 5 of the parts in 100 minutes.  Step 3 would process all 5 of these parts as would Step 4.  In other words, there would be 5 parts sitting directly in front of Step 2 waiting to be processed.  This isn’t exactly what the results would look like, but for demonstration purposes, it will suffice.  The point is, if Step 1 continued producing at its maximum capacity, the inventory would continue to build up in front of Step 2.  Steps 1 and 2 would be operating at 100% efficiency, but what about Steps 3 and 4?  Would it be possible for these two steps to reach 100% efficiency?

The problem with using efficiency as a performance metric is that it is controlled by the step with the least amount of capacity…..the system constraint.  The total system efficiency will always be less than 100 % simply because of the existence of the system constraint.  The system efficiency for our process would be 100 % (Step 1) plus 100 % (Step 2), plus 50 % for Step 3, plus 25 % for Step 4 divided by 4 or 68.75 %.  In fact, the only place where efficiency makes any sense at all is in the system constraint which in our process is Step 2.

The origins of operator efficiency lies in traditional cost accounting where the belief exists that everyone should be busy 100 % of the time.  In our process it is clear that the operator in Step 3 would only be busy half of the time while the operator in Step 4 would only be busy one quarter of the time!  The Cost Accountants would never stand for this and would be looking for manpower reductions!

So if operator efficiency isn’t a good performance metric (except in the constraint), then what is?  In other words, how should we measure the performance of our 4-step process?  In my next posting we’ll try to answer this question.

Bob Sproull


Friday, February 22, 2013

Focus and Leverage Part 187

I’m working on a very interesting project in a healthcare environment this week.  It isn’t my usual type project where we are interested in improving the throughput of patients through a process like an Emergency Department, but it’s clear that improved throughput will happen as a result of this team’s efforts.  This project involves the billing for immunizations at a healthcare organization’s 6 different satellite operations.  It seems as though there are numerous billing errors that translate into lost revenue for the hospital, so they called me in to help them reduce these errors.  The team make-up was important because it included Medical Assistants (MA’s), front desk personnel, and an expert in Clinical Informatics plus two Green Belts.

The first thing I had the team do was create a Pareto Chart of the six satellite operations to determine the distribution of billing errors between the operations.  As you can see, there is a wide disparity between each of the offices in terms of number of billing errors.  for data collected over a three month period (Oct-Dec, 2012).  On the surface one might conclude from this analysis that we should review the process in Location # 2 and have all sites use their process.  This would be a good approach if all things were equal, but digging deeper into the data indicated that there is a wide disparity between locations in terms of the number of immunizations delivered (i.e. number of patients immunized).

The next thing I had the team do was to develop a causal chain to determine potential root causes for billing errors across the different satellite organizations.  This effort proved to be very valuable from a problem solving perspective, but even more so it was very enlightening for the Medical Analysts who comprised the majority of the team.  It helped them crystallize their thoughts about what was happening in their own site location.


For those of you who may not have used a causal chain before, we started by developing a very succinct problem statement which in our project was Billing Revenue Lost.  The object (Billing Revenue) is listed on top of the line and the state that it’s in listed directly beneath it (Lost).  We then asked the question why? To determine potential reasons why revenue was being lost. 
The answer to this first “why?” was actually five potential reasons as follows:

1.    Order (Object) – Not documented in TW (State).  TW is the name of the database being used.

2.    Order (Object) – Not Bubbled (State)

3.    Billing Company (Object) – Commits Errors (State

4.    Charges (Object) – Incorrect (State)

5.    Standardized Work (Object) – Not followed (State)

We continued asking why until we arrived at an actionable item we could work on for each of the five potential root causes.  The team’s most enlightening potential root cause listed on the chain had to do with the standardized work which had been developed for each of the sites.  When the team asked, “why is the standardized work not being folled?” there were three reasons:

1.    The standardized work was not communicated, meaning that several of the MA’s didn’t even know it existed.

2.    The standardized work was not “owned” by the MA’s

3.    The standardized work was not enforced by the leadership.

For reasons 1 and 2, the team asked “why?” again and concluded that the MA’s weren’t involved in the development of the standardized work so they didn’t own it and didn’t know it was even available to follow.  As I’ve written about many times in this blog, if the subject matter experts (in this case the MA’s) aren’t involved in creating the process, it won’t be credible to them.  And because it isn’t credible, it won’t be followed and used.  It really matters not what the industry is, in every case what I typically see is leadership creating the process with very little, if any, input from the SME’s so it’s not owned, believed or used.

The team then reviewed each potential root cause to determine one of three different “ratings.”  If they believed the entity was “solvable” it was color-coded as green; if they believed the entity could be influenced, it was color coded as yellow; if they did not believe they could solve the entity, it was color-coded as red.  In all cases it was determined that all entities were either “solvable” or “influenceable.”

The team then decided to develop process maps for each different locations and then perform a value analysis on each one.  What the team found was very interesting in that, not surprisingly, each of the locations had a completely different process.  Although I’m not at liberty to visually display each of the different process maps here, suffice it to say that some of the differences were quite significant as were the number of billing errors.

In addition to the five current state maps, the team then developed two separate future state maps, one for scheduled patients and one for walk-in patients as demonstrated in the figures above.  This was an important step because in order for the team to improve the process (i.e. build their future state), it had to first understand their current reality.

In the table below we see the results of this team’s actions in terms of the before and after for total number of steps, value-added, non-value-added, and non-value-added but necessary and as can be seen, the differences were quite dramatic.  Of particular interest was the reduction in the number of yellow and red steps.

The team is very confident that if the two, new future states are implemented, the number of billing errors will be significantly reduced (i.e. >75% or more), but there is another benefit that is much larger.  The team was able to identify a key policy constraint that is delaying the administration of immunizations.  The team reasoned that if the physicians will simply enter the order into the data base, instead of waiting to verbally explain to the MA’s what the order is when they exit the examining room, the MA’s can prepare all of the necessary forms, supplies, medication, etc. needed to administer the vaccine in advance, thus significantly reducing the wait time for the patients.  The team believes that breaking this policy constraint will have several very important effects:

1.    By decreasing the patient wait times, patient satisfaction scores will increase which could have a positive impact on reimbursement rates.

2.    Since the overall immunization cycle time will decrease, the number of patients that can be seen each day by each location should increase, thus increasing the overall revenue base.

3.    Because the MA will no longer be getting verbal orders from the physicians, the number of mistakes due to interpretation of the verbal orders, errors will decrease significantly.

These effects are clearly very important, but the team believes that a significantly larger  financial impact will be seen if their new process is expanded to other processes such as EKG’s, Point of Care Testing, etc.  These items are much higher revenue generators than the revenue gained through immunizations, so the impact on the organization’s bottom line could be substantial.

This team learned several important things during this event, not the least of which was the value of bringing together the true subject matter experts, the people actually doing the work, to develop the “best” way to perform different functions.  Comments like, “I finally have a voice!” were not uncommon.

One of the things I always do in an event like this is to measure the team’s pulse before and after the event.  The purpose of this action is to get a feel of the team’s belief in two distinct areas:

1.    Does the team believe that what they came together to do would be difficult or easy?

2.    Does the team feel that it will be empowered or authorized to make their recommended changes.

The figure below is the tool I use to measure the pulse of the team by having them place red and green dots in one of four quadrants:

·         Quadrant 1:  Hard to do and not authorized to make changes

·         Quadrant 2: Easy to do, but not authorized to make changes

·         Quadrant 3: Hard to do, but empowered to make changes

·         Quadrant 4:  Easy to do and empowered to make changes


At the beginning of the event the team is instructed to place a red dot in one of the four quadrants and then a green dot at the conclusion of the event.  At the beginning of the event, for the most part this team felt empowered, but that it would be difficult to make changes.  At the end of the event, the team believed that they actually were both empowered and that the changes would be easy to make.

Bob Sproull

Sunday, February 17, 2013

Focus and Leverage Part 186

I was having a conversation at the Pittsburgh airport with a man that was carrying a copy of Bruce and my book Epiphanized.  He had so many questions for me and I was worried that I would miss my flight back to Georgia.  I must admit that most of the questions were quite good and my answers settled a lot of issues in his mind.  I thought in this posting that I would share one of the questions he had and how I answered it.

The first question he asked me had to do with Throughput Accounting.  You see he was an Accountant and didn’t understand why we needed a different accounting system when we had traditional Cost Accounting available.  I just smiled and thought to myself, where should I start.  I followed his question with a question of my own.  I asked him if he thought manpower efficiency was a good metric and he immediately replied, “Yes, of course I do!”  I asked him why he thought it was a good metric and, even though he had read our book, he told me that it was a great way to check on manpower requirements.  He further said that if efficiency was low, then the workers weren’t doing their job.  I then took out a piece of paper and drew my infamous piping diagram.
I asked him my usual question, “If you wanted to increase the amount of water flowing through these pipes, what would you have to do?  He responded by saying, “That’s simple, you would have to increase the diameter of Section E.”   I asked him why not just open up Section G’s diameter?  He told me that would be stupid since no additional water would flow.  So I then drew a simple Emergency Department process diagram and asked him the same question about what he would have to do to increase the number of patients passing through this process.
After giving it some thought, he said that the time for consult (55 minutes) would have to be decreased.  I said, “You mean like the diameter having to be increased in the piping diagram?”  He said yes.  He then asked me what all this had to do with efficiencies?  I asked him if he thought it would be a good idea to drive this ED process’s efficiency higher and he told me it would be a great idea.  I then asked him how he would do this and his simple reply was, “Have everyone run their part of the process as fast as they could.”  I then asked him what would happen if he ran the first two steps in this process as fast as they could?  He thought about it for a bit and simply said, “I get it!”  “If you run these steps as fast as you can, you’ll just stack up people waiting to consult with the physician.”  We then talked about the steps after Consult and his conclusion was that they are at the mercy of the consult step.  He had a much better idea of why I dislike efficiency so very much, but when I asked him what he thought about this metric now, he looked me square in the eye and simply said, “I need to go catch my flight.”
Bob Sproull

This week.....

For everyone interested, I will begin posting on my blog this week.  I will be in Chicago working on a VSA and will have time in the evenings to resume postings.  Thanks to everyone for their patience.

Bob Sproull

Saturday, February 9, 2013

Thanks for all of your kinds words.....

I want to thank everyone for all of the kind words I have received over the past several days.  Your emails really meant a lot to me and helped me through this difficult time.  Today we buried my sister and I have to say the service today was the best I have ever been a part of.  Deana, my Downs syndrome sister was indeed a very special woman and based upon the turn-out these past three days, she had many friends who loved her.  Deana was an avid basketball fan and went to every game while her health was good.  Last night 10 former high school basketball players all showed up to pay their final respects to Deana and told us that since Deana was always there for them, they had to be there for her.  The voluteer fire department, all 20 of them, also showed up and paid a tribute to her.  It was all so touching.  The funeral director delivered the final sermon for Deana with tears running down his face.  Deana will definitely be missed...........

Bob Sproull

Thursday, February 7, 2013

Posting Delay

For everyone waiting for my next posting.....I had a death in my family and haven't been able to focus on anything else.  My sister was a true inspiration for everyone around her and she will be missed very much.  She was a Down's Syndrome woman who outlived all expectations.  She never had a bad day and if you were around her, you would simply smile at her.  I'm on my way to her funeral to say goodbye......

Bob Sproull