Blog Archives

Problem Solving Pitfalls

Just for the reader’s information, I’m going to start a run here for a couple weeks about problem solving.  Some of these points I have touched on in other avenues, but these seem to fit as their own mini-series.  This isn’t a “How To”, but more of ‘Some of The Stuff They Don’t Tell You’.  Pretty much any structured problem solving method will lean heavily on data.  The good news is that most companies have a pile of numbers that can be used to identify problems.  The bad news is that these numbers often seem to lack some key characteristics that would make them very useful.  There are some serious pitfalls to be aware of as you are digging through the data.

One of the usual suspects to look for in terms of using data is the context that it comes from.  Does the data have a time or sequential relevance?  How can you tell what has changed in the process or the product that may have driven the data?  Put another way, what are the known special causes that can be filtered out so that the unknown special and common cause variation remains.  The data itself can almost never be taken at face value as a reflection of a stable reality.

A second area to dig in to is how much the data reflects what you are trying to measure about the process.  How direct and traceable are the measures to the actual process?  Do the numbers have to get combined and factored into something or are they transparent?  Is the data a reflection of a leading or lagging indicator?  Are they timely or delayed?  How much do the financial reports reflect actual dollars vs. some sort of calculated dollar figure?  All of these are important to understand to determine where you should be spending your time and how you need to leverage resources.

Once you can harness the data, gather the context it comes from and understand exactly what it tells you, there is another key step…verifying your measurement system.  Whether this step occurs in the form of a Gage R&R, a human verification, or whatever your MSA may need to be, it has to be done.  You have to know that the data you are getting is a reflection of what is attempting to be measured.

More times than I’d like to recall I’ve been a part of activities where one or more of these steps were skipped.  While you hate to say that any activity where you learn something is a waste of time, there has been a lot of time wasted in chasing problems that weren’t really there or trying to improve performance on a less important process.  That ‘waste’ could very easily have been avoided by investing the upfront time to study what was really there.  Maybe my errors can help save you the effort of going down the wrong path in the future.

Regressing to the Mean

With the popularity of the movie and book Moneyball, among other things, the principles of ‘advanced statistics’ seems to be everywhere you look in sports.  As I read about these different methods of analysis, I keep reading of authors referring to people and teams “regressing to the mean”.  To my eyes, it is mostly used as a blanket way of explaining the unexplainable.  If a player goes on a hot streak, a drop in performance is regressing to the mean.  If a team outperformed the historical trends last year, they should regress to the mean this year and do worse.  It all seems like stopping 2 or 3 why’s short in a 5-Why, but this isn’t the forum to argue the specifics of advanced metrics in sports.  However, as I have started to see the phrase “regressing to the mean” show up outside of those arenas I think it provides an interesting topic.

In the interest of time and space, I’ll keep the strawman simple here.  The sports guys and gals have the concept mostly accurate from a big picture level.  Over time, a process will show what its performance will be.  The short term swings high and low are just normal variations that level out over time assuming no other significant factors intervene.  From a human performance standpoint, what does that mean and how can we impact it?

People tend to have their own expectations of their performance.  If they exceed their expectations at something, most are likely to perform worse at the same task in the future.  There are a lot of factors at play in this such as differences in focus, preparation, expectations, complacency, and so on.  The opposite is also true.  When expectations aren’t met, the performance tends to increase.  From a cultural standpoint, I tend to think this is one of the factors that help to propel companies like Toyota and other that are successful in Lean.  There is a constant reinforcement of thought patterns like “ ’no problem’ is a problem” and chasing a True North state.  This helps create an ongoing mindset of not performing at a high enough level.   For the people that can function and even thrive in this environment, it develops a constant carrot to chase and keeps the organization as a whole from regressing.

I fully recognize that not everybody can function in that type of culture.  That isn’t an indictment of them, just a recognition that all people are different and come to work with their own needs.  I also recognize and have witnessed the farther end of the spectrum where the carrot of raised expectations turns in to the stick of failure.  This doesn’t mean that we should stop identifying successes.  It just brings an acknowledgment that reaching a certain level doesn’t mean that the climb is over.  Individually, it can give us an opportunity to ask ourselves how comfortable we have gotten with our skills and performance.  In turn, it’s an opportunity to look at those around us and try to understand how they view their successes and raising the bar of expectations.  If we believe in the old adage that you are either getting better or you are getting worse, regressing to a mean isn’t really a viable option.