Long ago, a wise and profound person coined the phrase “you can’t manage what you DON’T measure.” Although I suspect I heard it even before I studied work measurement at Georgia Tech, I think about this simple principle now more than ever before.

For many operations managers, recessionary challenges often involved balancing the competing forces of growth against a lean supply of labor. As economic conditions took a dive during the past four years, however, the challenge quickly turned to one of managing costs downward against shrinking volumes. Too many companies paid a painful price for not being prepared with the capability to meet this challenge.

My partner, Bruce Strahan, contends that as we look closely at why an operation is being run with too many people and too much inventory, we will find two common causes:

  • Supervisors and middle managers lack proper training in the basic techniques of quantitative management, and
  • The right operational performance data, at the right levels of detail, are not readily accessible.

If this describes any part of your organization, where should you begin?

Developing Managers of People

Without properly trained mangers, having good data is irrelevant… great data in the hands of someone who doesn’t understand or appreciate it is either dangerous or a waste of effort. Conversely, a manager enlightened in the basics of quantitative management will begin working to overcome gaps in data availability.

Training your front line supervisors and managers on the basics of quantitative techniques doesn’t have to be a huge undertaking. But it should include fundamental training and ongoing exposure to topics on how to measure and interpret results, and, just as importantly, how to coach improvement.

Unless you have a plan to address this deficiency, then you probably shouldn’t spend any more money and time gathering data. At best, that investment will have a short-lived benefit. At its worst, it will frustrate higher-level managers who see the data, but no signs of improvement.

Defining the Right Performance Data

This is one of the toughest parts of the problem, and of course is unique for each environment. Bruce offers these broad guidelines as starters:

  • Get to a level where the process attributes are reasonably common. For example, a “lines picked per hour” measurement will be more useful if separated by pallet, case and each picking.
  • In any time-based measurement, make sure the time denominator is accurate. Operations frequently move people from their assigned job without moving the hours, biasing the output-over-time stats.
  • Create the ability to isolate output measures over short periods of time. In our work we often observe short periods of time when the performance is considerably higher than the daily average. A great opportunity may exist if you can replicate the perfect conditions more often.
  • Low performance is more often related to process design flaws and uneven workflow than to issues of inadequate effort and pace. DON’T let a lack of systems support keep you from measuring. Manually collected data entered on spreadsheets can work just fine.

It’s true that you can’t manage what you DON’T measure. But it goes deeper than that. An effective performance management program involves the systematic means for measurement, setting targets and reporting results. But, it will only be effective over the long term if you have developed the measurements around your process and with your people – associates, supervisors, managers.

As of September 8, 2020, Crimson & Co (formerly The Progress Group/TPG) has rebranded as Argon & Co following the successful merger with Argon Consulting in April 2018. 

More Articles