ITMPI FLAT 003
Main Menu
Home / Authors / Blogging Alliance / Non-Utilitarian Metrics: When Your Knowledge is Not Actionable

Non-Utilitarian Metrics: When Your Knowledge is Not Actionable

BloggingAllianceButton

The Centers for Disease Control recently published a peer-reviewed paper and infographic by Dr. Francis P. Boscoe and Eva Pradham, MPH on the most distinctive causes of death in each state. In this case, “distinctive” means “more frequent than we should expect” based on the national rate for each cause of death from 2001 through 2010, with adjustments. This basis has led to interesting differences: Florida’s claim of HIV as the most distinctive cause of death was based on 15,000 funerals, whereas Louisiana only had to bury 22 folks with syphilis.

This infographic is an example of a non-utilitarian metric, in that it consumes time to review but delivers no actionable information (other than possibly, “Don’t seek treatment for tuberculosis in Texas.”). Television journalistas love to serve up this sort of thing, because it fits in with their daily delivery of crime, sports, and weather. It’s titillating, slightly scary news about things that happened to Other People, which you can review from the safety of your home. It leaves you feeling slightly more threatened, although you can’t explain why, and gives you one more credibility reference for the next urban myth that comes your way. Especially if it pertains to people having sex in Louisiana or Florida.

Right Data, Wrong Chart

I bring this up because I’ve lately seen the rise of other non-utilitarian metrics in the business world. For example, in a daily status report for system integration testing, a pie chart displayed Defect Severity – Critical, High, Moderate, and Low. I pointed out that, while the pie chart showed what fraction of the defects was most worrisome, fresh defects were being reported each day. Additionally, defects were being closed as they were worked, so we were looking at snapshots, not trends. I suggested we switch to a stacked bar chart, and we immediately realized that new defects were being opened at several times the rate they were being closed. This was the same data, but now actionable.

You Can Say That Again

Another recent tracking oddity was an attempt to aggregate qualitative risk across a program. Someone was clever enough to estimate a dollar amount for each significant project risk exposure (good) and match it with the estimated cost of the risk response (even better). However, several of the risks had an impact across multiple component projects, and were independently reported with each project. After a while, someone realized we were double-counting, and the spreadsheet was corrected. This led a business analyst to look into fixed costs, and sure enough… Of course, the program had credibility problems for a while after revealing that things weren’t as bad as previously reported, but no one complained that they weren’t worse.

These Go to Eleven

I like RAG status icons, because they let us have quick conversations with people who don’t need to understand the details as much as they need to understand the recommendations. Most of my status reports over the years have had a two-tier structure, with various components reported as Red/Amber/Green and an overall color. Occasionally, I’ve been challenged on the criteria for each color, as well as the algorithm for the summary level, but I’m always prepared to explain. A few times, I’ve had to fend off requests to add colors, representing some pet parameter, but I generally find that the other stakeholders will intervene to avoid complexity.

Then one day, someone asked me, “Is that a bright yellow?”

“I’m sorry?”

“I mean, is it trending red, trending green, or is it persistent?” We managed to avoid adding intensity to the color palette, but it was close.

There are a lot of reasons to tailor the metrics you collect to the project, especially when you are managing a multi-year program. There are also excellent reasons to present visual indicators, graphs, and charts that summarize tabular data. But it’s important to keep the metrics and the information they communicate actionable, rigorous, and unambiguous. Resist the temptation to measure things you can’t act on, report at a level of detail that creates confusion and errors, and use an unfamiliar metaphor. Choose utility, and avoid futility.

 

For more brilliant insights, check out Dave’s blog: The Practicing IT Project Manager

About Dave Gordon

Dave Gordon is a project manager with over twenty years of experience in implementing human capital management and payroll systems, including premises-based ERP solutions, like PeopleSoft and ADP Enterprise, and SaaS solutions, like Workday. He has an MS in IT with a concentration in project management, and a BS in Business. He also holds the project management professional (PMP) designation, as well as professional designations in human resources (GPHR and SPHR) and in benefits administration (CEBS). In addition to his articles and blog posts, he curates a weekly roundup of articles on project management, and he has authored or contributed to several books on project management. You can view his blog at The Practicing IT Project Manager by clicking the button below.

Check Also

Have You Sold Your Leaders on Asset Management?

In tough economic climates, companies are looking for ways to streamline their processes in addition …

Leave a Reply

Your email address will not be published. Required fields are marked *