BestLightNovel.com

Metrics: How to Improve Key Business Results Part 8

Metrics: How to Improve Key Business Results - BestLightNovel.com

You’re reading novel Metrics: How to Improve Key Business Results Part 8 online at BestLightNovel.com. Please use the follow button to get notification about the latest chapter next time when you visit BestLightNovel.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

Why not? No matter how good your graphical representation is, you can't afford to risk a misunderstanding. You rooted out the question and you designed the metric so that you could provide the right answer to the right question. You cannot allow the viewer of the metric to misinterpret the story that you've worked so hard to tell.

The narrative is your chance to ensure the viewer sees what you see, the way you see it. They will hopefully hear what you are trying to tell them. Any part of the plan can be updated on a regular basis, but the narrative requires frequent doc.u.mentation. Since the narrative explains what the metric is telling the viewer, the explanation has to change to match the story as it changes. The narration which accompanies the picture and doc.u.ments what the metric means is critical to how the metric will be used.

Figure 3-6 shows when the narration is doc.u.mented.

Figure 3-6. Narrative.

Doc.u.mentation: Making the Metric Development Plan More Than a Plan.

In the end, the metric development plan should doc.u.ment the why (purpose statement), what (metric), when (schedule), who (customers), and how (a.n.a.lysis, how it will and how it won't be used).

Doc.u.ment it as thoroughly as possible-putting all of the details into one place. This will help you in the following three ways: It will help you think out the metric in a comprehensive manner.

It will help you if you need to improve your processes.

It will help you if you need to replicate the steps.

Figure 3-7 shows them all together in coordination with the process for developing the metric.

Figure 3-7. The Development Plan Rather than prescribe a length or specific format, I want to stress the readability of the final plan. You will want it for reference and at times for evidence of agreements made. I find it extremely useful when the metrics are reported infrequently. The more infrequently the metrics are reported, the more likely I'll forget the steps I followed. The collection can be very complicated, cleaning the data can be complex, and the a.n.a.lysis can require even more detailed steps. The more complex and the more infrequent the process, the more likely I'll need the plan doc.u.mented.

Of course, even if I perform the process weekly, the responsible thing to do is to doc.u.ment the plan so others can carry it out in my absence.

The plan has use throughout the life of the metric. It starts out helping me to fully think out the design and creation of the metric. It then helps in capturing the agreements made around the metrics use and schedule. It helps in defending actions (it becomes a contract between the metric a.n.a.lyst and the data providers and the end customer) and meeting expectations. Finally it is also critical to long-term success. It provides a historical view as well as a "how to" guide. Without repeatability you can't improve.

Without repeatability, you can't improve.

The components of the development plan need to be doc.u.mented in a manner that allows easy and accurate access. Ensuring accuracy is more difficult than making it easy to access. We discussed different means of collecting data and ways to make it more accurate. The really good news is that many times, the way we make it more accurate also makes it easy to access. Less human interaction moves us toward more confidence in the accuracy of the data, and automation makes it easier to collect.

When you doc.u.ment the components, don't be afraid to be verbose. This isn't a time for brevity. We need to build confidence in the metric and the components. We need to doc.u.ment as much information around each component as necessary to build trust in the following: Accuracy of the raw data. You will be challenged on this, and rightfully so. People have their own expectations of what the answer to your root question should be. They will also have expectations regarding what the data should say about that question. Regardless of the answer, someone will think you have it wrong and check your data. Thus, you have to be accurate when you share the data. This requires that you perform quality checks of the data. It doesn't matter if the errors are due to your sources, your formulas, or a software glitch. If your data is proven to be wrong, your metrics won't be trusted or used. Most examples of inaccurate raw data can be found as a result of human error but even automated tools are p.r.o.ne to errors, particularly in the interpretation of the data. Errors can be found in anything from logging data incorrectly, mistakes in calculations, or a.s.signing the wrong categories to information. If your categories are not defined properly, an automated system may report the data correctly, but it might be reporting the wrong data. If you are tracking time to resolve trouble calls, is the time equal to the time between the start and stop dates? Or is it the time from the call to the resolution (which may occur well before the day/time the trouble ticket is closed)? Are you using calendar days to track time or workdays? If you are reporting on availability, what is considered an outage? How do you determine the outage occurred, when, and for how long? The simple rule of thumb is to double (and triple) check your data. I find the best way to check my data is to have someone else look at it. I'm too close to the work to see the errors others see immediately.

When you're starting, there is nothing more important than accuracy of your raw data.

Accuracy of your a.n.a.lysis. We'll get into methods of a.n.a.lysis later, but for now, it is important for you to doc.u.ment the processes and steps you take to a.n.a.lyze the data. This will enable you to repeat the process-a necessity for consistency. A simple example is the use of formulas in spreadsheet programs. I do a lot of my metric work in spreadsheet programs because they are easy, user-friendly, and powerful. I use formulas to calculate everything from the number of elapsed days to the percentage of change over time. Whatever methods you use they must be doc.u.mented. Anyone looking over your work should be able to replicate your work by hand (using pen, paper, and a calculator). This doc.u.mentation is tedious but necessary. Your process must be repeatable. Your process must produce zero defects in the data, a.n.a.lysis, and results. Your process and the resulting information must be error free.

Repeatability of your process. Yes, I already mentioned this in the accuracy of your a.n.a.lysis. But I it is worth emphasizing and clarifying that repeatability is critical not only for the a.n.a.lysis, but throughout the process to design the metric. The collection of the data must be repeatable-in a strict sense. The a.n.a.lysis of the data must be repeatable. The graphical representation must also be a repeatable step in the process. Each time, you should collect, a.n.a.lyze, and report the data in the same way. If you don't doc.u.ment the process and ensure it is repeatable, you will lose the all-important trust of your audience. This repeatability is necessary throughout the process. It's why we develop schedules. We want to do it the same way, at the same time intervals, and using the same tools. Consistency is critical.

Without repeatability you don't really have a process.

To adhere to the tenants of good doc.u.mentation, you'll need to use a method for controlling versions of your data, a.n.a.lysis, and reports. You'll need to store your information with backups. All of the doc.u.mentation listed must be safeguarded against loss or tampering. You have to ensure the accuracy of the components and you can't do this if you don't control access to the information.

First, you'll have to ensure the sources of your data are producing accurate information. When you have checked the data for quality and have attained a high level of confidence in the accuracy, you'll have to repeat the processes you used to gather that information. Once you have accurate data, you have to ensure it stays that way.

People are the greatest risk to having accurate data.

You will need a safe and secure location to store your data, your a.n.a.lysis, and your reports. You will need to safeguard it from others who may innocently tamper with it. You will also need to keep it safe from yourself. When was the last time you worked on something well past your bedtime? When was the last time you made errors keying data? When was the last time you lost over an hour's worth of work because you forgot to save regularly?

You will make mistakes-it's inevitable. The key is to mitigate this reality as much as possible.

How should you mitigate the inevitable mistakes you will undoubtedly make? Save early, save often, and save your work in more than one place. It won't hurt to have a hard copy of your work as a final safeguard. Along with backing up your data, it's important to have the processes doc.u.mented.

Another tool for mitigating mistakes is to use variables in all of your formulas. If you're using software to perform equations, avoid any raw data in the formulas. Put any values that you will reuse in a separate location (worksheet, table, or file). Not only does it allow you to avoid mistakes, it makes modifying the formulas easier.

Reference all values and keep raw data out of the equation.

The following are a few other pointers to help you as you doc.u.ment your plan: Don't work when tired. Seriously. You should know yourself well enough to know when you're tired. Put the work down and come back to it when you are refreshed. There are some things you can do when tired-metrics is not one of them.

Stick to your process. Don't allow short deadlines to force you to deviate from your process. You may be tempted to take shortcuts just to get the metric updated fast enough to meet an unexpected deadline. Resist this. Resist the person requesting the data before the agreed-upon schedule. Whenever you deviate from your process you run the risk of making mistakes. Start with "no." Refuse to rush.

Use version control. It doesn't have to be extensive-just effective. As long as you can track the work you've done and any changes you've made, you'll need to be able to "undo" any changes you've made and return to an earlier version you have faith in.

Create and use templates whenever possible. Templates allow you to make your process more repeatable and to ensure you collect the same data, the same way. I use templates for surveys, interviews, and questionnaires. I use them for a.n.a.lysis and creating graphs and charts. One caution-double-check the template for accuracy.

Reuse is great! Why re-create the wheel? Just double-check that the wheel isn't riddled with broken spokes.

A Note on Process Byproducts.

When you worked on the root question, you identified byproducts like goals, objectives, tasks, and measures of success, which were not essential to the metric's design. As you worked on the abstract picture of your metric, other thoughts came to mind. When you captured possible measures you'd need to fill out the picture, you identified more than what was required. The excess items you parked or stored in a to-do list. All of these byproducts have potential to help you improve your organization and could be very valuable. They should be captured and shared. Don't waste them.

Don't waste anything! Your intellectual property is valuable-treat it with respect.

Recap.

I have introduced a taxonomy so that we can communicate clearly around the subject of metrics. In the second chapter, I covered the theory and concept of designing a metric and the high-level process for collecting, a.n.a.lyzing, and reporting the data, measures, and information that go into making up that metric. In this chapter, I covered the basics of how and where to begin. I have purposefully kept the information at a high level so that you can feel comfortable with the concept before I start getting into the weeds. In this chapter we covered: A metric development plan is not a luxury. It's a necessity.

The plan not only helps in the creation of the metric, but it also provides guidance for the maintenance and final disposition.

The metric development plan is made up of the following components: A purpose statement An explanation of how it will be used An explanation of how it won't be used A list of the customers of the metrics Schedules a.n.a.lysis Visuals or "a picture for the rest of us"

A narrative Accuracy is critical. I stressed the importance of accuracy in your data (source dependent), your collection (process dependent) and your a.n.a.lysis (process and tool dependent). I also offered the benefits of making your processes repeatable.

Conclusion.

We have set some of the foundation for designing and using metrics responsibly. We provided some tools for the practical implementation of a metrics program. The next three chapters will cover the dangers inherent in a metrics program and I will provide warnings, mitigations, and threats to help you avoid the headaches many fall victim to. I believe this is a logical progression-because before you use a powerful and, therefore, potentially dangerous tool, it is important that you understand what it is, how it should be used, and how to avoid injury.

In almost every serious effort, you are told to doc.u.ment your work. It is stressed in everything from software engineering to grant writing. The problem is, it's tedious. I don't know of anyone who is pa.s.sionate about doc.u.menting their work. If you fail to doc.u.ment everything else, I'll forgive you-as long as you doc.u.ment your metric.

Using Metrics as Indicators.

To keep things simple, thus far I've focused only on the following basic concepts: Metrics are made up of basic components: data, measures, information, and other metrics.

Metrics should be built from a root question.

It's more important to share how you won't use a metric than how you will.

This chapter introduces another basic concept about creating and using metrics-metrics are nothing more than indicators. That may seem to be a way of saying they aren't powerful, but we know that's not the case. Metrics can be extremely powerful. Rather, the concept of metrics as indicators warns us not to elevate metrics to the status of truth.

Metrics' considerable power is proven by how much damage they can do. Metrics' worth is rooted in their inherent ability to ignite conversations. Metrics should lead to discussions between customers and service providers, between management and staff. Conversations should blossom around improvement opportunities and anomalies in the data. The basis for these conversations should be the investigation, a.n.a.lysis, and resolution of indicators provided through metrics.

Metrics should be a catalyst to investigation, discussion, and only then, action. The only proper response to metrics is to investigate. Not the type of general investigation discussed in Chapter 15 on research, but instead a directed and focused investigation into the truth behind the indicator.

Facts Aren't Always True.

If you search the internet for things we know to be true (supported, of course, by data), you'll eventually find more than one site that offers evidence "debunking" past and present-day myths. What was thought to be a fact is proven to be an incorrect application of theory or the misinterpretation of data.

Our earlier examples of health information are a ripe area, full of things people once believed to be true but now believe the opposite. Think about foods that were considered good for you ten years ago but today are not. Or foods that were considered not to be good for you, which now are considered healthy fare. Are eggs good for you or not? The answer not only depends on who you ask, but when.

The US Government's "food pyramid" changes periodically.

Who doesn't remember the scenes of Rocky downing raw eggs?

It seems like each year we get a new "diet" to follow-high protein, high cholesterol, low fat, no red meat, or fish...the arguments change regularly.

One good argument on the topic of old facts not being in line with new truths is that facts don't change, just our interpretation of them.

Let's take a quick look at a fact that perhaps isn't truth.

It concerns the Amazon book sales rankings. Michael Langthorne, one of my former coauthors, enjoys watching the sales ranking of our book on Amazon.com. He has a high level of confidence in the data and his level of excitement grows or wanes based on the numbers. The problem is that the rankings are daily and depend on not only the number of sales of our book, but also the number of sales of all the other books on the site. The sales are also only counted for individual buyers. If you were to buy a thousand copies of our book, Amazon would only count that as one sale toward the rankings. If you were to instead make a thousand separate orders, it would then be counted as one thousand sales, boosting the rankings. Another issue is that Amazon doesn't care who purchases the book. If I were to buy those books, it would improve our rankings, although with no bias because I'm one of the authors.

The point is simple. While the data is "accurate" (or at least you can have a high confidence level in them being accurate), the interpretation of that data can be problematic. Should Mike buy a new television in antic.i.p.ation of rising sales due to the increased popularity of the book? Or should he be depressed over the lack of sales if the rankings fall drastically?

It's fairly obvious that the answer to both questions is no.

This misrepresentation of metrics as fact can be seen in instances where only a portion of the metric is relayed to the viewer.

A business example is one a friend of mine loves to tell about the service desk a.n.a.lyst who was by all accounts taking three to five times as long to close cases as the other a.n.a.lysts. The "fact" was clear-he was less efficient. He was closing less than half of the cases as his peers and taking much longer to close each case. His "numbers" were abysmal.

The manager of the service desk took this "fact" and made a decision. It may not have helped his thought process that this "slow" worker was also the oldest and had been on the service desk longer than any of the a.n.a.lysts. The manager at the time made the mistake of believing the data he was looking at was a "fact" rather than an indicator. And rather than investigate the matter, he took immediate action.

He called the weak performer into his office and began chewing him out. When he finally finished his critique he gave the worker a chance to speak, if only to answer this question (veiled threat): "So, what are you going to do about this? How are you going to improve your time to resolve cases? I want to see you closing more cases, faster."

Showing a great deal more patience than he felt at the moment, the worker replied, "My first question is, how is the quality of my work?"

"Lousy! I just told you. You're the slowest a.n.a.lyst on the floor!"

"That's only how fast I work, not how good the quality is. Are you getting any complaints?"

"Well, no."

"Any complaints from customers?"

"No."

"How about my coworkers? Any complaints from them?"

"No," said the manager. "But the data doesn't lie."

"You're right, it doesn't lie. It's just not telling the whole story and therefore it isn't the truth."

"What? Are you trying to tell me you aren't the slowest? You are the one who closes the cases. Are you just incompetent?" The manager was implying that he wasn't closing the cases when done.

"No, I am the slowest," admitted the worker. "And no, I'm not incompetent, just the opposite. Have you asked anyone on the floor why I'm slow?"

"No-I'm asking you."

"Actually you never asked me why. You started out by showing me data that shows that I'm 'slow, inefficient,' and now 'incompetent.'"

The manager wasn't happy with the turn this had taken. The employee continued, "Did you check the types of cases I'm closing? I'm actually faster than most of my coworkers. If you looked at how fast I close simple cases, you'd see that I'm one of the fastest."

"The data doesn't break out that way," said the manager. "How am I supposed to know the types of cases each of you close?"

The employee replied, "Ask?" He was silent a moment. "If you had asked me or anyone else on the floor why I take longer to close cases and why I close fewer cases you'd find out a few things. I close fewer cases because I take longer to close my cases. The other a.n.a.lysts give me any cases that they can't resolve. I get the hardest cases to close because I have the most experience. I am not slow, inefficient, or incompetent. Just the opposite. I'm the best a.n.a.lyst you have on the floor."

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

Metrics: How to Improve Key Business Results Part 8 summary

You're reading Metrics: How to Improve Key Business Results. This manga has been translated by Updating. Author(s): Martin Klubeck. Already has 1029 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

BestLightNovel.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to BestLightNovel.com