Saturday, February 24, 2007

Cool Change Management predictive metric

The October 2005 issue of HBR has an article called "the hard side of change management". They introduce a new metric called DICE, which stands for "Duration", "Integrity of Performance, "Commitment", and "Effort". In their study they found that by applying simple subjective scores to each of these areas, and then a simple formula to roll up a total score, they have a number which can be used to predict success or failure. They did the study on over 1000 projects and found correlation between these four factors and were unable to add additional factors to produce a better predictor.

Amgen used the tool in 2001 when they made a series of changes. At one point they used this metric on 300 initiatives (yikes!) and reconfigured resources on 200 of them based on the DICE framework.

The authors suggest that it be used in one of three ways: track projects, manage portfolios of projects, and force conversations. All three are really helping the organization decide where to focus time and energy.

The article gives specific guidance on how to evaluate a score for each factor. You can pay for and download the article here from

Friday, February 23, 2007

Root Cause (or, punishing the innocent)

An article in the Wall Street Journal yesterday reminded me of how easy it is to fire people up to solve the wrong problem. For those of you with an online WSJ account, you can view the story here. There is also this article in Information Week which anyone can read. The articles talk about a bill that republican Michael A Costello is sponsoring, Massachusetts House Bill 213, which would punish retailers for leaking personal data. Now, don't get me wrong, we do want to provide incentives for protecting personal data. But the issue here is stolen credit cards. The root of the problem isn't security practices at retailers. It is not possible to completely secure a complex system from hackers. There will always be ways to get such data. Holding the retailer accountable is just wrong. Similarly, holding the software, system, and database vendors accountable is also fundamentally flawed.

What is the core problem here? The main problem is that credit cards are embarrassingly insecure. The banks don't deal with the problem because it has been less expensive for them to pay the price of failure than it is to fix the problem. What this bill would do, if passed, is completely let the credit card companies off the hook. That is just so wrong it is offensive, and I hope people see this for what it really is: an attempt to remove the increasing risk of credit card theft away from the group who is really responsible, by taking advantage of the current emotional response to recent TJ Maxx theft headlines.

The credit card companies can solve this. If you've used a SecurID card from RSA, you know that a credit card sized object can provide a unique temporary number based on a secret PIN you provide. You must have the card and supply the secret to get a one-time authentication. Instead of the inane little three digit code credit card companies have us copy from the back of the card, you could use a dynamic generated code which is based on at least two factor authentication.

Using such a system, it does a thief no good to have the card in their possession unless they have also obtained the magic number. If that did happen, the person who lost the card simply reports it stolen and it is disabled. No break in to any system would divulge the personal secret, and the TJ Maxx problem would simply go away. Not only does this virtually eradicate the most prevalent forms of theft, but it provides the bank with non-repudiation and replay protection. In other words, customers can't claim the charge was made without their authorization, and someone can't later use the numbers to create another charge. This means the consumer and the retailer win, at some cost to the credit card companies relative to the current system; but how much more expensive is it, if most of the theft goes away?

There are other ways to do this as well, probably equally effective, and some which are easier to use (hosted on a cell phone for example). But the banks don't want to invest in the infrastructure, and now they are trying to legislate passing the buck to the retailer, which ultimately will end up being paid by either the consumer or the systems providers, while the credit card companies continue to make record profits with inferior technology.

So let's quit punishing the innocent and solve the right problem. The retailer and the consumer pay the 3% transaction fee - shouldn't we demand that they earn their money and fix this abomination?

Do the same in your business. When someone is pushing costs around, take a look at what the real problem could be. It's probably the one that saves the most cost for most stakeholders, and in particular respects your customer's costs.

Sunday, February 18, 2007

A change tool: ADKAR

I was considering a potential assignment and rummaging through my toolbox, when I recalled an acronym: ADKAR. I don't recall where I originally learned it. It is a diagnostic tool to help you figure out how far along a change an organization is, and what remains to be done.

(A)wareness is present if people are aware that a change needs to be made.

(D)esire is present if people want to change the status quo.

(K)nowledge is present if the people know how to change it and get to the new state.

(A)bility is present if the people are actually able to work on making the change -- including but not limited to (for example) being given enough time to do so.

(R)einforcement is present if there are positively reinforcing reasons to keep the new state.

It's compellingly simple on the surface. The hard part is really who are the "people"? An organization generally will have some people with each of these attributes at any given time. A stakeholder analysis will help here, but I think there is a momentum factor as well. As you progress down ADKAR, you have to line the influencers up and gain momentum with everyone involved.

Sunday, February 11, 2007

The requirements factory

I was speaking to some offshore development managers last week, where we were discussing what a challenge it is to get good requirements. This is not a new phenomenon, it has just gotten so much harder with increasing levels of technical complexity and the cultural and distance diversity of our current development landscape.

The May 1995 issue of Communications of the ACM included several articles about requirements for software products. One of them involves how Digital Equipment Corp redesigned their requirements process. Their challenges and learnings are still relevant today.

They first described a fairly traditional nine-step process that one might find at any major software development organization. It isn't "waterfall" since many of the steps overlap. Nonetheless they mention that many product development personnel expect that creative, knowledge-driven processes are similar in execution to deterministic, manufacturing style processes, and that it was that view that was correlated to dissatisfaction with the requirements management process.
[they] expected the process to do the creative work of contextualized data collection, interpretive data analysis and responsive design
The teams that were more successful allowed themselves more flexibility on the process. In other words they adjusted the process to the problem. This supports the idea that a process should not be over-prescribed; the team needs the ability to be creative not only in how to implement the solution, but in how to figure out what the solution is.

Then they did something very innovative and interesting. They incorporated creation of marketing deliverables into the requirements management process. Specifically the marketing and advertising messages, user information, pricing and competitive positioning and sales and support information. I haven't worked or consulted anywhere where that was done up front, but it strikes me as a brilliant way to engage the cross functional teams to work through the entire problem set and ensure that they understand all aspects of what the proposed product is and needs to do. It provides the right framework to help answer the questions which will inevitably need to be solved, and gets them resolved earlier. The result is lower requirements churn.

The challenge for most organizations is going to be the change in behavior required to get the marketing department to work in a fundamentally different way. That means you'll need a compelling vision or a painful execution failure, and support from the top.

Tuesday, February 06, 2007

A well written product quality article

I was part of an effort where we had a tremendous impact on quality in just a couple of years. We principally did it by turning the organization around from its earlier behaviors, and simply making it clear that quality is the developer's job, not the QA department's. QA's job is to measure and prove that the developers did what they were supposed to (or that they didn't). Development's job is to design and build a high quality product. A lot of development groups don't run that way; we got fantastic results by making the change. I found this writeup, by Karl Wiegers, which talks about how to specifically implement systems which support that goal. It is fantastically well written. Karl is also the author of a book I have used to set up review systems, called Peer Reviews in Software.

Monday, February 05, 2007

Effective Communication

I found this on Ronny DeWinter's Software Quality blog:

Communication levels:
Not everything that is said is heard.
Not everything that is heard is understood.
Not everything that is understood is agreed.
Not everything that is agreed is applied.
Not everything that is applied is retained.

That is a fantastic short synopsis of why communication is so difficult. It shows that just getting someone to understand you is not enough. Of course, getting further down this list is not always required. But if it is, you should test every step so that if a breakdown occurs, you know where it happened and how to fix it.

You may not need agreement in every case, but it is nice to know when you don't have it, and if you have trust, agreement is testable simply by asking. If you don't have trust you may have to test downstream.

One of the places I worked would appear to have agreement but often fail to actually practice what was agreed. Application or practice should be easily testable, you can build it into the original communication. For example, "When you complete this, send me a copy".

Whether something is retained or not may or may not matter. If it is important to you that it is retained, put a follow up action in your favorite planning tool to test future retention.