Wednesday, April 25, 2012

Implementing Minimum Viable Changes As Part of a Lean Startup For Change Approach

In my last post I described how we measured Minimum Viable Changes (MVC) for our Lean Startup Enabled Kanban change initiative. In this post I will describe the exact process we used define, implement and measure those MVCs.


In order for our approach to provide learning at the pace we required we needed to be able to define Minimum Viable Changes that we could complete in a matter of weeks or days. We did this by breaking up our strategies into one or more transformation adoption "campaigns". Each campaign was differentiated in terms of underlying adoption approach used, a strategy it supported, or any the content being adopted.


MVCs were then designed to validate assumptions contained within a particular campaign. We needed to quickly validate if the approach behind a particular campaign could change people's behavior, and help them acquire new and useful skills at the rate required for success.

Each MVC was labeled according to a descriptive name along with one specific assumption that it was designed to validate. Each MVC contained an explicit hypothesis, along with details on how to measure the accuracy of that hypothesis.


Every MVC followed a similar measurement approach. Each MVC targeted a specific set of skills taken from the Transformation Participation Engine mentioned in the previous section. The value of each MVC was determined in terms of increased number of individuals acquiring new skills. Assumptions could then be validated by measuring actual changes in behavior, and determining if the adoption approach underlying the campaign with sound.

An example of this approach is the Kanban strategy, which was categorized into a number of campaigns. The Visualize As Is Work campaign was dedicated to visualizing current state processes, and gradually implementing WIP limits, policies and other components. There was no inclusion of other agile practices. An example of a MVC implemented for the Visualize As Is Work campaign was the setup of Functional Department Kanban Boards.

On the other hand, the Kanban/Agile Pilots campaign used a much more aggressive approach, mixing Kanban and Agile practices, such as story based development, cross functional teams, planning poker, etc. An example of the MVC implemented for this campaign was dedicated coaching and training of story mapping.

Other campaigns included a Kanban self-starter program allowing everyday staff to run their own Kanban initiatives, a hiring campaign to recruit dedicated Lean/Agile coaches, and a gamification framework that would render individuals progress in skills and behavior in a role-playing game style character sheet and leaderboard.


Using Kanban to track validated learning, while supporting a Kanban transformation

Of course to track the progress of our organizational transformation the change team used a Kanban system.  During the first 3 months of this engagement our Lean Startup Kanban system changed 5 times. Our end product was much simpler than previous incarnations, and has provided us excellent support for a validated learning change approach.

The backlog consisted of numerous campaigns, each campaign being associated with a set of MVCs that could validate the assumptions contained within each campaign. The priority of a particular MVC was represented by placing these MVCs from left to right on the backlog. 

MVCs were sized so that lead time would be between 1 and 3 weeks. During the preparation phase, metrics used to measure a specific hypothesis was defined, and the exact impact/commitment from the targeted set of clients was also specified and communicated. 

MVCs were then introduced to a subset of the organization known as a cohort. The introduction state involved initial coaching and training, hands-on workshop facilitation and other activities. Once the client was deemed to be somewhat independently operating with the new skills introduced, we moved the MVC to the watch state. 

Watching consisted of observing the behavior of our customers, and measuring specific behavior according to the Transformation Participation Engine. Once our customers had been observed for a suitable amount of time, we then measured the MVC, and determined if our outcomes matched our hypothesis. 
At first it was difficult to determine when to move a MVC from watch to measure.  We soon came up with a simple rule. A MVC could be moved as soon as someone from our team felt that a campaign required a change in tactics. This called for immediate action to measure the MVC that was currently in-flight, and introduce a new MVC to validate the modified approach. Often these observations preceded our measurements, we were measuring people's behavior, behavior that we had to manually observe. As a result our observation and measurement operated in tandem with each other.

Once a week we held team retrospectives, at that point we we reviewed all measured MVCs, discussed the outcomes and moved the MVC to the appropriate pivot or pursue lane depending on the results of the discussion.  The decision to pivot or pursue was also typically made at these retrospectives, once we had an opportunity to review a batch of MVCs.

Tuesday, April 24, 2012

Introducing the Transformation Participation Engine

Update: This part of the method is no longer being practiced by our team. We still feel that having a clear learning path for individuals is important, but it must be voluntary, transparent, and based on self assessment. It must also be pull based, where participant actually ask to be part of the system. Stay tuned for future updates.I am currently knee-deep in another large-scale IT organizational transformation. Again Kanban is a critical enabler, as are a mixture of agile methods. What makes this transformation different is our team's decision to manage the change initiative using a modified form of lean startup methods.
The following definition of a startup from Eric Ries Lean Startup book particularly inspired us...
a human institution designed to deliver a new product or service under conditions of extreme uncertainty
By this definition, an enterprise change initiative could be deemed a startup, one that could take advantage of Lean Startup techniques.


We quickly came up with an approach to guide our change initiative based on the Lean Startup method. We called it the Lean Startup Change Approach :-). What became quickly obvious to us, is that it was not apparent on what we were supposed to be measuring.

Measuring the things that matter for a change initiative

This turned outTo be more challenging than expected.  At first we tried to be overly be clever, and come up with experiments that would validate the performance benefits of Agile and Kanban practices. This exercise realistically would take many months, if not years, to complete.  Our team did not have the benefit to dedicate that much time.

After a significant amount of thinking we we focused our efforts on figuringOut how to measure behavioral change and capability of the organization to adopt different methods, as opposed to measuring the methods themselves.

Any validated learning effort should focus on assessing the areas of highest risk first. In our case we needed to be able to effectively change the working habits and thinking culture of our clients before our involvement in the change effort came to an end. Our job was to make sure that our clients were positioned for success once we left.

The Transformation Participation Engine

With this in mind we created a "Transformation Participation Engine" framework. The objective was to track and visualize the progress of adoption for individual staff on the journey towards lean thinking. Minimum viable changes (MVC) could then be developed specifically to target measurable changes in a subset of the organization.

We defined such a system by deconstructing the objectives of our change initiative into a set of fine-grained target behaviors. We then grouped those behaviors into specific skills, grouped those skills into tracks, and finally grouped those skills into strategies. Below is a simple diagram showing the components of the Transformation Participation Engine, along with a sample of each component in brackets.


Once we had a robust repository of behavior and skills, we associated each skill with an achievement rating. The Achievement ratings for skills were then used to calculate an individual's overall progress in terms of participation in the transformation. In our first iteration of the transformation participation framework we followed a very simple calculation algorithm. Achieving a rating from a single skill would be enough to promote an individual's overall progress to that rating. We anticipate using a more complex algorithm as we continue to use this framework.


Example: the Kanban category is divided into various tracks including Operate, Invent, Manage and Own. The Invent track contains a number of skills, including the Design skill. In order for someone to successfully achieve this skill, he or she would need to demonstrate evidence of a specific set of behaviors, one of which is building a Kanban system from scratch.
Bob completes the Design skill, which has an achievement level of Prowess, his overall progress is therefore Prowess

Using Kanban to visualize transformation participation

We defined a Kanban system to visualize and measure the learning/participation progress of individual staff, managers and executives using this framework. Each individual was represented as a set of work tickets within a Transformation Participation Kanban system.
A separate swim lane was used to track each FTEs overall progress. Each employee with the organization would have exactly one ticket on the “overall progress” swim lane


A separate area of the Kanban system was used to visualize an individual’s progress in various skills. An employee ticket was cloned for each skill that he/she was trying to complete. These "skill" tickets would progress through the skills track according to skills completed. As skills were completed, they would provide the employee with an "achievement rating".

The employee work ticket within the “overall progress” swim lane would move to the appropriate state according to the achievement rating received by completing particular skills.
With this system in place we were able to both track and project the rate that the organization would be able to adopt new methods. This became our primary method of communicating status throughout the transformation.


Once we had this measurement system in place, our work turned towards determining as quickly as possible whether any of our transformation methods would support the projected velocity of change. We then needed to design MVCs to specifically evaluate these assumptions. In essence, we elected to focus exclusively on "growth assumptions" for the immediate time being.

I'll talk about how we designed our various MVCs, and providing examples in future posts.
Technorati Tags: ,,,

Saturday, April 14, 2012

The Futility of Tracking Individuals Time

An inordinate amount of economic resources are spent on managing people's time. Professional Services, IT, Design Agencies are typical examples of organizations that ask their staff to track every task they do, the type of the ask, the client the task was for, etc., all down to the hour, or minute, depending on where you work.

The rational for doing this seem perfectly reasonable, even if the outcomes do not.

Chiefly, business owners want to know if they are charging the right price for services rendered to clients. The need to track effort is a primary objective. Supposedly this helps managers optimize efficiency.

Tracking time is also cited as a way to prevent abuse at work. Time management is supposed to protect workers from being forced to work unreasonable hours. It is also is touted as a way to keep workers honest, preventing them from spending to much time googling, or facebooking, or other kinds of goofing off.

While the objectives are rational, the ability of management to ignore the obvious dysfunctions of time management is not.

Whenever time tracking is used to prevent abuse, falsification of data is the result.

Project Managers will prevent workers from tracking after hours work if it affects their budget, regardless of exhortations from senior leadership. This is especially true in professional service and consulting firms.

Workers will also enter time according to the expectations of their management, implicit or otherwise. No one wants to be flagged as a bad performer.

Even when discounting abuse, time entry is notoriously inaccurate, most workers have trouble remembering exactly what they did down to the hour for an entire week. Again, time gets entered according to management expectation, rather than reality.

The really insidious aspect of time management is that it is measuring the wrong thing. It emphasizes an inward perspective where cost is king.

Success in a customer experience economy requires an external perspective, one focused on getting a handle on the creation of customer value.

On first look, this is a more involved exercise than tracking individuals time.

Measuring customer value requires a deeper understanding of the types of goods and services you offer to your customer. This allows you to look at metrics like throughput, the time it takes to create customer value, and how often you deliver value without incurring customer complaints.

These are the metrics that matter to your customer, not the exact time spent by every FTE. Cost can still be measured, but by approximation, which is more than good enough for most situations.

Simply take the total burn rate to produce a product or service and divide it by the throughput. Effort across multiple services gets amortized across the portfolio.

- Posted using BlogPress from my iPad