Monday, February 15, 2010

My Interview with Scott Ambler

A couple of weeks back I had the opportunity to interview Scott Ambler. Of course Scott does not really need introduction, but for those who don’t know him, Scott has been a tremendous help to the agile community for years. He’s the author of the agile modeling method, one of the authors of the enterprise unified process, and has played a key role in helping IBM and IBM clients become more agile. Of course this is just a brief overview of some of Scott’s accomplishments.



The interview was comprehensive, and we were able to discuss a number of very important issues to both the agile and the developer community at large. I recommend taking a look. Scott has some great thoughts on the role that agile is playing, its limitations, and how to scale agile beyond the core use cases that many agile practitioners discuss. In this interview he shares his experiences on applying agile and agile governance internally to IBM as well as to his customers.



What’s keeping you busy these days?


I’m spending most of my time customers adopt agile and lean particularly@scale, and use agile to address more complex nonstandard problems. I’ve also just been working on a 2 day disciplined agile delivery workshop which will be offered to our clients started in February. The focus of the workshop will be on how to do agile delivery end to end, targeted at intermediate development teams i.e. developers and their immediate line managers



What kind of practices will be covered in the training? Will focus be on technical or management practices?


The training will be comprehensive offering a combination of leadership style material, modeling, how to do agile documentation, test driven development, and also how to incorporate independent testing into agile. The goal is to explain how to do agile delivery end to end and cover the full spectrum, not just the cool "sexy" stuff that everybody likes to talk about, but the complete set of artifacts necessary to make agile delivery successful in the real world.



Let’s circle back to that in a second, but first of all tell me about a typical day in the life of Scott ambler? How do you go about spreading the word and evangelizing agile? Both within IBM as well as to clients?


I sort of wish I had a typical day which I don’t. A lot of my focus is on helping customers. I might spend an hour or two a day on conference calls with customers, I get to work out of my house for the most part. A lot of IBMers work from home, although I also am frequently on site with customers helping them work through whatever their complicated problem of the day is. I occasionally sit on sales calls as a technical expert and have also been playing a role in the adoption of agile within IBM itself. In terms of adopting agile within IBM, I give advice to a specific internal IBM product teams on how to become more agile



What are some the things you do to specifically help IBM to adopt agile development internally? How much of this is actually working with IBM teams developing product in agile manner as opposed to helping consultants and professional services be more agile in the way they help clients build applications?


I would say it’s approximately 2/3 customer facing, and one third working directly with internal IBM.



Taking into consideration all your duties at IBM both internally and with customers, what are some of the big issues that keep you up at night? what Would you cite as your biggest challenges?


It’s usually around helping people figure out how to scale up agile and address some of their more complex problems. The other issue revolves around how to replicate myself, I have been doing a fair bit of enablement and mentoring so that I have a group of agile adoption experts who are able to further my work. I’m probably spending a good 15 to 20% of my time directly mentoring these kinds of people.



What kind of attributes are looking for in the people that you want to mentor, and even more generally, what kind of attributes you think it takes a person to help successfully drive the adoption of agile?


The standard customer facing attributes are a must, as is flexibility, and of course the person should be reasonably senior, a minimum of 10 years experience. I will work with people fresh out of school, but just not with the intention of putting them into senior adoption roles.



You have worked on some awfully big projects in the past, written some very well-received books, for example the agile modeling method, your participation in the enterprise unified process, etc., what’s the next big project that’s going to come out from Scott ambler?


Right now it’s working on finishing the agile scaling material. I’ve been working on a series of agile white papers on scaling agile. There are four papers total, the first one was the agile scaling model, which came out in December. The second one which came out in January is an executive paper covering the details of the remaining material even though these haven’t come out yet. The third white paper will talk about how to scale various agile practices, and applies the agile scaling model to show how to scale several common agile practices, for example how do you scale daily standup meetings on larger projects.



Can you go into more detail about the agile scaling model?


The agile scaling model applies various issues that you will run into when trying to apply agile practices at scale. There are two primary ways to apply the model, the first way is to take is for example daily standards, and then apply one or more scaling factors which will change the way you use the practice. As an example, the way you want to manage daily standups will be very different if you have a large geographically distributed team then if you have a small co- located one. Of course not all factors affect all practices.



The agile process maturity model was about ... "how do you take a mature approach to agile in a disciplined way" ...phenomenal push back... the agile community literally went rabid, ... of course developers aren’t always good at seeing the big picture... maybe there are reasons organizational leadership is selecting these metrics to measure..., such as keeping people in the company out of jail



Would you also agree that some of the factors don’t even apply without the presence of a scaling factor?


Yes, some practices start to kick in when a scaling factor becomes apparent. So for example if you’re in a regulatory environment, you will be doing a lot of different things that you wouldn’t even consider in the non-regulatory environment. You’ll be doing more documentation, and formal reviews will start making more sense.



What is your primary motivation for creating the agile scaling model? When did you stop and say we need this to be able to take agile to the next level? You seem to have a bit of a unique brand from some of the other agile "gurus" out there, you have chosen to directly deal with some of the "nonsexy" agile issues. When did you start down this path?


That’s actually a hard question, it started several years ago, working with customers where I start to see several common patterns, situations where the mainstream agile was actually breaking down. For example many situations where small, co-located teams just don’t work. What we were seeing where these common factors which were causing us to modify the way we are doing agile in a very repeatable way. I kept getting asked "in this situation what do you do?", and of course it wasn’t just me. We were also being pushed by a lot of our customers to look at agile and CMMI together. So we started talking about a maturity model for agile. In fact a lot of the work in the agile scaling model has its roots in our early work around some work we call the agile process maturity model. What agile process maturity model was about the question "how do you take a mature approach to agile in a disciplined way". What was interesting is that we got phenomenal pushback from the agile community, they literally went rabid, you mention the word maturity and a large part of the agile community goes nuts.



Why do you think the agile community has such a bias for the most part against maturity models?


Well I think there are a lot of questionable implementations of CMMI, the big challenge with CMMI it is really attractive to bureaucrats so what you end up with is some very document heavy processes because that’s what these kinds of people believe in. Of course this is always the case, but it’s often true in general. And a lot of developers, agile or otherwise have had some pretty bad experiences and have not appreciated it. Of course developers aren’t always good seeing the big picture, maybe there’s a reason that the organizational leadership is talking these metrics and trying to measure efficiency, maybe there’s some reasons why the integration control processes are actually more co-located and simply dropping some code into source control system, and maybe there are some reasons why people outside of the development team actually need to do reviews, such as keeping people in the company out of jail. A favorite example of mine is time management reporting, everybody whines about doing this, and rightly so, I don’t like doing it myself, but the reality is, and this is a hard observation for developers, is that the most valuable thing you might do during the week is actually entering their time and that just blows their mind. A lot of development can be put off as research and development tax credits, so properly entering time against different activities can result in a significant savings for the organization.


At the same time bureaucrats can be out of control, and messing things up.



... but nothing really in it (CMMI) that says you have to be documentation heavy, and bureaucratic... if you let bureaucrats define your process, you will have a bureaucratic process. If you have pragmatic people define the processes you’ll have something pragmatic.



What are some of the alternate ways you could use CMMI like approaches so that you can balance these things you need to do to stay out of jail for instance, with the desire to be agile?


The first thing you need to know about CMMI is there’s nothing really in it that says you have to be documentation heavy, and bureaucratic. A few of the goals might motivate you towards that, but really it’s a choice in terms of how you implement your CMI process. You can actually be very agile, and very pragmatic. What I like to tell people is that if you let the bureaucrats define your process, and you will have a very bureaucratic process. If you have pragmatic people define your process menu and up with something that’s pragmatic. So developers need to stop complaining about how tough they got it, and they need to step up and participate in defining these processes. If they don’t step up, and the bureaucrats will.



So what you’re saying is that practitioners need to accept the need for process compliance will always exist and it’s a better job to ensure that these processes and the compliance methods are acceptable to practitioners?


Exactly, many companies are only used by their clients because they have these CMMI maturity levels, and many developers wouldn’t be working for their employers if these companies weren’t CMMI certified. Hopefully these developers would have jobs at different places but maybe they wouldn’t. So developers need to appreciate these bigger issues sometimes. And again CMMI implementers need to do a lot better in streamlining things and making them more pragmatic.



What would your recommended steps be for a CMMI compliant level 4 or 5 organization that is trying to become more agile?


Or my favorite first step is almost always to recommend the organization start trying to work within short iterations, with the goal of trying to create potentially shippable software. That’s your sort of brings the reality to the conversation, an interesting observations that pretty much every CMMI level 4 or 5 organization I’ve been to is trying to become more agile. They’ve actually got metrics that tell them where they are effective where they are not.


One of the biggest challenges for services firms is when their customers are maturing to level II or level III, and they are not agile themselves and they motivate their service providers cannot be agile.



... so-called precision, and "accurate estimates"... is false.... it (accurate estimates) comes at a pretty huge cost. Which the customer is oblivious to... (estimates) creates a very dysfunctional behavior on behalf of the IT department, …I always ask them what they’re smoking.... (management and business should) start actively managing the project.



So service firms trying to get customers to give up "long-term planning precision" in exchange for the usage of agile approaches is a real challenge?


I think what customers need to realize is that the so-called precision, and "accurate estimate" that they are getting is false. They aren’t getting that at all, and when they do get it comes at a pretty huge cost. Which the customer is oblivious to because the service provider is a black box to them. And this is true even if the customer is not using the service provider, an IT departments pretty much a black box to most of its business customers. They have no idea of what the actual cost is for this demand of a accurate or supposedly accurate estimate is on the project. This need for completely fixed upfront estimate create some very dysfunctional behavior on behalf of the IT department. Getting accurate estimate is actually a very costly thing to do and knowing is very good at it. That’s one of the great myths of the IT world, the fact is that we are still not very good at estimates. Most of the time our stakeholders aren’t even very good at telling us what they want, so how can we possibly give them accurate estimate, and even if they could tell us what we want they are most likely going to change their minds anyway. So what you want to do is move away from this false sense of security and start actively managing the project. So instead of putting all this effort into getting accurate estimate upfront, actively manage the ongoing costs of the project, keep an eye on things, and be actively involved that’s critical



So what would your response be to people who come up with complex estimation models like cocomo 2 and function point analysis, and other approaches to try to put some science in estimating?


I always ask them what they’re smoking, even the estimating community is pretty transparent about the fact that they’re not very good at estimating. The best estimates are done by people who know what they’re talking about and are more than likely going to actually be involved in the project. Using them to come up with a ballpark is faster, cheaper and in the long-term is going to be more effective than using function points or whatever kinds of points you want. The challenges is that we need to step back from this wishful thinking, what I typically tell people is "how well has this approach worked for you in the past?" Sometimes people will lie to me but most of the time it becomes very obvious that this just doesn’t work. Sometimes the only way to get these accurate estimates is for service providers to basically lie, they pad the estimates and cost. If you’re going to pad the budget by 30 to 40% you’re saying your "scientific" estimate is anything but.



... you can come up with a fairly good guess (for estimates) fairly quickly... …just as accurate as function point counting... you can always refine... check to see if you’re on track... if not make changes you need or shut down the project.



So how do you we can file this reality of estimation with the need for the business to have an understanding of cost? The business will have some concept of value, it may be intangible, but in some cases is reasonably well-known and they want to contrast this value with what delivery will cost them as a business? How can you communicate to the business that they can’t get an exact cost?


Well I will walk them through what’s happened in the past, because the business is smart, they know that these estimates aren’t any good. They keep hoping for the best, so I try to get them to see reality. That’s not to say there isn’t some benefit to getting a cost initially within the plus minus 30% , but we can do that pretty quickly, we don’t need to create complex models and function points, that’s a phenomenally expensive thing to do just to get an initial estimate. And it doesn’t give us very good accuracy. If you can get a few smart people into a room, and talk through what the scope is, talk through what your architecture approach is going to be, talk through the requirements then you can come up with a fairly good guess fairly quickly, and that’s going to be just as accurate as function point counting is going to be. You can always refine your estimate as you go. For instance if you are 3 months into an 18 month project, and you have actually done some development work, you can check to see if you’re on track and if the team is producing the value it’s supposed to be producing, and if not you can make the changes you need or shut down the project.



So what you’re saying is essentially set up a framework where you fail fast and fail cheap?


Exactly that way there is opportunity to spawn a couple of projects that might be worthwhile, you can continue with the things that are successful, and shut down the things that aren’t. I run a survey for Dr. Dobbs, part of this was a project success are very well I asked participants how many of them typically canceled projects that were failing and only 40% said that they actually cancel projects that were in serious trouble. That is obviously problematic, if a large number of projects fail, then you should be trying to cancel and as soon as possible. Effective project management should be focused on getting projects out of trouble or killing them right away, but a lot of organizations have this dysfunctional behavior where they are not allowed to do that.



Go to conferences, user groups and talk to other people,... others are having the same problems that you do. Nobody is pulling this crap off.



So it’s interesting that on one spectrum there are companies like Google, Yahoo and even Wal-Mart that are fail fast, fail cheap, experimental, they do a lot of different things and on the other hand you have traditional enterprise organizations which are traditional. How do you move some of these companies over to the other direction where it is actually okay to fail?


That’s a problem because it’s a huge cultural challenge, culture is one of the scaling factors in the agile scaling model, you need to make people how things are working or aren’t, however you can take a horse to water but can’t make him drink. I’m basically an evidence-based guide so I try to get them to observe that the current approach is not working out very well for them. A lot of people know that inherently the traditional approach isn’t working out very well for them but they don’t want to admit it or they don’t know that there are better options, than I start sharing evidence that it’s not really working out for anybody else either. One thing I tell people is go to conferences, go to user groups and talk to other people so that you have a better understanding of their experiences in the industry and you will find out that others are having the same problems that you do. Nobody is pulling this crap off. It’s really pretty shocking when you talk to other people and find out that everybody is frustrated for the same reasons and nobody has it right. So let’s start doing the things we know that work, and start abandoning the things that aren’t working for anybody.



Do you think today’s economic conditions have done anything to help spur interest in agile? Forcing IT organizations try to be more effective at delivering software for less?


I would actually say that it’s neutral, the problem is that in order to be more agile or more lean actually requires investment from these organizations, IEEE to spend money to save money. These organizations are tight for money so they want to improve things but to improve things you actually need to spend. Some organizations are actually figuring that out, but some are waiting until they have some budget.



(I am most proud of) my work on agile modeling,... a tough topic before it’s time,... simple solutions like writing stories on index cards are nice,... people are doing a lot more modeling than that. …agile modeling appears to be more popular than TDD... 80% of agile teams use some form of upfront modeling,... it’s not part of what the agile community is "allowed" to talk about... there’s some significant dysfunction in the way that the agile community communicates...



With all the publications, books, etc. things that you’ve worked on, on a project that you’ve been on, what strikes you as being the one thing that you are most proud of?


I would say my work on agile modeling, I took a tough topic that was probably before its time, and what’s interesting is that in the last couple of years I have been seeing more and more references to the work, and more and more people starting to get it. I get a phenomenal number of hits on that site, as more and more people are starting to scale agile and move out of the small co-located situations, modeling becomes more important. Simple solutions like writing stories on index cards are nice, but when you actually step back and observe what people are doing they are doing a heck of a lot more modeling than that. In the surveys that I run, agile modeling appears to be more popular than TDD even within the TDD community, which I thought was strange. Forrester just ran a survey a while ago and agile modeling was rated as being more popular than extreme programming. To be honest I question the way they worded the survey, but I would not have predicted anything like that at all.



Personally I’m a big fan of the agile modeling method stuff from years ago, he think it’s because AMM is really about a set of processes that are pragmatic and make sense, so people are just doing it without even realizing it’s AMM, unlike some of the other more strongly branded practices like scrum or XP?


Yes, branding is a good word because practices like XP and scrum, and particularly scrum have become a brand. But modeling is just not what the S. sexy by the developer crew, so something like XP which has almost no modeling took off with very little marketing. It became popular really quickly because it sounded cool and it really appeal to developers



So Developers like it for the same reason that the business hated it (XP)? Visions of programmers on snowboards?


Exactly. But seriously the challenge with XP is that it requires so much discipline to pull off is that even though everybody wanted to do it very few people are actually able to pull it off. So agile modeling didn’t have any marketing, we didn’t put a bunch of certifications around it, and there is no scam around it. And it was competing against the simpler messages like user stories, and while user stories are great it’s not enough to get the job done. But that’s just not what people want to hear. So one thing I do is run surveys to find out actually what people are doing in practice. Even though they may not be talking about it, and I’ve been criticized for that. But these surveys tell me 85% of agile teams do some form of upfront requirements and upfront modeling. It’s rare to hear agile teams talk about this, it’s not sexy or not part of what the agile community is "allowed" to talk about and not part of the agile culture. But I think there’s some significant dysfunction in the way the agile community communicates. There’s these taboo subjects, things that agile people aren’t supposed to talk about.



(The agile community) over focus on hard-core developer stuff, ...they don’t talk about the not so sexy stuff, something about this engineering mindset that wants black-and-white answers. Most of the world doesn’t work that way... they are fundamentally not getting the job done.



It’s my opinion that there’s a fair amount of retoric that comes out of the agile community that’s almost self-defeating, how would you characterize this? What are some of the big limitations of the agile community?


There is an over focus on hard-core developer stuff, which is good, you should take pride in your work. Agile provides some good process for developers to rally around but there’s this huge focus on development, and they don’t talk about about the not so sexy stuff, like documentation, like requirements analysis, like testing, even though they are usually doing it. Developers need to get away from some of the agile marketing, I don’t know what it is about developers that they seem to fall for this blatant marketing stuff.



So they are prone to Puritanism?


Yes, there’s something about this engineering mindset that wants black and white answers. Most of the world doesn’t work that way, software is mostly an art not a science, some of its lines but very little of it is. Developers tend to overly focus on technical problems like what the next version of the JDK going to be, which is interesting, but not crucial. In a couple of months another version will come out with the facts in question. So developers will spend off a lot of time learning the chassis of the technology, but not spend time to learn more business oriented skills such as how to develop a good user interface which is what their actual business users want. So often they are fundamentally not getting the job done because they’re focusing on technology over needed it.



... there’s a lot of good stuff in scrum ,..but it’s very limited.... it’s overly simplistic... this concept that you can take a two-day course and become a scrum certified master, people fall for this marketing stuff. It completely cheapens it. (The agile brand)



You raised an interesting point about some agile practices employing "blatant marketing", I take it you’re talking about scrum? You been a vocal critic of some of the practices of scrum? Where do you think scrum has gone wrong? What are some the things that you feel they are doing is harmful to the community?


Well first of all there’s a lot of good stuff in scrum, the ideas are great but it’s very limited. The focus on scrum is project leadership and requirements management and some stakeholder interaction for the most part, which is important stuff. But it’s overly simplistic, which is okay when you’re in simple straightforward situations simple approaches work. Also the scrum certification approach has helped to popularize agile and that’s a good thing. But this concept that you can take a two-day course and become a Scrum Certified Master, I mean come on. But people fall for this marketing stuff.



Do you think this certification marketing cheapens the agile brand?


It completely cheapens it. A friend of mine is a respected member of the agile community, and he’s got his doctorate. On his business card got two designations his PhD, and his CSM, one right above the other, as if these things are even remotely equal to each other. As if spending five years doing her PhD is equal to spending two days in the classroom trying to stay awake. And he does this because his customers want people with a CSM designation, those customers not realizing that this is just a two-day course. I mean so what. Also the "scrum but" phenomenon is utter nonsense. About a year ago I was on a panel with a scrum expert, so basically need, the "radical" and a bunch of other experts...



the typical scrum but rant,... you had to do everything (in scrum). Instead of listening to the marketing retoric... ask them (the audience) what actually works in the real world. ...scrum but is basically marketing ,...in the hopes that you’ll take the (scrum) course and become a Scrum Certified Master.



The "unnamed" radical?


Yes exactly, and somebody in the audience acid was possible to adopt some of the practices of scrum but not all of them. And the scrum leader jumped in and said that that was a bad idea, the typical scrum but rant, saying that you had to do everything. So I am shaking my head when I got a chance to speak I basically said okay fine, instead of listening to the marketing rhetoric let’s go to the audience and ask them what actually works in the real world. Does anybody in the audience actually benefit from having short iterations? What about a leader that focuses on mentoring and coaching and not just traditional project management? What about standup meetings? In every instance hands went up. So it is possible to benefit from adopting just a couple of these practices, sure there is some synergy in adopting more, but you can still get benefit from adopting these things on a one-off basis. So the scrum but stuff is basically marketing, to try to get you to adopt all of the practices. In the hopes that you’ll take the course and become a Scrum Certified Master. So what you’re seeing here basically is marketing taken to the extreme.



So on one hand scrum has been criticized as being too simple, on the other hand RUP has been criticized as being too complex, he figures unknown ground between what scrum is doing and what RUP is doing?


Yes, the challenge with scrum is you have to figure out what to tailor in, and the challenge with RUP is that you have to figure out what to tailor out. Either approach puts a lot of onus on the person using the framework. But as a middle ground, which is something that the open UP community tried to do. They’ve done a really good job, of course there is some bias against them from the agile community because of the bias against RUP.



So would you recommend something like open UP for customers that are trying to get to a disciplined agile approach?


Yes, it’s not perfect, but it’s got a lot of good things in it. What I found is that many times I’ll go to clients who are trying to agile and then had to invent many of the things that are in open UP. And they spend an awful lot of money doing it.



One of the charges against the agile approach is that a lot of the tooling doesn’t actually work easily with mainstream vendor supplied solutions. For instance it’s challenging to do test driven development using an end-to-end SOA suite provided by TIBCO, and it’s next to impossible (at least out of the box) to do this type of development for solutions using packages like SAP or Siebel. Is this a real challenge, and do you think it’s being addressed?


This is a big challenge for the agile community, a lot of tools are open source, and very good at doing what they are supposed to do. However it’s one thing to build a tool in isolation for a specific purpose, it’s much harder to build an integrated toolset that covers an end-to-end lifecycle and a number of different situations.


For instance the Jazz platform by IBM has been in development for several years by some very smart people, and it’s integrated toolset that covers the entire application lifecycle.



... some people would say that you need to do a fair amount of upfront requirements (for package/cots)... … that’s a bit of a myth, typically there aren’t that many options to choose from... … you can get to an answer fairly quickly. ...package (people) needs to be encouraged (should you agile)... ... this is a cultural thing



Would you recommend that people doing ERP like development such as PeopleSoft or Siebel try some of the agile approaches even though the tooling might not be there currently?


Yes, I wrote a column on Dr. Dobbs about this about a year ago, some people would say that you do need to do a fair amount of upfront requirements so that you can choose the right package, but that’s a bit of a myth, typically there aren’t that many options to choose from in a particular problem space and you can get to an answer fairly quickly. It’s also very possible to release package based changes using an iterative approach. I think the package community needs to be encouraged to try out some of these techniques as they would certainly benefit.



So again this is really a cultural thing that so many package implementations don’t use agile?


Yes, this is a cultural thing again one of the things I said in my article is that the community could learn a lot from agile. For instance when somebody is buying a package from a vendor when the first things I would ask is whether is the automated regression test suite. If it doesn’t exist I would ask why not?



Certainly most packages don’t have this kind of functionality?


Exactly, and a lot of package implementations run into trouble, and not just because they don’t have test suites, but it’s certainly not helping. We need to raise the bar on our vendors and ask them if they don’t have it on a test suite how can they be really sure that all of their functionality works? And how as a client and I going to test this when I need to integrate it with my stuff?



Can you think of some pragmatic way that let customers a package software can start moving towards this approach?


Well certainly customers of COTS solutions should be sure to put automated tests around extensions or customizations they implement on top of the package. But the need for an automated test harness should be in the actual package RFP, this should be a fundamental requirement of any package.



...certification scams are an embarrassment, ... (the agile community) have a tendency to reinvent the wheel, ... agile doesn’t typically talk about the end-to-end lifecycle.... these activities need to be talked about in a more mature fashion than they typically are... ... idea that most projects can start after two weeks just isn’t realistic.



What do you think are some of the biggest failings of the agile community? The agile community has done a lot of great things but where have they really missed the mark?


Again the certification scams are an embarrassment, and they need to stop. We also have a tendency to reinvent the wheel, it’s interesting to watch how the agile community will come across some new unique technique, but then when you look you will see that this has been in RUP or even CMMI for years. And these techniques were in other things before that.



Could you give an example of something where the agile community invented something that was not really new?


A couple of years back the XP community came across a day of building an "steelframe" basically a working reference or skeleton of a system. This is an idea that’s known as the elaboration phase and has been in RUP for years, it’s really nothing new. The RUP has always had the notion that you need to focus on validating your architecture before you go into construction. It’s still working software, and still has value, but you’re just reordering your work to make sure that the architecture is valid.



Another failing is the agile doesn’t typically talk about the end-to-end lifecycle. I was on a conference where a scrum member of the panel was saying on their project or coding from week one, then somebody from the audience to stand up who’s actually on the project and called bullshit, apparently there have been six months of prototyping requirements done before scrum had even started. These activities need to be talked about in a more mature fashion than they typically are by the agile community. This idea that most projects can start after two weeks just isn’t realistic. When I did my survey I found out that many agile projects took several months or more to get going. The initial stack of user stories need to come from somewhere, someone has to decide how to fund the project or if the project with funding, even if it’s just a small co-located project somebody has to find the room where everybody’s going to sit, these things all take time.



...lean explains why agile works. It also provides a philosophical foundation for scaling agile...



Lean has now come out as the new popular buzzword in the agile community, it has a much more end to end lifecycle connotation, do you think the focus on lean is a good thing? Or is this just another buzzword?


I actually think there’s a lot of good stuff in lean, in many ways lean explains why agile works. Things like deferring commitment, eliminating waste those principles really hammer home why agile works. It also provides a philosophical foundation for scaling agile, the principles are really good, frequently when I come across organizations that are doing a good job of scaling agile they have a good mixture of organizational principles, some of which come from lean some of which are their own, but the idea that they’ve got some grounding principles remained constant.



Would you say that any process framework needs to be grounded in these principles first?


Absolutely, take XP they have foundational value statements. Agile modeling has the same thing, as does RUP with it’s 6 principles. The processes can never tell you in detail what to do and even if they did it doesn’t matter nobody is going to follow it anyway. People are smarter than that. When you run the thing that the process doesn’t cover you should always be able to fall back on the principles, and this is what lean really brings to the table, a set of principles that really make a lot of sense.



So Lean it also brings a complete view, not just focused on developers?


Yes it brings a complete picture, all the work necessary to create a product, not just development.



...a lot of inertia in these (QA) communities, inertia with the (QA tooling) vendor, ...the QA group has been so underfunded and downtrodden, that they haven’t had a real chance to come up for air. ... some members of the agile community might not have the most welcoming attitude to people from the QA group.... should be getting these QA people into their teams for their testing skills, ...agile rhetoric can get in the way...



Who are for the biggest resisters to agile adoption? What are the kinds of people that typically block agile?


Quite often it’s the typical 9-to-5ers, but in a lot of places these kinds of people don’t exist anymore at least in IT. Sometimes when working with clients will come across people in the project management group or the database administer group that are in complete denial, especially the database group. The isn’t particularly a lot of leadership especially in the process are, in many ways this group is stuck in a rut from the 70s. And some other organizations it’s their quality assurance group would still insist on these big detailed specs extremely early on in the project.



You raise an interesting point, the QA group seems to be on a really different track than the agile one, if you go on forums or user groups is very much about testing separately, testing and the large using big vendor tooling, not a lot of talk about integrated test driven development approaches. Do you have any insight as to why that is?


While there is always going to be some need for external testing, such as penetration testing. But there is a lot of inertia in these communities, inertia with the vendors, and inertia with the people themselves. A lot of it has to do with that the QA group has been so underfunded and downtrodden, that they haven’t had a real chance to come up for air.



So would you attribute this lack of "fresh" thinking to a general lack of funding and a lack of focus on quality in general?


That’s a big part of it, also some members of the agile community might not have the most welcoming attitude to people from the QA group. This whole idea that because developers do TDD that they don’t need testers, is a good example. Agile developers should be getting these QA people into their teams for their testing skills, so some of the agile rhetoric can get in the way sometimes.



...most IT governance is dysfunctional. Good governance is about motivation and enablement.... you don’t tell knowledge workers what to do you motivate them, and make it easy as possible for them to do the right thing. ...many (developers) have had to play the role of governance blocker, ...all these instances of people pretending to follow governance and they’re really not, ... (developers) successfully blocking them, who else is also so successfully blocking? How many people are basically pulling the walls over their (governance) eyes?



... developers need to be educated that this (governance) is about getting the job done, using the most pragmatic way possible...... client had a multi-month architecture review process,... if I was the boss I would fire them, instantly...…governing developers is like herding cats, which is actually phenomenally easy, ...wave a fish in front of the cats, suddenly they’re really interested…



Agile governance has been discussed as an alternate from the typical command-and-control approach. What’s your elevator pitch for those who aren’t in the know about agile governance?


First of all, most IT governance is dysfunctional. For all their talk on measurements and metrics these governance groups never have very good numbers to show their own success rates. This is usually the first sign that something is going very seriously wrong here. Good governance is about motivation and enablement. The first thing that governance needs to realize is that IT workers are really knowledge workers and you don’t tell knowledge workers what to do, you have to motivate them, and make it as easy as possible for them to do the right thing. The problem with the command-and-control approach is that it adds another layer of bureaucracy, and makes it harder to get things done. So what happens is that knowledge workers will do just the minimum work necessary to conform to whatever crazy governing strategy is currently in place. That’s just not the way to do it, if you want your developers for example to follow some particular coding conventions, formal code reviews are just going to foster alienation. Give them the tools to validate their coding approaches in real time, and work with the developers to help them develop a coding convention in the first place. Promote things like collective code ownership and pair programming, those two things alone will do more to promote good code than code reviews will.



I have been to a number of conferences where I asked the audience how many have had to play the role of governance blocker, meaning that they were responsible for churning out documentation just to make external people happy and give the appearance that they were complying to some crazy governance framework. In every case a large number of people put up their hands, then I asked "how many of you people actually got caught", and everybody put their hands down. So all these instances of people pretending to follow governance and they’re really not. On one project I was working on for a client, the customer asked what was working and what wasn’t. This project was phenomenally successful in the eyes of the client. But I had to be honest and let them know that out of 10 people in our project, we had 2 people who were dedicated just to churning documentation. Documentation that was not used by any of us, and was built just to shut these people up. So that means we had a 20% overhead, overhead they could have been dedicated toward the invaluable things, like automating testing or re-factoring the code so that it had better quality. Even worse what it means is that there was a group of people, the bureaucrats were not only not adding value, to actually taking value away from the team. The client actually didn’t appreciate this, but sometimes I do appreciate dishonesty. I mean we were successfully blocking them, who also successfully blocking them in their organization. How many people are basically pulling the wool over their eyes?



...instead of having all these reviews and controls, a lot of these things can be automated… ...we have tools to help you manage and monitor your process improvement.... self-help checks and retrospectives... how well they are doing with process improvement


(these tools give) a very good handle with these groups on what the quality is, what the time-to-market is going to be, project status and stuff like that. Management has the information they need to tell if the team is in trouble right away or whether they’re doing well,



So imagine you’re a governance body was trying to enforce some constraint, say that you had to use "widget A" which maybe more or less useful in different situations. The conventional governance approach would be to conduct a review on the solution to ensure that different teams were using the widget, what would be alternate approach be?


The approach here is to tell teams that the default way is to use widget A, however if the team uses an alternate approach, they should be able to if they have a good story. But instead of having the review in the first place, it’s probably better to have some principles in place that say that our developers are not in the business of building fancy widgets, they are in the business of being a bank, or an insurance company or a retailer. Developers need to be educated that this is about getting the job done, using the most efficient way possible, but there still needs to be some room for creativity. But instead of having all these reviews and controls, a lot of these things can be automated. I was working for a client that had a multi-month architecture review process, regardless of the size of the project, holy crap. I mean how these architects justify their existence, because if I was their boss I would fire them, instantly. These people have to learn how to be pragmatic. The whole idea of lean governance is to streamline these things and look at behavior. People will tell you that governing developers is like herding cats. Well actually herding cats is not hard if you understand what motivates them. In traditional command-and-control you’ll send attached a couple of memos, maybe make them go under a couple of reviews, and the cats will ignore you. The cats are just going to sit wherever it’s the most warm, but getting cats out of a room is actually phenomenally easy, if I wave a fish in front of the cats, suddenly they’re really interested in me. So all I have to do is throw the fish into the next room and all the cats will go after it. I then close the door behind them and I’ve achieved my goal.



So what do we do to motivate developers? Or QA, or anybody in IT?


Knowledge workers are motivated by pride of work. Allowing them to do good work and be recognized for doing good work is important. Most developers actually want to do the right thing and most of them understand that they’re working for a company that needs to make money, they understand that there needs to be a margin on the software they produce. I suggest taking the time to educate developers on the fact that coding conventions could save the company up to 10% on revenue will help motivate them to take part in making these coding conventions. Developers will understand this if you approach them in the right way. Likewise if you’re in a regulatory environment, we need to do more documentation just to keep ourselves out of jail. Likewise if you’re working on a mission-critical system you need to do more testing, and the development cycle is going to be slower.



... developers need to realize that ...they are being governed, and destroy. They need to understand the goals of governance, ...if developers can get to the root cause, then they can say "hey there are other ways to achieve these government goals".



A lot of people think governance is something you do above and around everyday work. How do you enable an environment where governance is everybody’s job, and everyday practitioners are responsible for owning governance, how does that scale?


First of all developers need to realize that even though governance is a dirty word in their community they are being governed, end of story. Developers have let bureaucratic people define the governance approach, so they have ended up with a bureaucratic governance approach. The reason developers are so ticked off with governance, is that they let the bureaucrats define governance structure. They need to get involved directly themselves. They need to understand what are the goals of governance, what are the principles, where is the value? Developers could then say hey there are number of ways to achieve these goals that are pragmatic. For example, Toyota uses a concept called 5 whys, and whenever there’s a problem, everyone is asked to answer "why" so they get to the root of the problem. Developers need to start doing this with process, with governance, etc. If developers can get to the root cause, then they can say "hey there are other ways to achieve these governance goals".



Developers need to realize that they are going to be monitored, and they’re often in the denial of this. You can have all the agile processes and daily standups that you want, but somebody, either you or your manager is going to be asked to scrub this and present this at a status meeting. Of course this comes off as a phenomenal waste of time, it takes a lot of effort to produce the status reports. So instead of denying the need for these things you can start using automated and integrated tooling to help you do your job. Again, jazz and things like RTC can help you automate some of these governance requirements.



MCIF (allows) organizations take a look at client software development processes and how to improve them.... we have tools to help you manage and monitor your process improvement.... some metrics should only be consumed internally by the team, and almost always the process improvement metrics are those ones. ... rational insight can allow you to record daily activities... (using) very accessible workbenches, ...you have some hard metrics that management can look at to track effectiveness, but you also have some softer improvement metrics that are only consumed internally by the team.



So you referred to some of these "next-generation" tools by rational that can help scale agile? Can you go into more detail?


First of all their something called MCIF, which is something that helps the service organizations take a look at clients software development processes and how to improve them. It’s not specifically agile, but most of the core content is either agile or lean because that’s where a lot of the process improvement is coming from. There’s two aspects to the toolkit, the first one is initial assessments were service organizations can work with clients to figure out what they’re trying to achieve, or other objectives, what the challenges are. It’s your standard assessment where you end up with some really good thinking and really good advice.



The second part of the MCIF is execution, we have tools to help you manage and monitor your process improvement. We have a number of tools that help you not only do self-help checks and retrospectives. But also actually allow you to track what they’re doing. How well they doing with agile or other process adoption, and how effective is it. We want to give teams evidence to prove and measure that the process improvement is actually creating value. We have a tool called rational self check that helps organizations do this. A strange observation that we (IBM) have made is that some metrics should only be consumed internally by the team, and almost always the process improvement metrics are those ones. What we have observed both within IBM and within our customers is that when you report the process improvement mentor up the food chain, all of a sudden the objectives become to look good, rather than becoming good. So if you don’t report software process improvement metrics the objectives of the team stay on becoming good and actually improving.



So how does management to get a good sense of how teams are doing if management doesn’t get to look at these metrics? Does management ever get to look at these metrics?


They shouldn’t be looking at the MCIF metrics, they should be looking at other metrics. On the jazz platform, you can track and start reporting on a number of "software lifecycle" transactions using a number of IBM products that give you very accessible workbenches, as an example rational insight can allow you to record the daily activities of IT workers who are producing software and create each project workbenches that give visibility into what’s actually going on. This allows us to track things like burn down charts, build status as well as various projects and quality metrics etc. This approach allows us as an organization to be more efficient, and also gives management the metrics they need to effectively govern.



So essentially, you have some hard metrics that management can look at to track effectiveness, but you also have some software improvement metrics that are only consumed internally by the teams?


Yes, and you also have some very robust access controls, so that you can control who is exactly looking at what, but effectively how teams are trending along practices should remain private.



So if you’re never showing management the direct result of these software improvement initiatives? How do you justify these initiatives?


Management should be focusing on the goals, so the goal is to become more productive, or deliver less bugs, and there should be specific metrics to measure that. Managers should have their own set of more business oriented objectives and that’s what insight allows you to measure. These metrics that are generated by the various products, and rolled up into something consumable. We actually use this internally to manage our own software groups. We’ve got a very good handle with these groups on what the quality is, what the time-to-market is going to be, project status and stuff like that. Management has the information they need to tell if the team is in trouble right away or whether they’re doing well, and they won’t have to attend all the standup meetings directly. This automation is really important, my general rule of thumb is that I question any metric that is manually generated. Certainly there are some metrics that are hard to generate automatically, things like stakeholder satisfaction are hard to get. But for the most part most metrics can be automatically generated if your tools are sophisticated enough.



So in your lean governance paper you talk about how to set up an environment that constrains developers in the way it pushes them into doing the right thing.


Yes, a combination of education and then motivating these people do the right thing. And part of that is making it as easy as possible for them to do the right thing. A big part of this is automating as much of the governance process as possible



How do you answer the charge that it is impossible to "do agile" on a particular type of project? Whether it’s package, integration, etc.?


Ultimately when someone tells me that you can’t do something a certain way, what they’re really saying is they don’t know how to do it. Sure there are some situations where agile doesn’t make a whole lot of sense, but they are actually very few and far between. The challenge is that if you fall for the mainstream agile rhetoric, then you really see a focus on co-located teams, and a lot of talk about the core agile practices. If they want to actually be applicable to more complex domains, more complex team structures, they need to start addressing these issues with other practices, or the more complicated flavor of agile. If you listen to the agile rhetoric around "we don’t do modeling up front, and we don’t do documentation", the agile community is pretty much shot themselves in the foot in terms of addressing even remotely interesting situations.



(Architects and managers should) help people solve problems, ...get involved and be somebody that the team actually wants to go to... Look at your job as one of basically serving the people doing work.



So what some tactical advice that you could gives you a senior manager, or a senior architect, one who has all pile of projects that he is responsible for, and he’s trying to effectively manage them? How would you help them to become more relevant to the work that is actually going on? How does one do that and at the same time keep the high level view necessary to doing his job?


Help people solve problems, every team has a challenge and is always struggling with something, engage that team, roll up your sleeves, and help that team solve whatever problem they are having. Those problems could be the need for more resources, better facilities, whiteboards, or even some relief from creating all this unnecessary documentation, etc.. If you are a architect, start helping teams by pointing them at specific frameworks already exist, existing examples etc. Instead of just creating all these models and partaking in reviews, get involved and be somebody that the team actually want to go to. Basically as a manager, even at the highest level, you need to look at your job as one of basically serving the people doing the work. Being more of a coach and a leader, be a resource, don’t try to manage them. Don’t try to be a cop. Actually help them, and make yourself useful.



Actually that’s a great closing line, thank you for your time here


Thank you.

Sunday, February 14, 2010

It's time for Scrum to evolve

This week has seen a flurry of activity online concerning the current state of scrum, it's flaws, and if it should be changed.

While criticism online of a particular method is nothing new, more attention has been paid this week when Uncle Bob listed 8 issues that he had with scrum...

1. No technical practices. Scrum is great at giving project management advice,
but provides no technical help for the developer. Any good implementation of
Scrum needs to borrow technical practices from some other method like XP. The
suite of technical practices that should be added probably include: TDD,
Continuous Integration, Acceptance Testing, Pair Programming, Refactoring.

2. 30 day sprints are too long. Most scrum teams have either shrunk them to 2
weeks or perform some kind of midpoint check at the two week mark. I know of
some teams that have two 2-week "iterations" inside a single 4-week "sprint".
The difference being that they use the sprint for reporting upwards, but use the
iterations for internal feedback and control.

3. The tendency of the scrum master to arrogate project management powers. This
is not a problem with Scrum out of the box so much as it is a problem with the
way scrum sometimes evolves. Perhaps it is related to the unfortunate use of
the word "master". Perhaps the XP term "Coach" might be a better word to use.
In any case, good implementation of scrum do not necessarily correlate scrum
masters and project managers.

4. The C in CSM is unfortunate. Again, this is not so much about scrum out of
the box as it is about the scrum community. That letter C has gotten far too
significant for it's intention. It is true that the people in a scrum team need
to be trained. One of the things they should be trained about is the role of
the scrum master. The problem with the C is that it changes the notion of scrum
master from a role into a person. It is the person who has the C. In an ideal
case, the members of the scrum team will rotate through the scrum master role
the same way the members of an XP team rotate through the coach role. This
rotation is never perfect, and sometimes the role sticks to one or two people
more than others. But the idea was never to raise up a particular person with a
rank. We never wanted that C emblazoned on their chests.

5. Scrum provides insufficient guidance regarding the structure of the backlog.
We've learned, over the years, that backlogs are hierarchical entities
consisting of epics, themes, stories, etc. We've learned how to estimate them
statistically. We've learned how and when to break the higher level entities
down into lower level entities. Epics->Themes->Stories->Tasks.

6. Scrum carries an anti-management undercurrent that is counter-productive.
Scrum over-emphasizes the role of the team as self-managing. Self-organizing
and self-managing teams are a good thing. But there is a limit to how much a
team can self-X. Teams still need to be managed by someone who is responsible
to the business. Scrum does not describe this with enough balance.

7. Automated Testing. Although this could be considered a derivative of point
1, I thought it worth calling out as a separate point because it is so
fundamental. Scrum doesn't mention this, yet it is the foundation of every
agile effort. Agile teams work in short cycles because feedback only works well
in short cycles. But short cycles aren't enough. You also need objective
measurement of progress. The most reliable way to know how much a team has
gotten done is to run automated tests and count the tests that pass.

8. Multiple teams. Scrum has little to say about the coordination of multiple
teams. This is not a failing unique to scrum. Agile itself is virtually silent
on this issue. Scrum talked about the vague notion of a "Scrum of Scrums" but
that idea really hasn't played out all that well. Scrum-in-the-large remains in
the domain of certain consultants who claim to have an answer. There is no real
consensus on the issue.

Agile blogger Mike Cottemeyer have waded in stating that the concept of an Agile PO needs to be significantly rethought, and Alan Shalloway of Net Objectives has long maintained that Scrum (and other agile approaches) have long suffered from it's overty developer focus, and anti management bias. Alan has stated that extending scrum with lean/kanban can go a long way to make scrum significantly mire successful.

Jurgen Appelo has come in swingng to the defense of scrum, saying that a lack of technical practices is a strength, not a weakness as it allows scrum to be applied to other professions beside development.

For my part I am firmly with the critics on this one. Let me start by saying that I think scrum has some great things in it. The management practices are a good way for teams to get started down the road to becoming more agile. Simple things line retrospectives and standups reallly help noobs understand what agile is.

But that's is where it ends. Scrum is so simple and incomplete that it does not adequately serve it's target audience appropriately. It's fine to say that scrum is meant to serve multiple audiences, which is great in theory, but let's look at reality, the vast majority of scrum users are developing software. They are not graphic designers or painters or management consultants. I've seen teams get into serious trouble with flaccid scrum, ie no technical practices, because they thought they had what they needed. In reality they were worse off than the previous waterfall state. No documentation, external qa, but no CI or TDD, what a mess.

Scrum practices also come off as being a little naive and almost condecending to teams with even a little emotional intelligence. Rules like chickens can't speak (ie managers) at certain meetings and the notion that developers only have to deal with product owners seem like an extension of many of the spoilt developer attitudes that came out of the dot com era.

I actually think for agile to be successful teams need to take scrum practices and mature towards Lean an kanban. The language is more mature, and you can use more sophisticated but still simple tools to better model and manage more complex situations than one team serving one product owner.

Finally I think for such a simple approach scrum is a little dogmatic about what you have to do. Again this has to do with maturity, beginners might need to be told exactly what to do and who can talk in a stand up meeting, but teams that move a little bit up the maturity curve quickly find this kind of advice pedantic.

I won't even get into the whole certification fiasco, which to me is a complete joke.

So in conclusion I think owners of scrum need to take a look at this criticism and seriously consider how they can incorperate public feedback into improving scrum. This is critical if we are to continue to take scrum seriously.

Saturday, January 30, 2010

Tackling LEAN change week 1

Well I have finished my first week on a two month initiative to help a very large conservative organization starts it's journey to becoming a solution factory base on lean principles.
http://agileconsulting.blogspot.com/2010/01/tackling-agile-organizational-change.html

First of all I want to thank the folks on the agile lean yahoo group for some great advice and feedback.

http://tech.groups.yahoo.com/group/leanagile/message/4692

I have had a chance to talk to the CIO and get a sense of what he means by a solution factory.
-clear visibility around how things are being done and progress (ie the catwalk over the factory floor)
-flexible assembly lines where parts of the
supply chain can be interconnected in diffrerent ways to provide value
-an environment where workers can be proud of the work they do, and actually want to be more productive.

I also had an opportunity to discuss lean with the architecture group. This was a great conversation. The group had a keen awareness of how command and control and throwing things over the wall were not working, we had a great discussion on lean IT governance (ftp://ftp.software.ibm.com/software/rational/web/whitepapers/Lean_Development_Governance.pdf) ala Scott Ambler and Perr Kroll. The architects really got it, and were generally enthusiastic to try to break down the wall between architecture and delivery.

I asked the architects to give their opinions on how we should proceed building a vision and roadmap by asking them to ask wether they leaned to the left or right on the following

visionary <*----> pragmatic
educational <*-----> self learning
e2e value stream <---*-> IT perspective
planning <-*---> doing

I was quite surprised about the almost unanimous desire for creating an ideal state. I think the ideal state is important, it motivates and energizes people to push beyond the possible and truly excel.

I percieved the really strong desire for education over self teaching as a general apprehension over what lean would mean to the organization, it's clear to me that people here are looking for answers, I agree that education from the outside is crucial, but I hope I can hammer home the concept that people need to get into a self learning mode. (baby steps)

I'm fairly concerned around the IT focus versus end to end value. The rational given is that the IT group already knows the problems of the business and that the IT group would like to get it's house in order before approaching the business with their desire to go in a lean direction. My major concern is that the idea that "IT already knows" is a root causes of IT-business mis-alignmnent. I'm also concerned that some leadership is emphasizing efficiency over effectiveness.

That being said I think the evidence I've collected so far signifies a genuine desire to provide better value to there cumstomers, a hunger for better collaboration, and a real understanding that it will be the people on the shop floor that will make this successful.

That's it until next week.



Location:Brock Ave,Toronto,Canada

Sunday, January 24, 2010

Tackling Agile Organizational Change

I'm just about to start a new project where I will have the chance to assist a client in setting up a "software solution factory" based on Lean principles.

The client has had some serious challenges relating to software delivery in the past, and is supporting systems with a significant backlog of defects which don't seem to be getting any smaller any time soon. The client is also in the initial stages of some very ambition systems replacement initiatives, and would like to get things right this time.

Using a mixture of lean and agile principles clearly can offer alot of value to the client. But historically the organization has many elements that could make lean change challenging, a tendencancy to rely on buraucracy and collective bargaining are a few examples.

The client wants help in the typical things that are part of any transformation effort,
ie: establishing a vision, communicationg the urgency of the problem, listing challenges, developing a target, and developing a plan.

I see a number of potential ways to tackle this problem, and would really appreciate other people submitting thoughts and ideas. The more fresh eyes I can get on this problem the better.


It should be noted that these ideas are not mutually exclusive to each other. But there is a limit to what can be accomplished during this engagement, so I've tried to bucket options in a sensible way.

OPTION 1: Realist and Careful

-In this approach the initial effort would be focused on carefully cataloging the current state.

-Structure, work habits, technology, HR, would all be reviewed and assessed.

-These inputs would be used to identify the biggest issues and realistic roadmap would be created that would show target states over time (1 year, 3 years, etc)

This approach has the benefits of being easy to scope, and allows the client some time to prepare the message in a way that minimally disrupts the way things currently work.

However this is my exact issue with the approach, if things are broken what exactly is wrong with disrupting them if a better outcome is the result. A conservative approach has the most chance of becoming shelfware. Disruption is a critical part of any change, we need to test the organization's resolve at some point, and in my mind the sooner the better.

OPTION 2: Idealist and Disruptive

In this approach I would

- spend much less time collecting empirical evidence, but do the minimum necessary to give me contex.

- create an idealized vision of the way the organization should work. Every article I read on organizational change and lean seems to indicate that the idealized vision motivates people to stretch themselves toward excellence. This intuitvely makes sense to me.

- quickly identify groups that could benefit from agile lean rught now, and coach them to some degree of success.

- set up a "supply chain" of analysis, education, and adoption. The idea is to get a repeatable process instantiated that would allow my client to increase internal capability as quickly as possible.

- instrument adoption on as many parts of the organization as possible. Then learn from the experience, then optimize the education supply chain. Plan, Do, Check, Act

I really like this aproach as it has the opportunity to offer real value, and it is inherently lean. (use lean to bootstrap lean) However this client is conservative, and may not be able to move this quickly. This approach is also really hard to scope, so much depends on the client stepping up to the table.

OPTION 3:Educate and Participate
- get context, focusing on skills and ethics gaps
- create an online education forum where practices, principles, and other educational material can be posted and improved in a structured and collaborative fashion.
- create a repeatable practice around self serve education, slowly releasing the training reins to the client
- hold a organize wide conference with the dual purpose of education and collaboratively developing a transition roadmap

This approach has the benefit of involving or attempting to involve a large portion of the organization. It also can help spread the message far and wide accross the organization. My main issue with this approach is that education becomes quickly stale if it's not used. Also education alone is not sufficient for adoption, hands on mentoring is essential as well.

Again insight from the community would sincerely be appreciated. I promise to post updates on my progress in the hopes that this will help others who are on the same journey. I am also hopeful that this level of public discourse will help my client get the most they can out if going in a lean direction.

Monday, December 28, 2009

Agile Documentation

As a new theme to my blog, each week I will be pulling out one of the LEAN practice cards to talk about and discuss some of the concepts behind it as well as add in any colour commentary I may have based on my own individual project experiences.

The first card I thought would be interesting to look at is a Management Practice titled "Agile Documentation". Documentation is often overemphasized by some project teams or ignored by others depending on whether they subscribe to a waterfall or iterative based process. There is no clear cut answer on exactly "how much" documentation should be done as it is dependent on a variety of scaling factors.

Regardless of which development process your team may subscribe to, if we go back to the "why" behind documentation, I think applying Agile Documentation will help any team minimize documentation waste while also providing the benefits of documentation (yes, documentation is a good thing and even agile teams need to document as going agile is not an excuse to avoid documentation).

The LEAN Agile Documentation card states:



Document...
- When the business asks you to
- For clarity across teams working in different locations
- To establish appropriate business and technical context for the solution
- To define interfaces across systems
- To enable your next effort
- Informally
- To understand
- To communicate

Don't Document...
- To specify
- Because your process says so
- Unproven ideas (prototype it first)
- Without obtaining the structure from your intended audience
- Without considering maintenance and cost of ownership
- Implementation and design details that can be just as easily expressed in the solution itself

There are three main themes I want to pull out from this card that I think are worthwhile to discuss.

1) Document for an audience - Documentation requires time and effort and the business is paying for this just like they are paying for each feature built into the solution. Every project should contain estimates for time/effort (i.e. cost) towards documentation and this is something the business is paying for. If you look at many organizations and their development processes you will notice that there is often a large set of documentation that needs to be completed as part of the project. However, it's worth questioning does the business really need each one of these documents? Does the technology teams building and supporting the solution need these documents? If the team is documenting...

- Because your process says so
- Without obtaining the structure from your intended audience

Then it's likely the team is not documenting for the business. Generally the type of documentation I find invaluable to the business and the technology teams supporting the solution are operations and support manuals (e.g. run books), developer setup and build manuals, end user manuals and the delivery or release plan. This set may need to be expanded depending on the project.

2) Avoid documenting things that will change - Just like code and test cases, documentation is impacted by changes. To avoid paying the "rework" cost, hold off on documenting until a steady state is achieved (document at the last responsible moment) and stay away from implementation and detailed design details that will require frequent changes to the documentation.

3) Document to understand, communicate and establish contracts - It's important to recognize when the scaling factors impact your project and documentation is a great way to mitigate these risks and challenges. When your team is:

- Geographically distributed
- Large and difficult to manage
- Working in a complex business domain

Then documentation helps everyone understand the same language, communicate decisions and changes, and establishes contracts that helps each member understand how their work interacts with everyone else. The type of documentation I find myself often using to help the team navigate through these challenges are the high-level requirements (e.g. use case hierarchy) produced from the initial requirements envisioning sessions, ubiquitous language dictionary and high-level domain model, system context diagrams and bounded contexts, and system interfaces. An extremely effective practice that I often apply to documenting system interfaces or complex business domains is to use "executable requirements". Writing test cases as a form of documentation is a great way to precisely capture the details while also validating for correctness.

I have found these three themes combined with the bullet point checklists in the Agile Documentation practice card to be incredibly helpful in deciding what and when to document.

Wednesday, November 4, 2009

Agile isn't Just About Change

Over the last couple of months several clients of the firm I work for have stated that agile software delivery was not suitable to a particular development project because the need to continually embrace change was not there.

I hear this so often that I feel I need to emphatically point out that agile development practices can provide alot value to "fixed" projects. I have a couple of practices below.

Iterative development: Even if change is not the order of the day breaking up work into small chunks will go along way to mitigating a whole slew of implementation risks.







Test Driven Development: Anyone who has spent serious time practicing TDD can attest to the fact that TDD leads to better design than traditional development. It might not be practical for all situations but where it is TDD lets you safely take a second and third pass at your design.






Behaviour Driven Development: Having a consistent format to describe requirements and test cases in business friendly format that supports automation is just plain common sense, and a great way to specify contracts between teams on large scale projects.

Continous Integration: Code integration is never a fun job and the longer you put it off the harder it is, regardless of how much change is expected in a project.







Planning Poker, Collaborative Modeling, Daily Standups, Retrospectives and Agile Planning Boards: Because "static" projects will also benefit from approaches that increase collaboration, penetrate organizational siloes, and encourage resources who are actually doing the work to partake in the planning process.















Hopefully the point is made, software projects of almost any shape should strive to encorperate as many agile practices as possible.

Practically and common sense should be used to determine which practices should apply, but rate of change should not be the only value driver. Agile practices reduce project risk, increase quality, and reduce churn and rework.

Tuesday, November 3, 2009

Complex Up Front Estimation Tools Drive Me Crazy

Way to often I encounter both my colleagues and clients spending way to much effort and stock in building the uber estimation toolkit.

I'm sure many of you have seen this type of tool. The spreadsheet that uses function points, component complexity factors, non functional adjustments, and some really clever math to produce a really impressive, but, an largely fictional and arbitrary guess of what the actual effort of a software implementation effort will actually be.

You know you are using one of these tools when:
-you feel like you are doing the requirements modeling with a spreadsheet
-you are doing architecture and design in your head so that you can estimate the exact detailed code components that you think you need to build
-you are estimating at the hourly level for tasks that will be completed over 6 months from now.

I know that many of us have been the author and user of some of these tools.(I know I have built some wicked estimation steadsheets) But the problem with these tools is that they hide the fact that software development is an excercise in discovery and adaption. Planning is essential, but creating uber detailed estimations, and uber detailed plans for software development is alot like creating an uber detailed plan for a vacation. Setting in stone that you have to go to the beach on a particular afternoon cam get easily waylaid if a storm comes in, planning to bring swimwear with you because you plan to go to the beach on a particular afternoon just makes sense.

The same type of reasoning goes into software planning and estimation. Deciding that business requirement A is X+50% harder to build than requirement B is a whole lot more sensible than trying to calculate the work involved for both requirements to the exact hour. The high level estimate is likely to be just as accurate as the one developed by the complex one, and way easier to change if it's wrong.

Here are some pitfalls I see with using complex estimation toolkits






1)Requirements, even when well written always end up being notoriously innacurate.

Blame it on user always changing their minds, subpar analysts, immaturity of the field, etc. But what you based your estimates on will often end up being dramatically different than what you end up building.

2) Technology effort is based on platforms that are changing, dramatically, and the diversity is overwhelming

Don't think you can take your fancy web estimation toolkit based on j2ee version XXX apply a few tweeks and then use it for your upcoming Ruby on Rails project, especially if you don't have significant Rails experience. It is just as likely that your innovative portal/SOA estimator most likely won't cut it when trying to figure out an RIA/REST solution, especially if you are trying to estimate at a very detailed coding component level.

3)Development is not even close to a linear excercise.

The first few times a team build a component of type X will take alot longer and be alot riskier than after the tenth. Over time the team should be building the solution in such a way that the 10th component of type X should be way easier to implement, especially of they are working in an intelligent fashion.

4) A heavy dose of guestimation fudge is applied to even the most rigorous of models.

Every complex estimation toolkit has a way to parameterize the output with a heavy dose of fudge. Frequently the estimator will look at the estimate produced by the wonderful tool, sniff, and apply a 40%+ adjustment to get it to feel right.

5) These tools are incapable of incorperating the biggest inputs to software scheduling and variance to effort, which are human factors

Below is a tag cloud that shows some of the biggest factors to consider when estimating software delivery.






While I'm happy to debate the exact priorities of these factors, I'm pretty adamant on a couple of key points.

- Human factors trump development factors: Now I'm not saying that requirements complexity and target platform are not important, they are, but you can take two identical requirements on an identical platform and give them two different teams and see a exponential difference in output.

The same goes for the amount of organizational churn you have to go through to get something delivered. I have observed (and been guily of being part of ) teams that allowed a lack of appropriate accounting for beaurocracy leading to huge delivery variances from original estimates. Or in simple terms, you won't know how long it takes a particular organization to digest and deploy new software until you have hamds on experience with that organization.


Business ownership is another huge one, recently more and more developers are realizing that software delivery is primarily an excercise in communication. Not having accountable business owners properly at the table will have a massive impact in estimates.

Another way of interpreting the above cloud is as follows:

Consistent tools don't lead to consistent results, consistent experience does

Good estimates come from experts who have done the work before, have used the technology in question, understand the business, understand the organizational context and know the capability of their team members. In my experience this is a tall order, and as a result I have not heard of very many good estimates.

Good (and more likely) estimates are done frequently, by a cross functional team, by the people who will actually be doing the work, and have a level of detail that match how soon they will be done.(but that is a topic for another post)

Recently I was one of 4 groups asked to estimate a web site. One approach used function points, one approach used component complexity, one approach used a page complexity tool, and the last approach used planning poker. Eatimators were all experienced web developers.

They all came with 20% percent of each other. So thx but I'm sticking with planning poker.

Do estimation tools have any use?

Absolutely, yes.

They can provide structure to the thinking of an experienced craftsman.

They can help articulate the problem space and highlight any unknowns.

But a little goes a long way, and no estimation should be confused with fact, no matter how detailed they are.