- "What do you prefer, waterfall or agile development, which is best ?"
Nonsensical and out of context for the most part.
Firstly a development method, is just that, a method, a process, an organisational tool. And just like any tool there are suitable and unsuitable places, and ways of using it.
Using chisels as screwdrivers spring to mind.
Agile (or any other) method isn't a silver bullet to cure all software ills, or by any means a "radical" new way of doing things that will cure all bugs.
I think there are lots of different aspects to think about before answering the original question, though I'll go over the following ones.
Company culture & history
How does the organisation currently produce software, does it even do that in house ? Assuming there is a development team in house, what are the current methods being used ?
What are the discipline levels with the current development, how is version control, build management, documentation,target dates, etc all handled ?
Altering the way in which software is created within itself is going to involve being very aware of what is going on, what you are changing and why.
I was recently asked in an interview about my thoughts and experience with CMM (along with ITIL, which I hadn't heard of until then).
I answered along the lines of : CMM is typically seen as a heavy weight out of date system from the world of old school dev. Though it has a very valid premis which I like a lot :
- You have to know where you are to know how to get to where you want to go.
So before changing or "improving" the current way of doing things, you must first know where you are. Even if that is CMM level -2 !
Given that a development method is a process, it's a guide and limit to human behaviour. How is the team going to react to that ?
Sure you can take the stance of "You're paid to do as you're told, get on with it", though this isn't the military, and I tend to have more respect for people than that.
If you are attempting to improve something, in this case the out come of development, you should know what you're after. Be that :
- Speed of development
- Uniformity of work
- Less bugs
- Faster release cycle
For a method to be successful you have to state what you think it should achieve so you know if you're getting what you're after.
I've seen several teams where a certain method was introduced and things (i.e. quality) got very obviously worse.
So if you can identify what it is trying to be achieved, and then share that with the team, it will be much easier.
Identifying issues with the current process and not the people, while focusing everyone on a different outcome via a "better way" I've found gets much more support.
A shared vision and understanding of why there is change is far more likely to be adopted and succeed, than a dictatorial approach to "fixing people's mistakes".
Skills and experience of facilitators
Managing change in itself is a massive topic, just google "managing change"....
Tough attempting to influence a group of people to change their behaviour is no easy task. The all too familiar style of management will just "lay down the law", though this has a few main side effects :
- Short term wins over long term gains
- No stickiness of change (once the influence is gone, the system reverts)
What to do with conflict, resistance, apathy ? Far more important than getting a time estimate on a chunk of work, or drawing a shiny new graph for "senior management".
Also talking of senior management, do they assume that agile means they get to change their mind completely every sprint without consequences ?
Architecture & Design
Basically, does the method take it into account ? My main criticism of agile (used in ignorance) is that upfront design and defining a starting point is avoided, as it doesn't seem "agile".
Then with a constant focus on the minutia of the code (i.e. what's happening in the next X weeks/days), architecture goes out the window. Typically followed by technical strategic planning, it's as if there is a rush from one end of a polarized spectrum to the other.
No method I've used or researched actually advocates that, though I've seen it assumed in practice and the rush to skip it very evident.
There is a caveat here though - some accomplished and experienced developers can typically go straight from mind concept to code within an architecture naturally. Though they are typically the top % of what they do and they are working on their own or with another (XP style).
The context I'm talking about is development groups within a business.
I've been in sunk-works environments where you are allowed/expected to do this at work, though it's sinfully rare.
As it's just a tool, it all depends on the job at hand.
Though typically I find the best outcomes come from a blend of both. Take some of the initial aspects of waterfall to lay the foundations & vision of the project, identify the stakeholders and gather initial requirements.
Then move into wire-framing, investigate different technology, ensure the whole team has involvement and input from day one.
While all the pieces are coming together the call has to be made for "this is good enough", being that the picture is clear enough to begin the build. This is where the process typically turns into what most people think agile is.
I believe being wholly evangelical about a tool, be that a agile method, Ruby on Rail, Drupal, etc ......... or anything, compromises a project. you can't be objective if you believe there is one (and the same) answer for everything all the time.
The most adaptable and successful systems I've observed are typically blends, show flexibility, and are focused on getting the job done, not being right all the time!