Ok, I admit that I find buzzwords to be annoying. Not just because that’s the conventional wisdom but more because I think I’m a bit of a cynic at heart. When somebody tells me that they have found something new and great, I often assume they are just being emotional and there isn’t a lot of truth to it. Enter into the picture agile methodology.
Since this is 2013, it’s probably a bit hard to call agile development a buzzword. After all, people have been successfully delivering projects using it for a good number of years now. But what I really find challenging is when people openly admit that agile isn’t a silver bullet to managing all projects, but at the same time, they believe that waterfall methodology is an archaic thing of the past. I don’t think that’s often true as waterfall works well for certain types of projects. But more interestingly, I love to ask the question: “If agile is really so great, why do you think we’ve been using waterfall for over 50 years before agile became a main stream approach?” I get lots of answers (none of which I believe are correct):
- Requirements change too rapidly now – they didn’t back in the 1960’s – or at least non-IT expectations are moving more rapidly today than they did
- People didn’t really know any better; there were no alternatives to waterfall
- Software was much simpler back then so it was easier to get it right
- Software is much simpler today because we use proven approaches – so we don’t need to spend as much time doing design or architecture like a waterfall would demand
- Providing paper-based deliverables keeps people’s careers alive
I’m sure you could add your own to the list with some reasonable, some absurd and some down right cynical in their own right. But here is my answer:
The underlying assumptions have changed between 2013 and 1960.First, computers are cheap in 2013. You can buy a computer for under $100 today that can do a tremendous amount. Heck, your iPhone does way more than a 1960’s room full of computers. Back in the 1960’s, it cost huge dollars just to keep the machine running. Second, computers are abundant today. In 1960, a company was likely to have only a handful of computers. Third, computer languages and operating systems are very different. The 1960’s mainframes would typically run instructions sequentially into a job was completed or errored out. There are also big differences in bandwidth, graphical user interfaces, etc.
Imagine running an agile “sprint” where you could only run the code at night, it would only run until it hit the first bug, the CEO was yelling at you that it cost $100,000 every time you had a bug and you couldn’t just change things willy nilly without incurring big costs. Wouldn’t you more likely make sure you got all the requirements completed first? Then, make sure you have a sound design? Then make sure you absolutely code it right as much the first time as possible? And then make sure you test it in detail because you can’t just look at the screen to confirm that it’s working correctly?
In other words, you’d run a waterfall.
But, as to why it has lasted so long but still has a place in today’s IT environment is a fair question. I’ll leave my answer for another day and another blog…