How do you refine your estimation process? [closed] - estimation

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
Estimating how long any given task will take seems to be one of the hardest parts about software development. At my current shop we estimate tasks in hours at the start of an iteration, but once the task is complete we do not use it to aide us in future estimations.
How do you use the information you gather from past estimations to refine future ones?

By far one of the most interesting approaches I've ever seen for scheduling realistically is Evidence Based Scheduling which is part of the FogCreek FogBugz 6.0 release. See Joel's blog post linked above for a synopsis and some examples.

If an estimate blew out, attempt to identify if it was just random (environment broke, some once off tricky bug etc) or if there was something that you didn't identify.
If an esimate was way too large, identify what it was that you thought was going to take so long and work out why it didn't.
Doing that enough will hopefully help developers in their estimates.
For example, if a dev thinks that writing tests for a controller is going to take ages and then it ends up taking less time than he imagined, the next estimate you make for a controller of similar scope you can keep that in mind.

I estimate with my teammates iteratively until we reach consensus. Sure, we make mistakes but we don't calculate the "velocity" factor explicitely but rather, we use gathered experience in our new estimation debates.

I've found that estimating time will get you so far. Interuptions with other tasks, unforseen circumstances or project influences will inevitably change your time frames and if you were to constantly re-asses you would waste much time managing when you could be developing.
So for us here, we give an initial estimation based on experience to the solution for time (we do not use a model, I've not found one that works well enough in our environment) but do not judge our KPIs against it, nor do we assure the business that this deadline WILL be hit. Our development approach here is largely reactive, and it seems to fill the business' requirements of us very well.

when estimates are off, there is almost always a blatant cause, which leads to a lesson learned. Recent ones from memory:
user interface assumed .NET functionality that did not exist (the ability to insert a new row and edit it inline in a GridView); lesson learned is to verify functionality of chosen classes before committing to estimate. This mistake cost a week.
ftp process assumed that FtpWebRequest could talk to a bank's secure ftp server; it turned out that there's a known bug with this class if the ftp server returns anything other than a backslash for the current directory; lesson learned is to google for 'bug' and 'problem' with class name, not just 'tutorial' and 'example' to make sure there are no 'gotchas' lurking. This mistake cost three days.
these lessons go into a Project Estimation and Development "checklist" document, so they won't be forgotten for the next project

Related

organizing code and how to hit deadlines in a programming deadline [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I know this may not be exactly a coder question, but I feel it is still related to programming because I'm sure many developers have come across this before and might have some insight on how to resolve this or have advice. There is an actually programming question though.
My issue as a developer.
I work in a small company, roughly 15 people, 5 of which are developers include myself, the rest are tech support and management. Problem I'm having is, when we get a SOW (Statement of Work), our clients give us a rough description of the project they are requesting, which usually is a 1-3 page brief description, usually including a Visio document, now as a programming, I'm responsible for going over the document and relaying a time-line on how long it should take me to complete the project.
Unfortunately, there have been times, not only me, where we under-estimate the project because we didn't fully get into it till we actually developed it, which ends up slapping ourselves in the face, because my boss is upset because he is being hounded by the client, who is now upset because we missed our promised deadline.
My question is, how do you guys handle organizing basic project description when you need to give deadlines on more concept, and do you have any ideas on how to organize it.
I'm thinking of going to my boss and suggesting, instead of always pushing a estimated deadline to our clients which expect us to hit that, we should write up a detailed document that is more step-by-step (more like what to do) on how to develop the application they want, it may take a lot more time, but least if the project is moved to someone else it is laid out for them, and when I usually get back to it 4 months later, I don't have to refresh up again, I can just follow the steps I wrote.
What do you guys think? Ideas? Or better ways to handle this?
If you switch your development to using an iterative methodology (Agile, XP, Scrum, etc), then the customer will see results much earlier than any deadline you feel you have to promise - usually every 1 or 2 weeks.
The moment they see what you've developed, I can pretty much guarantee that they'll make changes to their initial requirements as they now have a visual representation of the product and it may not be quite what they were thinking of. Some of their changes might be quite radical, so best to get the feedback as early as possible.
In all the projects where i've insisted we do this, the customer was delighted - they saw the results early, could influence the project outcome, and we hit their end deadline. Unexpectedly, a whole load of features got left behind and - guess what - the customer did not mind at all as they got the top features they wanted and put the project/product straight into production as they'd had lots of time to refine it to suit their business, so they were already familiar with it.
It takes a lot of effort to get management, sales, creative, etc, to all buy-in to an iterative style, so you may need to implement a hybrid solution int he mean time, but in my experience, it is well worth it.
If a complete shift to iterative is not possible, split your project into tangible milestones and deliver on those milestones. As others have said, inflate your estimates. My previous manager doubled my estimates and the sales team doubled his too.
Inflate your project deadlines. It's something that most programmers should do (and I quote the VP of Freeverse, the company that I work at):
It is a well-known fact among people
who work in the software industry that
the last 5% of development always takes the longest.
If possible try to divide the higher level tasks as much as possible so that you can get a better approximation of how many man hours that sub-task would take.
Also, adding hidden buffers to your task execution helps in covering some of the unseen contingencies.
cheers
If you mock up (balsamiq or whatever) with your customer, you will get more details. Armed with those details and some experience, your estimates will be more accurate. And then double it and add 4 (hours,days,weeks,months)
First, unless you systematically under-estimate, your boss should not get upset. It's his job to answer to the client, and he should know that by definition, an estimate is NOT the future. Statistically, sometimes you should deliver earlier, sometimes later.
Personally, I think that the frame of "how long will it take" is not exactly the right discussion to have. Software development is a risky business, and change/surprises happen all the time. One approach which helps is to focus less on the "right" number, and more on the volatility. Look at the project, and consider the places where you are pretty clear on how long it will take (you have done it before and understand it well), and look at the places where you have uncertainty (unclear requirements, new technology), and for these, think about how bad it could go, and why. That will help you get not one number, but rather boundaries: what you think is reasonable, a worst-case scenario, maybe a best case scenario (which the client should never see :) ) - and convey that information to your boss, so that he can manage accordingly.
Additionally, this will allow you to identify the danger points of the project, and you can then prototype accordingly - look into the uncertainty points as early as possible, so that you can tighten up the timeline fast, and have early warnings for your boss and the client.

How small is too small for a project plan? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I have friends who have asked me to make websites and most are very small, usually I don't bother with a technical plan but one friend in particular clearly had goals larger than my own and the project is dragging on forever. If I had made a spec before the project I feel like this wouldn't have happened and our relationship would be as solid as before.
So my question is, how can you tell how small is too small? How do you tell when the project you're embarking on is going to end up in a guilt-ridden scope creep nightmare?
If you are going to be charging money (or don't want to be stuck doing the project forever), a project plan is always a good idea. Even if it's just a one-pager outlining what the web site will have (how many pages, any special features) and who is responsible for what. You should factor in that you'll spend 20% of your time (or whatever percentage past experience has taught you) on documentation or non-coding type work, you can give a better estimate of the effort needed. If it's a friend, you might want to tell them that you'll do the first X hours for free, but after that your rate is $Y per hour. Also, keep an accurate log of the time you've spent so that you can show them the amount of effort that is involved. Also, keeping an accurate log helps you estimate future projects.
As you may have already figured out, no project is too small to have at least an informal, written plan. Even if it's just a features list.
A project that does not need a plan is a project that does not need to be even started. In my opinion everything needs a plan, what changes is the extent of that plan. A plan could be just a list of deliverables and some deadline attached to each one. A more robust plan should include time charts, cost, phases, communications howto, dependencies, etc. So I think everything needs a plan, the contents of the plan is what changes depending on the project complexity.
Dwight Eisenhower on planning:
In preparing for battle I have always
found that plans are useless, but
planning is indispensable.
It seems the same in many software projects: you'll find that your plans need to be continually updated and that your first plan was quite different from what you finally completed. But that's okay, it's much better to put some planning in up front than to try something by the seat of your pants.
Agilists try to accommodate such changes in plans by breaking longer term plans into small "sprints" of 2-4 weeks. They'll have more details on the near term sprints, and fewer details on the longer term goals.
You'll especially want to be more detailed and precise if the project is bigger, if you are doing this for an external customer, or if you're attempting something new for you. It's less important (though not unimportant) for smaller projects and types of work you've done before and are very familiar with.

Has Crashing or Fast-Tracking a project schedule ever worked? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I posted this question on Reddit Programming and did not get a single response. So I am hoping that Stack Overflow community will have an opinion.
Have any of you ever been on a software project that had fallen behind, where 'Crashing' or 'Fast-Tracking' the project schedule actually brought the project schedule back on track? I have never seen either of these project management techniques actually work. And all the articles on software development that I have read all state that these 2 techniques do not work and actually pushing the project further behind (for example literature on the Mythical Man Month). So who has seen it work?
Thanks Bill.
I have only ever seen it work once. It was a three or four month long project that was projected to run an extra two months over the original delivery date. The project got fast-tracked and things ended up getting back on track for the release.
...keep in mind though, that was only once. I've been on many more projects where the PM tried to use one of those two methods and they failed miserably and dragged the project out for months beyond already extended date.
It can work. But there's a price to be paid: lower quality (more bugs, less testing) and turnover of burned-out programmers.
And in many cases, a fast-tracked project will both fail to deliver on time and will still pay the full negative price, for the reasons stated in Mythical man-month.
I've seen it work but it's not the norm.
Things I'd want to see before I thought it might be feasible:
1) Staff available with suitable skills and approach. By that I don't mean ".NET programmer", I mean detailed technical skills, business domain skills (so they understand the problem), personality fit and understand the tools and the approach (source control, methodology and so on). This can happen in large companies where there are common tools, standards and knowledge but you need to be sure that they're ticking pretty much all the boxes.
2) Tasks must be nicely divisible. The best situation is where there are whole modules, applications or tasks unstarted and you can put new people on that. It minimises upskilling, additional communication and so on. If you can't separate out what the new people will do you're likely to majorly disrupt the existing team.
3) The whole team must have bought into the approach. If the existing team don't agree that bringing people on board will be right they'll likely fight it and you're doomed.
4) You need to be sure you've addressed why it was running late in the first place. If it was just bad estimates then are you confident the new estimates are good? If it was scope creep have you got the scope and change control in hand now? If it was because the deadline moved, are you sure it won't move again?
If you can't tick all four of those off, it isn't going to work.
Crashing and Fast-Tracking are two very different things...
Fast Tracking is where you take something (tasks or work packages) out of sequence and do it early. This may because of hardware delivery lead times, availability of resources, risk or whatever. So you might do things in parallel where originally you had planned to do it sequentially. I've fast tracked a lot of projects.. and yes it works.
Crashing a project is different in that you typically throw more resources at a problem to get it done quicker... this can be tricky. If it's done as a crisis response it can be painful adding extra people as you are already under the pump. In some situations you just add more problems.
Another alternative to crashing is to reduce scope. This is not always possible, but it should be considered.
With fast tracking or crashing... the sooner you know when you need to make a schedule change the easier to manage. This is why early deadlines are so important, they indicate how the rest of the project will go.
Both of these project management techniques work well to maintain a schedule, but they should be used intelligently by judiciously analyzing the network diagram:
study the variance,
study lead and lags;
decide what suits to your project: ‘Crashing’ or ‘Fast-Tracking’.
There is a software management principle that says adding manpower to a late project makes it later.
That said, as long as the measures taken are sensible it should be ok. Don't expect too much of your staff and provide reasonable incentives and don't take short cuts. It won't make miracles happen but if you're practical and want to push things just that little bit faster it can definitely be done.
When people have a stake in the potential success of something it's amazing how much more effort they're willing to put in.
It depends on what you mean by "work". I don't think I've ever seen it make a way late project deliver on time, if that's what you are asking.
However, I have seen it make way late projects deliver only a bit late. From the fuzzy perspective of management, that might be called "working". I've also seen it significantly lower the customer-based pressure on the company. Some might also call that "working".
Of course the price is rather high. Employees burn out, develop health problems or big problems in their neglected personal lives, etc. All of that has large financial repurcussions to the company. So I doubt the company comes out ahead in the long run. Is that "working"?

When do you blow the scope creep whistle? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
Most people have been here at some point or another - in your project, you get really small requests along the way that you're happy to take care of, but at some point the little things add up. Sometimes it takes less time to implement something than it does to re-negotiate the project plan.
Providing the spec/requirements plan is decent and it isn't a doomed project to start with, at what point do you actually blow the whistle and start re-negotiating? At any request? When that request requires additional pages / forms? Or just feel it out? Would love to hear how you make the call.
Budget N hours of ad-hoc requests in your project plan. (You know it's going to happen, so why isn't it in there?) Then track your ad-hoc requests and renegotiate when the budget's blown.
At any request?
The real goal is to make the customer happy while not getting ripped off, right? Agile methods address these issues to a large extent. New requirements always come up, and if you don't address them as they come you end up building things that are obsolete or dysfunctional out of the box. So what you need is customer buy-in to the process, a working prototype as soon as possible, and lots of iterating. There's a ton more, of course, but that should be enough to be getting on with.
Edited to add: Customer buy-in means they are aware of you working on a new feature instead of whatever you would be doing, and that they are in agreement. When you've gone through your schedule and budget and still aren't done they they've been there with you the whole way and know why. No big surprise "What? You're not DONE?!"
I'd say when it's going to impact the schedule/release date. If that happens, it's definitely time to blow the whistle. If either the scope creep is of sufficient magnitude, or if there are enough cumulative changes that it's impacting your ability to ship on time, then you should push back.
The moment your budget gets blown. You can't keep doing all these "freebie" add-ons - unless you are doing it for charity.
Once you've put your foot down once, you'll find the requests drying up!
I have only been in this situation with internal tools where our stated goal was to best serve any whim of our "customers" in a situation where there was no way to predict needs in advance. So take my answer with a grain of salt.
My view is that the decision is often political, and unless you're the head of the company it might not even be up to you. The cost of unsatisfied customers going over your head to your boss can be more damaging.
I'm a big believer in agile and continuous requirement gathering that does involve seeing how users work with the product, and trying to match their needs. However, every user has his individual "nice to haves" and there's no way to satisfy everyone. If you have multiple target users, democracy is a good system - only implement things that the majority of the users can benefit from.
If your clients are a cohesive group (e.g., you're making it for users in a specific department in a specific organization), run a Wiki site or something like SO or other engines where they can list and then collaboratively vote on possible features. Make it clear that you will give priority (but no guarantees) about higher rated features, and that you're probably not going to give priority to things that don't get votes from others.
In doing so, you may be able to get the clients to apply some collaborative filtering (or peer pressure) on ideas. You will also get some visibility, so people can see why their wishes were not respected. An important side benefit is that whoever requested a feature now has an interest in formulating the request and its rationale well, so that they can get others to vote for them. This will eliminate some asinine half-baked ideas.
Of course, an underlying assumption of all this is that you budgeted some time to "misc features" with whoever is paying for the projec.
The estimated completion date is more of a probability curve than a single date.
Any extra feature reduces the likelihood of meeting some particular date.
You should 'blow the whistle' if and when the decrease in likelihood becomes 'significant' or worth mentioning.

How long does it really take to do something? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I mean name off a programming project you did and how long it took, please. The boss has never complained but I sometimes feel like things take too long. But this could be because I am impatient as well. Let me know your experiences for comparison.
I've also noticed that things always seem to take longer, sometimes much longer, than originally planned. I don't know why we don't start planning for it but then I think that maybe it's for motivational purposes.
Ryan
It is best to simply time yourself, record your estimates and determine the average percent you're off. Given that, as long as you are consistent, you can appropriately estimate actual times based on when you believed you'd get it done. It's not simply to determine how bad you are at estimating, but rather to take into account the regularity of inevitable distractions (both personal and boss/client-based).
This is based on Joel Spolsky's Evidence Based Scheduling, essential reading, as he explains that the primary other important aspect is breaking your tasks down into bite-sized (16-hour max) tasks, estimating and adding those together to arrive at your final project total.
Gut-based estimates come with experience but you really need to detail out the tasks involved to get something reasonable.
If you have a spec or at least some constraints, you can start creating tasks (design users page, design tags page, implement users page, implement tags page, write tags query, ...).
Once you do this, add it up and double it. If you are going to have to coordinate with others, triple it.
Record your actual time in detail as you go so you can evaluate how accurate you were when the project is complete and hone your estimating skills.
I completely agree with the previous posters... don't forget your team's workload also. Just because you estimated a project would take 3 months, it doesn't mean it'll be done anywhere near that.
I work on a smaller team (5 devs, 1 lead), many of us work on several projects at a time - some big, some small. Depending on the priority of the project, the whims of management and the availability of other teams (if needed), work on a project gets interspersed amongst the others.
So, yes, 3 months worth of work may be dead on, but it might be 3 months worth of work over a 6 month period.
I've done projects between 1 - 6 months on my own, and I always tend to double or quadrouple my original estimates.
It's effectively impossible to compare two programming projects, as there are too many factors that mean that the metrics from only aren't applicable to another (e.g., specific technologies used, prior experience of the developers, shifting requirements). Unless you are stamping out another system that is almost identical to one you've built previously, your estimates are going to have a low probability of being accurate.
A caveat is when you're building the next revision of an existing system with the same team; the specific experience gained does improve the ability to estimate the next batch of work.
I've seen too many attempts at estimation methodology, and none have worked. They may have a pseudo-scientific allure, but they just don't work in practice.
The only meaningful answer is the relatively short iteration, as advocated by agile advocates: choose a scope of work that can be executed within a short timeframe, deliver it, and then go for the next round. Budgets are then allocated on a short-term basis, with the stakeholders able to evaluate whether their money is being effectively spent. If it's taking too long to get anywhere, they can ditch the project.
Hofstadter's Law:
'It always takes longer than you expect, even when you take Hofstadter's Law into account.'
I believe this is because:
Work expands to fill the time available to do it. No matter how ruthless you are cutting unnecessary features, you would have been more brutal if the deadlines were even tighter.
Unexpected problems occur during the project.
In any case, it's really misleading to compare anecdotes, partly because people have selective memories. If I tell you it once took me two hours to write a fully-optimised quicksort, then maybe I'm forgetting the fact that I knew I'd have that task a week in advance, and had been thinking over ideas. Maybe I'm forgetting that there was a bug in it that I spent another two hours fixing a week later.
I'm almost certainly leaving out all the non-programming work that goes on: meetings, architecture design, consulting others who are stuck on something I happen to know about, admin. So it's unfair on yourself to think of a rate of work that seems plausible in terms of "sitting there coding", and expect that to be sustained all the time. This is the source of a lot of feelings after the fact that you "should have been quicker".
I do projects from 2 weeks to 1 year. Generally my estimates are quite good, a posteriori. At the beginning of the project, though, I generally get bashed because my estimates are considered too large.
This is because I consider a lot of things that people forget:
Time for bug fixing
Time for deployments
Time for management/meetings/interaction
Time to allow requirement owners to change their mind
etc
The trick is to use evidence based scheduling (see Joel on Software).
Thing is, if you plan for a little extra time, you will use it to improve the code base if no problems arise. If problems arise, you are still within the estimates.
I believe Joel has wrote an article on this: What you can do, is ask each developer on team to lay out his task in detail (what are all the steps that need to be done) and ask them to estimate time needed for each step. Later, when project is done, compare the real time to estimated time, and you'll get the bias for each developer. When a new project is started, ask them to evaluate the time again, and multiply that with bias of each developer to get the values close to what's really expects.
After a few projects, you should have very good estimates.

Resources