For us, an accurate delivery date on a 6 month project was mandatory. CX needed it so they could start onboarding high priority customers. Marketing needed it so they could plan advertising collateral and make promises at conventions. Product needed it to understand what the Q3 roadmap should contain. Sales needed it to close deals. I was fortunate to work in a business where I respected the heads of these departments, which believe it or not, should be the norm.
The challenge wasn't estimation - it's quite doable to break a large project down into a series of sprints (basically a sprint / waterfall hybrid). Delays usually came from unexpected sources, like reacting to a must have interruption or critical bugs. Those you cannot estimate for, but you can collaborate on a solution. Trim features, push date, bring in extra help, or crunch. Whatever the decision, making sure to work with the other departments as colaborators was always beneficial.
In practice developers have to "handle" the people requesting hard deadlines. Introduce padding into the estimate to account for the unexpected. Be very specific about milestones to avoid expectation of the impossible. Communicate missed milestones proactively, and there will be missed milestones. You're given a date to feel safe. And sometimes you'll cause unnecessary crunch in order for a deadline you fought for to be met. Other times, you'll need to negotiate what to drop.
But an accurate breakdown of a project amounts to executing that project. Everything else is approximation and prone to error.
Things rarely went to plan, but as soon as any blip occured, there'd be plans to trim scope, crunch more, or push the date with many months of notice.
Then I joined my first web SaaS startup and I think we didn't hit a single deadline in the entire time I worked there. Everyone thought that was fine and normal. Interestingly enough, I'm not convinced that's why we failed, but it was a huge culture shock.
Estimating in software is very hard, but that's not a good reason to give up on getting better at it
There are problems with all of these. The company knows they can sell X of the product for $Y (often X is a bad guess, but sometimes it has statistical range - I'll ignore this for space reasons but it is important!). X times Y equals gross profit. If the total costs to make the feature are too high the whole shouldn't be done.
If you trim features - the affects either the number you can sell, or the price you can sell for (sometimes both).
If you push the date that also affects things - some will buy from a competitor (if possible - and the later date makes it more likely the competitors releases with that feature).
Bring in extra help means the total costs goes up. And worse if you bring them in too late that will slow down the delivery.
Crunch is easiest - but that burns out your people and so is often a bad answer long term.
This is why COMPANIES NEED ACCURATE ESTIMATES. They are not optional to running a company. That they are impossible does not change the need. We pretend they are possible because you cannot run a company without - and mostly we get by. However they are a fundamental requirement.
Saying "I don't know" is arguably more honest, even if it's not useful for budgets or planning.
But you're pretty spot on, as 'professionally acceptable' indeed means politically acceptable most of the time. Being honest and admitting one's limit is often unacceptable.
[0]: https://www.strategy-business.com/article/Why-do-large-proje...
I *HATE* estimating roadmaps, because it feels unfair. I'm happy to estimate a sprint.
Former Test Engineer here. It was always fun when everyone else’s deadline slipped but ours stayed the same. Had to still ship on the same date even if I didn’t have silicon until much later than originally planned.
I completely agree. That's why I chose that example: They're also awful at it, especially these days in North America in particular. But any contractor that tried to put in a bid claiming "it'll be done when it's done and cost what it costs" would not be considered professionally competent enough to award a multi-million dollar budget.
Usually management backs off if they have a good understanding of the impact a change will make. I can only give a good estimate of impact if I have a solid grip on the current scope of work and deadlines. I've found management to be super reasonable when they actually understand the cost of a feature change.
When there's clear communication and management decides a change is important to the product then great, we have a clear timeline of scope drift and we can review if our team's ever pulled up on delays.
A side effect is, no there aren't. Allow me to explain that catty remark.
The experienced pro's have figured out how to arrange their affairs so that delivery of software doesn't matter, i.e., is someone else's problem. The software either arrives or it doesn't.
For instance, my job is in technology development for "hardware" that depends on elaborate support software. I make sure that the hardware I'm working on has an API that I can code against to run the tests that I need. My department has gone all-in on vibe coding.
Customers aren't waiting because the mantra of all users is: "Never change anything," and they can demand continued support of the old software. New hardware with old software counts as "revenue" so the managers are happy.
Estimation is a real problem in a lot of industries, including ours, and I think that's probably common ground here -- I suppose my differing position is that I think the solution is to get better at it, not to refuse to do it.
I've been on projects where I've seen the budget explode and projects where I've seen the budget kept tight and on track. The latter is very hard and requires effort from ALL sides to work, but it's almost always achievable.
I actually empathize a little bit more with megaprojects because generally the larger the budget the harder it will be to keep on track in my experience. Most estimates we're asked to give in our day jobs are not even multi-million dollar estimates.
Also I'm using budget and estimate interchangeably but these are of course different things -- that's one of my nitpicks is that we often treat these as the same thing when we talk about estimating being hard. A lot of individual estimates can be very wrong without affecting the ultimate budget.
Hence the separation into must-haves, highly desirable, and nice-to-haves. Hence the need for modularity and extensibility: you if don't get to build everything in one go, and can't always even predict what parts would be left outside the scope, you have more of a lego-like structure.
BTW maybe if we finally shook off the polite lie of planning how much work a project could be, and instead started to think in terms of possible deliverables within different time frames, the conversation would become saner.
I abandoned the plans from the previous PM and discussed the job with the developer who ballpark estimated that the work would take 2 months. After a quick analysis I adjusted this to 14 weeks.
But the account manager thought this sounded too long and insisted that we plug everything in to a Gantt chart, define the shit out of everything, map the dependencies, etc, which showed that the development would only take 6 weeks.
The project ended up taking 14 weeks.
I think you were estimating time to build things that were out of R&D and you had specifications that were actual specifications you were building up to.
In SaaS my experience is: someone makes up an idea not having any clue how existing software is working or is laid out, has no specifications beside vague not organized bunch of sentences. Software development team basically starts R&D to find out specifications and what is possible - but is expected to deliver final product.
Alternatively treat it like a bet and accept it may not pay off, just like any other business where uncertainty is the norm (movies, books, music).
Estimations in government contracts are as ridiculous as in software. They just pretend to be able to estimate when things will be done, when, in fact, the contractors are as clueless.
Not being able to say "it is impossible to estimate", does not mean your estimate will be correct. That estimation is usually a lie.
I think it's obvious that all software teams do some kind of estimates, because it's needed for prioritization. Giving out exact dates as estimates/deadlines is often completely unecessary.
Sure, but even accurate estimates are only accurate as long as the assumptions hold.
Market conditions change, emergency requests happen, people leave, vendor promises turn out to be less than accurate.
And most estimates for non-routine work involve some amount of risk (R&D risk, customer risk, etc.).
So pounding the table and insisting on ACCURATE ESTIMATES without a realistic backup plan isn’t good business, it’s just pushing the blame onto the SWE team when (not if) something goes south.
There is a bridge in my town that is finally nearing completion, hopefully, this year. It was estimated to be completed 2 years ago.
This changes when it’s a project that has fewer unknowns, where they’ve built the same thing several times before. The same is true in software.
The reluctance to accept the reality that it cannot be made true achieves nothing positive for anybody. Rather it results in energy being lost to heat that could otherwise be used for productive work.
This isn't about respect between functions, this isn't about what ought to be professionally acceptable in the hypothetical. It's about accepting and working downstream of a situation based in objective truth.
Believe me, I wish it were true that software estimates could be made reliable. Everyone does. It would make everything involved in making and selling software easier. But, unfortunately, it's not easy. That's why so few organisations succeed at it.
I don't present easy answers to the tensions that arise from working downstream of this reality. Yes, it's easier to make deals contingent on firm delivery dates when selling. Yes, it's easier to plan marketing to concrete launch dates. Yes, it's easier to plan ahead when you have reliable timeframes for how long things take.
But, again unfortunately that is simply not the reality we live in. It is not easy. Flexibility, forward planning and working to where the puck is going to be, and accepting redundancy, lost work, or whatever if it never arrives there is part of it.
That I think is what people in different functions are best served rallying and collaborating around. One team, who build, market and sell software with the understanding that reliable estimates are not possible. There simply is no other way.
In my work we have our core banking system designed in 80s on top of Oracle DB so everything is just boxes around it, with corresponding flexibility towards modern development methodologies. The complexity of just doing a trimmed copy of production servers for say user acceptance test phase is quite something, connecting and syncing to hundreds of internal systems.
Needless to say estimates vs reality have been swinging wildly in all directions since forever. The processes, red tape, regulations and politics are consistently extreme so from software dev perspective its a very lengthy process while actual code changes take absolutely tiny time in whole project.
As far as estimates go, I've also struggled with the industries cult(ural) rituals. I tried to put forward a Gaussian based approach that took into account not only the estimate of time, but the expected uncertainty, which is still probably off the mark, but at least attempts to measure some of the variance. But again, the politics and the rigidity of the clergy that has built around software development blocked it.
On the bright side, all this has helped me in my own development and when I think about software development and estimating projects. I know that outcomes become more chaotic as the number of pieces and steps compound in a project (i.e. the projects normal curve widens). You may not even get the project at all as defined at the outset, so my normals approach is still not quite the right tool.
I think this kind of thinking can be helpful when working solo or in a small group who are exposed to market forces. But for solo and small groups, the challenge isn't so much about the estimates, it's about how you're going to fight a battalion of mercenaries hired by big VC money and Big Tech. They can often afford to be inefficient, dump in the market, because their strategy is built around market control. These aren't practices small players can afford, so you need to get creative, and try to avoid these market participant kill boxes. And this is why, coming back to my earlier point, that often times, inefficient practices and politics plays a big role. Their trying to marshal a large number of troops into position and can afford to lose a few battles in order to win the war. The big money plays by a different set of rules, so don't worry if their doing it wrong. Just recognize your in the army soldier!
It's not binary, it's a continuum.
With experience, it's possible to identify whether the new project or set of tasks is very similar to work done previously (possibly many times) or if it has substantial new territory with many unknowns.
The more similarity to past work, the higher the chance that reasonably accurate estimates can be created. More tasks in new territory increases unknowns and decreases estimate accuracy. Some people work in areas where new projects frequently are similar to previous projects, some people work in areas where that is not the case. I've worked in both.
Paying close attention to the patterns over the years and decades helps to improve the mapping of situation to estimate.
If you can't make firm delivery commitments to customers then they'll find someone who can. Losing customers, or not signing them in the first place, is the most harmful thing to everyone in the organization. Some engineers are oddly reluctant to accept that reality.
My favorite metaphor is building something like a new shopping mall. If you ask for an estimate you first need to architect the entire thing. This is equivalent to breaking down the task into sprints. In most companies the entire architecture phase is given very little value, which is insane to me.
Once we have our blueprints, we have other stakeholders, which is where things really go off the rails. For the mall, maybe there is an issue with a falcon that lives on the land and now we need to move the building site, or the fixtures we ordered will take 3 extra months to be delivered. This is the political part of estimating software and depends a lot on the org itself.
Then, finally building. This is the easy part if we cleared the precursor work. Things can still go wrong: oops we hit bedrock, oops a fire broke out, oos the design wasn't quite right, oops we actually want to change the plan.
But yes, estimates are important to businesses. But businesses have a responsibility to compartmentalize the difference. Get me to a fully ticketed and approved epic and most engineers can give you a pretty good estimate. That is what businesses want, but they consider the necessary work when they Slack you "how long to build a mall?"
Estimating the delivery of a product whose absence means zero product for the customer is very different. A company that’s already humming along can be slow on a feature and customers wouldn’t even know. A company that’s not already humming is still trying to persuade customers that they deserve to not die.
Whereas with software most of what was done previously is now an import statement so up to 80-100% of the project is the novel stuff. Skilled leaders/teams know to direct upfront effort toward exploring the least understood parts of the plan to help reduce down-stream risk but to really benefit from that instinct the project plan has to regularly incorporating its findings.
1. Guess the order of magnitude of the task (hours vs days/months/years)
2. Add known planning overhead that is almost order of magnitude more.
Example: if we guess that task will take 30min, but actually it took 60min - that’s 100% error (30min error/30min estimate).
But if the methodology is used correctly, and we spend 2h in a planning meeting, same estimate and same actual completion time results in only 20% error, because we increased known and reliable part of the estimate (30min error / 2h30min estimate)
This is exactly what makes estimates categorically unreliable. The ones that aren't accurate will surprise you and mess things up.
In that sense, it does compress to being binary. To have a whole organisation work on the premise that estimates are reliable, they all have to be, at least within some pretty tight error bound (a small number of inaccuracies can be absorbed, but at some point the premise becomes de facto negated by inaccuracies).
https://en.wikipedia.org/wiki/Program_evaluation_and_review_...
this kind of (self-)deprecation is exactly the kind of thing that makes it impossible to be happy as a technical person in a startup
- Create urgency
- Keep scope creep under control
- Prioritize whatever is most valuable and/or can stand on its own
If you just say “I don’t know” and have no target, even if that’s more honest, the project is less likely to ever be shipped at all in any useful form.
If you're truly creating such unique and valuable software that it is to be compared to the world's engineering megaprojects in its challenge then perhaps it is beyond being beholden to a budget. Who am I to say?
But 99.9% of this industry isn't doing that and should probably be able to estimate their work.
Well, it is the truth. It won't be done before it is done. It is understandable that there is a business that needs to function, but the issue here is the question of asking for an estimate like you've already solved the problem, instead of actually sitting down with the engineer to discuss the business problems that need to be solved. That's what engineers are there for: To solve business problems. Estimates are irrelevant as the solution will be designed with the business constraints in mind.
> it's quite doable to break a large project down into a series of sprints
This too comes across like the problem is already solved. You don't need to break problems down into sprints. That is a ridiculous way to operate. This kind of thing only shows up where there is some weird effort to separate engineers from their jobs.
In fact, "sprint" comes from Scrum, which was designed to be a transitionary exercise to get engineers more comfortable with Agile, which is all about removal of managers. It is intended to teach engineers to think and act more like managers so that when you get rid of the managers completely that they don't flounder. If you are doing it as more than a temporary thing, you've entirely missed the point.
Rather: the customer will find someone who can confidently pretend that thet can make firm delivery commitments.