Much has been written about the Estimating Problem, the optimism bias, the planning fallacy, and other related issues with estimating in the presence of Dilbert-isk style management. The notion that the solution to the estimating problem is not to estimate, but to start work, measure the performance of the work, and use that to forecast completion dates and efforts is essentially falling into the trap Steve Martin did in LA Story.
Using yesterday's weather becasue he was too lazy to make tomorrow's forecast
By the way each of those issues has a direct and applicable solution. So next time you hear someone use them as the basis of a new idea, ask if they have tried the known to work solution to the planning fallacy, estimating bias, optimism bias, and the myriad of other project issues with knowing solutions?
All measuring performance to date does is measure yesterday's weather. This yesterday's weather paradigm has been well studied. If in fact your project is based on Climate then yesterday's weather is likely a good indicator of tomorrow's weather.
The problem of course with the yesterday's weather approach, is the same problem Steve Martin had in LA Story when he used a previously recorded weather forecast for the next day.
Today's weather turned out not to be like yesterday's weather.
Those posting that stories settle down to a rhythm assume - and we know what assume means - that the variances in the work efforts are settling down as well. That would mean the word assume comes true Ass out of U and Me. That's a hugely naive approach, without actual confirmation that the variances are small enough to not impact the past performance. When you have statistical processes looking like this, from small sampled projects in the absence of actual reference class - in this case self-reference class - you're also being hugely naive about the possible behaviours of stochastic processes.
Then when you slice the work to same sized efforts - this is actually process used in the domains we work: DOD, DOE, ERP - you're actually estimating future performance base on a reference class and calling it Not Estimating.
So when you hear examples and Bad Management, over commitment of work, assigning a project manager to a project that is 100's of time larger than that PM has ever experienced and expecting success, getting a credible estimate and cutting it in half, or any other Dilbert style management process - and you start with dropping the core process needed to increase the probability of success.
This approach is itself contrary to good project management principles, which are quite simple:
If we start with a solution to a problem of Bad Management, before assuring that the Principles and Practices of Good Management are in place, we'll be paving the cow path as we say in our enterprise, space, defense domain. This means that the solution will have not actually fixed the problem. It will have not treated the root cause of the problem, just addressed the symptoms.
There is no substitute for Good Management.
And when you hear there is a smell of bad management and there is no enumeration of the root causes and the corrective actions to those root causes, remember Ingio Montoya's retort to Vizzini's statement
You keep using that word. I do not think it means what you think it means.
That word is dysfunction, smell, root cause - all of which are missing the actual innumerated root causes, assessment of the possible corrective actions, and resulting removal of the symptoms.
I speak about this approach from my hands on experience working the Performance Assessment and Root Cause Analysis on programs that are in the headlines.