Without a Plan, you're making it up as you go. If you make it up as you go, it's the basis of getting it horribly wrong - Salman Rushdie
And the Principles, Practices, and Processes to Increase Probability of Success
There is a popular Agile and No Estimates phrase...
It is by doing the work we discover the work we must do.
This of course ignores the notion of engineering or designing a solution to the needed Capabilities of the customer, BEFORE starting coding. It is certainly the case that some aspects of the software solution can only be confirmed when the working software is available for use. But like all good platitudes in the agile community, there is no domain or context as to where this phrase is applicable. Where can coding start before there is some sort of framework for how the needed capabilities will be delivered?
This sounds not only näive, but sounds like we're wandering around looking for a solution without any definition of what the problem is. When that is the approach, when the solution appears it may not be recognized as the solution. Agile is certainly the basis of dealing with emerging requirements. But all good agile processes have some sense of what the customer is looking for.
This understanding of what capabilities the customer needs starts with a Product Roadmap. The Product Roadmap is a plan that matches short-term and long-term business goals with specific technology solutions to help meet those goals.
A Plan is a Strategy for success. All strategies have a hypothesis. A Hypothesis need to be tested. This is what working software does. It tests the hypothesis of the Strategy described in the Product Roadmap.
So if you have to do the work to discover what work must be done, you've got an Open Loop control system. To close the loop this emergent work needs to have a target to steer toward. With this target the working software can be compared to the desired working software. The variance between the two used to take corrective actions to steer toward the desired goals.
And of course, since the steering target (goal) and the path to this goal are both random variables - estimates will be needed to close the loop of the control processes used to reach the desired outcomes that meet the Capabilities requested by the customer.
If you can't explain what you are doing as a process, then you don't know what you are doing - Deming
Process is the answer to the question How do we do things around here? All organizations should have a widely accepted Process for making decisions. "A New Engineering Profession is Emerging: Decision Coach," IEEE Engineering Management Review, Vol. 44, No. 2, Second Quarter, June 2016
A skeptic will question claims, then embrace the evidence. A denier will question claims, then reject the evidence. - Neil deGrase Tyson
Think of this whenever there is a conjecture that has no testable evidence of the claim. And think ever more when those making the conjectured claim refuse to provide evidence. When that is the case, it is appropriate to ignore the conjecture all together
A common assertion in the Agile community is we focus on Value over Cost.
Both are equally needed. Both must be present to make informed decisions. Both are random variables. As random variables, both need estimates to make informed decisions.
To assess value produced by the project we first must have targets to steer toward. A target Value must be measured in units meaningful to the decision makers. Measures of Effectiveness and Performance that can monetized this Value.
Value cannot be determined without knowing the cost to produce that Value. This is fundamental to the Microeconomics of Decision making for all business processes.
The Value must be assessed using...
Without these measures attached to the Value there is no way to confirm that the cost to produce the Value will breakeven. The Return on Investment to deliver the needed Capability is of course.
ROI = (Value - Cost)/Cost
So the numerator and the denominator must have the same units of Measure. This can usually be dollars. Maybe hours. So when we hear ...
The focus on value is what makes the #NoEstimates idea valuable - ask in what units of measure is that Value? Are those units of measure meanigful to the decision makers? Are those decision makers accountable for the financial performance of the firm?
Where You Stand Depends On Where You Sit
This notion of the basis of all good discussions. It is also the basis of discussions that get us in trouble. For example, I sit in a FAR 34.2/DFARS 234.2 Federal Procurement paradigm and a similar paradigm for Enterprise IT based in ISO 12207, ITIL V3.1, CMMI, or similar governance processes.
Both these domains are guided by a Governance framework for spending other people's money, planning for that spend, performing the work with that money, reporting the progress to plan for that spend to produce the needed value, and taking corrective actions when the outcomes don't match the plan to increase the probability that the needed value (Capabilities) will be delivered for the planned cost to keep the Return On Investment on track.
This paradigm is independent of the software development method - traditional or agile.
If you work where the customer has a low need to know the cost, schedule, or what will be produced for the money, then you likely sit somewhere else.
On our morning road bike ride, the conversation came around to Systems. Some of our group are like me - a techie - a few others are business people in finance and ops. The topic was what's a system and how does that notion impact or world. The retailer in the group had a notion of a system - grocery stores are systems that manage the entire supply chain from field to basket.
Here's my reading list that has served me well for those interested in Systems
These are all actionable outcomes books.
Systems of information-feedback control are fundamental to all life and human endeavor, from the slow pace of biological evolution to the launching the latest space satellite ... Everything we do as individuals, as an industry, or as a society is done in the context of an information-feedback system. - Jay W. Forrester
There is a popular graph showing project performance versus the estimated project performance in "Schedule Estimation and
Uncertainty Surrounding the Cone of Uncertainty," Todd Little, IEEE Software, May/June 2006.
This chart (above) shows data from samples of software development projects and is used by many in the agile community and by #NoEstimates advocates to conjecture that estimates are usually wrong. In fact estimates can easily be wrong for many reasons. But knowing why they are wrong is rarely the outcome of the discussion.
This approach is missing a critical understanding about assessing the effectiveness of any estimating process. It starts with the notion of an ideal estimate. That is a post hoc assessment of the project. The idea estimate is only known after the project is over.
Next is another critical issue.
Did the projects (in the graph above) overrun the estimate because the estimate was wrong or because the project was executed wrongly?
In our domain of complex projects, many of which are Software Intensive System of Systems, there are four primary Root Causes of Unanticipated Cost and Schedule Growth, shown below.
The top graph shows samples from the programs in the business. But it does not show WHY those programs ended up greater than the initial estimates. The graph is just a collection of data after the fact. What was the Root Cause of this unanticipated cost (and schedule) growth. The paper goes on to speak about a Estimating Quality Factor but that is only One of the four core Root Causes of cost growth from the PARCA research. As well the paper mentions other causes of growth, similar some PARCA causes.
But each project in the graph is not assigned a specific (maybe even more than one) cause. Without that assignment it is not possible to determine if estimating was the Root Cause, or as stated above one or more of the three other causes was the reason the project was above the Ideal line.
In the notion of Ideal I will assume the estimate was Ideal if the project actuals matched the estimate.
The problem here is those advocating estimates are flawed, a waste, not needed, do harm, or are even evil use this chart to support their claim. Without stating what the Cause for the deviation from the Ideal is. Just that it IS. Not Why. There is no basis to make any claims about the issues with estimating.
Without the Why, NO corrective action can be taken to improve the performance of both the project or any of the four listed (and many other) processes including the estimating process for the project. This is the fundamental basis of Root Cause Analysis. And it comes down to this ...
Without the Root Cause for the undesirable outcome, no corrective action can be taken. You're just treating the Symptom. Those conjecturing a Cause (say Estimating is the smell of Dysfunction) have no basis on which to make that statement. It's a house built on sand. Same for the Cone of Uncertainty - also popularly misused, since the vertical scale is rarely calibrated for the specific domain, instead some broad and uncalibrated range provided for notational purposes.
And as a notional concept the Cone of Uncertainty is useful. As a practical concept, only when the vertical scale is calibrated (Cardinal versus Ordinal) can it be used to assess the uncertanty of the estimates during specific periods of the project. This is another misuse of statistical data analysis.
There are databases that provide information needed to calibrate that chart as well. But that's another post.
For now, take care when seeing the first chart, to ask do you know the cause for the projects that is above the Ideal Line? No, then all you can say is we don't know why, but there are a lot of projects in our organization that have a Symptom of cost overrun, but we have no way to know why. And therefore we have no way to know how to fix this symptom. Just that we've observed it. But can't fix it.
To start to apply Root Cause Analysis, here is an introduction to the method we use on our Software Intensive Systems.
The question of the viability of #NoEstimates starts with a simple principles based notion.
Can you make a non-trivial (NOT de minimis) decision in the presence of uncertainty without estimating?
The #NoEstimates advocates didn’t start there. They started with “Estimates are a waste, stop doing them.” Those advocates also started with the notion that estimates are a waste for the developers. Not considering those who pay their salary have a fiduciary obligation to know something about cash demands and profit resulting from that decision in the future.
The size of the “value at risk” is also the starting point for estimates. If the project is small (de minimis) meaning if we over run significantly no one cares, then estimating is likely a waste as well. The size of the project, whether small or multi-million’s doesn't influence the decision to estimate. The decision is determined by “value at risk,” and that is determine by those paying NOT by those consuming. So the fact I personally work on larger projects does not remove the principle of “value at risk.” Any client’s (internal or external) V@R may be much different than my personal experience – but it’s not our money.
Next comes the original post from Woody – “you can make decisions with No Estimates.” If we are having a principled based conversation (which NE’er don’t) then that statement violates the principles of Microeconomics. Making decisions in the presence of uncertainty (and I’m assuming all projects of interest have uncertainty), then estimates are needed to make decision. Those decisions are based in MicroEcon on the Opportunity Cost and the probability of making the best choice for the project involves assessing the probability outcome of those choices, estimating is required.
Real options is a similar process in IT based on estimating. Vasco stated long ago he was using RO along with Bayesian Decision Making. I suspect he was tossing around buzz words without knowing what they actually mean.
From the business side, the final principle is Managerial Finance. This is the basis of business management of its financial assets. The balance sheet is one place these are found. Since the future returns from the expenses of today and the “potential” expenses of the future are carried in that balance sheet, estimating is needed there as well for the financial well being of the firm.
These three principles Value at Risk, MicroEconomics of Decision Making, and Managerial Finance are ignored by the NE advocates when they start with the conjecture that “decisions can be made without estimates,” and continue on with “estimates are a waste of developers time, they should be coding not estimating.”
It’s the view of the world, that as a developer “it’s all about me.” Never looking at their paycheck and asking where did the money come from. That’s a common process and one I did when I started my career 35 years ago as a FORTRAN developer for Missile Defense radar systems and our boss had us take out our paychecks (a paper check in those days) and look at the upper left hand corner. “That money doesn’t come from the Bank of America, El Segundo Blvd, Redondo Beach, it comes from the US Air Force. You young pups need to stay on schedule and make this thing work as it says in our contract.”
In the end, the NE conversation can be about the issues in estimating and there are many - and Steve McConnell speaks to those. I work large federal acquisition programs – IT and embedded systems. And many times the “over target baseline” root cause is from “bad estimating.” But the Root Cause of those bad estimates is not corrected by No Estimating as #Noestimates would have us believe.
As posted on this blog before and sourced from the Director of “Performance Assessment and Root Cause Analysis,” unanticipated growth in cost has 4 major sources:
1. Unrealistic performance expectations – that will cost more money.
2. Unrealistic cost and schedule estimates – the source of poor estimating.
3. Inadequate assessment of risk and unmitigated risk exposure.
4. Unanticipated technical issues.
Research where I work some times (www.ida.org) has shown these are core to problems of cost overruns in nearly every domain from ERP to embedded software intensive systems. It is unlikely those 4 root causes are not universal.
So what’s #NoEstimates trying to fix? They don’t say, just “exploring” new ways.” In what governance frameworks? They don’t say. They don’t actually say much of anything, just “estimates are waste, stop doing them and get back to coding.”
As my boss in 1978 reminded us newly minted Master’s degreed coders, “it ain’t your money it’s the USAF’s money, act accordingly – give me an estimate of this thing you’re building can be used to find SS-9’s coming our way?” Since then I’ve never forgotten his words, “business (in that case TRW) needs to know how much, when, and what, if it’s going to stay in business.”
Sitting in our seats at last night's Rockie v. Phillies game and dawned on me the analogy between Moneyball strategy and good management of software development. In Moneyball, Billy Beane was faced with a limited budget for players. He hired a statistics guy and they figured out the getting on base was just as important as hitting a homerun and a hell of alot cheaper.
Last night there were a few home runs. But most of the action were singles and a few doubles. If you do the simple minded math, when the rotation all get singles with batting averages of 0.303, then there'll be runs scored every inning. 4 singles equals a home run. Getting on base is the key to winning games.
Getting incrementally more software out the door - assuming it's the right software needed by the customer and that customer can put the software to work - then progress toward the win is being made.
So the Strawman of Waterfall and Big Bang is just that a Strawman. The Straw Man of No Projects is also nonsense, along with No estimates. The manager of the Rockies has a plan in the presence of uncertanty and emerging situations. That's why he's called the MANAGER because he manages in the presence of uncertainty. And in doing that job he makes estimates on the probability of success of the emerging play options.
We can learn a lot from Baseball about managing projects. First get on base. You can't score unless you're on base. First, Second, Third, then Home. You can't count on hitting Home Runs to win the game. It doesn't work that way. Offense of good, but so is defense. Managing the risks is defense. Defense in baseball is more than just putting players on the field. It's how those players react when the ball is hit. Go for the out at first? Try for a double play? Hold the ball after a single bounce in the outfield?
While baseball is not a contact sport, it still requires teamwork. I played competitive Volleyball in College - the ultimate Team Sport, since you're only as strong as the weakest player. Much like the software development team. But all teams have a strategy, a game play that changes as the game emerges and most of all - as Peter Kretzman has lambasted some NE advocates who have not likely ever played baseball - all the players are making estimates all the time in order to catch the ball, keep control of the ball and the emerging situation of the game.
There appears to be a resurgence in the No Projects conversation, similar to the No Estimates notion that has been around for awhile.
I’m going to suggest that most of the disconnects around ideas of software development ‒ from No Estimates to No Projects to whatever ‒ starts with Developers and the assumption It’s their money. It’s not their money. If it is they can do with it as they please, no one cares.
There are standard business accounting processes in any business that creates products or services -- including Software - for money. Unless the developers are Staff Aug (just labor) to another firm, the accounting processes are defined in several places. FASB 86, FASB 10, even GAAP for capitalization and expenses of the cost of developing that product for use internally and for sale externally.
Here's a start …
The separation of Products from Projects at the software development process level is understandable. I work a Program that has both. Both for good reasons for both. A Product Line enhancement is usually on continuing system – Version 9 of a legacy system is a Product Line extension of a system that as be in place for 11 years. A “new” product” Version 1.0 in the same domain is treated as a Project. The Project is to establish Version 1.0 which will then be extended over its life.
If the No Projects approach goes that same way as the No Estimates approach, those paying for the work will be intentionally excluded from the conversation. When I asked one of the originators of the #NoEstimates conjecture to go check your idea with the CFO, I got silence.
Follow the Money is advice I received from my colleague – former NASA Cost Director
This is good advice for any anyone proposing an initiative that pretends to change the status quo, tilt at the windmill of supposed bad management, or any suggested evil as seen by those spending the money provided by someone else. Remember this when making suggestions...
It's not Your Money, act accordingly. Your opinion should be considered as appropriate, but it's not your money. Those whose money it is have a fiduciary responsibility to spend it in a manner compliant with the accounting principles of the firm, be that private or public.
I work on Agile software development programs. Most everyone in Boulder Colorado works on Agile development programs. We meet once a month or so for coffee and talk about Agile. We have formal MeetUps hosted by local vendors - Rally, Scaled Agile, and other agile development shops.
The range of projects at our morning coffee clutch go from one man shows to multi-billion dollar space flight programs and back again. There are times when we get pissed off at each other for all the right reasons - this is the big ended littled ended argument of Gulliver's Travels that was actually an argument about the bit order in microprocessors when the 32 bit machines started with floating point calculations - Holy Wars and a Plea for Peace. When I was working on embedded systems and we had to choose a 32 bit machine for our new signal processing system, we got in huge fights about how the floating points software was going to be moved over to the floating point hardware.
So the question is what are the shared principles across a broad range of projects, business and technical domains. Here's some background on the domain I work and the application of Agile in that domain. I'm speaking at the Boulder Agile Meetup July 27th if anyone is interested in hearing about Agile in Government.
Here's some background on Agile in the domain I work
Typically these are also Software Intensive System of Systems. Here's some background on those
So when we hear something about exploring new approaches, ask first - in what domain have you tested that idea? By what Principles can that new idea be accepted into a domain outside your domain? Is there any evidence that your new idea can work outside your personal experience? How would I test your idea before spending my customer's money? What would be those test cases?
It's popular in the #NoEstimates community to claim Forecasting is not Estimating. Using English Dictionaries, they build a case using logical like this. It's been repeating nearly continuously since the start of the discussion about how to make decisions in the presence of uncertanty without estimating.
Salviati: If you have a number of uniform sized tasks & know velocity you can predict ttc (Time to Complete) Simplicio responds: Indeed...gives us much stronger forecasting ability than estimation
Let's use the Oxford Dictionary of Mathematics, not the High School English Dictionary.
It ain't so much the things we don't know that gets us in trouble. It's the things we know that ain't so - Artemus Ward
Estimating is about the past, present, and future outcomes of a process, a model, or some external observation. We can estimate the number leaves that have fallen and need to be raked come fall, to size how many bags have to be bought at Home Depot (past). We need to estimate the number of people on the Pearl Street Mall right now to assess how long the walk will be to Starbucks from our parking spot (present). We can estimate the number of minutes we'll need to reach our favorite parking spot at Denver International Airport from the office parking lot (future).
Forecasting is Estimating about the future. The weather forecast for tomorrow is 70 degrees and a 30% chance of rain in northern Boulder County (where we live). With this forecast, we can estimate what time we need to tee off to have a high probability of getting all 18 holes of golf on our course in before the rain starts.
Forecasting is estimating about the future
The misuse of Hofstadter's Law is common in many agile development domains. The quote is ...
It always takes longer than you expect, even when you take into account Hofstadter's Law
This quote is misused to suggest that estimating can't be done. On page 152 of Gödel, Escher, Bach: an Eternal Golden Braid, Hofstadter explains the context and meaning of Hofstadter's Law.
He's speaking about the development of a Chess playing program - and doing so from the perspective of 1978 style software development. The game playing programs use a look ahead tree with branches of the moves and countermoves. The art of the program is to avoid exploring every branch of the look ahead tree down to the terminal nodes. In chess - actual chess, people - not the computer - have the skill to know what branches to look down and what branches to not look down.
In the early days (before 1978) people used to estimate that it would be ten years until the computer was a world champion, But after ten years (1988) it was still estimated that day was still ten years away.
This notion is part of the recursive Hofstadter's Law which is what the whole book is about. The principle of Recursion and Unpredictability is described at the bottom of page 152.
For a set to be recursively enumerable (the condition to traverse the look ahead tree for all position moves), means it can be generated from a set of starting points (axioms), by the repeated application of rules of inference. Thus, the set grows and grows, each new element being compounded somehow out of previous elements, in a sort of mathematical snowball. But this is the essence of recursion - something being defined in terms of simpler versions of itself, instead of explicitly.
Recursive enumeration is a process in which new things emerge from old things by fixed rules. There seem to be many surprises in such processes ...
So if you work on the development of recursive enumeration based software designs, then yes - estimating when you'll have your program working is likely going to be hard. Or if you work on the development of software that has no stated Capabilities, no Product Roadmap, no Release Plan, no Product Owner or Customer that may have even the slightest notion of what Done Looks like in units of measure meaningful to the decision makers, then probably you can apply Hofstadter's Law. Yourdan calls this type of project A Death March Project - good luck with that.
If not, then DO NOT fall prey to the misuse of Hofstadter's Law by those likely to not have actually read Hofstadter's book, nor have the skills and experience to understand the processes needed to produce credible estimates.
So once again, time to call BS, when quotes are misused
There's another rash of Twitter posters supporting the conjecture that decisions can be made about how to spend other people's money without estimating the impact and outcome of that decision. This is a core premise of #NoEstimates from the Original Poster
Here's a starting point to learn that is simply not true...
There are 100's more books, papers, conference proceedings on the topic of decision making in the presence of uncertanty.
It comes down to a simple concept
All project work is uncertain. Reducible uncertainties (epistemic) and irreducible uncertainties (aleatory) are present on all projects. When money is being provided to develop software, those providing the money have a reasonable expectation to know something about when you will be providing the value in exchange for that money, how much money will be required to provide that value, and what capabilities will be provided in exchange for that money. The field of study where these questions and answers live is microeconomics of decision making. Software development decision making is a well studied subset of that field.
When it is conjectured decisions can be made in the presence of uncertanty without estimates - like the Original Poster did, and there is no supporting theory, principle, practices, or evidence of how that can be done - run away - it's a bogus claim.
I've started reading Vasco's book #NoEstimates and will write a detailed deconstruction. I got the Kindle version, so I have a $10 investment at risk. Let's start with some graphs that have been around and their misinformation that forms the basis of the book.
The Chaos Report graph,is the 1st one. This graph is from old 2004 numbers. That's 12 year old numbers. Many times the books uses 12, 16, even 25 year old reports as the basis of the suggestion that Not Estimating fixes the problems in the reports. The Chaos reports have been thoroughly debunked as self selected samples using uncalibrated surveys for the units of measure for project failure. Here's a few comments on the Standish reporting process. But first remember, Standish does not say what the units of measure are for Success, Challenged, or Failure. Without the units of measure, the actual statistics of the projects and the statistical ranges of those projects for each of the three categories, the units as essentially bogus. Good fodder for selling consulting services or for use by those with an idea to sell, but worthless for decision making about the root cause of Failure, Challenged, or even Success. Any undergraduate design of experiments class would have all that information in the public.
So the 1st thing to read when you encounter data like this is Project Success: A Multidimensional Strategic Concept, Aaron J. Shenhar, Dov Dvir, Ofer Levy and Alan C. Maltz. Only then start to assess the numbers. Most likely, like the numbers in the book, they're not credible to support the hypothesis. Which by the way there is no hypothesis for you can make decisions in the presence of uncertanty without estimating
So let's look further at the difficulties with Standish and why NOT to use it as the basis of a conjecture
A simple google search would have found all this research and many many more. I get the sense V didn't do his homework. The bibliography has very few references to actually estimating, no actual estimating books, papers, or research sites. Just personal anecdotes from a set of experiences as a developer.
The Standish Report failure mode is described in Darrell Huff's How to Lie With Statistics - self select the samples from the survey. Standish does not provide any population statistics for their survey.
None of these questions are answered in Standish reports. No Estimate picks these serious statistical sampling error up and uses the, as the basis of the pure-conjecture that Not Estimating is going to fix the problems. This would garner a high school D is the Stats class.
Next comes a chart that makes a similar error. This reference is from Steve McConnell's book, but is actually from another source. The No Estimates book does a poor job of keeping the references straight. It is common to misattribute a report, a graph, even a phrase. The book needs a professional editor.
The graph below is used to show that estimates are usually wrong. But there is a critical misunderstanding about the data.
I'm in the early parts of the book and already have a half dozen pages of notes for either fallacies, incorrect principles, 30 year old references, and other serious mistakes on understanding on how decisions are made in the presence of uncertainty. My short assessment is
It's a concept built on sand.
When riding a single track like this one behind our neighborhood, I came to an understand of the tradeoffs between stability and control. In our conference sessions, we speak about how the Wright Brothers learned this concept as well.
|Here's another trail being ridden by our son at Nationals in 2013. who's massively better than I will ever be. But same tradeoff between stability and control is needed to be ranked 33rd Cross Country and 23rd Short Track at DII in the nation, with team taking 2nd overall.|
In both cases, nationally ranked and rank amaetur, control of bike is the key. Stability is a relative term, depending on speed, terrain, skill, risk taking tolerance, experience, technical equipment, and other intangible factors.
We speak at conference where program planning and controls is the topic. Starting 2 years ago, our foundation for speaking about managing in the presence of uncertanty is the Wright Brothers.
Most earlier experimenters in powered flight focused only on one or two of the primary problems - (a) A set of lifting surfaces, or wings, (b) A method of balancing and controlling the aircraft, and (c) A means of propulsion - and did not consider the final design from the outset. The Wrights recognized that each of these areas had to be successfully addressed to build a working airplane. They believed that the aerodynamic and propulsion problems would be comparatively easier to solve, so they first concentrated on how to maintain balance and control.
Many believed that air currents were too swift and unpredictable for human reflexes. Therefore, an aircraft had to be inherently stable for the pilot to be able to maintain control. Because of the Wrights’ extensive experience with the bicycle—a highly unstable but controllable machine (just like the mountain bike)—they saw no reason why an airplane could not be unstable yet controllable as well.
The notion of Control is missed used in software development projects. Misused and misunderstood. Control inside the upper and lower control bounds is a critical success factor. What are those upper and lower control bounds? Good question. They have to be estimated in the start. They become cleared as the project progresses. They have to be adjusted as new information emerges.
In projects, the question for Stability is a false quest. All project work is uncertain. Stability is short lived. New inputs arrive every day. Just like new inputs arrive while riding down the flat trail for cruising around the open space in the neighborhood or more so on the bumpy high speed trail of collegiate racing.
Even if the trail is defined in from of you, the small disruptions and many times big distributions input your control of the bike. The key is Control over Stability. Staying in control will get you across the finish line, down the trail, and home again to ride another day.
Skills of Maintaining Control on the Mountain Bike and the Project
Starting from our house there is a nice loop Left Hand Trail, that is easy, has a few climbs, and a few descents, uncrowded, and provides a nice view of the small hills before the real mountains start . Combine that with the Eagle Trail, Sage Trail, and North Rim Trail and it's a nice 7 mile loop
There's lots of misinformation going around again in the #NoEstimates community
Here's the starting point to clarify and debunk those concepts
Here's some more details, once it's been confirmed you have domain experience
In the end it's simple. Anyone speaking about the difficulties or the waste of estimating is likely on the spend side of the software development equation. Those on the pay side know (or should know) they needed estimates. They can get the estimates from those most qualified to provide them - the developes. Or they can just make them up. Probably better to have a bottom up estimate.