In some "points of view" the notion of measuring software development parameters with Source Lines of Code is equivalent to the devil incarnate. This is of course another POV that makes little sense without understanding the domain and context. It's one of those irrationally held truths that has been passed down from on high by those NOT working in the domains where SLOC is a critical measure of project and system performance.
In the embedded real time systems domains - Software Intensive Systems - where the number of systems and related code base dominates by 100X the desk top and server side code base - the number of line of code in a systems is a direct measure of predicted cost and schedule, as well as predicted performance. Estimating in the presence of uncertainty for Software Intensive Systems is a critical success factor.
For some background on software in intensive systems...
The importance of embedded systems is undisputed. Their market size is about 100 times the desktop market. Hardly any new product reaches the market without embedded systems any more. The number of embedded systems in a product ranges from one to tens in consumer products and to hundreds in large professional systems. […] This will grow at least one order of magnitude in this decade. […] The strong increasing penetration of embedded systems in products and services creates huge opportunities for all kinds of enterprises and institutions. At the same time, the fast pace of penetration poses an immense threat for most of them. It concerns enterprises and institutions in such diverse areas as agriculture, health care, environment, road construction, security, mechanics, shipbuilding, medical appliances, language products, consumer electronics, etc. (Embedded Systems Design: The ARTIST Roadmap for Research and Development. ed. / Bruno Bouyssounouse; Joseph Sifakis. Berlin / Heidelberg : IEEE Computer Society Press, 2005. p. 72 (Lecture Notes in Computer Science, Vol. 3436).
There are some that are repelled by the notion of counting the lines of code or estimating the number of lines of code that may be needed to produce the needed capabilities. But that'd be because of the domain problem again.
Databases exist that correlate the SLOC with cost and schedule for business systems. (www.qsm.com)
And of course Real Time systems
So like it or not, consider it the devil incarnate or not, the numbers talk.
Predicting computer performance requirements for a completed system early in the design and development lifecycle of that system is challenging. Software requirements and avionic or hardware systems may times mature in parallel, and, in early stages of design, uncertainty of meeting the performance requirements makes determination of processing architecture difficult.
Later in the design process, as details finalized and prototypes can be developed, estimates of performance, cost, and schedule become increasingly accurate. If we wait till later in the lifecycle to make architectural changes causes, those changes are much more costly. These changes also result in schedule and technical risks.
The early performance needs are determined and the corresponding system architecture is established, the easier an appropriate computing platform (hardware and software) can be incorporated into the design.
A direct example I'm familiar with is NASA's Orion Crew Exploration Vehicle flight software. That approach uses available requirements documentation as a basis of the estimate and decouples input/output (I/O) and computation-based processing to estimate each separately then combine them to a final result.
This approach was unique since it was used to estimate the execution time for unwritten or partially specified software, in addition to giving a specific contribution for I/O. As well as estimating the time needed to develop the code and therefore the cost of that code. The method for estimating I/O processing performance was based on quantifying data, and the method for estimating algorithmic processing was based on approximated code size.
The result was used to predict processor types and quantities, allocate software to processors, predict communication bandwidth utilization, and manage processor margins. (Requirements-based execution time prediction of a partitioned real-time system using I/O and SLOC estimates, (Innovations in Systems and Software Engineering, Volume 8 Issue 4, December 2012, Pages 309-320)
Now is SLOC appropriate for you? Good question. Actually a theological question in some quarters, since the conveyed truth from the agile community is this is never an appropriate approach. Trouble is, research shows a direct correlation between the size of the software systems - both measured and estimated - and its cost and schedule.
Databases exist showing this and other parametric measures that used produce estimates very useful to both business and technical management. At the ICEAA 2014 conference I and a colleague presented our research paper showing how to apply Time Series Analysis (ARIMA) and Principle Component Analysis (PCA) for estimate future performance of projects, there was briefing on the databases available for making estimates of software intensive systems, here's a sample:
These some sources of reference classes for estimating cost and schedule for business and engineering systems. So what ever your thoughts - and likely biases - SLOC is a very useful production tool in many domains - business and embedded systems - with reference class databases, if you're willing to do the work to estimate the complexity of the code. If you claim it can't be done, then for you that's likely true.