When we talk about documentation within software development, we really need to tackle the issue on two fronts - problem domain documentation and solution domain documentation. Most of us are familiar with the now popularized Agile mantra "Working Software over Comprehensive Documentation". The intent of this value statement within the Agile Manifesto was to attack the prevalent practice of the day to front-load projects with extensive requirements specifications in one big batch. Arguably, it was also a veiled shot across the bow on model-based documentation of software, with a very strong bias towards reliance on the code level of abstraction for knowledge management of how the software works, and an even stronger bias towards the test suite for documenting the intent and purpose of what the software was supposed to do. Be gone the days of big requirements documents - up-front; equally bad, big design documents - up front. The folks who drew a line in the sand were, in dramatic form, attempting to end "documentation for documentations sake". Hard-core agilistas point out that the misuse of documentation as a crutch for communication is reflective of a bureaucratic mindset and culture has festered for far too long. Traditionalist folks have been clinging on to the misinterpretations of a crusty old paper written in 1970 as somehow being state-of-the-art.
This is all fine and good, but unfortunately de-contextualizes the issue and is highly myopic. As I describe at length in "Value Stream: Generally Accepted Practice in Enterprise Software Development", part of this decontextualization is due to an insensitivity or lack of concern/respect for certain stakeholders, and ignorance of the realities of some form factors that would follow in the footsteps of turn of the milenium technologies. One case in point - COTS acquisition, which almost always means large ERP. These are the supertanker programs within large enterprises, and the objectives are typically to mitigate the very real business risks of quarter-century old legacy systems that serve as the vital organs of the business organism. When faced with organ failure, ripping out the "HR heart" or "Accounting liver" within the body is no laughing matter, and as has so often been the case, the price for lack of agility is on the order of hundreds of millions of dollars - or worse, spectacular failure on the order of billions - see DIMHRS in you need proof of this. Lack of documentation of the "requirements" can seriously impede agility by several orders of magnitudes. But the "requirements" in this case are not what traditionalists might like to think. The critical documentation in this context is of the form of the business processes (Business Use Case Realizations) that form the vendors mass-market viewpoint of how to run the enterprise. These are what are being adopted/acquired, and in a gap-driven fashion, the Agile placeholders for conversation (fine-grained demand for IT work, "User Stories") must fall out in evolutionary fashion. Poor documentation on the part of the vendor will kill agility, and lead to attempts to recreate this image of the future state by folks who are all too willing to fill potholes and create "requirements" so as to justify their existence. Unfortunately, this negligence is what leads to massive train wrecks. I can think of the above noted DoD program which went 10 years without a single joint ratified requirement. Similarly, vendors who are unwilling to share their test suites ( which is also a strong form of requirements specification ) can lead to orders of magnitude reductions of agility.
Another example of how blindly following Agile rhetoric can kill agility is in the area of Compliance and quality of evidence issues related to standards of care. We live in a society of laws, like it or not, and whoever gets to be the judge of effective risk management unfortunately gets the last say on what constitutes a compliance exception. This is not to be taken lightly, as an exception on a SSAE-16 audit can lead to very real consequences - like billions in capital outflows for example. This is why the ideals like continuous delivery must be tempered by their poor-cousin - near continuous delivery due to the overhead from Legal related to reviews for misrepresentation, errors and omissions, etc. Yet somewhere along the way, "comprehensive documentation" got conflated with "effective risk management". Anyone who cares to dig further however, finds that management science is not on their side when it comes to such an assertion. Not only does such a completeness standard oversimplify the fit-for-purpose of the practices being forced upon teams, but it also ignores the risk tolerance and cultural realities of the organization in question. In effect, enforcing such "rigorous" documentation actually inserts risk. It does this by inserting delay in the feedback loop that enables true knowledge to emerge, that tacit and hidden knowledge that is beyond what can be "rigorously" and explicitly documented. This destabilizes the ecosystem dynamics, as described in SDLC 3.0: Beyond a Tacit Understanding of Agile. What does serve as essential documentation for achieving agility in these cases, but rarely efficiently achieved, is the release bill-of-materials. This capability, rooted in efficient and effective configuration management capability, prevents the inability to understand what is being released from segregated environment to segregated environment This form of documentation, ironically provided with agility through tools, if missing can severely impair the last-mile as you attempt to deliver potentially shippable increments of value into production.
When we shift from problem-domain documentation to solution domain documentation, more versus less documentation becomes essential not only for near term agility, but for sustained agility and effective knowledge management. Again, contextually driven, the era of Cloud and various modern form-factors of software delivery are heavily dependent on documentation for ensuring agility. When we embrace modern technologies, whether they be extremely powerful PaaS platforms (like IBM Bluemix or similar), or we embrace cultural movements like DevOps which rely heavily upon tool-chain integration, we realize that pure Agile dogma that is now 16 years old (an eternity in current day half-lifes) must be scrutinized by more mature and deeper reflection. "Individuals and Interactions over Processes and Tools" has very much become obsolete and must be interpreted as "Individuals and Interactions empowered by Processes and Tools". Within modern day development, documentation becomes absolutely essential, with a very real example is illustrated through attempts to leverage cloud based APIs within the ALM (Application Lifecycle Management) tools environment. As a cloud based REST API, the Rally web services api provides an extremely efficient way to extract JSON data from their SaaS repositories for ODS data integration/synchronization, delivery intelligence reporting or lightweight operational data warehousing purposes to name a few. The documentation at first glance is seemingly robust. Folks can invoke REST-based service calls from within their microservices quite efficiently, or in other words with high agility. However, this can grind to a crawl within the same API documentation environment. Once you go beyond the operational contract and the off-the-shelf usage examples (in the technology of your choice, whether it be Java, C#, Ruby, Python, PHP, etc.) agility is severely hampered by lack of documentation about fields that are available. Even if one goes through schema definitions to review the field inventory for a workspace or project, a developer will likely still bang their head for long periods of time. Only upon piecing together clues does one find correlation of continuous Cloud API upgrades. One of these frequent upgrades extends GET access within query definitions to certain Portfolio level fields which while accessible through the API, are not returned in earlier API versions. You might say that this is one isolated example. Unfortunately you would be mistaken, as another similar environment standard that isn't even cloud has similar hidden issues - OSLC (the Open Services Lifecycle Collaboration standard). How many hours have been wasted fighting with that API and finding the needle in the haystack for obfuscated knowledge? Or how about anyone who has lived through regression because of browser upgrades, who would attest to the impact of lack of documentation, causing endless hours of soul searching. Gone are the days where "the documentation is in the code" because you don't even have access to the code most of the time - Opensource or not. The agility achieved by much of what is available through such typical cloud assets like Google Code (Google Charts would be a popular example), would be worthless if their documentation sucked.
Throughout this post, hopefully two critical ideas pop out to help you sift through all the noise out there: we must understand the contextual constraints and drivers of our software endeavors; and we must understand the function over the form of the practices we employ. This latter emphasis goes to the heart of the matter, and why Value Stream emerged. It describes how to make the internal contextualization real for the vast array of tactics, strategies and innovative ideas related to ways of working. This is different than the various models of external contextualization of projects which are sometimes discussed out there. Value Stream delivered to the world the first practice-based system's thinking Universal Kernel, which makes explaining how our software delivery ecosystems actually work very simple, yet provides an extremely credible basis for understanding why we see the symptoms we see within Enterprise IT. Armed with such deep insights, mere "pathos" rhetoric like "Working Software over Comprehensive Documentation" evolves towards "Working Software informed by Effective Documentation".