As Model-Based Software Engineering continues to gain traction in our industry, data models will become more commonplace in software architectures and acquisition programs. This paper proffers a framework for considering the maturity of data models. Not only does this provide modelers and architects a way to characterize their models, it also enables program managers and contracting officers to write requirements for the creation (and subsequent evaluation) of models. This paper will also relate each of the models to existing technology and indicate where they align with (and are enabled by) the FACE Data Architecture standard.
Abstract models that organize data elements and standardize how they relate to one another and to real world properties are commonly referred to as data models. The architecture used to govern these data models (data model architecture) has evolved to include advancements in both theory and practice. Over these years of evolution, however, it has become more and more challenging to differentiate best practices from the use of various tools, standards and other approaches that may be used to achieve similar goals. This is particularly understandable given the competing nature of goals, costs, and requirements across the vastly different programs and projects in which data model architectures are applied. To more effectively engage with new data model architecture ideas, it would be logical to step back from the intricacies of the approaches and remind ourselves why data models are important and what they strive to accomplish.
If we consider an interface, it is the anticipated use and the anticipated extent of use of a given interface that drives the rigor and specificity of its interface documentation. This is because, at a fundamental level, it is the documentation of interfaces that enables an architecture’s interfaces to be managed, understood, shared and extended.
As the strength of a chain is determined by its weakest link, the success of the communication between an interface and its consumer is dependent upon its weakest point of documentation. In other words, the minimal rigor to which the interface must be documented is dependent upon the highest level of comprehension required. As the number of integrators and / or organizations increase, so does the required rigor and specificity of the interface documentation.
Since interface documentation is critical to integration, management and scalability of that documentation are two primary areas of importance to examine further. In fact, these two factors can provide a solid litmus test for categorically identifying levels of interoperability.
Management of interface documentation has notoriously been difficult. To remain effective, documentation must be maintained as interfaces change and updates are made. As systems grow larger and more complex, this process becomes more daunting. In an attempt to mitigate the difficulty, a single authoritative body is often authorized with control over the single documented representation of those interfaces. While this may be sufficient for isolated, small or one-off interface implementations, trends in distributed teaming makes a single nexus of control increasingly difficult, if not impossible. The need for distributed control of documentation grows year by year.
Scalability of interface documentation refers to the ability for that documentation to grow, change, and evolve over time, while retaining traceability to its original provenance. Traceability allows for a characterization of a data’s meaning to be retained throughout all future iterations of that documentation, without the need for it to be recharacterized.
Additionally, scalability impacts the documentation’s portability, or ability of interface documentation to be reused and shared. This prevents the need for the physical documentation to grow at a proportional rate to the data being characterized by the documentation. Only new concepts need to be fully characterized in ongoing iterations of the documentation since certain fundamental concepts such as location stand the test of time. For example, the concepts of knowing where you are, locating and identifying things and items of interest near oneself, deciding what to do about those things, and then actually doing something have been the same for a long time. The fact that how we go about doing these things has evolved does not change the concept of what it is we are doing. 1
1 As an exercise, consider the differences in a conceptual model describing an engagement for a modern naval combatant versus a 200-year old ship of the line. Not the interfaces and the specifics of the data, but the structure of the conceptual concepts.
Interface Documentation Maturity Levels (IDMLs)
The ease with which interface documentation may be managed and scaled provides critical insight into the level of system’s interoperability enabled through the use of that documentation. It is this level of interoperability that determines an architecture’s place in the desired progression towards advanced approaches and best practices.
Interface Documentation Maturity Levels (IDMLs) provide a basis for qualifying and relating levels of interoperability based on interface documentation rigor and specificity. By identifying the specific IDML for a given interface or set of interfaces, it is possible to classify what level of interoperability that interface will have with all other interfaces also quantified by IDML.
The IDML Model with its seven levels represents an additive maturity concept progressing towards today’s most advanced practices in interface documentation. As levels in the model increase, the rigor and specificity of the documentation increases, as does its usability, management and scalability. (It should be noted that the notion of an IDML scale is conceptually taken from the long-standing concept of Technology Readiness Levels (TRLs) (Sadin) but is not intended to carry any particular connotations or usages of TRL.)
Recall that we earlier likened interface documentation to links in a chain and interoperability to the strength of that chain. Although an interface may be categorized at a particular IDML, it will be minimally bound by the extent of functionality of not only the interface itself but also the minimum functionality of the interface interacting with it. The level of interoperability between any two interfaces is limited by the lowest IDML between those two interfaces. For example, if an IDML6 interface interacts with an IDML5 interface, the interoperability will be limited to IDML5 capabilities.
As an interface’s requirements change, it may be necessary to further specify the interface documentation, requiring a higher IDML. The progression to a higher IDML ensures that the interface now meets higher
standards; new interfaces at this level, may now be discovered and utilized. Depending on the current IDML, the transition from that current level to a desired level may require substantial data model architecture changes to accomplish the transition. Such possibilities need to be considered carefully during architecture planning. Being that very few interfaces will only require interoperability with other low-level IDML interfaces, best practice recommends that an interface strive to reach the highest IDML reasonably attainable.
Early: Levels 1-2
When a program is small, relatively static and/or wholly managed internally with no need for external interfaces, early maturity level documentation may suffice. Such documentation often meets the needs of the program as management is simple and scalability is not desired. The most basic forms of interface documentation are characterized by these Early Maturity levels which are limited to the specification of content. In this early stage, interface documentation may be captured through Source Code or an ICD / IDD that simply captures the content of the interface without machine-readable syntax or semantic meaning.
These two levels of documentation are not addressed by the FACE Data Architecture.
1: Source Code
In IDML1, one could reasonably argue that no overt interface documentation exists, since the source code provides the extent of the documentation. The code explicitly specifies the algorithm, but only implicitly suggests syntax and semantics. Change and tracing of change happens only in the source code, and no additional documentation is available to communicate the intent or use of that code. As long as the original intent is fully understood by modifying the code, this approach is sufficient. Source code is the simplest possible form of interface documentation.
Figure 1: example source code
2: Interface Control / Description Document (ICD / IDD)
An ICD / IDD is a familiar engineering artifact and a key element in systems engineering. It is often used as tool to communicate information about interfaces, documenting the inputs and outputs of system or the interfaces between systems. This solution provides additional information beyond the source code including the meaning of the interface, how the data is transmitted, and the data formats used.
The ICD provides a useful vehicle for the communication of interface specifications beyond source code. However, use of the ICD also requires a knowledgeable user to read, interpret, and implement aligned code structures and type manipulation routines that are in alignment with the IDML2 documentation.
Figure 2: sample IDD template
Mid: Levels 3-5
As complexity and scope increase, interface management becomes more challenging. External interfaces necessitate more rigorous documentation and, once users have experienced the (sometimes prolonged) pain of a paper-document-based architecture, a push is often made for the creation of “The Data Model.”
This Mid Phase is characterized by the capacity to be machine readable. Machine readability becomes