SolutionsPrivate Clouds: Cloudforms PaaS: OpenShift Public Clouds Cloud Portfolio Building Clouds Cloud Partner Program ResourcesMedium & Large Enterprise Enterprise Solutions Business Intelligence Collaboration & Content Management Enterprise Applications Security Web Applications
Best Practices: Standards
by Michael Tiemann, Chief Technical Officer
Simply stated, standards are conducive to developing and maintaining best practices. After all, when two systems behave in the same way because of some underlying standard, a good practice developed for one should apply equally to the other, multiplying the benefit.
Conversely, to the extent that two systems differ, the practices may need to be abstracted, re-engineered, or may simply not apply, reducing the benefit and increasing the cost of developing a best practice. Thus, the quality and robustness of standards is a major determinant of whether a good practice can become a best practice at all.
The focus of this article is not solely about standards for the sake of standards, but how standards enable or disable efforts to implement best practices. From this perspective, let us consider three types of standards prevalent in the industry: de facto standards, de jure standards, and open standards.
de facto standards are standards because everybody assumes they are standards. In other words, they are approved by the market, even if they are not officially prescribed nor even documented. A great example of such a standard is the document format for Microsoft Word documents, the so-called "doc file format." It is a de facto standard because people exchange Word documents assuming that if they can read the document, so can anybody else. When the single source of software for reading and writing such documents changes the file format, the notion of the de facto standard remains while the utility of the standard is destroyed. To recreate the value, the market participants must purchase new software that can read the altered format, and when they do, they become unwitting agents of change (because they begin writing documents that require the new software). In the case of proprietary standards backward and forward compatibility then become values that are subject to market manipulation. Over time, the value the market receives from the existence of those proprietary standards is transferred to the controller of those proprietary standards.
In a best practices context, proprietary de facto standards are dubious at best. First and foremost is the question, other than some product, what is the standard? The more carefully one tries to abstract the standard for best practices purposes, the more questionable the assumptions of the abstraction become. Alternatively, the more completely one simply accepts the product-as-standard model, the more transparent it becomes to the product's owner of one's vulnerability to changes. There really is no winning solution, which is why many best practices specifically say "avoid proprietary standards."
de jure standards are standards that are prescribed by a standards body. They may be in widespread use (such as the metric system) or just a specification awaiting implementation or adoption (such as proper etiquette for making historical references when traveling in time), but de jure standards all share one common property: they are documented and vendor neutral. de jure standards ensure that a 1/4" screw can screw into a 1/4" nut or that a SONY CD-ROM will play in a Phillips CD player. de jure standards provide a strong foundation from which to build best practices, except for the fact that many more de jure standards exist than the market can support. What good is it to build a best practice around a technology that's never widely adopted?
So it seems that one can build a best practice on the quicksand of de facto standards or in the wilderness of de jure standards, but wouldn't it be nice to have standards that are both popular in the market and documented and prescribed by standards bodies?
Open standards take a middle road, which is very favorable for nurturing best practices. For the purposes of this document, we will define open standards as standards that are sufficiently documented to be implemented by and/or verified by a third party and which may be implemented freely, without payment of royalty for using the standard. Of course one may charge money for a specific implementation (if somebody's willing to pay), but a competitor cannot be taxed for developing its own implementation of the standard. The World Wide Web Consortium (W3C) is an example of a standards-setting body that has recently taken a position on patents to align their definition of a Web standard with this definition of an open standard.
Open standards have enjoyed considerable success in the marketplace. As we have seen with the adoption of open standards such as ethernet, TCP/IP, HTTP, and XML, when there's nothing to be afraid of, the industry can move forward quickly and confidently. One of the best practices that enabled all of these standards to exist was the Open System Interconnection (OSI) model. This seven layer model created a standard reference for the exchange of data between different systems. By separating the layers of the physical, datalink, network, transport, session, presentation, and application, (the best practice) and then letting markets focus on innovations at each of these separate layers, we saw a dramatic rise in both technical innovation and system compatibility that culminated in the Internet as we know it today. Moreover, open standards such as TCP/IP and ethernet have transcended their origins as network building blocks and become true open systems interconnect: TCP/IP is now used not only for LAN and WAN, but to connect storage systems (SAN) and even CPUs and memory (PAN). Ethernet has scaled from the 3Mbps implementation first demonstrated at Xerox PARC to 10Gbs, fast enough to build interconnect fabric for next-generation super computers.
Had the boundaries of the OSI been controlled by a single company as a de facto standard (which was status quo for networks at the time), the popularization of the Internet (and the value such a universal resource represents to society) might well remain a frustrated dream.
There are of course good, best, and downright awful practices related to creating and maintaining open standards. Dispensing first with the bad, declaring something to be an open standard when it is neither open (documented, vendor-neutral, freely implementable) nor standard (universally acnknowledged, if not accepted) is, sadly, all too common in this industry. In fact, last year in Federal District Court, in a case where several states were trying to remedy the behavior of an abusive monopolist, the monopolist argued (successfully) against a remedy that would enjoin them from claiming to adhere to industry standards when it was clear that they did not. Thus, while there is no guarantee that a claim of an open standard is anything more than puffery, a valid claim is worth investigating.
At the opposite end of the spectrum, we find the Internet Standards Process, Revision 3, or RFC 2026 for short. RFC 2026 is in the category of "Best Current Practice" and defines the nature and process of Internet standards. Reading this document helps one understand how thousands of companies and millions of individuals could all bring something to Internet and not have it fragment the industry like a modern day Tower of Babel.
Quoting from this document, we see:
The goals of the Internet Standards Process are:It is not difficult to see how the Internet could be architected within this framework, and also why open source was such a natural resource for its construction. Taken together, the best practice of open standards and that of open source have clearly delivered a surpassing result.
The procedures described in this document are designed to be fair open, and objective; to reflect existing (proven) practice; and to be flexible.
- technical excellence;
- prior implementation and testing;
- clear, concise, and easily understood documentation;
- openness and fairness; and
- These procedures are intended to provide a fair, open, and objective basis for developing, evaluating, and adopting Internet Standards. They provide ample opportunity for participation and comment by all interested parties. At each stage of the standardization process, a specification is repeatedly discussed and its merits debated in open meetings and/or public electronic mailing lists, and it is made available for review via world-wide on-line directories.
- These procedures are explicitly aimed at recognizing and adopting generally-accepted practices. Thus, a candidate specification must be implemented and tested for correct operation and interoperability by multiple independent parties and utilized in increasingly demanding environments, before it can be adopted as an Internet Standard.
- These procedures provide a great deal of flexibility to adapt to the wide variety of circumstances that occur in the standardization process. Experience has shown this flexibility to be vital in achieving the goals listed above.