When I was a young university student I have learned TCP/IP by reading RFCs. It gave me exact idea how the network worked. It trained me to recognize good specification. And it also somehow persuaded me to believe in standards. And I have maintained that belief for most most of my professional life. However it started to vanish few years ago. And recently I have lost that faith completely. There were two "last drops" that sent my naïveté down the drain.
The first drop was SCIM. I was interested in that protocol as I hoped that having a standard interface in midPoint would be a good thing. But as I went through the specification I have recognized quite a lot of issues. This is a clear telltale of an interface which is under development and it is not suitable for a real-world use, not even thinking about becoming a standard. I have concluded that SCIM is a premature standardization effort and was ready to forget about it. But there was suggestion to post the comments on SCIM mailing list and in an attempt to be a good netizen I did just that. There was some discussion on the mailing list. But it ended in vain. What I figured is that there is no will to improve the protocol, to make the specification more concrete and useful. SCIM is not a protocol, it is not an interface. It is a framework that can be changed almost beyond recognition and one still can call it SCIM. All hopes for practical interoperability are lost. Well, there are some public interoperability testing. But I have checked the scenarios that were actually tested. And these are the most basic simplest cases. These are miles away from the reality. The folks on SCIM mailing lists argue that most of the "advanced" features are to be done as protocol extension, which most likely requires "profiling" the protocol for a specific use case. Which means practically no interoperability out of the box. Every real-world deployment will need some coding to make it work. I believe that SCIM is lost both as a protocol and as a standard.
The other drop was OAuth 2. I was not watching that one so closely, but recently a friend pointed me to Eran Hammer's blog entry. Eran describes the situation that is very similar to SCIM: specification that does not really specifies anything and a lack of will to fix it. That was the point when I realized that I have seen this scenario in various other cases during the last few years. It looks like premature standardization is the method and vague specifications are the tools of current standardization efforts. I no longer believe in standards. They just don't work.
But we need interoperability. We need protocols and interfaces. How can we do that without standards? I think that open specifications are the way to take. Specifications that are constructed outside of the standardization bodies. Specifications backed by (open source) software that really work in practical situations before they are fixed and "standardized". Specifications based on something that really works. That seems to be the only reasonable way.
But there is also a danger down this road. Great care should be taken to do the design responsibly, to specify it well, to reuse (if possible) instead of reinvent and to learn from the experiences of others. To avoid creating abominations such as OpenID.