Skip to content
I've noticed recently that I'm getting quite a bit of grey in my beard. While there are many downsides to being an aging geek (I can't pull all-nighter coding sessions any more without hell to pay for several days), one of the benefits is gaining some perspective on how innovations play out over time. I started building software in the late '80s on my beloved Amiga, began programming OO in Objective-C on NeXTStep in 1990, and built my first web application in 1993. Along the way, I've seen all manner of approaches to standardization of new technologies.

 

Some ultimately yield successful standards, some do not.

 

Innovation Lifecycle

 

When a new technology that has broad relevance arrives, it tends to develop along a lifecycle that is more or less similar to that of other innovations over time. It may or may not follow the diffusion of innovation, dependent on market adoption of the technology. Understanding this lifecycle is a prerequisite for knowing when and how to apply standards successfully.

 

  1. Invention Phase: This is the raw, dynamic phase of a technology’s life. The technology shows substantial promise so people are interested, but there are a lot of wrong assumptions, incomplete features, and wires hanging out of the implementations. The Web from 1993-1999 was in the invention phase. Netscape and Microsoft were fighting it out over what the user experience would be, and tons of new features were being added to browsers. In this phase, incompatibilities abound and implementation “camps” appear, advocating one implementation over another and jockeying for control.
  2. Evolution Phase: The big issues are largely resolved, but there is still constant addition to the technology from many sources. There is incentive to build what’s compatible with other companies’ implementations because the market usually has grown to the point where adoption requires compatibility. From about 2000 – 2008 or so, we saw the evolution phase of the Web. The implementation camps in this phase are still existent and important, but often one has become dominan
  3. Maturity Phase: The technology is mature and relatively stable in terms of feature set. There are mature, competitive implementations in the market. Their compatibility is now given a high priority, and companies that don’t embrace compatibility lose market share. The maturity phase of the Web began around 2008, and we’re now at the tail end of it. Implementation camps in this phase usually cater more to personal preference, as most surviving implementations are broadly compatible and distinctions have become refined.
  4. Absorption Phase: The technology that was important unto itself becomes buried down a stack. Innovation takes place atop the foundational technology that is now largely stable. We are at the start of the absorption phase of the Web, with innovations like the cloud being built atop the now stable stack.

 

People tend to have a positive reaction to something that is standard, but I often don't if the standards aren't evolutionary and iterative. Standards in software that are imposed too early don't reflect the preferences and experience of users over time. They are often subordinated to the goals of companies rather than responsive to the needs of the users. This makes them questionable.

 

The right time to standardize is almost always during the maturity phase.

 

Someone has to start trying earlier, but things turn out best if efforts to standardize are frustrated by competitive innovations well into the evolution phase. I recall the efforts of a particular professional organization in the ‘90s to convince people not to use proprietary browser features. They often debated with me in online forums, claiming that I was being irresponsible for using browser-specific proprietary features and that I was harming the development of the Web by so doing. One such feature that Microsoft introduced in 1996 was the iframe element, which later facilitated the development of the AJAX pattern). We'd live in a poorer world if the early standardizers had won.

 

I've been using the Web as an example of the innovation lifecycle in action because I think it is the most successful technological innovation of the last 20 years, and that's largely because it wasn't prematurely standardized. A counter example is SOAP and the WS-* standards that proliferated from it.

 

XML-RPC was a neat little thing. It was easy to adopt, worked pretty well, and was a helpful mental bridge from RPC-style programming to REST. People got excited about it and started rolling out libraries to use it in various languages. Good times. Then some big companies got involved and decided to build a big stack to speculatively defined standards...

 

Pretty soon, what was a cool little technology that was useful and usable became a massive pile of complexity. Big parts of the vision died on the vine because they weren't the product of competition in the marketplace and user preference. UDDI registries were supposed to be markets for services that you could search for and use by downloading their WSDL. Money had been invested. People had been enthusiastic, including a younger and less skeptical me. But, despite the fact that there was no user-driven market adoption and evolution, the big guys who were writing the standards proliferated them. You won't be shocked to learn that they also were selling expensive tools that made dealing with the Byzantine mess of standards "easier". This ultimately produced a monolith of bad ideas, implemented poorly.

 

The world responded by largely rejecting the whole stack. (Some rejected it sooner than others; a few haven't yet.) We chose, instead, the competitive, evolutionary alternative in REST. UDDI was replaced by Google search and by businesses advocating their APIs through attractive user interfaces and creative tools. The world's largest SOA is expressed part in API and largely in English and other human languages. The water is certainly warmer without the dominance of a premature standard designed from on high.

 

Standardization is great, but only when the technology is mature. Right now, cloud lives in a blend of invention phase and evolution phase. This means we will endure the pain of non-standard interfaces for several years. It also means we get to participate in evolving de facto or sanctioned standards, which are better than those any room of people would speculatively mandate.

 

Embrace it all - this is the fun part! It's hard and you'll build and rebuild much of your stuff, but you'll do it to leverage powerful new features. You wouldn't be a developer if you didn't enjoy hard problems.

 

P.S. Sometimes non-standard features go terribly wrong...

 

Categorized Under