My last post looked to the past to explain how productivity was a key to Microsoft’s initial success but that new business models and the ascent of the internet has changed that equation.
One simple way to illustrate this is to look at software development. Historically software development has had it’s roots in what is good for the IT organization. Is it cheaper, easier, manageable for technologists to do what they need to accomplish? This was pretty evident in the systems design and command and control structure of technology as it evolved in the 50s through the 80s.
The advent of the PC opened up a new opportunity. It made technology more accessible to folks outside the traditional groups that controlled technology. In effect, non-technology oriented folks in consumer and enterprise markets had a new say in how things could be done and if they didn’t like what the IT organization was saying or doing they could often take things into their own hands. This basically let to the birth of modern operating systems and productivity software as we know it.
But this phase held on to some of the patterns from the command and control days, de-facto standards were held by a few an interoperability was something people thought about begrudgingly or not at all. In fact the business models of all the competitors of the day were based on this thinking, in short the models worked—for a time.
But the internet caused a disruption to this way of thinking. In the enterprise folks had grown frustrated with established hegemony and the Web opened up a new way of thinking. One, it made it easier to write applications that anyone could use. Two, it simplified distribution. But compromises were made around ease of use and productivity often took a back seat to the utility that the internet provided.
In addition, as the utility of the internet spread to consumer focused applications created by a whole new general of developers abandoned the traditional practices of application development—in fact many were never exposed to classic development projects in the first place.
There were benefits to this for all of us. As consumers we could things online or get access to data that previously might have been very difficult, it might be difficult to use these new tools but difficult was better than impossible
This new models made life easier for developers too, they were a step down the path of ‘write-once, run everywhere.’
In the late 90’s however cracks began to appear in this revolution. The concept of being easier for developers did not necessarily mean cheaper. It wasn’t uncommon for early commerce sites for large scale efforts to cost anywhere from 15 million to 100 million US dollars—amounts that almost defy description today. Much of this was because IT in the late 90s was similar to automotive industry in the early parts of the 20th century where there were hundreds of car manufactures competing for consumer’s business, there was no standardization and no interoperability, this made things really hard.
Worse still is what happened when these cobbled together systems actually started generating revenue. It became increasingly difficult to modify or make changes to applications that were mission critical but that were so complex that every feature rollout was the IT equivalent to performing brain surgery where a single mistake could cost millions in lost revenue.
Two things happened that pulled this revolution back from the brink and both of them really boil down to productivity.
One is that developers and software makers started focusing on standards—regardless if the solutions they created were ‘free’ or ‘open’. Web developers demanded it and used technology that was exclusively standards-based and enterprises that required more than standards could provide at least wanted the flexibility to interoperate with these standards and other technology. It’s hard to find any standards-based or proprietary technology today that doesn’t think about interoperability.
The second is that the level of frustration with the usability of technology came to the forefront. It wasn’t just about did an application actually work but could a person actually figure out how to use it.
Both of these drivers have been a key part of the Web 2.0 phenomena with the popularity of AJAX and rich media experiences enabled with things like Flash, Silverlight, etc. It even extends to the power of client software, such as iTunes, WPF applications, AIR applications and other solutions that take advantage of client hardware versus a browser.
I think the next wave of productivity is starting now and it’s focused on productivity in application development and life cycle management and in total cost of investment and return on investment.
When we think about productivity in application development and life cycle management we are really talking about the concept of the ‘Inverted-T’. Which can be defined at the repeatable best practices that we can apply to every project versus re-inventing the wheel. For example why build a content management system when you can buy one or save money implementing an open source option? Why spend money designing a architectural work pattern for a manufacturing facility if you can license one? Why build an authentication system if you can leverage one as a service? This type of thinking represents the horizontal part of the ‘T’. This way of thinking allows us to go deep and focus the majority of our attentions on the parts of our business that allow for differentiation and innovation.
We can also think about life-cycle management. For example does our workflow allow asynchronous round tripping through of projects and assets from designers to developers. Very few workflows do this today, one does (I’ll be polite and not mention them by name). I suspect this type of workflow will become standard in many environments and that in some agile Web processes traditional design tools will be eclipsed by tools that work in the target delivery medium (Vectors versus bitmaps for example or HTML and CSS that don’t need to be factored from static visual designs).
The final dimension in the next wave of productivity is probably services. We already have a myriad of services available to us around commerce, community, identity, location, search but the next wave of services, commonly called ‘cloud’ services are going to go far beyond that. In the future Knowledge Management or email systems might make more sense for many companies if they exist outside of an enterprise’s data center and in a cloud—much like we host many of our Web sites today if we’re a small to mid-size business. These type of moves will start letting developers in the enterprise and smaller entities focus on the core strategies that allow their business to innovate.
We’re starting to see signs of this already in the market. For example look at the ability of a service-based site like Mint.com. Their ability to innovate and role out new features is far more capable that some of their more traditional peers.
Lesser known examples might be click-ones applications that can silently and quickly introduce new features without proactive user activity.
The next post in this topic will dive into services and show how once we’ve taken advantage of the productivity gains that can come in software development and services that we’re ready to set the table for real breakthrough innovation.