« Bought an IOGear USB KVM switch | Main | WebSphere 6.0.2 ships »

July 18, 2005

End of the road for invasive middleware?

Why is open source software popular, even among customers with commercial application servers. I don't think cost is a factor given it's pennies when compared with development budgets. I think a big factor is flexibility/consumability.

Lets take object relational mappers. Application servers provide an implementation of CMP for persistence. But, despite this, some customers use Hibernate. I think the main advantage of Hibernate versus a builtin CMP engine is that when Gavin ships a new Hibernate, customers can take advantage of it without an application server upgrade. When a vendor updates its CMP implementation then it's typically in a new release which is a drag for people as they have to wait until their operations guys certify and upgrade the application servers. Upgrading to a new hibernate a much easier thing to do than upgrading the whole application server.

Caches. Tangosol is in a similar position. If a customer uses their cache then when Tangosol releases a new version then the customer can take advantage of it immediately where as a cache built in to an application server would typically require a version upgrade. Tangosol isn't open source or free but it's attractive to customers because of this.

Linux is a great example of this. Linux is a collection of components. The kernel, bash, X, etc. I can usually install a new bash shell without upgrading the whole linux installation. I really like the Linux RPM model. An installer that rarely changes, components in standard packages. The install allows components to be upgraded and drag newer versions of dependent components if necessary and it DOESNT intrude on the programming model for the components themselves. They are still written the way they always have been. There are lessons here for any component technology trying to muscle in here. This pattern works.

  • Stable installer, doesn't change very often.
  • RPMs for packaging components and specifying dependancies.
  • Components themselves largely independant of the install/packaging technology.
  • Components upgradable independantly of the platform

Looks like low coupling has some advantages after all... Middleware vendors should not be writing installer technology, they should be using the one the platform uses.

Middleware vendors need to start observing this and shipping middleware components which work on existing platforms and making sure new componentry is designed to work standalone or on older as well as competitive platforms. This is pretty different from the way the vendors ship product right now but I think the success of open source components is due at least in part to factors like this.

I think new middleware that forces you to replace your existing environment or insists on being the bootstrap component for your JVM or change the way you write your component (i.e. couple to the bootstrap) are not going to be as successful as a result. Coexistence is the key and significant upgrades of these kinds of components (cache/persistence etc) should be possible on legacy platforms as well as usable in J2SE or competitive platforms.

The common denominator is a J2SE 1.4 environment as it's pervasive right now. Whats pervasive wins. If customers don't have it already then it's a tough sell to force it down peoples throats even if its better technology. Windows and OS/2 is a good example here...

Components which prereq JDK 1.5 will be stifled until the market moves to 1.5 en mass which won't be for a while. The lesson here is to make sure the components work with what people have but be able to leverage improvements in the platform.

What we need is a common underlying platform to enable this. This platform cannot change very frequently for obvious reasons. It's need to be very stable over long periods of time. The Linux scenario, Microsofts windows update or Perls cpan are good examples of this. Eclipse/OGSI is another example but probably too invasive, I have to make my component in to a plugin, it's not automatic or transparent where as the Linux example is. I can write a new program for Linux and then as an after thought package it as an RPM and ship it. But, Eclipse is almost a complete java component environment. It have auto updates built in and an installer, versioning. Maybe, Eclipse should be the new J2SE environment. OGSI bundles become the RPM of Java. But, even if its better technology, it faces an almost impossible task of displacing the incumbent technology, POJOs, lightweight IoC containers etc which are not standing still and continue to improve and serve 95% of peoples needs.

We'd need Eclipse to become pervasive or the defacto standard which everybody has and runs their client applications and application servers in for this to really take off though and I'm not sure people are ready for that yet. So, that leaves a J2SE environment and the component packaging is a jar. Nice and simple and this technology is pervasive and requires no work/hassle on the part of customers. They already have everything they need. Linux and Windows is the same in this regard, everybody has the environment already. This J2SE mode leaves versioning is up to the customer but people seem to be happy doing that.

But, what ever, this is the world these components live in and these components will have to work with this infrastructure (Spring etc) or people just won't use them. If a customer is using Spring for configuring the application then a cache or persistence layer better work seamlessly with Spring and similar components or else these days, it's a big negative on adopting that component, how ever trick it is. Integrated configuration is still needed when customers don't have Spring but the customer better be able to configure it with Spring natively when they do. It has to fit in with what the customer is doing.

It's all about consumability and how can we remove the barriers that stop customers consuming the improvements in middleware technology. I think we need to rethink how consumable middleware components really are with this in mind and start designing for this market reality. Monolithic products need to change in the face of the competition here or else customers will just use the most consumable technology as they have started to do already.

July 18, 2005 | Permalink

Comments

I think you will find that the attitude of integration architects and developers also need to change. Currently most large corporations focus on using the 'monolithic' solutions as a safety blanket to an ugly problem (integration). There is no use having flexible solution stacks if they are not used.
A lot of developers and architects struggle to get their heads around using non proprietary WSDL's, XML Schemas and other current openly defined components that can be used to provide re-usable frameworks within an integration solution.
There is nothing wrong with proprietary technology if it is implemented in a flexible and modular way which maximises re-use. The sad fact is that across the world it appears that on a project by project basis, there is a lack of good design in integration. Projects can sit inside company internal silos but integration can not. Good integration design and implementation takes effort.
My opinion is that OpenJMS and other projects already provide such open source solutions, the biggest problem is that of mind set. More components would of course help though.

Posted by: Andrew Pym | Jul 20, 2005 12:04:24 AM

A very good point, Billy. I agree with you totally. Big +1!

Posted by: Trustin Lee | Jul 20, 2005 8:35:09 PM

You make a good point, but I feel "invasive middleware" is at an end for another reason. The middleware "container" model was sold on the promise of "re-usable" system components.

The vendors built it, and we bought in to it, but the promise turned out to be a false one. Instead what seems to work is judicous use of the programming language. Java allows you to extend the language into a platform by using standard APIs like JDBC and the Servlet API.

So specific service APIs work, but monolithic middleware doesn't. For me the lesson learned is don't buy into complexity, build the simplest thing that can possibly work for you.

Posted by: Paul Beckford | Jul 22, 2005 4:17:08 PM

I agree also. The trick now is producing middleware components with scalability capability. Ones that are easy to use for simple tasks but can be used in more advanced ways without detracting from the simple ones.

Posted by: Billy | Jul 23, 2005 3:02:52 PM

Nice article, but I think it is OSGI and not OGSI.

Posted by: Georges Goebel | Jul 27, 2005 1:16:57 AM

i think this consumability/modularity/interface approach is essential for SOA. The concepts are closely tied. Why would we develop service-oriented apps that could only be deployed to gorpy monoliths with tons of interdependencies? i am quite surprised the Java crowd haven't gone after MS on this one.

Posted by: james governor | Jul 27, 2005 4:16:02 AM

when plugging any component into a platform, (operating system, middleware..), making it work is one thing. In this case, you may not need lots of platform infrastructure service, it is simple, and should be very pluggable, independent of platform version

But component scalability/manageabilty is another whole topic. In most cases, the components need to rely on some platform API contracts to be scaled and managed by the platform. This will add dependency on the platform, especially if the platform API is not so stable, and keep changing, just as J2EE.

Middleware will be less intrusive when it become mature, and has longer upgrade cycle, just as OS.

Posted by: peter xu | Jul 27, 2005 3:25:09 PM

Post a comment