Skip to main content

Interoperability – The Second Front Opens

Often tired and frequently oversold, the struggle between the software industry’s giants, otherwise known as the Open Source /Linux debate, has, over the summer, opened a second front, interoperability, injecting more life into an argument of increasingly strategic importance.

For a while now, the case in favour and against the introduction of Open Source / Linux solutions, has surrounded fundamental questions of reliability, security and Total Cost of Ownership (TCO).

A search for the ‘silver bullet’ argument in the analyst reports on any one of the vendor websites remains elusive and what one finds mostly surrounds the question of why one side’s TCO benefit is greater than another’s. After all, TCO and ROI (return on investment) are less factors of the underlying Operating System than the applications and services that support the Server platforms. Furthermore, it’s argued that TCO doesn’t examine the underlying flexibility of a technology and with it, any potential savings that accompany its implementation.

If the security and TCO argument isn’t conclusive or appears to be moving in a new direction, then we have to start looking elsewhere if we’re going to have a better opportunity to “get the facts” or just as importantly, judge how apparently opposing technologies can cost-effectively and productively co-exist if and when peace finally breaks out.

With the three main argument of the OS/Linux offensive bogged-down in the detail, where do we go next for some decisive action? In my view it’s toward the promise of interoperability and being clear about the true cost of acquisition; more importantly, understanding why these are important questions which shouldn’t be neglected as a strategic consideration in the broader technology decision-making process.

With all of us now living in the same moment in an increasingly connected age, interoperability is simply ability of different IT networks, applications, or components to exchange and use information, i.e., to “talk” to each other. This goal can be achieved through the development of software that is “interoperable by design” (e.g., inclusion of XML technology in software to facilitate the easy exchange of data across different applications), by licensing / cross licensing of proprietary technologies and essential intellectual property, through collaboration with partners, competitors, customers and through the  implementation of industry standards (including open standards and broadly accessible proprietary standards) in products and services.

Whether you happen to be Microsoft or even IBM or Novell, this makes sense, as you may have read in an earlier Silicon story which mentioned the interoperability benefits of JBoss working with Windows. After all, in a future which operates from behind a cloud of Web and software services, a liberal ‘multicultural’ approach to software interoperability offers considerable benefits to all the players involved.

Where it all may go badly wrong however, is when the interoperability that business and the public sector desires. is ‘cobbled-together’ or poorly engineered, because one solution or licensing model is mandated to the exclusion of another which may offer much greater flexibility and lower costs over time.

This is the main substance of the ‘second front’ argument which is likely to occupy a great deal of editorial space over the next twelve months. Has business and in particular, the public sector started along a path, which though recognising the benefits of interoperability, pays only lip-service to the sibling value of flexibility through the promotion of one technology over another. Selecting a narrow, ‘single-source’ route  for the delivery of  interoperable products?

A catalogue of high-profile public sector project failures over the last twelve months may encourage everyone involved in planning large IT projects to think more pragmatically on how software and services can cooperate and be truly interoperable in a Rubik’s cube world of emerging complexity. In looking to a future that argues for greater co-existence between both Microsoft and Open Source/Linux as facts of life, we might be better advised to explore the intrinsic flexibility and value for-money faces of different solutions. The alternative is to continue along a narrow path which increasingly specifies ‘Open solutions’ that in principal, deliver the opposite of what “Open” and interoperable should mean in the first place, through  simply imposing one set of arguments or (proprietary/open) standards and technologies over another.

Or to quote Winston Churchill: “If we open a quarrel between past and present, we shall find that we have lost the future.”

Comments

Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…
The Mandate of Heaven

eGov Monitor Version

“Parliament”, said my distinguished friend “has always leaked like a sieve”.

I’m researching the thorny issue of ‘Confidence in Public Sector Computing’ and we were discussing the dangers presented by the Internet. In his opinion, information security is an oxymoron, which has no place being discussed in a Parliament built upon the uninterrupted flow of information of every kind, from the politically sensitive to the most salacious and mundane.

With the threat of war hanging over us, I asked if MPs should be more aware of the risks that surround this new communications medium? More importantly, shouldn’t the same policies and precautions that any business might use to protect itself and its staff, be available to MPs?

What concerns me is that my well-respected friend mostly considers security in terms of guns, gates and guards. He now uses the Internet almost as much as he uses the telephone and the Fax machine and yet the growing collective t…

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.


In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …