Skip to main content

Among the Believers

My last column on Open Source, “The Empire strikes back” provoked the inevitable remarks that I’ve come to expect, every time I pop my head above the parapet, take a deep breath and attempt to take an objective look at the progress of Open Source Software (OSS), which I’ve been doing since I was one of the first columnists to take an interest in ‘penguin’ evolution.

Once again, I expect to be shot down in flames by the Open Source groupies and I sometimes wonder if the author Salman Rushdie gives an occasional sigh of relief that he wrote the ‘Satanic Verses’ and not a column on Linux.

OSS enthusiasts may be pleased to hear that following an excellent and well-attended Westminster presentation, by my colleague, Dr Mohammed Al-Ubaydli of the Conservative Technology Forum, entitled "Open Source software for government”, I’m involved in a project to explore the practical application of OSS to deliver integrated and workable public sector solutions and in particular using bottom-up OSS in the Health Service to help "rescue" a still shaky-looking National Programme for IT (NPfIT).

It’s now 2006, over seven years since I first started writing on this subject and what continues to worry me however is that in the debate, objectivity is still blurred by an emotional attachment to one side of the argument or another which is more commonly found in hardened football supporters.

From the evidence around us, 2006 should be the year (once again) where the OSS argument achieves a critical mass and credibility supported by the evidence of its success in across the entire IT spectrum and for this to happen, it’s more vocal proponents need leave the emotional baggage of “Microsoft Bad, OSS Good” behind and concentrate on resolving the central themes of the OSS case in a way which satisfies not just cynical columnists like me but the big Enterprise and Government customers that will make a real difference to its future as a sensible and commercially sound technology alternative.

For this to happen then, we need to recognise the fundamental differences in the Linux and Windows models and accept that while there are advantages in the OSS componentization model, there are a number of valid potential customer objections which are expressed in the complexity, management and cost of maintaining an OSS environment that vary for each installation For medium sized-organisations at least, OSS still presents many challenges which most frequently involve a poorer grasp of OSS benefits, resource constraints and the worry over what is perceived as a brutal costs penalty incurred in any move away from a Windows environment.

It was Winston Churchill who said, “If we open a quarrel between past and present, we shall find that we have lost the future” and this is a sentiment that can just as easily be applied to the TCO/ROI debate which between them, Microsoft and IBM have used successfully to muddy the OSS argument to a degree that it’s very hard to establish not where the truth is but actually what it is.

A great deal of time and expensive consultancy has been focused on whether to focus on TCO, the all-inclusive costs of running specific workloads or trying to gauge the ROI when evaluating OSS; invariably with the latter as a rather abstract metric. The two are less factors of the underlying Operating System than the applications and services that support the Servers. Server choices are dictated by the application and TCO in particular doesn’t properly examine the real savings and flexibility that one server platform or another may offer.

These are just a few of the issues that we need to see clearly resolved in 2006 if OSS is going to make real headway. It’s no longer a question of a seismic event which sees Microsoft going the way of the dinosaurs. Instead it’s visibly becoming an evolutionary process which supports an uneasy co-existence and recognises the arrival of OSS as a major software technology. The only real question in my mind is how long it will take Linux to pass 30% what is see as a critical Server market share figure, the same question that I asked in 1999 and leaving the last word to Winston Churchill and why not, “I am an optimist. It does not seem too much use being anything else.”


simon moores 2 said…
I have two comments here which you may find useful this time!

1. Your general discussion around OSS fails to distingish between clients and servers. Like it of not, and for those of use who mistakenly followed the client/server computing model at great expense to our employers, have realised that there are billions of clients and very few "servers" (that we actually talk too). The problem with OSS for clients, as you rightly identified, is the millions of permuations in configurations which makes support far too costly for medium and large sized enterprises. Consequently, the MS model is appropriate. For small organisations, the reliance upon single expertise is a lesson learnt through experience. My view is Linux on the desktop will remain a niche solution.

Now for servers, it's a "different kettle of fish". Whilst many application vendors offer choices of server platforms, experience leads us to select the server platform which the application developers produce the code on. Why? Because when bugs are uncovered you quickly learn that your business can be seriously impacted by the application vendors ability to resolve the issue and if you choose to buy a configuration they are not familar with then time is not on your side and money goes down the drain. So here, I'd adovocate that the adoption of the OSS will be driven by the application vendors.

2. My second comment relates to TCO/ROI etc. and of course what drives changes. As you say, the cost of change is often hugely expensive and must be factored in to any programme. Back in the late '80's I was visited by the IT manager from our India office who was touting the idea of moving software developement to his team. When asked about cost savings, he was offering betwen 30% and 40% on what it cost us to do in the UK. he was rather shocked when I informed him that it wasn't enough, and that I wouldn't cosnider anything that wasn't at least a ten-fold reduction. Where did I get that "rule of 10" from. It's simple. Just look at technology changes which we've all experienced to realise that what it takes to make a change. DC-ROM's to DVD's (650mb to 4.7gb) or the 10, 100, 1000 ethernet (I bet you were a token ring man Simon!) are very simple examples.

So my bottom line is that OSS needs to show a 10 fold (or there about) improvement (in costs, performance, complexity/simplicity) to gain wide scale adoption. Adoption = educataion and education is pushing out mostly microsoft users with a sprinkling of MAC OS X.
DrMoores said…
Thank you. An excellent analysis and your are quite right, I should have said more aboutthe separation between client and server but in mitigation, I was trying to pck it all into a tight space for Silicon.Com and I quite agree with your view of Linux being niche on the client.

Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…
The Mandate of Heaven

eGov Monitor Version

“Parliament”, said my distinguished friend “has always leaked like a sieve”.

I’m researching the thorny issue of ‘Confidence in Public Sector Computing’ and we were discussing the dangers presented by the Internet. In his opinion, information security is an oxymoron, which has no place being discussed in a Parliament built upon the uninterrupted flow of information of every kind, from the politically sensitive to the most salacious and mundane.

With the threat of war hanging over us, I asked if MPs should be more aware of the risks that surround this new communications medium? More importantly, shouldn’t the same policies and precautions that any business might use to protect itself and its staff, be available to MPs?

What concerns me is that my well-respected friend mostly considers security in terms of guns, gates and guards. He now uses the Internet almost as much as he uses the telephone and the Fax machine and yet the growing collective t…

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.

In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …