Skip to main content
A Matter of Trust

I’m going to present you with an argument. You may not agree with it but I believe that its time to take a fresh look at the ideological battle raging between two well-entrenched armies, the Open Source community and Microsoft.

Most of us, have at one time read ‘Gulliver’s Travels’. A few of us may also remember why the Lilliputians were at war and had been as long as anyone could remember. It was the result of an argument over which end of a boiled-egg, should be opened and eaten first, the big end or the small end?

For over three years, I have been closely watching the progress of the Open Source software movement and Microsoft’s reaction to its challenge. Most of the time, it’s like watching two large dogs barking at each other on opposite sides of the same fence. There’s a great deal of noise but very little in the way of useful, sensible debate, the kind of thing which allows an IT Director to consider the many different arguments in a single, well-reasoned package and act accordingly.

You see, what has happened, is that like the story in Gulliver’s Travels, the Open-source vs. Proprietary software, has descended into the realms of emotional and political argument; Tux the cuddly Linux penguin vs. the great Seattle Satan.

But this is far from being a black and white argument and much of the time the different camps have an “apples and oranges” approach to the argument, in much the same manner as the Americans insist on discussing Iraq while the Arab World would rather resolve the Palestinian problem first.

What are the facts? Is this really as simple as a strategic choice between Microsoft and Linux? One of the two presents a greater security risk to business, costs more money in the long-term and from a developer perspective is environmentally unfriendly but which one? Do you know? I mean, do you really know or are you, like me, caught in the middle of a propaganda battle which has left the truth scattered in a muddy no-man’s land between two colliding ideologies.

The existence if the Open-source debate is good for the industry and its good for business but it needs to be just that, an intelligent debate and not the playground slanging match, the constant low intensity warfare that leaves business wondering whether last month’s security alerts are a defining metric in an argument which is increasingly losing its way.

If 2002 started with a demand for Trustworthy Computing then perhaps 2003 should be the year that introduces the concept of intelligent computing or at the very least, intelligent and well-reasoned argument?

From a computing perspective, it’s what lies at the middle of the egg which is important and how you arrive there is simply a matter of personal preference.


Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.

In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …

The Big Steal

I’m not here to predict the future;” quipped the novelist, Ray Bradbury. “I’m here to prevent it.” And the future looks much like one where giant corporations who hold the most data, the fastest servers, and the greatest processing power will drive all economic growth into the second half of the century.

We live in an unprecedented time. This in the sense that nobody knows what the world will look like in twenty years; one where making confident forecasts in the face of new technologies becomes a real challenge. Before this decade is over, business leaders will face regular and complex decisions about protecting their critical information and systems as more of the existing solutions they have relied upon are exposed as inadequate.

The few real certainties we have available surround the uninterrupted march of Moore’s Law - the notion that the number of transistors in the top-of-the-line processors doubles approximately every two years - and the unpredictability of human nature. Exper…