Skip to main content
Between the Forest & The Trees

"He uses statistics as a drunken man uses lampposts--for support rather than for illumination". Andre Lang

Even Tony Blair would find it hard to argue against a 75% disapproval rating and a Forrester report, ‘Can Microsoft Be Secure’ reveals that 77% of IT managers in companies with a $1billion turnover, list security as their principal concern and remain to be convinced by Microsoft’s ‘Trustworthy Computing’ security message.

In fact, the overall view of Microsoft’s progress isn’t as negative as one might suspect from the banner headlines. I’m presently putting the finishing touches to a thirty-five page investigation of my own on the topic of ‘Trustworthy Computing’ and many of Forrester’s conclusions mirror what I have found elsewhere in taking a critical look at what Microsoft is attempting to achieve. Forrester, like Aberdeen Research and Symantec’s most recent ‘Internet Threat Report’, broadly support the argument that Microsoft has been unfairly criticised for its efforts around ‘Trustworthy Computing’ and that its record is "both better and more complicated than conventional wisdom suggests”.

Regardless of present concerns surrounding the overall security of the Windows platform, 89% of the Forrester respondents rely on Microsoft’s products for use the software for sensitive applications. The research shows however that "too few firms are taking responsibility for securing their Windows systems,” with 40% of respondents not planning to make security improvements and only 59% of those who admitted to a security breach in the past making changes to the way they use Microsoft software. This leaves rather too high a gap of 40% among those companies who clearly haven’t learned from previous experience.

My own research suggests that a significant problem lies in the absence of true standardisation across platforms and security solutions. As the many thousands of published vulnerabilities illustrate, not having installed the very latest version of a software application or a Service Pack or patch leaves a system open to compromise and even the most up-to-date infrastructure has no certainty of immunity unless it is isolated and physically locked-down. Forrester, recommends Microsoft work more closely with independent software vendors (ISVs) to make sure that its patches work against applications.

Forrester notes, as has Computer Weekly, that designing a patch that won't crash an operating system is one thing, but designing one that is also safe for the applications running on top of it is another. The report also points out that while Microsoft's patches for the last nine high-profile Windows security holes predated such attacks by an average of 305 days, too few customers applied the fixes because "administrators lacked both the confidence that a patch won't bring down a production system and the tools and time to validate “Microsoft's avalanche of patches”.

In the end, it’s all a matter of confidence building and clearly, that confidence doesn’t yet exist on the part of IT managers worried by information security or the lack of it. Microsoft argues that it is constantly improving and the evidence supports this but it is confronted with two very real problems. The first being a high percentage of customers who still fail to remediate their systems in the light of previous experience and the second being the” time and tools objection”, which suggests that the regular patching process represents a significant cost and source of business interruption to Microsoft’s customers.

In a perfect world, all software would interoperate perfectly; everyone would work within a BS/ISO 17799 security framework and would be running the most up-to-date products. If it were needed, patching would be seamless and would not involve any business interruption.

Unfortunately, our world is rather different and the consistency, which both confidence and security demand, simply doesn’t exist. This makes Microsoft’s task difficult to the extent that without stronger support from customers and ISVs, it can only hope to steadily improve the present situation rather than solve the many different security challenges facing the company this side of 2010.


Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.

In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …

The Big Steal

I’m not here to predict the future;” quipped the novelist, Ray Bradbury. “I’m here to prevent it.” And the future looks much like one where giant corporations who hold the most data, the fastest servers, and the greatest processing power will drive all economic growth into the second half of the century.

We live in an unprecedented time. This in the sense that nobody knows what the world will look like in twenty years; one where making confident forecasts in the face of new technologies becomes a real challenge. Before this decade is over, business leaders will face regular and complex decisions about protecting their critical information and systems as more of the existing solutions they have relied upon are exposed as inadequate.

The few real certainties we have available surround the uninterrupted march of Moore’s Law - the notion that the number of transistors in the top-of-the-line processors doubles approximately every two years - and the unpredictability of human nature. Exper…