Skip to main content

Short Story

A short cybercrime story on Al Jazeera TV on Sunday, reminded me that the Infosec show in London had passed me by, almost unnoticed this year. I had been speaking at the IDC conference in Milan and missed my annual pilgrimage to the great security bazaar at London’s Olympia.

I can’t honestly remember how long Infosec has been running but an observer from another planet might be forgiven for asking why, after all this time, the security industry and government between them, have failed to deliver any sure-fire solutions for dealing with a global problem, beyond throwing ever larger amounts of money at it?

In time for Infosec 2003, I wrote a Microsoft-sponsored report entitled ‘A Matter of Trust’, which as I’m sure you will guess, focused on the company’s ‘Trustworthy Computing’ initiative. Five years further into the search for this elusive digital equivalent of the Holy Grail, I’m reminded of one of my comments:

“This problem brings us to where we are today, at the beginning of 2003, looking back at a disastrous record of security incidents and exploits and wondering how long it will be before any new approach to the challenge of 'Trusted Computing' can inspire real confidence from those at most risk from the technology.”

Back in the distant past of 2003 the threats were different, viruses, spam, hackers, ‘SQL Slammer’ and so on. The broadband society, Bot Nets and the Russian Business Network were still unimagined dangers and yet, billions of dollars were still being spent by individuals and companies to maintain the comforting illusion of security or at least mitigate its more immediate and damaging risks. Today, I look at my slide deck from Milan and see that we have entire Internet Relay Chat (IRC) networks controlled by the criminal underground economy, that cyber crime could be almost as big as the value of the global illegal drug trade – nobody really knows – and that as many as one billion personal computers – 12% of the world’s total internet connected - could be hiding malware of one type or another.

Meanwhile, the burgeoning information security business soldiers on and threatens to overtake the Chinese army as the largest employer on earth, having to admit that ttraditional anti-virus scanning approaches are no longer able to keep pace with the growth in malware products, increasingly purpose- designed by sophisticated criminal gangs with product packers to defeat anti-virus signature detection.

Most recently, I was passed a copy of an FBI report to the US Congress, from 2004, the last de-classified year. The report notes that “56 million cyber events took place in first six months of 2004 up from 500,000 events in 2002” and that “1/5 of suspicious incidents were committed by ‘foreign state actors’ in the same year.”

Bearing in mind that four years is a very long time in internet terms and that we know from our own experience how rapidly the many different threats and attack platforms have evolved, then you’ll understand why I discovered in Italy that they are rather unhappy over a series of recent exploits, which they believe are targeting their large companies for either purposes of espionage, extortion or simply the theft of trade secrets.

It’s only reasonable to assume that in another five years, ‘Infosec 2013’ will still be at Olympia but I struggle to imagine how much worse the problems that it exists to solve can become. Winston Churchill once said: “Although personally I am quite content with existing explosives, I feel we must not stand in the path of improvement” but unlike the Second World War, this is a struggle that we are losing, with no end in sight and an IT security industry arguably experiencing an expensive form of denial.

In 2003, I wrote: “The last twelve months have witnessed a worrying escalation in the number of vulnerabilities, which can lead to Internet-based attacks on organizations and the compromise of their information infrastructure”, So what’s really changed for the better since then? Perhaps I need to wait a little longer, for 2013 and the promised and long awaited arrival of really trustworthy computing to find out?

Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…

The Big Steal

I’m not here to predict the future;” quipped the novelist, Ray Bradbury. “I’m here to prevent it.” And the future looks much like one where giant corporations who hold the most data, the fastest servers, and the greatest processing power will drive all economic growth into the second half of the century.

We live in an unprecedented time. This in the sense that nobody knows what the world will look like in twenty years; one where making confident forecasts in the face of new technologies becomes a real challenge. Before this decade is over, business leaders will face regular and complex decisions about protecting their critical information and systems as more of the existing solutions they have relied upon are exposed as inadequate.

The few real certainties we have available surround the uninterrupted march of Moore’s Law - the notion that the number of transistors in the top-of-the-line processors doubles approximately every two years - and the unpredictability of human nature. Exper…

An Ockham of Gatwick

The 13th century theologian and philosopher, William of Ockham, who once lived in his small Surrey village, not so very far from what is today, the wide concrete expanse of Gatwick airport is a frequently referenced source of intellectual reason. His contribution to modern culture was Ockham’s Razor, which cautions us when problem solving, that “The explanation requiring the fewest assumptions is most likely to be correct;” sound advice which constantly proves to be true.

A week further-on since Britain’s second busiest airport was bought to a complete standstill by two or perhaps two hundred different drone sightings, it is perhaps time to revisit William of Ockham’s maxim, rather than be led astray by an increasingly bizarre narrative, one which has led Surrey police up several blind alleys with little or nothing in the way of measurable results.

 Exploring the possibilities with a little help in reasoning from our medieval friar, we appear to have a choice of two different account…