Skip to main content
A Visit to the Palladium

With the first anniversary of 911 almost upon, I’ve been thinking about security, wondering if very much has really changed, at least in IT over the last year. Airlines have of course tightened-up their security and if you’re foolish enough to produce a pair of nail clippers on a transatlantic flight, and then you’re looking at a possible life sentence and quite right too. But quietly hack into a bank or wreak havoc with a new virus for Windows and chances are that you’ll get away with it.

Microsoft might have discovered ‘’Trustworthy Computing’ but we are unlikely to have it until 2004 when Longhorn, the next version of Windows XP and Palladium, Microsoft’s new technology of trust, collide in a single unit.
Palladium is fundamentally a secure and encrypted run-time environment for Windows and ‘other’ operating systems that allows the next generation of software applications and services to protect the user from every conceivable nasty that can be thrown at a PC. You name it, invasion of privacy, hacking, spam, viruses and more clever stuff than I have room to write about, Palladium will have the answer through its use of specially designed hardware security chips and microprocessors from Intel and AMD

Palladium" is not a separate operating system. It is based on architectural enhancements to the Windows kernel and to computer hardware, including the CPU, peripherals and chipsets, to create a new trusted execution subsystem. Palladium-based PCs will still look and feel much the same and will work happily with legacy operating systems and applications. But most importantly, unique ‘Palladium ‘aware’ applications and services will offer a range of features of functionality not found elsewhere and one day, all software, if Microsoft has its way; will be Palladium-based software.

It is important to grasp that while today's applications and devices will continue to work in "Palladium," they will gain little to no benefit from "Palladium" services. To take advantage of "Palladium," existing applications must be adapted to utilize the "Palladium" environment or new applications must be written. This software - whether it is a single component of a Microsoft Win32-based application or a new application - is called a "Trusted Agent."

“That’s nice” I hear you say. “No more offers of Herbal Viagra from Hotmail and an end to the constant fear of viruses”. Well if it sounds too good to be true it probably is and in Microsoft’s words:

“Palladium provides a trusted processing environment. Trusted code runs in memory that is physically isolated, protected, and inaccessible to the rest of the system, making it inherently impervious to viruses, spy-ware, or other software attacks”.
The company continues by saying: “With respect to viruses, the contribution from Palladium is fairly straightforward. Since Palladium does not interfere with the operation of any program running in the regular Windows environment, everything, including the native OS and viruses, runs there as it does today. So antivirus monitoring and detection software in Windows will still be needed. However, Palladium does provide antivirus software with a secure execution environment that cannot be corrupted by infected code, so an antivirus program built on top of a Palladium application could guarantee that it hasn't been corrupted. This grounding of the antivirus software allows it to bootstrap itself into a guaranteed execution state, something it can't do today”.

This sounds reassuring but you’d be forgiven, if like me, you find yourself struggling to really understand Palladium because even Microsoft might admit that these are early days and Palladium is still very much a blueprint where the finer details are still a little vague and open to consultation.

But wait a minute, won’t Palladium extend Microsoft’s existing stranglehold on the software industry even further? This is, after all a “Windows-only” technology? I asked Stuart Okin, Chief Security Officer for Microsoft UK if this was the case and as you might expect, his answer was “Certainly not, Palladium actually raises the bar in Operating Systems and should be portable to other environments, under license”, even Linux, if an application is written to take advantage of the Palladium environment. But to be honest, I’m not entirely clear on how this might work in practice and Stuart Okin struggled to find a suitable metaphor to describe the process. After all, this is still Windows, even if it represents a ‘Trusted Space’ within the Windows Operating System.

I asked Eddie Bleasdale of Net Project, the UK’s leading independent Linux applications expert and he commented:
“Microsoft and trusted computing are oxymoron’s but if MS is going to bring out a trusted OS it has to simultaneously change its file structure and get rid of macros and a whole number of APIs which aren’t secure, such as Active X. This represents a complete discontinuity, identical to that which IBM went through with OS2, when it failed to grab market share because of that discontinuity. Microsoft is coming up with a secure operating fifteen years too late”.

The prospect of a secure and trusted environment is good for us and bad news for the security and content management companies. It also “stops the rot” from Microsoft’s point of view, removing one of the largest objections to using Microsoft technology in the first place, it’s miserable record on security. Now Microsoft seems keen to share the Palladium technology among other Operating Systems vendors and you’ll recall that the last thing that Microsoft shared freely with everyone was Internet Explorer, just as Netscape was becoming a universal metaphor for the browser. Call me suspicious or even paranoid but where’s the catch? – Answers on a postcard or by email please!


Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…

The Big Steal

I’m not here to predict the future;” quipped the novelist, Ray Bradbury. “I’m here to prevent it.” And the future looks much like one where giant corporations who hold the most data, the fastest servers, and the greatest processing power will drive all economic growth into the second half of the century.

We live in an unprecedented time. This in the sense that nobody knows what the world will look like in twenty years; one where making confident forecasts in the face of new technologies becomes a real challenge. Before this decade is over, business leaders will face regular and complex decisions about protecting their critical information and systems as more of the existing solutions they have relied upon are exposed as inadequate.

The few real certainties we have available surround the uninterrupted march of Moore’s Law - the notion that the number of transistors in the top-of-the-line processors doubles approximately every two years - and the unpredictability of human nature. Exper…

An Ockham of Gatwick

The 13th century theologian and philosopher, William of Ockham, who once lived in his small Surrey village, not so very far from what is today, the wide concrete expanse of Gatwick airport is a frequently referenced source of intellectual reason. His contribution to modern culture was Ockham’s Razor, which cautions us when problem solving, that “The explanation requiring the fewest assumptions is most likely to be correct;” sound advice which constantly proves to be true.

A week further-on since Britain’s second busiest airport was bought to a complete standstill by two or perhaps two hundred different drone sightings, it is perhaps time to revisit William of Ockham’s maxim, rather than be led astray by an increasingly bizarre narrative, one which has led Surrey police up several blind alleys with little or nothing in the way of measurable results.

 Exploring the possibilities with a little help in reasoning from our medieval friar, we appear to have a choice of two different account…