Skip to main content
Game On

It wasn’t so long ago that I wrote about the appearance of Linux – at least as a developer kit – on Sony’s Playstation 2 and warned that “The Penguin on the Playstation” might be the start of the first real competition that Windows has seen in over a decade.



Now I read that gaming is becoming a serious datacentre application in a release that tells me “IBM's ultra dense blade servers will be the backbone of a Linux-based grid that will enable millions of users and developers to access Sony Computer Entertainment Inc.'s PlayStation 2 games on the Internet”.

This is all about the arrival of ‘Grid computing’, a way of creating a single supercomputer by linking multiple servers together. It’s ‘The Matrix’ in real-life and resources can be shuffled between tens or even hundred’s of computers on demand.

When Microsoft originally planned its X-Box, it imagined that Broadband would be pervasive by the time it reached the market. This is one reason it made such a strong ‘play’ for the cable market in the States, with the idea that very soon, people would be running games over the Internet from Microsoft’s own ‘pay-per-game’ servers, with thousands of linked-up X-Boxes involved. This didn’t quite happen according to plan but the concept isn’t so far away anymore.

Grid computing on this scale isn’t quite ‘Plug and play’ just yet and some very powerful Middleware is required to make it all happen but IBM will reportedly use dual-Xeon BladeCenter servers in its hosting centres to run Butterfly. net’s middleware software on the grid. This illustrates what Microsoft first imagined and encouraged it to compete in a head-on clash with Sony; the migration of gaming from the console to the Internet.

IBM isn’t of course a culture you might readily associate with ‘Tomb Raider’ but it has 175 global hosting centres ready to support the PlayStation 2 grid if the anticipated demand appears.

Is Linux really up to the task? I met with the Open Group’s CEO, Alan Brown last week and we discussed Linux and where it works best. I mentioned that IBM, unlike Sun Microsystems, suggests that Linux is increasingly scalable and Brown remarked, that in his opinion Linux, which has a great ‘TCO’ message, would only “scale to thirty-two boxes” and if you want anything more, then you’re left with UNIX.

This leaves me thinking that the magic figure of ‘thirty-two’ isn’t quite on the scale of ‘The Matrix’ or even 175 global hosting centres, so I’m wondering what “Linux-based” really means and whether IBM executives will lose their blue suits and ties and start carrying skateboards in future?

Comments

Popular posts from this blog

The Mandate of Heaven

eGov Monitor Version

“Parliament”, said my distinguished friend “has always leaked like a sieve”.

I’m researching the thorny issue of ‘Confidence in Public Sector Computing’ and we were discussing the dangers presented by the Internet. In his opinion, information security is an oxymoron, which has no place being discussed in a Parliament built upon the uninterrupted flow of information of every kind, from the politically sensitive to the most salacious and mundane.

With the threat of war hanging over us, I asked if MPs should be more aware of the risks that surround this new communications medium? More importantly, shouldn’t the same policies and precautions that any business might use to protect itself and its staff, be available to MPs?

What concerns me is that my well-respected friend mostly considers security in terms of guns, gates and guards. He now uses the Internet almost as much as he uses the telephone and the Fax machine and yet the growing collective t…

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.


In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …