Skip to main content
In Memoriam

I have been reading a White Paper on cost of ownership and Server memory from Kingston Technology and the results are interesting enough to share.



Most businesses are trying to find ways of saving money on IT spend and drive down cost of ownership. Invariably, an expense that can’t be ignored is that of the Server upgrade. Many companies buy Servers with a dual processor capability but only a relatively small percentage of owners actually choose to add a second CPU or more memory, rather than buy a completely new Server when circumstances demand greater performance. Therefore, when a new Operating System, such as Windows 2003 comes along, there’s a tendency among many business, encouraged by the leading hardware vendors, to replace an existing Server with a shiny new one.

One objective of Kingston’s research, conducted by Mindcraft Labs, was to establish a ‘test-point’, which could aid customers in determining the best and most cost-effective configuration for their Servers and this revealed two very interesting facts.

In tests with Windows 2000, Linux and Solaris, adding more memory can increase Server performance by a much as 50% on a Web Server and that extra memory, rather than a second CPU, commonly represents a more cost-effective solution to the challenge of improving Web Server and DBMS Server performance across operating platforms. As an example, quadrupling the memory in a Windows 2000 Server from 512 MB to 2GB is 40% more effective than simply adding a second processor and the performance improvement could be as high as 1000% on dual processor Solaris DBMS systems with the memory expanded from 512Mb to 4 GB.

The results clearly indicate that adding memory first, as a rule of thumb offers the best solution until an application’s need for memory is completely satisfied. Once this objective has been achieved, then adding further processors is a next step but one should always ensure that systems are fully memory efficient first before adding further processors.

The research also argues against throwing-out older Servers when new applications or Operating Systems appear. In a great many cases, a cheap memory upgrade is enough to maintain the status quo or even improve performance significantly on what it had been in the past. In some respects, it’s rather like buying a new TV because you want to watch BBC Four, when in fact all one needs is the equipment upgrade.

So unless your business really wants to spend its dwindling IT budget on new boxes, it’s worth remembering that memory is the key factor to consider in respect of applications performance. Most of us don’t have enough of it and a little extra can, as the research illustrates, make a significant impact on Web Server or Database Server performance.

The research can be found at http://www.kingston.com/europe/mindcraft/


Comments

Popular posts from this blog

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for ma…
The Mandate of Heaven

eGov Monitor Version

“Parliament”, said my distinguished friend “has always leaked like a sieve”.

I’m researching the thorny issue of ‘Confidence in Public Sector Computing’ and we were discussing the dangers presented by the Internet. In his opinion, information security is an oxymoron, which has no place being discussed in a Parliament built upon the uninterrupted flow of information of every kind, from the politically sensitive to the most salacious and mundane.

With the threat of war hanging over us, I asked if MPs should be more aware of the risks that surround this new communications medium? More importantly, shouldn’t the same policies and precautions that any business might use to protect itself and its staff, be available to MPs?

What concerns me is that my well-respected friend mostly considers security in terms of guns, gates and guards. He now uses the Internet almost as much as he uses the telephone and the Fax machine and yet the growing collective t…

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.


In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …