Skip to main content

Mainframe to Mobile

Not one of us has a clue what the world will look like in five years’ time, yet we are all preparing for that future – As  computing power has become embedded in everything from our cars and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it’s bigger picture impact on the shape of tomorrow.

Our intuition has been formed by a set of experiences and ideas about how things worked during a time when changes were incremental and somewhat predictable. In March 1953. there were only 53 kilobytes of high-speed RAM on the entire planet.

Today, more than 80 per cent of the value of FTSE 500* firms is ‘now dark matter’: the intangible secret recipe of success; the physical stuff companies own and their wages bill accounts for less than 20 per cent: a reversal of the pattern that once prevailed in the 1970s. Very soon, Everything at scale in this world will be managed by algorithms and data and there’s a need for effective platforms for managing these new and disruptive algorithmic businesses.

We are now, firmly in a new industrial revolution, characterized by endless choice, digital technology, data, automation and artificial intelligence. The increasing demands of machine learning are now rapidly increasing the demands on silicon architecture. The emerging AI cycle captures the explosion of data and redistributes intelligence closer to billions of end products: cars, phones, IoT devices.

If I were to make a prediction for the next five years, I would say that nations and businesses who hold the most data, the fastest servers, and the most processing power will drive all growth from here on out.

The author, Quinn Norton has noted, that even your average desktop computer is “so complex that no one person on Earth really knows what all of it is doing, or how.”

A new iMac Pro, with 64 GB RAM and 4TB SSD has 11 times as many bytes of electronic memory as the Apple II, and by "Apple II," And I mean the total of all electronic memory ever installed in all six million Apple II computers ever made.

Think of the iPhone in 2007, and that Steve Ballmer quote: “There's no chance that the iPhone is going to get any significant market share.  No chance ... Apple’s iPhone was incrementally familiar in each detail yet ridiculously better in totality. It instantly humbled its predecessors.

What happened in 2007 was that a variety of pre-existing enabling technologies; mobile CPUs, ubiquitous networking, touch screens, application deployment, social networks, cloud, and mobile payments; simultaneously became “good enough” for a mass audience.  A great example being “Indium.”

When a critical mass of building blocks arrives at “good enough,” the entire category can sprint to “great.” As a result, adoption of smartphones and apps grew with startling speed. In less than a decade, many deeply entrenched points of friction were written out of existence. Big companies tumbled, new ones were built, the world lurched forward.

If I have a single message to share this evening, it’s that Innovation isn’t about ideas — it’s about solving problems - Companies are shrinking around their core competencies. The digital revolution has freed business to automate or outsource the routine tasks that are easiest to describe and quantify. In return, Businesses have become information processing systems and increasingly, that's all they are.

It’s not technology that’s holding us back, as I look around this room at all of you who share a common purpose. It’s the conceptualization of how you use these new tools in an agile, opportunistic and practical way.

If you define technology as any sort of system that humans have built and engineered for a specific purpose, you notice that almost all of today’s most advanced technologies are complex systems: dynamic, functionally intricate, of vast size, and with an almost organic level of complexity. Tomorrow’s Smart Cities will be no exception.

In 2007, there were 10 million sensors connecting every type of human contrivance to the Internet of Things. By 2030 it is projected that 100 trillion sensors will connect to the Internet and the price of micro-electromechanical systems (MEMS), including gyroscopes, accelerometers, and pressure sensors, has fallen by 80 to 90 percent in the past five years.

Very soon, the Internet of Things, an expression coined by MIT’s Kevin Ashton, back in 1995, will connect everyone and everything in a new economic paradigm that is far more complex than the First and Second Industrial Revolutions, but one whose architecture is invisible and highly distributed rather than centralized along the familiar, clunky and narrow 20th century model.

In his book The Ingenuity Gap, professor Thomas Homer-Dixon describes a visit he made in 1977 to the particle accelerator in Strasbourg.

When he asked one of the scientists affiliated with the facility if there was someone who understood the complexity of the entire machine, he was told that “no one understands this machine completely.”

Another way of saying this is that we simply have no idea of the huge number of ways that these incredibly complex technologies can also fail. Something has worked in the past, until—well, it unexpectedly no longer does. Consider for a moment that the most unbelievable aspect of the Star Trek universe is that every ship they meet has compatible video conferencing facilities.

When we build complex technologies, one of the most powerful techniques for constructing systems is what is known as abstraction.

Abstraction allows someone to build one technology on top of another, using what someone else has created without having to dwell on its internal details.

The most obvious reasons we end up with increasingly complex systems over time are the twin forces of accretion and interaction: adding more parts to something over time, and then adding more connections between those parts.

The most important point about complex systems is not how they work but how they fail, and this week’s news is preoccupied with one bank’s catalogue of migration problems.

Today, our urban environments struggle with hundreds of different systems and protocols that do not interoperate A single building might have one set of control wires for the elevators, another for heating and ventilation, another for security, and yet another for lighting. Integrating a whole city full of these legacy networks presents an almost intractable problem.

The challenge now confronting city planners is one of rapid convergence and complexity, figuring out how to interconnect a naturally fragmented city infrastructure, built-up over time, by using the Internet to bridge these gaps in the urban fabric in a robust, safe and secure manner. Securing those edge devices presents a swath of unique challenges, including the cost and availability of technology resources

In a typical IoT deployment, most connected devices will be always connected and always on. Unlike mobile phones and laptops, such devices are likely to go through only a one-time authentication process across multiple sessions. This will make them attractive to hackers looking to infiltrate into urban and business networks, as it allows easy control and sniffing of traffic; think of an arsenal of DarkWeb services that sniff-out connected devices like Shodan.

We don’t yet have a clear grasp of where the biggest risks may lie, when and how they might cause systems to fail, or what the chain-reaction consequences will be. You only have to think of the TSB bank.  To quote Douglas Adams, which I did quite happily on Radio 4 yesterday: “If you try and take a cat apart to see how it works, the first thing you have on your hands is a non-working cat.”

The TSB story illustrates only too well, that elements of systems that were intended to be shielded from each other increasingly collide or expose themselves to risk in unexpected ways. This huge spider’s web of interactions produces what’s known in complexity science as emergence, where the interactions at one level end up creating unanticipated phenomena at another, as Chinese author Cixin Lieu observes in his Science Fiction novel, The Wandering Earth.

It’s the same for Smart Cities as we rush headlong towards the Cloud and the Internet of Things. In the future. Much of the underlying fabric of Smart Cities, such as CCTV cameras which can host botnets, traffic lights, waste bins and sensors, will be dark and inherently insecure. A potential form of dark energy and that worries me.

Enabling both consolidation and business continuity for key applications over time has become a growing challenge to organisations and virtualization is a well-managed path to reducing both the risk and cost of an aging infrastructure.

Over and over, large systems are built on top of somewhat smaller and older systems. As long as the pieces work reasonably well, little thought among many organisations is given to the layering of new upon old. Consequently, we must accept the general principle that systems become more complicated over time, whatever the technological system we look at.

Let me use an analogy from physics and astronomy. (96% of our universe is composed of Dark Matter and Dark Energy. One keeps it together, the other tries to pull it apart.  We can’t see it or measure it, but the evidence tells us it exists and without it, we certainly wouldn’t.

The problem we share today, is one of rapid convergence and complexity, which in turn complicates our ability to anticipate and manage risk.   Each expert knows a piece of the puzzle, but the big picture is too big to comprehend.” Not even the experts who have built some of the largest and most reliable systems them fully understand many of these technologies any longer and those who did may have already retired.

According to Frank Oppenheimer, his brother Robert did not comment on the first successful nuclear bomb test with ‘I am become Death, destroyer of worlds’. He said: ‘I guess it worked’.

Popular posts from this blog

Civilisational Data Mining

It’s a new expression I haven’t heard before. ‘Civilisational data mining.’

Let me start by putting it in some context. Every character, you or I have typed into the Google search engine or Facebook over the last decade, means something, to someone or perhaps ‘something,’ if it’s an algorithm.


In May 2014, journalists revealed that the United States National Security Agency, the NSA, was recording and archiving every single cell-phone conversation that took place in the Bahamas. In the process they managed to transform a significant proportion of a society’s day to day interactions into unstructured data; valuable information which can of course be analysed, correlated and transformed for whatever purpose the intelligence agency deems fit.

And today, I read that a GOP-hired data company in the United States has ‘leaked’ personal information, preferences and voting intentions on… wait for it… 198 million US citizens.

Within another decade or so, the cost of sequencing the human genome …

The Nature of Nurture?

Recently, I found myself in a fascinating four-way Twitter exchange, with Professor Adam Rutherford and two other science-minded friends The subject, frequently regarded as a delicate one, genetics and whether there could exist an unknown but contributory genetic factor(s) or influences in determining what we broadly understand or misunderstand as human intelligence.

I won’t discuss this subject in any great detail here, being completely unqualified to do so, but I’ll point you at the document we were discussing, and Rutherford’s excellent new book, ‘A Brief History of Everyone.”

What had sparked my own interest was the story of my own grandfather, Edmond Greville; unless you are an expert on the history of French cinema, you are unlikely to have ever hear of him but he still enjoys an almost cult-like following for his work, half a century after his death.

I've been enjoying the series "Genius" on National Geographic about the life of Albert Einstein. The four of us ha…
The Mandate of Heaven

eGov Monitor Version

“Parliament”, said my distinguished friend “has always leaked like a sieve”.

I’m researching the thorny issue of ‘Confidence in Public Sector Computing’ and we were discussing the dangers presented by the Internet. In his opinion, information security is an oxymoron, which has no place being discussed in a Parliament built upon the uninterrupted flow of information of every kind, from the politically sensitive to the most salacious and mundane.

With the threat of war hanging over us, I asked if MPs should be more aware of the risks that surround this new communications medium? More importantly, shouldn’t the same policies and precautions that any business might use to protect itself and its staff, be available to MPs?

What concerns me is that my well-respected friend mostly considers security in terms of guns, gates and guards. He now uses the Internet almost as much as he uses the telephone and the Fax machine and yet the growing collective t…