Archive | April 2013

In a Minute: Software Defined Storage

As 2011 was a year of us talking about “Cloud”, closely followed by the “Big Data” wave of 2012 then 2013 is shaping up nicely as the year of the “Software-Defined” entity, where multiple technologies are being covered by the “SDx” banner.  Let’s have a brief look at what this means for the world of storage.

In the world of data we are used to constants; Controllers that manage the configuration of the environment and the placement of data, disks grouped together using RAID to protect data and the presentation of this data to servers using fixed algorithms. In effect when we wrote data we knew where it was going and could control it’s behaviour, we could replicate it, compress it, de-duplicate it and provide it with the performance level it needed, and when it needed less performance, then we just move it somewhere else – all controlled within the storage array itself.

Software defined Storage changes this model; it can be thought of as a software layer, put in place to control to control any disks attached to it. The storage services we are used to (snapshots, replication, de-dup, thin provisioning etc) are then provided to the Operating System from this layer. This element of control software will be capable of sitting on commodity server hardware, in effect becoming an appliance initially at least, and will be able to control commodity disk storage.

This does not really constitute some of the features of storage virtualisation, where a control plane manages a number of storage resources, pooling them together into a single entity; rather it separates the management functionality removing the need for the storage controllers – the most expensive part of a data solution. Therefore one of the driving factors for the uptake of Software Defined Storage is an obvious reduction in cost, and the ability to provide data service regardless of the hardware you choose.

The challenge to this is that data should be regarded differently to other aspects of the environment; data is permanent, packets traversing network are not, and even the virtual server environment does not require any real form of permanence. Data must still exist, and exist in the same place whether power has been present or not. We are now starting to see a generation of storage devices, note I was careful not use the phrase arrays, which are looking more capable of offering a Software Defined storage service, through the abstraction of the data and controller layers.

So what does this all mean for storage in the datacentre?

My main observation is that physical storage arrays will be with us for a long time to come and are not going away. However the potential for disruption to this model is greater than ever before, the ability to use commodity type storage and create the environment you want is compelling. With the emerging ability of software to take commodity hardware, often from several vendors simultaneously and abstract the data layer then the challenge to the traditional large storage vendors becomes a real and present danger.

I believe the rate of change towards the software defined storage environment will ultimately be more rapid and see greater early adoption to the proven concepts of server virtualisation, it will cause disruption to many existing major vendors, but ultimately end-users will still require copious amounts of disk technology, so the major players will remain exactly that. Whilst some niche players may make it through the big boys will still dominate the playground.

One of The Greatest Security Road Shows is About to Roll in to Town!

Infosecurity Europe starts on the 23rd April and historically has seen the latest and greatest IT security products launched to fanfare, song and even scantily clad ladies all vying for the industry’s acclaim and market share.  

However in recent years the market has changed and we no longer have the luxury of waiting for the annual Infosec to launch new products – they’re released when ready as competitive edge has become all-consuming and the threat landscape unrelenting in its diversity and evolution.  At least the latter is what vendors will have you believe – the truth is that security mitigation is becoming a commoditised landscape which is no doubt why certain vendors have stayed away in recent years. 

But commoditisation doesn’t meant that the problem is fixed – you know how to mitigate known threats  – it’s the unknown that’s the big issue.  If you’re going to Infosec the following should be on your to do list – if you want a differentiated view of the vendor landscape please feel free to contact me: 

  • Ddos – Distributed Denial of Service Attacks – historically mitigated in the cloud – Ddos is getting smarter and moving closer to the application layer making it a harder problem to resolve in the cloud – a blended approach of on premise and cloud is evolving.
  • APTs – Advanced Persistent Threats – those threats that we don’t know or have a method for detecting are those pieces of malware written by teams focussed on breaching an individual organisation – brand focussed and hell-bent on financial gain – ignorance is no longer a satisfactory excuse and IT Security teams have to have an answer.
  • BYOD – Securing the device isn’t enough – If always on computing is going to become a reality we need to secure communication within the device and more importantly the applications communicating with one another on the device.
  • Risk Based Computing – Security used to be built around trusted devices, secure connections and 2 Factor Authentication to identify the user.  The threat landscape has changed this – it’s about untrusted devices, enablement and did I mention threat mitigation?  A risk based approach to computing, enablement and threat mitigation is about to be released to the market – remember you heard it here…
  • Cloud Computing – won’t become mainstream until we can secure the content – a cohesive approach to securing the cloud is the only way forward –naturally Computacenter has the answer. 

I’ll be at the show on Wednesday – for a lively discussion you can contact me through your account manager or this page.

Keen to drive your business – maximise your “Digital Fuel”

I hereby make the case for a new term to describe our rich, IP network delivered, information flow – “Digital Fuel”.

Put simply, the wealth of digital information circling continually around the developed world could be classified as a fuel source utilised to drive everything from our social activity to the global economy. It now feeds the world, transported by IP networks and ensures we can consume the ever increasing volume of information created all the time, by everyone – everywhere. 

But should it really be called fuel, what does it drive or power? In this IT centric day and age it may be easier to describe what “Digital Fuel” doesn’t drive over what it does. And if we loosely align the “Digital Fuel” term with its fossil equivalent, what do we really understand about it? How is “Digital Fuel” generated and who dominates the supply chain? In the fossil fuel arena, certain geographic regions or nation states play a key role – does such a regional dominance exist in the “Digital Fuel” arena? And closer to home, as you read this blog, where does your “Digital Fuel” originate from – where is it refined and processed – how is it secured / stored?

For the purpose of this blog the term “Digital Fuel” is used as a play on words, analogous to other “powered” system based ideologies or indeed realities – but in a pause for deep thought the term may ring truer than initially considered.  As I sought to find additional insight to support the term “Digital Fuel” I located the following definition online in the midst of explanations aligning fuel with combustible fossil outcomes, Fuel -. something that nourishes or builds up emotion, action, etc.

Surely that definition resonates and could support the notion of “Digital Fuel” as information transported, realtime, all of the time by networks and now fundamental to our societal existence. But do you protect the pipelines or “networks” that deliver your “Digital Fuel” with the same level of diligence aligned with our fossil fuel pipelines – do you deem it part of your organisations “Essential critical infrastructure?”

If “Digital Fuel” really exists it raises serious questions of the use and importance of this fundamental and increasingly critical energy source. In too many circumstances the IP network readiness, design and deployment discussions are an afterthought usually well behind other more glamorous technology or business centric outcomes. BYOD, VDI, cloud computing, end user mobility, (I could continue) – all create, process and utilise “Digital Fuel”. But without a network fit for purpose, available and secure all of the time everywhere, the fuel delivery stops.  And with it so do we….

If you seeking business change and need more fuel, it’s time to make the IP network readiness conversation your first one not your worst one. If not how will your “Digital Fuel”, fuel?

Until next time

Colin W

Twitter @colinwccuk

The Paperless NHS…..?

Recently Jeremy Hunt – the Health Secretary – has stated that the NHS will become paperless by 2018 to “save billions”. But this is not a new project. Before the National Programme for IT (NPfIT) and Connecting for Health (CfH) were even a twinkle in a Health Secretary’s eye, the Information for Health (IfH) agenda clearly outlined the need for a paperless NHS (initially released in September 1998).

In fact, successive Health Secretaries, and other NHS leaders, have often suggested dates by when the NHS must become paperless, and yet in 2013, we still have a mainly paper led system. Granted, there have been great developments in this – for example, most GPs work in a paper-light fashion, and referrals, results etc. are all moving to a more paperless system. However, paper – and other hard copy records (e.g. X-rays) – still exists in the NHS.

Most of the changes that have come about in the field have not happened due to some mandated requirement. Instead, they are often brought in by clinical and business leaders to solve real business and clinical issues. Paperless solutions can lead to a reduction in treatment/medication errors, quicker time to diagnosis, shorter time to treatment, more collaborative diagnostics (allowing a wider range of specialists to be involved) and overall better patient care.

From a business perspective there are a number of benefits. As well as reducing the time taken in certain business processes (look at how email has transformed the business world) there is greater traceability, more accuracy and an overall change in the behaviours of many organisations for the good. Unfortunately, the Health Secretary fell short of announcing any new funding to assist with the paperless NHS vision. And so, again, organisations will attempt to become paper-light through localised procurement and innovation.

There are many suppliers in the “paperless office” space and organisations need to ensure that they choose the right partner for what they are trying to achieve. The software solution alone is not the only consideration. What are you trying to achieve? Clinical notes digitisation has a number of specific issues which need to be carefully managed if the digitisation process is not going to negatively impact on clinical care.

Considerations as to the security model and the storage requirements will play heavily into the service definition, and it is often better to overestimate the growth of data by a small margin than to underestimate. Many vendors will offer an assessment as part of their overall offering.

Organisations need to be sure that they are looking at how and where the information will be required. Make certain that various clinicians are part of the working group which defines how the information should be used. Too often projects like this can become centred on the technology, when actually technology is just about enabling the change to information flows. Clinical participation is critical to service success.

Fear Uncertainty and Doubt? Embrace the “New Normal”

Fear Uncertainty and Doubt or FUD has become a mantra with vendors – put simply get over it!

With one week to go until the RSA Security Summit  – The world’s changing, IT security is evolving and if the vendors are to be believed there’s a Cyber War raging on the internet!  Distributed Denial of Service (Ddos), State led hackivism and the ever present Advanced Persistent Threat (APT) all challenge your business led initiatives of mobility and enablement against a backdrop of Governance Risk and Compliance and gaps will exist in most security strategies.

Welcome to the “New Normal” – we don’t know what tomorrow’s challenges will bring but here, today business outcomes need to be delivered and the conventional network led approach to security whilst necessary isn’t the most efficient route to success.  Computacenter is speaking at RSA’s Security Summit on the 22nd April – come and hear about a different approach to resolving your security outcomes.

James