EMC World 2013 took place in the Venetian Hotel and Sands Conference centre on 6th-10th May 2013. Attended by over 12,000 staff, partners and customers there were several product announcements and a range of upgrades to existing technology. The main points of interest were as follows;
- ViPR (pronounced Viper); The major announcement. EMC’s entry into the world of Software Defined Storage. ViPR will be (initially at least) an appliance designed to abstract the control plane and data plane. The control plane will effectively be a storage hypervisor, managing the storage (data plane) underneath, which on day one will be EMC’s VNX, VMAX or Isilon and any NetApp arrays, other vendors to follow. The Data plane can be commodity storage in the future. First products due to ship late 2013, so final verdict is reserved until then. Initial release sounds very much like a Gen 1 product, so expect push back from other vendors, but the roadmap sounds fairly compelling, and comes under the “product to watch” category”. Rumour has it that EMC belief this to be their best Gen 1 product yet released, and is their future. ViPR will offer pooled storage resources presenting Block, File and Object based presentation and include simplified management and automation. Full review in a separate post.
- Pivotal – Announced before EMC World, but had a lot of focus, Pivotal is a partnership between EMC & VMware with GE investing heavily, this is designed for next generation Cloud and Big Data applications. Pivotal splits into three areas; Data Fabrics, Application Fabrics and Cloud Fabrics. Pivotal 1 launched late 2013, again one to watch
- XtremIO – Available now in limited quantities but a big focus. EMC’s All-Flash Array (AFA). Provides a lot of the functionality expected of Enterprise class arrays, combined with very high performance. Want to see one? Contact me, I’ve got one!
- EMC Velocity Partner Program – the partner program changes to allow all partners to be “Business Partners” with specialities in relevant areas. Look out for Computacenter changing from one “Velocity Signature Solution Centre” logo to about 20 different Business Partner logos. Those PowerPoint slides suddenly got very busy.
- Isilon upgrades – Isilon is proving to be an excellent acquisition for EMC, look out for forthcoming enhancements including deduplication, auditing ability and integration with HDFS, combined with additional scalability. Also the required enhancements to the SynqIQ replication functionality are being delivered.
- SRM Enhancements – New suites of management products, sharing a common interface with ViPR. Let’s face it – these were needed.
- Continuous Availability enhancements – The ability to combine VSPEX + VPLEX is designed to eliminate complexity in this area for relevant customers
- VNX upgrades are on the way, but still under NDA (if you are internal ask me nicely)
- BRS (Backup & Recovery Services) – Enhancement to the Data Domain range, with further development in Avamar technology means this remains a focus areas for both EMC and partners.
Summary; EMC World remains one of the Must-Attend events in the industry. Whilst some of the announcements are of future products which are work in progress, theses do give an insight into the direction the company is going. Joe Tucci stated that EMC will remain true to its roots, but with an increasing investment in software based products. EMC World proved a worthwhile investment in time.
As 2011 was a year of us talking about “Cloud”, closely followed by the “Big Data” wave of 2012 then 2013 is shaping up nicely as the year of the “Software-Defined” entity, where multiple technologies are being covered by the “SDx” banner. Let’s have a brief look at what this means for the world of storage.
In the world of data we are used to constants; Controllers that manage the configuration of the environment and the placement of data, disks grouped together using RAID to protect data and the presentation of this data to servers using fixed algorithms. In effect when we wrote data we knew where it was going and could control it’s behaviour, we could replicate it, compress it, de-duplicate it and provide it with the performance level it needed, and when it needed less performance, then we just move it somewhere else – all controlled within the storage array itself.
Software defined Storage changes this model; it can be thought of as a software layer, put in place to control to control any disks attached to it. The storage services we are used to (snapshots, replication, de-dup, thin provisioning etc) are then provided to the Operating System from this layer. This element of control software will be capable of sitting on commodity server hardware, in effect becoming an appliance initially at least, and will be able to control commodity disk storage.
This does not really constitute some of the features of storage virtualisation, where a control plane manages a number of storage resources, pooling them together into a single entity; rather it separates the management functionality removing the need for the storage controllers – the most expensive part of a data solution. Therefore one of the driving factors for the uptake of Software Defined Storage is an obvious reduction in cost, and the ability to provide data service regardless of the hardware you choose.
The challenge to this is that data should be regarded differently to other aspects of the environment; data is permanent, packets traversing network are not, and even the virtual server environment does not require any real form of permanence. Data must still exist, and exist in the same place whether power has been present or not. We are now starting to see a generation of storage devices, note I was careful not use the phrase arrays, which are looking more capable of offering a Software Defined storage service, through the abstraction of the data and controller layers.
So what does this all mean for storage in the datacentre?
My main observation is that physical storage arrays will be with us for a long time to come and are not going away. However the potential for disruption to this model is greater than ever before, the ability to use commodity type storage and create the environment you want is compelling. With the emerging ability of software to take commodity hardware, often from several vendors simultaneously and abstract the data layer then the challenge to the traditional large storage vendors becomes a real and present danger.
I believe the rate of change towards the software defined storage environment will ultimately be more rapid and see greater early adoption to the proven concepts of server virtualisation, it will cause disruption to many existing major vendors, but ultimately end-users will still require copious amounts of disk technology, so the major players will remain exactly that. Whilst some niche players may make it through the big boys will still dominate the playground.
Infosecurity Europe starts on the 23rd April and historically has seen the latest and greatest IT security products launched to fanfare, song and even scantily clad ladies all vying for the industry’s acclaim and market share.
However in recent years the market has changed and we no longer have the luxury of waiting for the annual Infosec to launch new products – they’re released when ready as competitive edge has become all-consuming and the threat landscape unrelenting in its diversity and evolution. At least the latter is what vendors will have you believe - the truth is that security mitigation is becoming a commoditised landscape which is no doubt why certain vendors have stayed away in recent years.
But commoditisation doesn’t meant that the problem is fixed – you know how to mitigate known threats – it’s the unknown that’s the big issue. If you’re going to Infosec the following should be on your to do list – if you want a differentiated view of the vendor landscape please feel free to contact me:
- Ddos – Distributed Denial of Service Attacks – historically mitigated in the cloud – Ddos is getting smarter and moving closer to the application layer making it a harder problem to resolve in the cloud – a blended approach of on premise and cloud is evolving.
- APTs – Advanced Persistent Threats – those threats that we don’t know or have a method for detecting are those pieces of malware written by teams focussed on breaching an individual organisation – brand focussed and hell-bent on financial gain – ignorance is no longer a satisfactory excuse and IT Security teams have to have an answer.
- BYOD – Securing the device isn’t enough – If always on computing is going to become a reality we need to secure communication within the device and more importantly the applications communicating with one another on the device.
- Risk Based Computing – Security used to be built around trusted devices, secure connections and 2 Factor Authentication to identify the user. The threat landscape has changed this – it’s about untrusted devices, enablement and did I mention threat mitigation? A risk based approach to computing, enablement and threat mitigation is about to be released to the market – remember you heard it here…
- Cloud Computing – won’t become mainstream until we can secure the content – a cohesive approach to securing the cloud is the only way forward –naturally Computacenter has the answer.
I’ll be at the show on Wednesday – for a lively discussion you can contact me through your account manager or this page.
I hereby make the case for a new term to describe our rich, IP network delivered, information flow – “Digital Fuel”.
Put simply, the wealth of digital information circling continually around the developed world could be classified as a fuel source utilised to drive everything from our social activity to the global economy. It now feeds the world, transported by IP networks and ensures we can consume the ever increasing volume of information created all the time, by everyone – everywhere.
But should it really be called fuel, what does it drive or power? In this IT centric day and age it may be easier to describe what “Digital Fuel” doesn’t drive over what it does. And if we loosely align the “Digital Fuel” term with its fossil equivalent, what do we really understand about it? How is “Digital Fuel” generated and who dominates the supply chain? In the fossil fuel arena, certain geographic regions or nation states play a key role – does such a regional dominance exist in the “Digital Fuel” arena? And closer to home, as you read this blog, where does your “Digital Fuel” originate from – where is it refined and processed – how is it secured / stored?
For the purpose of this blog the term “Digital Fuel” is used as a play on words, analogous to other “powered” system based ideologies or indeed realities – but in a pause for deep thought the term may ring truer than initially considered. As I sought to find additional insight to support the term “Digital Fuel” I located the following definition online in the midst of explanations aligning fuel with combustible fossil outcomes, Fuel -. something that nourishes or builds up emotion, action, etc.
Surely that definition resonates and could support the notion of “Digital Fuel” as information transported, realtime, all of the time by networks and now fundamental to our societal existence. But do you protect the pipelines or “networks” that deliver your “Digital Fuel” with the same level of diligence aligned with our fossil fuel pipelines – do you deem it part of your organisations “Essential critical infrastructure?”
If “Digital Fuel” really exists it raises serious questions of the use and importance of this fundamental and increasingly critical energy source. In too many circumstances the IP network readiness, design and deployment discussions are an afterthought usually well behind other more glamorous technology or business centric outcomes. BYOD, VDI, cloud computing, end user mobility, (I could continue) – all create, process and utilise “Digital Fuel”. But without a network fit for purpose, available and secure all of the time everywhere, the fuel delivery stops. And with it so do we….
If you seeking business change and need more fuel, it’s time to make the IP network readiness conversation your first one not your worst one. If not how will your “Digital Fuel”, fuel?
Until next time
Recently Jeremy Hunt – the Health Secretary – has stated that the NHS will become paperless by 2018 to “save billions”. But this is not a new project. Before the National Programme for IT (NPfIT) and Connecting for Health (CfH) were even a twinkle in a Health Secretary’s eye, the Information for Health (IfH) agenda clearly outlined the need for a paperless NHS (initially released in September 1998).
In fact, successive Health Secretaries, and other NHS leaders, have often suggested dates by when the NHS must become paperless, and yet in 2013, we still have a mainly paper led system. Granted, there have been great developments in this – for example, most GPs work in a paper-light fashion, and referrals, results etc. are all moving to a more paperless system. However, paper – and other hard copy records (e.g. X-rays) – still exists in the NHS.
Most of the changes that have come about in the field have not happened due to some mandated requirement. Instead, they are often brought in by clinical and business leaders to solve real business and clinical issues. Paperless solutions can lead to a reduction in treatment/medication errors, quicker time to diagnosis, shorter time to treatment, more collaborative diagnostics (allowing a wider range of specialists to be involved) and overall better patient care.
From a business perspective there are a number of benefits. As well as reducing the time taken in certain business processes (look at how email has transformed the business world) there is greater traceability, more accuracy and an overall change in the behaviours of many organisations for the good. Unfortunately, the Health Secretary fell short of announcing any new funding to assist with the paperless NHS vision. And so, again, organisations will attempt to become paper-light through localised procurement and innovation.
There are many suppliers in the “paperless office” space and organisations need to ensure that they choose the right partner for what they are trying to achieve. The software solution alone is not the only consideration. What are you trying to achieve? Clinical notes digitisation has a number of specific issues which need to be carefully managed if the digitisation process is not going to negatively impact on clinical care.
Considerations as to the security model and the storage requirements will play heavily into the service definition, and it is often better to overestimate the growth of data by a small margin than to underestimate. Many vendors will offer an assessment as part of their overall offering.
Organisations need to be sure that they are looking at how and where the information will be required. Make certain that various clinicians are part of the working group which defines how the information should be used. Too often projects like this can become centred on the technology, when actually technology is just about enabling the change to information flows. Clinical participation is critical to service success.
Fear Uncertainty and Doubt or FUD has become a mantra with vendors – put simply get over it!
With one week to go until the RSA Security Summit – The world’s changing, IT security is evolving and if the vendors are to be believed there’s a Cyber War raging on the internet! Distributed Denial of Service (Ddos), State led hackivism and the ever present Advanced Persistent Threat (APT) all challenge your business led initiatives of mobility and enablement against a backdrop of Governance Risk and Compliance and gaps will exist in most security strategies.
Welcome to the “New Normal” – we don’t know what tomorrow’s challenges will bring but here, today business outcomes need to be delivered and the conventional network led approach to security whilst necessary isn’t the most efficient route to success. Computacenter is speaking at RSA’s Security Summit on the 22nd April – come and hear about a different approach to resolving your security outcomes.
The internet is not new. Developed in the 60s for military purposes and evolving in both scope and popularity ever since, the Internet has become second nature to much of the developed world. When Tim Berners Lee formulated the linkage between the hypertext mark-up language (HTML) and the internet that spawned the graphical, interactive World Wide Web as we know it, who would have thought the internet would become the essential “commerce and communications” hub it now is.
But all of that “usefulness” and “interesting stuff” does not come without concern. Use of the internet is for all intents, unpoliced, unlicensed and without service levels. For many the internet has been used to transport and transact virtually every form of digital information that can be encapsulated into an IP network packet. To that end the last decade has normalised the use of the internet for essential commercial and fundamental electronic communications – and in the eyes of many it is clear that we may now fail to function effectively without it.
It’s no longer just about technical topics like “internet security” or “latency”, but the quasi business aligned perspectives that include customer satisfaction, yield, loyalty, advocacy and customer retention. And magically, all of these elements are often realised at a marginal cost when compared to the “off internet” legacy approach. This has propelled the internet to become a real “critical national infrastructure” element as essential to the business world as it is the personal world. But what happens if the internet disappears, fails, or is compromised? – dare we think of the day the internet finally “stops”.
Thankfully there are many supremely capable technical individuals around the world tasked with ensuring the internet doesn’t fail and due to inherent multiple levels of technical resilience, a full scale internet shut down is unlikely (but not impossible), however it is now a straightforward activity to shut down a corporate web server or the online presence of an organisation or group of organisations. The now infamous DDOS (distributed denial of service) attack is a commonly used approach to bombard a named web presence with unrequested traffic until it overloads and ceases to function.
Due to the essential commercial value delivered by corporate web sites and the financial revenue impact (and equally customer loyalty / goodwill) of a period offline, protection against DDOS and other malicious hacking approaches to take a web platform offline must now be fundamental to all. This week we have seen heightened awareness due to of one of the biggest cyber attacks of its kind involving a DDOS attack on a particular organisation at a level fourfold greater than had ever been previously experienced. And for those aforementioned organisations underpinned by the internet, this mass DDOS attack has allegedly “slowed down worldwide internet traffic”.
It may be time for you to consider a number of key points – is the internet an essential communications and commercial transport layer for your organisation?” and if yes, “What is the maximum period of offline activity could your organisation tolerate (i.e. no web presence, email availability or web access)?” and finally, “how slow is slower for your organisation when discussing internet related performance concerns”.
The web facing Internet presence of an organisation performs many key functions; most importantly acting as the prospect or customer initial “landing zone or gateway to the organisation”. When discussing the corporate visibility on the net – now you see it, now you don’t is definitely NOT a humorous customer experience.
It’s time for DDOS protection for all.
Until next time.