I started 2016 in bullish form with predictions for security based on the lows and highs of 2015. I touched on two on the many market catalysts set to transform both today and tomorrow’s worlds, enterprise mobility and the Internet of Things but highlighted I would mention three more. Part two of my security outline kicks off with my final three security focus areas for the first half of 2016, journey to the “cloud”, security for the SDDC and the need for intelligent people to “act smart”.
The enterprise journey to the cloud continues to be hindered by concerns robust enough to offset the unquestionable benefits. If enterprises are already challenged to secure local environments that benefit from additional levels of physical control and proximity, why would the need to secure information flowing through an external often multi tenanted service provider not highlight similar (and different) challenges. Pre 2016, it was straightforward for enterprises to deliver a blanket response “we don’t use the cloud” often citing security concerns and with no need for further explanation, but with shadow IT research validating authorised and unauthorised cloud usage exists whatever the policy, neither authority or ignorance seems to matter.
It’s therefore time to go “back to basics” and remove years of accumulated assumption of business functions and application flows and replace it with rigorous understanding. With a revisited / restated view of people, process, application flows controls and compliance expectations, “what” can be delivered via the cloud becomes clearer (“how is a whole different ball game”). Whether via internal or external assessment or audits, enterprises must obtain a robust and realistic “current state” view to calibrate the cloud trajectory and thus maximise the business benefits of cloud service delivery. This common sense view is my consistent response to mute the many often unfounded concerns of cloud service delivery or published negative cloud consequences. And I frequently pose the question “Can you really tell me now restated for now, the who, what, how of your business IT operations & applications calibrated by relevant controls”? If the answer is no, effective security for the cloud journey may have no effect at all. Time for change to make cloud service delivery a consistent, secure reality.
Following on from the cloud is the software defined datacenter (SDDC) snowball that continues to gather pace. SDDC ideals are no longer if or when for enterprise organisations with substantial workloads or IT services already delivered primarily via software elements. It’s the dynamic, frictionless, highly agile operational persona offered by a predominantly automated software driven environment that holds so much promise. But common to every “must have”, “must do”, “next big thing”, IT trend is the “what about security” question?
First off, will be a straightforward perspective – “avoid the security retrofit”, time for a security reset. Security must be the core deliverable of the SDDC outcome therefore can never be deemed an add-on or optional extra. When application dependencies and process workflows are in early draft mode (potentially in the earlier stages of the development cycle) the security expectations must be identified, qualified and externalised. Deferring security to later phases or accommodated via an assumption of inherent safety delivered by default is fundamentally flawed as applications and workloads become increasingly fluid in location and state.
A silver bullet of the SDDC ideology is the potential and proven reality of security moving always from a perimeter based ideal to an intelligent functional state as close to the workload as possible (in fact the workload is no longer a workload to be secured, but instead a “secure workload”). This new attitude to application and workload delivery must drive a “blank sheet of paper” review of security to ensure one of the most compelling benefits of the SDDC journey can be fully realised. An enterprise journey to the cloud presents the long overdue opportunity (and investment) to “get security right” – use it, don’t lose it.
And lastly its “people time”. The rise and rise and continued rise of the digital enterprise will fundamentally shift the way business services are operated, consumed and ultimately secured. We are venturing into the unknown and therefore wrestling to find answers to an endless stream of security questions. But is this state really unknown, I suggest not. The “enterprise” digital enterprise may be no more than the digital DNA already the vital fluid of the modern social network driven arena spilling over to and thus redefining the enterprise. Create and destroy data information instantaneously, join and graft multiple and previously unconnected data sources together to create new insight / new opportunities, always on, always now – isn’t this the digitisation defined “social world” already our norm.
And possibly with that Eureka moment appears an equivalent reality check, we still haven’t solved the security problem (s) in the digital social network world, in fact we at times we are not even close. And the main reason – “people”. As technology improves (both systems and security) people reduce their level of vigilance & diligence and increase their expectation that the “system will deliver protection”. Nothing could be further from the truth. I fear we may arrive at a state where there is little more that can be done from a security systems based neural or autonomic perspective. In other words, we have put as much logic and decision making in the system to determine and remediate as much as it can from a security perceptive in an acceptable timeframe. And then what or who is left in the chain as the primary attack vector, the same primary attack vector that has always existed – “people”.
Which drives me to highlight that 2016 may be the year enterprises revisit and reinforce the level of individual accountability that all system users are vigilant, diligent and aware of the security implications of their actions. Or sadly those same users may be affected by the double edged sword of compliance and personal liability. This is a step change forward from the never read acceptable use and security policies. Tough talking and a disappointing road to traverse, but the enterprise may no longer have a choice – systems cannot secure the organisation alone. With flexible working, dynamic workplaces, fluid workloads set to be a normal business state, every corporate endpoint whether human or system has the same responsibility to evaluate and maintain a company desired security state.
And this closes the security predictions overview for the first part of 2016. Whether it’s the increasingly mobile user or interaction with intelligent devices or “things” or dynamic services delivered by highly innovative new market entrants, optimum security will ensure the unquestioned benefits of this increasingly “digital” world arrive with minimal sting in the tail. I am not inferring optimum security has never been important before or isn’t delivered today by highly effective practitioners, it is and that fact it is, minimizes the negative consequences only a mouse click away. But everything we have delivered before is now under attack in a manner beyond our traditional level of understanding with the result it’s time to “deliver now” but with tomorrow’s expectations in mind. Time to change (ps, I am not advocating “patch management” for people – or am I?).
Until next time
Chief Technologist Computacenter UK, Networking, Security and Digital Collaboration.
The Software defined datacenter (or enterprise) is now the must have discussion topic within the enterprise IT arena. It describes the evolution of IT services and solutions to leverage the power and flexibility of software to drive ever changing business outcomes.
But has anything really changed, hasn’t software always supplied the intelligence to hardware, whether it’s microcode on a piece of firmware, software that programs hardware dynamically, a basic operating system or ultimately a front-line application. In a word, Yes – but this time things may be slightly different. Software defined to varying degrees incorporates all of the above but this time with the onus on maximising the intelligence within software (and the speed new or different intelligence can be add in cycles a magnitude quicker than a hardware orientated design) and for many reducing the intelligence within hardware.
But as software defined starts to gain real momentum with valid use cases more prevalent, many of the earlier perspectives are shifting. The Holy Grail may not be a world of intelligent software and dumb hardware but smart everything (both hardware and software). The key to real software defined success moving forward, is an IT landscape built via systemic thinking delivering almost living or neural IT. This infers the need for greater intimacy between software and hardware, real intelligent intimacy that uses intelligence to be “intelligent”.
Picture a modern smartphone – at present one of the best exponents of software and hardware working tightly in sync to deliver an end user experience (or service). Now think again at the real market leaders in the mobile space, emotion aside aren’t they the vendors where software and hardware (and both are highly optimised) operate in such a seamless and simple manner that enhanced agility and productivity become a by-product of it.
Does this challenge and dilute the possibly over hyped dreams of cost reduction and normalisation aligned with the software defined moniker, not really as commoditisation, the increased speed of virtually everything and improved inherent reliability within modern IT elements are resulting in more for less more often. But for software defined to really be the road ALL traverse, the magical amalgam of software and hardware working together with a level of almost human intimacy and intelligence is the journey that will compel the masses to join and accelerate.
Times are changing, changing times.
Until next time
Colin Williams is a Networking, UC and Security Chief Technologist for Computacenter. Please note the the content of this blog reflect the personal perspective of Colin Williams and not necessarily the viewpoint of Computacenter UK.
Congratulations to anyone that spotted the above to be a quote from the third President of the USA, Thomas Jefferson, and although he may have said it in 1803 the relevance remains today.
I’m no longer sure which generation I belong to, I come from an age when disk drives could be measured in Megabytes, nowadays we don’t talk in Gigabytes and some of us don’t even talk inTerbytes any more. We know data is exploding and we know technology develops to cope with this; however that’s evolution not revolution.
I believe we are at the cusp of the next revolution in technology. To be the next big thing has to fundamentally change how we do things. It has to change how we look and think about our world; it has to be revolutionary.
It used to be that we got excited by individual pieces of technology; maybe our first laptop, maybe our first 1Tb drive, maybe our first smartphone which you just love to hold, and be seen with.
But whilst these may be considered revolutionary, they remain point solutions – they are single dimensional.
We’re moving into a multi-dimensional world of IT. We’re moving from single dimensional solutions to Multi-dimensional solutions
- Where everything has an impact on everything else
- Where every piece of technology has to interact with everything else
That’s just in business, what about the personal world, where your smart phone has to interact with your car, which has to interact with your microwave, which has to interact with your television, so when you get home everything is in its place. How do you choose? And more importantly how do you control it all?
The problem with multi-dimensional solutions is that there so many choices to be made. We are seeing the start of this wave now in the ‘Software Defined’ world, where it gets harder to identify components of a solution, but really why should we care anyway?
So what do we do in this multi-dimensional, software-defined world of IT?
- Should you ignore everyone and continue as you are, after all it works doesn’t it?
- Maybe putting everything in the cloud and consuming as a service is the answer
- Why not adopt all the new methods, be seen to be progressive but continue to do everything the same old way?
- What if you adopt every new solution out there, change all your processes to get all that benefit you’ve been promised? How much disruption would that cause? & what if it doesn’t deliver?
It is a minefield out there, and as with all minefields it’s always good to have someone with experience to guide you through it. This is where Computacenter come in.
Every generation has an obligation to renew, reinvent, re-establish, re-create structures and redefine its realities for itself. Get ready for the next generation.
As 2011 was a year of us talking about “Cloud”, closely followed by the “Big Data” wave of 2012 then 2013 is shaping up nicely as the year of the “Software-Defined” entity, where multiple technologies are being covered by the “SDx” banner. Let’s have a brief look at what this means for the world of storage.
In the world of data we are used to constants; Controllers that manage the configuration of the environment and the placement of data, disks grouped together using RAID to protect data and the presentation of this data to servers using fixed algorithms. In effect when we wrote data we knew where it was going and could control it’s behaviour, we could replicate it, compress it, de-duplicate it and provide it with the performance level it needed, and when it needed less performance, then we just move it somewhere else – all controlled within the storage array itself.
Software defined Storage changes this model; it can be thought of as a software layer, put in place to control to control any disks attached to it. The storage services we are used to (snapshots, replication, de-dup, thin provisioning etc) are then provided to the Operating System from this layer. This element of control software will be capable of sitting on commodity server hardware, in effect becoming an appliance initially at least, and will be able to control commodity disk storage.
This does not really constitute some of the features of storage virtualisation, where a control plane manages a number of storage resources, pooling them together into a single entity; rather it separates the management functionality removing the need for the storage controllers – the most expensive part of a data solution. Therefore one of the driving factors for the uptake of Software Defined Storage is an obvious reduction in cost, and the ability to provide data service regardless of the hardware you choose.
The challenge to this is that data should be regarded differently to other aspects of the environment; data is permanent, packets traversing network are not, and even the virtual server environment does not require any real form of permanence. Data must still exist, and exist in the same place whether power has been present or not. We are now starting to see a generation of storage devices, note I was careful not use the phrase arrays, which are looking more capable of offering a Software Defined storage service, through the abstraction of the data and controller layers.
So what does this all mean for storage in the datacentre?
My main observation is that physical storage arrays will be with us for a long time to come and are not going away. However the potential for disruption to this model is greater than ever before, the ability to use commodity type storage and create the environment you want is compelling. With the emerging ability of software to take commodity hardware, often from several vendors simultaneously and abstract the data layer then the challenge to the traditional large storage vendors becomes a real and present danger.
I believe the rate of change towards the software defined storage environment will ultimately be more rapid and see greater early adoption to the proven concepts of server virtualisation, it will cause disruption to many existing major vendors, but ultimately end-users will still require copious amounts of disk technology, so the major players will remain exactly that. Whilst some niche players may make it through the big boys will still dominate the playground.