Archive | November 2015

Antivirus in rugged health

Anyone who is fortunate or unfortunate enough to spend much time at security conferences will probably be used to being told that antivirus is dead by people that want to sell you something else, and to me, that has always sounded like more than a touch of exaggeration.

IT can be dogged by statements like this; indeed a very traditional antivirus product that just compares files with a list of malware signatures is coming to the end of its usefulness. That doesn’t make antivirus technology redundant, just one method of detection.

Modern endpoint protection malware suites are something very different. Such products will have signatures at their core, not only signature for files but for behaviour, network traffic and even for file origin. Such endpoint protection suites integrate with the computer to provide firewall, intrusion prevention, browser protection and more, so providing a layered protection model that is far more effective than the old signature model. Nothing new there for anyone who is remotely interested in such things. Now compare this change with changes in the way modern malware has changed the way it attacks clients. two great examples are the generic downloader and cryptoware:

A generic downloader is normally embedded in an Email attachment, and is in part a, return to the macro virus.   It will attempt to manipulate a user into running an office macro designed to collect malware from the Internet, or other infected machine. The generic downloader is not itself the malware. The attacker can change the downloader code almost constantly to avoid traditional signature based scanners. However, its behavior must remain broadly similar, to connect to the Internet, download the true malware and then execute it. It is this sort of behaviour that modern systems can defend against and traditional systems struggle.

It is much the same the same for cryptoware, also known as ransomware. A user is tricked into downloading a cryptographic client and then that client starts encrypting files in a way almost indistinguishable from a legitimate request from the operating system, until the luckless user is asked to pay a ransom to get their files back. It is therefore the behavior of the client as the malware loads, and tries to contact the dark corners of the Internet that creates an opportunity for detection, even if an exact file signature is not available.

This leads to a couple of important security points, these modern suites only really work if the whole suite is installed, and enabled within a suitable framework, and secondly such suites needs to be managed throughout their life to make sure they continue to deliver the required level of protection. An example of this point is that the anti-malware client itself might contain vulnerabilities and the need to patch your all security software needs to be considered alongside patching your operating systems and applications.

Then there is always the cloud to think about. Antivirus has made use of cloud services long before they were called that for tasks such as the download of signature updates. Over the past few years much more interesting use is being made of cloud services for anti-malware. The cloud can support anti-malware software running on a client, for example by checking against cloud databases for a files reputation or a files source or to some extent replace it by forcing all Internet traffic through a proxy server. The cloud proxy server will have the latest signatures, reputation data, black lists etc continuously refreshed.

There have been, and remain all sorts of ideas to protect client computers using technology that doesn’t rely on the end point itself, especially when that client is virtualised.   To be really effective in delivering the protection needed a complex local client is still needed. Laptops need additional thought, exposed to so many more threats than a data centre supported virtual desktop.

I think we can be forgiven for occasionally referring to these modern solutions by the old name of antivirus, and the next time a salesman tells you AV is dead just think what else can work with application level encryption, third party removable storage and airport hotspots hundreds of miles away from friendly network.

Why wait until the year 2020 when “2020 IT” is needed now – “time to hurry up”

A few months ago I scribbled about the need to develop and deploy Information Technology systems (“IT”) now with 2020 in mind. In “Arthur C Clark” style I discussed the need for a change of thinking and the importance of considering all of the interconnected elements (many quite embryonic), due to the astonishing level of business change currently affecting us all. Through 2015 it has become apparent that the year 2020 shouldn’t be deemed a distant milestone, we need whatever we envisage “IT” will deliver in 2020 – today.

Data isn’t exploding, it has already exploded and will do every second, minute, hour of every day. We may never successfully control it but many will harness it to unlock unimaginable personal and business value. The connected society will continue to be the heartbeat of everything we do (and I do mean everything) and both personal & business expectations will increase every time benefits are realised. Whether it’s the relentless march of smart devices (even I have an Apple watch), the rise and rise of the “app for everything” culture (ok, nearly everything), the Internet of things optimising our everyday existence or always available (but not always effective) Internet / device connectivity – we are now a “connected device” dependent society. Our imagination is the catalyst for digital entrepreneurship energised by the view IT “can”, but the gloss is not without a little “matt”. If digital business gain must be balanced or is tempered by digital data loss is it really at gain at all. Maybe agile security is the new must have security persona as systems that learn and evolve as threats and attacks evolve must be the only effective way forward

And that means the personal and business outcomes previously considered “too radical” or “far out there” are many of the outcomes EXPECTED today. We have been here before and dare I say it, many times through previous IT revolutions or business evolutions. Each time the step change was delivered in somewhat controlled proportions and allowed the essential but at times loose coupling of IT and business to be maintained. But it feels different now, very different. The expectations of enterprises today buoyed by the belief that software can achieve “anything” and the connected enterprise can stitch together the business fabric required, is straining traditional IT operational models, architectural frameworks and delivery outcomes. The people change impact is underplayed, often overlooked but key to the successful and long lasting evolution to a truly digital enabled enterprise. The fallacy that IT and business can run as separate entities is misguided.  IT & the business must be interlocked to such an intimate and fundamental degree that even non IT bound businesses may fail to be effective without IT in the midst of the current “digital economy”.

The expectation of “IT 2020” realizable today is effecting application development and release to a profound degree. The change can no longer be avoided and even for the more traditional enterprises, accelerated/iterative development (“agile like”) and operational styles are no longer activities undertaken by “others” but essential modes required to keep up (forget about even moving ahead) with a business landscape changing at warp speed.  And as the power of “IT 2020” really accelerates with the IOT/IOE quasi social experience becoming the norm, we will start to experience today the benefits of people and systems intimacy that will underpin our societal existence in 2020.

Things really are different now and for me different is good unlocking possibilities and opportunities for all. With the market change agents continuing to blaze the trail with everything from healthcare via video or personal payment systems on a watch to home energy management via a Smartphone, the IT systems of today must change to ENABLE or they will hinder change. That’s why 2020 is too late for 2020 IT – that time is now.

Until next time.

Colin W

Twitter @colinwccuk

Chief Technologist, Computacenter UK – Networking, Security, UC

Storage Wars – The Force Awakens

storage-wars-a-and-eI’m allowed some Star Wars geekery occasionally!

With the imminent launch of the latest Star Wars movie I turned to thinking about the generation of images used in movies. We think less and less  about the computer generated images we see in movies, but are simply accepting of them as part of the action, even though the Wow factor is still there.

We know that those buildings are not really destroyed;  the Golden Gate Bridge has not really been devastated 20 times in movies recently, so we know its Computer Generated Imagery (CGI), but have we ever thought about the technology required to create these sequences?

Most important in this process is the role of the storage environment; it’s imperative to be able to process images quickly and to be able to render images in a timeframe to minimise cost and production time.

This is one of the places that Flash-based storage arrays really shine; the ability to deliver output in a rapid fashion means that my Star Wars user experience happens in 2015, and not in several more years’ time.

Remember, the original Disney cartoons took several years to make but now several can be produced every year, Flash storage solutions are one of the key factors behind this.

Now, performance isn’t always everything, but in the film industry it can be.

Whilst I genuinely have no preference for technology vendors, occasionally there are just some things you just have to highlight. One of these has been our recent testing of the HP StoreServ 20850 storage array. Having recently achieved world record results in the SPC-2 tests the 20850 became an obvious candidate for Computacenter to evaluate whether the claims could be substantiated in a real world scenario.

The performance of this array has been blindingly fast, and is one of the few which actually matches the vendor’s claims in terms of performance. Having tested several vendors’ solutions, the HP 20850 has stood with the best of them in terms of both price and performance.  Combining this with improved manageability makes the HP 20850 a compelling solution for customers across a wide range of applications, and supports customers in their move to the silicon datacentre.

The HP StoreServ represents a return to form for one of the major players in the storage industry, and is available for Customer Demonstration with a variety of either simulated workloads, or customer-specific tests utilising actual data, in the Computacenter Solution Centre based in Hatfield.

 

To (almost) quote Darth Vader; ‘HP StoreServ 20850- The Force is Strong with This One’

Avoiding the Doping Scandal in Storage Performance

Some Performance is More Equal than Others

Some Performance is More Equal than Others

For those that don’t know, my background is in Mathematics & Physics which, as a wise man once pointed out to me, is why I have OCD tendencies around numbers.

I like precision, I don’t like estimates or guesstimates, and I’m not a big fan of vendor spreadsheets that show how their technology will reduce your Capex or Opex and provide virtually immediate ROI, because we all know there are so many variables that they cannot possibly be particulalry accurate.

If I followed these models ultimately I could go in ever-decreasing circles where I have ultimate performance, at little cost, with no footprint and it pays for itself before I’ve bought it. Hooray for that!

Back in my precise world it’s important that we know what it realistically achievable, and more importantly what is achievable in specific environments with specific applications. One thing we have learned is that whilst all storage technology may look similar from the outside, it doesn’t always perform in a similar manner. One thing I’m asked repeatedly is how to decide between vendor technologies and what is the optimal solution for customers.

The answer is not simple, there are many variables that can affect the performance of any storage environment, and why for specific workloads there will be a solution which will work better than others for specific criteria. When sizing storage solutions we need to look at a multitude of variables;

  • Performance requirements in terms of IOPS, Latency & Bandwidth
  • Read / Write ratios
  • Application usage
  • Block size in use
  • Typical file sizes
  • Whether compression is applicable,and how well data may compress
  • Deduplication and how well data can be deduplicated

Now here comes the challenge; 64% of IT organisations don’t know their application storage I/O profiles & performance requirements; so they guess. The application owner may closely know the performance and capacity requirements, but adds extra to accommodate growth and ‘just to be safe’. The IT department takes the requirements and adds some more for growth and ‘just to be safe’ because ultimately we cannot have a new storage subsystem which does not deliver the required performance.

This means performance planning can be guesswork, with substantial under or more likely over-provisioning, and the unseen costs of troubleshooting and administration providing more significant overheads than should be necessary.

The ultimate result of this can be a solution which meets all the performance requirements but is inefficient in terms of cost and utilisation.

This is where Computacenter come in; working closely with our latest Partner LoadDynamix we can;

  • ACQUIRE customer specific workloads and understand exactly the requirements
  • MODEL workloads to understand the scale of solution required and ramp up workloads to find the tolerance of existing infrastructure
  • GENERATE workloads against proposed storage platforms to ascertain optimal solution, and how many workloads can be supported on a platform
  • ANALYSE the performance of proposed solutions which factual data, not vendor marketing figures

Coupling this approach provides an exact science for sizing the storage solution, and coupling this with Computacenter’s real world experience ensures my OCD tendencies can be fully satisfied.

The Computacenter / LoadDynamix Partnership announcement can be found here;

http://www.computacenter.com/news/151006_Load_DynamiX.asp

I like accuracy; working together with LoadDynamix we can achieve that not just for me, but more importantly for our customers and their users.

Coming Soon – Look out for the #BillAwards2015 announcing in December; want to know who wins these prestigious awards? Follow me on twitter @billmcgloin for all the answers