The Associate Journey so far
It’s hard to believe I have been with Computacenter for over 6 months now. I am sure the rest of the Associates would agree with me when I say that the time has simply flown by. I’d like to use this blog as an opportunity to look over some of the highlights of our first 6 months in the business, which culminated with the Associates H1 review at the beginning of the month.Last month Emily’s blog focused on Computacenter’s values and what better place to start when looking back over the last 6 months. From day 1 it was clear that there is a definitive Computacenter culture and its popularity is evident just by looking at the longevity of many people’s careers here. The CC values are infectious and are epitomized by the amount of people within the business who have given up time in their busy schedules to assist myself and the other associates. I can assure you, we are all very grateful.
During the programme we all have endeavoured to get involved in not only the business but the social side of CC as well. Myself (the self anointed star player) and a few of the other associates managed to get a team together for the annual 5 a side Football Charity Tournament. This was no mean feat considering the team was made up of recent graduates and the start time was 10 AM…….on a Saturday. Somehow we managed to walk away with the Winner’s Trophy and whilst this was great, the opportunity to meet more people from all around the business and raise some money for charity was the real success.
Another highlight was the inaugural Services University hosted by Julie O’Hara (UK Customer Service Director). It was a fantastic day full of engaging workshops and insightful keynote speeches. It was also another great opportunity to meet more of the service management community. The theme for the evening was ‘Beach’, which resulted in a wide spectrum of horrifically bright outfits. I left Celtic Manor feeling energised and with a real desire to turn up the dial and make the most of my time left on the Associate programme.
The culmination of our first 6 months is the Associate H1 review. This is where each individual on the Associate Programme gets the opportunity to present to senior members of the organisation. We each had half an hour to cover the programme so far, our personal development and our plan for the future. The difficulty was covering this all in such a short time slot. Everyone had their own approach and although it was a somewhat nervy experience it ended up being thoroughly enjoyable. In addition it was great to get some precious feedback from some of Computacenter’s most senior individuals. Having spoken to the other associates I think we all took away some valuable advice and guidance for both the remainder of the programme and the future.
We are now a third of the way through the programme. It is time for us all to begin turning up the dial and ensure we make the most of this valuable opportunity.
On that note I’d like to say thank you for reading and hope you stop by again for the next instalment, which will be brought to you by the enigmatic Tom Bateson. I’m sure it will be worth a read!
Let us know your thoughts,
In just a few short hours, Microsoft will press the button that will launch Windows 10. Is it very eagerly awaited following the success of the Insider Preview that’s been running for the past few months as users across the globe have road tested the new Operating System, “Platform”.
Those of you using Windows PCs in your personal world may already have seen the notifications for, or signed up for, early deployment. This is going to be a phased release, we don’t yet know how or who will get it first, but the anticipation is building and actually, the mainstream media have been very positive.
Since my last blog on this topic we’ve continued to have conversations with key customers who are keen to adopt Windows 10 early and have been actively exploiting the innovative Insider Preview approach provided by Microsoft.
In terms of planning your move to Windows 10, here are just a few things to consider:
Windows as a Service
First and foremost, there is still some “confusion” over the new servicing model provided with Windows 10. Talk of “automatic updates” is a scary prospect for organisations who by necessity have a managed ecosystem of tools and applications supporting the Desktop. History has shown that changes to the OS, patches, updates introduce risks and occasional problems, so having this outside the enterprise control for the first time is new territory.
Luckily, further details have been provided on this in recent months that show a more “managed” approach for businesses, with two slight deviances from the “automatic update” model, namely Current Branch for Business (CBB) and Long Term Service Branch (LTSB). I will not elaborate on these too much more as there is plenty of material in the mainstream media, but importantly they reflect the needs of enterprises for more control over change in a way that does not compromise Microsoft’s new philosophy around continual upgrades and development to the platform. Hopefully a win/win!
The launch of Windows 10 brings several new features that rely on specific hardware features. Cortana, the digital assistant, requires a higher quality microphone than we usually get on devices. Features like Windows Hello and Passport may also require new and enhanced features such as a camera to provide biometric authentication. Even a relatively up to date device, is unlikely to have these features, and therefore we’re waiting for the key OEMs to release new devices specifically for Windows 10. Expect these later in 2015/early 2016
The perpetual problem of Operating System upgrades is application compatibility. Many organisations are still weary from the pain and expense of re-packaging or transforming their applications for the Windows XP to Windows 7 wave – and are now faced with that again now; or are they?
Microsoft have made strong claims about application compatibility from Windows 7/8 to Windows 10, quoting approximately 90% compatibility. This is a) a very big claim and b) a great message for organisations to avoid the cost and complexity as they accelerate adoption of Windows 10. However, this should not downplay the significance of the change being undertaken, and organisations will still need to identify processes to evaluate and test key applications prior to migrations to ensure the business can continue to operate. In the long term, the promise of “Universal Apps” may reduce some of this complexity, catering for the new world of regular platform updates as well as delivering a seamless experience across multiple devices. However to exploit this, organisations still need to invest in transforming or replacing their current apps with Universal Apps equivalents.
How to start your Windows 10 migration
There is definitely sufficient promise and benefit from Windows 10 for organisations to be looking at it early. There are a number of key considerations that need to be evaluated, which means while time is still broadly on our side for migrating from Windows 7 and 8, we should consciously think about the future world.
Windows 10 is the destination platform for Microsoft, and given the continued prevalence of Windows applications despite the increasingly heterogeneous mobile world, will become the future platform for most organisations.
To start your Windows 10 journey, we’d suggest looking at specific Workstyles that would benefit from a modern, touch enabled mobile device, where you may be able to exploit Windows 10 early. With the re-introduction of the start menu, a solid touch interface and “continuum” (a clever feature that intuitively switches between both modes depending on whether the device is docked or not), it allows you to embrace the new platform to cater for your mobile working needs. The second area is perhaps addressing those users we tried to mobilise with Windows 8. The problems of Windows 8 are well documented, but fundamentally it did not provide the mobile and intuitive touch experience that mobile users were seeking, nor did it provide the classic desktop environment that more static users demand. Therefore targeted replacement and upgrade of Windows 8 with Windows 10 for these mobile users may help deliver an “early win” while you understand the impacts and implications of the new software, service and hardware models required in the future
A final note on Windows Phone
It’s important that Windows 10 is a success across “classic” device types as well as mobile devices. With the recent announcements regarding Nokia, it has made many people question Microsoft’s commitment to phone moving forwards. What is clear is that Microsoft absolutely need to continue with their Windows Phone proposition, the hope being that Windows 10, with a more focussed hardware development (as they do with Surface) gives them the lever to allow them to make inroads into the mobile marketplace. While Microsoft are doing some good cross-platform work across Office 365 for their productivity apps, the phone platform needs to continue to be a core platform for them – Windows 10 will hopefully get right what has historically been a troublesome area.
Certificates, encryption and lots of TLA’s
Things have been pretty quiet in the world of Internet encryption for some time; revelations from Snowden to the Hacking team have had surprisingly little to say on the subject. However, the calm is coming to an end as a raft of changes are beginning to make themselves felt.
Perhaps this lack of noise is because of where these changes are coming from, not from dramatic and media savvy vulnerabilities such as Heartbleed, Beast and Poodle but rather from some of the Internet’s biggest companies, especially Google and Microsoft.
Both are leaders in operating systems, web browsers and cloud services. So, it is little surprise that these companies are trying to drive up the quality of Internet security, not only to help them be perceived as secure to their customers but to provide a key differentiator with smaller players.
Given that this is a blog post, I have restricted myself to a couple of pages, a longer version of this article will be available from the author
Web browsers, and annoying everyone
All the browsers manufacturers are working to improve the security their products, though there subtleties in approach the core approach is the same. They are all starting to harden warning messages and turning alerts into blocking access to websites where configuration errors are found. Most importantly such settings will become the defaults for the next generation web browsers. Also Google has pretty much won the argument on browser updating, and a policy of continuous rolling updates is going to become standard, even for Microsoft. Google’s aggressive deprecation of old products and standards may well also become the norm. This will leave many businesses with even more complex legacy app support issues than we have now.
Certificates for HTTPS secure web
Web encryption can be divided into two key components, the certificate used to identify a website and commence secret communication and the ciphersuite responsible for the encryption of traffic between a user and a website.
There are moves by the industry to improve the quality and effectiveness of both these components, the first is increasing key lengths, so no real problem. The second one is a little more interesting and is the SHA1 problem. Certificates are signed to prove they are not tampered with and the legacy algorithms used for this are starting to show their age, so now to the SHA2 issue. This refers to the Google/Microsoft’s decision to the require modern signing algorithm SHA2, Google Chrome is already producing error messages and as Chrome updates they are going to become more forceful. When you renew certificates you are probably going to have to use SHA-2 despite the fact that some very old browsers or systems might have interesting issues.
The chain of trust
A description of Internet certificates is beyond this article but one component does need further examination, and that is the chain of trust. To work a certificate forms a chain of other certificates that link the users’ browser to the websites server. Browsers will produce an error if the chain is not correct, and increasingly will fail to connect at all. Anyone commissioning a certificate needs to consider how that chain of trust will be presented to their customer, and be confident that errors are not produced, look for problems in you suppliers server too.
The Ciphersuite, TLS and confusing names for things
The Ciphersuite is a series of configuration items that determine how the encrypted conversation is created. Typically both client and server support a number of different Cipher options and suitable choice is negotiated.
Transport Layer Security has replaced the older SSL standard. SSL is an old protocol and problems such as POODLE together with architectural issues that are corrected in TLS. This is an issue that seems to cause a lot of concern amongst system owners, because of a largely mistaken belief that clients will have problems connecting to TLS only systems. TLS 1.0 appeared in 1999 and support hit mainstream products by 2006. TLS itself has undergone a number of revisions and modern systems are now expected to support the latest version TLS 1.2 (released in 2008), with TLS1.3 currently in draft.
The encryption algorithm is generally the most recognised part of the Ciphersuite, examples are triple DES (3DES) or AES256. This refers to the actual algorithm used to encrypt the information and any current installation should offer AES256 to any client that can use it. It is also important to remove legacy algorithms that are no longer consider secure as an attacker might be able to “negotiate down” the connection in order to decode the traffic, it looks scruffy too.
Perfect Forward Secrecy
Though not a new idea PFS is starting to gain more support for web security as it avoids a significant single point of failure for Web encryption. Most commonly used systems have a single encryption key that is used for all connections, if this key is compromised then all traffic could be accessed, this applies even if the traffic was intercepted years earlier. PFS algorithms create a temporary or ephemeral key and create a new key when required, try Googling ECDHE for equations and graphs galore.
…and in conclusion
It’s easy to see web security as a solved problem; technologies such as public key infrastructure, HTTPS etc have been with us a long time and to most users they appear stable and rather dull. The reality is that the Internet can be a hostile place and the cryptography that underpins it is under constant scrutiny. What results is widening gap between legacy systems that were adequate for the task 10 years ago but are now cause for serious concern. Such weaknesses are also very easy to detect, making automated attacks practical. The big technology and service providers are also increasingly marketing security as a differentiator, and as cloud platforms become more prevalent the need for that security becomes even more pressing.
I remember the days when using awk, sed and grep on a log file was a really powerful way to extract useful data to help troubleshoot issues, or better plan a complex application deployment and management.
Now the amount of data that is generated by systems, applications and devices has proliferated to the extent that we are unable to use the old techniques to get information from the systems managed today.
A popular route for analysts is to download software on their laptops to help with this challenge and one of the more popular choices is visual analytics from Splunk. This personal need and learning has driven a “Shadow IT” style of adoption of the tool for Operational Intelligence in organisations. The Computacenter UK Infrastructure Operations team experienced some early success in this very manner. The initial benefits were amazing, but thought was needed on how to evaluate it as a corporate tool in order to drive operational efficiency and intelligence across the Global Managed Services Business.
On consideration of using a traditional approach for a proof of concept and pilot phase, which would take weeks to plan and more time to execute, it was decided to try a different approach. Something more agile was needed in order to benefit from quick results, the ability to test the software, its ease of use, and the other business benefits it could drive.
So with a little gamification and the flexibility of our Solutions Center, a competition was conceived…
The competition was to run for four-weeks only between teams from across the Computacenter Group. The challenge was to use Splunk’s visual analytics tool to address a Managed Services business problem in four weeks with just an afternoon’s worth of training. The teams were based in the UK, Germany, South Africa, Hungary and Spain, from both Service Desk and Infrastructure Operations.
All participants were given an overview of the tool and as mentioned before, half a day training which was run from the Solutions Center in Hatfield and broadcast across to the other countries via live presentation and video feed. A central infrastructure was provided with the software pre-installed.
The results were amazing, all the participants were data analysts so knew exactly what they wanted to get out of their data and were able to visualise it in the short space of time given to them. With varying help from Splunk experts, all were able to create compelling business-relevant dashboards, in just four weeks with very little training and while still doing their operational day jobs.
The results have allowed us to view what the art of the possible can be and now we can start further planning the use of this innovation driving software.