Archive | Posts RSS for this section

When was the last time you took a memory test?

Despite the proliferation of devices now available it’s good to see recent surveys showing the continuing relevance of the humble PC.

One such recent Intel survey questioning the ‘Importance of the laptop’ concluded that over 80% of respondents agreed with the statement – “I often use my computer when I need to get things done that matter” and a further 70% agreed that – “I feel that the time I spend on my computer is time well spent”. It’s results like this that endorse that the PC is still, in the main, the preferred weapon of choice for productivity and getting the work done for most people.

The business tasks and workloads we ask our PCs to deliver continue to increase. More and more multitasking as well as numerous applications running in the foreground and background are in danger of making what is an essential work tool slow down our productivity and detrimentally impact our user experience. 

Whilst silicon manufacturers do a fantastic job evolving their CPUs to keep a pace with our growing power-hungry requirements, we have started to see other aspects of the system becoming more of a bottleneck to performance. A number of years ago it was considered a big advancement when IT decision makers started realising the benefits and approving the use of Solid-State Drives (SSD) over the traditional ‘spinning’ Hard Disk Drives. Whilst SSD does indeed offers greater performance over its spinning rival they still lag behind the performance of their volatile memory DRAM cousins that make up the PCs main system memory.

Introducing Intel Optane

To increase the performance of the SSD, Intel has developed Intel Optane Memory H10 with Solid State Storage. Conceived originally as a Datacenter technology, Intel has created a single drive device that combines Optane memory and their high-speed SSD (QLC NAND) technology. 

Products based on Intel’s Optane technology represent a different approach to the traditional SSD. A unique characteristic of Optane is that the memory is significantly faster than that used in current NAND SSD drives. Unlike DRAM or main system memory, Optane is Non-volatile which means data written to it will remain even after the PC re-starts. 

Despite the Intel Optane Memory H10 with Solid State Storage solution consisting of both an Optane and NAND SSD memory the user only sees a single HDD. Behind the scenes Intel have an intelligent memory controller and their Rapid Storage Technology driver which is where the workload optimisation takes place. The drive constantly monitors how the user works on a daily basis – which applications are used most, or data is accessed most frequently. These common tasks are then moved dynamically into the higher performing and optimised Optane memory. 

The Benefits of Intel Optane

Users are rarely working with only one application at a time so demand systems that can cope with their multi-tasking needs. Even those users that may not think they are multi-tasking with applications typically are due to the increased number of background tasks being run. User experience remains a challenge, and this is where Intel Optane can help by providing: 

  • Improved performance – A more responsive PC that reduces time spent waiting for thing to happen.
  • Security – Support for industry standard encryption, including secure erase. 
  • Ease of use – Despite there being two components, the user and IT support will only see a single storage device.

By utilising Intel Optane organisations can continue to benefit from today’s demanding applications whilst allowing users to get more work done faster, improving both productivity and user experience.

Intel Optane in Action

Adding more system DRAM memory to a PC has long been the popular choice in attempting to increase its responsiveness, but with Intel’s claims of roughly 2x performance increase over a standard SSD an Optane enabled SSDs could offer a better option for increasing performance and ultimately the user experience.

The graph below highlights the potential performance gains of Intel Optane when considered as an alternative approach to doubling up on system memory.   

Intel Optane Memory H10 with Solid State Storage options is currently available in the following capacities – 

  • 256GB SSD featuring 16GB of Optane memory
  • 512GB SSD featuring 32GB of Optane memory
  • 1TB SSD featuring 32GB of Optane memory

What Next?

The leading PC manufacturers are already including Intel Optane storage options and configurations on the majority of their latest commercial products. If you are looking to add additional DRAM in the hope of increasing PC responsiveness and performance, talk to us about how you can test the Optane technology for yourself as it is likely to deliver an improved end user experience….. and who doesn’t like happy users.

Windows Virtual Desktop – Why VDI? Why now?

In a previous blog (April 2019) , while Windows Virtual Desktop (WVD) was still in Beta, I explored its features and debated the importance of this move by Microsoft into the world of virtual desktop infrastructure. Computacenter has been working with Microsoft and tracking the development of WVD through Public Preview and General Release.  

In this blog I will explain:

  • Why you should be interested in it?
  • What it means to other vendors?
  • How can you know if WVD is right for you?

Why should you be interested in WVD?

From the initial excitement of virtualising desktops, born from the success in the server world, VDI has remained at 10-20% of the desktop estate of large organisations. From the premise of everyone should have one, we now focus on specific use cases where the benefits stack up.  With WVD, Microsoft are focusing on three scenarios:

  • Replace/migrate on-premises virtual desktop deployment

At some point you’re going to need to refresh your existing virtual desktop infrastructure which will be both timely and costly. With many companies boasting a ‘cloud first’ strategy and an ongoing modernisation of application portfolios, migrating those workloads must be considered.

  • New Windows virtualisation

The experience of using and managing virtual desktops has become significantly easier in the last few years, whilst the challenges of effectively maintaining physical desktops is arguably becoming harder. Whether it’s a tactical workload like third party access or something more strategic the ability to pilot and develop on a cloud platform removes a lot of initial investment.

  • Windows 7 end of support

There will be organisations out there whose Windows 10 plans are being hampered by problematic Windows 7 applications. Migrating those workloads to Azure will give you the extended support needed, and so time, to allow that final remediation to take place. From a compliance point of view, it’s certainly a better place to be.

Single versus multi-cloud strategies

The main alternative to desktop virtualisation is giving people a laptop but let’s assume you’ve addressed that and your use cases for virtualisation are defined. If there is a limitation of WVD then it is its dependence on Azure. If that is an issue it’s worth remembering though that WVD is in fact two separate constructs; “broker” and “licensing entitlement”.

As a licensing entitlement you can choose to use Citrix, VMware, or a number of System Integrators offering turn-key DaaS solutions as the broker to those Azure desktops. The advantage, of those, being the ability to run workloads not just in Azure but on-premises and on other public-clouds from a single management plane. This could expand the number of users that could be included within scope. It also means that, perhaps, public cloud becomes your disaster recovery site of choice. Offering constantly refreshed hardware at a fraction of the cost while not powered on.

You also need to consider where your desktops reside based on the applications they need to access. With so many legacy applications hamstrung by latency sensitivity the proximity of the application and the desktop could be paramount to the user experience and, so, success of the project.

Assessing if WVD is right for your organisation

Whether you are new to desktop virtualisation or looking to transform an existing deployment, Computacenter would recommend the following approach

  1. Understand your business requirements and the needs of your users

Ensure you are clear on what WVD can deliver that physical machines can’t and match that to the needs of the business now and in the foreseeable future. Define your user workstyles at a conceptual level and use end user analytics to collect the empirical data that will help you understand which users are a good fit for WVD

  • Use proof of concepts and early user pilots to gain confidence and understanding

One of the most powerful aspects of public cloud is how fast you can be up and running. Test the scenarios you’ve identified and the applications that are in scope to confirm the user experience. Target users to pilot the environment and gain real-world feedback. Positive experiences will help gain momentum in the next phase

  • Build the business case and plan the deployment

Align identified business metrics to the capabilities of WVD. Baseline those metrics and be clear on how you can measure and so show improvement on them. Consider how ongoing application strategies may impact when people can be deployed and where their desktops should be placed.

Microsoft embracing desktop virtualisation is fascinating, and the long-term benefits for everyone must be a positive. Citrix and VMware have been talking about the benefits of public cloud for VDI for a long time, but few large-scale deployments have moved fully to it.  Many on premises VDI deployments were not deployed optimally, I think it’s fair to say, and if you were to do it again you’d probably do it differently. Public cloud forces you to re-visit those decisions both from an operational and a cost point of view. Re-visiting desktop virtualisation also forces you to look at the use cases you are supporting and re-evaluate them. Are you supporting how your users and the business wants to work or making them work in a certain way due to the technologies you’ve implemented?

Desktop virtualisation offers capabilities that physical desktops cannot. Public cloud offers benefits that are hard to achieve on premises. Neither will bring success though if the right users and workloads aren’t identified.

Let Computacenter help you decide if WVD can benefit your organisation.

Are you ready for the app attack?

Simon Minton Graphic

Guest blog from Simon Minton, Global Cyber Security Advisor at Cisco

Find out why taking a Zero Trust approach to developing and provisioning apps can help prevent security breaches

Sharing meeting notes. Processing customer transactions. Logging expenses. Signing contracts. More and more business processes are getting the app treatment. And that means more and more data is being exposed to potential security threats.

To ensure apps deliver on stakeholders’ agility and efficiency expectations, organisations are increasingly using the cloud to provision functionality to users both in the workplace and beyond. Apps aren’t just being provisioned via the cloud; they are being developed in the cloud too – and that introduces another layer of complexity and risk.

Cloud-native development enables organisations to build and update apps quickly. But the speed at which apps evolve can result in security being overlooked – especially as organisations increasingly bring application development back in-house due to its strategic and competitive importance.

Join the DevSecOps revolution

The need to balance security with agility has given rise to a new operating model in the app development world. DevSecOps isn’t just about adopting new processes and tools; it’s about adopting a new mindset in which everyone in the app lifecycle is responsible for security – whether they are a developer, a business stakeholder or a user.

DevSecOps shifts security from a bolt-on activity late in the process of application development, when much of the architecture has already been defined, to a fundamental part of design, build and continuous delivery.

In order for DevSecOps principles to take root in an organisation, developers need to be encouraged to take ownership of security, much like they are incentivised to develop metrics around application availability and performance.

Most data breaches occur from two interlinking scenarios; an exploitation of either the application itself and/or exploitation of the infrastructure hosting the application. Several recent high profile breaches occurred because of a misconfiguration of the supporting cloud infrastructure. The shared security model adopted by all cloud providers puts the onus on its customers to ensure that cloud services are properly configured.

Ensuring developers and IT security teams work together to proactively remediate misconfigurations in an application or infrastructure can help to reduce the impact from an incident or breach. Data analytics will be increasingly important for both teams when pinpointing application and cloud misconfigurations as well as malicious activity.

Monitoring solutions that leverage machine learning and behavioural modelling can provide visibility of activity not only on the network but also within the development environment and across cloud resources – which can act as an early warning of a potential security breach on an app or within the broader ecosystem.

For example, Cisco Stealthwatch collects and analyses network and cloud telemetry and correlates threat behaviours seen locally within the enterprise with those seen globally to detect anomalies that might be malicious.

To trust or not to trust

Advanced threat detection solutions can also help to identify policy violations and misconfigured cloud assets that could compromise the future security of an app. But visibility into potential app vulnerabilities needs to go one step further.

With internal and external developers increasingly using internet-based open source elements, such as software libraries, to accelerate time-to-market, apps have become a patchwork of unseen – and often unknown – components. All of which could introduce unexpected risks and dependencies.

Around 80% of an enterprise application is created using open source software libraries downloaded from the internet. Organisations often have very limited understanding of the risks inherent in these libraries or lack the policies needed to remediate known vulnerabilities.

By adopting a Zero Trust approach (where everything must be validated before it can be trusted) to app development, organisations will be able to identify potential security flaws much earlier. This will not only save time and money but also avoid reputational damage.

A Zero Trust approach can also be extended beyond the development stage to the entire lifecycle of the app. Users and devices accessing apps also need to be regularly validated to ensure they are not trying to launch an attack or steal data.

By getting smarter about how they provision and develop apps from the cloud, organisations will be able to protect thousands of employees and customers and provide a richer and safer app experience.

The Retail Engagement

retail-service-2-768x2561

Given the recent turbulence in Retail it’s difficult to imagine a company that is not looking at strategies and ways to improve on its decline. A big part of these strategies involves looking at ways to improve employee engagement.

Improving employee retention for example is a common topic in this sector, high staff turnover is somewhat expected in retail, and for some companies its considered to be a positive signal. Attrition can help cycle unmotivated, unproductive employees out of companies, and voluntary turnover from more driven employees often results with them finding the company that works for them.

However, some studies and statistics indicate that global turnover in this sector found retail had the second highest turnover rate of all industries at 13%, topping the worldwide average of 10% for turnover. [1]

Some of these statistics point to possible causes for these high numbers of turnover. A lack of professional development, inability to advance within role and company. Receiving limited to no training are common factors in this sector.

[1] https://business.linkedin.com/talent-solutions/blog/trends-and-research/2018/the-3-industries-with-the-highest-turnover-rates

Hard to Reach

First line workers are at the heart of the retail experience but sadly they are often the forgotten employees when it comes to communication and collaboration technology.

They are considered too hard to reach compared to their office counterparts. It is quite common practice not to issue company email addresses for instance, which means they often miss out on important communications that relay key company information and goals.

This is a basic need that helps give meaning and objective to their daily work. It’s quite common for communications of this nature to be given second hand by their management. Who relay this information face-to-face, or via more traditional paper/notice boards. Given that first line workers are typically ignored by these communications, it’s often found that they know very little about what their company stands for and how it might differentiate from its competitors.

When it comes to matters of people management, research and real-world scenarios have shown repeatedly that the companies who rise above the competition are those that place value in employee engagement and consider the overall experience ensuring that all its employees are included.

shutterstock_77274865

Retailers need to examine how to best support their front-line workers, they must carefully consider all the factors that drive employee engagement. It’s essential that they feel considered and included. Empower these employees with the right tools that allow them to receive company communications and messaging, let them feel part of the company and connected to all.

The speed and coverage of communication needs to improve. Decisions need to be relayed to the stores immediately with live communication or within minutes with recorded communications. This speed in deployment can be an immense competitive advantage.

The importance of having an employee communication strategy for the modern retail environment really matters. However, it’s also key to get the right medium, the rise of video usage for employee communication cannot be stressed enough. Industry research finds viewers retain 95% of a message when they watch it in a video compared to 10% when reading it in text.

Allowing your employees to see key messages from its executives and leaders on what is happening within the company is extremely powerful. They get to hear the passion and commitment first hand. This is the modern employee communications way. It goes a long way toward building your brand and meeting your goals.

It’s hard to point at any single factor for the issues faced by the Retail sector. However a key part of the future puts the first line workers front and center in the competition for consumer spend. The question now is who will best prepare their staff for this challenge. The answer starts with improving your employee experience.

 

 

Reuse, recycle, repeat: maximising the value of existing IT assets reduces cost and complexity

Alex_Dodd_v2-02

Whether it’s plastic containers or software licences, we all need to find ways to make the most of the resources we already have.

A new business requirement shouldn’t automatically trigger a new IT investment. This is particularly true within the networking and security arena where costs and complexity increase with every new purchase.

Instead of constantly expanding the security management portfolio, organisations need to start thinking about rationalising and integrating it. According to Cisco, 82% of companies want an integrated security portfolio but very few are achieving it.

Computacenter and Cisco are working together to help change this. As more IT providers move to an evergreen world with rolling releases and bundled packages, it’s getting harder for CIOs and their teams to stay up to date. As a result, features are not used, integrations are not leveraged, and opportunities are not exploited.

Making the switch

Let’s take an example. The Cisco Digital Network Architecture (DNA) subscription provides organisations with a new way of purchasing Cisco Catalyst 9K switches, which form the foundation of the modern intent-based network. But there’s a lot more to the software subscription than just the switches. And these added extras can make all the difference to the efficacy of security and networking operations in a digital age.

As well as providing organisation with access to software-driven rather than hardware-centric switches, the premier subscription also includes tools for managing security policies, detecting security threats, automating processes, configuring networks and generating actionable insights.

It’s hardly surprising then that the Catalyst 9K switches have been one of the fastest-selling new products in Cisco’s history: more than 1,100 customers signed up in the first quarter of availability. Eventually, it will become the standard platform for intent-based networking.

Although organisations have been quick to onboard the switches, the supporting software is still being under-utilised. As a result, CIOs often unnecessarily invest in additional point solutions or continue to use legacy tools that are no longer fit for purpose. They will also be missing out on the latest threat intelligence; the Catalyst 9K premier subscription enables CIOs to leverage the full power of Cisco Talos, which helps to protects people, data, and infrastructure assets.

Unlocking greater value

Computacenter has access to 10 Customer Success Managers – many of whom are Cisco Certified Internetwork Experts – who help organisations unlock the full potential of their technology investments.

We work with CIOs and their teams to drive greater value and enable business success. We share knowledge. We pinpoint existing resources that could be leveraged more effectively. We organise demonstrations and proof of concepts. And we flag potential new investments and integrations.

The shift to evergreen solutions and software subscriptions involves operational, cultural and financial change. Adapting to this new landscape while also delivering digital initiatives is a big ask for already stretched IT teams. But the quicker organisations can adapt, the quicker they can leverage the benefits.

Instead of fulfilling new requirements with new purchases, organisations will become accustomed to checking their existing IT entitlements and assets first. This will not only save money but also prevent the IT landscape from becoming even more cluttered. Maximising what we have today makes for a better tomorrow.

Alex_Dodd_v2-01

Stay in the security picture and avoid the ransomware revival

Everyone loves a sequel – just look at how well the latest Toy Story instalment is performing at the box offices. But there’s one sequel that we could all do without: Ransomware 2. It’s back, and like the best horror movie villains, it’s nastier and bolder than ever before.

Ransomware 2 has already claimed a number of high-profile victims. At the end of June, two US cities paid around $500,000 each to get files and data unlocked following successful attacks. The bill for Norsk Hydro, a global aluminium producer, was even higher. It didn’t pay the ransom, but it still paid the price.

The entire workforce had to resort to pen and paper when ransomware took hold across 22,000 computers in 40 different countries – Norsk Hydro is still recovering nearly three months later. On average, a ransomware attack results in seven days of downtime.

Although the Norsk Hydro’s tough stance has boosted its reputation; it’s also damaged its bottom line – the cost of the attack has already topped £45 million. The company is not the first to end up with a multi-million dollar bill: the Baltimore City government was hit with a massive ransomware attack that left it crippled for over a month, with a loss value of more than $18 million.

The resurgence of ransomware is not surprising – it’s a proven business model and a repeatable one. It works not only at an enterprise level but a personal level too. Individuals can be just as willing to pay a ransom to unlock personal data, such as family photos and financial files, if they are the targeted by an attack.

So how do you avoid joining the ransomware ranks? Although ransomware is powered by malicious software, it still needs human interaction to succeed. Just one click on a spam email or an infected ad is all it needs for a ransomware attack to be initiated. Even a visit to a legitimate website can land you in trouble, if the site is infected with code installed to redirect users to a malicious website.

Better user education can help prevent ransomware being unleashed – whether it’s on a home device or a business computer – but it will never completely eliminate the risk. So organisations need to be ready to fight back when the ransomware ball starts rolling, which means they need robust protection from the DNS layer to the email and the endpoint.

Blocking spam and phishing emails along with malicious attachments and URLs is an important first step. But the need to balance employee flexibility with IT security means the net can never be fully closed.

Even if someone clicks on a malicious link or file, organisations can still supress an attack. If ransomware can’t connect back to the mothership, it can’t be activated.

With thousands of DNS requests being initiated across an enterprise every day, detecting which ones are genuine and which are malicious requires highly sophisticated technology. Instead of proxying all web traffic, intelligent ransomware defence solutions will route requests to risky domains for deeper URL and file inspection. They will also be able to draw  on contextual security to identify unusual and potentially unsafe requests from individual endpoints.

These insights enable IT teams to make quick risk judgements that block threats without blocking genuine business activity. With new risks emerging all the time, ransomware defence solutions need to receive constant updates on the latest sources of malicious content.

If the call back to a command and control server is successful, there are still ways to contain a ransomware attack before it proliferates across an entire organisation. For example, dynamic segmentation can prevent ransomware from travelling across the network – helping to avoid a full-scale outage as experienced by Norsk Hydro.

By taking a layered approach to security, organisations and individuals can mount multiple defences against ransomware whether it’s launched via the web or email. And they will need every one of these defences because Ransomware 2 looks like it’s going to be a blockbuster. Ransomware damages are predicted to reach $11.5 billion in 2019.

Stay safe until next time.

Colin Williams

Business Line CTO Computacenter UK – Networking and Security

https://www.coveware.com/blog/2019/4/15/ransom-amounts-rise-90-in-q1-as-ryuk-ransomware-increases

https://cybersecurityventures.com/ransomware-damage-report-2017-part-2/

IT modernisation is just what the doctor ordered

Chris Price, Computacenter’s Public Sector Director, explores how the NHS can deliver better outcomes by adopting new technologies and digital processes

I recently used an online company to do a finger-prick blood test instead of going to see an NHS phlebotomist and was amazed by the speed and ease of the service. It’s a good illustration of how the NHS could – and should – be evolving to improve the patient experience.

The potential for transformation in the healthcare sector is huge: patients want it, frontline staff want it, and NHS leaders want it. Health Secretary Matt Hancock is a real technology advocate and aims to make the NHS the most cutting-edge in the world. And a Computacenter survey of more than 100 IT professionals across NHS trusts backs this up. The demand for new technology is high, with a wide-ranging wish list: for example 30% of respondents want to implement tablet devices in the next 12 months, providing technology at the point of patient care.

But implementing and transforming technology needs financial resources – with many trusts claiming that budget constraints are the prime reason for their inability to upgrade ageing IT infrastructure.

With the current political turmoil, it is likely that modernisation of the NHS will take a back seat. But we can’t afford to neglect this: to deliver better patient outcomes and value for money, we have to push the digital roadmap forward. Computacenter is playing its part by working with NHS Digital and individual trusts to not only accelerate the adoption of new technologies but also to maximise benefits realisation.

Investing in the future

Greater digitalisation will require new skills, new processes, and new policies. Navigating this new landscape will not be easy: as well as highlighting a lack of budget and resources, our survey revealed that some IT system upgrades are not pursued as they are just too complicated.

Security will be a key priority – both at a national and a local level – as healthcare data becomes digital. As part of our work with NHS Digital, we have deployed a network analytics solution that will help to identify patterns of potential threats across the NHS Digital Health and Social Care Network.

NHS organisations will need to develop new skills to aid the implementation, optimisation and management of these and other new technologies.

At Computacenter, we are committed to investing in future talent both within our business and beyond. Each year we employ in excess of 100 young people across a number of different technical and business programmes to give them a springboard into a career in the tech industry. We also promote school and university outreach to show students the opportunities that can arise from working with technology. We are delighted that these efforts have been recognised; Computacenter recently won an award for the best medium-sized organisation’s undergraduate industrial placement programme.

The transformation of NHS IT is critical to the future of healthcare delivery and enabling a more preventative approach: modernising IT is the top priority in 2019 for our survey respondents. The government has already recognised the importance of technology and now is the time to step up its commitment.

The NHS faces unprecedented financial and operational challenges, and patient care is suffering despite the determined efforts of frontline staff. We want to help the NHS with its needs of today and also prepare it for the digital opportunities of tomorrow.

When it comes to health, a better experience is important for all of us: the quicker patients and clinicians receive information, the quicker they can take action to improve wellbeing. Receiving that blood test result so promptly meant I could make more informed decisions and catch any small health issues before they become big health issues. With the right technology, this can all be done in my time and without consuming valuable resources at the local GP surgery!

Find out more about Computacenter’s work in the public sector.