Archive | July 2012

How can a good enough network really be good enough?

A quick look at the current popular enterprise networking infrastructure platforms and they all seem to suffer from a similar predicament – almost without exception the functionality is good, reliability levels are high and performance (in relevant terms) delivers against expectations.

The reasons for this rather stable state include a networking journey to date that embraced the pain of interoperability and standardisation many years ago, the common use of high performance off the shelf network processing asics (with a few notable vendor exceptions) and until recently no real need to change the status quo.

After numerous years of highly effective network solution design by the extensively trained and highly talented network engineers, that embraced inherent technology limitations and extracted maximum performance we now have our “good enough” networks. I reiterate that there are many great network engineers that underpin the largest enterprises in the world, make complex networking “just work” and deliver business outcome after outcome – helping in many cases to hide that fact that below the surface all is not as well as it may seem.  

But surely, if you were given a blank sheet of paper and networking / security designs were architected with a clean view of the vendor landscape plus tomorrows business outcomes as well as today’s, would you still design yesterdays way? If the business outcomes of today and definitely tomorrow differ from the network usage approach of yesteryear surely good enough can’t still be “good enough”.

A five year old network designed and configured for large volumes of direct connected network servers with one Gigabit interfaces surely won’t be good enough for a densely consolidated converged infrastructure requiring multiple ten Gigabit network interfaces. Equally a multi layer network topology originally configured for hundreds and potentially thousands of physical servers, with multiple physical network interfaces has very different operational and performance characteristics to a distributed switch, hypervisor virtualised network layer.

The stage is set for good enough (or worse) networks to be evolved in line with tomorrow’s application and business requirements. Software defined networks (SDN) underpinned by the open standards aligned with OpenFlow and Openstack protocols and frameworks may in time enable the granular levels of flexibility and capability required to personalise today’s “good enough” general purpose networked infrastructure footprint into outcome specific networked topologies. This blog was set to discuss the well crafted Cisco ONE strategy that leverages the value delivered by OpenFlow and Openstack and clearly positions a customer journey that leverages existing technologies interfaced with the emerging software network footprints and equally the highly innovative HP VAN software aligned network play that leverages IMC and IRF tightly woven into those same open network software foundations, to deliver tangible application aligned networking.

But both of those great stories may now be somewhat pale when compared to VMware shock acquisition of Nicira. Put simply the worlds dominant x86 hypervisor vendor now includes a highly regarded SDN networking core that can be leveraged in numerous and as yet unannounced ways that could potentially paint a new picture for enterprise networking. (save this for another blog).

So “Good enough networks” in the not too distant future may become a thing of the past. Will they ever be “perfect networks”, unlikely due to the ever changing nature of business and increasing levels of complexity, but could they become much closer aligned with the levels of flexibility and adaptability and cost effectiveness currently sought by enteprise network customers. “Quite possibly”…….

And then they will be more than “Good enough”.

Until next time

Colin W

Twitter: @Colinwccuk

Windows 8 is on its way

This week we have been participating in Microsoft’s Worldwide Partner Conference, and this year was a record-breaking event with over 16,000 partners attending from over 156 different countries (4,000 were attending for the first time). When you witness the vast number of attendees and the diversity of partners you realise that it is one of the largest, most vibrant IT ecosystems in the world.

Just so you can get a sense of the scale yourself, you can see a picture of the keynote here

20120714-102302.jpg

One of the major announcements of the week, was the availability of the new release of their flagship desktop operating system Windows 8. If you didn’t catch it on the news wire, Microsoft confirmed that Windows 8 is on track to Release to Manufacturing (RTM) the first week of August. For enterprise customers with Software Assurance benefits, they will have full access to Windows 8 bits as early as August.

There were many new features and enhancements discussed during the course of the week, but most of the excitement centred around the potential for devices such as Surface and the new Metro style user interface and applications. It is certainly going to be exciting to see how all of the OEMs, service providers and application developers innovate to exploit the platform for customer value. Here at Computacenter we have already started to look how we integrate it into our existing ‘Contemporary Workplace’ framework of solutions and services – whether it be advice, supply, deployment, integration or management that is needed for an effective outcome for our client’s users.

So, come the end of the year there is going to be another credible option for Enterprise organisations that wish to deliver touch based applications and services on slick, lean and powerful tablet devices. With Apple’s almost ubiquitous iPad already established as the market leader and Microsoft’s dominance of the corporate desktop platform (backed by the sort of ecosystem covered above) – the fight for market dominance is going to be monumental battle to watch.

The good news is that we can help our clients either way – but which way do you think the battle will swing or do you think there is room for both?

A sting in the tail for Apple Users?

There have been a couple of articles in the last week that have really got me thinking about the consequence of using products from the world’s most valuable brand.

The first article that appeared in wired magazine shows that the Ratio of PC to Mac Sales Narrowing to Lowest Level in Over a Decade. Whilst the article cites that industries that use video and photo editing are typically Mac-centric, I think it is easy to see their use in many more scenarios than this.

For the better part of the last two decades, former Apple CEO Steve Jobs focused on the outward appearance of his company’s products with an enthusiasm unmatched by his competitors. The unique designs that resulted from this obsession have given Mac products the “hip” image that they enjoy today. However, this ‘hip’ image also comes at a premium on acquisition, particularly when you consider that if you take apart a Mac computer, and you take apart a PC, you will find that they use the same parts and components. Both have: a motherboard, processor, RAM memory, graphics card, optical drive, hard drive etc.

However, they do not use the same software which brings me to another hidden cost that I had not heard of until recently.

Over the weekend I read an article in the WSJ that stated Apple Mac users booking holidays on the travel firm Orbitz’s web-site were paying up to 30% more than Windows PC users! Mainly because they could and would.

Orbitz are defending the tactic as an ‘experiment’ and believe some of the data has been taken out of context, with their CEO commenting: “However, just as Mac users are willing to pay more for higher end computers, at Orbitz we’ve seen that Mac users are 40% more likely to book 4 or 5-star hotels as compared to PC users, and that’s just one of many factors that determine which hotels to recommend a given customer as part of our efforts to show customers the most relevant hotels possible.”

So basically their website was interpreting the type of software accessing their content and then used advanced algorithms to render the more expensive options if it was Apple based. Whilst this has created a flurry of social media objection and conjecture, marketing data for this company showed that Mac users are associated with a somewhat richer demographic than PC users and Orbitz CEO Barney Harford defends their position stating that its software is simply showing users what it thinks they will want to see and buy.

The WSJ believes that the sort of target marketing undertaken by Orbitz will become more commonplace in the future as retailers become bigger users of predictive analytics.

Clearly, the challenge with this approach is that there is an assumption that if you use a Mac, then you stand out as a big spender. Whether it is true or not, I sense that other organisations will soon follow suit and will try to see that you place bigger orders as a result. In theses austere times, its just another factor of cost that sometimes isn’t considered by the more well heeled advocates of completely corporate wide BYOD scheme.

Personally, I think it’s just another example of how quickly the dynamics of the workplace and technology are moving – and as an Apple user myself I’ll be keeping a keen eye on my purchases!

In you are interested, you can read the WSJ article here