Today’s news is dominated by Microsoft’s announcement of Office being ported to IOS, delivering what all iPAD users, particularly those in enterprise, have been waiting for.
On the run up to this announcement, I’ve been debating with colleagues whether this move will finally see the iPad elevate its position to a core productivity device, rather than, as we tend to see, an additive device.
For some, this news will form the final piece of the puzzle to allow them to use the iPad as their sole device, and they will take little persuasion to try to adopt and embrace this. For me, I don’t think that I would be able to work on a tablet to do all of the things I need to do.
With this announcement were a couple of key points. The free version doesn’t provide document editing, to get that requires an Office 365 subscription. The vast majority of business uses will no doubt need the advanced features of the paid version, and will be able to easily justify the cost. This will no doubt also be positive for Office 365 uptake and may encourage customers to look at exploiting other 365 features.
I think bringing the power of Office to the iPad is a great move, and one which was inevitable given how the platform market has moved and the significance of the Office business in Microsoft. Whilst there are other productivity applications available, many of which I have used and are very good, native Office will provide a much better and more familiar experience and remove some of the “niggles” that can appear with files formats etc. But going back to my personal use and needs, productivity apps are all about multi-tasking and there are still some fundamental limitations to true multitasking on a tablet compared to how you use a desktop that are more difficult to overcome and will keep me with a laptop for some time I suspect.
We also need to consider this in context of a full mobile delivery platform. While Office is arguably the key application suite for many use cases, and this move is a significant one for several reasons, we can’t just get hung up on the apps. Of equal importance is content delivery, where will users store and access all of these Office files in a secure and governed way, across teams and across other devices? We consider an integrated mobility solution as comprised of applications, content and policy controls, so need to cater for each aspect equally for maximum value
I will be keeping my eye out around the office and in the customers I visit for users who make the full shift to a single tablet device now. I do know for many that’s the objective, so I won’t be surprised if some can make it work
(written from Office on a laptop, for now!)
We are in the midst of interesting times. Is there ever a day when the bulk of the dialogue isn’t about the implications of “change”. Now more than any time in the last 25 years the rate of “IT” (Information Technology) change is more likely to fill the average enterprise decision maker with dread rather than the childlike excitement of yesteryear. But is “IT” really as transformational as commonly inferred – is the digital DNA always discussed that underpins modern business and society really that fundamental?
A very important topic but one often only discussed in economics or business schools is one of General Purpose Technologies (GPTs). A GPT and there have been less than 25 identified and universally accepted, is a technology introduction that permeates society then fundamentally transforms a whole economy (the real definition is much much longer than that). If we roll back in time the advent of the steam or the internal combustion engine and also electricity are examples of GPTs that are easy to quantify when based on transformational impact. It should be no surprise that “IT” or Information Technology is also considered a GPT but the in my opinion the current and future resonance of “IT” equips it with the potential to surpass many of the better known GPTs to date by a magnitude (please note, this is a very loose concept as for example without electricity, “IT” doesn’t function).
We have witnessed and been affected by “IT” over the last 40 years to a previously unimaginable degree. However the last ten years has propelled this beyond the realms of science fiction or even the minds of the freest thinking individuals.
We are bordering on an era of IT evolution and advancement potentially tempered only by a lack of imaginative thinking or dare I say it initial financial funding, rather than capability. It now seems virtually anything is possible. And there lies the problem, paradigm or opportunity, the expectations of users / customers of IT are now at a level where they also believe that “anything is possible” and potentially pour scorn on anyone who fails to help them to realise it. This means that not only must “IT” continue to change but so too must the services and solutions providers that deliver “IT” outcomes.
Successful infrastructure product supply and installation services are expected, not optional. Moving forwards the primary “IT” value add is to help individuals or organisations to realise the transformational affect or outcome of an IT solution deployment that is personal to them. This will not only require a different sales and consulting approach from today’s services and solutions providers but also an attitude change from customers who may need to revise how they position or frame the business benefits they seek from any deployed solution.
Never has the “IT” landscape looked more exciting with the promise of the future truly inspiring to an indescribable degree. In summary. if “IT” doesn’t change “IT” stays the same – we know for certain it will change and therefore we MUST all continue to change.
Until next time.
Don’t do SDN. Quite simply there is nothing to be done as such. If the current industry hype is compelling you to “do SDN” or “get SDN” you may find you already have it (or a version of it). If you are a user of server virtualisation solutions with hypervisors and virtual switches you are already leveraging networking elements defined and delivered by software (but elements the MUST still drive hardware). To extend the discussion further if your organisation uses carrier based services (delivered by one of the major telecoms companies) you are already using network services like MPLS and VPLS that massively leverage elements defined in software to deliver the networking outcome you need (many call this network function virtualisation but this is somewhat semantic).
Therefore are you missing anything now or are you already a customer of the next big thing but were blissfully unaware? Enter that horrible response “Yes and Yes”, modern enterprise customers have embraced software defined networking ideals for quite a while however equally the software defined storyboard has been somewhat invisible to all but those learned technologists employed to design, build and support the platforms in question.
But now those more recent networking elements defined in software and grouped together under the SDN banner, paint a totally different picture even if many of the legacy network infrastructure elements are retained. The brave new defined world of SDN is all about open standards (preventing vendor lock in), accelerated innovation (by using open source ideals), potential for cost reduction (due to the hardware abstraction or any network hardware vendor ideology), true network agility (massively reducing time to market of applications and new business services) and most compelling of all, application awareness (to ensure applications control the network not vice versa).
It means that striving to “do SDN” makes little sense unless you are clear on the business outcome aligned aspects that are essential to realise. With that in mind the “big tip” is to understand the SDN or network virtualisation elements that can deliver tangible value against a realistic operational plan. This must be the primary action for now, not an unchecked move to a new platform based on a features biased evaluation.
To that end now is the time to evaluate how ready your current networking platform (and security footprint) that underpins your business is to deliver the speed, agility and dynamism your business requires. And maybe is not a valid response. By understanding and leveraging the most viable elements of traditional networking approaches, interfaced with validated software defined and network virtualisation outcomes, the best of both worlds has the potential to deliver the best in the world outcome for your organisation. The new dawn of the software defined IT enterprise will potentially be your best dawn ever……
Big claims maybe – try me !
Until next time
Welcome to the first blog written by Computacenter’s 2014 intake of new associates. The success of previous programmes has allowed this year’s intake to be bigger than ever, including a new programme for Service Manager Associates.
“We were fortunate enough to receive an insight into the company
from high profile employees such as Neil Muller”
Computacenter provided the 14 of us with a jam packed induction during our first three weeks. The information we received during this time will be invaluable during our 18 month programme. During the induction, we were fortunate enough to receive an insight into the company from high profile individuals such as Neil Muller (UK Managing Director) and Clare Parry-Jones (Director of Business Enablement) as well as a number of Sector Directors. We also benefitted from a number of 101 (introductory guide) presentations including an introduction to hardware and software. This came as a huge relief to me, because I started the programme with zero technical knowledge, and although we were told this was not an issue, it was reassuring that this level of knowledge was catered for!
On behalf of the 2014 Associate Programme, I would like to take this time to thank all of those who took the time to present to us; as well as to Sue Harris and Adriana Mills who organised such a smoothly run and informative induction.
“With having fun being the main emphasis”
The next stage of the programme was one that we were all very much looking forward to; getting stuck into our rotations. I, along with three other Sales Associates, began my journey through the programme with a month in Partner Management. The objective of this programme was to network with all members of Partner Management, with having fun being the main emphasis; we certainly did this! The team could not have been any more welcoming and were happy to get us involved with events and vendor meetings, which we all found informative and enjoyable. Thank you to all of those in Partner Management!
“The feedback that we received from the day will be vital to us”
All of the associates were fortunate to get the opportunity to attend a presentation skills training day during our second month. The feedback that we received on the day was vital to us, as throughout the programme we are required to present to a number of senior people, up to and including Mike Norris. This is just one of the ways that we are being assessed during the programme, so good presentation skills are something that all of the associates want to master early on!
I hope you have enjoyed reading the first blog brought to you by the 2014 Associates. Please tune back in next month to hear from one of our Line of Business Sales Associates, Ben Parry.
The big thing that caught my eye in the past week was the VMware announcement of Horizon DaaS availability, offering Windows desktops and applications across the hybrid cloud. Significantly this announcement creates the union of vCloud Hybrid Service (VCHS), and the capabilities acquired in October 2013, with the purchase of Desktone. VMware are definitely not alone in looking at this are, but they are the first major player to launch in the UK. (Amazon announced Workspaces in late 2013 but this service is not yet generally available)
So the major players are readying their DaaS offerings, but are we ready to see enterprises rush to adopt?
There are key technical challenges to be overcome with DaaS. Despite the obvious promise it holds, the key aspect of any desktop or application virtualisation solution has been the proximity of the desktop/app client to the back end platform, in many cases specifically implemented to deliver the best end user experience. This is what drives investments in onsite infrastructure to deliver desktop virtualisation. DaaS (in public cloud terms) now takes that proximity away again by putting the desktop in the cloud.
It is not just the application traffic flows that are a concern, the entire application and data delivery approach needs considering and potentially re-architecting for DaaS. Given the challenges we see with organisations keeping their base platform up to date and available, is this a key investment area in 2014?
If DaaS has an enterprise future it is in the hybrid model, and VMware aligning Horizon DaaS and VCHS was inevitable. However a truly hybrid model requires both clear logic and mechanisms for provisioning and managing desktops across the public and private cloud. At this stage it still seems too early that customers can adopt a platform will allow customers to consume services effectively in this manner.
Another major aspect that will dictate the success of DaaS is around the licensing. VDI has struggled due to licensing constraints, particularly the VDA licensing requirement of Microsoft. The early sign from the DaaS players is that they are addressing it by operating DaaS based on server OS rather than client OS, and as such working round the VDA licensing challenge. That said, many clients will want client OS for reasons such as application compatibility, so how and if this aspect changes over time will be of key interest.
We have been watching the DaaS market develop with great interest recently, and no doubt the market will continue to move at pace. A desktop is of limited use without a user’s apps, data and other resources, and we are yet to see how for the majority of use cases DaaS can be of real value without the back end transformation of the legacy app and data estate to support it.
The world wide web (www) was 25 years old this week. Even Sir Tim Berners Lee with his most enthusiastic and optimistic perspective could never have dreamt the profound effect the world wide web would have on the world.
To roll the clock back slightly, the Internet hailed from a connectivity approach and environment called Arpanet used to deliver resilient and disaster tolerant computer to computer communication in research, university and military situations. The world wide web spawned from Tim Berners Lee’s invention of a programming langauge called HTML and associated work with colleagues at CERN that allowed linked pages to be created, edited, searched and located across an internet style environment. And the result, computer to computer connectivity, plus pages that could be linked and searched (we now call them web pages and web sites), that has delivered us the rich tapestry we now called “the internet and the world wide web (www)”.
I have greatly simplified something that is underpinned by extraordinary complexity that second by second is hidden from us all. Our engagement with the world wide web is public but personal, general but specific, real time and real world – in fact for most of us the internet is so bound into our day to day existence a world without it is now unimaginable.
But you may say, in the midsts of the mobile device explosion that has reprogrammed our understanding of always connected, a connected world without the world wide web is acceptable as there is always an “app” available that can deliver the same value. I however, do not agree as many of the “apps” in use today are no more than clever front ends to full blown world wide web based sites (with content, application and database logic behind). When considering a mobile user technology based existence, without the web and its rich content the user experience would be nothing like the “anything is possible” digital world we embrace today.
So for many the 25th birthday of the world wide web (www) came and went without so much as a raised eyebrow – but as I have now highlighted the significance of the monumental event spend 60 seconds and consider your own 24 hour existence with “NO” involvement, interaction or service delivered by the good old “www”. Painful thought if you ask me…..(I stopped after 15 seconds !!)
Until next time.
The beginning of the wrong end – dare we consider the impact of a “multi-tier / multi speed” Internet?
One of the most fundamental pieces of news for ALL Internet users broke last week across in the USA but seems to have slipped under the radar over here.
Put simply it’s the start of a change of stance by a number of major US carriers of “data” to levy additional charges to content providers that generate large volumes of Internet traffic. To explain this further, at present if an end user chooses to use the service of “content provider A” they access it via whatever internet connection or point of presence choose to use (whether a paid or free service), fixed or mobile. But the current result could be a popular service (for example social network site content or video streaming) delivering a mass of internet traffic generated by “content provider A” across the major carriers networks with no additional charge paid to the carrier (who must still maintain quality of service, manage bottlenecks, etc.).
However the news broadcast last week highlighted a change of stance by a major US carrier who is now requesting an additional charge from a major TV/film streaming company to carry its traffic across the carriers’ network. And what happens if the content provider refuses to or does not have a cost model that supports the payment of such a charge – does that mean traffic generated by an end user of the service is discarded, rejected?
This overt change seems to be by many as a first small step to a multi-tiered internet, not the “free ish” flowing internet we have today. And the worry, what originates in the US seems to have a tendency to quickly permeate to the UK/Europe (and how could this not).
I revisit the title of this blog, “Could this be the beginning of the wrong end”, which sees types of content only running at optimum levels when transported via “Carrier As” network but not “Carrier B”? And what happens when the networks join at various parts of the Internet, will traffic formally slow down at certain points because “Content provider C” hasn’t paid the carrier “traffic transport” premium? As an end user of an internet service does anyone really need to understand where, who and how to connect to gain not only the best experience but potentially the service at all?
This could be deemed an unsolvable problem as many think for carrier to seek to maximise the monetary income from transporting data is not unreasonable especially when they are fundamental to service delivery – but equally it’s tough for content providers (and that is virtually everyone on the internet) to factor in yet another variable in their income cost model (if they have a cost model at all). The Internet is only of value if “content” is available to the widest audience and can deliver the optimum end user experience with/from that content – a variable end user experience without the end user understanding why does not bode well.
This one is one to watch with very interesting times ahead. However this is resolved, and there are unlikely to be any true winners, ramifications to every popular content provider on the internet are great (and likely to cascade down to the end user).
Watch this space.
Until next time.