A Suite of Changes
Certificates, encryption and lots of TLA’s
Things have been pretty quiet in the world of Internet encryption for some time; revelations from Snowden to the Hacking team have had surprisingly little to say on the subject. However, the calm is coming to an end as a raft of changes are beginning to make themselves felt.
Perhaps this lack of noise is because of where these changes are coming from, not from dramatic and media savvy vulnerabilities such as Heartbleed, Beast and Poodle but rather from some of the Internet’s biggest companies, especially Google and Microsoft.
Both are leaders in operating systems, web browsers and cloud services. So, it is little surprise that these companies are trying to drive up the quality of Internet security, not only to help them be perceived as secure to their customers but to provide a key differentiator with smaller players.
Given that this is a blog post, I have restricted myself to a couple of pages, a longer version of this article will be available from the author
Web browsers, and annoying everyone
All the browsers manufacturers are working to improve the security their products, though there subtleties in approach the core approach is the same. They are all starting to harden warning messages and turning alerts into blocking access to websites where configuration errors are found. Most importantly such settings will become the defaults for the next generation web browsers. Also Google has pretty much won the argument on browser updating, and a policy of continuous rolling updates is going to become standard, even for Microsoft. Google’s aggressive deprecation of old products and standards may well also become the norm. This will leave many businesses with even more complex legacy app support issues than we have now.
Certificates for HTTPS secure web
Web encryption can be divided into two key components, the certificate used to identify a website and commence secret communication and the ciphersuite responsible for the encryption of traffic between a user and a website.
There are moves by the industry to improve the quality and effectiveness of both these components, the first is increasing key lengths, so no real problem. The second one is a little more interesting and is the SHA1 problem. Certificates are signed to prove they are not tampered with and the legacy algorithms used for this are starting to show their age, so now to the SHA2 issue. This refers to the Google/Microsoft’s decision to the require modern signing algorithm SHA2, Google Chrome is already producing error messages and as Chrome updates they are going to become more forceful. When you renew certificates you are probably going to have to use SHA-2 despite the fact that some very old browsers or systems might have interesting issues.
The chain of trust
A description of Internet certificates is beyond this article but one component does need further examination, and that is the chain of trust. To work a certificate forms a chain of other certificates that link the users’ browser to the websites server. Browsers will produce an error if the chain is not correct, and increasingly will fail to connect at all. Anyone commissioning a certificate needs to consider how that chain of trust will be presented to their customer, and be confident that errors are not produced, look for problems in you suppliers server too.
The Ciphersuite, TLS and confusing names for things
The Ciphersuite is a series of configuration items that determine how the encrypted conversation is created. Typically both client and server support a number of different Cipher options and suitable choice is negotiated.
Transport Layer Security has replaced the older SSL standard. SSL is an old protocol and problems such as POODLE together with architectural issues that are corrected in TLS. This is an issue that seems to cause a lot of concern amongst system owners, because of a largely mistaken belief that clients will have problems connecting to TLS only systems. TLS 1.0 appeared in 1999 and support hit mainstream products by 2006. TLS itself has undergone a number of revisions and modern systems are now expected to support the latest version TLS 1.2 (released in 2008), with TLS1.3 currently in draft.
The encryption algorithm is generally the most recognised part of the Ciphersuite, examples are triple DES (3DES) or AES256. This refers to the actual algorithm used to encrypt the information and any current installation should offer AES256 to any client that can use it. It is also important to remove legacy algorithms that are no longer consider secure as an attacker might be able to “negotiate down” the connection in order to decode the traffic, it looks scruffy too.
Perfect Forward Secrecy
Though not a new idea PFS is starting to gain more support for web security as it avoids a significant single point of failure for Web encryption. Most commonly used systems have a single encryption key that is used for all connections, if this key is compromised then all traffic could be accessed, this applies even if the traffic was intercepted years earlier. PFS algorithms create a temporary or ephemeral key and create a new key when required, try Googling ECDHE for equations and graphs galore.
…and in conclusion
It’s easy to see web security as a solved problem; technologies such as public key infrastructure, HTTPS etc have been with us a long time and to most users they appear stable and rather dull. The reality is that the Internet can be a hostile place and the cryptography that underpins it is under constant scrutiny. What results is widening gap between legacy systems that were adequate for the task 10 years ago but are now cause for serious concern. Such weaknesses are also very easy to detect, making automated attacks practical. The big technology and service providers are also increasingly marketing security as a differentiator, and as cloud platforms become more prevalent the need for that security becomes even more pressing.