On 17th February 2015, the 17th draft version of Hypertext Transfer Protocol (HTTP) version 2 Standard was released.
HTTP/2 offers many improvements over its predecessor, however, it also creates many new challenges. These challenges need to start being considered now, in order to prevent organisations from being caught out over the coming years.
The HTTP/2 specification enables a more efficient use of network resources and creates a reduced perception of latency for the end user. It achieves this by:
The current version also proposes the use of compulsory end-to-end encryption between the server and the client. Although this helps mitigate against a number of risks, such as Man-In-The-Middle (MITM) attacks, it is a controversial topic within businesses and ISPs. It has many benefits for the end user; however it renders services such as proxies, load balancing, web caching and profitable meta-data scrapers almost redundant.
– This allows for multiple concurrently open streams within one connection, with either endpoint interleaving frames from multiple streams.
– Typically connection tracking devices such as firewalls and NAT have to maintain a large number of stateful connections. The ability to have a single TCP connection reduces the overhead on these devices and could potentially create costs savings of up to 75% on larger websites by reducing the requirements for large scale firewalls.
– Streams can be established and used unilaterally or shared by either the client or the server. This allows either end to terminate the streams; ensuring unused sessions are not left open or idle on the web servers.
-The addition of Flow Control has been built in to the protocol, moving it away from networking devices. This allows for virtual hosts to be prioritised within the web farms, rather than current IP based technologies, which gives greater and much more granular control.
-Push technologies, built into the protocol, will allow for more intelligence to be built into the web servers to prepare and push information to the client based on typical requirements. For example, if a user requests the index page of the website, the server could pre-emptively push the next required file, such as the website’s CSS file, to the user. This reduces website response time and reduces the appearance of latency on a web connection.
There are a number of early deployments of this type of technology through SPDY (Speedy) which are showing good results and the advantages seem to outweigh the concerns, there will need to be a technology shift from the networks infrastructure to the server teams.