There have been many discussions in recent weeks about ‘net neutrality’ with mobile providers on one hand arguing that website operators like Facebook should contribute to their costs of maintaining a network and delivering content to users, whilst other parties are campaigning to force ISPs to treat all traffic ‘equally’.
A fact missed by many in this debate is that there is no concept of ‘net neutrality’ in the UK at this time, and I will argue that in fact network operators are already being selective in how they run their networks.
There are many good justifications for net neutrality; it encourages innovation and ensures that large network operators do not abuse their market position to increase the barriers to entry for new companies trying to break into the market.
We already see discrimination in for example how Apple approves applications into its iTunes store. Some level of quality can ensure consumers are protected from malicious applications, or indeed ensuring developers are efficient in their use of resources such as battery life, something which has been quite significant on mobile platforms. To the same end, if web sites have to pay for some of the cost of routing traffic to mobile users, maybe they would better optimise their websites.
Traffic management does break the end-to-end principle which has been at the core of the Internet. This tries to ensure that the range of competing software and hardware platforms can work across the Internet using common standards, and requires that traffic is not interfered with in the middle of the stream, as this may have unintended consequences.
Traffic management or traffic shaping is the process of preferring certain traffic based on its (a) source/destination, (b) protocol, or (c) some other factor. These can be useful for example to ensure ISPs prioritise streaming video over operating system downloads, as the former is far more sensitive to performance issues whereas the latter just ‘takes a bit longer’ and doesn’t overly impact on user experience.
Let’s look at an example..
In the above diagram, a network operator “ISP X” runs a traffic management system which prioritises capacity to YouTube (owned by Google) and can be used to slow down videos from Flickr (owned by Yahoo) to ensure Google receive priority. This might happen if Google were to enter into an arrangement with the ISP to pay them for a higher quality service. (It should be pointed out that Google has generally supported net neutrality so the example is fictional):
“Allowing broadband carriers to control what people see and do online would fundamentally undermine the principles that have made the Internet such a success… A number of justifications have been created to support carrier control over consumer choices online; none stand up to scrutiny.”– Vint Cerf, Google Chief Internet Evangelist, and co-developer of the Internet Protocol (IP)
Whilst there may seem to be no particular problem in paying for better access to a network, at what point does ‘standard’ access become so unusable that not paying is effectively making the service useless? If mobile operators have a commercial arrangement with for example Facebook that allows them to stream videos in return for a fee, would this prevent a competitor such as Diaspora* from entering the market, without the sort of financial backing that comes with a large company like Facebook?
The problem already exists…
However, many people seem to think the Internet is ‘neutral’ at the moment; but those who have to manage networks know well that some networks already effectively cause traffic to some destinations to slow down—One of the manifestations of this is our speed test, where one of the reasons some users get slower results, is because a small number of (usually large) broadband operators intentionally allow their Internet connections to congest, and traffic from our speed tester just happens to be one of the applications which is affected by this. To us, this shows our speed tester experiencing real life congestion on the broadband providers’ networks, and it’s something we can demonstrate if we manually alter routing to send it to the broadband operator through “another route” (often a far longer one; consider using the country lanes rather than the motorway because the motorways aren’t wide enough to cope with rush hour traffic.)
Let me explain how this can occur:
The above diagram shows a user watching videos at YouTube and Flickr, owned by Google and Yahoo respectively. The ISP has made a decision to have a big ‘private interconnect’ (e.g. 10Gbps) with Google, whereas they only have a smaller interconnection with Yahoo (e.g. 1Gbps). These don’t necessarily have to be direct interconnections and may well flow through an Internet Exchange Point, or a commercial transit provider, however in the end, the ISP will have a decision at some point in the network about the capacity of the link.
Let’s assume for a second that the above ‘big’ and ‘small’ links are 10Gbps and 1Gbps respectively and YouTube has 5Gbps of traffic and Flickr only has 1.5Gbps. Whilst YouTube has much more traffic, it can deliver this content without problems as the connection is faster than the actual traffic, whilst Flickr is losing packets, resulting in poor performance.
To remind you, your broadband provider makes a choice as to how big each of its Internet pipes is, and by allowing some to congest, they are making an active decision to prefer some traffic over other traffic.
Another way in which broadband providers can ensure their own services perform better is to connect them at fast speeds to their core network. So, for example, if the ISP above wants to start its own video hosting service, it can add a server cluster on a very fast connection, and therefore provide better connectivity for its own service than the one provided by Flickr for example. It is a perfectly acceptable way to reduce costs by hosting content locally, but where it runs into problems is when the ISP allows its Internet connections to congest, effectively degrading the user’s browsing experience when using competing services.
Many parties are often quick to condemn any type of traffic management activity and words like traffic shaping are often seen as negative influences on Internet services. I would however argue that traffic management is not evil per se, provided it is used transparently, and by that I mean in such a way that the user truly understands what is happening. I would further suggest the only effective way to ensure this is to give the user a live indicator which makes it clear when their traffic is being ‘shaped’ or ‘managed’.
We can’t build mass market broadband networks to deliver the peak bandwidth all of the time without seeing a significant increase in cost. It is possible to store electricity in damns to deliver it when the adverts start during an episode of Coronation Street, but network capacity has to be there all the time. For this reason, slowing down traffic which is not sensitive to speed or latency is an ideal way to make more efficient use of resources, whilst ensuring the user experience is protected. It doesn’t really matter if the operating system update takes 10 or 15 minutes, but it does matter if you’re getting stuttering during your Skype call with relatives on the other side of the world.
Where problems start happening is when service providers use these tools, and make decisions about running the network, not to improve general user experience, but to cause other traffic to deteriorate so much that it cannot compete. The Internet has flourished on being both open and innovative, often providing free services to users. A two-tier Internet risks strengthening the position of larger incumbents and discourages small start-ups from challenging the status quo, in the way open source software has been able to take on its commercial competitors. This ease of switching suppliers has ensured that no company can truly believe their premier status in any industry is assured, as consumers can quickly shift to competing services which are better. If someone develops a better social network than Facebook, a better search engine than Google, then they should be able to compete on quality, not merely on who has deeper pockets. Many of these large companies have benefited from this open network to grow to the position they are in now.
So in summary, I welcome trials of new business models as without them we can’t challenge the current assumptions, however I am most concerned about the way in which dominant broadband service providers (or content providers for that matter—it’s not inconceivable the BBC could ask ISPs to pay for the benefit of getting better connections to its network) may use their customers as a commodity to bargain preferential deals, sealed behind closed doors, and to the detriment of the consumer.