So the last year has seen the slow rise of a new video codec that it is claimed will provide a better quality streaming video experience for the same bit rate, or let us squeeze 720p video into a stream that runs on a 1 Mega bit per second connection. The codec is hitting the headlines because it has just been ratified by the ITU.

So why does this grind my gears? Simply because the assumption from so many is that existing HD video is as good as it gets, I cannot be alone in watching films and when there are sharp scene changes or explosions to get annoyed at the block effects that can be present even on 4 Mbps bit rate H264 streams that claim to be HD material.

Having seen the same event in raw 1080p which means a bit rate of 1.6 Gbps, to see it in the edit suite where it is usually produced as a master with some compression, perhaps down to 220 Mbps, to see it transmitted over a satellite connection at 25 Mbps and then compressed further until you get the block fest that is Freeview, even the 10 Mbps HD streams from satellite TV pale into insignificance to the original version.

If H265 results in content producers deciding to try and save bandwidth then the codec will be a step back in time, in the same way Freeview has created more channels at the expense of the quality of picture. The cost of satellite transponder capacity means that there is a real chance of this happening, or the next range of TV standards will simply have the picture compressed down to fit existing infrastructure.

There is a Cisco demo of H.265 which looks impressive, but the key to H.265 will be not how it manages talking head type shows, but things like football matches where the grass pitch can become a green blob with poor encoding and more importantly how it manages the encoding and onward transmission of a live stream.

I am willing to bet if you took a master of any film and played it on a TV in store and labelled the set as ultra-HD it would sell like hot cakes compared to the other models with their Freeview or Freesat feeds. Not because the TV is better, but the feed has less artifacts and will blow people away with the clarity. 

We should be increasing the bit rate of video at the exact same time as improving the codec’s used, this is where the wow factor will come from. If we move towards using H.265 to squeeze something that people will buy believing it to be HD into 1 Mbps streams, then we will be giving the politicians less reason to bother with upgrading broadband and broadcast TV infrastructure.

Tags: , , ,

4 Responses


  1. Alex Atkin on 09 Feb 2013

    I couldn’t agree more.

    We have already seen BBC HD and Sky pull this same trick. Both have switched H264 encoders to ones which SUPPOSEDLY could achieve the same quality at lower bitrates, and both were clearly far more blurry afterwards.

    I had episodes of Stargate Atlantis on my SkyHD box from both before and after the bitrate drop, it was incredibly obvious. The wow factor was lost, the clarity on textured areas no longer there.

    I was rather disappointed to find that with Netflix changing their H264 encoder into one which “supposedly” is more efficient and thus a better picture at the same bitrate, that all they are doing is using the SAME bitrate and calling it Super HD.

    If its the same bitrate then its just common sense that SOMETHING is being sacrificed.

    Why when I have a 70Mbit uncapped broadband connection, can’t I have at LEAST 10Mbit for 1080p? Especially when this new Super HD feed is only enabled if your ISP has a Netflix cache on their network, or peered directly to their CDN. It feels like a step backwards.

    I just watched Green Lantern on Now TV and the picture quality was HUGELY disappointing. On PC it showed terrible black crush and macroblocking, this doesn’t seem to be visible when played on Xbox 360 but its still a VERY soft image, much worse than Netflix HD (and I don’t have access to Super HD on my ISP). The sound quality doesn’t fair any better either.

    I understand wanting to cram HD down peoples 1Mbit connections where possible, but that should not be a reason to impose this limitation on people with far better connections.

  2. Tim on 14 Feb 2013

    Where can I download a ‘master’ copy example of low-compressed 1080p HD and a heavily-compressed version to compare so I can see what you are talking about?

  3. Colin Barrett on 15 Feb 2013

    I absolutely agree. I spend most of my waking hours converting professional & amateur video from their analogue sources to something that enables them to survive in the digital world, so I spend a lot of time considering codecs, bitrates, file sizes, etc., and although I swear by H.264 for storage and playback on desktop screens, TV displays and mobile devices, I’m appalled at what I see on the telly.

    In my work, I have the advantage of relying on relatively slow software-based compression – often achieved in more than one pass – but TV transmission is a different kettle of fish and the real acid test of a codec.

    I’m very unimpressed with the mainstream “terrestrial” channels’ HD output for the reasons that Andrew states above – especially on fast scene changes, intricate detail and movement within detailed scenes.

    I haven’t yet seen H.265 in action, but I’ll be intrigued to see how it copes. My expectations aren’t high! I expect my gears to be similarly ground as well!

  4. andrew on 15 Feb 2013

    Tim – in terms of really low compression having trouble finding stuff that can be shared.

    Random high bit rate stuff from the Internet can be a mixed bag, as some people only record at 20 to 30 Mbps and then increase bit rate.

    Perhaps one day I’ll have time to find some decent material and create lower bit rate copies to illustrate.


Leave your comment