The IP Development Network

Welcome to The IP Development Network Blog

Wednesday, 1 August 2007


iPlayer Technology Review

The last article described the iPlayer's service elements. My conclusion was that because it is tied to the PC and yet it lacks social networking, it misses the mark in a number of ways. In its current guise, it will be a niche application in spite of the wealth of content on offer. There is much work to be done on the device and service side to make it consumable - perhaps content isn't king after all...?

Here, I will be detailing my technical conclusions of the service.

File Sizes, Line Speeds & Encoding
As a user... and leaving aside the critical lack of a streaming capability for just a second, on a good quality LLU line, the service delivery is fast. The file(s) will download at close to line rate. If you can get 10Mbps, you will be able to download at just a small amount under that speed.

The Beta Test Blog shows a line trace sustained at around two thirds of the maximum speed on their 10M Be connection. Say 6-7Mbps of download speed in their results, which means that even the largest show, DanceX, a 75 minute 756MB extravaganza (suggesting encoding at 1.3Mbps), would only take 15 minutes to download. Top Gear, which is a more moderate 60 minute show at 387MB (860kbps encoding) would be ready to go in just under 8 minutes.

But, only 30% of the population can get 10Mbps because of line lengths. My downloads also seemed to come down at close to my line speed, which for me was 2Mbps during the test period. The same Top Gear show took me 26 minutes to download but at least and the picture / sound quality are good and I got the same end-product as someone on a faster line.

Twenty six minutes though... it's less time than it takes to play the show, but is enough to make me lose interest and go and do something else. Can you imagine a 26 minute zap time on the Sky EPG...? The iPlayer is not readily consumable.

Perhaps with this in mind, the children's programmes seemed to be encoded at much, much lower rates - as low as 300kbps in some cases - which meant that zap times were almost bearable. I was surprised that this didn't seem to impact picture quality greatly. Iggle Piggle and his friends in The Night Garden were perhaps a touch fuzzy around the edges, but it was still watchable - if you are 3 years old and like that sort of thing.

Below is a table showing the file size, run time and the implied video encoding rate. I have also added a best case estimate of the download zap time at various line speeds.

Download + Store, Zap Times & Progressive Downloads

My belief is that the future of this iteration of the iPlayer is confined to a small niche because it is built without progressive download capabilities. Of course, the download and store is a nice intro to catchup TV, but the PC is not the end-game for VOD so the content is in a holding pattern waiting to be sent in two different directions.

Its current download and store model is suited to mobile TV because it removes the unreliable and costly cellular data network. Mobile phones now specialise as the jack of all trades and they have replaced the PC in that role because they are more portable than a laptop. The mobile is best suited to the iPlayer's brand of personal, download for later viewing application where people who are short of time can snatch a few minutes here and there whilst commuting.

The other direction the iPlayer content needs to go is back onto the TV - but with the added on demand capability not yet in the new service. For this to be successful the zap time issue needs to be resolved and this requires progressive downloads (or Virgin Media's cable network).

This is not a bandwidth problem - at least for the majority of users at current resolutions - it's a service application problem. If I can download something in 26 minutes that takes 60 minutes to play, I have a decent buffer which would allow me to start viewing almost immediately while the rest of the file downloads in the background.

Managing Compromises
For some, progressive downloads would mean lower picture quality and it seems that the BBC is keen to avoid tarnishing it's content with this brush. With progressive streaming, if the line rate is even close to the standard encoding rate, progressive downloads need to re-encode at a lower rate, delivering a lower resolution.

Right now I get the same end product as the other guy on a 10M pipe, I just have to wait longer for it. The compromise is very egalitarian, even if that means that very few will be truly satisfied.

Because the BBC is publicly funded, it has probably had to design for a wide base of users meaning that it has to manage these compromises where Joost, Babelgum & Veoh can simply ignore low speed users as "unsuitable". The flip side of this is that if BBC users expect to wait, then perhaps there is a window of opportunity to crank up the resolutions to true HD for the iPlayer's small niche audience. What is the difference between waiting 26 minutes and 2 hours? The mass market won't go for either.

BSG report contained data that showed that 20% of UK households will not get more than 1Mbps even with LLU because of long copper loops. Is the BBC brave enough to behave like a business and shrug it's shoulders at the Digital Divide? Can it develop the service knowing that at least 20% of its license fee payers won't be able to get it?

Peer to Peer or Client Server?
It is too early to judge the Peer to Peer network, but we have taken some benchmark readings. These show that on Friday night - before most of the new wave of Betas had been activated - even the headline content (Top Gear) was coming directly from the BBCs servers.

Chart 1 - Top Gear Download, Friday 27th July at 8pm
BBC in Red, Peer on Virgin Cable in Blue

The black total line (total inbound bytes/time) is very close to the red bars (the bytes that came from the BBCs servers). The small blue line about half way through is a peer on Virgin's cable network that for a short period, contributed some data.

Saturday morning's kiddie TV also came directly from the core sites, but by the time Monday morning came along and many of the new Betas appeared online, there was much more interesting stuff to consider.

The red lines are again traffic from the BBC, while the black outline shows the total inbound traffic including the BBCs. The behaviour of the peer to peer network is shown clearly here.

Chart 2 - Mountain Download I, Monday 30th July at 10am. BBC in red

First a small volume arrives, which the application deems to be insufficient so it calls up the BBC to get things going. No sooner does this happen than a peer appears and starts contributing data. This peer can be seen as the blue line on Chart 3 below.

The throughput was low and I was unsatisfied with the speed so after about 500 seconds I paused the download and resumed it a few seconds later to see what would happen. Very interestingly, the BBC almost immediately begins filling my requirement. The lesson? If your download is slow, pause it and resume the transfer - you might get the BBC to notice you - although I'm not sure whether this is an intentional "design feature"...

In this middle period, the end of which is a second experimental pause, there is very little peer to peer traffic. After the second restart, you can see that the download begins to gather traction as the BBC gradually eases out to be replaced by the green line in Chart 3 which is a computer at Edinburgh University. The pink lines shows the aggregate of other peers within the sample.

Chart 3 - Mountain Download I, Monday 30th July at 10am
BBC in red, Loughborough Uni in Blue, Edinburgh Uni in Green, Others in Pink

iPlayer Kontiki P2P picks on key sources
That was the first part of Mountain. Because the wireshark output was getting rather large, I stopped again and started a clean trace. Chart 4 below shows that the Edinburgh peer dominates the traffic sources for the rest of the download. This very much fits with the overall pattern I observed, where the iPlayer seeks out the fastest single connection and tries to get as much as possible from that one source.

Chart 4 - Mountain Download II, Monday 30th July at 10am
Edinburgh Uni in Green, Peer on BT Central Plus in Blue

The final sample of iPlayer data was taken at peak internet viewing time on Monday evening. I downloaded the DanceX file which was by far the largest and longest playing. The first 5 minutes of the download are shown below in Chart 5. Again you can see an attempt to find peers is initiated before the BBC picks up the slack. This time, the release back to peers is supported better until Cambridge University gradually assumes almost the entire load.

Chart 5, DanceX Download, Monday 30th July at 8pm
BBC is red, Cambridge Uni is Blue, Peer on Virgin Cable in Green

My connection as a peer
The other side of the coin is that the application uses my upstream connectivity to share files with other peers. This shows some curious behaviour that is worth looking at.

Chart 6 - Mountain Upload I, Monday 30th July 10am
Peer on PIPEX is Green, Peer on BT is Blue, Peer on Hi Velocity is Pink

It almost looks like the PIPEX and BT peers are fighting over who gets my bandwidth. The PIPEX peer is the first to become established before the BT peer comes along and demands the files. Then, like two children squabbling over a toy they play a game of tug-of-war before the BT peer seems to give up. Having "won" the battle, the PIPEX peer also seems to lose interest and eventually disappears.

Of course, the application may be designed to burst like this, but it does seem to end in a fairly inefficient use of the available resource. In spite of this, the moderately lengthy spikes are not great for aggregation by the ISP - if you think of the space used in a jar of pencils against a jar of marbles and you can probably picture what I mean.

Uploads continue even when off!
I'm not going to make too big a thing of this because my upstream usage is free - I pay for download usage only, but it is an application characteristic that deserves to be noted.

My traces have shown that in each case after the Top Gear and Mountain downloads completed, uploading activity to one peer has continued after you close the library and even after you close the application in the taskbar. The only way to stop this was to power down the PC...

I'll keep an eye on this and report again next time.

Ping & Traceroutes
A significant finding of my Monday downloads was that the majority of peer to peer traffic is coming to me not from within my own ISPs network, but from University networks throughout the UK. These are clearly on very high capacity connections, although their ping times were slower (~40ms) than the BBC sources (~30ms) that they replaced in my delivery chain.

Interestingly, both these are faster than round trip times to other broadband users on Zen's IPStream network (~65ms). Of course, Zen's servers are the first IP layer devices that the traceroute sees, but it seems to be quicker to interconnect with JANET and get to a university campus, than it is to remain on Zen and go back out their BT Centrals. Connecting to other subscribers on BT's Central Plus shows the same phenomenon (~80ms), the extra time being the interconnection time between Zen and BT.

This perhaps explains the fact that there are very few IPStream users acting as peers in my results. The majority of P2P is with users on University LANs and Virgin's Cable network. Pings to cable customers are blocked by Virgin, but up to the point where they are blocked, times are fast (~40ms) suggesting that those users are quicker to get to than other IPStream users on my ISPs network.

The scarcity of IPStream peers is good for BT and their wholesale customers, but Virgin is known to also be short of upstream capacity so the knock on impact my not be good for them. Cable peers are also among the first UK sites to pop up in Joost traces too.

Traceroutes of all the major sources of data show that Zen is taking delivery of the traffic at the LINX or MANAP peering points. LINX is where Zen interconnects with JANET who provide the UK university network backbone, and with BT. Interconnection with Virgin Cable seems to be preferred at MANAP.

The once exception I have noted is that a small amount of traffic was exchanged with a Hi Velocity subscriber - particularly on my upstream. Zen does not seem to peer with them as the traffic traces show these packets going through Cogent's network.

Comparing Kontiki and Joost
Unlike Joost, the iPlayer seems happy with a small number of high speed peers. Joost will try and reach out to as many as 4 sources simultaneously, with each of these responsible for only a small piece of the file. This is a clear advantage also for Joost's DRM because no one peer has enough of the file to make it worthwhile cracking.

Chart 7 - Joost Ferrari 340 Download, 11 July at 11am
Coloured lines show various peers, black is total download.

You can very clearly see how well structured the Joost P2P protocol is from this trace. This contrasts with the somewhat chaotic nature of the iPlayer traces. Each Joost peer seems to have a clearly pre-defined role, while the iPlayer Kontiki equivalent seems to be fighting with its sources as discussed above.

Looking at my connection as a peer on Joost, you can see the other side of the same equation. The green bars are my computer sending to a peer in Canada (via Time Warner Telecom), the red is to a destination in Norway (via Telia).

Chart 8 - Joost Ferrari 340 Upload, 11 July at 11am
Black shows total upload, Peer in Norway is Red, Peer in Canada is Green

By contrast, the iPlayer seems happiest with one very fast peers, such as the one on Edinburgh University's network that sent me 145MB of the total 267MB in the Mountain download.

It is almost certainly an over simplification, but Kontiki seems to be very aggressive at pulling in contributors - like the community do-gooder that we all know and love. As with that example, iPlayer peers seem to be reluctant to contribute and drop off before bouncing back when no-one else takes their place. It is really only the universities that want to share the iPlayer by the looks of things, perhaps because so much is demanded of volunteers who do step up.

Joost peers are much more distributed - everyone does a little bit, rather than one source getting burdened with the majority of the demand. This organisation is good enough to allow the Joost application to pause the download and wait until the buffer has been depleted before initiating further data downloads.

Although it is too early to draw conclusions on the iPlayer based on the first weekend's data set, it would appear that it has a lot to do to develop into a clean, controllable distribution mechanism like Joost clearly is already.

Designed to help ISPs?
In reply to my pre-launch post, Angus suggested that the solution was for ISPs to run the iPlayer on their own high speed servers, so as to serve all the traffic from within their networks.

I wonder whether the application is actually behaving as it is with the university networks because it is designed to work with the ISPs in this way. It may well be that a few fast peers on gigabit links at the ISPs data centre could well take responsibility for serving their user base - saving Peering costs if nothing else.

Zen at least, is not there yet. If / when they are, will the iPlayer prefer their sources ahead of those on JANET? The JANET response times are pretty good, but if anyone knows of ISPs hosting iPlayer servers, let me know and I'll run traces on their links to see...

Serving the traffic from within the network will eliminate the Peering cost, but it still leaves a significant backhaul element on the ISP. I will be looking into the commercial implications of the iPlayer in a final article on Friday, where I will also write up my overall conclusions.

Labels: , , , ,

The problem with them relying upon the p2p to spread delivery, if that's their real aim, is just the same as you highlighted with Joost.

It's the "long tail" programmes that I would want to use this service for. Big programmes that I know about are already being recorded by my PVR each week.

The stuff *I* want to be able to download are the small, niche programmes that I don't find out about until after transmission (usually when seeing trails at the end of whatever my PVR did record.)

That's why content needs to be available for longer than a week, and beyond the transmission of subsequent episodes - we need to be able to "catch up" with what we missed on broadcast. P2P won't help much with that, either.

(But this is all academic. I've got a Mac.)
Now THAT's a comprehensive tech review! A quality post there, you must have spent more than a couple of hours on that.

Thanks for the linkage, I'll have to update my post to link back to you and this article because it's one of the most comprehensive reviews I've read that focuses on the iPlayer's underlying tech. A fascinating read, and one I'll be recommending to others.

Christopher (TBTB)
Post a Comment

<< Home


This page is powered by Blogger. Isn't yours?

 Subscribe in a reader