The IP Development Network

Welcome to The IP Development Network Blog

Thursday, 23 August 2007


Is there a Wizard at Ofcom?

Or is that a dunce's hat they are wearing?

Do Ofcom Have a Clue?
The most subjective point that I made in my recent iPlayer series, and in particular in my article on The Regsiter, was that Ofcom know what they are doing. I have had a fair amount of feedback along the lines of the following:

"Ofcom really don't have a clue about anything and just are pushed from pillar to post by the amount of lobbying going on. You are really naive if you think that someone in Ofcom is really the Wizard pulling all the levers in the background."

Is BT the Power Behind the Scenes?
This thesis holds that actually, it is BT that holds power. The theory goes that BT are able to make a weak minded Ofcom accede to their every wish through their use of Jedi mind tricks.

I agree that the Force is strong with BT. Their Regulatory department is staffed by the brightest people in Telecoms, whose role it is to confuse the heck out of the rest of us. Furthermore, there is evidence that BT seems able to pull victory from the jaws of defeat when it seems that they have been beaten into submission - but does this mean that we are under the spell of a great magician? Or is it just that (unlike everyone else), they just get on with it when decisions go against them?

In the late 1990s, it is certainly true that Oftel were pulled from pillar to post. They were naive and believed Mercury every time a grievance was raised. Similarly, they fell under the spell of MFS magicians who worked tirelessly for changes to the market that altered the telecoms landscape in the UK and throughout Europe.

Many of those same people who fought BT in the 90s have now found themselves at BT since the company's renaissance in the 00s. Another example of poacher turned gamekeeper in telecoms...

BT has Good Reason to Like the iPlayer
It is certainly true that BT wins from the iPlayer's launch. Their wholesale product still covers 34% of the market (excluding Retail) and the Capacity Based Charging scheme means that any extra iPlayer driven usage within this base just adds to BT's profits. The price of the BT Central product is worth noting - £155 per mbps per month.

In the long term, the iPlayer driven LLU may cost BT subscribers but certainly within the short term, the additional usage will drive wholesale revenues that more than fill any gap. Over the long term, the iPlayer is likely to drive backhaul circuit investments from LLU operators, which is revenue to BT too.

A Strategic Decision to Go Ahead
Of course the BBC Trust made the final decision to go ahead - but if Ofcom had turned to them and said, "look, there's an £831m bill, the ISPs don't have that kind of money", the Trust's hand would have been forced.

But Ofcom did not say that - the discussion probably went more along the lines of "look, there's a £831m bill, the ISPs will grumble but the investment is for their own good". In fact when the MIA was announced to the world, the press release made no note of the cost whatsoever and you have to work down to page 103 before you get to the detailed assessment! So did BT persuade Ofcom that the bill was an acceptable cost and not to make a fuss about it, or did Ofcom make its own mind up about that?

I see no evidence of BT being more prepared than the other players which is often a dead giveway that a decision has gone their way. In fact the initial suggestion that they too were fighting the launch, however off-the-record and however quickly retracted, suggests that there was no grand plan behind this. BT executive were clearly not all quite on-message.

My view is that Ofcom are now pulling the levers. More specifically, it seems that the levers are being jerked around violently as Ofcom battles to reign in the huge range of stakeholders over whom it has an influence. My conclusion is based on the fact that it seems that they can do no right by anyone which suggests to me that they are trying hard to balance opposing interests.

Equality of Hardship
Let's look at who Ofcom have upset recently... First came structural separation - it cannot be argued that this has helped BT because of the amount of work involved to demerge Openreach and create a set of systems and processes that could support the new design of wholesale market. It might be in the group's long term interests, but that is more due to the quid-pro-quo that saw the chains removed from BT Retail.

Then Ofcom upset the Broadband Stakeholder Group and a previous Ofcom boss, Kip Meek who feel that we are not doing enough to prevent Britain becoming a digital backwater. Next Ofcom upset customers and the market with their tacit acceptance of two tier pricing, before they upset the politicians who questioned whether the organisation was too big for its boots and was making political decisions.

Just when I was beginning to think that the interests of investors in LLU broadband were the only ones that had not been targeted, came the iPlayer. Having had a good run of it for a couple of years, the iPlayer brings down the hammer to signal the start of a new and much bigger wave of investment for those players.

Ofcom Understand the Dynamics
Ofcom clearly understand the market - assuming they they are unique in reading their entire
Communications Market Report. Their 2007 version was published today and is a veritable goliath as research pieces go. I have printed it double sided on A4 paper and even without the Radio sections, the report sits over an inch high on my desk. I am sorry for the tree, but this is the only way to digest so much information. I am assuming that Ofcom have digested it...

So is it a wizard at Ofcom or a Jedi at BT? It may well be both - but it does not appear to me that BT is being favoured by the regime. Where BT wins is that it plays the regulatory game where ISPs do not.

The Regulatory Game
The game consists of huge volumes of data and documentation being passed backwards and forwards. What tends to happen is that the crucial part is in a footnote on page 142 where one sentence changes the market. BT will take time to read this and reply in kind, whereas the ISPs just don't put enough effort into unravelling the mysteries that BT and Ofcom conjure up until it is too late - and the iPlayer launches.

Is it Ofcom's fault that the game is played this way? Should they be made to stand in the corner and be subjected to ridicule for allowing BT to run rings around them? In my view, the answer is no - detail is a feature of regulatory policy making or there will be loopholes. Every player in the market knows this and has used it to their advantage in the past. If some players in the market believe that there are higher priorities at a given moment than responding to consultation, then that is their decision.

Trying to play the victim when the result doesn't go your way just strikes me as bad sportsmanship.

Labels: , , , ,


Thursday, 16 August 2007


Unlimited* Broadband

Suddenly everyone is a power user. I bet you didn't see that coming...

Wakey Wakey!
The iPlayer is a wake up call because we can all now see the beginnings of the final product. ISPs have known about it for years - the market estimates have been openly shared - but perhaps because the development process and consultation took 4 years, they might have forgotten that this day would someday come. Now the product is out there - with a prized place on the BBC's web site - all those light users that made the economics work (just) are suddenly potential power users too.

The BBC's online media organisation is formidable and is a mass market proposition. is the 5th most popular UK site according to Hitwise - most of us have sampled online video and radio from them already. Streaming clips validated the concept of online video, but the iPlayer brings the promise of what has been lacking so far - stuff that lots of people want to sit back and watch. Only the networks - with their threats of throttling and extra charges - stand in the way of mass market adoption.

They Got Themselves Into This Mess
It is hard to feel sympathy: we all know that ISPs have made decisions that have put them where they are today as they fought their way through the land grab over the last few years. The result is a market where customers think they are buying one thing, while their suppliers are delivering something different. What does " unlimited " mean to you? What does " unlimited* " mean to you?

The asterisk is vital as we all know, but even if you read the Terms & Conditions to find the Fair Use Policy (FUP), you are unlikely to be left with the impression that it is going affect you. The policies talk of using P2P and filesharing applications like they are some sort of nasty disease that you are very unlikely to catch. Some ISPs were up front about it - capped products were launched - but they really weren't very popular. Because they were trying to grow numbers in an expanding market, there remained the option to go unlimited* for just a little bit more money each month. And for a while, the model worked, especially when the market price hit the magic £17 per month tipping point.

Problem, What Problem?
Power users were simply not a problem for most ISPs because they became such a small corner of the base. As prices fell, adoption rates soared and ever lighter users were added to the network reducing average usage and actually making the price cuts work financially.

The trick with fixed price models is to set the price at a point where even light usage customers choose it anyway because it gives them certainty in their monthly bill for a reasonable price. The "under-utilisation" of your new customers actually makes average usage fall which reduces cost per customer. Set the fixed price too high and you only get the power users for whom the service is still cheap. Set it too low and you know what happens...

But the chickens are coming home to roost. The market is saturating and the inevitable has happened: light users now have the urge to use video filesharing applications too. Only we're not talking about mininova or some diseased video pirates now, it's the iPlayer from that bastion of British media, the BBC.

P2P: The Disruptive Force
Has the BBC caught the disease too, or are the ISPs wrong to treat filesharing as a parasite? It was certainly easier when P2P meant bootleg content. Then, service providers probably held the moral high ground even perhaps protecting the interests of media organisations in a strange sort of way.

Now though, mainstream media is using P2P technology because it delivers them a lower cost for their distribution. P2P was necessary in the piracy world because viewers were not paying customers and a way had to be to offload the cost. The solution was brilliant - use the spare CPU, RAM, Disk and Bandwidth of all users to remove the need for central servers that would a) be traceable and b) cost money. Is this a necessary move from big businesses or is it predatory?

Big media have turned the poacher into the gamekeeper. Of course P2P saves them money but it also helps their DRM by fragmenting the file into disparate pieces on its journey across the internet. The technology works in their interests but it does so at the expense of the ISPs. I'll save writing about the black arts of P2P economics for another day, but suffice to say, P2P generates a lot of extra upstream traffic and disaggregates traffic flows making them very difficult to manage (ie. it costs more). There are solutions, but that too is another article.

If big media was paying their share of distribution costs then perhaps the ISPs concerns would have a hollow tone. This is just not how the internet works: the BBC grant free peering much in the same way as peasants receive an invitation to one of the Queen's Garden Parties, something that is inconceivable in reverse. The fact is that users want this content out there and they don't care who their ISP is as long as long as the connection is free(ish). The ISPs are over a barrel.

What is Really Happening Now?
But lets take a reality check and look at traffic across the LINX peering point where the iPlayer's impact on network bandwidth is likely be seen first. Although traffic is up this week, it has almost certainly been due to wind and rain rather than diseases running wild over the network.

It certainly sounds like the apocalypse may be coming but in fact there is no real evidence of any iPlayer growth in demand although you would not expect to see growth in August. There may be signs that the seasonal lull may not be as obvious as in past years, but that could just be the terrible weather this summer. It will be interesting to keep an eye on these graphs in the autumn when the days get shorter.

Supply is NOT Infinite!
Before dismissing the problem, look at the year on year picture at LINX. Peak traffic loads are close to double what they were a year ago so while connection numbers are only increasing by 15% on an annualised basis. 115% of customers have used 200% the bandwidth used a year ago, indicating that usage per user even before the iPlayer may be going up by as much as 75% every year.

Has your broadband bill gone up by 75% in the last year? Probably not... you have an unlimited* product. Maybe though the * is getting bigger and more ominous? Am I going to get punished for watching the Beeb?

If you feel like this you are not alone - it's going to be an issue for everyone very soon. Looking at some estimates of the bandwidth impact, you can see the iPlayer itself - one application - being responsible for as much traffic in 2010 as is carried from every other source put together now. Total traffic will grow tenfold if ITV, Sky and the others follow suit.

It has amused me to see the rekindling of the network neutrality "debate" in response to the iPlayer launch. Network Neutrality is not a debate, it is a faith and the debate is no more constructive than arguing with someone about their religion. I agree with Martin Geddes - he's my god on this issue.

No, the problem is not that ISPs want extra money for carrying this traffic just to increase their bottom lines - although they would of course take it if they could. The problem is that most of them still haven't paid back the last loans that took them into broadband and are going to have to find more money from customers with unlimited* usage to pay the £831m iPlayer bill.

We Could All See it Coming
Someone asked me once if I had £6m would I put it into their LLU project. Only if I had £60m in the bank I said, because it was clear a long time back that investment was a recurring theme of the broadband business model. That was before a public body came along and wanted to double the load on the networks and them with the bill. My father researched black holes for a living in his career as an astronomer. I often feel like I am doing the same thing when I look at telecoms economics.

ISPs knew what was coming in the iPlayer. Perhaps they didn't believe it possible that the BBC would get this far. Perhaps they took their eye off the ball in the price war deathmatch? You don't want to worry your customers unnecessarily - especially when you are in land-grab mode - but more should have been done by the big players to clarify exactly what they mean by unlimited* before the problem arose. Maybe that is what is happening now?

Even now though there are gains to be made and there is a game being played out. Tiscali are playing chief bad-guy, perhaps because if TalkTalk had tried taking on that role, Dunstone would have been given the Graham Taylor treatment. Others are staying out of it knowing that they haven't dug themselves in quite as deeply and may be able to profit from the negative PR that the two fighting the case will surely receive.

In spite of the fact that the problem that the industry has caused itself has become so apparent*, they are still looking for ways to get one up on each other. That's competition and it shows that the market is working as it has been designed to.

But has it been designed well? Will ISPs find a way to make extra charges stick? If they do not, where is the money coming from to pay back the LLU bill, let alone the iPlayer bill? Will we see a further wave of Telco bankruptcies as yet another round of investment is written off and sold for pennies in the pound deepening the vicious circle of price decline and under-investment? If no money is made from LLU, who is going to lend the money to build fibre?

ISPs need to act as a cartel* on this, but then that is illegal... We're back to the natural monopoly issue again.

* subject to fair use policy. See Orange's as an example. Shockingly vague - I have captured it here for the historical record as I suspect this may have to change! Tiscali's is a little better, but it still tries to brush the problem aside "If you don't use Peer to Peer or file sharing software it is unlikely you will ever be affected by this Fair Usage Policy"

Labels: , , ,


Wednesday, 15 August 2007


Offcuts and Afterthoughts

When you write to a word limit, as I did in my iPlayer Politics piece for The Register, there is often a fair amount that hits the cutting room floor. This article is going to pick up on a few of those themes and tries to answer an excellent question I received on the piece.

The Question
The question from Chris Fraser really gets to the heart of the debate from a user's perspective. An educated user, yes, but a user nonetheless.

Why is it that once again we are being told by UK ISPs that our systems are not capable of delivering the type of service that has been available on the continent for some time? I am willing to believe that maybe the infrastructure is not up to the task. If that is the case why are they not willing to make the same investments as their European counterparts?

"Please can you give me a legitimate reason why Ofcom should not be forcing these ISPs to put their hands in their pocket and actually pay for a less out of date infrastructure when some of them are posting huge profits in their yearly financial accounts reports?"

History of UK Internet Access
The answer to this starts in the mid 1990s when internet access was via dial up and the vast majority of internet content was in English (US English to be precise). At that time, France, Italy and Spain in particular lagged behind in adoption rates because of the language barrier and so when DSL came along in the late 1990s, there was little to lose for the industry to make the step straight to DSL.

The other factor was the pricing schemes for dial up access. In the UK thanks to Freeserve, as elsewhere it became well established that dial up was via a local rate number. The difference was that local rate at peak times in the UK was close to 4p per minute. In France, Sweden and the Netherlands it was closer to 1p.

The UK's 0845 scheme was set up to provide artificially high (excess profits) to companies willing to invest in voice switches. This decision was made before dial up access was popular and was originally intended to spur competition in the voice market.

The result however was that dial up minutes swamped everything else and new entrant voice operators (OLOs) were guaranteed the lion's share (~70%) of the consumer price. A lot of this went in revenue share to ISPs, who had control over whose network their users 0845 minutes were carried by.

So with this nice gravy train benefiting OLOs and ISPs alike, no-one really wanted to do broadband. It took until 2003 for the latent demand to explode and DSL to really be taken seriously. This was fully 4 years behind France - a gap which we are probably still seeing today.

Other Factors
Population density is also a factor - Paris, Amsterdam and Stockholm is where the most notable OLO FTTH projects are now appearing in Europe. There you have a lot of multi-tenancy buildings which are much cheaper to connect than the individual dwellings we prefer here in the UK.

Laying fibre is not cheap, £600 per home in a city or thereabouts. If all you are doing is spending money to serve the same amount of revenue (prices do not go up when bandwidth increases) you get into the Broadband Incentive Problem. That's an article in itself but it is worth noting that BT may well soon start building fibre networks in new build estates - no copper, just fibre - where there is not the cannibalisation issue.

The Digital Divide and Natural Monopolies
The other issue that needs to be thought through is the whole Digital Divide problem. Is it right that ever faster broadband speeds are made available where it is economic and not where it is not? We have the luxury of being able to consider this still, because when you get to where France is now, you are on a one way street and may have to use significant state funds to address the problem. Do we as taxpayers want to do this in the UK?

The problem in my view is that telecoms infrastructure is a natural monopoly and competition is artificially imposed. The most efficient model is one network big enough to serve all and using cross subsidisation to level out inequalities in pricing and access speeds. Of course then the issue is not an economic one but a behavioural one.

Behaviour is less of a problem when you have multiple companies essentially reselling the monopoly asset at a regulated price, but then the problems are economic. All ISPs can do is stick a badge on the product and do some creative packaging. The value we as consumers see is not from the bits and bytes - they are a necessary evil - so we look for our broadband to be as cheap or as free as possible.

The LLU Factor Makes the Price War Worse
Some would argue that the last paragraph only describes the IPStream-based offers. LLU is different because the operators own the kit in the exchanges and control the circuits back to their core networks. LLU is cheaper than IPStream if you have more than around 300 connections off individual exchanges and its gets cheaper still the more users you have in that small geography.

There the ISP has an incentive to invest in LLU, but once you have made that incentive there is an incentive to play out a price war to grab market share. Consider the game theory behind the price war.

The reason ISPs lose their investment is because once you have made the investment, the next rational move is to reduce pricing in order to fill the network you just built because your incremental costs are extremely low. Obviously if everyone sees it this way (and they do) you end up with a continuation of the price war, only this time with a much lower floor.

This is not a unique problem for telecoms. Washing powder and countless other FMCGs have the same dynamic - for them investment in LLU is replaced by investment in production capacity. Once you have invested and find competitors have done the same, you might as well throw the original business plan away because the pricing power assumptions you might have made are just no longer there.

Recreating the Monopoly
The way out of this predicament is to rebuild your monopoly power through acquisitions, but when you do that you are again paying over the odds because the companies are valued by stock markets knowing the very weak position in which the buyer finds themselves. That's why so many companies that went on acquisition sprees find themselves in a bankruptcy position. The Goodwill you are buying is just not real because the product is such a commodity.

The final problem is that if you are really successful and get too big or rebuild the monopoly too far, you have the spectre of regulation and consumer group pressure as the US carriers are now finding since the re-creation of AT&T.

The Incumbents
The above is a very long winded answer to most of the question, but it still only deals with the position of a competitive carrier. The position of an incumbent is very different indeed.

There are really two incumbents in the UK - BT of course and Virgin Media will all the old cable franchise assets. I'll come back to BT in a minute because that is where the profits that Chris mentions in his question are being made.

Virgin has found itself in a strong market position - a superior service technically to BTs for the 47% on a cable run - but in a mess organisationally. Trust me, acquisition integration is nowhere near as easy as the CEOs will perhaps suggest in their briefings to investors. I liken it to spaghetti - a lot of customer service and network management systems that have been designed in isolation but have been brought together under one brand. I feel a huge amount of sympathy for support agents dealing with quad play customers because the information they have at their disposal is so poor!

BT is very different - they have a monopoly over the infrastructure serving the 53% and they face a very different problem. They make huge, humongous investments on a periodic basis like they are with 21CN which give them far more capacity than they need immediately on the routes they build. They are regulated to wholesale this product but they can't suddenly drop the prices to their new costbase because there is no immediate demand, so they have to leak it out gradually.

Nevertheless, both incumbents are in the same position: they have made investments and their shareholders want a return before the next wave of spending is released. Virgin in particular need to give back before they take more - which is why Private Equity has been circling the company. Both BT & Virgin have assets and market power and are in a position to make significant cash returns by slowing investment.

There is also the point worth considering that if BT moved too quickly fighting tooth and nail for the attractive markets, it could obliterate the competition that Ofcom has strived for twenty-something years to foster. Sure that would address the speed and capacity issue for some, but leave BT as a re-established monopoly responsible for a widening Digital Divide. Be careful what you wish for...

Ofcom's Attempt to Solve the Problem
So onto the final part of the question: what is wrong with Ofcom's gunboat diplomacy - get all these players, the OLO/ISPs, BT and Virgin to invest in a network for the 21st century and not just the 202nd decade?

Perhaps nothing, it was a similar strategy that led to the change in attitude by BT which got us to where we are now. Line everyone's business models against the wall at gunpoint, shine a light in their eyes and ask them some difficult questions.

Notice that I haven't mentioned the iPlayer at all in this analysis because the picture is much much bigger than the BBCs rather limited application. It could as much apply to YouTube, Joost or any other mass traffic source like Google or Yahoo!

The problem it seems to me is that it is very difficult to work efficiently at gunpoint. What is needed is for the ISPs and the content owners to stand down from the confrontation that has been bubbling up ever since AT&T vs Google in the network neutrality debates and work on a better way to make sure that the money flows down the value chain. It is absurd to expect a commercial entity to invest without the promise of stability and an ROI. ISPs today have neither.

The iPlayer's unique position as a free, advert-less, quasi-publicly funded product makes it an ideal political football. We are behind because of the unintended consequence of a regulatory decision in the 1980s to regulate NTS - long before most people had ever heard of the internet - and Ofcom is using the iPlayer as a battering ram to solve the unintended consequence of its actions years ago.

There is a very real danger that yet more aggressive action could have further unintended consequences. Ofcom may argue that actually it was the BBC Trust that had the final say on the iPlayer, but it would be naive to believe that this approval would have been given had Ofcom raised the ISPs concerns more strongly than they did in the MIA.

They may have plausible deniability but I think Ofcom knew exactly what they were doing and perhaps it will have the intended result - but it is a high risk strategy.

Labels: , , , , , ,


Tuesday, 14 August 2007


Dear Regulars

I'm not ignoring you, I promise. It was summer last week and it was also my son's birthday so I took him to the Emirates Stadium (it was Arsenal's members day). There are some elements in my children's education that I won't leave to chance and this is one of them ;-)

Then on Thursday I managed to secure a ticket to the Test match (
cricket) and decided to make it all into an impromptu week off. It is one of the benefits of being self employed!

I'm preparing a piece on television content popularity based on BARB stats and the Long Tail model, but Andrew Orlowski called me up yesterday out of the blue and offered me a king's ransom to write a piece on the iPlayer for The Register. Well, perhaps not a king's ransom, but enough to pay for the beers I had at the cricket anyway.

So I agreed and you can read the result
here. I came to the conclusion that all the fuss directed at the BBC is really a case of shooting the messenger. Ofcom are the ones responsible for the gunboat diplomacy in my view and it made me wonder (again) whose side they are on. Perhaps it is a regulator's job to be dismissive of the interests of all its stakeholders on an equal basis?

If you are not a regular and are here because you read the piece on The Register... welcome! Have a look around (hopefully the iPhone inspired menu will help), put me in your RSS reader (
Feedburner link) and come back another day.

Comments, positive or negative are always welcome (unless they are spam, so I do moderate) and if you have a pet topic that you want me to write about, please let me know: jpenston at ipdev dot net. I can't promise to agree with you though!

If you are wondering "who does he think he is writing all this &*%$!", there is a picture of me on the left hand nav linking to the about section of the site. Further feedback can be provided through the questionnaire linked from the right hand nav.



Tuesday, 7 August 2007


iPlayer Conclusions

Of course, any conclusions now are based on the current beta so if you are looking back on this article in a year's time, don't be surprised to find that what I say is now horribly out of date.

The BBC's service is very limited. It is riddled with compromises which detract from the end result and it's Kontiki P2P delivery is both a source of inefficiency and controversy.

And yet it is a start. A walk of a thousand miles and all that, but by taking the leap and putting the bulk of its programming on the internet, the BBC has opened a range of opportunities for the development of the service.

Kontiki's P2P is really not very good. It was very disorganised in comparison to Joost, which makes iPlayer and 4oD downloads slow to start and traces look a bit like kids bickering in the playground. I will be keeping an eye on peer hit ratios and will report back periodically on those as it is too early to draw firm conclusions on the amount of traffic that the BBC is offloading onto peers. But for now, talk is of underhand tactics.

Underhand? Yes - absolutely. My iPlayer is closed in the taskbar and yet kservice.exe is still running according to my task manager (Ctr+Alt+Del, Processes). Does it matter? Not to me as I don't pay for my upload but ever since I installed and tested 4oD I have experienced a significant increase in used upload capacity. If I was a cable customer, the extra bandwidth used on the coax might degrade everything for me and for my neighbours.

I deliberately slotted the Arootz article in the middle of this iPlayer articles because in that concept you can see how the BBC may be able to do it differently - multicasting to storage. If the BBC are committed to Kontiki, then they all have their work cut out.

As a user application, the iPlayer is inferior even to its 4oD stablemate because of the strange disconnection between where you select the programme - the web - and the application you actually view it on - the iPlayer itself.

It is all very different from Joost, Babelgum, VeohTV or YouTube. Those services are for live entertainment. The iPlayer is not - it is a catchup download service where you have to wait to watch what you want. The lack of progressive streaming is a big shortfall.

The addition of progressive streaming would make the service feel a lot more like TV. Furthermore, it would open the door to further development of the client software onto set top boxes, freeing the service of the chains that currently attach it to the PC.

I am not saying that the BBC is wrong to offer catchup downloads. It is a part of the product set that they want to end up with. Perhaps it is the low hanging fruit, but the final solution also needs to replicate and add value to the core broadcast model. Here too, there is work to be done.

Catchup Downloads has a number of avenues for development too. The capability will be very important in mobile TV where the cellular networks are simply too immature to offer anything like an acceptable experience for streamed services. Here, latent demand for mobile TV can be met by bridging the mobile handset and the broadband network, sideloading the media onto the device while the user sleeps. The iPlayer's current design provides this as a further development option, if nothing else.

But all is not lost - and that's why I issued the word of caution in the opening paragraph. As a service to me as a license fee payer, it is very good simply because it's got BBC programmes on it and not Channel 4s, Joost's, Babelgum's or Veoh's. It's attractiveness is directly proportional to the BARB figures which show the BBCs average viewing (for June) at 7 hours 24 mins against Channel 4's 2 hours 15 minutes.

In my service review, I wondered whether content was really king, and I think on reflection it is. What I think I've learned is that the application and the distribution network play a vital role in the shadows, they are the king-makers...

The BBCs royal aspirations are still alive and well after this release, but they really need to think about the people they are surrounding themselves with and whether they can get to where they want to be with the baggage they are carrying. I'm not just referring to Kontiki, the BBC is also weighed down by beaurocracy and that too is severely limiting the service.

If the BBC is serious about IP as a distribution technology for TV, and I believe they are, they need to evolve to a point where they simultaneously broadcast and offer up for time-limited storage their entire portfolio of programming. Quite what value they are creating by doing so is an interesting question given their unique commercial status - for competitors, the benefit is targeted adverts - but what does the iPlayer add economically? Something for another day perhaps...

Labels: , , , , , ,


Wednesday, 1 August 2007


iPlayer Technology Review

The last article described the iPlayer's service elements. My conclusion was that because it is tied to the PC and yet it lacks social networking, it misses the mark in a number of ways. In its current guise, it will be a niche application in spite of the wealth of content on offer. There is much work to be done on the device and service side to make it consumable - perhaps content isn't king after all...?

Here, I will be detailing my technical conclusions of the service.

File Sizes, Line Speeds & Encoding
As a user... and leaving aside the critical lack of a streaming capability for just a second, on a good quality LLU line, the service delivery is fast. The file(s) will download at close to line rate. If you can get 10Mbps, you will be able to download at just a small amount under that speed.

The Beta Test Blog shows a line trace sustained at around two thirds of the maximum speed on their 10M Be connection. Say 6-7Mbps of download speed in their results, which means that even the largest show, DanceX, a 75 minute 756MB extravaganza (suggesting encoding at 1.3Mbps), would only take 15 minutes to download. Top Gear, which is a more moderate 60 minute show at 387MB (860kbps encoding) would be ready to go in just under 8 minutes.

But, only 30% of the population can get 10Mbps because of line lengths. My downloads also seemed to come down at close to my line speed, which for me was 2Mbps during the test period. The same Top Gear show took me 26 minutes to download but at least and the picture / sound quality are good and I got the same end-product as someone on a faster line.

Twenty six minutes though... it's less time than it takes to play the show, but is enough to make me lose interest and go and do something else. Can you imagine a 26 minute zap time on the Sky EPG...? The iPlayer is not readily consumable.

Perhaps with this in mind, the children's programmes seemed to be encoded at much, much lower rates - as low as 300kbps in some cases - which meant that zap times were almost bearable. I was surprised that this didn't seem to impact picture quality greatly. Iggle Piggle and his friends in The Night Garden were perhaps a touch fuzzy around the edges, but it was still watchable - if you are 3 years old and like that sort of thing.

Below is a table showing the file size, run time and the implied video encoding rate. I have also added a best case estimate of the download zap time at various line speeds.

Download + Store, Zap Times & Progressive Downloads

My belief is that the future of this iteration of the iPlayer is confined to a small niche because it is built without progressive download capabilities. Of course, the download and store is a nice intro to catchup TV, but the PC is not the end-game for VOD so the content is in a holding pattern waiting to be sent in two different directions.

Its current download and store model is suited to mobile TV because it removes the unreliable and costly cellular data network. Mobile phones now specialise as the jack of all trades and they have replaced the PC in that role because they are more portable than a laptop. The mobile is best suited to the iPlayer's brand of personal, download for later viewing application where people who are short of time can snatch a few minutes here and there whilst commuting.

The other direction the iPlayer content needs to go is back onto the TV - but with the added on demand capability not yet in the new service. For this to be successful the zap time issue needs to be resolved and this requires progressive downloads (or Virgin Media's cable network).

This is not a bandwidth problem - at least for the majority of users at current resolutions - it's a service application problem. If I can download something in 26 minutes that takes 60 minutes to play, I have a decent buffer which would allow me to start viewing almost immediately while the rest of the file downloads in the background.

Managing Compromises
For some, progressive downloads would mean lower picture quality and it seems that the BBC is keen to avoid tarnishing it's content with this brush. With progressive streaming, if the line rate is even close to the standard encoding rate, progressive downloads need to re-encode at a lower rate, delivering a lower resolution.

Right now I get the same end product as the other guy on a 10M pipe, I just have to wait longer for it. The compromise is very egalitarian, even if that means that very few will be truly satisfied.

Because the BBC is publicly funded, it has probably had to design for a wide base of users meaning that it has to manage these compromises where Joost, Babelgum & Veoh can simply ignore low speed users as "unsuitable". The flip side of this is that if BBC users expect to wait, then perhaps there is a window of opportunity to crank up the resolutions to true HD for the iPlayer's small niche audience. What is the difference between waiting 26 minutes and 2 hours? The mass market won't go for either.

BSG report contained data that showed that 20% of UK households will not get more than 1Mbps even with LLU because of long copper loops. Is the BBC brave enough to behave like a business and shrug it's shoulders at the Digital Divide? Can it develop the service knowing that at least 20% of its license fee payers won't be able to get it?

Peer to Peer or Client Server?
It is too early to judge the Peer to Peer network, but we have taken some benchmark readings. These show that on Friday night - before most of the new wave of Betas had been activated - even the headline content (Top Gear) was coming directly from the BBCs servers.

Chart 1 - Top Gear Download, Friday 27th July at 8pm
BBC in Red, Peer on Virgin Cable in Blue

The black total line (total inbound bytes/time) is very close to the red bars (the bytes that came from the BBCs servers). The small blue line about half way through is a peer on Virgin's cable network that for a short period, contributed some data.

Saturday morning's kiddie TV also came directly from the core sites, but by the time Monday morning came along and many of the new Betas appeared online, there was much more interesting stuff to consider.

The red lines are again traffic from the BBC, while the black outline shows the total inbound traffic including the BBCs. The behaviour of the peer to peer network is shown clearly here.

Chart 2 - Mountain Download I, Monday 30th July at 10am. BBC in red

First a small volume arrives, which the application deems to be insufficient so it calls up the BBC to get things going. No sooner does this happen than a peer appears and starts contributing data. This peer can be seen as the blue line on Chart 3 below.

The throughput was low and I was unsatisfied with the speed so after about 500 seconds I paused the download and resumed it a few seconds later to see what would happen. Very interestingly, the BBC almost immediately begins filling my requirement. The lesson? If your download is slow, pause it and resume the transfer - you might get the BBC to notice you - although I'm not sure whether this is an intentional "design feature"...

In this middle period, the end of which is a second experimental pause, there is very little peer to peer traffic. After the second restart, you can see that the download begins to gather traction as the BBC gradually eases out to be replaced by the green line in Chart 3 which is a computer at Edinburgh University. The pink lines shows the aggregate of other peers within the sample.

Chart 3 - Mountain Download I, Monday 30th July at 10am
BBC in red, Loughborough Uni in Blue, Edinburgh Uni in Green, Others in Pink

iPlayer Kontiki P2P picks on key sources
That was the first part of Mountain. Because the wireshark output was getting rather large, I stopped again and started a clean trace. Chart 4 below shows that the Edinburgh peer dominates the traffic sources for the rest of the download. This very much fits with the overall pattern I observed, where the iPlayer seeks out the fastest single connection and tries to get as much as possible from that one source.

Chart 4 - Mountain Download II, Monday 30th July at 10am
Edinburgh Uni in Green, Peer on BT Central Plus in Blue

The final sample of iPlayer data was taken at peak internet viewing time on Monday evening. I downloaded the DanceX file which was by far the largest and longest playing. The first 5 minutes of the download are shown below in Chart 5. Again you can see an attempt to find peers is initiated before the BBC picks up the slack. This time, the release back to peers is supported better until Cambridge University gradually assumes almost the entire load.

Chart 5, DanceX Download, Monday 30th July at 8pm
BBC is red, Cambridge Uni is Blue, Peer on Virgin Cable in Green

My connection as a peer
The other side of the coin is that the application uses my upstream connectivity to share files with other peers. This shows some curious behaviour that is worth looking at.

Chart 6 - Mountain Upload I, Monday 30th July 10am
Peer on PIPEX is Green, Peer on BT is Blue, Peer on Hi Velocity is Pink

It almost looks like the PIPEX and BT peers are fighting over who gets my bandwidth. The PIPEX peer is the first to become established before the BT peer comes along and demands the files. Then, like two children squabbling over a toy they play a game of tug-of-war before the BT peer seems to give up. Having "won" the battle, the PIPEX peer also seems to lose interest and eventually disappears.

Of course, the application may be designed to burst like this, but it does seem to end in a fairly inefficient use of the available resource. In spite of this, the moderately lengthy spikes are not great for aggregation by the ISP - if you think of the space used in a jar of pencils against a jar of marbles and you can probably picture what I mean.

Uploads continue even when off!
I'm not going to make too big a thing of this because my upstream usage is free - I pay for download usage only, but it is an application characteristic that deserves to be noted.

My traces have shown that in each case after the Top Gear and Mountain downloads completed, uploading activity to one peer has continued after you close the library and even after you close the application in the taskbar. The only way to stop this was to power down the PC...

I'll keep an eye on this and report again next time.

Ping & Traceroutes
A significant finding of my Monday downloads was that the majority of peer to peer traffic is coming to me not from within my own ISPs network, but from University networks throughout the UK. These are clearly on very high capacity connections, although their ping times were slower (~40ms) than the BBC sources (~30ms) that they replaced in my delivery chain.

Interestingly, both these are faster than round trip times to other broadband users on Zen's IPStream network (~65ms). Of course, Zen's servers are the first IP layer devices that the traceroute sees, but it seems to be quicker to interconnect with JANET and get to a university campus, than it is to remain on Zen and go back out their BT Centrals. Connecting to other subscribers on BT's Central Plus shows the same phenomenon (~80ms), the extra time being the interconnection time between Zen and BT.

This perhaps explains the fact that there are very few IPStream users acting as peers in my results. The majority of P2P is with users on University LANs and Virgin's Cable network. Pings to cable customers are blocked by Virgin, but up to the point where they are blocked, times are fast (~40ms) suggesting that those users are quicker to get to than other IPStream users on my ISPs network.

The scarcity of IPStream peers is good for BT and their wholesale customers, but Virgin is known to also be short of upstream capacity so the knock on impact my not be good for them. Cable peers are also among the first UK sites to pop up in Joost traces too.

Traceroutes of all the major sources of data show that Zen is taking delivery of the traffic at the LINX or MANAP peering points. LINX is where Zen interconnects with JANET who provide the UK university network backbone, and with BT. Interconnection with Virgin Cable seems to be preferred at MANAP.

The once exception I have noted is that a small amount of traffic was exchanged with a Hi Velocity subscriber - particularly on my upstream. Zen does not seem to peer with them as the traffic traces show these packets going through Cogent's network.

Comparing Kontiki and Joost
Unlike Joost, the iPlayer seems happy with a small number of high speed peers. Joost will try and reach out to as many as 4 sources simultaneously, with each of these responsible for only a small piece of the file. This is a clear advantage also for Joost's DRM because no one peer has enough of the file to make it worthwhile cracking.

Chart 7 - Joost Ferrari 340 Download, 11 July at 11am
Coloured lines show various peers, black is total download.

You can very clearly see how well structured the Joost P2P protocol is from this trace. This contrasts with the somewhat chaotic nature of the iPlayer traces. Each Joost peer seems to have a clearly pre-defined role, while the iPlayer Kontiki equivalent seems to be fighting with its sources as discussed above.

Looking at my connection as a peer on Joost, you can see the other side of the same equation. The green bars are my computer sending to a peer in Canada (via Time Warner Telecom), the red is to a destination in Norway (via Telia).

Chart 8 - Joost Ferrari 340 Upload, 11 July at 11am
Black shows total upload, Peer in Norway is Red, Peer in Canada is Green

By contrast, the iPlayer seems happiest with one very fast peers, such as the one on Edinburgh University's network that sent me 145MB of the total 267MB in the Mountain download.

It is almost certainly an over simplification, but Kontiki seems to be very aggressive at pulling in contributors - like the community do-gooder that we all know and love. As with that example, iPlayer peers seem to be reluctant to contribute and drop off before bouncing back when no-one else takes their place. It is really only the universities that want to share the iPlayer by the looks of things, perhaps because so much is demanded of volunteers who do step up.

Joost peers are much more distributed - everyone does a little bit, rather than one source getting burdened with the majority of the demand. This organisation is good enough to allow the Joost application to pause the download and wait until the buffer has been depleted before initiating further data downloads.

Although it is too early to draw conclusions on the iPlayer based on the first weekend's data set, it would appear that it has a lot to do to develop into a clean, controllable distribution mechanism like Joost clearly is already.

Designed to help ISPs?
In reply to my pre-launch post, Angus suggested that the solution was for ISPs to run the iPlayer on their own high speed servers, so as to serve all the traffic from within their networks.

I wonder whether the application is actually behaving as it is with the university networks because it is designed to work with the ISPs in this way. It may well be that a few fast peers on gigabit links at the ISPs data centre could well take responsibility for serving their user base - saving Peering costs if nothing else.

Zen at least, is not there yet. If / when they are, will the iPlayer prefer their sources ahead of those on JANET? The JANET response times are pretty good, but if anyone knows of ISPs hosting iPlayer servers, let me know and I'll run traces on their links to see...

Serving the traffic from within the network will eliminate the Peering cost, but it still leaves a significant backhaul element on the ISP. I will be looking into the commercial implications of the iPlayer in a final article on Friday, where I will also write up my overall conclusions.

Labels: , , , ,


Tuesday, 31 July 2007


iPlayer Service Review

It took just under 12 hours to assign me a beta test account, but just before 7.30pm on Friday night last week, my iPlayer login details arrived. Being Friday night, I cracked open a beer and then sat down to watch Top Gear - on my computer.

The Experience - would I do it again?
It was not very sociable... there was nowhere for my wife to sit because I was using the desktop. Perhaps if I'd been on the laptop in the lounge, it might have been different, but I can't see that sitting at the computer watching telly is going to be a regular Friday night thing in this house.

Maybe this is a feature of my demographics? If my children were older and wanted to watch their own thing, the iPlayer might offer an alternative to them having a TV in their room. Even that though doesn't feel right, because the service resolutely avoids any signs of the social networking that would attract the youth away from existing TV, SMS, MySpace & MMORPG activities.

Anyway Saturday morning, pressing on with my sample of 1 household, I asked my three year old son to be a crash test dummy while I took another look at the wireshark outputs. He thought it was an excellent idea because he got to watch "Beebies". He didn't care that it was on the computer - he didn't know any different - although he did at one stage ask me why my desk was messy. Perhaps he couldn't see over the piles of wireshark data printouts?

He was very patient for 5 minutes while the first programme loaded and then he sat there quite happily for 15 minutes watching Clifford, The Big Red Dog. Cuairt le Calum was next up which he knew meant Bob the Builder, but not because my three year old English boy reads Welsh, but because the EPG had a picture of Scoop on it. Finally I let him watch The Night Garden, a 40 minute show which downloaded in about 7 minutes in the background while he watched the other stuff.

I got some great Long Tail data samples while he got to watch Saturday morning TV and my wife had a lie in. We were all happy... Perhaps we will use the iPlayer when we go on holiday and can take the laptop, although I can't help thinking that with a few DVDs competing for his attention, the BBCs content is not going to be top of the list.

Using the iPlayer
It's not as easy to use as television and it doesn't use the features that the PC brings and the iPlayer does not offer video on demand either. Well, not what I would call on demand, anyway.

I'll come onto the technical bit in my next article, but you have to download a programme fully before you can start watching it because the application does not (yet) have a progressive streaming capability. This means you have a wait between choosing to view something and actually watching it - the zap time - anything from a few minutes to half an hour depending on the length of programme and your line speed.

This is where the iPlayer is clearly inferior to Joost, Babelgum and Veoh as video delivery platforms. With each of those, after a short period of buffering lasting perhaps a few seconds, you get the video you demand. With the iPlayer, the wait is a serious issue as you get no instant gratification.

Once the file was received, it's a simple click in the library to start watching the programme. I found the picture quality to be ok and comparable with Joost and Veoh. In my unscientific rankings that would make it better than Babelgum, but not as clear as Channel4 On Demand.

There are plenty of other blogs describing the service, so rather than repeat all that, here are a few links. Tech reckon it fills the need it is intended for, BBC News said that the iPlayer had received a cautious welcome and then went on to list some on the complaints from the user forums. I suppose you can't expect them to say that a lot of people found it somewhat lacking, although The Beta Test Blog said they were impressed.

My verdict on the service? Yes, I would use it again, it's just that I'm not sure when.

The next piece will look at the bandwidth usage of the application, files sizes, encoding rates and peer to peer analysis results.

Labels: ,


Thursday, 26 July 2007



It's not even a month since the last i launch, but tomorrow sees the launch of another service that could disrupt its industry to an even greater degree than Apple promises to do with mobile telecoms. This time though, thankfully, we won't have to pay the homeless to wait in line for us to get hold of it.

The BBC launches the iPlayer tomorrow, but unlike the iPhone launch where all you could find was praise and hype, the BBC faces nothing but criticism, doomsday scenarios and even calls for a ban on the eve of it's big announcement. No wonder the folks behind it have decided to find pastures new.

The problem is that the BBC is publicly funded. It gets its money from everyone in the UK with a TV set because we all need a license to own a TV. The BBC's license revenue comes in exchange for a responsibility to deliver a universal service, free of advertising to anyone who pays the license fee. Foreign readers may find this curiously eccentric in the 21st Century, but the BBC is a national institution and we are British so that's the kind of thing we do.

This is where the problems lie. The license fee was designed at a time when the BBC was broadcasting: it had no competition in 1922 when the license was introduced to cover radio. The TV + Radio license was introduced in 1946. The Sky empire was still just a twinkle in the eye of James Murdoch's grandfather at that time.

The company (if you can call it that) is now operating in a very different world, but for many reasons (most of them sentimental), the BBC is still funded this way. As a result, it competes with other TV channels (and web sites) on an unequal footing because their funding model does not expose them to market forces.

Because the BBC is publicly funded, it has been free of the commercial pressures that competitors face on a daily basis. Has this given it an unfair advantage...? How many R&D departments would be given 4 years and £3m to deliver a project? Surely, anyone else in the same position would have lost the faith of shareholders well before now and management would be history. The BBC's unique position has shielded the iPlayer and given it breathing space in which to develop the service.

On the other hand though, how many R&D departments would face an Ofcom Market Impact Assessment, a Public Value Assessment, a full review by the BBC Trust and scrutiny by parliament before it could launch? The kerfuffle about the lack of service on Macs and Vista - there is a petition with 11,000 signatures with Downing Street asking the PM to ban it - is frankly pathetic. Do people really expect the BBC to be able to launch the service working 100% and available to everyone on day 1 with no testing?!?

Anyone who has ever been involved in product management will know that this is a recipe for disaster. The BBC cannot eat the elephant in one bite, but because of its funding model it will be forced (they might say "easily persuaded") to deal with standards issues like no other entity. The elephant will be consumed.

The Mac and Vista options might be addressed by making the content available through other media players as long DRM issues can be resolved. I suggested in my LUI Part 6 piece, where we described a prototype of the future of IPTV, these players are likely to include the likes of Joost. Because of its universal service obligation, the BBC is not in a position to say no.

The BBC's obligation extends beyond the internet however. For those without a PC, the BBC is investigating Virgin Media's on demand platform. This still leaves a chunk of people with no access to the service because of technology constraints on the user's side (no PC, no cable, no broadband).

Even though Freeview does not offer the bandwidth, the BBC is sure to get embroiled in how to serve these users, where other competitors would simply write off the niche as too expensive to serve. This is the flip side to the breathing space they have had to develop the service.

We already have video on demand from Channel4, an evolving service from Sky and a promised launch of a service from ITV that looks spookily like that promised by the BBC. So what's the big deal with the BBC's launch tomorrow? I've said it could disrupt its industry to a greater degree that the iPhone, so I had better explain myself...

Driver for IPTV Adoption
Ofcom's MIA states that by 2011, the iPlayer is likely to account for 3% of TV viewing hours, which doesn't sound like a lot. This is in fact about 45 mins per household per week, assuming total viewing remains as today at around 25 hours per week.

But, as with Freeview, the BBC gives this new(ish) technology the credibility to go mass market very quickly. There will undoubtedly be a knock on effect on all other broadband television services because there may not be a more trusted organisation anywhere in the world than the BBC. If IPTV is good enough for the BBC, it's good enough for me...

Looking closer at the Ofcom projections: 3% of total viewing is 9% of the BBC's current viewing. It would be reasonable to suggest that competitors services might grow in line with the BBCs. This would mean every household in the UK watching on average 2 hours and 23 minutes a week of IPTV by 2011. Over 3 billion hours a year...

The MIA also says "The costs of the broadband capacity required to support the services could in aggregate be between £399 million and £831 million over the next 5 years." Once the capacity is there "the additional capacity would also be available for use by a wide range of other services, including commercial on-demand services, [so] it would not necessarily be appropriate to attribute the associated costs to the BBC services in isolation."

Ofcom's model says that the average capacity increase from the iPlayer will be 3GB per user per month by 2011.

Assuming that other broadcasters follow the same adoption curve, you are looking at almost exactly 9.5GB extra per user per month to serve the 9% of viewing hours at standard definition. This will add around 46kbps per user to an ISPs peak traffic load (approximately doubling what they have today). This is low, because I am using data that shows that early iPlayer alpha trial users had web-surfing-like peak to mean traffic profiles.

TV usage profiles tend to be much more peaky than web surfing traffic. Where you might get a peak to mean ratio on web traffic around 1.6, on TV viewing profiles, this looks more like 2.8. Cutting a long story short, this would push the traffic impact of the iPlayer from 46kbps per user up to around 81kbps additional traffic (easily tripling today's usage, from just one application).

Reverse engineering Ofcom's 3GB per user per month figure from the 3% penetration rate shows that they assume a 2Mbps encoding profile in their models. This suggests that high definition is not being taken into account.

If the BBC were to deliver at 1080p instead (as in the US have announced they will), you might want to multiply the total capacity requirement by 5. With all content (ITV, Sky etc) as HD, the 9.5GB might be 45GB extra for every house connected to the broadband network. This would push the incremental peak load per user up by between 220kbps and 385kbps depending on peak to mean profile.

Where there is demand, there is money, right...?

Actually, no. This is the other major problem with the BBC, the license fee and the universal service requirements. The BBC's iPlayer will not generate money from adverts (the BBC does not do ads), from subscription (the license fee already covers the service) and any other creative sources of income (including abroad), are likely to be relatively trivial.

This is not an issue for the BBC because the content is paid for already (its a catch up service of stuff already produced for broadcast). The service creation costs have been kept under control at £3m and rather than having to pay a big hosting bill, Kontiki's P2P client is being used, theoretically relieving the BBC of the burden of distribution costs.

The big losers are the networks who have to carry all this extra traffic and have no way of monetising it. This is again a BBC-specific problem because with other commercial broadcasters, the ISP is in a position to do an ad-revenue share agreement based on the unique element that the ISP can provide - the postcode. (We are going to come back to this point and the revenue opportunity from commercial broadcasters other than the BBC in LUI Part 10 early next week.)

The use of P2P actually makes the problem much bigger for the ISP. Historically, the BBC's web traffic, although significant, has been manageable via direct peering relationships between the ISPs and the BBC. Replacing this with P2P looks (to me at least) like a two fingered salute to the businesses that have to transport the BBCs product.

Even using the lowest results in the analysis, the iPlayer promises to double the traffic on the UK internet between now and 2011. On top of that the iPlayer opens the door to other broadcasters, which could mean that instead of doubling the volume of traffic, the iPlayer launch could drive an increase by tenfold or more.

I'm going to be watching the iPlayer's use of bandwidth very closely over the coming months. As I have done with Joost, Babelgum and 4oD, I will be running traffic source analysis and looking at where the Kontiki client gets its traffic from. Channel 4 also uses Kontiki, but using their service, I found that the scarcity of peers meant that much of the traffic was client server from the seed caches instead of actually using P2P.

I will be keenly examining the peer hit rates as that will determine the BBCs costbase. I will also be looking at where these peers are and whether BBC/Kontiki keeps traffic within the service provider's network or whether (like other P2P I have tested), in-country traffic source management is random. I will be publishing the findings here at periodic intervals.

If I can get the client from the website, the first set of data will be published here by lunchtime tomorrow...

UPDATE: no client = no data = no update. Sorry folks...

I got to the site by 7.40am, regsitered but have yet to receive the invite. I wouldn't say that the message board is on fire yet (10 ir so people grumbling about the same thing), but there are people who stayed up until midnight to register who are in the same boat.

They let Mashable in though, so if you are looking for a sneak peak that's the place to go. If you want a different perspective on possible adoption rates, I also found this.

IWR were able to run an initial test and reported that a 30 minute programme was 108MB, which suggests an encoding rate of 480kbps. It is not known what the download speed was, which may be different from the encoding rate to allow for buffering. The picture defaulted to 400 x 200 screen size, which sounds small.

More on this when I get my prized invite...

Labels: , , , , , , , ,



This page is powered by Blogger. Isn't yours?

 Subscribe in a reader