The IP Development Network
spacer
spacer
spacer

Welcome to The IP Development Network Blog

Friday, 28 September 2007

 

Back to Basics

One advantage of being your own boss is that you can tell granny how to suck eggs without getting the sack. So at the risk of teaching an old dog old tricks, today's article is going to go back to basics and draw a picture of a baseline network.

Under normal circumstances we might just draw a cloud in place of the detail below because the detail, like the network itself, is only relevant when there are problems that need to be understood. Once those problems are solved, we can all go back to drawing clouds.

The time to examine those problems is now. This week,
Ofcom launched its Next Generation Access (NGA) consultation which appears to run alongside the efforts of the Broadband Stakeholder Group (BSG) activities and Stephen Timms (MP)'s efforts to convene a summit. Whether all these acronyms offer parallel efforts or something more in keeping with the market trend for convergence, we shall see.

While the focus is on the local loop and the peak capacity that can deliver, the situation is a lot more complex than simply opening the gates yet further.

Transition
We are in transition from dial up to "true broadband". In fact we may always be in transition because after true broadband we should probably expect the next next-generation lobby to be pushing for something that they might have to call "true true supersuperfast broadband, honest". Drawing lines in the sand though - at say 100Mbps - gives a target that helps us plan for the next stages in the evolution, but it is worthwhile noting that there will never be enough to satisfy the high end users.

We know that video streams are an elephant in the room, which some have estimated will account for
98% of internet traffic within two years. This is a serious problem so that means getting into more detail about the network and the factors that are creating the problems.


The first point to make is that for all the fuss about Fibre to the Home (FTTH), that is only one part of the jigsaw in delivering video to people's homes. The local loop is one of four major choke points on the access network side that need to be considered. Sitting alongside those access issues are considerations of how the internet routes packets and how and where those packets are stored.

In fact, for all the hype, it may be that FTTH is one of the least pressing of the issues that stand in the way of the internet's ability to deliver the larger and much more lumpy video traffic on the horizon. At some point, the last mile will again be the most significant bottleneck - as it was when all we had was dial up - but right now, how many people would be able to use a 100Mbps local loop if it magically turned up on their doorstep?

Would the architecture support it? Would there we enough capacity in the home and on the core network to use it? How many knock-on issues would we need to solve before we spend the money on fibre in the local loop? A chain is only as strong as its weakest link...

The Home Network (x, in the diagram above)
A good place to start: it is undoubtedly the most complex issue because of the anarchy that exists in this space. There is no control over end points and how they are attached to the network, or indeed which network they are attached to. Security? It is best not to ask - denial is a wonderful thing...

Consumers are truly left high and dry to build the wireless / ethernet / homeplug network for themselves. If they are really lucky, they can get a friend / son / daughter to do it for them.

This makes the introduction of new hardware and new services that would use the 100Mbps a significant challenge. The customer may not have a network, or it may be "a bit flaky" such that when they come home with shiny new CE equipment, they are disappointed (or worse) to find that it doesn't work. So they make a call to the ISP but after a long wait, they find that their "service" provider isn't there to help.

Do you phone a friend every time you want to install a new device in your home? How soon before your DIY network starts to creak and your friend's generosity starts to get seriously tested? If everyone suddenly had 100Mbps to the home today, very little of it would be usable because the capacity of the last yard is significantly below that. Before FTTH, we need a solution that simplifies the home network and extends management of that to a real "service provider".

The Local Loop (y)
No-one is happy with the current state of affairs. That is not to say that everyone agrees that access networks are too old and slow and are in dire need of an upgrade - LLU has yet to be fully exploited, so perhaps we should start getting the best out of that? There are two conflicting priorities that need to be managed: ultimate speed is one of them, but at least as important is ultimate reach.

Looking first at speed, there seems to be a clear assumption made by many that we need more than ADSL2+. This point is worth explaining because this is not about headline speeds: 24Mbps for all would be enough for a fair few years. But ADSL 2+, like ADSL 1 speeds degrade with distance, so only anything substantial can only be delivered in real time over relatively short distances.


This table is from the BSG report, Pipe Dreams. Links to articles covering that and other articles can be found on the
Digital Divide section of this site.

In summary, only 30-40% of the population are close enough to their exchange to get 8Mbps or more on copper. You can get more on cable but cable also covers less than half the population, similarly concentrated into densely populated areas. For some therefore, there is a vibrant market and the local loop is no barrier at all. Certainly not one requiring life support from a quango or two.

The role of the quango should be to concern itself with the areas where the market does not have an answer. In the local loop, this includes a significant number where there are signs emerging that the market for connectivity beyond LLU will fail. This failure will occur because the market needs huge investments by Openreach to shorten the copper loops but for the monopoly
it is hard to see any extra revenue to pay for the new investment.

So 60% of us might be stuck with the speed we have today - it doesn't matter whether we use ADSL1 or 2+, the result is the same because the line length is the problem, not the technology at the exchange. And because the copper replacement case is so weak, you might have to move to a new build estate to get fibre to your home...

BT IPStream and ADSL1 (1)
There are very clear signs here and now of market failure in the provision of basic broadband access. Fortunately this only impacts a very small minority who cannot get 512k or more - a rare enough occurrence that some even appear as "
news" these days.

This market failure today is very small indeed, but there is the prospect of many more (perhaps 15-20% of the population) getting left behind in the rollout of LLU. Although there is no doubt that competition here has led to cheaper products for all, those price reductions came at the expense of the digitally divided for whom competition in the local loop is a double whammy.

Competition means that investment from all players, including BT, has focused on the denser locations where the business case is best. For most, that means higher speeds and lower prices but the money taken out of the value chain through price competition, is money that was once used to cross-subsidise services where the business case didn't make sense. For the minority on the other side of the divide, LLU enshrines a two tier system.

Two Tier Pricing
A two tier system means two tier pricing, but it is worthwhile understanding what that two tier system means. It does not mean people miss out on broadband: almost everyone can get affordable broadband connectivity if they want it - 99.x% have access to some form of broadband and prices are universally below £20.

Two tier pricing may mean that the basic product is available for free on LLU exchanges and for £10-£15 more on IPStream, but even that is not the problem. The problem of the two tier pricing system as it is evolving, is the impact that it is having on the affordability of broadband capacity once you have the basic connectivity.

This manifests itself as usage caps and fair use policies because broadband capacity (as distinct from broadband connectivity) is hundreds of times more expensive on IPStream than on LLU. For these consumers on the wrong side of the digital divide, competition in the market means that the cost of actually using the service is prohibitive.

Every action has an equal & opposite reaction
This two tier system is the direct product of "managed competition". IPStream's prices are maintained artificially high to allow room for competitors to build their own infrastructure at a cheaper rate than they can lease capacity from BT.

Solving the two tier pricing problem may distort the competition that has been so carefully created because it means BT selling IPStream at rates comparable to LLU. This would undoubtedly stop future LLU investments and throw into doubt the commercial viability of many existing deployments. It also requires that BT have an incentive to cut prices for the least profitable exchanges in an environment where there is no competitive pressure demanding that they do so.

There is a significant difference between now and 2004 when BT held back enabling the least profitable exchanges with ADSL1 because the promise of returns was non-existent. The difference is that now BT has to compete with LLU; in 2004 they were a monopoly and could cross-subsidise more effectively.

The key question that we need to be clear on is who are we trying to deliver fibre to? Is it the top x% where a little shove makes the business case work? Or are we going to let the market work on that while aligning regulation and politics to deal with the bottom y%?

Backhaul (z)
Backhaul is an issue that is best summarised quickly here. There are more details in
a previous series of articles written by Keith McMahon and I a few months back.

Backhaul has been a severe inhibitor to the development of broadband in the UK for the past few years but it appears that BT have been quietly upgrading capacity of even some of the long tail of exchanges to fibre (I have heard anecdotes of exchanges on the 95th percentile being glassed up). This, combined with their new BNS product for LLU operators described by Keith in the above article means that we are much closer to removing capacity constraints in backhaul.

That is not to say that backhaul is universally cheap though, as the model is heavily distance dependent and profitability is reliant on customer density. The pricing scheme is built to deliver service to those with their own core networks close to the exchanges being unbundled. The model is designed to clearly benefit the decreasing number of larger players.

Backhaul competition exists but the BNS introduction certainly took the price floor down a few notches. Additionally, there is a subset of exchanges colocated with the core network itself but these have a much easier life because there, core networks are cheap and plentiful and the backhaul circuits are simply internal wiring.

Would backhaul survive an overnight upgrade of local loops to 100Mbps? For the vast majority of users, the answer would have to be yes but what would break would be the business model because backhaul pricing is based on today's usage and not what you would see with 100Mbps in the last mile.

Backhaul Pricing
This final point deserves explanation because the way that prices are set is a self fulfilling prophesy. In simple terms, there is a "budget" for backhaul - ISPs and even consumers buy as much as they can for that budget. They will expand their usage gradually to fill it and then throttle back use so that they fall within the budget, until the price falls and the cycle starts again.

Dropping prices means more capacity would be available within the budget, but it does not often lead to absolute gains in total revenue because people still spend the budget. Of course it is recurring revenue so you need to keep cutting prices to keep the business - most assets require between 3 (hardware), 5 (system) and 15 (infrastructure) years of use at a recurring fee to pay for itself.

The problem for those selling capacity is that when you drop the price, it takes time to recover the revenues you have given away in the reduction and even when you do, you often find yourself back to square 1 as the throttling caps the upside. So it makes most sense to hold tight and wait for someone else to make the first move.

This point is clearer working through an example... Say you have 100Mbps of used capacity at £5 per Mbps and are charging £10 per Mbps to your customers. Your total cost is £500, your revenue is £1,000. Say that you then upgrade that circuit to 1Gbps at £1.25 per Mbps (total cost £1,250).

At that point in time you are making a £250 loss - what do you do? If the market is saturated you face a problem because you somehow need to be able to get users to pay more than their budget (£1,000). Even if there is still some growth room from new users, do you hold on and sell slowly at £10 per Mbps? Cut the price by the same proportion as the cost to £2.50 per Mbps (total losses now £1,000 and a breakeven point of 5 times your existing sold capacity)? Or something in between? Does this change if I tell you that your competitor is selling at £4 per Mbps...? £3.95, perhaps?

The problem is that prices are increasingly lumpy with ever larger upgrade steps (100Mbps to 1Gbps is 10x as is the next step to 10Gbps). Such steps cause problems because available capacity increases far in excess of demand. Pricing on the basis of availability would leave the owner with pennies in comparison to pricing on the basis of usage, although the latter has the effect of stagnating growth.

Backhaul Competition
Extending the core networks to get increasing numbers of exchanges on-net is the only way to take the recurring cost off your books, if operators want to. Putting their own fibre into exchanges sounds attractive, but it is even more attractive to wait until someone else does and then needs to sell the new capacity. At that point the wholesale customer can start to drive the price down aggressively at the expense of the facilities-based carriers who undercut each other progressively downwards.

There is an incentive problem for operators who may be considering investing in their own backhaul builds. They are better off waiting for some other idiot to make the first move...

If competition is going to stretch into the provision of local loops, it must first address the much simpler issue of backhaul competition. It is simpler because it is a fraction of the cost, but the issues are the same: protection of existing assets, build cost, site access, asset sharing, equivalence, price fixing, price regulation, period of regulation, certainty, etc. Perhaps it is a safer place to experiment with various solutions?

Content Issues
In simple terms, hosting content on your own servers is cheapest, next comes content on peer networks that can be reached through internet exchange points while Transit is the most expensive.

Transit originated as a way to get access to US content, but more and more of the big US properties are now hosted on caches that can be reached through in country peering (from my ISP, you can get to google.com through LINX). Transit still plays a big part because it is the only way to reach everything else (youtube.com goes through transit). The difference between transit and peering is that you can't transit peer networks as a general rule.

Transit networks themselves host a lot of content but their value primarily lies in that you can go over one of these networks to reach something the other side. So instead of maintaining thousands of smaller circuits with everyone else, Transit takes care of that in one interface.

How would content hosting be impacted by 100Mbps in the last mile? It might be ugly for a while as the shock of an overnight upgrade kicks in, but as we are unlikely to wake up tomorrow and find the tooth fairy has given us all fibre, we have some time to consider the impact on the electrical grid.

Space is not a problem: in the late 1990s data centres the size of football pitches were constructed which are still being filled now. Network connections are not the problem as most are on multiple fibre rings.

Power on the other hand is a real concern, particularly given climate concerns and the ever increasing cost of energy. We really do not understand the power consumption increases driven by fibre to the home - this cannot be ignored as delivering new electrical capacity may be even more problematic than laying the fibre.

Bigger Lumps of Data
Video is not necessarily going to be the most popular internet application but it doesn't have to be to cause the predicted impact. It is not where people spend the most time that necessarily drives the traffic: a second of HD video is 65 times as much data as a second of high quality music. Put another way, 1 hour of video is 65 hours of music or 315,000 page views on google.com...

For video files of that size, there are storage implications, but storage capacity is far more advanced than network capacity so it becomes more a question of where do you keep it to minimise the distance travelled and subsequently the cost you incur. If you can control it there are suggestions of charges for premium delivery to help monetise the downstream access network assets.

Whoa! Network Neutrality alert - but looking at how this is being played out, there is a question whether the content applications will cooperate to allow the ISP to exert such control. P2P is an example of how content companies are trying to work their way over the top of ISP platforms.

P2P vs Client Server
The concern with video applications is understanding the direction the market will develop. Will it be the wild west all over again with P2P data everywhere (forcing much of traffic onto transit networks) or will the video market evolve to work with the networks (much of the content locally hosted). At stake is the bill that ISPs pay transit providers for global access.

The choice of application is as much political as it is technical. If P2P wins, it will be increasingly hard for ISPs to do anything about monetising the increasing volumes of content but it may deliver an inferior user experience - something the ISPs can comfort themselves with. ISPs would be much happier with client server as that is something their networks have been built around and something they can control the quality and cost of.

Fibre and 100Mbps access certainly plays into the P2P corner as it blows away one of the fundamental limits of P2P - upstream bandwidth. In a DSL environment there is only the capacity to create perhaps a 10th of the capacity there is the potential to consume. In a fibre environment, it can all be P2P.

I believe that we need to look at how and where the networks route P2P and move routing closer to the edge to reduce tromboning. This is because applications would perform much better and network demands may well be lower, even allowing for the additional Layer 3 technical and operational overhead. Geo-aware P2P might work for everybody, but that is
a story I have written up before.

Busy Hour Planning
There is a huge difference between how computers are used on the internet and how TVs are used. Watching television is much more heavily concentrated: peak audience (of all channels) is around 2.8 times the average audience over a 24 hour period, whereas for web surfing this is nearer 2.1. P2P actually generated very good peak load efficiency because the ratios of download applications that use P2P is around 1.4.

What on earth does this mean...? In simple terms, you need to provide 33% more capacity for watching TV as you would for viewing the same volume of data on the web because you have to build for the peak unless you want congestion on the network. Furthermore, congestion for streamed services like TV is far more serious than for web access (where building to the 95th percentitle was commonplace).


The chart shows how the usage of various applications varies and where the peak loads are on the respective networks. There is no weighting for file size - the area under each line has been rebased to 1,000 units. The aim is merely to show the peak to mean traffic profile of video is significantly higher than for web and P2P.

This reiterates the point above that for the same volume of data, you need more network for video than you do for other internet applications.

Which makes a situation which is already very bad, even worse - the capacity that we actually use today is only a fraction of what is available on existing local loops. Average usage of around 5GB per month on a 2Mbps circuit uses 0.8% of the connection's maximum capacity. There are over 8 Exabytes (8,912 Petabytes) per month of unused capacity on existing networks.

A 2Mbps link is enough capacity to deliver 146 hours of 1080p programming per month - the average household watches just over 100 hours per month. What we have today could deliver what we need tomorrow.

It highlights the inefficient use of the total available resource... The problem is "on-demand".

Conclusion
Video is the only application that looks remotely like driving demand for fibre. Assuming for today that we need to move video over from its existing broadcast platform - a case worth exploring in detail in another thread - it is clear that there are a number of key areas where we are not ready for fibre to the home.

We do not have the ability to deliver service because of networking issues in the home and commercial models in backhaul and hosting are going to have to change in light of the new traffic demand. But these are functions of evolution that will follow the technical capability as it grows.

Where there are serious questions to be asked are in the supply of power for the next generation capabilities and in the efficient use of the resource that is in place today. Every routing hop is another drop of oil gone forever and do we need to build nuclear power stations next to data centres to support demand there?

It is also clear to me that we are not using what we have in place today. Perhaps we should stop to think about that too before ploughing huge sums into delivering yet more peak capacity?

The removal of the bottleneck in the backhaul means that it is only the commercial model preventing full-time wirespeed usage of connections. For 70%, this is 2Mbps plus. Even if you need 10Mbps for the video itself, technology is evolving that predicts what a user might want "on-demand" and pre-loads it for viewing at 10Mbps.

This offers the network provider a way of maximising resource usage. If you can fill the unused capacity on the network today instead of pushing the headline speed, you don't need the expensive infrastructure upgrades.

There are clearly areas where there is a vibrant market for connectivity because of recent regulatory efforts to encourage competition. But this risks leaving a subset of the population behind with access speeds below what might be necessary - 30% cannot get 2Mbps. This is the area where lobbying and regulation should concern itself, not with the drive to 100Mbps.

Labels: , , , , , ,


[Permalink]

Thursday, 23 August 2007

 

Is there a Wizard at Ofcom?

Or is that a dunce's hat they are wearing?

Do Ofcom Have a Clue?
The most subjective point that I made in my recent iPlayer series, and in particular in my article on The Regsiter, was that Ofcom know what they are doing. I have had a fair amount of feedback along the lines of the following:

"Ofcom really don't have a clue about anything and just are pushed from pillar to post by the amount of lobbying going on. You are really naive if you think that someone in Ofcom is really the Wizard pulling all the levers in the background."

Is BT the Power Behind the Scenes?
This thesis holds that actually, it is BT that holds power. The theory goes that BT are able to make a weak minded Ofcom accede to their every wish through their use of Jedi mind tricks.

I agree that the Force is strong with BT. Their Regulatory department is staffed by the brightest people in Telecoms, whose role it is to confuse the heck out of the rest of us. Furthermore, there is evidence that BT seems able to pull victory from the jaws of defeat when it seems that they have been beaten into submission - but does this mean that we are under the spell of a great magician? Or is it just that (unlike everyone else), they just get on with it when decisions go against them?

In the late 1990s, it is certainly true that Oftel were pulled from pillar to post. They were naive and believed Mercury every time a grievance was raised. Similarly, they fell under the spell of MFS magicians who worked tirelessly for changes to the market that altered the telecoms landscape in the UK and throughout Europe.

Many of those same people who fought BT in the 90s have now found themselves at BT since the company's renaissance in the 00s. Another example of poacher turned gamekeeper in telecoms...

BT has Good Reason to Like the iPlayer
It is certainly true that BT wins from the iPlayer's launch. Their wholesale product still covers 34% of the market (excluding Retail) and the Capacity Based Charging scheme means that any extra iPlayer driven usage within this base just adds to BT's profits. The price of the BT Central product is worth noting - £155 per mbps per month.

In the long term, the iPlayer driven LLU may cost BT subscribers but certainly within the short term, the additional usage will drive wholesale revenues that more than fill any gap. Over the long term, the iPlayer is likely to drive backhaul circuit investments from LLU operators, which is revenue to BT too.

A Strategic Decision to Go Ahead
Of course the BBC Trust made the final decision to go ahead - but if Ofcom had turned to them and said, "look, there's an £831m bill, the ISPs don't have that kind of money", the Trust's hand would have been forced.

But Ofcom did not say that - the discussion probably went more along the lines of "look, there's a £831m bill, the ISPs will grumble but the investment is for their own good". In fact when the MIA was announced to the world, the press release made no note of the cost whatsoever and you have to work down to page 103 before you get to the detailed assessment! So did BT persuade Ofcom that the bill was an acceptable cost and not to make a fuss about it, or did Ofcom make its own mind up about that?

I see no evidence of BT being more prepared than the other players which is often a dead giveway that a decision has gone their way. In fact the initial suggestion that they too were fighting the launch, however off-the-record and however quickly retracted, suggests that there was no grand plan behind this. BT executive were clearly not all quite on-message.

My view is that Ofcom are now pulling the levers. More specifically, it seems that the levers are being jerked around violently as Ofcom battles to reign in the huge range of stakeholders over whom it has an influence. My conclusion is based on the fact that it seems that they can do no right by anyone which suggests to me that they are trying hard to balance opposing interests.

Equality of Hardship
Let's look at who Ofcom have upset recently... First came structural separation - it cannot be argued that this has helped BT because of the amount of work involved to demerge Openreach and create a set of systems and processes that could support the new design of wholesale market. It might be in the group's long term interests, but that is more due to the quid-pro-quo that saw the chains removed from BT Retail.

Then Ofcom upset the Broadband Stakeholder Group and a previous Ofcom boss, Kip Meek who feel that we are not doing enough to prevent Britain becoming a digital backwater. Next Ofcom upset customers and the market with their tacit acceptance of two tier pricing, before they upset the politicians who questioned whether the organisation was too big for its boots and was making political decisions.

Just when I was beginning to think that the interests of investors in LLU broadband were the only ones that had not been targeted, came the iPlayer. Having had a good run of it for a couple of years, the iPlayer brings down the hammer to signal the start of a new and much bigger wave of investment for those players.

Ofcom Understand the Dynamics
Ofcom clearly understand the market - assuming they they are unique in reading their entire
Communications Market Report. Their 2007 version was published today and is a veritable goliath as research pieces go. I have printed it double sided on A4 paper and even without the Radio sections, the report sits over an inch high on my desk. I am sorry for the tree, but this is the only way to digest so much information. I am assuming that Ofcom have digested it...

So is it a wizard at Ofcom or a Jedi at BT? It may well be both - but it does not appear to me that BT is being favoured by the regime. Where BT wins is that it plays the regulatory game where ISPs do not.

The Regulatory Game
The game consists of huge volumes of data and documentation being passed backwards and forwards. What tends to happen is that the crucial part is in a footnote on page 142 where one sentence changes the market. BT will take time to read this and reply in kind, whereas the ISPs just don't put enough effort into unravelling the mysteries that BT and Ofcom conjure up until it is too late - and the iPlayer launches.

Is it Ofcom's fault that the game is played this way? Should they be made to stand in the corner and be subjected to ridicule for allowing BT to run rings around them? In my view, the answer is no - detail is a feature of regulatory policy making or there will be loopholes. Every player in the market knows this and has used it to their advantage in the past. If some players in the market believe that there are higher priorities at a given moment than responding to consultation, then that is their decision.

Trying to play the victim when the result doesn't go your way just strikes me as bad sportsmanship.

Labels: , , , ,


[Permalink]

Monday, 6 August 2007

 

Arootz: One to Watch

Sometimes, you see something so elegant and simple, you wonder why no-one thought of it before. An old colleague and regular visitor to this blog contacted me the other day to advise me to check out Arootz - and I'm very pleased he did.

A Relative Unknown
I wish I could claim an exclusive, but I can't. They have been in BusinessWeek & the Jerusalem Post already but it appears that the point was lost on most people. There are only 53 blog entries on the company, the vast majority seeming to repeat BusinessWeek verbatim while most of the rest focusing on the fact that the company raised some cash recently.

One article I found does add some value, commenting on the BusinessWeek piece rather than just repeating it - more questions than answers, concluded Businessofvideo.com - and I agree with that part at least, but perhaps Mr Rayburn had a lot on that day because he didn't actually ask the questions. I have asked the questions - I contacted Arootz and their CEO replied with a significant amount of detail - and having looked into it, I have to disagree with the negative thrust of what Dan says.

Old Technology, New Ideas
It is true that multicast has been around for ten years or more. I know, I was at UUNET 10 years ago when UUcast was being hyped and developed in parallel. As with most other attempts to use multicast, that product failed to find a market because in the end, all it was doing was replacing broadcast and as we all know, if something isn't broken...

Multicast has again come back into people's thinking recently as IPTV services have been rolled out using the technology for the linear (live viewing) portion of what they offer. The problem there remains that multicast has not catered for timeshift behaviour. If you want on-demand, IPTV has to unicast and that means that you use a whole stream all to yourself.

I was discussing this problem with an eminent industry architect back in April at the ISP Forum event - his suggestion was staggercast, which effectively means a multicast stream of a programme being distributed every N minutes, much like Sky's multistart service for prime time movies. It was a definite improvement on multicast / unicast combinations already in use, but doesn't really tick that on-demand box.

Businessofvideo.com is also correct in highlighting that personal storage has also been around for ever. Quite right, it has, but what Arootz has done is combine this with multicast so that the network sees one stream and yet everyone gets a copy that they can watch what they want on demand. It's a mashup of two very well understood technologies and that is the simplicity that I refer to in my opening statement.

The Solution in a Nutshell
In summary, there are 3 elements - Distribution Servers where the content owner injects content, a Multicast enabled network and a set of user Multicast-2-Storage (M2S) agents sitting on PCs or STBs. Arootz sells this CDN as a managed service to content owners and works with the ISPs to make sure that multicast is turned on over the network. I'll come back to the web of relationships later in the article, but I will focus first on the service piece.

Targetted Adverts
My initial reaction was: ok, sounds good they've dealt with on demand but if you are multicasting, you miss the personalisation capability that must be at the centre of IPTV to make it a step beyond broadcast. Erm no, they've thought of that.

"The ads are delivered to storage ... based on the advertising targeted parameters, the decision which ad to show is targeted individually (based on a doubleclick server somewhere) and then the ad is inserted in real time into the video stream but since it comes from storage, it is fast, high quality and real time." Arootz's CEO Noam Bardin.

That's clever - the media and the ads are delivered separately and reassembled to create the final, personalised media file...

Navigating Uncharted Waters
What about navigation and finding what you want among the wealth of possibilities?

"We allow users to subscribe to RSS like feeds from a variety of sources ... We provide interfaces for preference engines to assist in selection of content such as 'the highest rated channel based on yesterdays actual viewership' or 'all content with the word Shark somewhere'" (Shark is of course an example of something you might be interested in.)

Hmm, I like that too. This is the elegance - mashing up social networking, RSS and an EPG into something that can cope with the huge volumes of content...

Huge Volumes of Content
Arootz estimates that the average user consumes 125GB of content per month. Obviously it depends on resolutions: it might be a fair bit less than that for standard definition TV, but if we were talking about 1080p, we could be looking at four times that figure. Is 500GB a lot of data? I think that depends on whether you are a unicast network or a hard-drive.

Terabyte drives are the basis for Arootz's business model and that starts to explain why you have not seen this model previously. Storage has always been far too expensive to make plans like this work but Arootz reckons that by 2010, you will be seeing cost effective drives offering 5 Terabytes... At this point, the limitation is back on the network.

Multicast takes care of the core network capacity issue because as with caching models I have discussed previously, each media file need only be sent once to each exchange and not once per user as with unicast delivery. This saves many thousands of identical 2Mbps plus streams and brings us back and the point where the bottleneck is again the physical speed of the local loop.

Use Case Example
While Arootz claims that its service only uses off-peak capacity, this is a configuration option that can easily be changed. The idea is that you watch live TV via the live multicast feed. If you are not watching (or are watching and have some spare network capacity), other programmes are sent down to you and stored on your machine. You pull these up on demand.

Of course you can't download the entire programme catalogue. Lets say there's 10 channels that make up your regular viewing, you can't even download everything on each of those unless you have a very very very fast connection. Choosing which programme to download (because you might want to watch it) is the job of the M2S AI agent.

The question then becomes whether the AI is good enough to make sure that the file you want is already on your hard drive when you come to watch it. Backing that up, there's the fall-back unicast option in the event that you are feeling a bit wacky today. It looks like it might hold together.

You might even find that the model allows you to escape some of the shackles of the local loop speed as it allows you to watch delayed feeds at 720p (6.4Mbps) even though your line may only just be good enough for 480p (2.5Mbps). A 2.5Mbps connection maxed out enables you to receive something like 800GB per month. For 720p content you need the AI to give you a 40% hit rate (you watch 25 hours a week, it downloads 65 hours that you "might" be interested in).

So Far, So Good
Arootz links multicast with storage and adds personalised RSS subscription with targeted advertising. Sounds good so far doesn't it? The software assets they have are clearly well thought out and fill a growing need. But what about the issues?

The weakness of the Arootz model is that it requires each link in the chain to be working in harmony. The content distributor must plant the content on the CDN, the ISP needs to multicast the channels and the PC or STB hardware requires the M2S software to take advantage of the storage and run the channel selection process.

Conflicts in the Supply Chain
These three parties are not exactly in cahoots. The interests of content and network are juxtaposed and that led to the Network Neutrality issues of 2006. Sitting uncomfortably with a foot in each camp are the hardware companies. Only Sky, with assets in each area following their acquisition of Amstrad last week, seem remotely aware of the need to align the interests of the players in the supply chain.

Although Arootz has an elegant solution to the video unicast problem, they need all elements in the chain to see it and play along to make it work. Without any one element, it breaks down. If they work together, everyone wins. So who is pulling it through so that the vision becomes a reality? As with other efforts to bridge these gaps, is it a question of chicken or egg?

"Look at it differently – if you believe (and I do) that most of video is going to be consumed off IP networks then there is a scaling problem with the current technologies. This scaling is quality, cost and technical. The main bottleneck is the network and our solution is the only cost effective way for this to happen. It may not be us but it will be multicast based. Just like internet advertising was dead in 2001, premium content would never go online in 2005 – Multicast will have to rebound because unicast cannot scale to deliver the cost/quality we expect.
" Noam Bardin again.

I agree with everything he says there, but it doesn't really answer the question. What is the incentive for each party to play ball? I think the answer is actually much more simple.

Cash. Cold, Hard Cash.
The commercial model is that content owners pay Arootz either as a straight arrangement or as an advertising revenue share. Simple enough so far, but in parallel, they are selling to ISPs.

"Our offer to them is 'let us accelerate your content on your network such as VOD, Internet TV and other components'. We will then wholesale the access to 3rd party content owners and provide them with a revenue share back so that they get more content distributed on their networks, more efficient distribution (less load), they get a slice of the action and thus are part of the value chain, unlike P2P where they carry the cost but are not part of the upside."

Aha! Someone is taking the bull by the horns and putting in place a way to route the money so that the networks open up and get paid for carrying content. It would be easy to think that the netcos should be happy with cost savings, and try and keep the distribution fees to themselves, but this statement above all the others shows that Arootz is pragmatic and understands that a virtuous circle needs to start somewhere. And the hardware?

"It can be embedded in software or hardware, it provides distribution and targeted advertising capabilities" and "we are not the brand"

Where Next?
As with every small company, there are a million things that can go wrong as larger and better funded alternatives try and achieve the same thing. That said, the technology that Arootz has and the pragmatic approach shown by their commercial model is an excellent starting point.

Channels to markets (they are in many) still threaten to derail the company as putting together the video supply chain involves dealing with some very heavy hitters. It may require the sponsorship of one of these big players to get the ball rolling, but this is a solution where without compromises, everyone wins. Worth keeping an eye on...

Labels: , , , , , , ,


[Permalink]

Wednesday, 1 August 2007

 

iPlayer Technology Review

The last article described the iPlayer's service elements. My conclusion was that because it is tied to the PC and yet it lacks social networking, it misses the mark in a number of ways. In its current guise, it will be a niche application in spite of the wealth of content on offer. There is much work to be done on the device and service side to make it consumable - perhaps content isn't king after all...?

Here, I will be detailing my technical conclusions of the service.

File Sizes, Line Speeds & Encoding
As a user... and leaving aside the critical lack of a streaming capability for just a second, on a good quality LLU line, the service delivery is fast. The file(s) will download at close to line rate. If you can get 10Mbps, you will be able to download at just a small amount under that speed.

The Beta Test Blog shows a line trace sustained at around two thirds of the maximum speed on their 10M Be connection. Say 6-7Mbps of download speed in their results, which means that even the largest show, DanceX, a 75 minute 756MB extravaganza (suggesting encoding at 1.3Mbps), would only take 15 minutes to download. Top Gear, which is a more moderate 60 minute show at 387MB (860kbps encoding) would be ready to go in just under 8 minutes.

But, only 30% of the population can get 10Mbps because of line lengths. My downloads also seemed to come down at close to my line speed, which for me was 2Mbps during the test period. The same Top Gear show took me 26 minutes to download but at least and the picture / sound quality are good and I got the same end-product as someone on a faster line.

Twenty six minutes though... it's less time than it takes to play the show, but is enough to make me lose interest and go and do something else. Can you imagine a 26 minute zap time on the Sky EPG...? The iPlayer is not readily consumable.

Perhaps with this in mind, the children's programmes seemed to be encoded at much, much lower rates - as low as 300kbps in some cases - which meant that zap times were almost bearable. I was surprised that this didn't seem to impact picture quality greatly. Iggle Piggle and his friends in The Night Garden were perhaps a touch fuzzy around the edges, but it was still watchable - if you are 3 years old and like that sort of thing.

Below is a table showing the file size, run time and the implied video encoding rate. I have also added a best case estimate of the download zap time at various line speeds.


Download + Store, Zap Times & Progressive Downloads

My belief is that the future of this iteration of the iPlayer is confined to a small niche because it is built without progressive download capabilities. Of course, the download and store is a nice intro to catchup TV, but the PC is not the end-game for VOD so the content is in a holding pattern waiting to be sent in two different directions.

Its current download and store model is suited to mobile TV because it removes the unreliable and costly cellular data network. Mobile phones now specialise as the jack of all trades and they have replaced the PC in that role because they are more portable than a laptop. The mobile is best suited to the iPlayer's brand of personal, download for later viewing application where people who are short of time can snatch a few minutes here and there whilst commuting.

The other direction the iPlayer content needs to go is back onto the TV - but with the added on demand capability not yet in the new service. For this to be successful the zap time issue needs to be resolved and this requires progressive downloads (or Virgin Media's cable network).

This is not a bandwidth problem - at least for the majority of users at current resolutions - it's a service application problem. If I can download something in 26 minutes that takes 60 minutes to play, I have a decent buffer which would allow me to start viewing almost immediately while the rest of the file downloads in the background.

Managing Compromises
For some, progressive downloads would mean lower picture quality and it seems that the BBC is keen to avoid tarnishing it's content with this brush. With progressive streaming, if the line rate is even close to the standard encoding rate, progressive downloads need to re-encode at a lower rate, delivering a lower resolution.

Right now I get the same end product as the other guy on a 10M pipe, I just have to wait longer for it. The compromise is very egalitarian, even if that means that very few will be truly satisfied.

Because the BBC is publicly funded, it has probably had to design for a wide base of users meaning that it has to manage these compromises where Joost, Babelgum & Veoh can simply ignore low speed users as "unsuitable". The flip side of this is that if BBC users expect to wait, then perhaps there is a window of opportunity to crank up the resolutions to true HD for the iPlayer's small niche audience. What is the difference between waiting 26 minutes and 2 hours? The mass market won't go for either.

The
BSG report contained data that showed that 20% of UK households will not get more than 1Mbps even with LLU because of long copper loops. Is the BBC brave enough to behave like a business and shrug it's shoulders at the Digital Divide? Can it develop the service knowing that at least 20% of its license fee payers won't be able to get it?

Peer to Peer or Client Server?
It is too early to judge the Peer to Peer network, but we have taken some benchmark readings. These show that on Friday night - before most of the new wave of Betas had been activated - even the headline content (Top Gear) was coming directly from the BBCs servers.

Chart 1 - Top Gear Download, Friday 27th July at 8pm
BBC in Red, Peer on Virgin Cable in Blue

The black total line (total inbound bytes/time) is very close to the red bars (the bytes that came from the BBCs servers). The small blue line about half way through is a peer on Virgin's cable network that for a short period, contributed some data.

Saturday morning's kiddie TV also came directly from the core sites, but by the time Monday morning came along and many of the new Betas appeared online, there was much more interesting stuff to consider.

The red lines are again traffic from the BBC, while the black outline shows the total inbound traffic including the BBCs. The behaviour of the peer to peer network is shown clearly here.

Chart 2 - Mountain Download I, Monday 30th July at 10am. BBC in red

First a small volume arrives, which the application deems to be insufficient so it calls up the BBC to get things going. No sooner does this happen than a peer appears and starts contributing data. This peer can be seen as the blue line on Chart 3 below.

The throughput was low and I was unsatisfied with the speed so after about 500 seconds I paused the download and resumed it a few seconds later to see what would happen. Very interestingly, the BBC almost immediately begins filling my requirement. The lesson? If your download is slow, pause it and resume the transfer - you might get the BBC to notice you - although I'm not sure whether this is an intentional "design feature"...

In this middle period, the end of which is a second experimental pause, there is very little peer to peer traffic. After the second restart, you can see that the download begins to gather traction as the BBC gradually eases out to be replaced by the green line in Chart 3 which is a computer at Edinburgh University. The pink lines shows the aggregate of other peers within the sample.

Chart 3 - Mountain Download I, Monday 30th July at 10am
BBC in red, Loughborough Uni in Blue, Edinburgh Uni in Green, Others in Pink

iPlayer Kontiki P2P picks on key sources
That was the first part of Mountain. Because the wireshark output was getting rather large, I stopped again and started a clean trace. Chart 4 below shows that the Edinburgh peer dominates the traffic sources for the rest of the download. This very much fits with the overall pattern I observed, where the iPlayer seeks out the fastest single connection and tries to get as much as possible from that one source.

Chart 4 - Mountain Download II, Monday 30th July at 10am
Edinburgh Uni in Green, Peer on BT Central Plus in Blue

The final sample of iPlayer data was taken at peak internet viewing time on Monday evening. I downloaded the DanceX file which was by far the largest and longest playing. The first 5 minutes of the download are shown below in Chart 5. Again you can see an attempt to find peers is initiated before the BBC picks up the slack. This time, the release back to peers is supported better until Cambridge University gradually assumes almost the entire load.

Chart 5, DanceX Download, Monday 30th July at 8pm
BBC is red, Cambridge Uni is Blue, Peer on Virgin Cable in Green

My connection as a peer
The other side of the coin is that the application uses my upstream connectivity to share files with other peers. This shows some curious behaviour that is worth looking at.

Chart 6 - Mountain Upload I, Monday 30th July 10am
Peer on PIPEX is Green, Peer on BT is Blue, Peer on Hi Velocity is Pink

It almost looks like the PIPEX and BT peers are fighting over who gets my bandwidth. The PIPEX peer is the first to become established before the BT peer comes along and demands the files. Then, like two children squabbling over a toy they play a game of tug-of-war before the BT peer seems to give up. Having "won" the battle, the PIPEX peer also seems to lose interest and eventually disappears.

Of course, the application may be designed to burst like this, but it does seem to end in a fairly inefficient use of the available resource. In spite of this, the moderately lengthy spikes are not great for aggregation by the ISP - if you think of the space used in a jar of pencils against a jar of marbles and you can probably picture what I mean.

Uploads continue even when off!
I'm not going to make too big a thing of this because my upstream usage is free - I pay for download usage only, but it is an application characteristic that deserves to be noted.

My traces have shown that in each case after the Top Gear and Mountain downloads completed, uploading activity to one peer has continued after you close the library and even after you close the application in the taskbar. The only way to stop this was to power down the PC...

I'll keep an eye on this and report again next time.


Ping & Traceroutes
A significant finding of my Monday downloads was that the majority of peer to peer traffic is coming to me not from within my own ISPs network, but from University networks throughout the UK. These are clearly on very high capacity connections, although their ping times were slower (~40ms) than the BBC sources (~30ms) that they replaced in my delivery chain.

Interestingly, both these are faster than round trip times to other broadband users on Zen's IPStream network (~65ms). Of course, Zen's servers are the first IP layer devices that the traceroute sees, but it seems to be quicker to interconnect with JANET and get to a university campus, than it is to remain on Zen and go back out their BT Centrals. Connecting to other subscribers on BT's Central Plus shows the same phenomenon (~80ms), the extra time being the interconnection time between Zen and BT.

This perhaps explains the fact that there are very few IPStream users acting as peers in my results. The majority of P2P is with users on University LANs and Virgin's Cable network. Pings to cable customers are blocked by Virgin, but up to the point where they are blocked, times are fast (~40ms) suggesting that those users are quicker to get to than other IPStream users on my ISPs network.

The scarcity of IPStream peers is good for BT and their wholesale customers, but Virgin is known to also be short of upstream capacity so the knock on impact my not be good for them. Cable peers are also among the first UK sites to pop up in Joost traces too.

Traceroutes of all the major sources of data show that Zen is taking delivery of the traffic at the LINX or MANAP peering points. LINX is where Zen interconnects with JANET who provide the UK university network backbone, and with BT. Interconnection with Virgin Cable seems to be preferred at MANAP.

The once exception I have noted is that a small amount of traffic was exchanged with a Hi Velocity subscriber - particularly on my upstream. Zen does not seem to peer with them as the traffic traces show these packets going through Cogent's network.

Comparing Kontiki and Joost
Unlike Joost, the iPlayer seems happy with a small number of high speed peers. Joost will try and reach out to as many as 4 sources simultaneously, with each of these responsible for only a small piece of the file. This is a clear advantage also for Joost's DRM because no one peer has enough of the file to make it worthwhile cracking.

Chart 7 - Joost Ferrari 340 Download, 11 July at 11am
Coloured lines show various peers, black is total download.

You can very clearly see how well structured the Joost P2P protocol is from this trace. This contrasts with the somewhat chaotic nature of the iPlayer traces. Each Joost peer seems to have a clearly pre-defined role, while the iPlayer Kontiki equivalent seems to be fighting with its sources as discussed above.

Looking at my connection as a peer on Joost, you can see the other side of the same equation. The green bars are my computer sending to a peer in Canada (via Time Warner Telecom), the red is to a destination in Norway (via Telia).

Chart 8 - Joost Ferrari 340 Upload, 11 July at 11am
Black shows total upload, Peer in Norway is Red, Peer in Canada is Green

By contrast, the iPlayer seems happiest with one very fast peers, such as the one on Edinburgh University's network that sent me 145MB of the total 267MB in the Mountain download.

It is almost certainly an over simplification, but Kontiki seems to be very aggressive at pulling in contributors - like the community do-gooder that we all know and love. As with that example, iPlayer peers seem to be reluctant to contribute and drop off before bouncing back when no-one else takes their place. It is really only the universities that want to share the iPlayer by the looks of things, perhaps because so much is demanded of volunteers who do step up.

Joost peers are much more distributed - everyone does a little bit, rather than one source getting burdened with the majority of the demand. This organisation is good enough to allow the Joost application to pause the download and wait until the buffer has been depleted before initiating further data downloads.

Although it is too early to draw conclusions on the iPlayer based on the first weekend's data set, it would appear that it has a lot to do to develop into a clean, controllable distribution mechanism like Joost clearly is already.

Designed to help ISPs?
In reply to my pre-launch post, Angus suggested that the solution was for ISPs to run the iPlayer on their own high speed servers, so as to serve all the traffic from within their networks.

I wonder whether the application is actually behaving as it is with the university networks because it is designed to work with the ISPs in this way. It may well be that a few fast peers on gigabit links at the ISPs data centre could well take responsibility for serving their user base - saving Peering costs if nothing else.

Zen at least, is not there yet. If / when they are, will the iPlayer prefer their sources ahead of those on JANET? The JANET response times are pretty good, but if anyone knows of ISPs hosting iPlayer servers, let me know and I'll run traces on their links to see...

Serving the traffic from within the network will eliminate the Peering cost, but it still leaves a significant backhaul element on the ISP. I will be looking into the commercial implications of the iPlayer in a final article on Friday, where I will also write up my overall conclusions.

Labels: , , , ,


[Permalink]

Thursday, 26 July 2007

 

iPlayer

It's not even a month since the last i launch, but tomorrow sees the launch of another service that could disrupt its industry to an even greater degree than Apple promises to do with mobile telecoms. This time though, thankfully, we won't have to pay the homeless to wait in line for us to get hold of it.

The BBC launches the iPlayer tomorrow, but unlike the iPhone launch where all you could find was praise and hype, the BBC faces nothing but criticism, doomsday scenarios and even calls for a ban on the eve of it's big announcement. No wonder the folks behind it have decided to find pastures new.

The problem is that the BBC is publicly funded. It gets its money from everyone in the UK with a TV set because we all need a license to own a TV. The BBC's license revenue comes in exchange for a responsibility to deliver a universal service, free of advertising to anyone who pays the license fee. Foreign readers may find this curiously eccentric in the 21st Century, but the BBC is a national institution and we are British so that's the kind of thing we do.

This is where the problems lie. The license fee was designed at a time when the BBC was broadcasting: it had no competition in 1922 when the license was introduced to cover radio. The TV + Radio license was introduced in 1946. The Sky empire was still just a twinkle in the eye of James Murdoch's grandfather at that time.

The company (if you can call it that) is now operating in a very different world, but for many reasons (most of them sentimental), the BBC is still funded this way. As a result, it competes with other TV channels (and web sites) on an unequal footing because their funding model does not expose them to market forces.

Because the BBC is publicly funded, it has been free of the commercial pressures that competitors face on a daily basis. Has this given it an unfair advantage...? How many R&D departments would be given 4 years and £3m to deliver a project? Surely, anyone else in the same position would have lost the faith of shareholders well before now and management would be history. The BBC's unique position has shielded the iPlayer and given it breathing space in which to develop the service.

On the other hand though, how many R&D departments would face an Ofcom Market Impact Assessment, a Public Value Assessment, a full review by the BBC Trust and scrutiny by parliament before it could launch? The kerfuffle about the lack of service on Macs and Vista - there is a petition with 11,000 signatures with Downing Street asking the PM to ban it - is frankly pathetic. Do people really expect the BBC to be able to launch the service working 100% and available to everyone on day 1 with no testing?!?

Anyone who has ever been involved in product management will know that this is a recipe for disaster. The BBC cannot eat the elephant in one bite, but because of its funding model it will be forced (they might say "easily persuaded") to deal with standards issues like no other entity. The elephant will be consumed.

The Mac and Vista options might be addressed by making the content available through other media players as long DRM issues can be resolved. I suggested in my LUI Part 6 piece, where we described a prototype of the future of IPTV, these players are likely to include the likes of Joost. Because of its universal service obligation, the BBC is not in a position to say no.

The BBC's obligation extends beyond the internet however. For those without a PC, the BBC is investigating Virgin Media's on demand platform. This still leaves a chunk of people with no access to the service because of technology constraints on the user's side (no PC, no cable, no broadband).

Even though Freeview does not offer the bandwidth, the BBC is sure to get embroiled in how to serve these users, where other competitors would simply write off the niche as too expensive to serve. This is the flip side to the breathing space they have had to develop the service.

We already have video on demand from Channel4, an evolving service from Sky and a promised launch of a service from ITV that looks spookily like that promised by the BBC. So what's the big deal with the BBC's launch tomorrow? I've said it could disrupt its industry to a greater degree that the iPhone, so I had better explain myself...

Driver for IPTV Adoption
Ofcom's MIA states that by 2011, the iPlayer is likely to account for 3% of TV viewing hours, which doesn't sound like a lot. This is in fact about 45 mins per household per week, assuming total viewing remains as today at around 25 hours per week.

But, as with Freeview, the BBC gives this new(ish) technology the credibility to go mass market very quickly. There will undoubtedly be a knock on effect on all other broadband television services because there may not be a more trusted organisation anywhere in the world than the BBC. If IPTV is good enough for the BBC, it's good enough for me...

Looking closer at the Ofcom projections: 3% of total viewing is 9% of the BBC's current viewing. It would be reasonable to suggest that competitors services might grow in line with the BBCs. This would mean every household in the UK watching on average 2 hours and 23 minutes a week of IPTV by 2011. Over 3 billion hours a year...

Bandwidth
The MIA also says "The costs of the broadband capacity required to support the services could in aggregate be between £399 million and £831 million over the next 5 years." Once the capacity is there "the additional capacity would also be available for use by a wide range of other services, including commercial on-demand services, [so] it would not necessarily be appropriate to attribute the associated costs to the BBC services in isolation."

Ofcom's model says that the average capacity increase from the iPlayer will be 3GB per user per month by 2011.

Assuming that other broadcasters follow the same adoption curve, you are looking at almost exactly 9.5GB extra per user per month to serve the 9% of viewing hours at standard definition. This will add around 46kbps per user to an ISPs peak traffic load (approximately doubling what they have today). This is low, because I am using data that shows that early iPlayer alpha trial users had web-surfing-like peak to mean traffic profiles.

TV usage profiles tend to be much more peaky than web surfing traffic. Where you might get a peak to mean ratio on web traffic around 1.6, on TV viewing profiles, this looks more like 2.8. Cutting a long story short, this would push the traffic impact of the iPlayer from 46kbps per user up to around 81kbps additional traffic (easily tripling today's usage, from just one application).

Reverse engineering Ofcom's 3GB per user per month figure from the 3% penetration rate shows that they assume a 2Mbps encoding profile in their models. This suggests that high definition is not being taken into account.

If the BBC were to deliver at 1080p instead (as ABC.com in the US have announced they will), you might want to multiply the total capacity requirement by 5. With all content (ITV, Sky etc) as HD, the 9.5GB might be 45GB extra for every house connected to the broadband network. This would push the incremental peak load per user up by between 220kbps and 385kbps depending on peak to mean profile.

Money
Where there is demand, there is money, right...?

Actually, no. This is the other major problem with the BBC, the license fee and the universal service requirements. The BBC's iPlayer will not generate money from adverts (the BBC does not do ads), from subscription (the license fee already covers the service) and any other creative sources of income (including abroad), are likely to be relatively trivial.

This is not an issue for the BBC because the content is paid for already (its a catch up service of stuff already produced for broadcast). The service creation costs have been kept under control at £3m and rather than having to pay a big hosting bill, Kontiki's P2P client is being used, theoretically relieving the BBC of the burden of distribution costs.

The big losers are the networks who have to carry all this extra traffic and have no way of monetising it. This is again a BBC-specific problem because with other commercial broadcasters, the ISP is in a position to do an ad-revenue share agreement based on the unique element that the ISP can provide - the postcode. (We are going to come back to this point and the revenue opportunity from commercial broadcasters other than the BBC in LUI Part 10 early next week.)

The use of P2P actually makes the problem much bigger for the ISP. Historically, the BBC's web traffic, although significant, has been manageable via direct peering relationships between the ISPs and the BBC. Replacing this with P2P looks (to me at least) like a two fingered salute to the businesses that have to transport the BBCs product.

Summary
Even using the lowest results in the analysis, the iPlayer promises to double the traffic on the UK internet between now and 2011. On top of that the iPlayer opens the door to other broadcasters, which could mean that instead of doubling the volume of traffic, the iPlayer launch could drive an increase by tenfold or more.

I'm going to be watching the iPlayer's use of bandwidth very closely over the coming months. As I have done with Joost, Babelgum and 4oD, I will be running traffic source analysis and looking at where the Kontiki client gets its traffic from. Channel 4 also uses Kontiki, but using their service, I found that the scarcity of peers meant that much of the traffic was client server from the seed caches instead of actually using P2P.

I will be keenly examining the peer hit rates as that will determine the BBCs costbase. I will also be looking at where these peers are and whether BBC/Kontiki keeps traffic within the service provider's network or whether (like other P2P I have tested), in-country traffic source management is random. I will be publishing the findings here at periodic intervals.

If I can get the client from the website, the first set of data will be published here by lunchtime tomorrow...

UPDATE: no client = no data = no update. Sorry folks...

I got to the site by 7.40am, regsitered but have yet to receive the invite. I wouldn't say that the message board is on fire yet (10 ir so people grumbling about the same thing), but there are people who stayed up until midnight to register who are in the same boat.

They let Mashable in though, so if you are looking for a sneak peak that's the place to go. If you want a different perspective on possible adoption rates, I also found this.

IWR were able to run an initial test and reported that a 30 minute programme was 108MB, which suggests an encoding rate of 480kbps. It is not known what the download speed was, which may be different from the encoding rate to allow for buffering. The picture defaulted to 400 x 200 screen size, which sounds small.

More on this when I get my prized invite...

Labels: , , , , , , , ,


[Permalink]

Monday, 23 July 2007

 

LUI Part 6 - Television over the Internet

This is part 6 in the Leeds Unbundled ISP (LUI) series that Keith McMahon and I are producing. The aim is to deliver a view on the commercial prospects of a hypothetical ISP, serving a niche community (Leeds in our example).

Before we can properly present the numbers though, we need to describe what those numbers are modelling. We have already looked at backhaul, staffing and our short and medium term product set. Today we look at the biggest variable in the future of our made up ISP: video.

IPTV is better for viewers than broadcast because it is truly on-demand. It gives viewers timeshift capabilities for BBC, ITV, C4, Five and Sky so that they can watch what they want on TV around the rest of their lives. So what is the variable?

While we are confident that video services like YouTube will continue to grow, we are not sure whether mainstream TV will successfully move online because of economic, marketing and technology challenges. IPTV is competing with established digital platforms (satellite, cable and freeview) that already penetrate 18m homes (more than have broadband). Getting mainstream TV online means replacing these distribution networks with the internet.

Consider the scale difference between the two extremes of service adoption: YouTube consumption is a few minutes at a time, a few times a week. TV is 25 hours per household per week. YouTube is currently 200kbps, IPTV as a vehicle of HD means 10Mbps.

There is very little that LUI can do to make any money from YouTube, but conversely, once we have our gigabit backhaul links in place, we are not too concerned about the cost of carrying its traffic. If they cranked up the resolution to the levels used by Veoh (700kbps), we might be a bit more concerned but as it stands, we are happy enough to carry the traffic.

What cannot be allowed to happen is for us to end up in a situation where we are a simple transport network for everyone else's broadcast-replacement services. Our commercial model, and that of every other ISP in the world, is based on carrying relatively small files (peak traffic over total users equals around 35kbps). TV viewing moving over to the internet and adopting HD resolutions will make this closer to 5.5Mbps (159 x the current dimensioning).

This needs to be paid for and the value is in the content: people buy music, video and TV. They don't buy bits and bytes. This means that we need payment for our bits and bytes bundled with the payment for the music, video and TV services. This means adopting a Fed-Ex model for superfast delivery of premium, newly released content and ad supported models for the rest.

Will this happen? Maybe, but only if the economics are right - we know that IPTV offers better functionality than broadcast because the internet uplink opens new doors for interactivity. With the public becoming disillusioned with telephone based interactivity on TV we think that the internet can rescue what had until recently been a popular genre of content.

Furthermore local loop speeds of 20M or so means HD at 1080p is practical and can be made available on demand. It's all technically possible but all these developments will only be attractive at a price point that is competitive with broadcast.

So where does that leave LUI?

Nowhere, right now at least. We just have a basic access service with some customers on CPS. Next up dev wise is the photo blog (due Q4 2007) and starting work on the softswitch (due 2008 perhaps). Enhancements to the photo blog and community stuff are mid 2008 launches.

We looked at buying in a wholesale IPTV service, even before we unbundled the access service. When Tiscali bought HomeChoice, we heard some suggestions that the HomeChoice platform would be wholesaled alongside Tiscali's LLU platform. While the attractions were obvious, the differential advantage was not, so we rejected that option.

More recently, we have looked at Iliad's platform and the service that Fastweb offers with a view to buying that in lock, stock and barrel. These are not currently deployed in the UK and although we could overcome the competitive issue to some degree, we just felt a little underwhelmed by the idea of taking something that had been done before.

LUI wants to do something a little differently and has to exploit our core concept: our localness. For LUI, IPTV has to be build around the community, but we also have to remember that it is still essentially a distribution network for mainstream TV that replaces the satellite dish, cable or freeview aerial in the home.

LUI's problem is that our customers can get all that from other operators, notably Sky & HomeChoice, so where we need to be different is in the EPG. We need to offer social networking in the EPG that exploits our localness and the social groupings within our customer base.

We are a network company and a small one at that, we need someone bigger to bring us the content. That means the content won't be exclusive so we have to add value to it another way. Hence the EPG and social network mashup.

That means things like the ability to recommend a programme to your circle of friends and comment on what you have seen, perhaps with an SMS gateway tacked on for alerts. When you turn on your EPG you see the linear TV option and the timeshift scroll back for the mainstream channels plus a Friends Recommend Channel.

We also have the Leeds Community Channel which will be developed before our IPTV service, but which needs evolve onto the TV when IPTV does arrive. The Community Channel is where local interest groupings (schools, community education, sports teams etc) can post virtual private videos to their members - much like the Iliad offer. All this is built on top of a core of content based on today's free local press.

We are already lobbying to force the publicly funded BBC content to be made available via public API so that anyone with a delivery solution can use it to deliver BBC content. The others are different because they are commercial entities, so why will ITV, C4, Five and Sky let us carry their stuff, sometimes in direct competition with their offerings?

Money. Pure and simple. They can get more from our subscribers if they deliver content via our network than they can via other means.

How? Because we know the customer's postcode and we can deliver that when we place the request for content. We also deliver the ip address of course, but they would get that anyway. With the postcode, they can then use Geo Mapping databases to paint a very good picture of who the consumer is, so they can use a) demographic and b) personalised advertising. They can't get this postcode without the ISP.

We could also consider sharing any special interest profiles that the user may create on our social network but this raises some ethical issues I suspect, not to mention the technical challenges.

But all of this needs to be pulled together: content, advertisers, client software, DRM and CDN. We are looking for one party to bring this to us. LUI's plan is to work with them and vice versa to prototype the future of IPTV.

The prototype is based on Joost, or Babelgum or Veoh (we haven't stitched it all together yet, it's just a plan). Something that runs on either an AppleTV-like STB or on the TV itself. Their job is to aggregate the content and provide us with an efficient distribution using P2P and local caching. They also handle all the advertising including the targeting and pay us a revenue share.

In order for this to work, Joost (or whoever our chosen partner is) must bring a deal with the major broadcasters. Joost does the deals with the content owners for us because they can and we can't.

Our BBC, ITV, C4, Five and Sky content comes through the deals that Joost has with them. Joost can pay better ad revenue than the producer can get by themselves from broadcast because we give them the postcode. As a result, they can target ads much more effectively. We get a rev share because we are adding value to their proposition.

Furthermore, we are solving one of Joost's problems - the EPG and social networking, which are currently lacking in their product - and we are leaving them to concentrate on their role as the IP TV operating system. We carry Joost's traffic and help them develop their intelligent localised P2P routing.

We provide the EPG (or at least our software partner behind the photo blog / community web stuff do that for us) and that has a two way API into Joost (or whoever). The EPG is our value add, our brand, our directory of content and the portal through which users can get to the array of services that we offer. Of course, they can go onto the open internet but with our gateway offering them RSS-based access to the world, we reckon that we can hold a fair proportion of the screen-time on our own services. Which is great for our ad revenues.

Of course this is all made up. LUI doesn't exist and there are holes in the plan and some very rough edges. With any luck though, this might give you a few ideas...

Part 7, back on Telebusillis is going to look at Hardware, which Keith will publish later this week!

Labels: , , , , , , , ,


[Permalink]

Thursday, 12 July 2007

 

Joost: further analysis of a bandwidth hog

Back in April, when I was writing my original "Joost: analysis of a bandwidth hog" piece, I had no idea that the piece was going to make up 24% of my pageviews since then. It is quite possible that this had nothing to do with my writing and everything to do with the strategy secrets to be found in the link colmmac posted in his comment. Unfortunately, these secrets are now gone but you can read all about it here.

Whichever it was, it was clear that people were interested in how the application was behaving, so I updated the analysis in late May with my "Joost about on track" post that also looked at the wider strategic issues facing the company. My summary was that they were kitted up to play, but by no means certain of making the cut. They had (and still have) a lot of work to do on their EPG to improve what is a very static and "been there, done that" channel guide.

The content delivery network though was improving and this has clearly moved the company towards the concept of being "a high quality ad-supported secure cost-effective delivery platform" that works within networks rather than over the top. Their new CEO used to head Cisco's network equipment division, remember... Joost: the operating system for IPTV?

Now in mid-July, having run the analysis for a third time, it is clear that one thing that is working really well for them is the peer-to-peer network. Success here is measured in hard currency: the bill they pay their network providers (Level3 mainly, but also BT Infonet). What I am looking for in my analysis is the proportion of traffic that comes from peers versus the proportion that is served from these centralised locations.

In order to get my results, I am using Wireshark, which many readers will know as Ethereal. This allows me to see all the traffic on my LAN, and if I turn everything off except Joost, enables me to see what the application does when it is active.

What this shows is that for popular content at least, Joost has successfully offloaded the vast majority of traffic from its paid-for connections onto its free P2P network (free to Joost that is). In April's study 47% of bytes came from central servers, six weeks or so later on the 25th May I reported that this was at 18%. Results from 11th July show that this is now 6.7%

This popular content (the Fifth Gear Ferrari 430 Spider road test) is now well seeded, such that no one peer delivers more than 12% of the total file. There are some major sources: 6 peers between them give me 50% of what I am after and 14 peers account for 80%, but there is a long tail.


In total, there 99 IP addresses on 75 different networks in 33 countries delivering traffic to me.


From what I have seen, the network is now starting to prefer peers on this side of the Atlantic (possibly because there are just more of them now). Where in my May sample there was 39% coming from the US and Canada, this is now 4% in my July data. This is good news. Things are moving in the right direction.

So now that Joost's distribution capacity is seeded in Europe, expect to see this get more country specific to align with the network interconnection arrangements already in place. Serving Joost punters in the UK from other users in the UK means that ISPs costs are mitigated as they will likely avoid the most expensive routes - Transit.

But this is still only mitigation. The overall burden on the ISPs goes up with every Joost user that leaves their PC on even if the application minimised.

As I have mentioned before, Joost also uses the network when it is inactive (minimised), which I have questioned in the past. I don't know if this is new or if I missed it the first time, but the Terms and Conditions acceptance box now states clearly "I know that the Joost software will operate when minimised unless I fully exit the software".

Perhaps this is a necessary evil, but in a world where many people either have bandwidth limits or fair use policies set by their service providers, consuming someone's scarce resources for use by a complete stranger is still a big concern for me. This is where the house of cards could be vulnerable - you might give them a lift if you were going that way anyway, but would you let a stranger borrow your car when you are not using it (even if you were 100% sure you'd get it back)? Hmm...

Perhaps this won't be an issue. Certainly when I was sampling yesterday, I was surprised to find that the Joost application was not using my bandwidth when idle. Granted, this was yesterday morning around 11am so I can understand that demand from other users will be pretty low at that time - most people are at work of course and not watching Fifth Gear on Joost. I am certainly going to have to come back to this point and look again what is happening at peak hours.


I speculated in June that once the peers are well established, the file sizes can be cranked up to increase resolution and user experience. That time may well be approaching - perhaps when the service finally goes from the most well known Beta trial in history to being the real thing?

Having said that, to my untrained eye, the resolution already looks much better than it did in April. This might well be new codecs in the ever evolving software (the latest download is version 8 of version 0.10, so the guys there have clearly been busy). Whatever it is, the picture quality is now almost as good as standard definition TV. Not quite up there with Channel4's on Demand service (which uses a lot more bandwidth), but a lot better than Babelgum.

It does not look like this improvement has come by cranking up the bandwidth used by the application. My traces show that this is about where it was before, if anything the use may even be ever so slightly lower. The picture quality though is a lot better.

So that was popular content, what about the stuff that is buried in the Joost channel guide? Wow, do you have to work to find something watchable here? In fact I didn't watch it, I put it on, started Wireshark and went to make a cup of tea.

I chose the Community Channel because it seemed like the sort of thing that trendy Joost-heads would be most unlikely to be interested in. To be fair, the film was actually very well made, home made, but well made. It was about a couple of schools who decided to give their pupils a taste of the business world by letting them play Tycoon where they could design and develop a product for sale in their communities. Thankfully Peter Jones was nowhere to be seen.

So I was surpirsed when Wireshark revealed that this show about three very British schools, was coming largely from servers in the US. 68% of bytes came all the way across the Atlantic while almost all the rest was from the UK. The source networks? Level3 in both cases.

The aim of the experiment was to see how long tail content was distributed and here we have an answer. It makes me wonder (again) about the whole Long Tail thing. It is going to cost Joost an awful lot more to distribute this as it will be very hard to seed. They will be much better off if / when they get mainstream content on there that they can deliver as timeshifted linear TV.

My research has previously shown that on YouTube at least, the internet means that the richer get richer. The 20th most popular clip there had 18% of the views of the most popular, while the 20th most popular DVD rental generated 79% of the revenue of the no. 1 title. Humans like to follow the herd and social networking helps by showing the direction the herd is heading.

If it is far more expensive to distribute the unpopular stuff and demand for it is much lower (lower demand = lower prices), where's the business case? Time will tell, and I've been wrong before...

Labels: , ,


[Permalink]

Wednesday, 16 May 2007

 

That precious hour

Ten days ago, quite unexpectedly, my wife had a baby. Of course, I knew she was pregnant, but the little guy wasn't due until the end of May so we weren't ready - I'd managed to paint the ceiling of his room, but not the walls. He didn't have anywhere to sleep or any clean clothes to wear. It was all a bit chaotic, as I'm sure you can imagine.

All's well that ends well, but all this meant that I was unable to write anything last week as my job suddenly became Chief Entertainer for our two year old son. They say that it's easier the second time around and they are right - looking after a newborn is infinitely easier than looking after a two year old...

On Monday night this week I got a bit of time to myself. Not much, but between 8pm and 9pm I managed to catch a bit of telly. Fortunately for me, there was not one, but two programmes that I wanted to watch. Unfortunately for me, we don't have a Sky+ box and my old VHS recorder let me down badly when I tried to tape the Australian Grand Prix a few weeks back so I have consigned that to history. So I had to make a choice: did I want to watch Panorama's programme on Scientology or Dispatches' character assassination of our next Prime Minister, Gordon Brown?

Again, last night (Tuesday), I had an hour to myself. A precious hour. An hour I had worked hard for all day. But what was on? Nothing!


(Strictly speaking, that's not true as I have hundreds of channels, but there was nothing there to float my boat and I ended up watching Hugh Grant's truly dire American Dreamz on Sky's Comedy Movie Channel)

So one night I miss something I would like to watch and the next I'm forced to watch Hugh Grant pretending to be Simon Cowell because there's nothing else on.

"A sample of one", said an industry expert when I described my problem to them. "The best thing about television is that it is not interactive", and while I agree with them to a point, I can't help feeling that what I experienced last night (not for the first time on a Tuesday night by the way), would be enough to convince me to go for a genuine time-shifted / catch-up TV service. Given how precious my time is now, I would be willing to pay for it too - on a subscription basis, not pay-per-view as the last thing I want to do after reading The Gruffalo for an hour is to analyse whether watching X is worth £Y.

So why don't I get a PVR? Jolly good question that - maybe I should - but I can't help thinking that this just adds one more problem to my already busy life. A PVR requires planning and foresight which is something most people with kids are short of at 8pm. For sure, I would have been able to watch Lewis Hamilton's debut in Melbourne a couple of months back because I would have set the thing to record (as I tried to with the VHS), but you don't often get a programme like that, that you know you are going to miss and want to watch enough to get off your backside to do something about.

I am talking about that veg out time. That precious hour, when you really don't want to think "if only I'd set the PVR last night", when it would be so much easier just to go "back" through the programme guide and find something, anything, better than is on offer right now.

If you are the BBC, ITV or C4, making quality programmes, why do you want to restrict your audience to those that happen not to have anything better to do when your work is aired? Don't you think that more people would actually watch your stuff if they could do it on their own terms. Maybe they would watch less imported tosh and maybe Sky have more to lose than gain from on-demand?

Clearly, the BBC and C4 are with me on this. The iPlayer is
horribly caught up in bureaucracy, which is a shame because the BBC have most to offer when it comes to quality programmes. C4's 4oD is available now, so as part of my research into this article, I have caught up on the half hour or so of the Dispatches programme that I missed when I switched over to Panorama on Monday.

The quality of the 4oD download was good. The file was 348 MB for 48 minutes of film, which is very slightly under 1 Megabit per Second. Interestingly, it was 48 minutes because the 1 hour broadcast had been stripped of the adverts - which I consider peculiar in the extreme. Surely, here is a great vehicle for ads to help pay for the service provision. Although, with the ability to fast-forward in the current media player, you might not expect people to watch them...

So I watched the re-run of Dispatches on my PC and as I worried about whether my country was going to end up with a control freak as its next leader, I also considered some research that I had seen from CacheLogic into online video watching habbits. Was I happy with watching TV on my PC? Yes, but it was "during my lunch break": I would not want to do this at 8pm during my precious hour of veg out time. I didn't last night... I watched Hugh Grant "acting" the twit instead.

The picture quality was equivalent to broadcast TV (and better than Joost, which runs at around 700kbps). You have to download the 4oD application which has an embedded Ioko Kontiki P2P client, which obviously aims to spread the distribution burden.

I checked my download with Wireshark (fka. Ethereal) and found that over 90% of the data was coming from Ioko's own network, indicating that the seed file had not been distributed sufficiently to allow P2P to have much impact on Ioko's (and C4's bandwidth costs). I'm sure they weren't helped by the fact that I turned off my 4oD client as soon as the download had finished in order to save my bandwidth cap. This all meant that someone on BT's Central Plus that had been receiving data from me, had to find somewhere else to get it from - probably back to Ioko...

It is also interesting to note why I wanted to watch Panorama. Like many other visitors to the BBC's web site I had been intrigued by John Sweeney's tirade at Tommy Davis (official BBC version). As a blogger with an emphasis on online video, I was further intrigued by the use of video by the BBC to promote their programme and the use of YouTube by the Church of Scientology to counter the position taken in the programme - Scientology getting its retaliation in first.

So whatever your views, whether or not you believe we are descended from aliens or not, you can now make your views known to the world and with a bit of time and effort (and £2,000 of electronic equipment). I have no doubt that there will be a lot of long term fallout from this very high profile example of an electronic propaganda war.

On demand allows me to fill my precious hour with something I want. But it's not just about me, it's also a vehicle for advertisers who want to reach me. Consider for a minute how much you have learned about me by reading this article and hearing what I like to watch.

Put yourself in an advertiser's shoes and ask yourself how good a profile you could build of me if you could monitor what I watch when I really do have a choice. The uplink on an IPTV service is pretty much ideal for transmitting such detailed one-to-one demographics back to you, the distributor. You could be pretty sure to "know me" with a picture of my likes and dislikes over time. You can hit me with what can genuinely be called targeted ads. And, because you are in control of the content feed to my TV set, you can place personalised ads, just for me, that you know will be most likely to get a reaction from me. Even when I am watching The Bill, just like everyone else.

Labels: , , , , ,


[Permalink]

spacer

This page is powered by Blogger. Isn't yours?

 Subscribe in a reader