Welcome to The IP Development Network Blog
By | February 13, 2008 |
Last Thursday marked the start of the Chinese New Year. This year is the year of the Rat.
I’d love to tell you more about why this is important, but although I have been making great strides in my attempt to raise funds for my new venture, I am not yet in a position to share details with the world.
But, please… do stay tuned. The story starts here.
Topics: Uncategorized | Comments Off
By | January 17, 2008 |
Simple – easy to understand, deal with, use, etc.
I want to make it simple to get data on and off mobile handsets. Today, it is not easy to understand, deal with or use mobile infotainment, so Nihat Karaoglu, MD of Centrillium-IT and I have developed something to simplify the process!
It will cost less than £30 inc VAT.
I want to talk to people who are interested in selling the product. I need to create a virtuous circle between funding – manufacture – distribution and that starts here with an open invitation to anyone who wants to get involved to contact me.
Unfortunately, that means a dramatic scaling back in the amount of blogging I will be able to do in the future. I have always regarded my independence as a writer as paramount in what I blog. The articles, theses and philosophies were fun to write and will remain as a snapshot of how one telco commentator saw the market in 2007. It is going to be harder to guarantee that independence in the future but I will try occassionally to comment if I have something useful to add.
The Dragon’s Den for me? A lot of people have suggested it, but the investment I need is much larger than they have dealt with before. I would not expect anyone to analyse such an opportunity in a 20 minute slot and it would be unfair to expect the Dragons as serious investors to do so. I have all the details available for analysis and although I believe that the concept can be sold in less than 10 seconds, I would expect investors to be very cautious about investing such a large sum.
Please email me directly if you want to get involved.
Topics: Uncategorized | Comments Off
By | December 13, 2007 |
I have certainly been distracted over the last couple of weeks by a mountain of work. This is no bad thing considering that I have two kids, a wife with expensive tastes and Christmas is coming, but it has meant that I have been more than a little quiet on the blog.
While catching up, I learned that yesterday, the BBC sneaked out a streaming version of the iPlayer, based on Adobe Flash and Akamai’s CDN. I say sneaked, but they could have been standing behind me with a megaphone shouting “Streaming iPlayer is here” and I would still have missed it.
Anyway, a day late, I checked it out and I also re-read some of my earlier comments on the service, reflecting on a point made by The Guardian there there are only a “few thousand regular users” at present.
The lack of streaming was a huge hole in the initial version because it meant the user got no instant gratification from using the service – you had to wait for the programme to download for a while. Imagine waiting 20 minutes for the channel to change when you hit the Sky remote? No, me neither.
So now that streaming has been addressed, I would expect to see usage ramp very quickly. Perhaps not on Christmas Day (which has to be the day to soft-launch a product), but with boredom and family feuds setting in after the festivities have passed and a seasonal catalogue of programmes to catch up on, we may see some results very quickly.
The use of Akamai is highly significant – it will be costing the corporation real money to deliver the traffic now which may yet break the business case for free access to catchup content. But, it also makes it a lot harder to block the traffic as it no longer falls into the “bad” P2P bucket… I don’t know if traffic shapers will be able to distinguish one BBC Akamai bit from a non-BBC Akamai bit.
2007 saw the Beeb take a beating for the iPlayer most of which was down to the use of the Kontiki’s Kservice and Windows-only DRM. The new service addresses both in one go so 2008 may yet see the iPlayer emerge as the bandwidth hog we all thought it would be. Where does this leave the Kontiki? Up a certain creek, without a vital instrument in my view…
Topics: iPlayer | Comments Off
By | December 12, 2007 |
No summary of 2007 would be complete without mention of the iPhone and Facebook, so I will get them out of the way. There: they are out of the way. Now that I have dealt with sledgehammers masquerading as marketing phenomenons, I can move on to the more subtle developments that happened in 2007 and what I think they mean for 2008.
For me, 2007 was about the reality of PTT separation, the future of fibre-to-somewhere (but where?), platforms and over-the-top applications & devices. The items that promised to dominate the agenda, like net neutrality and IPTV, were strangely subdued, perhaps by the burden of expectation that they would actually deliver. Today, I will be reflecting on broadband and how separation has changed the landscape.
2007 Brief – Separation Stimulates Broadband
Separation really started to deliver in 2007. The UKs pragmatic move towards separation has led to 3.5m LLU lines (23% of total broadband) and an accelerated consolidation of the ISP market. The second tier is all but gone: PlusNet lives on, but is now a part of BT while PIPEX Internet suffered a far more gruesome fate after its attempts to trigger an auction frightened off all and sundry.
Niches have begin to emerge: BT is about total service, Sky about its TV and Virgin about its cable again (a sign that technology can still be a benefit). TalkTalk and Tiscali are playing the transport game leading the price war into ever darker depths and buying up those they suffocate. Zen lives on, insulated by its excellent and transparent service in an ever shrinking number of players.
BT Keep Going
BT Retail lead the way through consistency and evolution. The evolution has seen the home hub emerge as a central feature with BT making the best use of theirs to simplify connection and support. They are well positioned for their core, more mature and less technically able market.
Disappointingly for the company no doubt, BT Vision has been a failure so far in spite of massive advertising and technical achievement (something Tiscali TV could have perhaps warned them of). It is an overlay service and yet you can only get it on BTs IP Stream based service…?
Feature-wise it is limited too and the service has not worked because it is simply a replacement distribution network that costs more than its predecessor and is more technically complicated. If I want multichannel, I get Freeview or Sky.
The Price War Wounded
TalkTalk stumbled through its AOL integration. That transaction changed Carphone in many ways, for better and for worse – the company stopped acting like a punk kid and got a lot more serious but lost some of its panache as a result. I wonder whether that deal is going to work out for them as I’m sure it confused more than it created.
Tiscali keep confounding the doubters and, unlike Carphone with AOL, are taking the direct approach to integrating an acquired customer base. It will probably work for them too – you sense an edginess about Tiscali keeping them hungry. If you had told anyone 5 years ago that Tiscali would be one of the major broadband players in 2007, you would have got some strange looks, but all credit to them for achieving what they have. Not sure where they go from here though as the squeeze will continue from BT and Sky.
Naive? Yes! Innocent? Hmm…
Virgin were still NTL Telewest on Jan 1. On the 8th February, they became Virgin. The name change was supposed to herald a newer, braver and racier image but they were quickly put in their place. The rebrand was instantly tarnished as a result of trying to take on Sky and coming out second best.
The row made Virgin look petty and naive, but it seems that in recent months they have moved on from that episode and the shambolic ousting of Steve Burch. Under acting CEO, Neil Berkett they have refocused on their core cable assets and have seen customers start to return. They need stability and they can put aside any notion of being bought out now that the credit markets are closed, which should be good for them in 2008.
While Virgin briefly hit the canvass in the bout with Sky, the satellite broadcaster may have been left nursing its wounds too. For all the progress on broadband, it is now it is £60m a year short from Virgin. Furthermore, the TV business is getting more expensive because of competition for rights and the stake it took in ITV has diminished in value substantially. It may yet be forced to sell this at a discount, although James Murdoch will not be there to suffer the humiliation as he has been promoted within News Corp to play with some bigger toys.
On the broadband side though, Sky is going great guns. They are starting to build an effective hedge against the internet replacing broadcast – and now have their seat at the Telco table too from where they can influence how the market evolves. I don’t think Sky really wants the internet to “win”, but they know how to block and tackle as well as how to look good throwing a touchdown pass (this analogy is a Holiday Special for all my American Readers).
That leaves Orange and that’s it.
Poor Orange is stuck deep in French goo. It seems like broadband is actually dragging down Orange mobile and although there have recently been changes there , it seems inevitable that Sky will go soaring past them and continue its pursuit of the big boys. It will notable if and when Sky catches up organically, with the two (CPW and Tiscali) who have reached that size via acquisition.
Six players account for 95% of UK broadband subscriptions. Each has over 1m subscribers so they all have a critical mass of sorts, but I would expect 2008 to see this number reduced to 5 or perhaps even 4. I cannot see France Telecom allowing the situation at Orange to continue with their UK units pulling each other down and the broadband one languishing in sixth place. Tiscali are an obvious target, but an acquisition by Orange in 2008 of CPW should not be ruled out. Alternatively, they could sell the Freeserve / Wanadoo business, but that would be a massive step backwards for what was once a great industry visionary.
O2 and Vodafone are sitting on the sidelines. O2 has of course dipped its toe in the water with Be; it even got its swimming costume on before doing so, but it looks like they are still testing the temperature, unsure of how broadband fits with its core business. Vodafone’s earlier intransigence might yet make Sarin look good compared to rivals who rushed in and are now feeling the chill.
2008 Prospects – Volume Growth Dies
Growth has already slowed down and we are now clearly into a new era. But, it seems that there is also a measure of price stability with the cheap, cheaper, free playground spat well and truly behind us. It is now all about switching customers and stopping your customers getting switched. Because LLU switching has not been required much to date, the process has not been tested, but it will be in 2008. ISPs will need to work out how to do this quickly or there will be trouble.
The TalkTalk base is coming out of contract and people are no longer buying “free”. This poses a considerable challenge for the company and the other price leader, Tiscali especially as both will start to feel the pain of resources that were spent in 2007 integrating acquisitions rather than developing the product.
Expect both to continue along the price leadership path through ever lite-er services, aggressive policing of usage volumes and outsourcing of customer services to cheaper parts of the world. Expect very little in the way of feature development or product innovation but in a harsh economic climate, they are well placed to survive.
Who is Going to do R&D?
R&D is not CPW or Tiscali’s core competence, but innovation is needed in the saturated market. We don’t yet know how far the Telco needs to go beyond the connection, although others are playing out various scenarios. BT is good at R&D, so they deliver the integrated products with upsells that add revenues. Virgin too, on their cable platform can deliver incremental services that generate extra revenue.
Sky has also shown its ability to evolve additional revenues on top of a basic satellite platform, but it remains to be seen how well they can do this over broadband. They sell access today, how are they going to monetise their content on broadband? I think we will find out in 2008.
Step by Step Developments
I expect there to be a major focus on the home network by this time next year. The access market is mature now but is limited by the still difficult process of adding new IP devices that aren’t computers. This is a choke point on the path to video that will start to clear in 2008. The extension of the home hub will increasingly become a valued differentiator which will suit BT.
The other major development I expect to see in 2008 is the emergence of the son-of-IP Stream. The product that will offer an alternative to LLU is long overdue and I expect this to start to make a play in 2008.
I will be looking for signs that BT Wholesale is going to compete against LLU now that it is free to do so. BT Wholesale cannot simply give up the market for wholesale connectivity wherever there is LLU and an IP Stream alternative would allow niche ISPs to extend their competitive reach. How BT play this will be very interesting.
Also in son-of-IP Stream, I expect to see the results of BTs R&D in a wholesale portfolio. Use of BT engineers for home visits, home networking equipment, BT Vision (Wholesale), Multicast, Storage, Backup, Traffic Management. It would be interesting to see whether these also get added eventually to LLU customers too.
I expect 2008 to be a hard grind for broadband access. Economic conditions look like they are going to be pretty tough so customers are going to be chasing value. On the other hand, customers are also going to increasingly chase service and honesty as a backlash against broken promises from free* unlimited* broadband bulls**t.
BT and Virgin look to be on the right track and I expect them to quietly improve throughout 2008 and maintain their lead over the pack, which will be joined by Sky. Sky look well placed, although they might suffer on the pay-TV side if the economy is really bad.
CPW and Tiscali are in danger of fighting over the scraps and hurting each other in the process so it may be that a merger between the two UK arms is a possibility in 2008. Combined, they would be bigger than BT Retail which might ruffle a few feathers and leave Sky a long way adrift in fourth… As long as Tiscali led the integration project and CPW the public image, it might even work.
Orange…? I cannot see how they can keep up with Sky or catch Tiscali or CPW. Can they run a cash cow and avoid hurting the mobile business? Do they want to play if they are going to be on 7% market share in 6th place? I think that the answer will be revealed in 2008 and it will be a “non”. I expect the Orange broadband unit will be sold off in the second half of 2008, possibly to Tiscali
O2 and Vodafone look on. O2’s commitment will be tested in 2008. Neither of them are going to win 1 million plus customers organically like Sky without something special and something new, but neither of them have it. Do they want to swap positions with Orange? No, thanks, so they either buy one of the established players or they work out a way to go over the top.
My money is on the latter. We are well on the path towards the sort of consolidation we have seen in the US – where AT&T and Verizon form a virtual duopoly. It is not just France Telecom that doesn’t want to come 6th, no one does.
There is probably an ultimate market structure which supports BT, Virgin and one other at the access level with everyone else either running niche implementations on BTs platform or going fully over the top. We won’t get there in 2008, but we will continue to evolve that way.
By | December 10, 2007 |
Here is my response to Ofcom’s consultation in full…
We are nowhere near the limits of the existing access network. We use less than 2% of the local loop capacity, but peak throughput is an issue because of bottlenecks and immature architecture on other parts of the network. We have a lot to do before we are ready for NGA.
There is an assumption that we need more local loop speed but I believe that this is flawed. What we need to do is give the market time to work out whether this is true or whether the speed gains can be achieved in other ways.
Distributed networking, local caching and storage and predictive off-peak downloads all based on Peer-to-Peer principles all allow the management of peak loads by deferring the transfer across the whole day.
P2P is not just about free-loading and copyright theft – that was just the sandbox in which the technology developed – the technology itself provides an intelligent routing layer onto a dumb network that gets data where it is needed before it is needed. Zero network latency…
On the copper itself, there are a number of projects that may deliver yet another step change in performance. ASSIA Inc, working with the University of Melbourne is one example – what they have found is that by minimising crosstalk you can deliver as much as 250 Mbps on existing copper loops. There are other, less credible, projects that promise even more.
The existing network has not yet been allowed to fully evolve so it would be premature to suggest that it needs to be replaced. More emphatically, it would be extremely dangerous to pre-empt what the most efficient solution was without giving some of the other options time to play themselves out. There is a False Sense of Certainty that FTTH is the answer.
New builds offer the opportunity to rollout fibre as cheaply as copper so in those cases, it would seem sensible to do this so that the fibre option can be fully evaluated. Elsewhere, replacement is an entirely different question as there is already a basic service with the potential to evolve.
What would seem clear is that evolution should happen from the core outwards, rather than from the access network inwards. Backhaul issues are being addressed, but there is a long way to go before the backhaul, exchange and cabinet facilities are at a level where an alternative operator has a real opportunity to innovate.
In general, improvements to copper performance come by shortening loops more than by using a better type of xDSL. The real issue in UK Broadband today is not Next Generation Access, it is the Digital Divide for Last Generation Access because of very long loops on existing infrastructure.
This is where policy and regulation should play. There should be universal service right to enough speed and capacity at a reasonable price. This might start at 512k with 2GB for £9.99 so that focus can be put on areas where this is not possible.
Solving such problems will undoubtedly involve shortening loops, perhaps by converting some cabinets into exchanges. This is an expensive task in rural areas so once policy has set the targets, regulation can work whether artificial assistance is required to make addressing this issue an economic proposition. It might also be a good sandbox for FTTC.
Where LLU has proved the market to be competitive, Policy and Regulation needs to back-off and let it evolve. The real job of Policy and Regulation is to focus on how to level the playing field for LGA where there is no competition.
Definitions and Key Data
I think it is important to detail what I understand by NGA as opposed to NGN before I pile to the questions themselves. I am using the following demarcations
- NGN represents the elements of the upstream network (towards the internet)
- Around 130 core BT nodes and the fibre between them
- OLO core networks (using wavelength or fibre)
- Does not include backbones built on leased capacity
- Backhaul describes the portion of the network that connects the NGN to the DSLAM in either the exchange (LLU) or the street cabinet (SLU). It also includes any point to point long haul circuits used by this traffic using STM-x or Ethernet on a backbone network to a centralised core infrastructure.
- Access describes the connection from the exchange to the wall socket in the customer’s premises
It is also worth splitting out the logical layers of the network in this discussion. I think you have to consider the Layer 1 (physical), Layer 2 (data), Layer 3 (IP) together and in isolation.
Backhaul today runs layer 2 tunnels to the core nodes before the traffic is routed in the NGN. The extent of Layer 3 deployment is a very important consideration in the effective use of existing resources.
The effective use of existing resources is the centre of my response. A 2 Mbps link is capable of delivering 642 GB per month of data and yet average UK internet usage is what? 5 GB – 10 GB? Less than 2%…
Question 1: When do you consider it would be timely and efficient for NGA investment to take place in the UK?
NGAs should be built wherever there is a new build project like Ebbsfleet. It is possible that all new estates can follow this model, although there needs to be consideration given to what constitutes a critical mass of homes in a project.
Replacing existing copper is altogether different, obviously in part because of the scale of the task. It will take a great deal of time to achieve – perhaps 10 years – so obviously delays getting started are a concern. But, there is another angle to consider first.
There are a small number of homes for whom even 1Mbps is a pipe dream today. There is an excellent response to the consultation by Stephanie Northen which says it far better than I can. The Digital Divide should not be thought of as an issue at the margins. Of course, Northern’s case is extreme, but it should be noted that 20% can’t get 1Mbps today and 40% can’t get more than 4Mbps (source: BSG Pipe Dreams Report).
This country need to address The Digital Divide first, before the need for speed at the top end. There needs to be a clear understanding of what basic services everyone has a right to be able to receive – either Ofcom or Government need to set benchmarks and work out policies to achieve these targets.
We need to level the playing field before we extend the divide further. Everyone should have access to 2 Mbps before anyone has access to 100 Mbps.
The Need for Speed (or Capacity?)
10 GB divided by 642 GB = 1.6%. We use less than 2% of the capacity of a 2 Mbps loop.
Of course capacity is not the same as speed and this is where the problem lies. There is an obsession with headline speed because it is one of the few “marketing features” in broadband today. We jumped to 512 kbps then 2 Mbps; we wanted 8 Mbps then 24 Mbps. Now we want 100 Mbps… But do we know why?
What is so big and so live and so important that you need 100 Mbps there and then? With 100 Mbps, you can send / receive 32 Terabytes a month. We certainly don’t need that yet.
Furthermore, local loop speed is only one of the factors in the throughput – which is surely what we are actually talking about. Speed is important but it degrades with distance as the chart from Akamai shows (source Telco 2.0). There is no point in having 100 Mbps access unless you can throughput at 100 Mbps, so you need to consider where content is physically distributed from and place that closer to the user.
If speed is important, we should first look to lower latency and ask what the options for doing that are. The falling cost of storage is highly relevant to the broadband market as that allows more content to be deployed deeper into the network, shortening routes and lowering latency.
Tromboning & Routing Inefficiencies
Bringing Level 3 into the backhaul network and even eventually into the NGA should be considered as one of the options because it would help minimise use of the backhaul and core networks and thus improve performance. The first time a connection is routed is when the Layer 2 data sees a Broadband Remote Access Server (BRAS) which are typically deep within the network. This long backhaul is the cause of the tromboning effect.
DSLAMs convert Layer 1 data to Layer 2 and a BRAS coverts Layer 2 to Layer 3. Until it reaches the BRAS and becomes Layer 3, the data is heading down a tunnel with no exits. This tromboning effect is an inefficient use of the Access and Backhaul networks.
In the diagram above the blue line represents a session between two users on the same ISP. You can see the tromboning on the backhaul circuits. Similarly when the session is between users of different ISPs, there is even greater tromboning because the traffic has to go through a peering point.
Bringing routing deeper into the network makes it ever more like a grid – which might be what we need to get the best out of technologies like peer-to-peer. Shorter distances means lower latency and lower lower jitter, and a much greater scope to use the capacity on the fixed price, uncontended portion of the network at zero extra cost.
With DSL, the local loop is dedicated capacity between the exchange and the home. It costs a set rate that doesn’t vary based on how much it is used. This is not the case at Layer 2, because more usage means more circuits and that means more cost.
The elimination of tromboning and the localisation of content reduces network load and latency. This leads to a more complex network, but one with better the performance.
Alternatives have not been explored
We need to question how much bandwidth we need. Bandwidth is two things: speed and capacity (speed x time). While there is a speed problem constraining certain applications, there is not a capacity problem on the access network.
We use less than 2% of the DSL access network’s capacity.
Furthermore, the cost of local storage is falling very quickly and predictive technology is improving that could enable a user to receive their files before they need them. This could enable much more efficient use of the aggregate capacity installed today by filling up off-peak troughs on the backhaul networks.
It has to be considered that the satellite and Freeview broadcast models will probably be preferable for mass market content for a good time to come, particularly when, as with the Sky+, local storage is incorporated at the edge to provide catch-up and on-demand services to broadcast. Broadcast will have a long-term advantage in live events like News and Sport.
The falling cost of storage means that the CDN can be extended into the home. Catch-up TV and other non-live media publications can use this to deliver their products while the network is quieter. In most cases, the time of delivery is flexible, and can be done in advance of publication by pulling down the files required ahead of time.
The internet should never need to achieve the encoding speed of a file – we don’t need 8 Mbps internet to watch 8 Mbps HD. It can instead deliver any necessary files at any time of the day or night and it has the broadcast channels for live events.
We may not need more than 10Mbps (4 TBytes / month) for a considerable time and perhaps never for applications like HD TV. If its is watched live, it is surely most efficient to use broadcast, while if it is anything other than a live event, it can be sent in advance. In any case a file can be stored locally either off the broadcast or the internet feed.
For whatever reason, Multicast is not enabled on BT’s core network. This limits the use of a technology that offers an alternative to more capacity.
Why has it not been used? Is it because no attention has been paid to how multicast interconnection might work and how an aggregated network of active access components could deliver this, perhaps alongside a network of passive access components?
Multicast is another natural monopoly layer to the network like fibre (and other passive components). Unfortunately in a competitive market such as we have created, we lose the ability to cooperate for the greater good – multicast is one example of how this can be highly inefficient.
Consideration need to be given to natural monopolies and whether it is necessarily a bad thing to not have competition. It may be that by trying to encourage competition (so that competition delivers investment), we may actually be killing any potential return on the value of the investment.
Question 2: Do you agree with the principles outlined for regulating next generation access?
This is a hard question to answer directly because the SRT principles are to a large extent “motherhood and apple pie”. What is missing is the detail as to how these principles will be employed.
i) promote competition at the deepest levels of infrastructure where it will be effective and sustainable;
Um, yes ok. But what do you mean? What is this level of infrastructure, how do you define effective and how do you define sustainable? Effective and sustainable to who – eg. someone who already has X, Y and Z or a new entrant?
ii) focus regulation to deliver equality of access beyond those levels;
I think you need to specify the levels. I think there should be regulated equality of access to the following network elements: backhaul, local loop, space in exchange, space in cabinet and local hands because these are natural monopoly resources that any operator would need to build their own service.
Others might agree with the overall principle, but may disagree with some of these level: the devil is in the detail, and I think this consultation needs to base its principles at least one level of detail further down if it is not to gain a meaningless agreement.
iii) as soon as competitive conditions allow, withdraw from regulation at other levels
Ditto. What do you mean by “as soon as competitive conditions allow”. I think you also need to describe the process for end-of-lifing a regulation, much as you would sunset a product. How is this requested, contested and implemented?
iv) promote a favourable climate…
Hard to say no to this one…
v) accommodate varying regulatory solutions for different products and where appropriate different geographies
This one on the other hand, is hard to agree with.
There should be consistent regulatory solutions, not varying solutions. One of the biggest barriers to investment is uncertainty over alternative technologies and the treatment they get compared to the one that ABC co is interested in. Spectrum usage and future licensing of the analogue TV band threatens to offer an alternative to fixed NGAs depending on how/where it is implemented.
It also seems that licenses use can change, completely altering the basis on which investments either in this or another technology were based. The change in PCCW’s Now license is just one example.
Every different technology should be treated equally. If there is an open access obligation on one, then there should be on all others or the situation is somewhat absurd. This may be uncomfortable to migrate to, but is surely a base principle that helps us remove uncertainty.
Such treatment should be technology agnostic, but the base principles in rural Wales (for all technology) need not be the same as the base principles (for all technology) in central London.
Question 3: How should Ofcom reflect risk in regulated access terms?
I think that the anchor product approach is correct, but I agree that definitions need to be tight. There are fundamental weaknesses in both other suggestions.
Price setting is too open to obfuscation – what is a cost, how is it attributed by product and how does a fixed price investment get reflected in a monthly recurring, usage based price? BT are the world-leaders in this… Cost-based pricing has turned into a game of confusion and hidden charges and that is bad for business everywhere.
Removing the burden to justify the price and going with a free market approach to monopoly pricing would remove much of the gaming that goes on, but is unlikely to lead to an attractive result. Price skimming is a clear risk of this approach.
One major charge I would like to see is the elimination of usage (or capacity) based charging on monopoly components because it is a distortion and one of the major weaknesses of the wholesale market. The backhaul cost per end-user increases to the wholesale customer where it remains flat for the wholesale monopoly provider because backhaul is a usage based price even though the build cost is largely fixed (the dig and the passive components).
At the same time, the ability of service providers to manage the volumes that drive these bills is limited to throttling. There are a number of alternative options, some of which require monopoly components like collocation space deeper in the network. More needs to be done to help ISPs manage the impact of usage on their costbase.
The cost of components required to build a competitive infrastructure should be fixed (per customer) or the monopoly carrier needs to provide tools that the service provider can use to manage usage – hosting space deeper in the network and traffic shaping ability – to help manage variable backhaul costs, on a similar anchor product basis.
Major contracts like the award of local government network contracts can go a long way to provide the basis for a profitable investment in the network. This raises a multitude of issues that need to be considered.
Should these requirements be brought to the front of the rollout queue in order to provide a revenue base for the NGA network? Should these contracts default to the anchor tenant as part of a set of “soft guarantees” to the ROI? How can the effect of these indirect subsidies be measured and should they be controlled?
The award of government contracts that use a potential NGA network need to be on top of people’s minds when assessing the issue as there are a number of very significant implications of policy here.
It is important to note different treatment of NGA versus LGA (Last Generation Access) applies only because the NGA replaces the LGA before the end of the LGA’s natural economic life. This is not the case with New Builds where there was never an LGA
Any preferential treatment of NGA investment should only apply where it replaces an LGA.
Question 4: Do you agree with the need for both passive and active access remedies to promote competition?
Passive access is defined in the document as direct access to physical elements like ducts, fibre or copper loops, while active access refers to products based on the physical elements (the example given is IP Stream, but also includes Ethernet handover products).
For me, this is a weakness of the framework. The active product described in the consultation document should be broken down further into:
1. the services which hand over at Layer 2 – eg. point to point data, backhaul
2. the services which hand over at Layer 3 – eg. IP Stream
Because passive capacity is physical and the Layer 2 product virtual, I don’t believe they can be grouped together. But, I do not believe either that Layer 2 products can be analysed in combination with a Layer 3 option because there is the potential for genuine competition at Layer 3 active products for perhaps 75% of homes.
This is not the case with Layer 2 competition because there are very few operators with their own passive network backhaul to the exchange. The layer 2 service is a critical component in LLU, alongside a number of passive access products, but the passive capacity needs to be shared between service providers and that is where Layer 2 offers an intermediate options to layer 3 access.
Furthermore, layer 3 (routed) products can bring very different economics than layer 2 (point to point) services, especially in combination with deeper routing in the network and locally cached content described earlier in the document. It must be possible to provide a wholesale alternative offering that includes a fully managed UK IP network. This is certainly a different market from the provision of an Ethernet-based backhaul circuit.
There are three subsets of the market: passive (layer 1), data link (layer 2 and L2TP) and active (routed layer 3) components. LLU consists of passive and data link components while IP Stream today is an unrouted L2TP service so it would be a data link service. There is much scope for BT Wholesale to offer a routed Layer 3 service in the future as an alternative to IP Stream and this must be considered.
IP Stream Lockdown is a Major Market Distortion
For a period of time, IP Stream made it all happen. Now though, capacity based charging has flipped the business on its head – operators make more money the less a customer uses their services. The incentive to ISPs is to throttle growth in usage.
While cost per unit throughout the industry drops with increases in usage (because of scale gains), IP Stream has been locked in an uncompetitive position so as to enable the creation of a new generation of unbundler.
The time has come to rebase the prices on IP Stream to a competitive level because the creation of LLU competition in cities had hurt the price of IP Stream access in the villages. The creation of LLU has widened the digital divide, not just by improving service in the cities but by making the service worse for those who cannot get LLU.
IP Stream is vital element in a competitive market and services based on it need to be able to compete with LLU but this competition is distorted today. IP Stream should be allowed to resume competition, even if it at the expense of LLU.
IP Stream, Cross Subsidies and USO
The problem for IP Stream competing with LLU is that IP Stream needs to blend its costs between where it does and where it does not compete with LLU. A model whereby IP Stream is cheaper in locations where it has competition is a clear exploitation of the monopoly where it does not have such competition. There needs to be one price.
But this raises the key question: why should rural users be subsidised entirely by BT IP Stream users in the cities? It would seem that there should be a more equitable contribution to universal service.
All LLU circuits combined with IP Stream should contribute equally to deliver a basic universal service where there is no LLU and competition to make that happen.
Should BT Retail do LLU?
The perversion of the BT Retail / Wholesale split is that BT Retail is locked to Wholesale and the base IP Stream network. While it would clearly be cheaper to BT Retail in isolation to do what every other major player did and LLU, because it and Wholesale are part of the BT Group, it did not.
So far, this is an internal BT issue, but there is an external effect because the prices that BT Wholesale customers pay are locked to those that BT Retail pays.
Clearly the choice for retail to unbundle is probably more costly to BT Group, but it highlights the artificial position of BT Wholesale and the prices that they offer. Charging excess profits to BT Retail is easy because they have no competition for that account and these prices can then also be applied to other monopoly markets by Wholesale.
The role of BT Wholesale and its supply relationship with BT Retail needs to be investigated and boundaries set. Does BT Retail have to treat BTW as its network supplier? Does BTW have to offer other ISPs an equivalent product to what they offer BT Retail? Everywhere or just where there is a monopoly?
This whole area comes down to a question of what pricing principles are adopted. These principles will determine the competitive state of the wholesale markets.
Passive elements and the Layer 2 components that are necessary in LLU should be available at fixed prices that are certain for the period during which the market is determined to exist. The same principles should be applied to a number of additional natural monopoly components like collocation space and local hands.
BT Wholesale should buy these components at the same rates as 3rd parties and should be free to set IP Stream prices as they see fit as long as a) the price paid by BT Retail is the same as prices charged to other ISPs, b) scale discounts are negligible, c) the price of IP Stream where there is LLU is the same as the price where there is not
In order to achieve c), there needs to be a Universal Service Charge applied to all broadband lines and distributed to subsidise the development of infrastructure and price equality in areas where LLU competition will not deliver this improvement.
Question 5: Do you consider there to be a role of direct regulatory or public policy intervention to create artificial incentives for earlier investment in NGA?
Emphatically not. The DSL evolution is only four years old and it cannot be possible to conclude that we need intervention or artificial incentives for earlier investment. Doing so would damage the value of investment in DSL and perhaps perversely even damage the rollout of NGA because of uncertainty about where, when and on what basis subsidy would be awarded.
At this stage in the market, regulatory and public policy needs to concentrate on the edges of the existing market – mitigating the Digital Divide – and not on widening it by making the rich richer.
What are the targets?
Ofcom needs to start addressing the issue or fair access to existing technologies for all before it starts to address the issue of access for a limited few to an NGA. In fact, there may even be a case to deliberately delay an NGA investment and to channel the funds to level the existing playing field before the “haves” again accelerate off into the distance.
There is a role here for either Ofcom or for Government to set out what this basic universal service requirement is. This should be done and those excluded dealt with before effort is put into NGA.
There should be a common baseline (speed, capacity, price) that everyone can expect: a universal access product available for an affordable price (TBD). Everyone should have access to a baseline.
There are however public and regulatory policy positions which are unclear and are perhaps confusing the case for NGA. Unfortunately, this consultation is one of those initiatives clouding the horizon because it raises the possibility of public subsidy.
This consultation, having asked the question needs to put that question to rest in the final response – what are the direct and indirect subsidies that will and won’t be factored into the NGA investment case? How will technologies that compete with NGA, like Wireless and LGA also be regulated?
There should not be any artificial incentives in areas where LLU has proven that competition is viable. There needs to be a very clear statement on this subject, including where subsidies will be considered and where they will not and how indirect subsidies (as discussed earlier) will be applied.
Step by Step Upgrades
It’s a cliché but we should be aware of trying to eat the elephant whole. NGA is simply too big a question to put into one bucket so the components need to be broken down and a passage plotted that takes us to where we want to be.
There is simply no point in upgrading access capabilities unless other elements of the network are in place to cope – you just shift the bottleneck and spend a lot of money for little return.
There are choke points in the infrastructure today, but these are being alleviated by investment. It was not long ago we were referring to a crisis in backhaul, but fibre is now being installed to connect even some very small exchanges (sub 1,000 homes).
Before we need access, we need a number of key components: we need data centres with power and we need these much deeper in the network. We need a commercial framework that takes care of the braodband incentive problem and we need stability.
 It is worth noting that cable and optical networks are typically configured differently and aggregate traffic in a ring rather than using a point to point topology like DSL from the exchange.
By | November 15, 2007 |
This will end in tears…
Who Plays the Piper?
Bureaucracy is supposed to be subservient to politics. At least that is what I thought, but we have somehow arrived at a model where an unelected body – The Commission – feels powerful enough to make a play for control over telecoms policy and regulation throughout the union.
The European Commission has publicly questioned whether the UK’s top telecoms regulator has any idea what he’s talking about when it comes to, er, telecoms regulation.
In a response to Ofcom chief executive Ed Richards’ letter to the Financial Times today, the commission said: “We note that Mr Richards takes position against a ‘central European telecoms regulator’. In this respect, we really wonder what Mr Richards is talking about.”
That was two weeks ago. Yesterday, the European Telecom Market Authority was announced.
“A more European regulatory approach is particularly justified in telecoms. After all airwaves know no borders. And the internet protocol has no nationality” said José Manuel Barroso, President of the European Commission.
The problem is that the deck is already loaded in Reding’s favour. The debate is set to be heard in the European Parliament, not at a national level, and it seems that the decision will never be sensitive enough (or the implications well understood enough) to make it worthwhile burning political points for the sake of the retention of sovereignty over such complicated matters.
The European Politics of Centralisation
The Eurocrats have skillfully crafted the words, using a hard sell to consumers promising lower prices and yet leaving enough ambiguity over what this means in practice.
It is clear to those who look however, that Viviane Reding is seeking to become the regulator, with underlings in existing national bodies tasked with implementing the central decrees. Call it a “super-regulator” or a “central European telecoms regulator” depending on how diplomatic you want to be, but if it has control over policy, that is what it is.
The grounds for such a bid are based on that over-arching principle of western civilisation: competition. It is argued that only through centralisation can the same grounds for competition be set in each member state.
Does anyone else see the irony of the clash between this and the other pillar of western political thought: democracy? If you leave it to the locals you end up with policies which favour local companies - so you have to dictate from the centre.
What’s in it for me?
The public is again being promised cheaper prices.
“The planned changes are designed to offer consumers cheaper broadband services and phone calls from fixed line and mobile handsets, the Commission also argues”, reports the BBC.
You mean to tell me that all this is necessary to reduce mobile roaming charges?!? You must think I’m stupid. If you want to achieve that simple goal, do what you did in May and mandate a cut in pricing.
What is this Really About?
This is about imposing a political philosophy on how the telecoms market should be structured. This philosophy says that monopolies are intrinsically bad and should be broken up.
This isn’t even about free markets anymore. The markets are in no way free to evolve because of constant interference, if not from government or local regulation, increasingly from the centre. Structural separation is the new creed and the crusade to impose it has begun.
For me this is hugely dangerous because as with any philosophical argument, your starting point determines your conclusion. I have a real problem with all of this because I cannot say which model is best, and the benefit of local control is that we are seeing a huge number of different variations played out and each can evolve from their own starting point.
When you reach perfection in policy, go ahead and impose it. Until then, diversity in the gene pool is good.
What is really needed, as I highlighted in my Dear Ofcom article, is a consistent set of objectives for regulators to meet with the policy levels already at their disposal. As described in that article, the broken bit is the bit that determines what Ofcom (and presumably other national regulators) are supposed to be achieving.
These are the elements that are left to politicians in the current model. There are weaknesses with that approach, not least the lack of expertise in government on what is undoubtedly a complex and specialised market. A further weakness is the short time frame under which politicians naturally operate and their need to “make announcements” to grab the attention of the people who decide their fate.
Perhaps that is where The Commission can play a role. By laying out targets for digital inclusion, defining minimum service criteria and so on, they can effectively set the baseline so that European citizens can expect the same level playing field. They can also take the long term view and set targets over many years.
What they do not need to get involved in the “how” question, or the policy answers to achieving the goals. If they try to, it will end in tears because there is such diversity in the baseline – how can you set a common policy for the range of countries between The Netherlands on one side and Bulgaria on the other?
At a time when the market needs certainty to enable the proper evaluation of investment in fibre, this is a bad move. It is based on enforcing an ideology that is at best, unproven in telecoms.
It is probable that The Commission will get its own way, because the issues are complex beyond the understanding of the people who have to make the decisions on whether to adopt the proposal. It centralises power, and yet the people deciding whether to adopt it are those that stand to benefit from such centralisation.
“From today onwards, a single market without borders for Europe’s telecoms operators and consumers is no longer only a dream,” said José Manuel Barroso, President of the European Commission. “Telecoms is a field where our single market can bring about very concrete results for every citizen in terms of more choice and lower prices, whether for mobile phones or for broadband internet connections. At the same time, a single market with 500 million consumers opens new opportunities for telecoms operators – if Europe helps to ensure effective competition and consistent rules of the game. This is why we act today.”
There is no such thing as a single European Telecoms Market and there never will be. The theory supporting competition for goods and services within Europe assumes the portability of those goods and the production and labour that goes into making them. A telecoms network is not portable – with the exception of mobile roaming – so you cannot go and buy your phone service from Germany if the suppliers there are cheaper than those at home because infrastructure is local.
What is needed is a common set of social and economic objectives – the “what” – but what I can see is an desire to control policy – the “how” – and this will be bad for telecoms users. The idea that The European Commission can make one policy that suits the needs of users operating on widely diverse historical networks, is I’m afraid, just wrong.
By | November 13, 2007 |
Guess what? Just as Sky used football to drag people through the doors, football again promises to bring content into the Brave New World of Fibre to the Home.
21st Century Television
We all know that deep down, the major driver for fibre is TV over the internet. It is the one internet application which smashes through the limits imposed by DSL today. Nothing else comes close to the bandwidth requirements of running a multi-channel HD home.
But what are you going to watch that isn’t already on the Sky or Virgin platforms, especially when you consider that these will be able to deliver video on demand by adding local storage into your set top box? It’s the niche stuff of course, that is perhaps so small that it doesn’t warrant a slot among the top 599 TV channels that can get space on the Sky platform.
Where are the Niches?
The problem with so many niches is that they are made up of small scatterings of people geographically distributed over the country / the world. And, in order to make a geographically disperse niche model make commercial sense, you need to combine perhaps thousands of tiny niches until you get the volume of one mass market play. The volume of course is central to making the investment work.
So when you see a localised niche that may in fact resemble a local monopoly, you have to look at things slightly differently because it allows you to concentrate the investment. You may still have a very small niche in national terms – perhaps 10,000 people - but if they all live next door to each other, the small number doesn’t matter so much as when you have to lay fibre to 25 million households to get your 10,000 punters.
The Ebbsfleet Niche
We know that Ebbsfleet is getting fibre to the home as part of the new build taking place in the valley. Today we found out that Ebbsfleet United has been bought by a football fans’ community website MyFootballClub. While it is quite possible that this is the worst way to run a football club (although I would argue that Spurs run them close), it is not all about winning. When you are in a community, taking part is far more important.
While this may seem like a gimmick to many, the idea of a community owning a football team is extremely powerful, even if that community is not (currently) centred in Ebbsfleet itself. While MyFootballClub is all about football, it is a new media organisation that cannot fail to see the verticals that it may be able to address now that it has a focal point in the Ebbsfleet community.
Oh, and did I mention that Ebbsfleet is getting fibre? Yes, but it is worth reiterating…
Football as the Killer TV Application
We know football was the killer application for SkySports, it is not a stretch to imagine a 21st century version that includes social networking and the ability to vote for who plays on Saturday being the killer application for Ebbsfleet’s next generation media. We like to have our say these days, and MyFootballClub promises fans will vote on who plays and who doesn’t. Audience participation is why the X Factor is such a success, even if, (not unlike Ebbsfleet United) we know we are watching second rate performers.
The really interesting stuff though is the offshoots that MyFootballClub can develop. Ebbsfleet United currently attracts an average attendance of 1,000 people to every home game, while the stadium in Northfleet holds 3,000 or so. There are about to be another 10,000 households moving to the area, all with fibre and all with a desire to know more about the area they have arrived in.
The football club used to be at the heart of local communities. While I am not arguing that a run down Blue Square Premier Division club, lying 9th in the table of England’s 5th division is ever going to rival Newcastle United, Rangers, Liverpool or Celtic, they do offer something new, something fresh for an audience that is attracted to the new town by promises of a town designed and built for the 21st century.
It has to be Local
When TeleBusillis (rest it’s soul) and I got together to write about how a hypothetical ISP in Leeds might be able to fight the big boys, we quickly concluded that the only way forward was to hook into the local community and take advantage of what would otherwise be a weakness – our small, local presence.
Ebbsfleet is slightly different in that the opportunity for residents afforded by the fibre connection to their homes is obvious, but that because it is so avant-garde, they may find content simply doesn’t evolve to meet their needs until the rest if the country catches up with their infrastructure. Certainly, there aren’t too many big-media outlets that would develop content just for a small subsection of their target market (not least because it might risk alienating the other 99.x% who can’t get it).
So, for a while the opportunity is local and only local players with defined boundaries can really play hard. Whether or not the community can agree on who to pick to play right-back won’t matter if the community can also get a rich off-shoot of locally flavoured services as a result.
Garry Kasparov may have beaten The World Team because the old adage that too many cooks spoil the broth, but that didn’t stop that game in 1999 being coined as “the greatest game in the history of chess“. This is the networked world – it may not be any better, but we are all a part of it.
Topics: FTTH | Comments Off
By | November 12, 2007 |
Now that the buzz caused by The Minister’s involvement has dissipated and we have all had a little time to rationalise our positions, the time has arrived to take a stance and write our responses to Ofcom’s consultation into “Future Broadband – Policy Approach for Next Generation Access“.
My Starting Point
So what do I think? Perhaps more importantly, why should you care..? After all, I have no real stake in the result – I don’t represent anyone other than jpenston.com trading as The IP Development Network. This is not a ghost written piece advancing the interests of a telco and I am not employed by anyone to lobby for a particular outcome.
As an independent consultant, my role might be more tuned to working out how to deal with whatever the outcome is, rather than trying to say what I think that outcome should be. But I have a view, and anyone who has met me or has read my blog over the last 12 months or so, will know that it is very difficult for me to keep quiet when I have an opinion.
Objectives and Objectivity
So what is the objective of the consultation? I understand that Ofcom want to trigger a “widespread debate”, but a debate about what? And more importantly, a debate with what boundaries? Perhaps at the highest, UK Plc level, you might see the debate being about:
- Does faster internet access correlate with economic expansion
- Does slower internet access correlate with reduced global competitiveness
- Does internet access need to be at a certain level to deliver social benefits
- Is there a digital divide
- If there is, is the digital divide a problem
- Is there a right to high speed internet access
- Is infrastructure a natural monopoly
- What level of responsibility should the state take for infrastructure
- What responsibilities does the incumbent have to the country
- What is the role of competition
- How can the various “shades of grey” between 8, 9 and 10 be managed
- How should government contracts be awarded to avoid distorting the market
Flaws in the Structure
There is a serious flaw in the discussions that I have heard on the NGA subject: some believe that many of these issues are outside of the scope of this consultation… When I attended the BSG review of the consultation, one of the common themes was that “addressing the digital divide is not Ofcom’s role”.
Those that made such statements were clearly experienced regulatory lawyers and I don’t know enough to say whether they are right or wrong, technically. But actually I don’t care for the minutia of whether this is in or out of scope – it stinks of a culture where people are afraid to say what they think in case they tread on someone else’s toes. Corporate politics only leads to bad things.
What is clear to me is that any changes to Ofcom policy resulting from the consultation will determine whether the Digital Divide gets worse, better or stays the same. Anything that encourages “selective investment” will make the divide worse, with the opposite also true. Whether or not “addressing the Digital Divide” is Ofcom’s role, the divide is a natural result of Ofcom’s policies on the competitive structure of the market.
Political Objectives Should Drive Regulation
I have heard it said that this set of questions are actually for the politicians to answer. The structure that exists says that the politicians determine the objectives for Ofcom to achieve using the competition policy levers at its disposal as a regulator.
While conceptually valid, this puts Ofcom into an awkward position because perhaps it is not fair to expect the politicians to have the industry specific expertise to know what the right answers are without advice from Ofcom. It is therefore incumbent on Ofcom to advise the government on what the decisions should be which will inevitably mean asking Ofcom to question its own role.
It is therefore vital in my view that the role of Ofcom be at the centre of the NGA consultation. Avoiding this question will naturally lead to certain answers being rejected even before they have been analysed for their worth.
The Positioning of Responses
What is clear to me is that most respondents will take a much more narrow view and that it will be up to Ofcom to interpret those positions determine the answers to the Plc questions. Looking at the stakeholder positions is always going to be challenging because self-interest will immediately explain the conclusions. I suggest that the various stakeholder objectives may include the following:
- Increase “my” regulated rate of return
- Let “me” choose when and where to invest and allow me to build barriers to entry where I do
- Ensure availability of wholesale 100Mbps and allow “me” to choose where I buy it
- Ensure the infrastructure exists as an enabler for “my” services
- Ensure “I” have a choice of providers
I could go on, but you get the picture. The stakeholders are wide and varied: infrastructure owners, wholesale customers at various levels, service providers, application providers, users in schools, homes and businesses, investors…
While it may be very difficult for a commercial entity to do this, it seems necessary to me that we take collective responsibility for what is “best for us” rather than simply what is “best for me”. This is probably asking too much, but it makes decisions much harder if a filter has to be applied every time a response is read.
Such collective responsibility will involve understanding your place and what you need from others to achieve your objectives. It also means understanding what they need from you and occasionally putting short term self interest aside so that a bigger and better market results. Central to this is agreeing who makes the investments, where they get the money from and how much others should be prepared to pay as a premium for someone else taking the risk.
A key question for Virgin Media for example is how they approach NGA where they have cable compared to where they don’t…
Lessons from the Past
The historical background is important. What we have learned is that if you want to make a step change happen that makes binary changes to the country – internet access (yes or no), broadband (yes or no) – a monopoly investment is the way forward because you can cross subsidise to deliver the benefit to all.
However, once that monopoly investment is made, if you want to maximise utility, then competition is the way forward. Of course that leaves some people behind which will eventually run full circle until the point where monopoly investment (or taxation-based subsidies) is required to correct the imbalance.
We have lurched from one to the other and back again in the UK ever since competition was introduced in the 1980s, but I think we are seeing the problems that such lurches in policy cause. LLU brought with it a degree of certainty, but already we can see the flaws in the digitally excluded. One of the major hurdles to solving this is uncertainty on spectrum policy and how wireless will be used in the future.
Freeing up the analogue TV spectrum could bring a viable alternative to fixed infrastructure if it is regulated a certain way. Alternatively, it could be ring-fenced, assigned to certain niches or auctioned in an attempt to let the market itself decide. It may be that all of these are combined – which will it be? Until you know, you can’t compare that option to an LLU footprint expansion or a NGA BT Wholesale product.
Actually what is done doesn’t matter as much as when it is done. If spectrum policy is decided before FTTx investments then those investments can take wireless factors into account. If spectrum policy decisions are deferred then the uncertainty into the LLU expansion case is only going to lead to sub-optimal decisions being made.
We need to gather together all the technology forces and try and look at the bigger picture. We need to decide what we want that bigger picture to look like and then (and only then) we should make decisions into what the best structure is to arrive at that objective. Once we make that decision, we shouldn’t tinker with it.
We are where we are
We cannot avoid the position we are in. My belief is that right or wrong, we have started down a path where the primary objective is competition. We are in an environment where our collective political economic philosophy is that competition is the best way to maximise the outcome in any given environment.
But competition can occur on many different levels and this is where regulation comes in because it artificially decides what that competition should look like. LLU was a major shift from the IP Stream (and CPS-like) model where competition was on service and not infrastructure.
LLU encouraged competitive investments deeper into the networks and allowed those that invested the chance to benefit from those investments by building scale. LLU undeniably forced the recent wave of consolidation and we now have a situation where 5 players rule the market.
Please, not again – let’s have some stability!
There would have to be very clear evidence that LLU has failed before we should alter or even tinker with the policy that last changed dramatically in 2004. I simply don’t believe that we have given this enough time to have such evidence of failure. I believe that the case for change is not proven.
This may be a very boring conclusion to some who want big money investment in fibre to the home, perhaps using some form of state aid to help oil the wheels. But “do nothing” is always a strategic option and should not be confused with inertia (where you know something needs to be done, but can’t decide what).
Do we need to ”Do Something”?
I want to briefly unpick the case for “do something” as I think it is based on a False Sense of Certainty. The main false premise for me is that “we will not be able to deliver 100Mbps without massive investments in the access loop to the home”. We have seen in recent weeks how copper may not be quite as dead as we thought.
Furthermore, we conveniently ignore the fact that we use around 1% of the existing local loop capacity – perhaps because that does not sit comfortably with the far more sexy alternative of spending lots of someone else’s money.
Thirdly, we need to consider the implications of cheaper storage and the possibility that affordable terabit storage may find its way into people’s homes and businesses in a relatively short period of time (quite possibly much faster than we can lay new fibre optics). Such storage built into consumer devices would dramatically change the ability to offer on demand TV services that receive their signals from existing broadcast. internet based multicast or even P2P services.
And last, but by no means least, we need to ask ourselves whether television really belongs on the internet. Not to be confused with video on demand, IPTV theory seems to centre around the belief that we would choose the long tail of archives over and above the current set of linear programming – if it were equally easy to select. We might well dabble occasionally, but the the popular stuff available through the storage model described above, it might be a case of 80% of the money being spent to deliver 20% of the value.
Summary & Conclusion
The Ofcom consultation does not ask the right questions. We should start with the political, social and economic requirements of the country and only then should we start looking at what to do to deliver what we want.
When we come to that second phase, we should look at all the technology options and we should lay out policy on all of them, on the same basis, at the same time. For example, if we are to mandate wholesale (open access) to fibre investments, we should do that same with cable, wireless and any other technology which achieve the same access result.
Once we have made those decisions, we should have a period of complete certainty lasting perhaps as long as 10 or 15 years during which we allow the policy time to become established and the market time to evolve. We should not tinker with it once we have established the structure. If we screw up, we should trust the market to find a workaround – if competition really is our primary goal, we should trust it to work.
That’s it for today. Over the coming weeks, I will be writing more in specific response to the actual questions in the consultation:
Question 1: When do you consider it would be timely and efficient for NGA investment to take place in the UK?
Question 2: Do you agree with the principles outlined for regulating NGA?
Question 3: How should Ofcom reflect risk in regulated access terms?
Question 4: Do you agree with the need for both passive and active access remedies to promote competition?
Question 5: Do you consider there to be a role of direct regulatory or public policy intervention to create artificial incentives for earlier investment in NGA?
By | November 9, 2007 |
It is all change here…
Firstly, my hosting provider objected when I queried an anonymous request for payment and gave me 24 hours to get off his server. After a 10 hour flirtation with Fasthost, during which time I learned that their version of subdomains didn’t include the ability to host a directory underneath it (unless I paid them 10 times more money as a “reseller”), I found a new home at EUKHost.
Meanwhile Google Blogger’s FTP service did its best impersonation of striking french transport workers and would not publish – a problem that affected hundreds of users – and I decided to bite the bullet. Enough was enough: WordPress here I come.
The Rocket I Needed
I’d been casting envious glances as WordPress blogs ever since I hastily moved my blog from Blogware to Blogger in March. Having decided in haste back then, I have been repenting at leisure ever since as the complexity of getting my heavily customised template into WordPress php gave me the chills. That was until Google’s FTP service went comatose and I read hundreds of group users pleading / begging / crying for help from their lords and masters at Googleplex. Take control, Jeremy. So I did.
Google’s support is appalling. It’s not a joke and it certainly is not funny. Their help pages are shallow and they point you to the groups for support where it seems users are left to wallow is a state close to desperation as they try and get the attention of an employee who might help. There are no obvious email support routes and god forbid they should offer support by telephone. You certainly get what you pay for with the Big G.
If you use Blogger and are prepared to let Google hold all your files, the service is ok when it works. They have been adding widgets to blogspot and custom domain services for a while, but I wanted the files on my own server. In March I lost the blog when yet another hosting company tried to charge my phone number instead of my credit card (and tried to ring my email address to tell me that payment wouldn’t go through). I decided then that come what may, I would hold the files myself and back them up to my hard drive regularly – something you can’t do with blogspot or custom domains.
There Must be Something Better
I wanted widgets. I wanted plugins. I wanted Web 2.0! Google’s FTP service is grudgingly provided and it does not give you any of these features. They want the content on the Google Cubes, not under my control on my hosting service. I wonder why…?
So I invested the time in learning the basics of php and WordPress and actually, moving the template was not nearly as hard as I expected once I found a similarly structured theme among the library of hundreds of freely available designs. All I had to do was change a few images, colours and add in a few bits and pieces that I had hacked in to the Blogger template.
Importing the content was relatively easy too, once I found a hack that enabled me to keep my permalink structure when I ran the Blogger to WordPress import (note: I had to publish by FTP Blogger to Blogspot before it would let me do this). It takes a bit of courage to start playing around with the contents of php files, but I’m glad I did. Even so, there were one or two pages where the formatting ended up a bit odd, but the more I tried to fix those, the worse it got.
Happy with the Result?
So here we are, what do you think? I dropped the iPhone inspired menu now that the novelty value of the “touch screen menu” has worn off. It feels a bit passe, I must say – ironically on the day that Apple launches the product here in the UK.
I have added a few of the widgets and plugins that I had been eyeing up. It will be a bit of trial and error for a while, but I want to see what you (readers) like. We shall see where we end up, but with the wealth of open source plugins available on WordPress, there is a lot of scope to experiment.
Understanding my Readership
You will no doubt have noticed Digg. I have a soft spot for that service and I am hoping that every now and again, you will like something I read enough to give me a Digg. I need the feedback and I haven’t really been getting enough of it to keep up with what people are interested in me writing about.
For the same reason I have also added a rating widget, which is more anonymous than Digg and can give better qualitative feedback because you decide how many stars to give an individual article. Unlike with Digg, you don’t need an account, you just fire and forget.
You may be starting to think that I am a bit insecure because I have also added a Popularity Widget. This one though requires no input from you whatsoever. It ranks the articles based on views, numbers of comments, trackbacks (something else Blogger lacks that I wanted), etc. and tells the world how popular a piece is compared to the “most popular” on the site. It also dynamically lists the top ten in a menu on the left sidebar which might also be helpful for people who arrive and want to browse what I have written.
Also in the left sidebar you will find another experiment, again based on Digg. This could crash and burn horribly or it could make the site more dynamic – I don’t know. What it does is list the top ten articles on Digg’s universe that have been dugg by “my friends”. Since I don’t have any friends at the moment, this could be a disaster, but for a while at least, I’m prepared to give it a whirl and see if anyone wants to contribute.
I am hoping that this will allow all readers to share interesting articles with each other. I know that ipdev.net is a small niche and I am expecting that the very bright people who read it, also read other interesting stuff. What content is going to show up? I have no idea…
Finally, there is an improved search facility, a tag cloud and a bigger and better set of options allowing you to subscribe to what I write through RSS, Bloglines etc. My Feedburner stats show that my readers use 37 different news aggregators between them which is why you have ever Tom, Dick and Harry that ever built an RSS Reader listed there. I’m sure to offend someone now though having missed off their fave…
Beginning of the End for Google?
So back to my friends at Google. So long and thanks for all the (free) fish, but my experience with them is also worthwhile sharing in a wider context. We have all heard over the last couple of weeks about Open Social, the Google (software stak on a) Phone and the Google PC. In the recent past we have also seen the emergence of gmail and Google Docs and hundreds of small acquisitions like Jaiku.
These initiatives “beyond search” have certainly put the frighteners on all parts of the industry from Telco to Microsoft, but I wonder whether we are seeing the company expand too far beyond its area of core competence. All these new avenues take them closer to environments where they have to interact directly with customers and provide them with help.
My experience of Google’s help is very poor. Even by the low standards set by the industry in general, Google are bad. The sort of like it or lump it, or even the F-you attitude shown by the company I was hosting with until last weekend is bad for business and sooner or later, it may come back to haunt them.
Are they Starting to Panic?
Furthermore, Google seem to be in an undue hurry, have neglected key security considerations and OpenSocial got hacked. It seems that it was announced to take the wind out of Microsoft’s sails after their Facebook investment which to me sounds more like the cry of a jilted lover pawning the engagement ring to buy a flashy pair of shoes.
Some commentators initially went as far as to call it a Google checkmate when MySpace joined the party, but I believe that you cannot really compare a service with an established set of 3rd party applications to one who have simply “announced” an initiative. OpenSocial has a long way to go before it is real – announcing products before they are ready is the sure sign of a company with a closer eye on its stock price than its operational performance.
The same is true of the gPhone. Perhaps the hype didn’t come from Google, but they did nothing to quell it and what resulted was a very damp squib when we learned that it was simply a software stack. Again, this might have perhaps been more of a defensive measure against Apple that the product announcement that gFans wanted. Whatever comes of the initiative, we won’t be seeing it until next year.
Operating systems, user applications and in particular the provision of 700MHz-based access services will necessitate a quantum leap in Google’s ability to thoroughly develop products and support trouble tickets to deliver what customers expect. Sure they can make money from ads, but can they fix a problem with your software or your access if you have one? If they can’t, we might see a growing number of people saying goodbye to Google.
By | November 5, 2007 |
O2 is predicting that they will sell up to 200,000 iPhones over Christmas and the New Year here in the UK. On the face of it, that is quite an astonishing claim, but please folks, bear in mind this is telecoms where the words “up to” are very, very important.
Let me also make a prediction
Prices will fall after Christmas.
You would have to be a millionaire to afford the launch prices! There are approximately 450,000 millionaires in this country, and that may be where the 200,000 sales number is driven from? Perhaps that many people are insensitive to price and just want an iPhone to brag about? I don’t know…
Chrsitmas is of course a big factor in all this. Apple have historically sold 45% to 55% of a year’s total iPods (by volume) in the October to December quarter and Christmas is when people are most prone to irrational exuberance. Of course, skimming the market is a standard entry strategy which Apple used it in the US iPhone launch where they cut the price after just two months.
But you have to wonder how many people are going to treat themselves (or someone else) with a present worth upwards of £899 over 18 months. Let me simplify that – £50 a month, for a phone and a contract giving you 200 mins and 200 texts.
If you get an iPhone for Christmas, smile sweetly and say thank you, but do please ask whether the £35+ a month bill is also covered. It may not be very diplomatic but it’s kinda’ important so as not to wake up with a big hangover on New Year’s Day.
For £30 a month you can get a FREE Nokia N95 with 400 mins and 500 texts, although whether the shop would open specially for you to get one on a Friday evening is debatable. Perhaps the iPhone is marketed to the strong silent type? Or to Billy-No-Mates with no-one to call?
Now, the N95 may not be “The iPhone”, but it is £359 cheaper, and gives you twice as much talktime and up to 5,400 more texts included in the price. Ladies and gentlemen, Friday truly sees the launch of a premium product.
Are we going to buy it?
I know all the sceptics in the US had their words forced straight back down their throats and those same people have had to endure an even smugger than usual Steve Jobs announcing stupendous financial results for the Apple group. It would be a fool who says that it won’t happen here…
It won’t happen here.
Is that the headstrong non-conformist in me? Or even the intractable recalcitrant? (Don’t worry, my mother called me worse…) Maybe it is. After all, I am not putting any money on my views. I can write what I like and the worst that can happen is that I get tagged as yet another false prophet of doom predicting the premature death of the next big thing.
Perhaps I had better explain myself
Firstly, and by way of qualification, I am not saying that the iPhone will never catch on. I am saying that it won’t catch on at these prices. The iPhone is going to be able to carry a price premium over the competition, and a substantial one at that, but maybe 15-25% is what I have in mind. I’m going to play a little game here:
Let’s just say a minute has a nominal value of 1 groat and a text the same. On that basis, the above N95 package is worth 900 groats a month, the iPhone contract 400 groats. Over the course of an 18 month contract, the N95 costs £540, the iPhone £899. Correcting for price and minutes/texts the iPhone carries an astonishing 275% premium over the Nokia N95. Ok, so I haven’t taken into account the WiFi and Unlimited Data or any under usage of the fatter N95 plans, but this is a blog and not an analyst’s research note so you get the picture.
Money Up Front
Factor number 2 in my sceptical forecast is the £269 up front charge. Without any real evidence to back this up, I believe there is a cultural ocean between American consumers and Brits when it comes to paying up front. Even discounting the Scots among us, we are a tight fisted bunch – witness how popular Free Internet has been here. It might look nice and have a fancy touch screen, but TWO HUNDRED AND SEVENTY QUID? You’re having a laugh, mate. We like to spread the cost, hence the mountain of credit card and mortgage debt here…
Finally, factor number three (again it is price based like the first two). We don’t like a superiority complex, and although we may secretly covet premium brands, if they are out of reach – particularly because of perceived greed – social envy can quickly turn “yuppie” into an insult. There is a delicate balancing act here for Apple to create aspiration and not kill it by stretching it too far.
So what is the conclusion?
Firstly, if you want an iPhone and don’t want to look like a yuppie, wait until the prices come down. Secondly, don’t bet on anything like the success in America being repeated here but thirdly, don’t think that an inability to hit volume targets is necessarily a bad thing for Apple.
The iPhone is not a volume play and it’s value is at least in part attributable to it’s niche positioning. This will change in time as it becomes more affordable but by then the value will have been enhanced by the period during which the majority could only enviously disparage the lucky few.
Jobs and co know what they are doing. They have their partners where they want them to the point that you have to wonder whether it will really be Apple that funds the price cuts when they do come. You see, their interests are not necessarily in line…
Apple makes more selling 250 thousand iPhones at 4 times cost (250 x 3C) than they do by selling half a million at twice cost (500 x C). On the other side O2 make more simply by pumping more units because their costs are largely sunk in the network.
200,000 units would be an almighty achievement for O2 over the seven and a bit weeks between launch and New Year’s Day. It took 10 weeks to sell 1 million in the US, a country with a GDP 5.6 times greater than the UK. Two hundred thousand does sound a bit like wishful thinking, particularly with the added uncertainty in the market of the Google (Software Stack on a) Phone… My guess? 65,242, but what do I know.
« Previous Entries