The space is heating up – possibly because its increasingly clear that P2P may be a required technology for extremely large file transfers like broadcast quality TV programs. In fact Robert Cringely in a March post pretty much sums up the problem
“Twenty million viewers, on average, watch “Desperate Housewives” each week in about 10 million U.S. households. That’s 210 megabytes times 10 million downloads, or 2.1 petabytes of data to be downloaded per episode. Fortunately for the download business model, not everyone is trying to watch the show at the same time or in real time, so iTunes, in this example, has some time to do all those downloads. Let’s give them three days. The question on the table is what size Internet pipe would it take to transfer 2.1 petabytes in 72 hours? I did the math, and it requires 64 gigabits-per-second, which would require an OC-768 fiber link and two OC-256s to fulfill.
There isn’t an Internet backbone provider with that much capacity”
Of course this assumes that everyone wants to start watching at different times so a broadcast model won’t work – but that seems reasonable given that everything else is going that way. It also assume that broadcasters might be interested in making highly popular programs available this way – which is questionable.
As Marc Gerstein points out in his March review of TV on demand – moving to an on demand model hurts networks in a number of ways
The audience gets harder to measure, which hits ad revenue
It also makes strategic programming mode difficult, if not impossible
Still P2P has an attraction for networks because if time shifting does drive TV viewing at least P2P weakens the hand of distributors by decentralizing distribution into millions of hands with no distinct voice arguing for a share of the pie.
Add to that that P2P means opens a long tail of content that can attract some viewers back to TV and you can see the value.
Now if I could just get invited to the beta.