I don’t buy the argument that we should reinvent the Internet because some applications work badly on congested networks (eg VoIP and streamed video). My view is that
- Users understand and accept variable quality as the price of the huge choice afforded them by the open Internet. 2.5 billion paying customers can’t be wrong.
- Most of the time, on decent network connections, stuff works acceptably well
- There’s a lot that can be done with clever technology such as adaptivity, intelligent post-processing to “guess” about dropped packets, multi-routing and so forth, to mitigate the quality losses
- As humans, we’re pretty good at making choices. If VoIP doesn’t work temporarily, we can decide to do the call later or send an email instead. Better applications have various forms of fallback mode, either deliberately or accidentally.
- Increasingly, we all have multiple access path to the Internet – cellular, various WiFi accesses and so forth. Where we can’t get online with enough quality, it’s often coverage that’s the problem, not capacity anyway.
- Anything super-critical can go over separate managed networks rather than the Public Internet, as already happens today
Excellent. Many good points on the topics.