HUH, Good god y'all. What is it... wait I think I've used this joke before...
One of the problems with the discussion around the NBN is the technical and economics arguments that are being put forward. The idea that somehow economics will magically solve all physical and mathematical limitations is a fanciful one, at best. So here are some of the limitations of the materials and technologies we’re talking about here.
First, let’s talk about wireless. In order to talk about the maximum bitrate for a wireless network, we look for Shannon’s law. Note that this law isn’t talking about “given certain economic conditions” or anything. It’s straight up maths.
This is why Modems work at the speed they do. Currently, the wikipedia entry states:
When calculated, the Shannon capacity of a narrowband line is Bandwidth * log_2 (1 + P_u/P_n), with P_u/P_n the (linear) signal-to-noise ratio. Narrowband phone lines have a bandwidth from 300-4000 Hz, so using P_u/P_n=1000 (SNR = 30dB): capacity is approximately 35 kbit/s.
That is to say, a ye olde phone line modem is operating at near theoretical peak performance. There is no way to go faster given the medium without switching technologies. When you look at theoretical maximums with ADSL2+, you can see that realistically we’re only getting to 10MBps for about half the people connected. And that’s in cities. Could better technology be used? Well, given the graph is talking about signal attenuation at a distance, I doubt it (unless you live less than 900 meters from the exchange).
For wireless, WiMax is probably the technology we ought to go with, and you can bet it’s skirting Shannon’s limit. While it offers a theoretical peak speed of 1Gbps (with the annex), this is equivalent to a tiny PC speaker offering 1000W PMPO, but look closer and it’s a little 3W speaker. That is to say, offered over the kinds of distances, and servicing the numbers of customers that would make it cost competitive with copper or fibre, we’re looking at maybe 1MBps at most. And that’s before we start talking about QoS (which in many cases is more important than bandwidth, for example for phone calls). Worse, because the spectrum is shared not only with users, but also with electrical machinery, using wireless technology means ever-flaky internet access.
Second, let’s talk about technology, specifically the kind of “future looking” technologies that we could be using which we aren’t. The argument is that we don’t need greater bandwidth than we currently have for technologies such as HD video conferencing, etc. because current codecs can sufficiently encode at low enough bit-rates to be good enough. To see why this doesn’t make sense, one only need look back at YouTube, and how it succeeded. When you look at it, the quality of YouTube was similar to the quality of RealPlayer. The only difference was that RealPlayer used more sophisticated codecs which could work over dial-up, but YouTube required broadband. That is, the codecs for YouTube were far worse than they are today.
The reason is because Flash, which YouTube is based on, added video as an after-thought, using out-dated video codecs. However, even then this was more convenient than using RealPlayer or a competing video technology. As a result, huge amounts of bandwidth were wasted, and now YouTube is a media behemoth. Only now does flash have more efficient codecs, which allow YouTube to become HD, and otherwise not waste bandwidth. Under arguments used against the NBN, YouTube could’ve existed without broadband. However, this isn’t true, because the codecs simply weren’t in the right place at the right time for the right price. Many new companies simply cannot use the most efficient codecs to squeeze into current bandwidth limitations.
Third, onto uploads. Right now, the “A” in ADSL seems to be as big a problem as the copper infrastructure. With a paltry 3.5Mbit/s upload speed, modern uses of the internet such as uploading videos or music, facebook photos, or voice and video calling, you can see that a mere 450 odd KBps isn’t really cutting it any more. When services like Dropbox or Amazon’s S3 allow you to practically keep your entire hard drive on the cloud, having this half a meg limit on your uploads is quite severe.
I haven’t talked about the Network Effect, which is more of a social effect than a technical one, however, it’s a big example of where you can’t use a cost-benefit analysis to justify, say, a telephone network. Similarly, you can’t do one for the NBN.
I also haven’t talked about peak download speeds vs streaming download speeds. Currently, most people against the NBN talk about download speed being good enough for many types of streaming data, but none talk about available bit rate data (that is, data which depends on throughput, such as photo uploads or web page downloads). With the internet scape looking the way it is, web applications should define the future of internet usage. These tend to require high throughput at certain times. They’ll make your computer feel “snappier” or “sluggish”. For example, when you go to gmail, the “loading” screen is effectively the entire application downloading to your PC. Until that application is there, you can’t do anything. Similarly, at certain critical points such as saving data or switching application context, there is a strong potential for wasted productivity. Currently, Video editing and high resolution photo editing web sites are possibly stymied by the kind of bandwidth currently provided, simply because of the user experience.
So there it is. I’m not pro or anti the NBN, because it’s basically a policy; I’m only interested in technology. But I dislike how people talk about the “magic of the markets” somehow solving bandwidth issues which are skirting the laws of physics.