You are here

Broadcasting Your Audio: Part 2

In the final instalment of his two‑part Net Notes about broadcasting your audio and video on the Internet, Dave Shapton discusses error correction and quality control, helps you to prepare for your webcast, and suggests some solutions for broadcasting live events. This is the last article in a two‑part series.

Last month we looked at the basic mechanisms for transmitting real‑time digital media (audio and video) over the Internet. What we found was that the Internet isn't like a small network. Small networks are easy to broadcast on because everyone sees every little chunk of information. They are simple but inefficient because, since everyone has to share the available bandwidth (the capacity of the network to move data around in a given time), the ultimate speed of the network diminishes in proportion to the number of people using it. The Internet simply wouldn't work with the number of people who use it today if it was a simple, unbranched network. This is a pity because in order to set up your own Internet radio or television station there's more to it than simply being connected.

As I mentioned last month, the only way for so many people to use the Internet in an efficient way (or even one that works at all), is to actively route data between the sender and the recipient.

No Room For Error

Strictly, there is no difference between digital media data and ordinary data — it's all ones and zeros. What is different, though, is the consequence of losing some of the data. If you're downloading a computer program, or even a data file (such as a word processing or spreadsheet document) there's no room for error. Let's say you've downloaded an update to Cubase — any error, however small, will most likely stop the program from working altogether. Even a one‑bit error in ten million. Think of a computer program as if it's a list of directions in a strange town. It's just a set of instructions. So if you're told to go left, left, right, left, right, right and you get any one of these directions wrong, you'll end up in the wrong place.

The languages or 'protocols' that the Internet uses are designed to take account of the criticality of the data. The most used protocol, TCP/IP, contains procedures for checking the integrity of received data and can even arrange for missing or corrupt packets to be resent. It works extremely well — even with a slow computer and the dodgiest analogue phone line, it's still perfectly possible to download programs and files with confidence, albeit very slowly — the recent release of the free version of the Be operating system required a massive 60Mb download: but, if you were prepared to foot the phone bill, there's no reason why you shouldn't have an error‑free transfer, even with a download containing 480,000,000 bits of information!

Losing even one bit of information when downloading the 60Mb BeOs would stop it from working. So what would happen if you lost some data while watching a digital video stream? Well, nothing much — you'd probably see the picture jerk or part of the scenery turn into Lego blocks. And, of course, if it happened halfway through a streamed transmission, well, it couldn't affect the bit you've already seen!

Guaranteed data integrity is a luxury we can — and have to — do without when streaming. The reason we can't have it is because of speed. Error correction works in several ways. Some of it is single‑ended: data can be re‑constructed at the receiving end from extra (or redundant) information intermixed with the original data. The extra data, though, takes time to send and receive, and would therefore reduce the overall (unique) data bandwidth. As a consequence, the quality of any streamed media would suffer. (See the box below).

Quality Control

The protocols used to stream digital media on the Internet take the view that what's happened has happened, and it's more important to concentrate on what's going on now. The protocols are therefore simpler and, because there is less handshaking between the transmitting and receiving ends, there is more bandwidth available for the actual media data, resulting in a better overall quality.

Talking of quality, there is no real way to ensure that your Internet broadcasts are received with a consistent quality. Network experts usually depict the Internet as a cloud in diagrams. It's a bit vague, but it's probably appropriate because the Internet is actually very non‑deterministic: in other words, the Internet is so complex that it's almost impossible to say what state it will be in at any given time. All you can talk about is the sort of performance you can expect on average. This is completely at odds with the requirements for transmitting digital media in real time!

The word 'stream' itself implies something that flows at a steady rate. If you look at how audio and video are transmitted conventionally, you'll see that they require an absolutely constant transmission rate: CD audio at 44.1 samples per second and video at 25 (or 30 in the US and Japan) frames per second. Any deviation from this (such as jitter) results in gross audio and video distortion.

But the Internet does not guarantee any sort of Quality of Service. QOS guarantees are available through network architectures such as ATM (the basis of much of the worldwide communications network) and the next generation of Internet Protocols (called Internet Protocol version 6) may provide mechanisms for better consistency of bandwidth, but I don't think we're going to see anything like guaranteed service across the Internet for a number of years yet.

Back To Broadcasting

Now, we've already established that you can't just connect yourself to the Internet and expect everyone in the world to be able to hear or see your broadcast. Deep in the specifications for Internet protocols there is a specification for a broadcast mode; but this only works on local area networks. Since it would essentially flood the entire Internet if it were used, most routers will simply ignore it.

So, back to a very basic question: what do you do if you want millions of people to watch your broadcast? Well the first thing you have to figure out is how many people you want to, or are likely to, receive your webcast. Then you have to look at some very simple calculations. If you are just making streaming media files (as opposed to a live stream) available over the Internet, then you have to make sure that your connection to the Internet is fast enough to accommodate a new stream for each user. So if your media is configured to stream at 20kbits/second, then if you have ten simultaneous users you will have to have a 200kbits/second connection to the Internet. In fact, you'll actually need a much faster link than that because of the inevitable 'overhead': the data bandwidth that is wasted in administering the connection.

Not many of us have a fast Internet connection, and ADSL is not necessarily the answer to this because the 'A' stands for 'asymmetrical'; the asymmetry being that uplink speeds, (which concern the rate you can send information to the Internet), are typically much slower than the downlink speeds. You must also have a fixed (as opposed to floating) IP address — and it looks like most ADSL schemes will not allow this.

Still, that may be a blessing in disguise because, unless you can afford a massive Internet setup, it's probably best to let your Internet Service Provider take responsibility for getting your media on the Net. You must, however, check with the ISP whether it is capable or willing to host media streaming. And don't assume the answer will always be yes.

Resending Corrupt/Missing Data

The ultimate form of error correction is not really correction at all but is actually the process of requesting that any corrupt or missing data is resent. The efficiency of this depends on the quality of the link: if there's a good link, then no replacement packets need to be resent. As the quality of the link drops, the amount of resending can increase exponentially to the point where the whole thing seizes up completely. This just wouldn't work with a real‑time stream and the problem is worse than it seems — you might think that if you're only missing one packet in a hundred then the amount of data re‑transmitted would only amount to about one percent. But what goes on behind the scenes is actually a two‑way conversation along the lines of "Did you get that packet?" "No." "Shall I resend it?" "Yes please." "Did you get it this time?" "Yes thanks." "Did you get the next packet?" and so on. (This is a huge over‑simplification, and these conversations actually happen at a very low level and bear little resemblance to the pseudo‑English version above: but the principle is correct). You can imagine how much time interactions like this can take.

Broadcasting Live Events

What we have been talking about, so far, is 'on demand' streaming. With a small number of simultaneous users, it's quite possible that small or non‑specialist ISPs will let you do it. Where thousands of users want access to the same file then it can get expensive, but the solutions are straightforward. To allow thousands, possibly millions, of users simultaneous access to the same material, you have to duplicate the media on what are called Caching Servers. These are physically separate computers that are available to stream the same material. Most caching server setups use network load‑balancing software and hardware to ensure that if you have ten servers, they are accessed evenly.

For me, Internet broadcasting gets really interesting when we start talking about live events. It's possible, with the aid of a few computers, to send live material over the Internet. The normal arrangement would consist of a web server, a streaming server, and an encoding computer. The web server, which hosts the web site referring the user to the live webcast, can be anywhere in the world, or indeed right next to the streaming server — it really doesn't matter. Web servers and streaming servers are normally on separate machines because you wouldn't want heavy access to the streaming material to stop other users getting onto your website. The encoder computer has an audio and/or video capture card in it. Encoding software then sends data to the streaming server for presentation to the Internet.

The encoder and streaming server don't necessarily have to be apart from each other, though. In fact, one of the best live streaming solutions I've seen so far is the Stream Genie (ideal for David Bowie concerts!) from Pinnacle. It's a one‑box solution, which is actually a 'lunch‑box' computer. With six audio and video inputs and an Ethernet socket as its output — it's ready for connection to the Internet. In addition to encoding facilities, the Stream Genie has a built‑in video mixer that can dissolve (what video people call a crossfade) between any two video sources. Remarkably, it can also accomplish 3D transitions, like page‑turns and pond‑ripples, in real time as your material is being broadcast. Possibly most useful is the facility to generate titles and captions for superimposing on the video stream. It comes with a small Mackie mixer to handle audio. With it you can just turn up at a venue, set up your microphones and cameras, and start streaming.

Pinnacle also provides a solution to the awkward question of how you get your live material to millions of people. Using a dedicated site called Cast Connect, you can set up a live webcast. All you have to do is book a time, and give a few details, including your credit card number. Then you plug your Stream Genie into the Internet, via your local ISP, and millions of people can tune in to your live concert. It's not cheap, but then what other method of transmitting audio and video live to a potential audience of millions is?

There are other, more generic solutions to live broadcasting and you might want to check out companies such as Vewcast.com, who make capture cards for streaming, and Akamai and Edgecast, who provide streaming infrastructures like Cast Connect.

Next month I'm going to look at multicasting, preparing your media for web streaming, and at MPEG4 structured audio, which is a method of streaming audio with perfect (I mean perfect) quality at very, very low bandwidth. But there's a catch...