Data streaming, as known today, lets a single machine send real time data to a number of clients. This number is limited due to bandwidth constraints, as it is expensive (specially on the upload channel). The problem that arises is that individuals or little associations can't set up reliable streaming servers, as they won't be able to pay the costs for it (thus only being able to serve a very limited number of clients).
Where do I want to get? Well, two friends and I have been working on a little project (as a university task) to improve this situation. The idea is to apply the peer-to-peer networking scheme to data (multimedia) streaming, so that each client also acts as a server. This way, the number of clients is theorically unlimited; however, the reception latency increases and you may suffer data cuts if a node drops (the time needed to search for and connect to another peer).
While testing our application, we set up a server on a 56kbits line and we were able to connect more than 5 clients to it (yeah, it's a non-impressive number, but we didn't have more machines to test), receiving each one a constant data flow of (almost) 4Kbytes/s. That is an upload bandwidth of 20Kbytes/s, impossible to reach by our telephonic line (which barely serves the flow needed by the stream).
BTW, the application is network transparent, so it works with existing applications and protocols; ATM, it only handles SHOUTcast, as it's the protocol we focused on, but it should be extensible to others without much problems.
I don't know if we will make the code available (I have nothing against it, and I think my friends won't either). The thing is that I'm afraid it will suffer from stagnation, just like happened with our previous university-related project (Vigipac).
So, if you found the idea interesting, check out Peercast, a more mature application. Hmm... why did we reinvent the wheel? Well, we discovered it just a few days ago, when the project was already finished.