Attempting to understand the technology involved with audio streaming can prove to be a challenge. A share of the problem lies in the use of acronyms to describe particular equipment and procedures. So, it seems the best way to attempt to grasp what is going on is to provide definitions for some of the critical terminology.
Let’s begin with the phrase audio streaming. True streaming occurs when one listens to music in “real time” rather than by means of progressive streaming that requires saving the data as a file for later listening. Live streams are only available for one listen on music services such as Spotify.
With true streaming, an audio file is delivered to you in small ‘packets’ in order for the data to be buffered on your device and played almost immediately. As long as there is a steady stream of these packets delivered to the device, no interruptions will be experienced.
The source of the traveling data is a streaming server (1). A stand-alone player or a plugin serves as a decoder and works as a part of a Web browser. Typically the server delivers the files with the help of a Webserver. The process requires the listener to go to a Web page that is stored on the streaming server. When a file is chosen it sends a message to the streaming server. The server than sends the file directly to the listener, bypassing the Web server. Together, the server, data stream, and decoder work together to allow the listener to listen to live and prerecorded broadcasts.
There is truly an art to designing a network to support streaming media. Datagram Protocol (UDP) sends the audio stream in the form of the small packets mentioned above. If data are lost, it is the Responsibility of the receiving application to correct it with Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP) or Real-time Transport Control Protocol (RTCP) (2).
The streaming files start out as very large, and hopefully, high-quality files referred to as raw files. These files then undergo a process to compress them. This compression uses an audio coding format such as MP3, Vorbis, AAC or Opus. The encoded audio is then assembled in a container “bitstream” such as MP4, FLV, WebM, ASF or ISMA. The bitstream is then delivered from the server to a user’s devise with Internet-communication capability (2).
Rules, referred to as Protocols, work with Web traffic to balance the load on the server.
- Wilson, Tracy V., How Streaming Video and audio Work, Howstuffworks. http://computer.howstuffworks.com/internet/basics/streaming-video-and-audio2.htm. Retrieved September 24, 2016.
- Streaming media. Wikipedia, the free encyclopedia. https://en.wikipedia.org/wiki/Streaming_media.