Our broadcasting of the iPhoneDevCamp event to as many as 1000 remote users around the world on their iPhones has been a great success. We believe this is the first time ever that a live event has been broadcast with audio & video directly to iPhone end users. Using the new HTTP video streaming protocol built into the iPhone OS 3.0 we were able to custom crunch some existing technologies and make them work to pull video from a high quality Axis camera, transcode the video and chunk it out into the file based protocol required for this new streaming and make it available on an iPhone optimized web site.
We gave a presentation last night on how we set up the video infrastructure and how this new video protocol works.
The basics of our implementation are:
- Axis HD video camera streaming RTSP H.264 in an mpeg2 wrapper
- VLC grabbing the RTSP stream and converting it to HTTP. This feed is then split off in HD and made available to the satellite locations (running on Mac Book Pro #1)
- FFMPEG to transcode the video to 320×240 for the iPhone piped to stout (running on Mac Book Pro #2)
- Additional FFMPEGs running simultaneously so we can stream live and break between events more smoothly (also running on Mac Book Pro #2)
- an open source media segmenter reading stdin and then outputing the required series of small video files to a file system (running on Mac Book Pro #2)
- a remote file system mount directly to one of our servers using Mac FUSE over SSH to securely upload files out of Yahoo
- Apache httpd letting users access the files on their iPhone (remote web server at Lextech)