I have mounted a D-Link DCS-923L IP-cam in a nesting box to watch birds nest and their home life. The IP-cam is a simple surveillance camera with night vision that allows it to capture video in complete darkness, and it is equipped with a built in web server that serves the live video feed. I set up the camera last spring and configured my firewall to allow friends and family to watch the birds. More and more people watched the feed and when the chicks were about to leave the nest (seems to be the most popular part of nesting) my connection to internet was completely choked. This year I figured I had to set up a mirror that could take the load off my network and I set out to find a solution. I found Bambuser (a free live video streaming service), why not use it?
So, how do you connect an IP-cam to Bambuser? It turned out that it can not be connected directly but with some some extra work it is possible, almost for free.
The first problem was to figure out how the camera delivers the video. No information in manuals or in the admin GUI. You can set resolution, frame rate and quality but not much more than that. When watching the live feed in a browser you are presented with the two options “ActiveX” and “Java”, not very helpful when it comes to understanding the video format. After some reverse engineering I discovered that the feed is motion JPEG and can be found on the URL
The next problem was to understand what Bambuser consumes. Not much information on their site, you basically download a client to your phone and off you go. There are some information about desktop clients that can be used to broadcast video captured with a USB-cam but nothing for IP-cams. I found a link on bambuser where I could download a configuration (fmle profile) to use with the Flash Media Live Encoder and in it I found a URL that seemed to be what I was looking for. A section from the file looked something like this:
<output> <rtmp> <url>rtmp://123456.fme.bambuser.com/b-fme</url> <stream>xxx...xxx</stream> </rtmp> </output>
Rtmp seemed to be a Flash proprietary protocol and another section of the file indicated that they use a video encoding format called VP6.
A nice tool when it comes to converting video is avconv (formerly known as ffmpeg) so I hoped it would be able to do the job, and I found this post that confirmed that. The plan was to use avconv to transform and package the motion jpeg from the camera web server to the flash format in real time.
Raspberry pi and avconv
All I needed was some hardware to run avconv on. As it happened, I had a raspberrypi that I didn’t use, was it powerful enough? i installed avconv
sudo apt-get install ffmpeg
The next step was to figure out the parameters for avconv. I ended up with
avconv -an -f mjpeg -r 5 -i http://camera/MJPEG.CGI -f flv -same_quant rtmp://123456.fme.bambuser.com/b-fme/xxx...xxxx
-an : no audio
-f mjpeg : format of input stream, motion mpeg in this case
-r 5 : 5 frames per second
-i http://camera/MJPEG.CGI : the input source, the address of my camera
-f flv : format of output, FLV format
-same_quant : use same quantifiers as in the mjpeg images. This should hopefully reduce the work for avconv.
rtmp://123456.fme.bambuser.com/b-fme/xxx...xxxx : the destination. 123456 and xxx…xxx identifies the user in bambuser and can be found in the fmle profile described earlier. The full address is the URL found in the
<url> tag concatenated with the
<stream>tag, separated by a slash.
A quick check on Bambuser and I could confirm the success!
With an image size of 640×480 and 5 frames per second the resulting bitstream is around 120 kbits/s. The raspberry pi processor runs at 100% more or less all the time…
I have discovered some stability issues in this solution as avconv hangs after 15-20 minutes. I suppose this require some further investigation, some other day.