We want super-fast live streaming. We want razor-sharp images. And we want it easy and cheap. How do we do it? We livestream with Raspberry Pi and the Gstreamer – and we also show you how.
At CodeFlügel, we are currently developing a telepresence system, which can be used anytime, anywhere and by anyone – and, most importantly, with any internet connection. Naturally, I can’t go blabbing about everything we’ve figured out over the past few years in this blog, but I can give you a few pointers on creating a quick and cost-efficient live stream.
Well, one of the issues we had to deal with was this: streaming a video signal from A to B with as little latency as possible. The geographical position of the two places shouldn’t matter, the camera should be mobile and the user should work on a PC. Our goal was to keep the latency under 1 second. Since the camera kept moving around and getting varying internet signals, we had to figure a few things out.
Prior to the transmission, we needed to compress the video data, since they had to packed into a streaming protocol or container format. At first, we thought WebRTC was the best solution, since it is hailed as the Holy Grail of live streaming. At that time, however, the technology had only basic support and even now, the implementation is still not comprehensive. So we settled on RTP (Realtime Transport Protocol). RTP is old, robust, narrow and tested. Moreover, you can stream it via UDP, which promised small latency, since it does not wait for packages which arrive late.
We wanted to stream the video to an online server, where users could then access it – in a web browser, because that’s what people use to look at stuff on the internet. An RTSP server seemed fitting, since it doesn’t need much transcoding. However, in a web browser, RTSP needs either VLC, Quicktime or Realplayer. The latter two have too much latency, VLC is acceptable in this respect. Installing an additional software is not optimal for our purposes, though. Eventually we decided on RTMP. Now, I know, some of you are going to say, “hey, in order to make that work I need to install the Flash Player!”. And that’s true, but most people already have. Frighteningly many, actually. Others will say, “why not MPEG-DASH or HLS?” And to you I say, remember that we want to limit the latency. Coding all that HTTP streaming stuff just takes way too much time.
For the compression, H.264 seems like the only choice. Although this video codec requires a lot of computing, the video files are downsized quite a bit. You can see it compared to other popular video codecs in the image below.
At that point, we thought it would be awesome if we didn’t have to do the computing for the video compression ourselves, meaning with additional software. We wondered whether there was any equipment, such as cameras, which could do it for us. The answer is YES, there’s a whole bunch! Especially in the area of IP camera surveillance, an integrated hardware H264 compression is the norm. With web cams there unfortunately isn’t as much to choose from. Creative and Logitech are the ones you want to go for here. For our purposes, we chose the Logitech C920. This neat little webcam is a H.264 UVC (Usb Video Device Class) camera, so it can also stream H.264 via USB. Bingo!
Now, all we needed was a device we could connect to the camera via USB and which would package the video files in RTP packages for us. I’m not even going to pretend I can create any kind of suspense at this point. As you know from the title of this blog, we built the livestream with Raspberry Pi 2.
The Whole Thing
So let’s look at the whole recipe again: We take a Logitech C920, connect a Raspberry Pi 2, add some internet with a WIFI stick and upload the H264 video files to our servers with RTP and UDP, and off they go into the depths of the internet. Meanwhile on the server, Gstreamer is waiting to pack everything from RTP to RTMP. Then, the files move on to an nginx media server for multicasting and finally into a web browser with TCP.
As promised, here are some Gstreamer pipelines. I’m not going to walk you through the software installations, but I’m sure you’ll manage, if not please shoot me a message.
First, here’s a pipeline with which you can get the H.264 stream from the C920 and send it to the server with RTP and UDP:
gst-launch-1.0 uvch264src average-bitrate=1000000 iframe-period=1000 device=/dev/video0 name=src auto-start=true src.vidsrc ! video/x-h264,width=1920,height=1080,framerate=30/1,profile=constrained-baseline ! h264parse ! rtph264pay pt=127 config-interval=2 ! udpsink sync=false host= your.server.ip port=5000
Please note: We’re using uvch264src to get the stream. The parameters for the bitrate, resolution, framerate etc. are sent when the camera is first started. They cannot be changed during the runtime, so you will need to restart the camera to do that.
If you have two servers you want to stream to, you can also split the stream:
gst-launch-1.0 uvch264src average-bitrate=1000000 iframe-period=1000 device=/dev/video0 name=src auto-start=true src.vidsrc ! video/x-h264,width=1920,height=1080,framerate=30/1,profile=constrained-baseline ! h264parse ! rtph264pay pt=127 config-interval=2 ! tee name=t ! queue max-size-time=10000000 ! udpsink sync=false host= your.server.ip port=5000 t. ! queue max-size-time=10000000 ! udpsink sync=false host=your.other.server.ip port=5000
On our server, RTP is converted to RTMP. Since we transmitted it via UDP before, we have to use udpsrc as the source. The target is a local nginx RTMP media server.
gst-launch-1.0 udpsrc caps="application/x-rtp,payload=127" port=5000 ! rtph264depay ! capsfilter caps="video/x-h264,width=1920,height=1080" ! flvmux streamable=true ! rtmpsink location='rtmp://127.0.0.1:1935/live1/sers'
Finally, there is a pipeline which we also use but which doesn’t really go with the rest. It extracts 30 JPEGs per second from the C920 stream and saves the images to the working memory. We do this, because the Pi has an SD card and does not like too many writing cycles. So we’re creating a ram disc which is located, as mentioned before, in the working memory.
gst-launch-1.0 uvch264src device=/dev/video0 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! multifilesink location=/home/pi/ramdisk/snap%d.jpeg max-files=60
In addition to those pipelines, Gstreamer also offers the possibility of programming in C. Thus, you can solve more complex tasks, such as creating individual protocols or a RTSP server. An alternative to Gstreamer would have been FFMPEG, but it just couldn’t handle the H.264 UVC cameras. Also, I think this is as good a time as any to say thanks for all the open source stuff. Even though it is often just a marketing ploy, it still gives me a lot of positive feelings.