Install GStreamer 1.18 on Raspberry Pi 4 - Q-engineering
Q-engineering
Q-engineering
Go to content
images/empty-GT_imagea-1-.png
Install GStreamer on Raspberry Pi 4

Install GStreamer 1.18 on Raspberry Pi 4.

Last updated: September 14, 2021

Introduction.

GStreamer is a pipeline-based multimedia framework that links various media processes to a complex workflow. For example, with a single line of code, it can retrieve images from a camera, convert them to Mpeg, and send them as UDP packets over Ethernet to another computer. Obviously, GStreamer is complex software used by more advanced programmers.

One of the main reasons for using GStreamer is the lack of latency. The OpenCV video capture module uses large video buffers, holding the frames. For example, if your camera has a frame rate of 30 FPS and your image processing algorithm can handle a maximum of 20 FPS, the synchronisation is lost very quickly due to the stacking in the video buffer. The absence of buffer flushing makes things even worse.
In this situation, GStreamer comes to the rescue. With the buffering of just one frame, you will always get an actual frame as output.

The guide covers the following topics.
  • Version 1.14.4.
    We start with a survey of the current version 1.14.4 and the installation of rpicamsrc on 32-bit systems.
  • Version 1.18.4.
    The second part covers installing GStreamer 1.18.4 on your Raspberry Pi.
  • Streaming examples.
    In the last part, a lot of streaming examples, including streaming to YouTube are explored.


Version 1.14.4.

When you scan your system for GStreamer, you will find several packages already installed. They are essential for your Raspberry desktop.

LibGStreamer_1_14_4

If you remove these packages, your desktop will fall back on the standard Debian LXDE desktop.

LXDE desktop

There are a few additional plugins you can install. These are especially useful if you want to start streaming, one of GStreamer's popular applications.
Please follow the commands below.
# install a missing dependency
$ sudo apt-get install libx264-dev libjpeg-dev
# install the remaining plugins
$ sudo apt-get install libgstreamer1.0-dev \
     libgstreamer-plugins-base1.0-dev \
     libgstreamer-plugins-bad1.0-dev \
     gstreamer1.0-plugins-ugly \
     gstreamer1.0-tools
# install some optional plugins
$ sudo apt-get install gstreamer1.0-gl gstreamer1.0-gtk3
# if you have Qt5 install this plugin
$ sudo apt-get install gstreamer1.0-qt5
# install if you want to work with audio
$ sudo apt-get install gstreamer1.0-pulseaudio

Streaming.

With all GStreamer modules installed let's test the installation with $ gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink .

GStreamer_1_14_4

The Raspicam can be invoked with this rather large pipeline $ gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw, width=1280, height=720, framerate=30/1 ! videoconvert ! videoscale ! clockoverlay time-format="%D %H:%M:%S" ! video/x-raw, width=640, height=360 ! autovideosink Remember to enable the Raspicam on forehand in your Raspberry Pi configuration menu.

GStreamer_Raspicam

All pipeline commands are constructed in the same way. First, the source is named, followed by several operations, after which the sink is determined. All parts of the pipeline are separated from each other by exclamation marks. For instance, in the example above, you could remove the part, which prints the date and time on the screen.  
Common sense is required when composing the pipeline. The limited computing power of the Raspberry Pi does not allow for overly complex pipelines, such as parsing a Mpeg stream to the original frames, adding a timestamp and compressing the stream back to say mp4.
UDP streaming.
There are many types of streaming possible with GStreamer. UDP and TCP are most used to connect two devices. The name of the streaming refers to the Ethernet protocol used.
Let's start with UDP. We use two Raspberry Pis, both connected to the same home network. However, it could just as easily be an RPi and a laptop on the other side of the world. You need to know the address of the receiving Raspberry Pi. Follow the commands below.
Raspberry Pi 32 or 64-bit OS
# get the IP address of the recieving RPi first
$ hostname -I
# start the sender, the one with the Raspicam
$ gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.178.84 port=5200
# start the reciever, the one with IP 192.168.178.84
$ gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink
Now you can see claerly the structure of a pipline. The sender has the Raspicam located at /dev/video0 as source and sinks to the IP address of the other Raspberry Pi. The recieving RPi has UDP port 5200 as source and sinks to the screen (autovideosink).

UDPsend

none

TCP streaming.
The other method of streaming is with TCP. The difference with UDP is the latency. UDP is faster.
The commands as listed below. Note the different IP addresses. With TCP streaming, you use the server address, the sender, instead of the receiver, as we saw with the UDP streaming.
Raspberry Pi 32 or 64-bit OS
# get the IP address of the sending RPi first
$ hostname -I
# start the sender, the one with the Raspicam and IP 192.168.178.32
$ gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw,width=640,height=480, framerate=30/1 ! videoconvert ! jpegenc ! tcpserversink  host=192.168.178.32 port=5000
# start the reciever and connect to the server with IP 192.168.178.32
$ gst-launch-1.0 tcpclientsrc host=192.168.178.32 port=5000 ! jpegdec ! videoconvert ! autovideosink
Both streams, UDP and TCP, start with single frames (video/x-raw). A timestamp is inserted if necessary. Then the image is compressed with jpeg to reduce its size, decreasing the required bandwidth. Once received, the jpeg image will be decompressed and displayed on the screen. You can always change the resolution. Frame sizes of 1280x960 at 30 FPS were no problem here in the office.
RTSP streaming.
If you want to stream RTSP (Real-Time Streaming Protocol), you need a server. GStreamer has its own server available for RTSP. If you don't want to stream RTSP, this additional software isn't necessary. The streaming examples section provides some pipelines and other information about setting up an RTSP stream. For now, just the installation commands.
# install the rtsp server
$ wget https://gstreamer.freedesktop.org/src/gst-rtsp-server/gst-rtsp-server-1.14.4.tar.xz
$ tar -xf gst-rtsp-server-1.14.4.tar.xz
$ cd gst-rtsp-server-1.14.4
$ ./configure
$ make
$ sudo make install
$ sudo ldconfig
rpicamsrc.
Thanks to the impressive work of Jan Schmidt (thaytan), GStreamer now fully support the Raspicam. The source is incorporated in the later versions.
In the case of default Raspberry Pi version 1.14.4, you have to install the plugin yourself.

👉 Please note, rpicamsrc works only on a 32-bit operating system. Due to missing Userland components, it will never work on a 64-bit OS. Use the default v4l2src device=/dev/video0 source in this situation, as shown in the above examples.
Only Raspberry Pi 32-bit OS
# install rpicamsrc in 1.14.4
$ git clone https://github.com/thaytan/gst-rpicamsrc.git
$ cd gst-rpicamsrc
$ ./autogen.sh
$ make
$ sudo make install
$ sudo ldconfig
RpicamsrcMake

Check with $ gst-inspect-1.0 rpicamsrc.

RpiSrc

The UDP or TCP streaming commands are shown below. For fun, we've put timestamps in the corners of the videos.
ONLY Raspberry Pi 32-bit OS with rpicamsrc
UDP
# get the IP address of the recieving RPi first
$ hostname -I
# start the sender, the one with the Raspicam
$ gst-launch-1.0 -v rpicamsrc num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! clockoverlay time-format="%D %H:%M:%S" ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.178.84 port=5200
# start the reciever, the one with IP 192.168.178.84
$ gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink
TCP
# get the IP address of the sending RPi first
$ hostname -I
# start the sender, the one with the Raspicam and IP 192.168.178.32
$ gst-launch-1.0 -v rpicamsrc preview=false num-buffers=-1 ! video/x-raw,width=640,height=480, framerate=30/1 ! timeoverlay time-mode="buffer-time" ! videoconvert ! jpegenc ! tcpserversink  host=192.168.178.32 port=5000
# start the reciever and connect to the server with IP 192.168.178.32
$ gst-launch-1.0 tcpclientsrc host=192.168.178.32 port=5000 ! jpegdec ! videoconvert ! autovideosink


Version 1.18.4.

Version 1.18.4.

This section walks you through the installation of GStreamer 1.18 on a Raspberry Pi 4. With version 1.18, GStreamer fully supports the Raspicam on 32-bits operating systems. Unfortunately not on the 64-bits systems, due to the missing Userland video engine.

Because GStreamer is deeply embedded in the Raspberry Desktop, it is not easy to customize or upgrade. Parts of the old version may remain active, which sometimes causes unexpected reactions. Completely removing the old version will not only destroy your desktop but will also remove needed dependencies. The newly installed version will therefore lack many features.

👉 By the way, keep in mind that you will have to reinstall OpenCV once GStreamer 1.18 is installed.
Version 1.19.1.
On June 1, 2021, Version 1.19.1 is released. There is not much new to report regarding the Raspberry Pi. Version 1.19.1 is more or less an 'intermediate version' pointed out by the GStreamer community here. The most important fact is the Rust implementation in this version, a language only a few uses on an RPi.

Preparation.

You have to install at least three GStreamer packages, the core gstreamer, the plugins-base and the plugins-good. As mentioned, OpenCV has to be rebuilt also, after installing GStreamer. But first remove, if available, earlier user-installed GStreamer versions.
# remove the old version
$ sudo rm -rf /usr/bin/gst-*
$ sudo rm -rf /usr/include/gstreamer-1.0
# install a few dependencies
$ sudo apt-get install cmake meson
$ sudo apt-get install flex bison
$ sudo apt-get install libglib2.0-dev

Installation.

Next, install the core GStreamer libraries.
Core
# download and unpack the lib
$ wget https://gstreamer.freedesktop.org/src/gstreamer/gstreamer-1.18.4.tar.xz
$ sudo tar -xf gstreamer-1.18.4.tar.xz
# make an installation folder
$ cd gstreamer-1.18.4
$ mkdir build
$ cd build
# run meson (a kind of cmake)
$ meson --prefix=/usr       \
        --wrap-mode=nofallback \
        -D buildtype=release \
        -D gst_debug=false   \
        -D package-origin=https://gstreamer.freedesktop.org/src/gstreamer/ \
        -D package-name="GStreamer 1.18.4 BLFS" ..

If everything went well, you end up with the screen below.

Meson_Rdy_1

Now build and test GStreamer with the next commands.
# build the software
$ ninja -j4
# test the software (optional)
$ ninja test
# install the libraries
$ sudo ninja install
$ sudo ldconfig
Meson_Test_1

Testing.
It is common for some tests to fail. As long as the ninja build doesn't throw an error, don't worry.
Tests fail mainly for two reasons.
First, the required hardware (GPU) is not available on the Raspberry Pi. While you may think it shouldn't be compiled, sometimes Gstreamer needs parts of these libraries later in the build.
Second, most commonly, it takes too much time on a (simple) Raspberry Pi to run the test. Think of parsing a MPEG stream into the original frames. Now an RPi had a lot of work ahead of it. The test may take a certain amount of time, after which it will fail.
When you later install the Bad or Ugly package, even more tests will fail.

OS check.
Please check your operating system to be sure, before the next step. Run the command uname -a and verify your version with the screen dump below.

Version_32_64

Plugins-base.

Once the core libraries are installed, the next step is to install the two additional plugins. Both follow the same procedure as above, so we'll give the installation without much explanation.
Plugins Base
# some additional dependencies
$ sudo apt-get install libgtk2.0-dev libcanberra-gtk* libgtk-3-dev
$ sudo apt-get install libegl1-mesa-dev libglfw3-dev libgles2-mesa-dev
$ cd ~
# download and unpack the plug-ins base
$ wget https://gstreamer.freedesktop.org/src/gst-plugins-base/gst-plugins-base-1.18.4.tar.xz
$ sudo tar -xf gst-plugins-base-1.18.4.tar.xz
# make an installation folder
$ cd gst-plugins-base-1.18.4
$ mkdir build
$ cd build
# run meson
$ meson --prefix=/usr \
-D gl_winsys=wayland \
-D buildtype=release \
-D package-origin=https://gstreamer.freedesktop.org/src/gstreamer/ ..
$ ninja -j4
# optional
$ ninja test
# install the libraries
$ sudo ninja install
$ sudo ldconfig
Plugins Base
$ cd ~
# download and unpack the plug-ins base
$ wget https://gstreamer.freedesktop.org/src/gst-plugins-base/gst-plugins-base-1.18.4.tar.xz
$ sudo tar -xf gst-plugins-base-1.18.4.tar.xz
# make an installation folder
$ cd gst-plugins-base-1.18.4
$ mkdir build
$ cd build
# run meson
$ meson --prefix=/usr \
-D buildtype=release \
-D package-origin=https://gstreamer.freedesktop.org/src/gstreamer/ ..
$ ninja -j4
# optional
$ ninja test
# install the libraries
$ sudo ninja install
$ sudo ldconfig
Not all tests will pass on a 64-bit system. This is normal due to unsupported features in the Wayland protocol. The screen dump below shows the result of the 32-bit version.

Meson_Test_1

Plugins-good.

The last step is the installation of the plugins-good package.
Plugins Good
$ cd ~
$ sudo apt-get install libjpeg-dev
# download and unpack the plug-ins base
$ wget https://gstreamer.freedesktop.org/src/gst-plugins-good/gst-plugins-good-1.18.4.tar.xz
$ sudo tar -xf gst-plugins-good-1.18.4.tar.xz
# make an installation folder
$ cd gst-plugins-good-1.18.4
$ mkdir build
$ cd build
# run meson
$ meson --prefix=/usr       \
       -D buildtype=release \
       -D package-origin=https://gstreamer.freedesktop.org/src/gstreamer/ \
       -D package-name="GStreamer 1.18.4 BLFS" ..
$ ninja -j4
# optional
$ ninja test
# install the libraries
$ sudo ninja install
$ sudo ldconfig

Plugins-bad.

There are two additional packages that you should be aware of. The plugins-bad does not meet the same high quality standard as the other pacakges. They may not have been thoroughly tested, or some documentation is missing or in progress. Commonly used format conversion tools, such as h264parse or matroskamux, can be found in this plugin. Keep in mind that the Raspberry Pi 4 still has modest computing power, so converting high frame rates live from MPEG to Matroska will give disappointing results.
Plugins-ugly.
The other package is the plugins ugly. The code is good but may have problems distributing. There may also be patent issues with the libraries on where the plugin depends. See this GStreamer site for more information.

Both packages can be installed on the Raspberry Pi 4 without any problem. The procedure is identical to the plugins-good. You only need to replace the name good to bad or ugly in the command lines.

Over time, a needed module may not be present in the current installation. If all four plugins are installed, chances are the underlying library isn't installed on your Raspberry Pi. You need to install the required library first and then rebuild the plugin. As an example, the missing x264enc module, found in the ugly package, is installed below. Note that there is no need to recompile OpenCV in this case.
# test if the module exists (for instance x264enc)
$ gst-inspect-1.0 x264enc
# if not, make sure you have the libraries installed
# stackoverflow is your friend here
$ sudo apt-get install libx264-dev
# check which the GStreamer site which plugin holds the module
# rebuild the module (in this case the ugly)
$ cd gst-plugins-ugly-1.18.4
# remove the previous build
$ rm-rf build
# make a new build folder
$ mkdir build && cd build
$ meson --prefix=/usr       \
      -D buildtype=release \
      -D package-origin=https://gstreamer.freedesktop.org/src/gstreamer/ \
      -D package-name="GStreamer 1.18.4 BLFS" ..
$ ninja -j4
$ sudo ninja install
$ sudo ldconfig

Testing.

With the three packages installed, you can test GStreamer 1.18.4 on your Raspberry Pi 4. The screen dump below shows the installed version and a test pipeline $ gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink to see if everything works.

Test_32

Unfortunately, the rpicamsrc module only works with a 32-bit operating system because the userland GPU interface does not fully support the new 64-bit operating system yet. Several modules have already been ported to 64 bits. However, the mmal core, which GStreamer needs, is still in the 32-bit version. We can still use GStreamer with all its features, but as source we need to use the old fashion /dev/video0 declaration.
32-bit OS testing.
The next test is to see the rpicamsrc working. If the $ gst-inspect-1.0 rpicamsrc command can be executed, it will certainly work correctly.

InspectRPiCam

Last test is the Raspicam projecting a preview on the screen by executing the following command.
$ gst-launch-1.0 -v rpicamsrc preview=true ! fakesink
Basically, the rpicamsrc has almost the same properties as the raspivid application. More information about the rpicamsrc can be found on the GStreamer site.
64-bit OS testing.
The 64-bit version uses the v4l2src module. First, see if the module is correctly installed with the command $ gst-inspect-1.0 v4l2src.

InspectV4L2Cam

If the Raspicam is connected and enabled in the Raspberry Pi Configurations, you will get the following screen when probing /dev/video0, indicating the camera is working.

InspectDevCam

TCP and UDP streaming.
The streaming commands are identical to those in version 1.14.4. If you have a 32 bit OS you can apply the rpicamsrc otherwise /dev/video0 will be used.

Cleaning.

Once you are certain GStreamer is working well, you can remove all the code and zip files.
# remove the core code and zip files
$ rm gstreamer-1.18.4.tar.xz
$ rm -rf gstreamer-1.18.4
# remove the plugins
$ rm -rf gst-plugins*

OpenCV.

For OpenCV to work with GStreamer 1.18.4, it must be recompiled, even if OpenCV was built with GStreamer 1.14. An OpenCV with version 1.14 will not work with the newly installed 1.18 version.
Please follow the steps given in our tutorials. The only difference is setting the -D WITH_GSTREAMER=ON argument in the build.
$ cmake -D CMAKE_BUILD_TYPE=RELEASE \
        -D CMAKE_INSTALL_PREFIX=/usr/local \
        -D OPENCV_EXTRA_MODULES_PATH=~/opencv_contrib/modules \
   ..................        

        -D WITH_GSTREAMER=ON \

        ..................        

        -D OPENCV_GENERATE_PKGCONFIG=ON \
        -D BUILD_EXAMPLES=OFF ..
OpenCV_GStreamer

If the generated build information shows the detected GStreamer version, you can be confident OpenCV will support the new version once build and install. On our GitHub page you can find a simple GStreamer example with the Raspicam for a Raspberry Pi 4 32 or 64-bit OS.

WebCam.

GStreamer can also work with a USB webcam. There are two points to keep in mind.

First, there is the address of the camera. The Raspicam will always occupy /dev/video0 once enabled. All other USB cameras use the following numbers. So the first USB camera has /dev/video1, the next one /dev/video2 and so on. If there is no Raspicam enabled, then, of course, the numbering starts at /dev/video0.

The second point is the available output formats the USB webcam supports. Below an example of a popular Logitech c720 webcam is shown.

Logitech_G1

Logitech_G2

Logitech_G3

The source format is declared in the pipeline. It is the first part of the command, starting with video/x-raw, width=1280, height=720, framerate=30/1. Choose only a format supported by your webcam. In the list of the Logitech c720, you will find formats such as 640x480@30 FPS or 1280x960@5 FPS. Other formats give an error. The same goes for the MJPG formats. We specified that the source is video/x-raw, not video/x-mpeg. Using the MPEG stream is more for advanced users. Especially on the limited resources of the Raspberry Pi 4.


Streaming examples.

All samples are suitable for the 32- and 64-bit Raspberry operating system. We don't use the rpicamsrc as found on the 32-bit version to be as generic as possible. You can always replace v4l2src device=/dev/video0 with rpicamsrc in your pipeline to use the rpicamsrc if you want.
We have used the Raspicam V2 (IMX219) camera. Of course, you need to enable the camera first in the settings. As explained above, the pipelines can also be applied to other cameras, such as webcams, if you take care of the resolution and-or other properties.
Streaming to screen.
The first example is a simple streaming to the screen. You have seen it already working in the above sections. Note the scaling at the end of the pipeline.
$ gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw, width=1280, height=720, framerate=30/1 ! videoconvert ! videoscale ! clockoverlay time-format="%D %H:%M:%S" ! video/x-raw, width=640, height=360 ! autovideosink
Streaming to file.
Another commonly used option of gstreamer is streaming to a file. Here are some examples.
The omxh264 plugin has been depreciated lately. Better to use coding plugins closer to the Raspberry Pi v4l2h264 coding. However, when looking at the computer load, the omxh264 does make not that much difference.
Please note the -e. This option lets the file close gracefully when the pipeline stops. In other words, it prevents a corrupted file end when the pipe suddenly closes.
Final remark, the SD card on your Raspberry Pi will wear out quickly if you start writing video files continuously. It is better to write them on a USB stick or other media.
# mkv
$ gst-launch-1.0 -e v4l2src device=/dev/video0 ! image/jpeg, width=1280, height=720, framerate=30/1 ! jpegdec ! omxh264enc ! video/x-h264, profile=high ! h264parse ! matroskamux ! filesink location=output.mkv
# mkv (without omxh264enc)
$ gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-raw, width=1920, height=1080, framerate=15/1 ! v4l2h264enc extra-controls="controls, h264_profile=4, video_bitrate=620000" ! h264parse ! matroskamux ! filesink location=output.mkv
# mp4
$ gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-raw, width=1280, height=720, framerate=30/1 ! v4l2h264enc extra-controls="controls, h264_profile=4, video_bitrate=620000" ! h264parse ! mp4mux ! filesink location=video.mp4
# flv
$ gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-raw, width=1280, height=720, framerate=30/1 ! videoconvert ! omxh264enc ! video/x-h264 ! h264parse ! video/x-h264 ! queue ! flvmux name=mux ! filesink location=video.flv
Streaming to OpenCV.
The following example is streaming to your OpenCV application. The best practice is by using raw images instead of motion compressed streams like mp4. The sink is now called appsink. The pipeline is encapsulated in a single routine. We show you a snippet of code, more on our GitHub page.

						
						#include <opencv2/opencv.hpp>
						
						std::string gstreamer_pipeline(int device, int capture_width, int capture_height, int framerate, int display_width, int display_height) {
						    return
						            " v4l2src device=/dev/video"+ std::to_string(device) + " !"
						            " video/x-raw,"
						            " width=(int)" + std::to_string(capture_width) + ","
						            " height=(int)" + std::to_string(capture_height) + ","
						            " framerate=(fraction)" + std::to_string(framerate) +"/1 !"
						            " videoconvert ! videoscale !"
						            " video/x-raw,"
						            " width=(int)" + std::to_string(display_width) + ","
						            " height=(int)" + std::to_string(display_height) + " ! appsink";
						}
						////////////
						int main()
						{
						    //pipeline parameters
						    int capture_width = 1280 ;
						    int capture_height = 720 ;
						    int display_width = 640 ;
						    int display_height = 360 ;
						    int framerate = 30 ;
						
						    std::string pipeline = gstreamer_pipeline(0,capture_width, capture_height,
						                                              display_width, display_height, framerate);
						    
						    cv::VideoCapture cap(pipeline, cv::CAP_GSTREAMER);
						    if(!cap.isOpened()) {
						        std::cout << "Failed to open camera." << std::endl;
						        return (-1);
						    }
						
						
						    
						
UDP streaming.
For completeness, the UDP streamer again. The port number (5200) is arbitrary. You can choose any number you want, preferably over 1000, as long as the sender and receiver use the same number.
# get the IP address of the recieving RPi first
$ hostname -I
# start the sender, the one with the Raspicam
$ gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.178.84 port=5200
# start the reciever, the one with IP 192.168.178.84
$ gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink
TCP streaming.
And the TCP streamer once more. Same as with UDP, you can choose any port you like.
# get the IP address of the sending RPi first
$ hostname -I
# start the sender, the one with the Raspicam and IP 192.168.178.32
$ gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw,width=640,height=480, framerate=30/1 ! videoconvert ! jpegenc ! tcpserversink  host=192.168.178.32 port=5000
# start the reciever and connect to the server with IP 192.168.178.32
$ gst-launch-1.0 tcpclientsrc host=192.168.178.32 port=5000 ! jpegdec ! videoconvert ! autovideosink
RTSP streaming.
RTSP streaming is widespread. It is designed to control media streaming sessions between endpoints. In contrast to the single client connection of TCP and UDP, RTSP can connect a single server to multiple clients. In practice, the number of clients will be limited by the bandwidth capacity of a Raspberry Pi.
Before you can start streaming RTSP, you need the gst-rtsp-server and its examples. See the installation instructions in the section of your GStreamer version 1.14.4, of 1.18.4. As you can see, the pipeline assumes that your source is capable of the x-h264 format, like the raspicam. If not, you need to convert the format.
# select the proper folder (1.14.4 or 1.18.4)
$ cd ~/gst-rtsp-server-1.18.4/examples
# run the pipeline
$ ./test-launch "v4l2src device=/dev/video0 ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse config-interval=1 ! rtph264pay name=pay0 pt=96"
Only 32-bit OS with rpicamsrc installed
# check if you have rpicamsrc running
$ gst-inspect-1.0 rpicamsrc
# select the proper folder
$ cd ~/gst-rtsp-server-1.14.4/examples
# run the pipeline
$ ./test-launch "rpicamsrc bitrate=8000000 preview=false ! video/x-h264, width=640, height=480, framerate=30/1 ! h264parse ! rtph264pay name=pay0 pt=96"
You can receive the stream with the VLC player for example. In the Media menu, select the option Open Network Steam... and enter the address of your Raspberry Pi (in our case 192.168.178.32) followed by the postfix :8554/test as you can see in the screen dump below.

 VLC menu

VLC media player

If you like to use GStreamer to receive the stream, use the following command.
# use the IP address of the sending RPi (192.168.178.32) first
$ gst-launch-1.0 rtspsrc location=rtsp://192.168.178.32:8554/test/ latency=10 ! decodebin ! autovideosink
GitHub examples
Raspberry 64 OS
Raspberry 32 OS
Raspberry and alt
Raspberry Pi 4
Jetson Nano
images/GithubSmall.png
images/YouTubeSmall.png
images/SDcardSmall.png
Back to content