README: Extend gstreamer examples

There have been many reports of facing difficulties with the gstreamer
element and getting the libcamerasrc to successfully negotiate with
other gstreamer elements.

This is often due to the current limitations on colorimetry and frame
rate support in the element, and can usually be worked around by
specifying those explicitly in the caps.

Provide a tested example to capture, encode, and stream images as jpeg
to a remote device in the gstreamer section of the getting started
readme.

Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
diff --git a/README.rst b/README.rst
index ca8a97c..aae6b79 100644
--- a/README.rst
+++ b/README.rst
@@ -139,6 +139,25 @@
 All corresponding debug messages can be enabled by setting the ``GST_DEBUG``
 environment variable to ``libcamera*:7``.
 
+Presently, to prevent element negotiation failures it is required to specify
+the colorimetry and framerate as part of your pipeline construction. For
+instance, to capture and encode as a JPEG stream and receive on another device
+the following example could be used as a starting point:
+
+.. code::
+
+   gst-launch-1.0 libcamerasrc ! \
+        video/x-raw,colorimetry=bt709,format=NV12,width=1280,height=720,framerate=30/1 ! \
+        jpegenc ! multipartmux ! \
+        tcpserversink host=0.0.0.0 port=5000
+
+Which can be received on another device over the network with:
+
+.. code::
+
+   gst-launch-1.0 tcpclientsrc host=$DEVICE_IP port=5000 ! \
+        multipartdemux ! jpegdec ! autovideosink
+
 .. section-end-getting-started
 
 Troubleshooting