Situation : I have a basler camera connected to a raspberry pi, and I am trying to livestream it's feed with FFmpg to a tcp port in my windows PC in order to monitor whats happening in front of the camera.
Things that work : I manage to set up a python script on the raspberry pi which is responsible for recording the frames, feed them to a pipe and streaming them to a tcp port. From that port, I am able to display the stream using FFplay.
My problem : FFplay is great for testing out quickly and easily if the direction you are heading is correct, but I want to "read" every frame from the stream, do some processing and then displaying the stream with opencv. That, I am not able to do yet.
Minimaly reprsented, that's the code I use on the raspberry pi side of things :
command = ['ffmpeg',
'-y',
'-i', '-',
'-an',
'-c:v', 'mpeg4',
'-r', '50',
'-f', 'rtsp',
'-rtsp_transport',
'tcp','rtsp://192.168.1.xxxx:5555/live.sdp']
p = subprocess.Popen(command, stdin=subprocess.PIPE)
while camera.IsGrabbing(): # send images as stream until Ctrl-C
grabResult = camera.RetrieveResult(100, pylon.TimeoutHandling_ThrowException)
if grabResult.GrabSucceeded():
image = grabResult.Array
image = resize_compress(image)
p.stdin.write(image)
grabResult.Release()
On my PC if I use the following FFplay command on a terminal, it works and it displays the stream in real time :
ffplay -rtsp_flags listen rtsp://192.168.1.xxxx:5555/live.sdp?tcp
On my PC if I use the following python script, the stream begins, but it fails in the cv2.imshow function because I am not sure how to decode it:
import subprocess
import cv2
command = ['C:/ffmpeg/bin/ffmpeg.exe',
'-rtsp_flags', 'listen',
'-i', 'rtsp://192.168.1.xxxx:5555/live.sdp?tcp?',
'-']
p1 = subprocess.Popen(command, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
while True:
frame = p1.stdout.read()
cv2.imshow('image', frame)
cv2.waitKey(1)
Does anyone knows what I need to change in either of those scripts in order to get i to work?
Thank you in advance for any tips.
