I wrote an application in C++ using Qt 5.7, basically is a websocket server, using qwebsocket for this, capable to send an image elaborated with OpenCV to an HTML client.
In order to do this I encode the image in base64, transmit it and on the client put the encoded string in the src of an image tag.
SERVER
imgIn = cv::imread("/home/me/color.png", CV_LOAD_IMAGE_UNCHANGED);
std::vector<uchar> buffer;
cv::imencode(".png",imgIn,buffer);
std::string s = base64_encode(buffer.data(),buffer.size());
pClient->sendTextMessage(QString::fromStdString(s));
CLIENT
<img id="ItemPreview" src="" style="border:5px solid black" />
websocket.onmessage = function (evt) {
console.log( "Message received :", evt.data );
document.getElementById("ItemPreview").src = "data:image/png;base64," + evt.data;
};
The images comes from a video elaboration, I extract frames, do some image processing and send them to view in the HTML client the video. When computation arises, or the client is remote (but same wifi network) I experience some lag, the video is not smooth.
Is there a better way to transmit images over websocket? Maybe using some other technique than base64 encoding to reduce the data transmitted and the computation speed.
I know TCP protocol used by websocket is not ideal to stream videos as UDP would be preferred to avoid lags, but since the images are small (320*240 ~ 100kb) I would try this way.