I am trying to send binary data from C++ to Javascript and interpret it byte per byte. For this I use a simple TCP server and the net module from node.js. Here is the C++ Code:
char a[4]={128,0,0,1};
std::stringstream stream;
stream.write(a,sizeof(a));
server.send(stream);
And for the Javascript side :
var client = new net.Socket();
client.setEncoding('utf8');
client.setNoDelay(true);
client.connect({port:25003},handler);
client.on('data', function(data) {
var thedata=data;
console.log('Received from C++: ' + thedata.length+"bytes");
console.log('DATA: '+thedata);
for(var i=0;i<thedata.length;i++)
{
buffer1[i]=thedata.charCodeAt(i);
}
console.log(buffer1[0]);
});
Code explained shortly:
C++ sends 4 bytes over TCP: 128 0 0 1
Javascript saves the received data in an Uint8Array with the charCodeAt function.
So basically my problem is that I can only send "numbers" from 0-127 because in javascript it seems that I need to use the charCodeAt function, which goes crazy for values over 127. And even worse, when I try to send this over a socket.io connection it creates a 16bit String.
How can I handle the bytes on the Javascript side? I mean, simply interpret it back as 128 0 0 1?
.setEncoding()altogether - you don't want Node to interpret the stream of bytes as being characters with any particular encoding. If you don't call that, then your "data" events will be passed Buffer instances instead of strings.