I am developing an embedded system which sends some data over TCP. This system is ARM-based and its code is written in C. In C side, I have an array of char (or unsigned byte, i.e. uint8_t) which represents some encoded data:
char buffer[BUFFER_SIZE] = {0, 11, 34,176,255}; // for example.
This buffer will be sent to a server via TCP/IP protocol, using a popular GPRS module called SIM800. The connection between microcontroller and SIM800 is UART, i.e. standard serial communication. I can send either uint8_t or char array. It doesn't differ in C world.
At the server side, there exist some Java services that receive and parse this array.
The problem is: In C language, the uint8_t and char data types are somehow identical, i.e 0 -> 255 is equal to the whole ASCII table. But as far as I know, this is not true at the server. In Java byte data type is intrinsically signed and its range is from -128 to 127. Moreover the extended ASCII characters, from 128 to 255 are somehow non-standard and differ from system to system.
The Java service receives data as String and then converts to an array of bytes.
I am confused. What will happen if I send the above-mentioned array to server. Can Java service reinterpret it?
charinstead of array ofbyte. Not sure how your Java service is implemented though.