I have a PC server and an android client; my android client start a socket connection to server.
While I am connected to server, I also receive data from server to android client;
Here is my code:
Socket socket = null;
DataOutputStream out = null;
DataInputStream in = null;
InputStream inputStream = null;
OutputStream outputStream = null;
...
public void connectToTCP()
{
try
{
socket = new Socket(HOST_ADDRESS, PORT);
socket.setSoTimeout(30000);
outputStream = socket.getOutputStream();
out = new DataOutputStream(outputStream);
inputStream = socket.getInputStream();
in = new DataInputStream(inputStream);
Log.e("TCP-", "Connected");
while (socket.isConnected()){readBytes();}
}
catch (UnknownHostException e)
{
Log.e("Error in tcp connection","Unknown Host");
}
catch (IOException e)
{
Log.e("Error in tcp connection", "Couldn't get I/O for the connection");
}
}
public void readBytes() throws IOException
{
if (in.available() > 0)
{
byte[] buffer = new byte[in.available()];
if (buffer.length > 0)
{
if (mListener != null)
{
int numberOfBytes = in.read(buffer);
mListener.tcpConnectionDataReceived(buffer, numberOfBytes);
}
}
}
}
but my problem is in performance. I tested the code on the device and I noticed (from task manager) that the app consume a lot of resources (CPU usage is more than 50%) but when I stop reading from socket by deleting this while loop while (socket.isConnected()){readBytes();} CPU usage becomes less than 1%.
Any ideas to solve this?