I'm working on a cross-platform encryption system. One of the requirements is to easily encrypt and decrypt strings in out application code.
The encryption class works flawlessly, but I'm having trouble with string encoding on the java side.
Currently, I have the following static methods:
public static String encrypt(String key, String data)
{
byte[] decoded_key;
byte[] decoded_data;
try
{
decoded_key = key.getBytes("UTF-8");
decoded_data = data.getBytes("UTF-8");
}
catch (Exception e)
{
//Not Supposed to happen.
throw new RuntimeException();
}
if(decoded_key.length != 16)
throw new IllegalArgumentException("Key length must be of 16 bytes. Given is " + decoded_key.length + ".");
try
{
return(IOUtils.toString(encrypt(decoded_key, decoded_data), "UTF-8"));
}
catch (Exception e)
{
//Not Supposed to happen.
throw new RuntimeException();
}
}
public static String decrypt(String key, String data)
{
byte[] decoded_key;
byte[] decoded_data;
try
{
decoded_key = key.getBytes("UTF-8");
decoded_data = data.getBytes("UTF-8");
}
catch (Exception e)
{
//Not Supposed to happen.
throw new RuntimeException();
}
if(decoded_key.length != 16)
throw new IllegalArgumentException("Key length must be of 16 bytes. Given is " + decoded_key.length + ".");
try
{
return(IOUtils.toString(decrypt(decoded_key, decoded_data), "UTF-8"));
}
catch (Exception e)
{
//Not Supposed to happen.
throw new RuntimeException();
}
}
My unit tests are failing when decrypting. I ran a test where I compared a byte array of encoded UTF-8 data encoded_data with IOUtils.toString(encoded_data, "UTF-8").getBytes("UTF-8") and for some reason they turned out to be different arrays altogether. No wonder my decryption algorithm is failing.
What is the proper procedure to convert from a java string to a UTF-8 byte array and back to a java string?