0

I'm trying to translate existing code from java into JavaScript (node.js to be specific). Creating an md5 hash of the same String in both languages leads to the following different results:

In Java Arrays.toString(MessageDigest.getInstance("MD5").digest("test".getBytes()));

returns

[9, -113, 107, -51, 70, 33, -45, 115, -54, -34, 78, -125, 38, 39, -76, -10]

while in JS crypto.createHash("md5").update("test", "ascii").digest();

returns

[9, 143, 107, 205, 70, 33, 211, 115, …]

using crypto 1.0.1 Can anyone explain this to me? I already played around with different encodings, but it did not affect the result.

2 Answers 2

3

You're using signed bytes in Java. It's actually the same. I would vote for the unsigned version, but if you have to be compatible with the Java version just arr.map(function(e) {return e >= 128 ? e - 256 : e})

Sign up to request clarification or add additional context in comments.

1 Comment

oh well, that does make sense, because im actually using a node.js Buffer, the provided mapping function does not change that behavior. Never the less, thanks for your help!
0

Both hashes are same. Since the unsigned/signed types you are seeing different numbers but they has same values. Try casting values to unsigned.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.