I'm using javascript, and I'm little rusty on my bit arithmetic.
Ultimately, my goal is to convert a UInt8Array into 11-bit numbers for use with a bip39 wordlist for converting a libsodium private box key to a mnemonic (I'm building a small p2p-ish chat app).
So, my thought process is:
- Uint8Array is returned from
libsodium.crypto_box_keypair() - Convert Uint8Array into a 256bit (boolean) array
- divide the 256bit array into 11bit buckets (2d array: ~24 x 11bits)
- convert each 11bit array to a base 10 number (between 0 and 2047)
Steps 2, 3, and 4 can be combined into the same loop.
The goal of all this is to efficiently convert from a Uint8Array to an array of 11bit numbers (efficient for the computer -- this hasn't been efficient for myself).
I have this at the moment, but it isn't quite right, and feels somewhat hacky (just from steps 2 and 3, where I try to create the 11bit buckets)
// inspired from: https://github.com/pvorb/node-md5/issues/25
export function toUint11Array(input: Uint8Array): boolean[][] {
let result: boolean[][] = [];
let currentChunk: boolean[] = [];
input.forEach(byte => {
for (var j = 7; j >= 0; j--) {
var b = ((byte >> j) & 0x1) > 0;
if (currentChunk.length === 11) {
result.push(currentChunk);
currentChunk = [];
}
currentChunk.push(b);
}
});
return result;
}
Currently, for 2048, I get 2 11 bit arrays (expected), but the content / order is unexpected.
[
false, false, false, false,
false, false, false, false,
false, false, false
],
[ false, true, false, false,
false, false, false, false,
false, false, false
]
2048 is 0b100_000_000_000
where the 12th digit from the right is the 1 (added underscores for easier reading)
so maybe it looks like I have an endianness problem and and off by one issue? because the true in my dual array is the 13th position from the left.
though, when I test with 4096 (13 bits (0b1_000_000_000_000)), I get this:
[
false, false, false, false,
false, false, false, false,
false, false, false
],
[
true, false, false, false,
false, false, false, false,
false, false, false
],
[
false, false, false, false,
false, false, false, false,
false, false, false
]
Here, the true is 12th from the left, and 22nd from the right.
Update
per @bergi, who asked about endianness.
I don't know what endianness this is. :-\
Update 2
Thanks to @harold for coming up with the answer. I have some tests that I think confirm correctness.
const numbers = {
['32']: new Uint8Array([32]),
['64']: new Uint8Array([64]),
['2048']: new Uint8Array([8, 0]),
['4096']: new Uint8Array([16, 0]),
['7331']: new Uint8Array([28, 163])
}
test ('toUint11Array | converts | 32 (8 bits)', function(assert) {
const result = toUint11Array(numbers['32']);
const expected = [32];
assert.deepEqual(result, expected);
});
test ('toUint11Array | converts | 2048 (12 bits)', function(assert) {
const result = toUint11Array(numbers['2048']);
const expected = [8, 0];
assert.deepEqual(result, expected);
});
test ('toUint11Array | converts | 4096 (13 bits)', function(assert) {
const result = toUint11Array(numbers['4096']);
const expected = [16, 0];
assert.deepEqual(result, expected);
});
test ('toUint11Array | converts | 7331 (13 bits)', function(assert) {
const result = toUint11Array(numbers['7331']);
const expected = [3, 1187];
assert.deepEqual(result, expected);
});
the first 3 pass, but the last does not.
when converting a Uint8Array(28, 163), I get [796, 28]
I'm not 100% sure that I converted 7331 into appropriate bytes correctly, but I did:
7331 = 0b1_1100_1010_0011 split: [1_1100, 1010_0011] -> [28, 163].
I suppose for the output, it should be: [11, 100_1010_0011] which is [3, 1187] which also doesn't match the output.

crypto_box_keypair()always return a 32-byte array?