1

I'm trying to initialize a System.BitArray instance from integer value. However, it looks like I don't get the right values.

My code is

        var b = new BitArray(BitConverter.GetBytes(0xfa2));
        for (int i = 0; i < b.Count; i++)
        {
            char c = b[i] ? '1' : '0';
            Console.Write(c);
        }
        Console.WriteLine();

I've tried also without BitConverter:

        var b = new BitArray(new int[] { 0xfa2 });

But none of these attempt seem to work. These are the attempts that was suggested here: Convert int to a bit array in .NET

My output: 01000101111100000000000000000000. The excpected output: 111110100010.

Any help will be really appreciated!

3
  • Seems about right, except for some leading zeroes Commented Aug 29, 2014 at 17:57
  • It has to do with the endienness of the architecture. Possible duplicate of stackoverflow.com/questions/217980/… Commented Aug 29, 2014 at 17:57
  • do you expect 12 bits instead of 32? or are you talking about the order of the bits? Commented Aug 29, 2014 at 18:09

3 Answers 3

5

You are looping from the wrong direction. Try this:

    var b = new BitArray(BitConverter.GetBytes(0xfa2));
    for (int i = b.Count-1; i >= 0; i--)
    {
        char c = b[i] ? '1' : '0';
        Console.Write(c);
    }
    Console.WriteLine();
Sign up to request clarification or add additional context in comments.

Comments

2

This is a Little-Endian vs Big-Endian issue, so you need to take the endienness of the hw architecture into consideration. Also based on the documentation you need to change the way you print the BitArray.

byte[] buffer = BitConverter.GetBytes((ushort)0xfa2);

if (BitConverter.IsLittleEndian) Array.Reverse(buffer);

var b = new BitArray(buffer);

for (int i = 0; i < b.Count; i+=8)
{
    for (int j=i + 7; j >= i; j--)
    {
        char c = b[j] ? '1' : '0';
        Console.Write(c);
    }
}

Console.WriteLine();

Reference:

The first byte in the array represents bits 0 through 7, the second byte represents bits 8 through 15, and so on. The Least Significant Bit of each byte represents the lowest index value: " bytes [0] & 1" represents bit 0, " bytes [0] & 2" represents bit 1, " bytes [0] & 4" represents bit 2, and so on.

Comments

1

As stated in documentation BitArray Constructor (Int32[]):

The number in the first values array element represents bits 0 through 31, the second number in the array represents bits 32 through 63, and so on. The Least Significant Bit of each integer represents the lowest index value: " values [0] & 1" represents bit 0, " values [0] & 2" represents bit 1, " values [0] & 4" represents bit 2, and so on.

When use this constructor there no need to check endianness, just reverse output order:

var b = new BitArray(new int[] { 0xfa2 });
// skip leading zeros, but leave least significant bit:
int count = b.Count;
while (count > 1 && !b[count-1])
    count--;
// output
for (int i = count - 1; i >= 0; i--)
{
    char c = b[i] ? '1' : '0';
    Console.Write(c);
}
Console.WriteLine();

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.