9

Could someone help me to convert bittarray to string properly? I wrote this:

static String BitArrayToStr(BitArray ba)
        {
            byte[] strArr = new byte[ba.Length / 8];

            System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding();

            for (int i = 0; i < ba.Length / 8; i++)
            {
                for (int index = i * 8, m = 1; index < i * 8 + 8; index++, m *= 2)
                {
                    strArr[i] += ba.Get(index) ? (byte)m : (byte)0;
                }
            }

            return encoding.GetString(strArr);
        }

but on the output I have this: "���*Ȱ&����L9��q�zȲP���*Ȱ&����L9��q�zȲP���*Ȱ&Y(W�" -many unrecognised symbols, what shoud I do?

0

4 Answers 4

13

You can use this extension method:

public static string ToBitString(this BitArray bits)
{
    var sb = new StringBuilder();

    for (int i = 0; i < bits.Count; i++)
    {
        char c = bits[i] ? '1' : '0';
        sb.Append(c);
    }

    return sb.ToString();
}
Sign up to request clarification or add additional context in comments.

1 Comment

Note that the order of bits may be the opposite of what you expect, e.g. if you want to print the binary representation of an integer or float. Usually, the least significant bit is on the right. With the above code, it may be the reverse. Of course, this is a question of interpretation and depends on what you are doing exactly.
4

Are you sure your input bit array is from a string that was encoded as ASCII?

Using your code I did the following test

string s = "Hello World";
byte[] bytes = Encoding.ASCII.GetBytes(s);
BitArray b = new BitArray(bytes);
string s2 = BitArrayToStr(b);

And s2 came out with the value Hello World as expected.

Update:

As described in the comments, the ASCII Encoder only handles the bytes 32-127 as printable characters, 0-32 which are control characters will display symbols and everything above 127 will use the ASCII fallback to handle the bad bytes.

Here is a quote from MSDN

http://msdn.microsoft.com/en-us/library/system.text.decoderfallback.aspx

A decoding operation can fail if the input byte sequence cannot be mapped by the encoding. For example, an ASCIIEncoding object cannot decode a byte sequence that yields a character having a code point value that is outside the range U+0000 to U+007F.

When an encoding or decoding conversion cannot be performed, the .NET Framework provides a failure-handling mechanism called a fallback. Your application can use predefined .NET Framework encoder and decoder fallbacks, or it can create a custom encoder fallback derived from the EncoderFallback and EncoderFallbackBuffer classes or a custom decoder fallback derived from the DecoderFallback and DecoderFallbackBuffer classes.

4 Comments

problem is that i have random values in bitarray
The ASCII character set only covers 7 bit characters, the printable characters are int the byte range 32-127, all other characters are non-printable. Below 32 are control "characters" and 128-255 are extended ASCII characters which require 8 bits and the symbols you see will depend on the font being used, or if you use a code page encoding it will depend on the code page you select. So if your random numbers are outside that range then you will get the funny characters that you are seeing.
so wich encoding shoud i use that covers all possible binary variants?
Well no encoding is guarenteed to give you a valid character for every byte combination. Event the Unicode transform have things like surrogate pairs etc. which are byte sequences that form a character code point, but not all byte sequences neccesarily form valid code points. You can experiment with the various encodings and possibly provide a custom decoder fallback routine to handle errors in a way that would suite your requirements.
0

You could use this method to convert the BiArray into a byte array:

public static byte[] ToByteArray(this BitArray bits) 
{
    int numBytes = bits.Count / 8;
    if (bits.Count % 8 != 0) numBytes++;

    byte[] bytes = new byte[numBytes];
    int byteIndex = 0, bitIndex = 0;

    for (int i = 0; i < bits.Count; i++) 
    {
        if (bits[i])
            bytes[byteIndex] |= (byte)(1 << (7 - bitIndex));

        bitIndex++;
        if (bitIndex == 8) {
            bitIndex = 0;
            byteIndex++;
        }
    }

    return bytes;
}

And then:

BitArray ba = Fill();
string result = Encoding.ASCII.GetString(ba.ToByteArray());

3 Comments

i was tring it this way but still on output i have many "�" on output how can i fix this?
Well then probably your BitArray is corrupted or not ASCII encoded.
@Darin Dimitrov, OP is correct, you need to use different encoding to make this work. "ASCIIEncoding corresponds to the Windows code page 20127. Because ASCII is a 7-bit encoding, ASCII characters are limited to the lowest 128 Unicode characters" link
0
static string BitArrayToStr(BitArray ba, bool reverseBits) {
  byte[] bytes = ((ba.Length+7)/8);
  ba.CopyTo(bytes, 0);
  if (reverseBits) ReverseBits(bytes);
  return Encoding.ASCII.GetString(bytes);
}

static void ReverseBits(byte[] bytes) {
  for(int i=0; i<bytes.Length; i++)
    bytes[i] = ReverseBits(bytes[i]);
}

static byte ReverseBits(byte b) {
  b = (byte)(((b >> 1) & 0x55) | ((b & 0x55) << 1));
  b = (byte)(((b >> 2) & 0x33) | ((b & 0x33) << 2));
  b = (byte)(((b >> 4) & 0x0F) | ((b & 0x0F) << 4));
  return b;
}

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.