If you look in the debugger at your asciiString variable you will see that all the 3 letters are there, but in between you always have a 0x00 char.
(screenshot from LINQPad Dump)

This is unfortunately interpreted as end of string. So this is why you see only the first byte/letter.
The documentation of GetBytes(char) says that it returns:
An array of bytes with length 2.
if you now get the bytes from a single char:
byte[] byte1 = BitConverter.GetBytes('a');
You get the following result:

The solution would be to pick only the bytes that are not 0x00:
bytes = bytes.Where(x => x != 0x00).ToArray();
string asciiString = Encoding.ASCII.GetString(bytes, 0, bytes.Length);
label1.Text = asciiString;
This example is based on the char variant of GetBytes. But it holds for all other overloads of this method. They all return an array which can hold the maximum value of the corresponding data type. So this will happen always if the value is so small that the last byte in the array is not used and ends up to be 0!
a,bandc?Encodingis a bad idea. Assuming you want to be able to get back the original binary data, I'd adviseConvert.ToBase64Stringinstead.