2

I am bit confused about how Java handle conversion.

I have char array consist of 0 (not '0')

char[] bar = {0, 'a', 'b'};
System.out.println(String.valueOf(bar));

When this happens println method does not output anything. But when I treat zero as character:

char[] bar = {'0', 'a', 'b'};
System.out.println(String.valueOf(bar));

Then it output "0ab" as expected.

My understanding was if you declare array of primitive type with empty value like:

char[] foo = new char[10];

those empty cells have default value of 0 in Java, so I thought it was ok to have 0 in the char array but seems like not. Could anyone explain why print method is not even outputting 'a' and 'b' ?

2
  • 1
    It is OK to have 0 in a char array in the sense that you won't be arrested. But why would you expect the output to be the character "0"? That's what you would get if the first element of bar was 48. 0 corresponds to the nul character, and maybe your display device does something "funny" when it gets that character. Commented Jul 10, 2010 at 2:08
  • I'm getting ` ab` when I run this in Eclipse, so I'd wager that @GregS is right. Commented Jul 10, 2010 at 2:12

2 Answers 2

2

As you have discovered, the character value of the integer 0 and the character literal '0' are different. To convert a number to the character representation of a decimal digit, you need to do something like this:

int digit = ... // some value between 0 and 9
char ch = (char)(digit + '0');

You can also do this as:

int digit = ... 
char ch = Character.forDigit(digit, radix);

where radix has a value between Character.MIN_RADIX and Character.MAX_RADIX. A radix of 10 gives you a decimal digit. See Character.forDigit(int,int).

The character value zero is the Unicode codepoint NULL (0x0000). This is a non-displayable "control character". If you try to display it, you are liable to see either nothing at all (as you are getting) or a special glyph that means "I cannot display this character".

Sign up to request clarification or add additional context in comments.

Comments

2

Your confusion is more basic than the title suggests, and it's not related to arrays, nor to printing, but to the char datatype in Java (and in many languages, as C). Try this:

char c1 = 0;
char c2 = '0';
System.out.println(c1 == c2); // false

And further:

     char c1 = 48;  // unicode code-point of character '0' = 0x30 = 48
     char c2 = '0';
     System.out.println(c1 == c2); // true
     c1 = 97; // unicode code-point of character 'a' = 0x61 = 97
     c2 = 'a';
     System.out.println(c1 == c2); // true

See, a char datatype is really an integer value which (conventionally) denotes a codepoint, i.e., the numeric value assignated to a textual character in Unicode (assumming we're inside the BMP) (and recall that for ASCII chars, i.e. codepoints below 128, ASCII and Unicode have the same codepoints). So, char c1='a' is sort of a shortcut to say "assign c1 the integer value that corresponds to the codepoint of the character 'a'"

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.