1

When I am doing Integert.valueOf('a') It is returning 97 that is ASCII value of char a.

I want to know when char converted in ascii value in valueOf method?

3
  • Internally is used parseInt Commented Jul 18, 2017 at 9:16
  • It isn't converted, strictly speaking. ASCII is a character encoding standard which essentially maps a set of characters to numbers. The Java VM has this map with Integers on one side and Chars on the other. Commented Jul 18, 2017 at 9:20
  • @marekful Could you please explain more as I am seeing it is working as parseInt method is calling internally. Commented Jul 18, 2017 at 9:29

2 Answers 2

2

A char is a numeric type. You are calling the Integer.valueOf(int i) method, which takes an int. The char is promoted from 16- to 32-bits by zero-extending it and that value is passed to the valueOf method.

Note that ASCII is almost certainly not involved here. Most Java compilers, like the Oracle javac reference compiler, will use a platform default character set such as Windows-1252 when parsing your source code, and internally Java uses Unicode. However, ASCII is a subset of both of those character sets and Unicode code point 97 LATIN SMALL LETTER A is common to all three.

I realize it's common to refer to code points as "ASCII values" but that can lead to confusion if one tries to speak of the "ASCII value" of, say, a Greek letter. It's better to say "code point" unless you're actually talking about the US-ASCII character set.

Sign up to request clarification or add additional context in comments.

Comments

1

What happens to you is an internal type conversion.

so your code is equivalent to this:

char a = 'a';
int aint = (int)a;
Integer aInteger = Integer.valueOf(aint);
System.out.println(aInteger.toString());

Quoting the oracle java tutorial: https://docs.oracle.com/javase/tutorial/java/nutsandbolts/datatypes.html

char: The char data type is a single 16-bit Unicode character. It has a minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or 65,535 inclusive).

So the variable a holds the unicode equivalent of the charachter 'a' (97)

what then happens is a widening operation https://docs.oracle.com/javase/specs/jls/se7/html/jls-5.html

5.1.2. Widening Primitive Conversion

[...]

A widening conversion of a char to an integral type T zero-extends the representation of the char value to fill the wider format.

so your char (16 bit) will be widened to an int (32 bit).

That int is what's beeing fed into Integer.valueOf()

Hope that answeres your question.

Tobi

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.