I am writing a program that is dealing with letters from a foreign alphabet. The program is taking the input of a number that is associated with the unicode number for a character. For example 062A is the number assigned in unicode for that character.
I first ask the user to input a number that corresponds to a specific letter, i.e. 062A. I am now attempting to turn that number into a 16-bit integer that can be decoded by python to print the character back to the user.
example:
for \u0394
print(bytes([0x94, 0x03]).decode('utf-16'))
however when I am using
int('062A', '16')
I receive this error:
ValueError: invalid literal for int() with base 10: '062A'
I know it is because I am using A in the string, however that is the unicode for the symbol. Can anyone help me?
int('062A', 16)0x062Aand\u0394?'062A'with'utf-16'?int('062A', '16'), and got the same error as @KarlKnechtel (TypeError: 'str' object cannot be interpreted as an integer). Please ensure that your post contains the entire, correct, error output.