Issue
This works fine:
int foo = bar.charAt(1) - '0';
Yet this doesn't - because bar.charAt(x) returns a char:
int foo = bar.charAt(1);
It seems that subtracting '0' from the char is casting it to an integer.
Why, or how, does subtracting the string '0' (or is it a char?) convert another char in to an integer?
Solution
That's a clever trick. char's are actually of the same type / length as shorts. Now when you have a char that represents a ASCII/unicode digit (like '1'), and you subtract the smallest possible ASCII/unicode digit from it (e.g. '0'), then you'll be left with the digit's corresponding value (hence, 1)
Because char is the same as short (although, an unsigned short), you can safely cast it to an int. And the casting is always done automatically if arithmetics are involved
Answered By - Lukas Eder
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.