Issue
I'm confused why Java integer literals default to int
instead of long
. This seems to cause unnecessary confusion.
First, it requires the programmer to adopt a special syntax (append "L" to literals) when assigning a value to a long
that exceeds the maximum int
size (2147483647).
long x = 2147483647; // Compiles
long y = 2147483648; // Does not compile
long z = 2147483648L; // Compiles
Second, when using the Long
wrapper class, the programmer must always use the long
literal notation as explained in this SO question.
Long x = 250; // Does not compile
Long y = 250L; // Compiles
Third, considering that implicit conversion from int
literals to the "narrower" data types (short
and byte
) works just fine in all situations (that I know of), it seems that simply making all integer literals type long
would have been the obvious solution... right? Wouldn't this completely remove the need for this odd system of appending "L" to integer literals in special cases?
Solution
This behavior is by design1 and is codified in the JLS: Java Language Specification.
First, note that this is not related to widening which is why the (valid) integer-literal is promoted to a long value. Instead, this is related to the very specification of the int literal:
It is a compile-time error if a hexadecimal, octal, or binary int literal does not fit in 32 bits.
The smallest and largest signed 32-bit integer values are -2147483648 and 2147483647, respectively.
1I care not speculate on why it works this way, and languages like C# have different rules.
Answered By - user166390 Answer Checked By - Robin (PHPFixing Admin)
0 Comments:
Post a Comment
Note: Only a member of this blog may post a comment.