Issue
let decimalA: Decimal = 3.24
let decimalB: Double = 3.24
let decimalC: Decimal = 3.0 + 0.2 + 0.04
print (decimalA) // Prints 3.240000000000000512
print (decimalB) // Prints 3.24
print (decimalC) // Prints 3.24
I'm totally confused. Why do these things happen? I know why floating point numbers lose precision, but I can't understand why Decimal lose precision while storing decimal numbers.
I want to know how can I initialize Decimal type without losing precision. The reason why these happen is also very helpful to me. Sorry for my poor English.
Solution
The problem is that all floating point literals are inferred to have type Double
, which results in a loss of precision. Unfortunately Swift can't initialise floating point literals to Decimal
directly.
If you want to keep precision, you need to initialise Decimal
from a String literal rather than a floating point literal.
let decimalA = Decimal(string: "3.24")!
let double = 3.24
let decimalC: Decimal = 3.0 + 0.2 + 0.04
print(decimalA) // Prints 3.24
print(double) // Prints 3.24
print(decimalC) // Prints 3.24
Bear in mind this issue only happens with floating point literals, so if your floating point numbers are generated/parsed in runtime (such as reading from a file or parsing JSON), you shouldn't face the precision loss issue.
Answered By - Dávid Pásztor Answer Checked By - Terry (PHPFixing Volunteer)
0 Comments:
Post a Comment
Note: Only a member of this blog may post a comment.