Many SQL developers like to use MONEY data type as it tends to be faster during computations and byte cheaper (arguably). Still, in the next few examples, I am going to show some issues that can lead to possible loss of precision.
Let us try the following code.
- declare @d1 money, @d2 money, @res money
- set @d1 = 18.4172
- set @d2 = 1.00
-
- set @res = @d2/@d1
- select @res
The actual result is 0,054297 and by all mathematical rules, it should be rounded to 0,0543. Still, the query result is 0,0542.
Let's look at another example.
- declare @d1 money, @d2 money, @res money
- set @d1 = 18.4172
- set @d2 = 1.1562
-
- set @res = @d2/@d1
- select @res
Actual result is 21,2939664 but the query returns 21,294.
This kind of behavior is called banker's rounding. However, if you use SQL Server database combined with a .NET application, you may encounter precision loss because .NET doesn't use banker's rounding for decimal data type by default.
One of the possible solutions to this is to cast money to more precise data types before such operations. Here, the following example works correctly.
- declare @d1 money, @d2 money, @res money
- set @d1 = 18.4172
- set @d2 = 1.1562
-
- set @res = (convert decimal(19,4), @d2)/@d1
- select @res