Details
-
Type:
Bug
-
Status: Open
-
Priority:
Minor
-
Resolution: Unresolved
-
Affects Version/s: 5.5.40
-
Fix Version/s: 5.5
-
Component/s: None
-
Labels:None
Description
MariaDB [test]> select 1 % 0.00000000000000000000000000000032349 ;
+-------------------------------------------+
| 1 % 0.00000000000000000000000000000032349 |
+-------------------------------------------+
| 0.000000000000000000000000000000 |
+-------------------------------------------+
1 row in set (0.00 sec)
Technically, the second operand is outside DECIMAL precision (it has 30 zeros after the point), so it wouldn't be strange if the query considered it 0 and produced 0. But the current result is difficult to explain.
MySQL 5.5 and PostgreSQL return a non-zero value:
postgres=# select 1 % 0.00000000000000000000000000000032349 ;
?column?
---------------------------------------
0.00000000000000000000000000000018574
MySQL [test]> select 1 % 0.00000000000000000000000000000032349 ;
+-------------------------------------------+
| 1 % 0.00000000000000000000000000000032349 |
+-------------------------------------------+
| 0.00000000000000000000000000000018574 |
+-------------------------------------------+
1 row in set (0.00 sec)
Gliffy Diagrams
Attachments
Issue Links
- relates to
-
MDEV-6841 test decimal math
-
- Open
-
Activity
- All
- Comments
- Work Log
- History
- Activity
- Transitions
Actually, it's not really 0. If I assign the result of the expression to a variable, I can see the real value:
MariaDB [test]> set @a = 1 % 0.00000000000000000000000000000032349; Query OK, 0 rows affected (0.00 sec) MariaDB [test]> select @a; +---------------------------------------+ | @a | +---------------------------------------+ | 0.00000000000000000000000000000018574 | +---------------------------------------+ 1 row in set (0.00 sec)But if it's the problem with the default precision of a returned value, it can be seen on much simpler examples:
It is strange, I can't find in the documentation anything that would explain such low precision of the result.