Details
-
Type:
Bug
-
Status: Closed
-
Priority:
Major
-
Resolution: Fixed
-
Affects Version/s: 10.0.9
-
Fix Version/s: 10.0.12
-
Component/s: None
-
Labels:None
Description
After fix for MDEV-4410, EXPLAIN still shows weird #rows:
create table t2(a int); insert into t2 values (0),(1),(2),(3),(4),(5),(6),(7),(8),(9); create table t1 (key1 int, col1 int, key(key1)); insert into t1 select A.a + 10 * B.a + 100 * C.a, 1234 from t2 A, t2 B, t2 C;
mysql> explain update t1 set key1=key1+1 where key1 between 10 and 110 order by key1 limit 2; +------+-------------+-------+-------+---------------+------+---------+------+------+---------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +------+-------------+-------+-------+---------------+------+---------+------+------+---------------------------+ | 1 | SIMPLE | t1 | range | key1 | key1 | 5 | NULL | 101 | Using where; Using buffer | +------+-------------+-------+-------+---------------+------+---------+------+------+---------------------------+
The output shows as if 100 rows were to be scanned. Actually, we expect to scan LIMIT / where_selectivity rows (here where_selectivity is selectivity of WHERE for results of the range scan).
If we assume where_selectivity=1 (which is often done across the optimizer when no other sources of info are present), then we expect to scan LIMIT rows.
Gliffy Diagrams
Attachments
Activity
- All
- Comments
- Work Log
- History
- Activity
- Transitions
This patch makes the issue to go away.
Surprisingly, making get_index_for_order() change *scanned_limit parameter caused fewer changes than I expected.
Somehow change in the function only affects EXPLAIN UPDATE output.
I'm still concerned about what will happen for selective WHERE clauses..