Introduction
The purpose of this post is to warn of an edge case where the cardinality estimate might significantly increase after adding a conjunctive filter predicate. This problem might arise in Oracle 12c due to sql plan directives on partitioned tables when there is a value which is rare in one partition, but appears frequently in other partitions.
Test data
Create the table with two partitions:
CREATE TABLE T
(
id number,
value varchar2(10)
)
PARTITION BY RANGE (id)
(
PARTITION t_id_1 VALUES LESS THAN (2) ,
PARTITION t_id_2 VALUES LESS THAN (3)
);
Firstly, fill the second partition with both rare and popular data:
INSERT INTO T
SELECT 2,'RARE_IN_2' value
FROM dual
CONNECT BY level <= 1;
INSERT INTO T
SELECT 2,'POPULAR' value
FROM dual
CONNECT BY level <= 1000000 ;
Secondly, insert the rare values from the second partition into the first partition:
INSERT INTO T SELECT 1,'RARE_IN_2' value FROM dual CONNECT BY level <= 1000;
Finally, gather statistics:
EXEC DBMS_STATS.gather_table_stats('SYS', 'T');
SQL plan directives
The following query selects the rare values from the second partition:
SELECT /*+ GATHER_PLAN_STATISTICS */ *
FROM t
WHERE id = 2 and value = 'RARE_IN_2' ;
As expected, the estimated cardinality is wrong. As there are only two values in the table and the histograms are missing the optimizer thinks it would get the half of the records back (500K):
-----------------------------------------------------------------------------------------
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers
-----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | | 1 |00:00:00.02 | 2211
| 1 | PARTITION RANGE SINGLE| | 1 | 500K| 1 |00:00:00.02 | 2211
|* 2 | TABLE ACCESS FULL | T | 1 | 500K| 1 |00:00:00.02 | 2211
-----------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
2 - filter(("VALUE"='RARE_IN_2' AND "ID"=2))
However, because of the “AND” clause in the filter predicate, the statistics feedback kicked in and as a result, the SQL plan directive was created:
EXEC DBMS_SPD.flush_sql_plan_directive;
select type,state,reason,notes
from dba_sql_plan_directives
where directive_id in(
select directive_id from dba_sql_plan_dir_objects
where owner='SYS' and object_name='T'
);
TYPE STATE REASON
-------------------- ------ ------------------------------------------------------------
NOTES
----------------------------------------------------------------------------------------
DYNAMIC_SAMPLING USABLE SINGLE TABLE CARDINALITY MISESTIMATE
<spd_note>
<internal_state>NEW</internal_state>
<redundant>NO</redundant>
<spd_text>{EC(SYS.T)[ID, VALUE]}</spd_text>
</spd_note>
Thanks to dynamic sampling the cardinality estimate for the second execution is correct:
SELECT /*+ GATHER_PLAN_STATISTICS */ *
FROM t t2 WHERE id = 2
and value = 'RARE_IN_2' ;
-----------------------------------------------------------------------------------------
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers |
-----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | | 1 |00:00:00.02 | 2211 |
| 1 | PARTITION RANGE SINGLE| | 1 | 1 | 1 |00:00:00.02 | 2211 |
|* 2 | TABLE ACCESS FULL | T | 1 | 1 | 1 |00:00:00.02 | 2211 |
-----------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
2 - filter(("VALUE"='RARE_IN_2' AND "ID"=2))
Note
-----
- dynamic statistics used: dynamic sampling (level=2)
- 1 Sql Plan Directive used for this statement
Now, we are selecting all the data from the first partition. No surprise there, the estimate is correct:
SELECT /*+ GATHER_PLAN_STATISTICS */ *
FROM t WHERE id = 1
-----------------------------------------------------------------------------------------
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers
-----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | | 1000 |00:00:00.01 | 72
| 1 | PARTITION RANGE SINGLE| | 1 | 1000 | 1000 |00:00:00.01 | 72
|* 2 | TABLE ACCESS FULL | T | 1 | 1000 | 1000 |00:00:00.01 | 72
-----------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
2 - filter("ID"=1)
To stir up we will add a conjunctive predicate:
SELECT /*+ GATHER_PLAN_STATISTICS */ *
FROM t WHERE id = 1 and value = 'RARE_IN_2' ;
-----------------------------------------------------------------------------------------
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers
-----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | | 1000 |00:00:00.01 | 72
| 1 | PARTITION RANGE SINGLE| | 1 | 2760 | 1000 |00:00:00.01 | 72
|* 2 | TABLE ACCESS FULL | T | 1 | 2760 | 1000 |00:00:00.01 | 72
-----------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
2 - filter(("ID"=1 AND "VALUE"='RARE_IN_2'))
Note
-----
- dynamic statistics used: dynamic sampling (level=2)
- 1 Sql Plan Directive used for this statement
The cardinality estimate soared which is counterintuitive because an additional “AND” filter predicate can only reduce the number of the returned rows. What happened here? Because of the additional filter predicate the sql plan directive and dynamic sampling kicked in and produced the wrong estimate because of the skewed data. In this case, the basic column statistics would have led to a far better outcome:
SELECT /*+ GATHER_PLAN_STATISTICS dynamic_sampling(0) */ *
FROM t WHERE id = 1 and value = 'RARE_IN_2' ;
-----------------------------------------------------------------------------------------
| Id | Operation | Name | Starts | E-Rows | A-Rows | A-Time | Buffers
-----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | | 1000 |00:00:00.01 | 72
| 1 | PARTITION RANGE SINGLE| | 1 | 1000 | 1000 |00:00:00.01 | 72
|* 2 | TABLE ACCESS FULL | T | 1 | 1000 | 1000 |00:00:00.01 | 72
-----------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
2 - filter(("ID"=1 AND "VALUE"='RARE_IN_2'))
As a workaround, you might permanently disable the misleading sql plan directive by using the procedure described in the post by Franck Pachot and create the histogram on the skewed column.
Update 18.11.2015: The Oracle Support filed a bug for the problem – Bug 22229118 : WRONG CARDINALITY ESTIMATE WITH SQL PLAN DIRECTIVE .
Hi Nenad,
interesting case again. I think that this is a general CBO bug as the DS result itself is plausible, but the corresponding adjustment of cardinality not. I rebuild your test case for illustration purpose and in addition the result cache was flushed to get valid/full iteration statistics in the dynamic sampling trace output.
…
Table: T Alias: T
Card: Original: 1000.000000qksdsSInitCtx(): qksdsSInitCtx(): timeLimit(ms) = 2500
…
DS Query Text:
SELECT /* DS_SVC */ /*+ dynamic_sampling(0) no_sql_tune no_monitoring optimizer_features_enable(default) no_parallel result_cache(snapshot=3600) */ SUM(C1) FROM (SELECT /*+ qb_name(“innerQuery”) NO_INDEX_FFS( “T”) */ 1 AS C1 FROM “T” SAMPLE BLOCK(19.861, 8) SEED(1) “T” WHERE (“T”.”ID”=1) AND (“T”.”VALUE”=’RARE_IN_2′)) innerQuery
…
Iteration 1
Exec count: 3
CR gets: 27
CU gets: 0
Disk Reads: 19
Disk Writes: 0
IO Read Requests: 4
IO Write Requests: 0
Bytes Read: 155648
Bytes Written: 0
Bytes Exchanged with Storage: 155648
Bytes Exchanged with Disk: 155648
Bytes Simulated Read: 0
Bytes Simulated Returned: 0
Elapsed Time: 3949 (us)
CPU Time: 3950 (us)
User I/O Time: 113 (us)
qksdsDumpEStats(): Sampling Input
IO Size: 8
Sample Size: 19.860973
Post S. Size: 100.000000
…
DS Query Text:
SELECT /* DS_SVC */ /*+ dynamic_sampling(0) no_sql_tune no_monitoring optimizer_features_enable(default) no_parallel result_cache(snapshot=3600) */ SUM(C1) FROM (SELECT /*+ qb_name(“innerQuery”) NO_INDEX_FFS( “T”) */ 1 AS C1 FROM “T” SAMPLE BLOCK(39.7219, 8) SEED(2) “T” WHERE (“T”.”ID”=1) AND (“T”.”VALUE”=’RARE_IN_2′)) innerQuery
…
Iteration 2
Exec count: 1
CR gets: 32
CU gets: 0
Disk Reads: 9
Disk Writes: 0
IO Read Requests: 2
IO Write Requests: 0
Bytes Read: 73728
Bytes Written: 0
Bytes Exchanged with Storage: 73728
Bytes Exchanged with Disk: 73728
Bytes Simulated Read: 0
Bytes Simulated Returned: 0
Elapsed Time: 1967 (us)
CPU Time: 1968 (us)
User I/O Time: 59 (us)
qksdsDumpEStats(): Sampling Input
IO Size: 8
Sample Size: 39.721946
Post S. Size: 100.000000
…
Final
Exec count: 4
CR gets: 59
CU gets: 0
Disk Reads: 28
Disk Writes: 0
IO Read Requests: 6
IO Write Requests: 0
Bytes Read: 229376
Bytes Written: 0
Bytes Exchanged with Storage: 229376
Bytes Exchanged with Disk: 229376
Bytes Simulated Read: 0
Bytes Simulated Returned: 0
Elapsed Time: 5916 (us)
CPU Time: 5918 (us)
User I/O Time: 172 (us)
qksdsDumpEStats(): Sampling Input
IO Size: 8
Sample Size:
Post S. Size: 100.000000
…
DS Service Statistics
qksdsDumpStats(): Executions: 2
Retries: 1
Timeouts: 0
ParseFails: 0
ExecFails: 0
qksdsDumpStats(): qksdsDumpResult(): DS Results: #exps=1, smp obj=T
qksdsDumpResult(): T.CARD = qksdsDumpResult(): (mid=2029.1, low=2029.1, hig=2029.1)qksdsDumpResult():
qksdsDumpResult(): end dumping resultsqksdsExecute(): qksdsExecute(): exit
>> Single Tab Card adjusted from 1000.000000 to 2029.105000 due to adaptive dynamic sampling
…
The DS query with the block sample of 19.860973 returns NULL and the DS query with the block sample of 39.721946 returns 806. The DS itself got no errors, but one retry. However the qksds adjusted results are just plain wrong even if the DS query itself produces a valid result. The sample clause itself is explained by Jonathan in one of his blog posts: https://jonathanlewis.wordpress.com/2010/03/12/sample-clause/
Regards
Stefan
Hi Stefan,
Yes, the adjustment after the DS is out of the ballpark for this kind of data.
I filed a SR and will update the post once I get feedback from Oracle support.
Regards,
Nenad
Hi Stefan,
Oracle Support filed a new bug 22229118 : WRONG CARDINALITY ESTIMATE WITH SQL PLAN DIRECTIVE .
Regards,
Nenad