Home:ALL Converter>SQL: Optimize the query on large table with indexing

SQL: Optimize the query on large table with indexing

Ask Time:2019-10-02T00:17:19         Author:Trần Kim Dự

Json Formatter

For example, I have the following table:

table Product
------------
id
category_id 
processed
product_name

This table has index on columns id category_id and processed and (category_id, proccessed). The statistic on this table is:

select count(*) from Product; -- 50M records
select count(*) from Product where category_id=10; -- 1M records
select count(*) from Product where processed=1; -- 30M records

My simplest query I want to query is: (select * is the must).

select * from Product 
where category_id=10 and processed=1 
order by id ASC LIMIT 100  

The above query without limit only has about 10,000 records.

I want to call the above query for multiple time. Every time I get out I will update field processed to 0. (so it will not appear on the next query). When I test on the real data, sometime the optimizer try to use id as the key, so it cost a lot of time.

How can I optimize the above query (In general term)

P/S: for avoiding confuse, I know that the best index should be (category, processed, id). But I cannot change the index. My question is just only related to optimize the query.

Thanks

Author:Trần Kim Dự,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/58188810/sql-optimize-the-query-on-large-table-with-indexing
Gordon Linoff :

For this query:\n\nselect *\nfrom Product\nwhere category_id = 10 and processed = 1\norder by id asc\nlimit 100;\n\n\nThe optimal index is on product(category_id, processed, id). This is a single index with a three-part key, with the keys in this order.",
2019-10-01T16:18:34
yy