SQLite: Query Optimization for Large Dataset with Multiple Joins and Conditions
I'm upgrading from an older version and I'm currently working on a report generation feature in my SQLite-based application, which retrieves data from multiple tables using several joins and conditions. However, as the dataset grows, the query performance has significantly degraded, causing noticeable delays in response time. Here's my current SQL query: ```sql SELECT a.name, b.amount, c.date FROM customers AS a JOIN orders AS b ON a.id = b.customer_id JOIN payments AS c ON b.id = c.order_id WHERE b.status = 'completed' AND c.date BETWEEN '2023-01-01' AND '2023-12-31' ORDER BY c.date DESC; ``` Despite adding indices on `customer_id` in the `orders` table and `order_id` in the `payments` table, the query is still running slow. I've tried using `EXPLAIN QUERY PLAN` to analyze the execution plan, which shows that the query performs sequential scans instead of using the indices I created. Hereβs the output I received from the `EXPLAIN QUERY PLAN`: ``` 0|0|0|SCAN TABLE customers 0|0|1|SEARCH TABLE orders USING INDEX orders_customer_id_index (customer_id=?) 0|0|2|SEARCH TABLE payments USING INDEX payments_order_id_index (order_id=?) ``` I also considered breaking down the query into smaller subqueries, but that didn't yield any improvements. Currently, I am using SQLite version 3.36.0. Are there any best practices or specific optimizations I can apply to improve the performance of this query without changing the database schema? Any suggestions would be appreciated! This is part of a larger application I'm building. What am I doing wrong?