All we need is an easy explanation of the problem, so here it is.
In my case, we have a table ABC with the supposed 1 row.
but while retrieving I would like to get multiple rows where the start date is incrementing by 1 like till it’s less than or equal to the end date.
Please let me know how we can do that.
How to solve :
I know you bored from this bug, So we are here to help you! Take a deep breath and look at the explanation of your problem. We have many solutions to this problem, But we recommend you to use the first method because it is tested & true method that will 100% work for you.
You can do this in a couple of ways (all the code below is available on the fiddle here).
Solution using GENERATE_SERIES:
For starters, I inserted a few more records to make the use case a bit more realistic I also added a couple of reasonable constraints to the table.
CREATE TABLE test ( id SMALLINT NOT NULL PRIMARY KEY, -- PK added name TEXT NOT NULL, start_date DATE NOT NULL, end_date DATE NOT NULL, CONSTRAINT test_name_uq UNIQUE (name), -- maybe, maybe not? CONSTRAINT sd_lt_ed_ck CHECK (start_date < end_date) -- important! );
INSERT INTO test VALUES (1, 'Rahul', '2021-06-01', '2021-06-06'), (2, 'Bill' , '2021-06-02', '2021-06-10'), (3, 'Mary' , '2021-07-15', '2021-07-22'), (4, 'Fred' , '2021-07-20', '2021-07-27'), (5, 'Joe' , '2021-08-01', '2021-08-04'), (6, 'Jim ' , '2021-09-04', '2021-09-05'), (7, 'John' , '2021-09-17', '2021-09-21');
and then I ran the following query:
SELECT id, GENERATE_SERIES (t.start_date, t.end_date, '1 DAY')::DATE AS sd, t.end_date FROM test t ORDER BY t.id, t.start_date;
Result (snipped for brevity – see fiddle):
id sd end_date 1 2021-06-01 2021-06-06 1 2021-06-02 2021-06-06 1 2021-06-03 2021-06-06 1 2021-06-04 2021-06-06 1 2021-06-05 2021-06-06 1 2021-06-06 2021-06-06 2 2021-06-02 2021-06-10 2 2021-06-03 2021-06-10 ... ... more records ...
an "edge case" of just two consecutive days also works nicely:
6 2021-09-04 2021-09-05 6 2021-09-05 2021-09-05
I did a performance analysis of my solution versus the other one offered by @Akina and while it appears that my solution has more operations, it is consistently faster (typically ~ 2/3 of the time – although occasionally, mine is slower) than his. See the fiddle here.
Now, it is impossible to properly benchmark a solution when we’re only looking at 7 records on a server over which we have no control – what’s happening elsewhere on the machine? I would urge you to test any chosen solution using your own (test) system(s) to clarify this to your own satisfaction.
Solution using a RECURSIVE CTE (RCTE):
Another interesting solution also exists using an
RCTE as follows:
WITH RECURSIVE cte (n, id, sd, ed) AS ( SELECT 1, t.id, t.start_date, t.end_date FROM test t UNION ALL SELECT n+1, c.id, (c.sd + INTERVAL '1 DAY')::DATE, c.ed FROM cte c WHERE c.sd < (SELECT z.end_date FROM test z WHERE z.id = c.id) ) SELECT * FROM cte c2 ORDER BY c2.id, c2.sd, c2.ed;
Same as for the query above - including the two consecutive days
I’ve included the output of
EXPLAIN (ANALYZE, BUFFERS) for both queries in the fiddle. As you can see, the GENERATE_SERIES query takes about 50% of the time of the RCTE. So why, you might ask, would one bother with the RCTE?
Well, they are very powerful and they allow the programmer to add complex logic to their queries – I would urge you to explore them – maybe not for this case, but they are worth bearing in mind for future scenarios.
p.s. welcome to dba.se. For future reference, please do not post the same question both here and on StackOverflow.
Database questions are very much on-topic here. If you’ve asked a question and don’t receive a response within a reasonable amount of time, feel free to ask elsewhere, but, in that case, please put a link from the old question to the new to avoid duplication of effort!
SELECT test.id, test.name, series.start_date::DATE, test.end_date FROM test CROSS JOIN generate_series(test.start_date, test.end_date, '1 day') AS series (start_date)
Note: Use and implement method 1 because this method fully tested our system.
Thank you 🙂