Anyway, he says:
(why this is easy in a normal client-server setup) With an Internet connection though, the customer comes back to the system and asks, Remember me? I was here a little while ago and asked for some data. Can I have the next bit please? And Oracle says, No, you terminated the session so I closed the cursor. I can't fetch the next set of rows.
So what's the solution in the Internet environment? Option 1 is for the front end to know how many rows it has received so far and to keep repeating the quesry, possibly changing it slightly each time. For instance, on the first call it says, Get me the first 20 rows. The second time it says, Get me the first 40 rows and discard the first 20 rows (...snip - you get the idea)
Option 2 is to write all the results into a permanent table on the first call, and add a spurious line number and a meaningless ID (passed back as a cookie perhaps) to each row written, then the second call will say, Get the stored result lines 21 to 40 for cookie code XXX(...snip)
He points out that both approaches are heavier on the server than a traditional client/server approach.
If anyone has a good, efficient solution to this problem, I for one would be v. interested!
andy.
(I suppose if you could didn't mind a long wait before the first row appeared to the user, you could pass the whole results set in one go and use Javascript to only show part of it to the user at any one time. Pretty impractical though).
update: It's just occurred to me that what you really need is a persistent cursor between requests. I wonder if it would be possible to hold the statement handle between requests, then you could fetch_arrayref() say 20 times, show the page, fetch it the next 20 times for the next request, etc. merlyn has two columns that might be relevant.
In reply to Re: Using large result sets from a database
by andye
in thread Using large result sets from a database
by MrCromeDome
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |