LIMIT is indeed, er... limited. But there is usually some equivalent. For example, MS SQL Server allows
SELECT TOP 100 * FROM table
As ikegami said, it isn't portable. But if you use stored procedures for your queries, or something like SQL::Abstract, you should still be able to sluff the limiting to the RDBMS.
But, to answer the OP's question...
From a memory-conservation POV, you are best looping to grab the first 100 lines. From a performance standpoint, you'll really have to benchmark, because it depends a lot on your RDBMS configuration. Even though the query has returned, all that means is that the DB has located the data -- it hasn't sent it, yet. So, asking for all 10_000 lines might be a significant performance hit, especially if the results will be sent over a slow net link.
The only issue with the loop approach is that you must make sure that your query returned at least 100 records, else handle the problem of running out of data. Something like:
while ((my $row = $sth->fetchrow_hashref) && (@results < 100)) { push
+@results, $row }
warn ("Wasn't able to get 100 records") if (@results < 100)
|