erix,
Sorry to take so long, real work got in the way.
Hears the updated info, I used BrowserUk's sub to generate the data. On the key part is was 40bytes long, but on the data part it was 320bytes, so I 'substr' it to 80 (If 320 is correct, I can run the tests again). I ran the test for 1_000_000, since for your purposes it didn't matter. Also I multipled the times by 1000 to get the results in milliseconds. I generated the random keys at the beginning, so that they wouldn't be in cache.
Here it is:
while ( $cnt < $howmany ) { $key = rndStr( 40, 'a'..'z'); $data = substr(rndStr(80,qw[a c g t]),0,80)); if ( ( ( $cnt % 113 ) == 0 )&&( scalar keys %khash < 10 ) +) { $khash{$key} = 0; } . . . for ( 1 .. 4 ) { foreach $key ( keys %khash ) { $stime = gettimeofday; $ret = $cursor->c_get($key, $data, DB_SET); $etime = sprintf("%.6f",(gettimeofday - $stime) * 1_000 +); print " $key Time: $etime ms\t$hkey\n"; } }
Running it and Output:
# time perl Show11M_mod.plx cds_enabled ## Start: VSZ-10292_KB RSS-4828_KB BLOCK: 512 ( 1000000 ) Write: 1049.66578292847 952/sec 1000000 ReadNext: 28.9542100429535 34537/sec Total: 1000000 ## End: VSZ-10292_KB RSS-6284_KB Diff:0|1456_KB BLOCK: 512 rijrxyzhfvfhvpktkiedvmnpwdphswhavejjwqvr Time: 0.164032 ms evxacpuyerimyidhwfqnvqsjqzrdpgwxzywssakk Time: 0.089884 ms qrckdiakaaanjsrnvsswzuebxmtxeaznhpwdqgfn Time: 0.064135 ms pxlyvhbaujsfdwzsdjterlqeiothhpdzljizypbi Time: 0.066996 ms wfbqhvgjnltboojbctaszbaxlcwibjdjgmwzcusu Time: 0.050068 ms ukotkvoceuchbrrdegkixjdegzqclfxbwkdvrnkj Time: 0.043869 ms dcrcpnxnuhfrwmysbxnfmbzqhgeblvoyczoqboef Time: 0.052929 ms xsgzxvlivfwqirwmpjpdnbtifuvjqmbthmgtnbxh Time: 0.050068 ms qntwonibxslleldmlvanodhzlqhweeihlsarfznj Time: 0.053167 ms rpflfufduuqvtkydqswvgnyionloswworrdraplt Time: 0.057936 ms rijrxyzhfvfhvpktkiedvmnpwdphswhavejjwqvr Time: 0.012875 ms evxacpuyerimyidhwfqnvqsjqzrdpgwxzywssakk Time: 0.011921 ms qrckdiakaaanjsrnvsswzuebxmtxeaznhpwdqgfn Time: 0.010967 ms pxlyvhbaujsfdwzsdjterlqeiothhpdzljizypbi Time: 0.010967 ms wfbqhvgjnltboojbctaszbaxlcwibjdjgmwzcusu Time: 0.010967 ms ukotkvoceuchbrrdegkixjdegzqclfxbwkdvrnkj Time: 0.011206 ms dcrcpnxnuhfrwmysbxnfmbzqhgeblvoyczoqboef Time: 0.010967 ms xsgzxvlivfwqirwmpjpdnbtifuvjqmbthmgtnbxh Time: 0.010967 ms qntwonibxslleldmlvanodhzlqhweeihlsarfznj Time: 0.012159 ms rpflfufduuqvtkydqswvgnyionloswworrdraplt Time: 0.010967 ms rijrxyzhfvfhvpktkiedvmnpwdphswhavejjwqvr Time: 0.011921 ms evxacpuyerimyidhwfqnvqsjqzrdpgwxzywssakk Time: 0.012159 ms qrckdiakaaanjsrnvsswzuebxmtxeaznhpwdqgfn Time: 0.012159 ms pxlyvhbaujsfdwzsdjterlqeiothhpdzljizypbi Time: 0.010967 ms wfbqhvgjnltboojbctaszbaxlcwibjdjgmwzcusu Time: 0.010014 ms ukotkvoceuchbrrdegkixjdegzqclfxbwkdvrnkj Time: 0.010967 ms dcrcpnxnuhfrwmysbxnfmbzqhgeblvoyczoqboef Time: 0.010014 ms xsgzxvlivfwqirwmpjpdnbtifuvjqmbthmgtnbxh Time: 0.010967 ms qntwonibxslleldmlvanodhzlqhweeihlsarfznj Time: 0.010967 ms rpflfufduuqvtkydqswvgnyionloswworrdraplt Time: 0.010014 ms rijrxyzhfvfhvpktkiedvmnpwdphswhavejjwqvr Time: 0.011921 ms evxacpuyerimyidhwfqnvqsjqzrdpgwxzywssakk Time: 0.011921 ms qrckdiakaaanjsrnvsswzuebxmtxeaznhpwdqgfn Time: 0.010967 ms pxlyvhbaujsfdwzsdjterlqeiothhpdzljizypbi Time: 0.010967 ms wfbqhvgjnltboojbctaszbaxlcwibjdjgmwzcusu Time: 0.010967 ms ukotkvoceuchbrrdegkixjdegzqclfxbwkdvrnkj Time: 0.010967 ms dcrcpnxnuhfrwmysbxnfmbzqhgeblvoyczoqboef Time: 0.010967 ms xsgzxvlivfwqirwmpjpdnbtifuvjqmbthmgtnbxh Time: 0.010967 ms qntwonibxslleldmlvanodhzlqhweeihlsarfznj Time: 0.010967 ms rpflfufduuqvtkydqswvgnyionloswworrdraplt Time: 0.010967 ms real 18m17.387s user 1m52.459s sys 0m34.850s
Regards...Ed
"Well done is better than well said." - Benjamin Franklin
In reply to Re^3: Efficient way to handle huge number of records?
by flexvault
in thread Efficient way to handle huge number of records?
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |