Thanks, looks like I overcomplicated it a lot, this is much faster and simpler. Perhaps because there was no answer like yours at SO? :-) FWIW, if compressibility (Compress::Zlib::compress) can serve as measure of entropy, then deflating joined output from 100 runs for the same 2096 items array input -- produces string of practically same length as my solution. So, yeah, one initial shuffle and then unshift/push/first seem OK to ensure enough randomness (and speed).
In reply to Re^2: Algorithm RFC: fast (pseudo-)random shuffle with no repetition
by Anonymous Monk
in thread Algorithm RFC: fast (pseudo-)random shuffle with no repetition
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |