in reply to Re: Handling HUGE amounts of data
in thread Handling HUGE amounts of data
This produces a file of 8400 lines of 17000 random numbers ~1GB in a little under 7 minutes.
#! perl -slw use strict; use Math::Random::MT qw[ rand ]; for ( 1 .. 8400 ) { print join ',', map 1e5 + int( rand 9e5 ), 1 .. 17000; } __END__ [11:04:48.11] C:\test>885103 > junk.dat [11:12:10.44] C:\test>dir junk.dat 30/01/2011 11:12 999,608,400 junk.dat
I appreciate that your application is doing something more complicated in terms of the numbers produced, but my point is that creating a file this size isn't hard.
So the question becomes, what problems are you having? What is it that your current code isn't doing? Basically, what is it that you are asking for help with? Because so far, that is completely unstated.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^3: Handling HUGE amounts of data
by Dandello (Monk) on Jan 30, 2011 at 17:32 UTC | |
by BrowserUk (Patriarch) on Jan 30, 2011 at 18:37 UTC | |
by bart (Canon) on Jan 31, 2011 at 08:06 UTC |