in reply to Re: Tokenising a 10MB file trashes a 2GB machine
in thread Tokenising a 10MB file trashes a 2GB machine
Here's one more output:
$ perl ./tokenizer.pl 8000028 68000056 Vsize: 322.05 MiB ( 337694720) RSS : 79143 pages
OS
uname -a Linux ubuntu 2.6.24-19-generic #1 SMP Fri Jul 11 23:41:49 UTC 2008 i68 +6 GNU/Linux
GCC Version
gcc -v gcc-Version 4.2.3 (Ubuntu 4.2.3-2ubuntu7)
|
---|