in reply to Re: string arrays
in thread string arrays

Fast native C import routines are pretty standard with any halfway decent RDBMS.

MySQL LOAD DATA INFILE '/tmp/blah.txt' INTO TABLE mytable FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'; MS SQL BULK INSERT [table name] FROM [filename to insert] WITH (FIELDTERMINATOR = '\t', FIRSTROW=2, ROWTERMINATOR = '\n') Oracle Two stage control file insert methadology similar to above

I would make the rash presumption that the user already has the data in a (text) file (hardly likely to type in 100 million strings by hand as it would take literally years) and wants to manipulate them in some way. A RDBMS is probably the best solution. Multiple gigabytes of RAM is another possibility. A text file would seem the have rather dubious utility.....

We use big RAM or a DB depending on the task. We have one data munging widget that takes several hours to run and consumes up to 4 GB of RAM. The same algorithm tied to disk to save memory took weeks to run. In this case the extra speed of in memory processing easily offets the RAM cost.

cheers

tachyon

s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print