in reply to Memory utilization and hashes

SQLite is exactly what I would recommend in this case: "it's just a disk file," but it's ideally suited to this sort of thing. You can import data very rapidly into an SQLite table, and you can also use its ATTACH DATABASE feature to work with more than one database (file ...) at a time. It has a very fast indexer and a good query engine, and it won't blink at all when dealing with this number of rows. And, since you can easily use them with spreadsheets and so-forth, you might well find that your need for custom programming is severely reduced or even eliminated. Hands down, this is the way I would do this.

Replies are listed 'Best First'.
Re^2: Memory utilization and hashes
by bfdi533 (Friar) on Jan 18, 2018 at 23:51 UTC

    Not a bad thought but you might notice that I had an array in my hash which I needed in the JSON output:

    {"Answer":[{"ip":"3.4.5.6"},{"ip":"1.2.4.5"}],"id":"4","host":"www.goo +gle.com"}

    This is certainly doable in a database (SQLite or PostgreSQL) but would involve another table and then a complicated query to get into the proper format to make it into JSON.

    Not as easy as it sounds in my specific use case, but certainly something I had considered at one point.

    Thanks for the pointer in this direction and the friendly reminder.

    2018-01-28 Athanasius changed pre to code tags