mohangopa has asked for the wisdom of the Perl Monks concerning the following question:

This node falls below the community's threshold of quality. You may see it by logging in.

Replies are listed 'Best First'.
Re: Perl Caching
by roboticus (Chancellor) on Aug 08, 2013 at 01:12 UTC

    mohangopa:

    For the high level logic, I'd suggest something like:

    1) Initialize 2) Establish communication 3) Get row 4) Generate ID 5) process row 6) If you have more rows, go to step 3 7) Profit!

    Seriously, though, with the rather sparse description of your problem, that's about as specific I can be. Rather than providing a few incomprehensibly abbreviated data column names, you might actually describe your problem. With the specification you've provided, I can suggest a simple function that will meet your specifications for generating IDs.

    ...roboticus

    When your only tool is a hammer, all problems look like your thumb.

      I am really sorry guys. I did not state the problem correctly. Typically we update the securities once a day and our transaction feeds run through out the day. After the process runs that updates security master, I am building a cache file (huge -- close to couple of million). I want to use this cache file to store the asset_id as a new column in my transaction feeds, so I can avoid caching (lookup) while loading the warehouse. I can work on a flag to see whether my cache file is ready to be used or not. Should I be using hash to look up cusip/sedol/isin and pick up asset_id and how does this perform with large data files? Thanks and appreciate your response. Cheers. Raj

        mohangopa:

        I've found that hashes work even when the datasets are fairly large, especially if the hash entries themselves aren't too big. I'd go ahead and try it and see how well hashes work in your application.

        Once you have it running, give it a file twice the normal size, and watch the memory it consumes as it runs. If all goes well, then I'd guess you're done until your processing volume doubles. If it had RAM trouble, then I'd suggest using something like SQLite or DBM to give you quick local cache with disk backup. That way, you can handle larger data structures without the overhead of communicating with the external database.

        ...roboticus

        When your only tool is a hammer, all problems look like your thumb.

Re: Perl Caching
by Jenda (Abbot) on Aug 08, 2013 at 22:41 UTC

    Hire an actual software developer. Preferably one that has some experience with databases.

    Oh my, the joys of offshore outsourcing. I bet this guy also has ten years experience with Sybase in his CV.

    Jenda
    Enoch was right!
    Enjoy the last years of Rome.

      I bet this guy who can't resist polking fun is Jenda