in reply to Optimize Large, Complex JSON decoding

It seems to take a long time to simply unthread the JSON files, so I'm wondering if there are any tips on optimizing this process.

What is that? How did you determine the bottleneck ?

Because on my really old laptop(9yo), it takes 0.96875 to slurp+decode+foreach 189279 "records" from a 21M json file

If I add in some Time::Piece strftime/strptime it goes to 9.984375 seconds

I don't see room for improvement, although it looks like you could reduce memory requirement with JSON::Streaming::Reader

  • Comment on Re: Optimize Large, Complex JSON decoding