Technically, yes. But nearly all JSON parsers i've seen are designed to slurp in everything all at once and turn it into a in-memory data structure. So for very large files, you might (or might not) have to cobble together a custom parser that can do a stream-as-you-go approach.
Of course, that's where Perl comes into its own. Munching insanely huge text files is what it was designed for in the first place ;-)
perl -e 'use Crypt::Digest::SHA256 qw[sha256_hex]; print substr(sha256_hex("the Answer To Life, The Universe And Everything"), 6, 2), "\n";'
| [reply] [d/l] |
You're an oracle man, right? Oracle has JSON_TABLE to alleviate the JSON weirdness, I think.
Postgres has the same functionality, but not yet those JSON_TABLE API/functions (that the SQL Standard prescribes). That JSON_TABLE work (for postgres) is largely done, although not yet committed. Somewhat understandably, there seems to be a lack of interest: most DBAs look down a bit upon the strangeness of JSON data type, and prefer tables of more conventional data types. I take it that in the oracle-world there is the same reluctance towards this encroachment of NoSQL-y types. I feel the same reluctance myself.
| [reply] |
I used to use Oracle back in the day. Nowadays, it's PostgreSQL all the way.
There are a few cases where i use JSON datatypes and similar, but they are very limited.
perl -e 'use Crypt::Digest::SHA256 qw[sha256_hex]; print substr(sha256_hex("the Answer To Life, The Universe And Everything"), 6, 2), "\n";'
| [reply] [d/l] |