http://qs1969.pair.com?node_id=11129468


in reply to Persistent data structures -- why so complicated?

The thread "Is there any cache mechanism for running perl script" might be an interesting read, where the replies show how to store data structures using Storable, JSON, or Path::Class (the latter only for simple arrays). As for editing a file "in-place", I showed several variations of that in this node.

Note that, with modern databases supporting JSON*, and with modern Perl modules, databases are IMHO pretty nice to work with too. For example, given a Postgres table:

CREATE TABLE students ( name TEXT NOT NULL PRIMARY KEY, hometown TEXT NOT NULL, grade TEXT, data JSONB NOT NULL DEFAULT '{}'::jsonb );

Here's some code using Mojo::Pg showing INSERT, SELECT, and UPDATE:

use Mojo::Pg; my $pg = Mojo::Pg->new('postgres://localhost:54321/testing') ->password('BarFoo'); $pg->db->insert('students', { name => 'Bob Kowalski', hometown => 'Vero Beach, FL', data => { -json => { hobbies => [ 'ham radio', 'Python programming', 'running' ], } }, }); $pg->db->insert('students', { name => 'Kranessa Evans', hometown => 'Dallas, TX', data => { -json => { hobbies => [ 'Perl programming', 'writing', 'polo' ], } }, }); my $res = $pg->db->select('students')->expand; while ( my $rec = $res->hash ) { if ( grep {/perl/i} @{ $rec->{data}{hobbies} } ) { $pg->db->update('students', { grade=>'A' }, { name=>$rec->{name} } ); } } $res->finish;

For a database that doesn't even require a server, Mojo::SQLite has pretty much exactly the same a very similar API as the above (Edit: though I haven't used JSON in SQLite yet).

Update 2: I've modified the connection string in the above to be more useful than using the postgres superuser - and normally one wouldn't hardcode the password of course, see e.g. ~/.pgpass. The following is how I spun up the test database. I'm using port 54321 instead of the default 5432.

$ docker run --rm -p54321:5432 --name pgtestdb -e POSTGRES_PASSWORD=Fo +oBar -d postgres:13 # wait a few seconds for it to start $ echo "CREATE USER $USER PASSWORD 'BarFoo'; CREATE DATABASE testing; +GRANT ALL PRIVILEGES ON DATABASE testing TO $USER;" | psql postgresql +://postgres:FooBar@localhost:54321 $ psql postgres://localhost:54321/testing # log in and create the above table # run the above Perl script $ PGPASSWORD=BarFoo psql postgres://localhost:54321/testing -c 'SELECT + * FROM students' $ docker stop pgtestdb

* Update 3: One more thought here: I'm definitely not advocating just dumping everything in an unstructured JSON blob - one should still try to follow the rules of good database design and normalization as much as possible. But sometimes there are cases where nested data structures can be an advantage, in which case having support in the database for them can be very useful. Update 4: In that regard, erix's reply below!

Replies are listed 'Best First'.
Re^2: Persistent data structures -- why so complicated?
by dsheroh (Monsignor) on Mar 12, 2021 at 07:58 UTC
    For a database that doesn't even require a server, Mojo::SQLite has pretty much exactly the same a very similar API as the above
    Or standard DBI and DBD::SQLite if you don't feel the need to drag a random web application framework into your code.
      Or standard DBI and DBD::SQLite if you don't feel the need to drag a random web application framework into your code.

      On the one hand, I understand the sentiment, as loading it does add overhead (although perhaps you should have said so more clearly instead of just expressing your apparent distaste for it), on the other, I think Mojo does have its advantages for simplifying writing code - IMHO it's extremely Perlish. I did list the Mojo solution last (Edit: did you look at the threads I linked to?), since it has the largest learning curve, but I thought it was worth mentioning in the spirit of TIMTOWTDI.

      Since the OP does talk about "All the teacher ... wants to do is store the information on a server or other computer so that he or she can call up the students' records at some point in the future -- maybe to browse all the information, or maybe to read or edit a particular student's information ...", using a web interface as a solution is not unthinkable. There's no hint of needing to process billions of records or accesses or other hints that the aforementioned performance overhead is a concern - though it's certainly worth keeping in mind.

        While you are correct that I'm no fan of Mojo, my response would have been the same (and probably worded the same) regardless of what web framework the database interface might be hanging off of. My real distaste here is for the idea that, if you want to use a database, you should do so by passing through a "web development toolkit" (how Mojo describes itself in its documentation), regardless of whether you're doing web development or not.
Re^2: Persistent data structures -- why so complicated? (updated)
by erix (Prior) on Mar 13, 2021 at 15:02 UTC

    For PostgreSQL jsonb one might mention the advantage of json(b)-indexing: it makes access of large json tables fast (rule of thumb: 100x faster - of course, it only matters for large datasets).

    (see the PostgreSQL fine manual on JSON-indexing)