in reply to Re^5: Hashing temporal tables in PostgreSQL
in thread Hash generate and find tampered tables in PostgreSQL

No, Erix, Sorry for the miscommunication. What I meant was that the CSV file which read all tables and generate a hash/checksum value for each table. And it is not temporary or temporal. Its about identify "tampered" tables by using the pre-generated hash/checksum values.
  • Comment on Re^6: Hashing temporal tables in PostgreSQL

Replies are listed 'Best First'.
Re^7: Hashing temporal tables in PostgreSQL
by erix (Prior) on Sep 25, 2017 at 10:32 UTC

    ok, so it's apparently a bit different from what we thought. It's still pretty unclear what you're up to or aiming for (or whether you have already succeeded or not).

    If you want help (from me, anyway) you'll have to provide a better explanation of what it is that you're stuck with.

      Hi Erix,

      Please see below points for the things I completed and things I didn't completed.

      Requirement: I wanted to check a production schema with multiple tables have been tampered or not before each new release. Because there can be issues due to the Table column changes, size changes done by unauthorized people. So my goal is to generate a Hash/Checksum value for each table in a schema and store in a table. In the next release deployment, I can run the script again and check the Hash/Checksum values whether the previous generated Hash/Checksum values are matched or not with the new Hash/Checksum list(This is mainly doing for the table ddl level). So if there are any mismatches between the Hash values, which means someone has changed or altered the tables.

      Current Status: I have automated this via Perl script for Oracle DB by using DBMS_CRYPTO function. In oracle, I can get the tables and generate a Hash/Checksum value for per table level. Then store in a temporary table and export into a CSV file. Likewise, we can generate theses lists time to time.

      Current Issue: As I explained, I have completed this task for the Oracle DB and need to extend this automated feature for PostgreSQL and DB2 databases. Due to the limitations in these databases, I couldn't find a way to generate a Hash/Checksum value for each table DDL level(table definition) in a schema. So that's the problem currently I have in my script... Hash/Checksum (unique key for identify any alterations later) generation in PostresSQL and DB2.