Hi Erix,
Please see below points for the things I completed and things I didn't completed.
Requirement: I wanted to check a production schema with multiple tables have been tampered or not before each new release. Because there can be issues due to the Table column changes, size changes done by unauthorized people. So my goal is to generate a Hash/Checksum value for each table in a schema and store in a table. In the next release deployment, I can run the script again and check the Hash/Checksum values whether the previous generated Hash/Checksum values are matched or not with the new Hash/Checksum list(This is mainly doing for the table ddl level). So if there are any mismatches between the Hash values, which means someone has changed or altered the tables.
Current Status: I have automated this via Perl script for Oracle DB by using DBMS_CRYPTO function. In oracle, I can get the tables and generate a Hash/Checksum value for per table level. Then store in a temporary table and export into a CSV file. Likewise, we can generate theses lists time to time.
Current Issue: As I explained, I have completed this task for the Oracle DB and need to extend this automated feature for PostgreSQL and DB2 databases. Due to the limitations in these databases, I couldn't find a way to generate a Hash/Checksum value for each table DDL level(table definition) in a schema. So that's the problem currently I have in my script... Hash/Checksum (unique key for identify any alterations later) generation in PostresSQL and DB2.
|