340 likes | 353 Views
Learn the basics of log tools, auditing, data recovery, and migration using BMC Log Master for DB2. Understand how to audit power user activities, track authorization changes, monitor schema changes, and manage dynamic SQL events for enhanced database security. Discover the high-speed apply engine for efficient recovery processes.
E N D
Audit Your SOX off, andother uses of a DB2 Log Tool Rick WeaverBMC Software
Overview • Mostly generic look at Log Tools • Basics • Examples • Auditing • Recovery • Data Migration • BMC Log Master for DB2 used for examples • High-Speed Apply Engine benefits
Log Tool Basics • Read the log • Provide context • Associate Unit of Recovery info to DM activity • Combine multiple records into single entities • Manage committed versus rolled back activity • Allow for robust selection criteria • Allow for “ONGOING” processes • Produce outputs • SQL, DDL, LOAD, REPORTS, reusable inputs
Log Tool Basics • Externalization of data • Completion of partially logged updates • Potentially costly process • Avoid with DATA CAPTURE CHANGES • Decompression • Invoke Edit and Field Proc Decode functions • Decode DB2 Internal Formats • Numerics, Date, Time, and Timestamps • Normalize Log Records to the current Version • Serialize XML
Log Tool Basics • Resources Used • BSDS – Boot Strap Datasets • ARCHIVE and ACTIVE Log Datasets • DB2 Catalog • EDM Pool • TABLESPACE VSAM Datasets • IMAGE COPY Datasets • Reusable inputs for reprocessing, SQL application, etc.
Member BSDS Member BSDS Member BSDS Archive Logs Active Logs Archive Logs Active Logs Archive Logs Active Logs Batch Log Scan Load Utility DB2 Logical Log Reports Report Writer SQL Processor High Speed Apply SQL Generator DMLDDL DDL Generator Repository Online Interface Load Generator Load File Mining the DB2 Log Data – BMC Log Master
Audit Your SOX OffApproach(es) • Ongoing process storing data for later reprocessing • Logical Log – Merged into Dailies, Weeklies… • Report Generation • Audit, Detail, Summary, Catalog Activity • User Control over content • Verbose Detailed output • SQL or DDL • LOAD or LLOG file output to post process
Audit Your SOX OffExamples • Power user (SYSADM, DBADM) activity • Track Authorization changes • Audit all schema changing activities • Look for changes to sensitive columns and tables • Detect Dynamic SQL activity • Report on Utility Execution • Ad Hoc research for cause of data corruption
Audit ExamplePower user activity • Need to reconcile power user activity to change control • Detail Report produced on a daily basis for listed AUTHIDs and sensitive DATABASEs • Report ORDER-ed BY • CORRELATION ID (Jobname, TSO ID, etc.) • UNIT OF RECOVERY • Each activity has to be reconciled to a Change Control Ticket
ONGOING versus FOR LIMIT • ONGOING – TO point resolution • TO CURRENT • TO DATE(*) TIME(-00.30.00) • OR LIMIT • 5 LOG FILES • 7 DAYS • 24.00.00
Audit ExampleTrack Authorization Changes • Need to track all Authorization activity • Catalog Activity Report produced daily • Filter was for all GRANT and REVOKE activity • The Catalog Activity Report is summary level • Another option – VERBOSE DDL
Audit ExampleSchema changing activities • Report on ALL DDL activity in the system • Method • Generate Logical Log on an ONGOING basis • Merge into Weeklies, Monthlies • Generate Reports from Monthly Logical Log
Audit ExampleChanges to sensitive columns or tables • Need was to track the life cycle of a check (cheque) • Customer wrote ISPF front end for Auditors • Auditor would enter • Account • Check number • Range for activity • ISPF Application would generate and submit a logscan job • Report could be viewed after job execution
Audit ExampleDynamic SQL activity WHERE PLAN NAME IN ( 'DSNTEP2', 'DSNTIAUL', 'DSNTIAD', 'DISTSERV', 'DSNESPCS') OR (PLAN NAME = 'ADMPROD' AND CORRELATION ID <> 'ALLOWJOB') OR PLAN NAME LIKE 'QMF%' OR PLAN NAME LIKE 'CDB%' OR PLAN NAME LIKE 'ACT%'
Audit ExampleReport on Utility Execution • Basically, an extract of SYSIBM.SYSCOPY • Can exclude non-utility ICTYPEs • ‘A’ – Alter • ‘C’ – Create • Approach • Generate a LOAD CSV file to post process • Transformed ICTYPE to meaningful value
Recovery Capabilities • Transaction Level Recovery • Avoid RECOVER outages and FIX IT programs • UNDO the bad • REDO the good after a PIT • Back out Integrity Checking • Discover and generate QUIET POINTs • DROP RECOVERY • Test System Regression • Customized Post Processing
Recovery CapabilitiesHigh-speed Apply Engine • SQL or LLOG input • LLOG processing almost always faster • Avoids SQL generation and parsing • Multi-threaded • Control over object distribution • Robust conflict resolution • Restartable • Used as the method for Migration as well as Recovery
UNDO - Take Away Only the Bad Data • BMC Log Master for DB2 can apply UNDO SQL to get rid of bad transactions. • Business Value – ZERO downtime for transaction level recovery! Good Transaction 1 Bad Transaction Good Transaction 2 UNDO Bad Transactions GenerateUNDO SQL Apply UNDO SQL
RecoveryBack out Integrity Report • You choose to do an UNDO… • But, has anything happened to the row between the UNDO range and current • BACKOUT INTEGRITY
Recovery ExampleQUIET POINT Analysis • Avoid QUIESCEs • Customer historically doing a QUIESCE during a ‘low’ processing time every night • Still received 100-200 application time outs • Now use Log Processing to find and manifest QUIESCEs • Finding points with no open Units of Recovery when RECOVER is necessary versus QUIESCE
Automated Drop Recovery – Reduce Risk Generates JCL and outputs to automate Drop Recovery • UNDO DDL to recreate the dropped object • Syntax for recovery and object ID translation • DB2 commands to rebind application plans that were invalidated when the object was dropped • Drop Recovery Report Scans DB2 Log Records Process is initiated from the online interface DB2 Subsystem Log Master Technology Recreates dropped objects Drive Recovery Technology using copy and log from Dropped Object. RECOVER PLUS Technology OBID TranslationApplies log to point of DROP Post recovery SQL and Rebind
Recovery ExampleSystem Regression • Test System Regression • Back out the test cycle versus recover to PIT • Set LOGMARK at beginning of test • Set LOGMARK at end of test • Generate LLOG from Begin to End LOGMARKS • Execute High-speed Apply Engine to UNDO LLOG • Production too…
Recovery ExamplePost Processing • Client has need to reverse changes up to 90 days • Many updates have occurred since • Solution • Produce LLOG or LOAD output for target data • Post Process to reverse changes while preserving current data
Migration • Data Warehouse • Test System Synchronization • To other Platforms • High-Speed Apply Engine • LOAD CSV or SDF (all character) Formats
DB2 Data Migration • Don’t replicate entire files, just migrate the changes!!! DB2 LOG RBA 1000 RBA 2000 RBA 3000 RBA 4000 (inflight URID 1988) Migrated 1000 - 2000 less inflight URID 1988 Migrated 2000 - 3000 plus inflight URID 1988 Migrated 3000 - 4000 Log Master BATCH PGM Log Master BATCH PGM Log Master BATCH PGM Logical Log(+1) Logical Log(+1) Logical Log(+1) REPOSITORY Migrated RBA range In-flight URIDs Input to LOAD utility or Apply SQL process
Migration ExampleTo a Teradata Decision Support System • Used SQL to port • Post processed the SQL to change the comma delimiter • Wanted to limit size of any given extract • Used the OR LIMIT # LOG FILES • Selection criteria • List of 52 Tables to include and 18 batch jobs to exclude • > 800 Partitions – all compressed • Volume ~24 Billion transactions a year
Migration ExampleSynchronizing a Test System • Needed to keep Regression Test system in sync with production to prove changes before implementation (462 Tables) • Use LLOG as the capture format • Use High-speed Apply Engine to process the LLOG • Has a process in place to “Refresh” the test system after major production DDL changes or utility windows • RESETs the ONGOING capture after refresh
UDCLUnload – Drop – Create – Load • Reduce Object Restructuring outage window • Standard Practice • Put object into Read only • Unload object being restructured • STOP all access to object • Drop object • Create object with appropriate changes • Repartitioning • COLUMN changes (metrics or location) • Load Table • REBIND plans • START access to object • Outage lasts the entire length of the UNLOAD/LOAD window
Traditional ‘UDCL’ Method Read Only Stop all Objects - OUTAGE Limited RO Activity Pre Image Copy Unload Data Drop Tablespace Create Tablespace Load Data Post Image Copy Install New Programs Runstats Bind/Rebind Application Available
UDCLUnload – Drop – Create – Load • Reduce Object Restructuring outage window • Log Master / Apply Plus Alternative • Create new object with appropriate changes • Unload old object • Load new object • Run job to Capture changes between Unload and Current (Apply too) • (Reiterravally, until ready to switch over) • Put old object into read only • Do final Capture / Apply (reconcile) • STOP objects and RENAME • Old object to temporary name • New object to old name • REBIND plans • START and carry on • Outage only lasts from the point of final capture through rename and follow on actions
SHRLEVEL CHANGE TRANSFORM Prep Activities - Current tables, Data, Online and Batch Unaffected Transform TS_New, Rebuild IX_New, Inline Copy, Runstats Create new Structures (TS_New, TB_New, IX_New) Create OCC From TS_Orig, Set Log Mark(0) Execute Log Scan from TB_Orig From Log Mark(0) to Current Time Creating Apply SQL DB2 Log Set New Log Mark(+1) Application Unavailable Application RO Rename TB/IX_New to TB/IX_Orig STOP all Rename TB/IX_Orig to TB/IX_Backup Final Log Update TB_New Start TS_Orig Read Only Update TB_New Apply SQL Files Application Available Start TS_New, TB_Orig, IX_Orig Rebind Update all TS/IX jobs & Utils, DROP TS_Orig
Wrap up • Auditing • Recovery • Migration • Alternative uses of migration • Additional ideas to share? • Questions?
Audit Your SOX Off, and other uses of a DB2 Log Tool Ken McDonald BMC Software Rick_Weaver@bmc.com