Database Auditing Essentials... or... Who did what to which data when and how?
The combination of increasing government regulation and the need for securing corporate data has driven up the need to track who is accessing data in our corporate databases. This presentation discusses these drivers as well as presenting the requirements for auditing data access in corporate databases.
The goal of this presentation is to review the regulations impacting the need to audit, and then to discuss in detail the kinds of things that may need to be audited, along with the several ways of accomplishing this.
3. INTELLIGENCE. INNOVATION. INTEGRITY
DBA versus Data Management
Database Administration
— Database Security
— Backup/Recovery
— Disaster Recovery
— Reorganization
— Performance Monitoring
— Application Call Level Tuning
— Data Structure Tuning
— Capacity Planning
Managing the database environment Managing the content and uses of data
Data Management
— Data Protection
— Data Privacy Protection
— Data Quality Improvement
— Data Quality Monitoring
— Database Archiving
— Data Extraction
— Metadata Management
4. INTELLIGENCE. INNOVATION. INTEGRITY
Top Data Protection
Challenges
Where is my sensitive data located &
who is using it?
How do I simplify & automate compliance?
How can I enforce access & change
control policies for critical databases?
5. INTELLIGENCE. INNOVATION. INTEGRITY
Security By the Numbers
(per TechTarget)
54% of companies say security budgets are growing
69% of security pros said their job will be more strategic in 2008
75% of security pros said their job will involve more compliance
work
62% of security pros said their company’s top IT person will care
more about security in 2008
52% said their top priority will have something to do with
network/security integration
80% said they would “prefer to purchase best of breed products
and integrate them into my network so I have the strongest
security possible for my budget.”
6. INTELLIGENCE. INNOVATION. INTEGRITY
Database Security By the
Numbers (per TechTarget)
67% of 1,100 security pros said securing databases was an
important or very important challenge for 2008
— #1 in a list of 13 data protection initiatives
45% of security pros will spend more time on data protection
in 2008 vs. 2007.
— #3 on a list of 19 security activities
41% will spend more time on database security specifically
7. INTELLIGENCE. INNOVATION. INTEGRITY
Data Breaches
According to the Privacy Rights
Clearinghouse, the total number of
records containing sensitive personal
information involved in security breaches
in the U.S. since January 2005 is:
218,198,364
As of February 5, 2008
http://www.privacyrights.org/ar/ChronDataBreaches.htm
8. INTELLIGENCE. INNOVATION. INTEGRITY
How Prevalent is this Problem?
68% of companies are losing sensitive data or having it
stolen out from under them six times a year
An additional 20%
are losing sensitive
data 22 times or
more per year.
Sources: eWeek, March 3, 2007
IT Policy Compliance Group
http://www.eweek.com/c/a/Desktops-and-Notebooks/Report-Some-Companies-Lose-Data-Six-Times-a-Year/
9. INTELLIGENCE. INNOVATION. INTEGRITY
Regulations Impacting Security
Governance vs. Privacy
Governance Privacy
1. Basel II
2. Sarbanes Oxley
3. OFAC
4. Turnbull Report
1. Basel II
2. Sarbanes Oxley
3. OFAC
4. Turnbull Report
1. EU DPD
2. AU/NZ NPP
3. SB 1386/AB 1950
4. GLBA
5. HIPAA
6. PCI
7. FCRA -- “Red Flag”
1. EU DPD
2. AU/NZ NPP
3. SB 1386/AB 1950
4. GLBA
5. HIPAA
6. PCI
7. FCRA -- “Red Flag”
Protect and control the process Protect the data
10. INTELLIGENCE. INNOVATION. INTEGRITY
FCRA: “Red Flag” Rules
All federally regulated financial institutions must be in full compliance by Nov.
1, 2008, with the so-called "Red Flag" provisions of the Fair and Accurate
Credit Transactions Act of 2003 (FACTA).
Part of FCRA – the Fair Credit Reporting Act.
Requires that financial institutions and creditors develop and deploy an
Identity Theft Prevention Program for combating ID theft on new and existing
accounts.
Each institution must develop a program that will:
Identify relevant patterns, practices, and specific forms of activity that are "red flags"
signaling possible ID theft.
Include a mechanism to detect red flags identified by the program.
Quickly respond to detected red flags in a way to both prevent and mitigate ID theft.
Be updated regularly to reflect changes in real world risks from ID theft.
11. INTELLIGENCE. INNOVATION. INTEGRITY
Audit Requirements CobiT
(SOX)
PCI
DSS
HIPAA
CMS
ARS
GLBA
ISO
17799
(Basel II)
NERC
NIST
800-53
(FISMA)
1. Access to Sensitive Data
(Successful/Failed SELECTs)
2. Schema Changes (DDL)
(Create/Drop/Alter Tables, etc.)
3. Data Changes (DML)
(Insert, Update, Delete)
4. Security Exceptions
(Failed logins, SQL errors, etc.)
5. Accounts, Roles &
Permissions (DCL)
(GRANT, REVOKE)
Top Regulations Impacting DB Security
DDL = Data Definition Language (aka schema changes)
DML = Data Manipulation Language (data value changes)
DCL = Data Control Language
12. INTELLIGENCE. INNOVATION. INTEGRITY
Protect enterprise dataSimplify & automate
compliance processRapidly address auditors’
requirements
Enterprise Data Protection
Tactical
Need
Strategic
Need
Fines
Loss/Out of
Business
Increased
Staff
Cost associated with Non-Compliance
& Breach of Data
13. INTELLIGENCE. INNOVATION. INTEGRITY
Regulatory Compliance and…
Impact: upper-level management is keenly aware of the need to
comply, if not all of the details that involves.
Prosecution: prosecution can result in huge fines
and even imprisonment.
Cost: the cost of complete compliance can be
significant.
Durability: although there have been discussions about scaling back
some laws (e.g. SOX), increasing regulations and therefore
increasing time, effort, and capital will be spent on compliance.
That is, the issue will not just disappear if you ignore it long enough!
14. INTELLIGENCE. INNOVATION. INTEGRITY
Cost of Breach
Identity theft, credit cards etc.
Regulatory Compliance
Database monitoring is
key requirement
Database Leak Prevention
Sensitive data needs protection
Open Accessability
Web 2.0 applications dissolving traditional
perimeter controls
Why Audit & Secure Databases
15. INTELLIGENCE. INNOVATION. INTEGRITY
Data and Database Protection,
Security, and Auditing Trends
AmountofData
Accessibility
Com
pliance
Protection
0
Data Protection Issues:
Volume of data
Increased accessibility
of data
Regulatory compliance
Increased number and
type of threats
16. INTELLIGENCE. INNOVATION. INTEGRITY
Database Auditing
In a world replete with regulations and
threats, organizations have to go well
beyond securing their data. Essentially,
they have to perpetually monitor their
data in order to know who or what did
exactly what, when and how – to all their
data.
HIPAA, for example, requires patients to
be informed any time someone has even
looked at their data.
Source: Audit the Data – or Else:
Un-audited Data Access Puts Business at High Risk.
Baroudi-Bloor, 2004.
17. INTELLIGENCE. INNOVATION. INTEGRITY
What is Database Auditing?
There are many names used for basically
the same thing.
I’ll call it database auditing, but you may
also know it as:
Data Access Auditing
Data Monitoring
Data Activity Monitoring (DAM)
18. INTELLIGENCE. INNOVATION. INTEGRITY
Database Auditing definition
Database Auditing:
The process of monitoring access to and modification
of selected database objects and resources within operational
databases and retaining a detailed record of the access
where said record can be used to proactively trigger actions
and can be retrieved and analyzed as needed.
20. INTELLIGENCE. INNOVATION. INTEGRITY
Database Auditing
Types of Database Auditing
Database Auditing Requirements
Database Auditing Challenges
Native DBMS Database Auditing?
Reporting and Analyzing the Audit Data
21. INTELLIGENCE. INNOVATION. INTEGRITY
Types of Database Auditing
Authorization Auditing
Who can do what.
Access Auditing
Who did do what.
Modifications: INSERT, UPDATE, DELETE
Reads: SELECT
Other: DDL (CREATE / DROP/ALTER), DCL (GRANT / REVOKE),
Utilities, SQL errors, failed logins, etc.
Replication Auditing
Who copied which data where.
22. INTELLIGENCE. INNOVATION. INTEGRITY
Database Auditing Approaches
What methods are available?
Audit within the DBMS (traces)
— Must start performance trace
– Overhead as trace records are written by the DBMS
— DDL changes required to traced tables?
Audit from the database transaction log files
— Modifications are on the log anyway so…
Audit over the network
— Capture SQL requests as they are sent over the network
— What about non-network requests? (e.g. CICS w/DB2)
Audit directly against the DBMS server (software tap)
23. INTELLIGENCE. INNOVATION. INTEGRITY
Native DBMS Audit
The native DBMS audit capability may not be
optimal:
Separation of duties – logging typically is turned on and
off by DBAs, who need to be audited
Overhead – many require traces to be started, which
can consume precious resources (as much as 10%
overhead?)
Comprehensive capture – may not capture everything
that needs to be captured for compliance
24. INTELLIGENCE. INNOVATION. INTEGRITY
DB2’s Native Audit Trace
The DB2 Audit Trace can record:
Changes in authorization IDs
Changes to the structure of data (such as dropping a table)
Changes to data values (such as updating or inserting data)
Access attempts by unauthorized IDs
Results of GRANT statements and REVOKE statements
Mapping of Kerberos security tickets to IDs
Other activities that are of interest to auditors
FYI: Audit Trace Classes are listed on page 287, IBM DB2 Admin Guide
-START TRACE (AUDIT) CLASS (4,6) DEST (GTF) LOCATION (*)
CREATE TABLE . . . AUDIT ALL . . .
25. INTELLIGENCE. INNOVATION. INTEGRITY
Limitations of DB2 Audit
The DB2 audit trace does not record everything. Consider the
following limitations:
The trace does not record old data after it is changed
(the database transaction log records old data).
If an agent or transaction accesses a table more than once in a single unit of
recovery, the audit trace records only the first access.
Although plan and authid can be used to limit audited data, wildcarding is not
supported, so starting appropriate traces is prohibitive.
Must start audit traces that have to be set up to go to an appropriate trace
destination: requires system programmer assistance.
The audit trace does not record accesses if you do not start the audit trace for
the appropriate class of events.
The audit trace does not audit some utilities. The trace audits the first access
of a table with the LOAD utility, but it does not audit access by the COPY,
RECOVER, and REPAIR utilities. The audit trace does not audit access by stand-
alone utilities, such as DSN1CHKR and DSN1PRNT.
Limitations exist as to what tables can be audited: you cannot audit access to
auxiliary tables or to the system catalog tables.
26. INTELLIGENCE. INNOVATION. INTEGRITY
What About Using the Log?
Database transaction log(s) capture ALL*
changes
made to data.
Database
DBMS
Transaction Log(s)
SQL
Changes
*
Well, maybe not all changes, all the time.
27. INTELLIGENCE. INNOVATION. INTEGRITY
Issues With Database Log Auditing & Analysis
Log format is proprietary
Volume can be an issue
Easy access to online and archive logs?
— But how long do you keep your archive logs?
Dual usage of data could cause problems?
— Recovery and protection
— Audit
Tracks database modifications, but what about reads?
— Transaction logs do not record information about SELECT.
And what about non-logged operations?
— LOAD LOG NO, REORG LOG NO
— Non-logged table spaces (new feature in DB2 9 for z/OS)
Cannot invoke real-time actions using log-based auditing
28. INTELLIGENCE. INNOVATION. INTEGRITY
Network Capture
Database auditing via network sniffing captures
SQL requests as they go across the network.
But not all requests go across the wire
— Mainframe applications
— DBA access directly on the server
Be careful, many third-party database auditing solutions
use this approach
29. INTELLIGENCE. INNOVATION. INTEGRITY
A Better Approach
Audit database calls at the server
— Capture all SQL requests at the server
— All SQL access is audited, not just network calls
— Retain all pertinent audited information
– No reliance on the DBMS
— No need to keep the active/archive log files
— No need to start a DBMS trace
— No need to modify the database schema
— Requires purchasing additional ISV software
30. INTELLIGENCE. INNOVATION. INTEGRITY
Enterprise DBMS
Platforms
Local
Access
IP Network
Access
Distributed
Environments
Oracle Bequeath/IPC
DB2 Shared Memory
SQL Server
Named Pipes
/Shared
Memory
Sybase TLI
Informix
Shared
Memory/TLI
Mainframe
Environments
DB2, IMS
CICS, MQ,
IMS/TM, TSO,
RRSAF, etc.
DB2 Connect
/JDBC
Visibility Across all DB Activities,
Platforms, and Access Methods
31. INTELLIGENCE. INNOVATION. INTEGRITY
Server-Based Database Auditing
Requirements
Surveillance Mechanism
Selective – must be rules-based to enable the
capture of audit details only on the specific
data that requires auditing.
Comprehensive – must be able to capture the
complete scenario of auditable information.
Non-invasive – must be able to audit access to
data without incurring expensive performance
degradation.
33. INTELLIGENCE. INNOVATION. INTEGRITY
Comprehensive
What Must be Captured?
Collect data from a variety of environments
— Any Supported Connection
— Web, App Server, CICS, IMS TM, Batch, etc.
— As well as multiple data access/update types
– INSERT, UPDATE, DELETE
– SELECT – capture information about queries
– Database Commands, DDL, DCL
– Database Utilities: REORG, RUNSTATS, COPY, RECOVER,
etc.
34. INTELLIGENCE. INNOVATION. INTEGRITY
What Must be Captured?
Environmental Data
DBMS Related Information
DBMS Data Object Descriptions
DBMS Specific Data
System Information
User information
Information dependent upon DBMS
— For example: IMS TM/DB Transaction Data Collected
— Or, for DB2: DBRM Information
35. INTELLIGENCE. INNOVATION. INTEGRITY
Data Captured?
Modification
Before and After Image of Data
— May require access to transaction logs
Access
Specifics of request (with host variables)
Image of all data accessed?
Number of rows accessed?
36. INTELLIGENCE. INNOVATION. INTEGRITY
Non-Invasive
Database Auditing needs to be implementable with as little
interference to the production environment as possible:
Should not require modifications to the database
schema
Should not require application changes
Should be performance-sensitive
— That is, you can’t “dim the lights” when
you start auditing database accesses
38. INTELLIGENCE. INNOVATION. INTEGRITY
Audit Trails in Tables
Sometimes people add “audit columns” to tables,
such as LAST_MODIFIED_DATE
Auditors don’t like this; it is a problem because:
Audit trails should be kept outside of the database
(because if you delete the row you lose the audit data)
Can you guarantee that LAST_MODIFIED_DATE is
accurate?
— Couldn’t someone have set it by accident (or nefariously)?
39. INTELLIGENCE. INNOVATION. INTEGRITY
DBA Auditing
One of the biggest needs for database auditing is
privileged user (ie. DBA) auditing?
In other words, who watches the watchers?
Super users (SYSADM, DBADM)
Can do anything to any data
Database auditing can be used to verify the
integrity and accuracy of the DBA group
41. INTELLIGENCE. INNOVATION. INTEGRITY
Long-Term Retention and Discard
Why:
— Need to be able to find who accessed data long after the
database logs would be recycled
— But there could be a legal exposure for data kept too long
Implications:
— Data must be kept long enough to be useful, but not beyond
a specific retention period
— Must be removed with no exposure to forensic software
How:
— Long-term storage mechanism
— Policy based discard
— Tightly controlled and audited
— True “zero out” capability
42. INTELLIGENCE. INNOVATION. INTEGRITY
Audit Analysis
Access and analyze the audit details to
produce multiple useful reports on
database activities:
Who did what to which piece of data when
Pattern analysis and trending
45. INTELLIGENCE. INNOVATION. INTEGRITY
Provide insight such as . . .
Who is changing database schemas or dropping tables?
When are there any unauthorized source programs changing data?
What are DBAs or outsourced staff doing to the databases?
How many failed login attempts have occurred?
Who is extracting credit card data?
What data is being accessed from which network node?
What data is being accessed by which application?
How is data being accessed?
What are the access patterns based on time of day?
What database errors are being generated?
What is the exposure to sensitive objects?
When is someone attempting an SQL injection attack?
46. INTELLIGENCE. INNOVATION. INTEGRITY
Gartner Recommendations
Implement Data Access Monitoring (DAM) functionality to mitigate the high levels of risk
resulting from database vulnerabilities and to address audit findings in such areas as
database segregation of duties (SOD) and change management.
Use DAM technology when there is a need for granular monitoring, or the overhead of
database audit functions is unacceptable.
Use security information and event management (SIEM) technology when the additional
resource consumption associated with native auditing is acceptable, and there is no
need for advanced, database-focused analytics.
If specialized DAM technology is required, then the evaluation should encompass data
capture methods, connection-pooling support, exception analysis, data retention,
compliance reporting, blocking capabilities and incident management/workflow support.
Examples of related functions that may be provided by a DAM vendor include: Database
vulnerability assessment and configuration audit, data loss prevention (discovery and
control of sensitive structured data at rest and in motion), database change discovery
and reconciliation to change management records.
Source: Gartner Research Report G00153063 (November 2007)
47. INTELLIGENCE. INNOVATION. INTEGRITY
Summary Points
Database auditing requirements go far beyond the
capabilities of today’s DBMS software
Log analysis software does not solve access auditing
requirements
Network sniffing does not provide sufficient
auditability for mainframe databases
Long-term proof of database access not viable without
archival of audit surveillance details
Audit details must be continuously managed
— Copying data for storage problems (e.g. media rot)
— Copying data for system changes
— Copying data for data encoding standard changes
— Logging, auditing, and monitoring
48. INTELLIGENCE. INNOVATION. INTEGRITY
Craig S. Mullins
NEON Enterprise Software, Inc.
craig.mullins@neonesoft.com
www.neonesoft.com
www.craigsmullins.com
www.DB2portal.com
My Books: http://www.craigsmullins.com/booklnks.htm
My Blogs: http://www.craigsmullins.com/blogs.htm
This slide can basically be broken down into two categories: database administration (or managing the containers) and data management (or managing the content within the containers). The DBA roles and responsibilities are well-known and well-supported today. Indeed, if you have an enterprise DBMS there is usually no argument from management that a DBA staff will be needed to support the care and feeding of the database system.
On the other hand, things are not as clear cut when it comes to managing the content of the databases. The data management side of this equation is growing with increasing support for these requirements. Today we will discuss one aspect of one these areas: data protection, or data security.
<number>
There are many aspects of data protection. Some of the biggest challenges being faced by organizations today include the identification and classification of sensitive data, enacting and enforcing appropriate policies for who can access and change data and database structures, and addressing and improving compliance.
<number>
Down from 06-07: 65% growing, 27% flat.
<number>
<number>
As of February 5, 2008 the total number of sensitive records breached since 2005 is: 218,198,364
<number>
Sixty-eight percent of companies are losing sensitive data or having it stolen out from under them six times a year, according to new research from the IT Policy Compliance Group. TJX’s massive data loss is just the tip of the iceberg. Almost seven out of 10 companies—68 percent—are losing sensitive data or having it stolen out from under them six times a year, according to new research from the IT Policy Compliance Group. An additional 20 percent are losing sensitive data a whopping 22 times or more per year.
The ITPCG is a security and compliance policy industry group that counts among its members the Institute of Internal Auditors, the Computer Security Institute and Symantec.
<number>
California AB 1950 and SB 1386 are two privacy bills, now laws in the State of California, that require organizations to notify Californians if their personal information is disclosed during a security breach.
SB 1386 was passed in 2002 to become effective on July 1, 2003. This law is directed at state agencies and businesses operating in California. Personal information is defined as an individual’s first name or first initial and last name with any of the following;
• Social Security number,
• Driver’s license number or California Identification Card number, or
• Account number, credit or debit card number, in combination with something like a PIN or password which would allow access to the account.
AB 1950 was passed in 2004, and became effective January 1, 2005. It added medical information to the information to be protected and extended the responsibility to organizations outside of the State, if they collect information about California residents. It does not apply to organizations that are subject to other privacy laws.
OFAC – Office of Foreign Assests Control
Sarbanes-Oxley (SOX)
The U.S. Public Accounting Reform and Investor Protection Act of 2002 requires executives and outside auditors to attest to the effectiveness of internal controls for financial reporting.
GLB
The Gramm-Leach-Bliley Act (GLB Act), also known as the Financial Modernization Act of 1999, is a federal law enacted in the United States to control the ways that financial institutions deal with the private information of individuals.
HIPAA
The HIPAA Privacy Rule creates national standards to protect the privacy of individuals' medical records and other personal health information and to give patients more control over their health information.
Basel II
Basel II is a round of deliberations by central bankers from around the world, the goal of which is to produce uniformity in the way banks and banking regulators approach risk management across national borders.
Beginning Jan. 1, 2008, with all federally regulated financial institutions ordered to be in full compliance by Nov. 1, 2008, the so-called "Red Flag" provisions of the Fair and Accurate Credit Transactions Act of 2003 (FACTA), requires that financial institutions and creditors develop and deploy an Identity Theft Prevention Program for combating ID theft on new and existing accounts. The Red Flag regulations are included in the same massive regulatory overhaul of the Fair Credit Reporting Act (FCRA) that gave consumers free credit reports and a host of additional protections.Under Red Flag provisions, each institution must develop a program that will• Identify relevant patterns, practices, and specific forms of activity that are "red flags" signaling possible ID theft.• Include a mechanism to detect red flags identified by the program.• Quickly respond to detected red flags in a way to both prevent and mitigate ID theft.• Be updated regularly to reflect changes in real world risks from ID theft.
<number>
1. DDL = Data Definition Language
AKA “schema changes”
Manipulates database structure
CREATE, DROP, ALTER objects such as TABLES
2. DML = Data Manipulation Language
Manipulates & retrieves data
INSERT, UPDATE, DELETE and SELECT
3. DCL = Data Control Language
Controls access
GRANT & REVOKE
CONNECT to the database or schema.
SELECT, INSERT, UPDATE, DELETE records.
USAGE -- use a database object such as a schema or a function
<number>
Organization are processing and storing more and more data every year. Average yearly rate of growth: 125%
- large volumes of data interfering with operations (the more data in the operational database, the slower all processes may run)
And regulations (as well as business practices) dictate that data once stored, be retained for longer periods of time.
- No longer months or years, but in some cases multiple decades
More varied types of data are being stored in databases. Not just (structured) characters, numbers, dates & times, but also (unstructured) large text, images, video, and more are being stored.
- Unstructured data greatly expands storage needs
The retained data must be protected from modification – it must represent the authentic business transactions at the time the business was conducted.
- need for better protection from modification
- need for isolation of content from changes
Auditing helps to answer questions like “Who changed data?” and “When was the data changed?”
and “What was the old content was prior to the change?” Your ability to answer such questions is
very important for regulatory compliance. Sometimes it may be necessary to review certain audit
data in greater detail to determine how, when, and who changed the data.
In a nut-shell, database auditing means monitoring how stored business data is being accessed. I have seen several alternative terms being used, including data monitoring, activity monitoring, or database auditing. Data auditing usually involves monitoring core data assets stored on a server in the data center, such as a database or a file server or perhaps a mainframe system. The value of data auditing is in creating accountability of data assets being transacted (for compliance), in helping assess and mitigate data-level risk (for security), and in ultimately offering data assurance and governance.
<number>
What we will discuss…
Authorization auditing basically is the process of reviewing who has been granted what level of database access authority.
Access auditing is the topic of this presentation and what we will heretofore refer to as database auditing –or- database access auditing.
Replication auditing is keeping track of what data is copied where and by whom. This type of auditing is also much-needed, but not the topic of this presentation.
Limitations of the audit trace The audit trace does not record everything, as the following list of limitations indicates:
Auditing takes place only when the audit trace is on.
The trace does not record old data after it is changed (the log records old data).
If an agent or transaction accesses a table more than once in a single unit of recovery, the audit trace records only the first access.
The audit trace does not record accesses if you do not start the audit trace for the appropriate class of events.
The audit trace does not audit some utilities. The trace audits the first access of a table with the LOAD utility, but it does not audit access by the COPY, RECOVER, and REPAIR utilities. The audit trace does not audit access by stand-alone utilities, such as DSN1CHKR and DSN1PRNT.
The trace audits only the tables that you specifically choose to audit.
You cannot audit access to auxiliary tables.
You cannot audit the catalog tables because you cannot create or alter catalog tables.
DB2 Audit Trace
Different trace classes can be used to limit the amount of data.
Plan and Authorization ID can be used to limit Audit data.
Wildcarding is not supported!
Three output destinations
Output destinations continued…
GTF
Requires GTF trace to be active.
Requires system programmer involvement
Real Time processing not available
SMF
Requires access to SMF records.
Requires system programmer involvement
Real Time processing not available.
OPx
Runs in real time.
Requires program to extract audit data from OPx buffers
Requires buffers to be properly sized to prevent system impact.
<number>
The log is the database – the database tables are just optimized access paths to the current state of the log.
And is all your audit trail being written to write-once media, so that it can't be manipulated to hide the culprit's tracks?
<number>
Using the log can be useful to confirm approved modifications and identify unauthorized changes. The log can show object changes that might affect financial reporting, unauthorized transactions (Who? What? Detect, report, and correct . . .) And it has the additional benefits of not requiring additional hardware and potentially being able to ensure SQL-based data recoverability.
But, some companies may discover that scanning the entire log with any of the currently available anaylzers could take more than 24 hours (think about large data sharing groups). I think we should work to charge the scanning costs to the accounting department, including the hardware upgrades that will undoubtedly be necessary to support such an enormous process. It might bring some sanity to the discussion.
<number>
Data Access Auditing:
Data access auditing is a surveillance mechanism that watches over access to all sensitive information contained within the database. This mechanism brings only the suspicious activity to the auditor’s attention. As was previously discussed, databases generally organize data into tables containing columns. Access to the data generally occurs through a language called Structured Query Language or SQL (Richardson 2). The perfect data access auditing solution would address the following six questions:
Who accessed the data?
When?
Using what computer program or client software?
From what location on the network?
What was the SQL query that accessed the data?
Was it successful; and if so, how many rows of data were retrieved?
The auditor can choose to either audit within the client, audit within the database, or audit between the client and the database
The ability to discard the data exactly when it must be gotten rid of is important. If the legal requirement is to retain the data for 35 years, then 35 years + 1 day is not acceptable! The data is there, in many cases, to support lawsuits against your company. If there is no legal mandate to keep it then why go to the expense of keeping it to help others sue you?
Of course, you must be able to deliver on discovery in a lawsuit. This means querying the Archive to deliver the requested data. This is a potentially HUGE issue. If you lose once in court because you cannot produce data for discovery, then lawyers will descend upon your company filing suit against that same data they know you don’t have. And you will lose.
Now keep in mind that Discard does not just mean delete. It means REALLY and TRULY delete forever. Zero out the data by writing over top of it. It must not be able to be recovered by forensic experts or software.
<number>
Mark Nicolett, VP and Distinguished AnalystJeffrey Wheatman, Research DirectorGartner Research Report G00153063 (November 2007)