Handwritten Text Recognition for manuscripts and early printed texts
Share Orlando Ulf Mattsson session 9353 2011
1. PCI Compliance Without
Compensating Controls – How to Take
Your Mainframe Out of Scope
Ulf Mattsson
CTO Protegrity
August 8, 2011
Session 9353
2. U lf M a t t s s o n
• 20 years with IBM Software Development
• Received US Green Card ‘EB 11 – Individual of Extraordinary Ability’
endorsed by IBM Research
• Inventor of 21 Patents
• Encryption Key Management, Policy Driven Data Encryption, Distributed
Tokenization and Intrusion Prevention
• Research member of the International Federation for Information
Processing (IFIP) WG 11.3 Data and Application Security
• Created the Architecture of the Protegrity Database Security
Technology
• Received Industry's 2008 Most Valuable Performers (MVP) award
together with technology leaders from IBM, Google, Cisco, Ingres
and other leading companies
2 Session 9353
5. Best Source of Incident Data
“It is fascinating that the top threat events
in both 2010 and 2011 are the same
and involve external agents hacking and installing malware
to compromise the confidentiality and integrity of servers.”
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team
5 Session 9353
6. C o m p r o m is e d D a t a T y p e s - #
R e c ords
%
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
6 Session 9353
7. A t t a c k s a t D if f e r e n t S y s t e m
L a ye r s
Data Entry
“The perimeter is gone – need for new security
tw
ork approaches”
Authorized/ Ne
Un-authorized
Users SNIFFER ATTACK
Application
SQL INJECTION
HW Service
MALWARE / TROJAN
Database
Contractors DATABASE ATTACK
Vendors File System FILE ATTACK
MEDIA ATTACK
Database Admin Storage …
System Admin
… Backup
7 Session 9353
8. P C I D S S - Payment Card Industry
Data Security Standard
• Applies to all organizations that hold, process, or
exchange cardholder information
• A worldwide information security standard defined by the
Payment Card Industry Security Standards Council
(formed in 2004)
• Began as five different programs:
• Visa Card Information Security Program, MasterCard Site Data
Protection, American Express Data Security Operating Policy,
Discover Information and Compliance, and the JCB Data Security
Program.
• 12 requirements for compliance, organized into six
logically related groups, which are called "control
objectives."
8 Session 9353
9. P C I D S S # 3 , 6 , 7 , 10 & 12
B u ild a n d m a in t a in a 1. In s t a ll a n d m a in t a in a f ir e w a ll
s e c ure ne tw o rk . c o n f ig u r a t io n t o p r o t e c t d a t a
2. D o n o t u s e v e n d o r -s u p p lie d d e f a u lt s
fo r s ys te m p a s s w o r d s a n d o th e r
s e c u r it y p a r a m e t e r s
P r o t e c t c a r d h o ld e r d a t a . • P ro te c t s to re d d a ta
• E n c r y p t t r a n s m is s io n o f c a r d h o ld e r
d a ta a nd s e n s it iv e in f o r m a t io n
a c r o s s p u b lic n e t w o r k s
M a in t a in a v u ln e r a b ilit y 1. U s e a n d r e g u la r ly u p d a t e a n t i-v ir u s
ma na g e me nt prog ra m. s o ftw a r e
• D e v e lo p a n d m a in t a in s e c u r e
s y s t e m s a n d a p p lic a t io n s
Im p le m e n t s t r o n g a c c e s s • R e s t r ic t a c c e s s t o d a t a b y b u s in e s s
c o ntro l me a s ure s . n e e d -t o -k n o w
• A s s ig n a u n iq u e ID t o e a c h p e r s o n
w it h c o m p u t e r a c c e s s
• R e s t r ic t p h y s ic a l a c c e s s t o
R e g u la r ly m o n it o r a n d • T r a c k a n d m o n it o r a ll a c c e s s t o
c a r d h o ld e r d a t a
te s t ne tw o rk s . n e t w o r k r e s o u r c e s a n d c a r d h o ld e r
d a ta
M a in t a in a n in f o r m a t io n •
1.
R e g u la r ly t e s t s e c u r it y s y s t e m s a n d
M a in t a in a p o lic y t h a t a d d r e s s e s
proc e s s e s
s e c u r it y p o lic y . in f o r m a t io n s e c u r it y
9 Session 9353
10. P C I D S S #3 & 4 – P r o t e c t
C a r d h o ld e r D a t a
• 3.4 Render PAN, at minimum, unreadable anywhere it is
stored by using any of the following approaches:
• One-way hashes based on strong cryptography
• Truncation
• Index tokens and pads (pads must be securely stored)
• Strong cryptography with associated key-management processes and
procedures
• 4.1 Use strong cryptography to safeguard sensitive
cardholder data during transmission over open, public
networks.
• Comments – Cost effective compliance
• Encrypted PAN is always “in PCI scope”
• Tokens can be “out of PCI scope”
10 Session 9353
11. PCI DSS - Appendix B: Compensating
Controls
• Compensating controls may be considered for most PCI DSS requirements
when an entity cannot meet a requirement explicitly as stated, due to
legitimate technical or documented business constraints, but has sufficiently
mitigated the risk associated with the requirement through implementation of
other, or compensating, controls.
• Compensating controls must satisfy the following criteria:
• Meet the intent and rigor of the original PCI DSS requirement.
• Provide a similar level of defense as the original PCI DSS requirement, such that the
compensating control sufficiently offsets the risk that the original PCI DSS
requirement was designed to defend against.
• Be “above and beyond” other PCI DSS requirements. (Simply being in compliance
with other PCI DSS requirements is not a compensating control.)
11 Session 9353
12. PCI DSS - Network Segmentation
• Network segmentation of, or isolating (segmenting), the cardholder
data environment from the remainder of an entity’s network is not a
PCI DSS requirement.
• However, it is strongly recommended as a method that may reduce:
• The scope of the PCI DSS assessment
• The cost of the PCI DSS assessment
• The cost and difficulty of implementing and maintaining PCI DSS controls
• The risk to an organization (reduced by consolidating cardholder data into
fewer, more controlled locations)
12 Session 9353
13. S p e e d o f D if f e r e n t P r o t e c t io n
M e t h oper second (16 digits)*
Transactions d s
10 000 000 -
1 000 000 -
100 000 -
10 000 -
1 000 -
100 - I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
*: Speed will depend on the configuration
13
Session 9353
14. S e c u r it y o f D if f e r e n t P r o t e c t io n
M Security o d s
e t h Level
High -
Low -
I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
14
Session 9353
15. S p e e d a n d S e c u r it y
Transactions per second (16 digits)
Security
10 000 000 -
Speed* Level
High
1 000 000 -
100 000 -
Security
10 000 -
Low
1 000 -
100 - I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
*: Speed will depend on the configuration
Encryption
15
Session 9353
16. D if f e r e n t A p p r o a c h e s f o r
T o k e n iz a t io n
• Traditional Tokenization
• Dynamic Model or Pre-Generated Model
• 5 tokens per second - 5000 tokenizations per second
• Next Generation Tokenization
• Memory-tokenization
• 200,000 - 9,000,000+ tokenizations per second
• “The tokenization scheme offers excellent security, since it is
based on fully randomized tables.” *
• “This is a fully distributed tokenization approach with no need
for synchronization and there is no risk for collisions.“ *
*: Prof. Dr. Ir. Bart Preneel, Katholieke University Leuven, Belgium
016
Session 9353
17. Evaluating Encryption & Tokenization Approaches
Evaluation Criteria Encryption Tokenization
Database Database Centralized Memory
Area Impact File Column Tokenization Tokenization
Encryption Encryption (old) (new)
Availability
Scalability Latency
CPU Consumption
Data Flow
Protection
Compliance Scoping
Security Key Management
Randomness
Separation of Duties
Best Worst
017
Session 9353
18. Evaluating Field Encryption & Distributed Tokenization
Evaluation Criteria Strong Field Formatted Memory
Encryption Encryption Tokenization
Disconnected environments
Distributed environments
Performance impact when loading data
Transparent to applications
Expanded storage size
Transparent to databases schema
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
High risk data
Security - compliance to PCI, NIST
Best Worst
18
Session 9353
19. T o k e n F le x ib ilit y f o r D if f e r e n t
C a t e g o r ie s o f D a t a
Type of Data Input Token Comment
Token Properties
Credit Card 3872 3789 1620 3675 8278 2789 2990 2789 Numeric
Medical ID 29M2009ID 497HF390D Alpha-Numeric
Date 10/30/1955 12/25/2034 Date
E-mail Address bob.hope@protegrity.com empo.snaugs@svtiensnni.snk Alpha Numeric, delimiters in
input preserved
SSN delimiters 075-67-2278 287-38-2567 Numeric, delimiters in input
Credit Card 3872 3789 1620 3675 8278 2789 2990 3675 Numeric, Last 4 digits exposed
Policy Masking
Credit Card 3872 3789 1620 3675 clear, encrypted, tokenized at rest Presentation Mask: Expose 1st
3872 37## #### #### 6 digits
19
Session 9353
20. S o m e T o k e n iz a t io n U s e C a s e s
• Customer 1
• Vendor lock-in: What if we want to switch payment processor?
• Performance challenge: What if we want to rotate the tokens?
• Performance challenge with initial tokenization
• Customer 2
• Reduced PCI compliance cost by 50%
• Performance challenge with initial tokenization
• End-to-end: looking to expand tokenization to all stores
• Customer 3
• Desired a single vendor
• Desired use of encryption and tokenization
• Looking to expand tokens beyond CCN to PII
• Customer 4
• Remove compensating controls on the mainframe
• Pushing tokens through to avoid compensating controls
20
Session 9353
21. T o k e n i z a t i o n U s e C a s e #2
• A leading retail chain
• 1500 locations in the U.S. market
• Simplify PCI Compliance
• 98% of Use Cases out of audit scope
• Ease of install (had 18 PCI initiatives at one time)
• Tokenization solution was implemented in 2 weeks
• Reduced PCI Audit from 7 months to 3 months
• No 3rd Party code modifications
• Proved to be the best performance option
• 700,000 transactions per days
• 50 million card holder data records
• Conversion took 90 minutes (plan was 30 days)
• Next step – tokenization servers at 1500 locations
21
Session 9353
22. Case Study 1: Goal – PCI Compliance &
Application Transparency
C r e d it Retail
C a rd Store Central HQ Location
E ntry
Applications
FTP
Applications
File
File
Decryption
Encryption:
Windows
Database File
Encryption: Encryption:
DB2 (zOS, iSeries), Windows,
Oracle, UNIX,
: Encryption service
SQL Server Linux, zOS
22 Session 9353
23. Case Study 2: Goal – Addressing Advanced
Attacks & PCI DSS
C r e d it
Retail
C a rd
Store Central HQ Location
E ntry
Encryption
Application
Application
FTP
Database File
End-to-End-Encryption Encryption: Encryption:
(E2EE)
DB2, Windows,
SQL Server UNIX,
zOS
: Encryption service
Application Application
23 Session 9353
24. Encryption Topologies – Mainframe Example
1
DB2
Micro-second* Mainframe
VIEW UDF (z/OS)
User Defined Function
1 Key Server
Local DB2 CPACF (CCF)
Micro-second*
Encryption
ICSF
EDITPROC
Integrated Cryptographic
Services Facility
DB2 1
Micro-second*
EDITPROC
CP Assist for
FIELDPROC CPACF Cryptographic
Function Crypto Server
Remote DB2
Encryption
VIEW UDF
TCP/IP
1000 Micro-seconds*
: Encryption service * : 20 bytes
24 Session 9353
25. C o lu m n E n c r y p t io n
P e r f o r m a n c e - D if f e r e n t
T o p o l o g i /e (100 bytes)
Rows Decrypted s s
1 000 000 –
z/OS
Hardware
Crypto - CPACF
100 000 -
(All Operations)
10 000 – Data Loading (Batch)
1 000 –
Queries (Data Warehouse & OLTP) Encryption
I I
Topology
Network Attached Local Encryption
(SW/HW)
Encryption (SW/HW)
25 Session 9353
26. E v a lu a t io n o f E n c r y p t io n O p t io n s
fo r
D B 2 o n z /O S
Encryption Performance PCI DSS Security Transparency
Interface
API
UDF DB2 V8
UDF DB2 V9 -
Fieldproc
Editproc
Best Worst
26 Session 9353
27. D if f e r e n t T o k e n iz a t io n
A p p ro a c h e s - P e r fo r m a n c e
PAN Tokenization
(per second) New Distributed
On-site 200 000 – Tokenization Approach
(per deployed token server)
100 000 –
10 000 –
Old Centralized
On-site 1000 –
Tokenization Approach
(enterprise total) Tokenization
Outsourced 5–
I I
Topology
Old New
27 Session 9353
28. E v a lu a t in g D if f e r e n t T o k e n iz a t io n
S o lu t io n s
Evaluating Different Tokenization Implementations
Evaluation Area Hosted/Outsourced On-site/On-premises
Area Criteria Central (old) Distributed Central (old) Distributed Integrated
Availability
Operati
onal Scalability
Needs
Performance
Per Server
Pricing
Model Per Transaction
Identifiable - PII
Data
Types Cardholder - PCI
Separation
Security
Compliance
Scope
Best Worst
28 Session 9353
29. P C I D S S - Wa ys to R e n d e r th e
P A N * U n r e a d a b le
Two-way cryptography with associated key management
processes
One-way cryptographic hash functions
Index tokens and pads
Truncation (or masking – xxxxxx xxxxxx 6781)
* PAN: Primary Account Number (Credit Card Number)
029
Session 9353
30. How to not Break the Data Format
Protection Method
Length and
Hashing - !@#$%a^&*B()_+!@4#$2%p^&* Type Changed
Binary Encryption - !@#$%a^&*B()_+!@ Type Changed
Alpha Encoding - aVdSaH gF4fJh sDla
Tokenizing
Encoding - 666666 777777 8888 or
Formatted
artial Encoding - 123456 777777 1234 Encryption
Clear Text - 123456 123456 1234 Data Field
Length
CCN / PAN
30 Session 9353
31. D if f e r e n t S e c u r it y O p t io n s f o r
D a t a F ie ld s
Evaluation Criteria Strong Formatted New Distributed Old Central
Encryption Encryption Tokenization
Tokenization
Disconnected environments
Distributed environments
Performance impact – data loading
Transparent to applications
Expanded storage size
Transparent to database schema
Long life-cycle data
Unix or Windows &“big iron”
Re-keying of data in a data flow
High risk data
Compliance to PCI, NIST
Best Worst
31 Session 9353
32. C h o o s e Yo u r D e f e n s e s –
P o s it io n in g o f
A l t e r n a t i v e Performance Storage Availability
s Transparency Security
Database Protection
Approach
Monitoring, Blocking,
Masking
Column Level
Formatted Encryption
Column Level Strong
Encryption
Distributed Tokenization
Central Tokenization
Database File
Encryption
Best Worst
32 Session 9353
33. D a t a P r o t e c t io n C h a lle n g e s
• Actual protection is not the challenge
• Management of solutions
• Key management
• Security policy
• Auditing and reporting
• Minimizing impact on business operations
• Transparency
• Performance vs. security
• Minimizing the cost implications
• Maintaining compliance
• Implementation Time
33 Session 9353
34. S in g le P o in t o f C o n t r o l f o r D a t a
E n c r y p t io n
API RACF
Applications
Encryption
Solution ICSF
Mainframe
DB2 z/OS z/OS
Hardware
Security
Files
DB2 LUW
Central Manager for:
•Encryption keys
Informix •Security policy
•Reporting
System i Hardware
Security
Other
: Encryption service
34 Session 9353
35. D a t a S e c u r it y M a n a g e m e n t
Secure
Distribution
File System Policy Database
Protector Protector
Secure
Collection
Audit
Application Log
Protector
Enterprise
Data Security
Administrator
Secure
Tokenization
Archive
Server
Broad Platform Support
35 Session 9353
36. Hiding Data in Plain Sight – Data Tokenization
Data Entry
Y&SFD%))S( Tokenization
Server
400000 123456 7899 Data Token
400000 222222 7899
Application
Databases
036
37. W h a t is E n c r y p t io n a n d
T o k e n iz a t io n ?
Encryption Tokenization
Used Approach Cipher System Code System
Cryptographic algorithms
Cryptographic keys
Code books
Index tokens
Source: McGraw-HILL ENCYPLOPEDIA OF SCIENCE & TECHNOLOGY
37
Session 9353
38. C o m m e n t s o n V is a ’ s
T o k e n iz a t io n B e s t P r a c t ic e s
• Visa recommendations should have been simply to use a
random number
• You should not write your own 'home-grown' token
servers
038
39. R e d u c in g t h e A t t a c k S u r f a c e
12 3 4 5 6 12 3 4 5 6 12 3 4 12 3 4 5 6 12 3 4 5 6 12 3 4
12 3 4 5 6 9 9 9 9 9 9 12 4 5 6 9 9 9 9 9 9 12 3 3 4 6 9 9 9 9 9 9 12 3 4
12 3 3 4 12 4 5
12 3 4 5 6 9 9 9 9 9 9 12 3 12 3 4 5 6 9 9 9 9 9 9 12 3 4 12 3 4 5 6 9 9 9 9 9 9 12 3 4
4
Applications & Databases
: Data Token
U n p r o t e c t e d s e n s it iv e
39 in f o r m a t io n :
P r o t e c t e d s e n s it iv e
40. P o s it io n in g o f D if f e r e n t P r o t e c t io n
O p t io n s
Evaluation Criteria Strong Formatted Data
Encryption Encryption Tokens
Security & Compliance
Total Cost of Ownership
Use of Encoded Data
Best Worst
40
Session 9353
41. W h y T o k e n iz a t io n – A T r ip le
P la y
1. No Masking
2. No Encryption
3. No Key Management
041
Session 9353
42. W h y In -m e m o r y T o k e n iz a t io n
$
1. Better
2. Faster
3. Lower Cost / TCO
042
Session 9353
Notas do Editor
ULF
Performance Impact on operations - end users, data processing windows Storage Impact on data storage requirements Security How secure Is the data at rest Impact on data access – separation of duties Transparency Changes to application(s) Impact on supporting utilities and processes
ULF
These are particular use cases where you should “watch out”. It does not capture ALL of criteria and use cases