5. Why Caching
It depends on a multitude of factors being:
• how many times a cached piece of data can and is
reused by the application.
• the proportion of the response time that is alleviated
by caching.
• In applications that are I/O bound, which is most
business applications, most of the response time is
getting data from a database. Therefore the speed up
mostly depends on how much reuse a piece of data
gets.
6. Why Caching
Amdahl's Law & How to calculate entire system
speed up.
• Amdahl's law, after Gene Amdahl, is used to
find the system speed up from a speed up in
part of the system.
1/ ((1 - Proportion Sped Up) + Proportion Sped Up / Speed up)
7. Why Caching
Speed up from a Web Page Cache
Un-cached page time: 2 seconds
Cache retrieval time: 2ms
Proportion: 100%
The expected server side system speedup is:
1 / ((1 - 1) + 1 / 1000)
= 1 / (0 + .001)
= 1000 times system speedup
The to the browser “system” speed up is much less
8. Why Caching
Speed up from a Database Level Cache
Un-cached page time: 2 seconds
Database time: 1.5 seconds
Cache retrieval time: 2ms
Proportion: 75% (1.5/2)
The expected server side system speedup is:
1 / ((1 - .75) + .75 / (1500/2)))
= 1 / (.25 + .75/750)
= 3.98 times system speedup
The to the browser “system” speed up is much less
11. ORM Caching
1st Level Cache
Session Caching:
– Enabled by default.
– Used transparently during the session.
– All objects that was saved or retrieved
• save
• get
• list
12. ORM Caching
1st Level Cache
Session Caching:
– flush() - will sync cache to DB.
– clear() - will evict all objects.
Read-only: A concurrency strategy suitable for data which never changes. Use it for reference data only.Read-write: Use this strategy for read-mostly data where it is critical to prevent stale data in concurrent transactions, in the rare case of an update.Transactional: Again use this strategy for read-mostly data where it is critical to prevent stale data in concurrent transactions, in the rare case of an update.Nonstrict-read-write: This strategy makes no guarantee of consistency between the cache and the database. Use this strategy if data hardly ever changes and a small likelihood of stale data is not of critical concern.
GimFire server cache & Spring data integration.