A 40 minutes version of my previous talk on layered caching in OpenResty, given at OpenResty Con 2018 in Hangzhou, China.
In this talk, we will dive into the new "mlcache" library, which aims at providing a simple abstraction for layered caching in OpenResty. We will explore several practical use-cases for it that will help you achieve high-performance goals for your applications.
4. 4/43
What type of caching?
Any Lua state fetched from I/O
Dynamic configuration
Dynamic logic (injected code)
Client sessions
...
Eventually, all OpenResty applications have needs for Lua-land
caching.
5. 5/43
The challenges of Lua-land caching
LuaJIT VM limitations
lua shared dict limitations
Split nginx workers
8. 8/43
Example: Caching in Kong
-- Retrieve a value from the cache , or fetch it
-- if it’s a miss
function _M.cache_get_and_set (key , cb)
local val = _M.cache_get(key)
if not val then
val = cb()
if val then
local succ , err = _M.cache_set(key , val)
if not succ and ngx then
ngx.log(ngx.ERR , err)
end
end
end
return val
end
Caching in Kong in 2015. Not ideal. Let’s switch to a library?
10. 10/43
Caching primitives in OpenResty
OpenResty offers a few off-the-shelf options for caching data in the
Lua-land:
The Lua VM itself
lua-resty-lrucache
lua shared dict
11. 10/43
Caching primitives in OpenResty
OpenResty offers a few off-the-shelf options for caching data in the
Lua-land:
The Lua VM itself
lua-resty-lrucache
lua shared dict
12. 10/43
Caching primitives in OpenResty
OpenResty offers a few off-the-shelf options for caching data in the
Lua-land:
The Lua VM itself
lua-resty-lrucache
lua shared dict
22. 20/43
Example: Database caching
local function fetch_user(id)
return db:query_user(id) -- row or nil
end
local id = 123
local user , err = cache:get(id , nil , fetch_user , id)
if err then
ngx.log(ngx.ERR , "failed to fetch user: ", err)
return
end
if user then
print(user.id) -- 123
else
-- miss is cached
end
24. 22/43
Example: DNS records caching
local resolver = require " resty.dns.resolver "
local r = resolver.new ({ nameservers = { "1.1.1.1" }})
local function resolve(name)
local answers = r:query(name)
local ip = answers [1]
local ttl = answers [1] .ttl
return ip , nil , ttl -- provide TTL from result
end
local host = "openresty.org"
local answers , err = cache:get(host , nil , resolve ,
host) -- callback arg
25. 23/43
Example: Injected logic
local function compile_code(row)
row.f = loadstring(row.code) -- load only once
return row
end
local user , err = cache:get(user_id , {
l1_serializer = compile_code
}, fetch_code) -- fetch Lua from DB
user.f () -- now a valid function
35. 33/43
Resiliency: Serving Stale Data
local opts = { resurrect_ttl = 30 }
local value , err = cache:get(key , opts , db.fetch)
If the L3 lookup fails (e.g. timeout), value may be served as stale
data, ensuring resiliency.
36. 34/43
Layered caching architecture
Built-in cache-stampede prevention
Efficient design (JIT)
Caching of Lua tables + negative hits
Custom serializers
Flexible deployment capabilities
Cache invalidation
With built-in IPC, or custom ones:
lua-resty-worker-events
slact/ngx lua ipc
Stale data serving
...
37. 35/43
Observability
Ok, caching is great. But how do you ensure it is properly
configured?
lua-resty-mlcache takes the next step as a caching library by
providing caching metrics.
38. 36/43
Observability
local value , err , hit_level = cache:get(key , nil ,
callback , ...)
print(hit_level) -- 1, 2, 3, 4
By keeping track of the hit lvl, we can compute hit/miss ratios for
each layer, as well as tracking stale data servings.
40. 38/43
Observability
Plans for providing metrics do not stop here. In the future, we will:
Provide more info than hit lvl on lookups
Use the new shm:ttl and shm:free space APIs
41. 39/43
A complete solution
Layered caching architecture
Built-in cache-stampede prevention
Efficient design (JIT)
Caching of Lua tables + negative hits
Custom serializers
Flexible deployment capabilities
Cache invalidation
With built-in IPC, or custom ones:
lua-resty-worker-events
slact/ngx lua ipc
Stale data serving
Observability
42. 40/43
In the wild
Used in Kong for over a year
Contributions from Cloudflare and Kong Inc.
Extensive test suite
OpenResty 1.11.2.2 to 1.13.6.2 (current)
43. 41/43
OpenResty contributions & Improvements
User flags support - openresty/lua-resty-lrucache#35
lua shared dict R/W lock - openresty/lua-nginx-module#1287
TODO: custom l2 serializer
TODO: Lua-land R/W lock
TODO: A native IPC solution for OpenResty