-
Notifications
You must be signed in to change notification settings - Fork 769
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Module LRU Cache #3703
Comments
Hi, Sorry I couldn't make it to the last PMC and I promised to give some feedback on the caching matter.
I assume that local means same network/data center and additionally, but optional, in-memory. And as this is very heavy on implementation details, I would argue that it's okay if the Java and Go versions have no feature parity. Prebid Server JavaI can only speak from the prebid-server-java perspective. However I assume that there are equally suitable alternatives in goland. In Java there's already https://github.com/ben-manes/caffeine as an in-memory Caching implementation available. It's being used in the There are some hand-written cache systems that might benefit from it as I would argue that we should
This caching component should be so generic it can be used in modules, but also in the rest of the code base, e.g. for the VendorList, geo lookups or stored impressions. This is a rough sketch on how it might look like, specifically for modules. modules:
my-module-1:
cache:
storage: memory
max-size: 10000
my-module-2:
# this configuration is modelled as its own data class that can be instantiated from a settings property
cache:
storage: redis
host: redis.mynetwork.local
port: 8467
# eviction strategies https://github.com/ben-manes/caffeine/wiki/Eviction
max-size: 100000
retention: 3600 A module should be able to instantiate a cache from it's configuration. Something like var cacheConfig = CacheConfig.fromSettings(...);
var cache = new Cache(cacheConfig); The
Option two may look like this // scala... sorry, I'm faster in this
// result is either an exception, or nothing
type Result = Either[CacheException, Unit]
trait Encoder[T] {
def encode(value: T): ByteBuffer // or some other low level primitive
}
trait Decoder[T] {
def decode(value: ByteBuffer): Either[T, DecodeException)
}
trait Cache[Key,Value] {
def put[Value: Encoder](key: String, value: Value): Future[Result]
// result can be optional if not found
def get[Value: Decoder](key: String]: Future[Option[Value]]
} The second options requires module maintainers to write |
The committee discussed providing caching services to modules in the context of #3512
We agreed that another important component would be a controlled cache local to Prebid Server. Some modules may have a local cache hit ratio high enough to help overall latency.
Requirements:
Note that the 51Degrees module implemented an LRU cache that might be applicable more broadly:
https://github.com/51Degrees/Java-Device-Detection/blob/cbc9a1acf55a863054530152fe42daa9bde2ed07/device-detection-core/src/main/java/fiftyone/mobile/detection/cache/LruCache.java#L67
Looking for community members to help flesh out a technical proposal.
The text was updated successfully, but these errors were encountered: