Building a Highly Concurrent Cache in Go: A Hitchhiker's Guide

February 25, 2024 41 min Free

Description

This talk explores the design, implementation, and optimization of a concurrent cache in Go, incorporating LRU and LFU eviction policies and advanced concurrency patterns beyond sync.Mutex. It delves into building custom data structures for specific needs, utilizing Go's concurrency features, and the importance of benchmarking and profiling to measure and improve performance. The discussion covers various cache replacement algorithms, including hybrid approaches and decaying LRU, and explores the trade-offs involved in optimizing for speed and efficiency in concurrent environments.