Hello, I’m Ganesh. I’m working on FreeDevTools online, currently building a single platform for all development tools, cheat codes, and TL; DRs — a free, open-source hub where developers can quickly find and use tools without the hassle of searching the internet.
There is a famous saying in computer science:
There are only two hard things in Computer Science: cache invalidation and naming things – Hil Karlton
It is really hard to balance performance (caching everything) with accuracy (showing new data immediately). If you cache too much, users see old data. If you invalidate too often, your databases get hammered with traffic.
Performance is a crucial factor. Whether you’re building a web applicati…
Hello, I’m Ganesh. I’m working on FreeDevTools online, currently building a single platform for all development tools, cheat codes, and TL; DRs — a free, open-source hub where developers can quickly find and use tools without the hassle of searching the internet.
There is a famous saying in computer science:
There are only two hard things in Computer Science: cache invalidation and naming things – Hil Karlton
It is really hard to balance performance (caching everything) with accuracy (showing new data immediately). If you cache too much, users see old data. If you invalidate too often, your databases get hammered with traffic.
Performance is a crucial factor. Whether you’re building a web application, a microservice, or any other software system, optimizing for latency on your APIs (response times) can enhance user experience and reduce operational costs (CPU/Memory).
In-memory caching and it’s advantages
In-memory caching involves storing frequently accessed data in memory for quick retrieval, rather than fetching it from a slower data store, such as a database, every time it’s needed.
As data will be in the memory.
It will significantly reduce latency and improve overall system performance.
Go using it’s lightweight goroutines and channels make it straightforward to build highly concurrent systems, which is ideal for handling cache operations efficiently, especially in scenarios with high read and write throughput.
Note: There are some popular packages like go-cache that can be used directly.
Implementation
Now, let’s explore how to implement a basic in-memory cache in Go.
Since generics were introduced in Go 1.18 and it has been some time since generics launched..
Using map[string]interface For a Cache, we need 4 main functions -
- Get() - to access value by key
- Set() - set value to a key
- Clear() - clears/resets the cache
- Delete() - to remove specific key We create a Cache struct and bind the 4 methods defined above.
We use map[string]interface{} here since it allows to store the value of any type corresponding to a string key. You can learn more about it here.
package main
import (
"fmt"
"sync"
"time"
)
// Cache represents an in-memory key-value store.
type Cache struct {
data map[string]interface{}
mu sync.RWMutex
}
// NewCache creates and initializes a new Cache instance.
func NewCache() *Cache {
return &Cache{
data: make(map[string]interface{}),
}
}
// Set adds or updates a key-value pair in the cache.
func (c *Cache) Set(key string, value interface{}) {
c.mu.Lock()
defer c.mu.Unlock()
c.data[key] = value
}
// Get retrieves the value associated with the given key from the cache.
func (c *Cache) Get(key string) (interface{}, bool) {
c.mu.RLock()
defer c.mu.RUnlock()
value, ok := c.data[key]
return value, ok
}
// Delete removes a key-value pair from the cache.
func (c *Cache) Delete(key string) {
c.mu.Lock()
defer c.mu.Unlock()
delete(c.data, key)
}
// Clear removes all key-value pairs from the cache.
func (c *Cache) Clear() {
c.mu.Lock()
defer c.mu.Unlock()
c.data = make(map[string]interface{})
}
func main() {
cache := NewCache()
// Adding data to the cache
cache.Set("key1", "value1")
cache.Set("key2", 123)
// Retrieving data from the cache
if val, ok := cache.Get("key1"); ok {
fmt.Println("Value for key1:", val)
}
// Deleting data from the cache
cache.Delete("key2")
// Clearing the cache
cache.Clear()
time.Sleep(time.Second) // Sleep to allow cache operations to complete
}
We have a working cache.
We used mutex for locking to avoid concurrent access.
Conclusion
We understood improteance of caching and how it will help Improving Performance and Efficiency.
If you notice, cuurent cache doesn’t support the concept of expiry of keys TTL (time to live).
Let’s add expiry support to our cache which means after a defined TTL the item expires and is removed from the caching in next article.
I’ve been building for FreeDevTools.
A collection of UI/UX-focused tools crafted to simplify workflows, save time, and reduce friction when searching for tools and materials.
Any feedback or contributions are welcome!
It’s online, open-source, and ready for anyone to use.