Memory Efficiency


Reason #1: Minimize Memory Usage

Legacy memcached was never designed for memory efficiency.

Problem

memcached wastes expensive RAM with unnecessary memory overhead.

memcached combines a large fixed-storage-overhead (~100 bytes per entry) with inefficient over allocation decisions due to its fixed-size slab allocator.

We Fixed It

Carrier Cache is designed using the most efficient data storage possible using pointer-free data structures with automatically proportional metadata.

Carrier Cache minimum overhead is 6 bytes per key-value pair (overhead automatically adjusts with larger data). In memcached, storage overhead is fixed around 100 bytes per key regardless if you store 5 bytes or thousands of bytes each.

View Carrier Cache vs. memcached memory savings at Carrier Cache Benchmarks

Memory Savings Comparison

For small data, memcached RAM growth is dominated by the overhead of storing individual keys regardless of actual data size. If you store 300 million integers in memcached, it takes 30 GB RAM due to bloated memcached overhead. If you store the same 300 million integers in Carrier Cache, you only need 3.83 GB.


64-Byte Overhead Example (legacy memcached)
Item Count (billions) Item Size Data Size Accounting
Overhead
Total Size Overhead Percent
1 100 bytes 100 GB 64 GB 164 GB 64%
10 10 bytes 100 GB 640 GB 740 GB 640%
100 10 bytes 1,000 GB 6,400 GB 7,400 GB 640%
500 5 bytes 2,500 GB 32,000 GB 34,500 GB 1,280%
100 5 bytes 500 GB 6,400 GB 6,900 GB 1,280%
200 3 bytes 600 GB 12,800 GB 13,400 GB 2,133%

6-Byte Overhead Example (Carrier Cache)
Item Count (billions) Item Size Data Size Accounting
Overhead
Total Size Overhead Percent
1 100 bytes 100 GB 6 GB 106 GB 6%
10 10 bytes 100 GB 60 GB 160 GB 60%
100 10 bytes 1,000 GB 600 GB 1,600 GB 60%
500 5 bytes 2,500 GB 3,000 GB 5,500 GB 120%
100 5 bytes 500 GB 600 GB 1,100 GB 120%
200 3 bytes 600 GB 1,200 GB 1,800 GB 200%

Security First


Reason #2: Add Encryption

Encryption—have you heard of it? Legacy memcached has no built-in TLS support.

Problem

You can't trust your network. We're in a golden age of network exploitation and data exfiltration to the point where running unencrypted services is professionally irresponsible.

memcached has been in legacy maintenance mode for years and has never supported encrypted connections.

We Fixed It

Carrier Cache includes built-in support for modern CPU accelerated TLS encryption including elliptic curves with forward secrecy.

Also, Carrier Cache's unique design allows decryption of incoming requests concurrently with encryption of outgoing responses using as many cores as you have available.

Modern Operation


Reason #3: Modern Ops

Modern architectures have so many components, your softwaremust protect itself against mis-usage.

Problem

memcached's insecure defaults have opened thousands of servers to data loss.

memcached had defaults of listening on all interfaces allowing accidental data exfiltration, listening on UDP enabling the largest seen terabit-scale DDoS attacks.

Even after memcached fixed defaults in its upstream repository, it can take 5+ years for those changes to hit your production OS package repositories.

You probably can't trust memcached patch levels from your OS vendor, and you likely aren't building memcached from upstream origin/master yourself.

We Fixed It

Carrier Cache defaults to absolute operational security.

Carrier Cache won't even let you listen on public interfaces without manually specifying an override configuration setting of securityOverrideAllowPublicIP.

Carrier Cache won't let you listen on wildcard interfaces without manually specifying an override configuration setting of securityOverrideListenOnAllInterfaces.

With Carrier Cache, you will never accidentally expose all your data to the public Internet without intentionally configuring it that way yourself.

Carrier Cache even intentionally lacks a common service port number, making it impossible to quickly do Internet-level scans to discover Carrier Cache instances. By having each user create their own port number definitions, global port scans are much less useful for finding improperly administered servers like shodan enjoys sharing with everybody.

Subscribe to receive announcements, release notes, feature updates, progress reports, and maybe even some surprises along the way.

Modern Internals


Reason #4: Modern Internals

Redesigned from the ground-up, Carrier Cache has an architecture designed for 2025 and beyond instead of legacy 1990s architecture from legacy memcached.

Problem

memcached was designed for a very specific purpose: to use excess RAM on web servers to cache database data for livejournal because mysql wasn't fast enough.

memcached was never designed for operational excellence at scale. It was never designed to run on the multi-terabyte RAM servers of today. Heck, it doesn't even have configuration files you can audit inside revision control.

memcached was designed using the best available decisions from 15 years ago, but it has stagnated and become a gaping Internet-wide liability.

We Fixed It

Carrier Cache is designed using modern and even post-modern data structures taking advantage of modern processors and modern processor instruction sets.

Carrier Cache is designed for maximum memory efficiency without resorting to bloated small-data slab allocator schemes (that inevitably get wedged in production generating on call alerts with no remedy other than server reboots).

Carrier Cache even abandons memcached's wasteful hash table architecture of bloated overhead and online resizing latency penalties in favor of self-tuning data management frameworks capable of saving you money by minimizing memory usage while targeting consistent performance.

Subscribe to receive announcements, release notes, feature updates, progress reports, and maybe even some surprises along the way.

Carrier Cache: You'll Love It

Give Carrier Cache a spin at our getting started page then come back and grab a dozen production licenses when you wonder how you even served data before Carrier Cache.

Subscribe to receive announcements, release notes, feature updates, progress reports, and maybe even some surprises along the way.