multi-terabyte caching

Eric Bowman ebowman at
Fri Nov 20 23:19:29 CET 2009


Apologies if this has been hashed out before.  I did some googling, and
read the faq, but I could have been more thorough... ;)

I'm considering using Varnish to handle caching for a mapping
application.  After reading, it seems like
Varnish is maybe not a good choice for this.  In short I need to cache
something like 500,000,000 files that take up about 2TB of storage.

Using more 1975 technologies, one of the challenges has been how to
distribute these across the file system without putting too many files
per directory.  We have a solution we kind of like, and there are others
out there.

My impression is that we would start to put a big strain on Varnish and
the OS using it in the standard way.  But maybe I'm wrong.  Or, is there
a way to plugin a backend to manage this storage, without getting into
the vm-thrash from which Squid suffers?

Thanks for any advice -- Varnish gets such good press I'd really love if
it were straightforward to use it in this case.


Eric Bowman

More information about the varnish-misc mailing list