<div dir="ltr">I have seen the same question and some answered:<div><a href="https://www.varnish-cache.org/lists/pipermail/varnish-misc/2012-April/021957.html">https://www.varnish-cache.org/lists/pipermail/varnish-misc/2012-April/021957.html</a> </div>
<div><br></div><div>"</div><div><pre style="white-space:pre-wrap;color:rgb(0,0,0)">I don't think what you have in mind would work. Varnish requires an
explicit lock on the files in manages. Sharing a cache between Varnish
instances won't ever work.
What I would recommend you do is to hash incoming requests based on URL so
each time the same URL is hit it is served from the same server. That way
you don't duplicate the content between caches. Varnish can do this, F5's
can do it, haproxy should be able to do this as well.</pre></div><div>"</div><div><br></div><div>So I don't think what I have in mind would work.</div><div><br><br><div class="gmail_quote">---------- Forwarded message ----------<br>
From: <b class="gmail_sendername">Xianzhe Wang</b> <span dir="ltr"><<a href="mailto:wxz19861013@gmail.com">wxz19861013@gmail.com</a>></span><br>Date: 2013/2/20<br>Subject: Re: How to duplicate cached data on all Varnish instance?<br>
To: Sascha Ottolski <<a href="mailto:ottolski@web.de">ottolski@web.de</a>><br>Cc: Varnish misc <<a href="mailto:varnish-misc@varnish-cache.org">varnish-misc@varnish-cache.org</a>><br><br><br><div dir="ltr">Thank you for help.<div>
What you said is all correct.</div><div><br></div><div>I think I'm not clear. I want exactly is that: </div><div>2 varnishes share only one cache file(or cache memory). As one request, 2 varnishes share only one cache object.</div>
<div>Something like this: </div><div>varnish1--> cache file <--varnish2</div><div><br></div><div>Could you have any suggestions or experiences?</div><div>Everything will be appreciated.</div><div><br></div><div>Regard</div>
<div><br></div><div>Shawn</div><div><br></div></div><div class=""><div class="h5"><div class="gmail_extra"><br><br><div class="gmail_quote">2013/2/20 Sascha Ottolski <span dir="ltr"><<a href="mailto:ottolski@web.de" target="_blank">ottolski@web.de</a>></span><br>
<blockquote class="gmail_quote" style="margin-top:0px;margin-right:0px;margin-bottom:0px;margin-left:0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">Am Dienstag, 19. Februar 2013, 19:44:53 schrieb Xianzhe Wang:<br>
<div><div>> Here I take nginx as a load balancer and it contects 2 varnish severs.<br>
> I wanna share the cache object between 2 varnishes.<br>
> When one varnish is down, the left one will work fine, and the cache<br>
> object is still work.<br>
> Is there anything I can do for this?<br>
><br>
> I aslo saw an example something like this:<br>
> <a href="https://www.varnish-cache.org/trac/wiki/VCLExampleHashIgnoreBusy" target="_blank">https://www.varnish-cache.org/trac/wiki/VCLExampleHashIgnoreBusy</a><br>
> But i think it will increase network delay. So I don't want do it like<br>
> this.<br>
><br>
> Is someone can share their experience? Thanks a lot.<br>
><br>
> Shawn<br>
<br>
</div></div>I would say, you already have your solution. If nginx send the requests<br>
randomly to any of the two servers, each will obviously fill its cache;<br>
so if one goes down, the other is still there. The two caches may not be<br>
completely identically, depending on the size of your cacheable content,<br>
but each should be "warm" enough to serve most requests from its cache.<br>
<br>
And you're not limited to two varnish servers, of course. The more you<br>
put into your loadbalanced cluster, the lower the impact if one fails.<br>
<br>
Cheers<br>
<span><font color="#888888"><br>
Sascha<br>
</font></span></blockquote></div><br></div>
</div></div></div><br></div></div>