How to make multiple clients can get the response at the same time by stream.
Jakub Słociński
kuba at ovh.net
Wed Jan 30 16:19:23 CET 2013
Ok, thanks for clarification, Per.
Per Buer napisał(a):
> No. It doesn't work.
>
> You'll end with multiple versions of the same object in memory and you'll
> disable caching when multiple requests are taking place. Sort of like a
> braindamaged "pass" that doesn't free up the memory when the request is
> done.
>
>
>
> On Wed, Jan 30, 2013 at 11:22 AM, Jakub Słociński <kuba at ovh.net> wrote:
>
> >
> > Hi,
> > I would bet that hash_ignore_busy works under the version Xianzhe Wang
> > mentioned - 3.0.2-streaming :)
> >
> > --
> > Regards,
> > Jakub Słociński
> >
> > Per Buer napisał(a):
> > > Hi.
> > >
> > > On Tue, Jan 29, 2013 at 1:13 PM, Jakub Słociński <kuba at ovh.net> wrote:
> > >
> > > > Hi Xianzhe Wang,
> > > > you should try option
> > > > "set req.hash_ignore_busy = true;"
> > > > in vlc_recv.
> > > >
> > >
> > > No. This won't work.
> > >
> > > Plain vanilla varnish doesn't support streaming properly. You should use
> > > the "plus" branch of Varnish if you need it.
> > >
> > >
> > >
> > >
> > >
> > > >
> > > > Regards,
> > > > --
> > > > Jakub S.
> > > >
> > > >
> > > > Xianzhe Wang napisał(a):
> > > > > Hello everyone,
> > > > > My varnish version is 3.0.2-streaming release.And I set
> > > > > "beresp.do_stream = true" in vcl_fetch in order to "Deliver the
> > object
> > > > to
> > > > > the client directly without fetching the whole object into varnish";
> > > > >
> > > > > This is a part of my *.vcl file:
> > > > >
> > > > > sub vcl_fetch {
> > > > > set beresp.grace = 30m;
> > > > >
> > > > > set beresp.do_stream = true;
> > > > >
> > > > > if (beresp.http.Content-Length && beresp.http.Content-Length ~
> > > > > "[0-9]{8,}") {
> > > > > return (hit_for_pass);
> > > > > }
> > > > >
> > > > > if (beresp.http.Pragma ~ "no-cache" ||
> > beresp.http.Cache-Control ~
> > > > > "no-cache" || beresp.http.Cache-Control ~ "private") {
> > > > > return (hit_for_pass);
> > > > > }
> > > > >
> > > > > if (beresp.ttl <= 0s ||
> > > > > beresp.http.Set-Cookie ||
> > > > > beresp.http.Vary == "*") {
> > > > >
> > > > > set beresp.ttl = 120 s;
> > > > > return (hit_for_pass);
> > > > > }
> > > > >
> > > > > return (deliver);
> > > > > }
> > > > >
> > > > > Then I request a big file(about 100M+) like "xxx.zip" from
> > clients.There
> > > > is
> > > > > only one client can access the object.because "the object will
> > marked as
> > > > > busy as it is delivered."
> > > > >
> > > > > But if the request goes directly to “pass” ,multiple clients can
> > get the
> > > > > response at the same time.
> > > > >
> > > > > Also if I remove
> > > > > if (beresp.http.Content-Length && beresp.http.Content-Length ~
> > > > > "[0-9]{8,}") {
> > > > > return (hit_for_pass);
> > > > > }
> > > > > to make the file cacheable,multiple clients can get the response at
> > the
> > > > > same time.
> > > > >
> > > > > Now I want "multiple clients can get the response at the same time."
> > in
> > > > all
> > > > > situations("pass","hit","hit_for_pass").
> > > > >
> > > > > What can I do for it?
> > > > > Any suggestions will be appreciate.
> > > > > Thank you.
> > > > >
> > > > > -Shawn Wang
> > > >
> > > > > _______________________________________________
> > > > > varnish-misc mailing list
> > > > > varnish-misc at varnish-cache.org
> > > > > https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc
> > > >
> > > >
> > > > _______________________________________________
> > > > varnish-misc mailing list
> > > > varnish-misc at varnish-cache.org
> > > > https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc
> > >
> > >
> > >
> > >
> > > --
> > > <http://www.varnish-software.com/> *Per Buer*
> > > CEO | Varnish Software AS
> > > Phone: +47 958 39 117 | Skype: per.buer
> > > We Make Websites Fly!
> >
>
>
>
> --
> <http://www.varnish-software.com/> *Per Buer*
> CEO | Varnish Software AS
> Phone: +47 958 39 117 | Skype: per.buer
> We Make Websites Fly!
More information about the varnish-misc
mailing list