How to make multiple clients can get the response at the same time by stream.

Xianzhe Wang wxz19861013 at gmail.com
Fri Feb 1 08:01:53 CET 2013


 Hi,
Thanks for clarification. What you say is very clear.
I am sorry to show my poor English, but I have tried my best to communicate.

There is another question.For example, if we request a .jpg
file(cacheable), varnish will encapsulation it as an object and insert
 into memory.  How can we get the .jpg file from the object?

Thank you for help again.

-Shawn Wang


2013/1/30 Per Buer <perbu at varnish-software.com>

> Hi,
>
> I was a bit quick and I didn't read the whole email the first time. Sorry
> about that. You're actually using the streaming branch, already I see. What
> you're writing is really, really odd. There is a slight lock while the
> "first" object is being fetched where other requests will be put on the
> waiting list. However, when the hit-for-pass object is created these should
> be released and pass'ed to the clients.
>
> If the backend takes forever coming back with the response headers then
> the situation would be something like what you describe. However, that
> would be odd and doesn't make much sense.
>
> PS: The streaming branch was renamed "plus" when it got other experimental
> features. You'll find source https://github.com/mbgrydeland/varnish-cacheand packages at
> repo.varnish-cache.org/test if I recall correctly.
>
>
>
>
> On Wed, Jan 30, 2013 at 3:32 AM, Xianzhe Wang <wxz19861013 at gmail.com>wrote:
>
>>
>> Hi,
>> Thanks a lot.
>>
>> I tried option
>> "set req.hash_ignore_busy = true;"
>> in vlc_recv.
>> I think it works. But there are side effects: it would increase backend
>> load.
>>
>> I have an idea about it in my previous email. what do you think about it?
>>
>> Another question is that where can I find the "plus" branch of Varnish
>> which matches this issue.
>>
>>  Any suggestions  will be appreciate.
>> Thanks again for help.
>>
>> Regards,
>> --
>> Shawn Wang
>>
>>
>> ---------- Forwarded message ----------
>> From: Xianzhe Wang <wxz19861013 at gmail.com>
>> Date: 2013/1/30
>> Subject: Re: How to make multiple clients can get the response at the
>> same time by stream.
>> To: Jakub Słociński <kuba at ovh.net>
>>
>>
>> Hi  Jakub S.
>> Thank you very much.
>> I tried, and take a simple test, two client request the big file at the
>> same time, they get the response stream immediately, so  it works.
>> In that case, multiple requests will go directly to "pass", they do not
>> need to wait, but it would increase backend load.
>> We need to balance the benefits and drawbacks.
>>
>> I wanna is that:
>>     Client 1 requests url /foo
>>     Client 2..N request url /foo
>>     Varnish tasks a worker to fetch /foo for Client 1
>>     Client 2..N are now queued pending response from the worker
>>     Worker fetch response header(just header not include body) from
>> backend, and find it  non-cacheable, then  make the remaining
>> requests(Client 2..N) go directly to "pass". And creat the hit_for_pass
>> object synchronously in the first request(Client 1).
>>     Subsequent requests are now given the hit_for_pass object instructing
>> them to go to the backend as long as the hit_for_pass object exists.
>>
>> As I mentioned below, is it feasible? Or do you have any Suggestions?
>>
>> Thanks again for help.
>>
>> Regards,
>> --
>> Shawn Wang
>>
>>
>>
>> 2013/1/29 Jakub Słociński <kuba at ovh.net>
>>
>>> Hi Xianzhe Wang,
>>> you should try option
>>> "set req.hash_ignore_busy = true;"
>>> in vlc_recv.
>>>
>>> Regards,
>>> --
>>> Jakub S.
>>>
>>>
>>> Xianzhe Wang napisał(a):
>>> > Hello everyone,
>>> >     My varnish version is 3.0.2-streaming release.And I set
>>> > "beresp.do_stream  = true" in vcl_fetch in order to "Deliver the
>>> object to
>>> > the client directly without fetching the whole object into varnish";
>>> >
>>> > This is a part of my *.vcl file:
>>> >
>>> >  sub vcl_fetch {
>>> >     set beresp.grace = 30m;
>>> >
>>> >     set beresp.do_stream = true;
>>> >
>>> >     if (beresp.http.Content-Length && beresp.http.Content-Length ~
>>> > "[0-9]{8,}") {
>>> >        return (hit_for_pass);
>>> >     }
>>> >
>>> >      if (beresp.http.Pragma ~ "no-cache" || beresp.http.Cache-Control ~
>>> > "no-cache" || beresp.http.Cache-Control ~ "private") {
>>> >            return (hit_for_pass);
>>> >        }
>>> >
>>> >      if (beresp.ttl <= 0s ||
>>> >          beresp.http.Set-Cookie ||
>>> >          beresp.http.Vary == "*") {
>>> >
>>> >                 set beresp.ttl = 120 s;
>>> >                 return (hit_for_pass);
>>> >      }
>>> >
>>> >     return (deliver);
>>> >  }
>>> >
>>> > Then I request a big file(about 100M+) like "xxx.zip" from
>>> clients.There is
>>> > only one client can access the object.because "the object will marked
>>> as
>>> > busy as it is delivered."
>>> >
>>> > But if  the request goes directly to “pass” ,multiple clients can get
>>> the
>>> > response at the same time.
>>> >
>>> > Also if I remove
>>> >   if (beresp.http.Content-Length && beresp.http.Content-Length ~
>>> > "[0-9]{8,}") {
>>> >        return (hit_for_pass);
>>> >     }
>>> > to make the file cacheable,multiple clients can get the response at the
>>> > same time.
>>> >
>>> > Now I want "multiple clients can get the response at the same time."
>>> in all
>>> > situations("pass","hit","hit_for_pass").
>>> >
>>> > What can I do for it?
>>> > Any suggestions  will be appreciate.
>>> > Thank you.
>>> >
>>> >  -Shawn Wang
>>>
>>> > _______________________________________________
>>> > varnish-misc mailing list
>>> > varnish-misc at varnish-cache.org
>>> > https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc
>>>
>>>
>>
>>
>
>
> --
>  <http://www.varnish-software.com/> *Per Buer*
>
> CEO | Varnish Software AS
> Phone: +47 958 39 117 | Skype: per.buer
> We Make Websites Fly!
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://www.varnish-cache.org/lists/pipermail/varnish-misc/attachments/20130201/9a8fa374/attachment.html>


More information about the varnish-misc mailing list