Rewriting served URLs dependent on user agent

Ian Evans dheianevans at gmail.com
Fri Mar 1 00:33:23 CET 2013


I've been looking at this site's discussion of how they're handling
the traffic loss caused by Google's redesign of their image search.

http://pixabay.com/en/blog/posts/hotlinking-protection-and-watermarking-for-google-32/

One of the ways they handle it is by having img urls served to humans
end with ?i while robots like Googlebot would just see the img url in
the source without the ?i

They said their solution doesn't scale too well, and I began wondering
if there was a way to use Varnish in the process. I've just started
reading about Varnish and VCL, but from articles I've read, I thought
it might be a great solution.

Let's say the backend has all the img src urls already ending in "?i"
as in img src="http://www.example.com/test.jpg?i"

Is there a way that Varnish could cache two versions of the page?

One, human visitors would get the cached page with the?i
Two, robot user agents would get a cached version where Varnish would
strip all the ?i from urls.

Is that possible? Thanks for any pointers.



More information about the varnish-misc mailing list