pass uses higher cpu than hit
kioto mitsubisi
reader_1000 at hotmail.com
Sun May 17 11:59:18 CEST 2015
Hi,
I was trying to load test my Varnish setup with wrk[1]. I have Varnish and Apache http server (backend) on same machine (RedHat 6).
I am testing for two scenarios.
First, I am making requests to page which Varnish caches and CPU usage of Varnish is around 17% on average:
wrk -t50 -c200 -d90s --latency http://host/cached/hit_page.html
Second, I am making requests to page which Varnish passes to the backend since it is a static page and I cannot purge it timely when it changes therefore I prefer Varnish not to cache it. However, for this scenario CPU usage of Varnish is around 78% on average:
wrk -t50 -c200 -d90s --latency http://host/nocache/pass_page.txt
It seemed a little bit strange to since I directly pass the request to the backend without any additonal processing. Could you please help me to find the reason? Where should I look for: operating system, backend, Varnish parameters? I put my VCL, varnishstat -1 output below. I also tried the "perf record" system-wide and put its output but I am not sure it is useful without having kernel debug packages.
Thanks in advance.
[1] https://github.com/wg/wrk
sub vcl_recv {
if (req.url !~ "^/cached"){
return (pass);
}
unset req.http.cookie;
if (req.http.host ~ "(?i)^(www.)?example.com") {
set req.http.host = "example.com";
}
# Strip hash, server doesn't need it.
if (req.url ~ "\#") {
set req.url = regsub(req.url, "\#.*$", "");
}
# remove query string
set req.url = regsub(req.url, "\?.*$", "");
# remove trailing slash
set req.url = regsub(req.url, "/$", "");
if (req.method == "PURGE") {
if (!client.ip ~ purgers) {
return(synth(405,"Not allowed"));
}
return (purge);
}
}
varnishstat -1:
MAIN.uptime 47764 1.00 Child process uptime
MAIN.sess_conn 848 0.02 Sessions accepted
MAIN.sess_drop 0 0.00 Sessions dropped
MAIN.sess_fail 0 0.00 Session accept failures
MAIN.sess_pipe_overflow 0 0.00 Session pipe overflow
MAIN.client_req_400 0 0.00 Client requests received, subject to 400 errors
MAIN.client_req_411 0 0.00 Client requests received, subject to 411 errors
MAIN.client_req_413 0 0.00 Client requests received, subject to 413 errors
MAIN.client_req_417 0 0.00 Client requests received, subject to 417 errors
MAIN.client_req 427203 8.94 Good client requests received
MAIN.cache_hit 72990 1.53 Cache hits
MAIN.cache_hitpass 0 0.00 Cache hits for pass
MAIN.cache_miss 152 0.00 Cache misses
MAIN.backend_conn 3537 0.07 Backend conn. success
MAIN.backend_unhealthy 0 0.00 Backend conn. not attempted
MAIN.backend_busy 0 0.00 Backend conn. too many
MAIN.backend_fail 0 0.00 Backend conn. failures
MAIN.backend_reuse 350666 7.34 Backend conn. reuses
MAIN.backend_toolate 30 0.00 Backend conn. was closed
MAIN.backend_recycle 350705 7.34 Backend conn. recycles
MAIN.backend_retry 0 0.00 Backend conn. retry
MAIN.fetch_head 0 0.00 Fetch no body (HEAD)
MAIN.fetch_length 354193 7.42 Fetch with Length
MAIN.fetch_chunked 13 0.00 Fetch chunked
MAIN.fetch_eof 0 0.00 Fetch EOF
MAIN.fetch_bad 0 0.00 Fetch bad T-E
MAIN.fetch_close 0 0.00 Fetch wanted close
MAIN.fetch_oldhttp 0 0.00 Fetch pre HTTP/1.1 closed
MAIN.fetch_zero 0 0.00 Fetch zero len body
MAIN.fetch_1xx 0 0.00 Fetch no body (1xx)
MAIN.fetch_204 0 0.00 Fetch no body (204)
MAIN.fetch_304 0 0.00 Fetch no body (304)
MAIN.fetch_failed 5 0.00 Fetch failed (all causes)
MAIN.fetch_no_thread 5 0.00 Fetch failed (no thread)
MAIN.pools 2 . Number of thread pools
MAIN.threads 100 . Total number of threads
MAIN.threads_limited 2 0.00 Threads hit max
MAIN.threads_created 453 0.01 Threads created
MAIN.threads_destroyed 353 0.01 Threads destroyed
MAIN.threads_failed 0 0.00 Thread creation failed
MAIN.thread_queue_len 0 . Length of session queue
MAIN.busy_sleep 0 0.00 Number of requests sent to sleep on busy objhdr
MAIN.busy_wakeup 0 0.00 Number of requests woken after sleep on busy objhdr
MAIN.sess_queued 313 0.01 Sessions queued for thread
MAIN.sess_dropped 5 0.00 Sessions dropped for thread
MAIN.n_object 89 . object structs made
MAIN.n_vampireobject 0 . unresurrected objects
MAIN.n_objectcore 115 . objectcore structs made
MAIN.n_objecthead 115 . objecthead structs made
MAIN.n_waitinglist 26 . waitinglist structs made
MAIN.n_backend 1 . Number of backends
MAIN.n_expired 61 . Number of expired objects
MAIN.n_lru_nuked 0 . Number of LRU nuked objects
MAIN.n_lru_moved 120 . Number of LRU moved objects
MAIN.losthdr 0 0.00 HTTP header overflows
MAIN.s_sess 848 0.02 Total sessions seen
MAIN.s_req 427203 8.94 Total requests seen
MAIN.s_pipe 0 0.00 Total pipe sessions seen
MAIN.s_pass 354059 7.41 Total pass-ed requests seen
MAIN.s_fetch 354211 7.42 Total backend fetches initiated
MAIN.s_synth 7 0.00 Total synthethic responses made
MAIN.s_req_hdrbytes 23873180 499.82 Request header bytes
MAIN.s_req_bodybytes 0 0.00 Request body bytes
MAIN.s_resp_hdrbytes 109855624 2299.97 Response header bytes
MAIN.s_resp_bodybytes 6030354123 126253.12 Response body bytes
MAIN.s_pipe_hdrbytes 0 0.00 Pipe request header bytes
MAIN.s_pipe_in 0 0.00 Piped bytes from client
MAIN.s_pipe_out 0 0.00 Piped bytes to client
MAIN.sess_closed 0 0.00 Session Closed
MAIN.sess_pipeline 0 0.00 Session Pipeline
MAIN.sess_readahead 0 0.00 Session Read Ahead
MAIN.sess_herd 426561 8.93 Session herd
MAIN.shm_records 33794054 707.52 SHM records
MAIN.shm_writes 1598314 33.46 SHM writes
MAIN.shm_flushes 0 0.00 SHM flushes due to overflow
MAIN.shm_cont 3312 0.07 SHM MTX contention
MAIN.shm_cycles 12 0.00 SHM cycles through buffer
MAIN.sms_nreq 0 0.00 SMS allocator requests
MAIN.sms_nobj 0 . SMS outstanding allocations
MAIN.sms_nbytes 0 . SMS outstanding bytes
MAIN.sms_balloc 0 . SMS bytes allocated
MAIN.sms_bfree 0 . SMS bytes freed
MAIN.backend_req 354204 7.42 Backend requests made
MAIN.n_vcl 2 0.00 Number of loaded VCLs in total
MAIN.n_vcl_avail 2 0.00 Number of VCLs available
MAIN.n_vcl_discard 0 0.00 Number of discarded VCLs
MAIN.bans 1 . Count of bans
MAIN.bans_completed 1 . Number of bans marked 'completed'
MAIN.bans_obj 0 . Number of bans using obj.*
MAIN.bans_req 0 . Number of bans using req.*
MAIN.bans_added 1 0.00 Bans added
MAIN.bans_deleted 0 0.00 Bans deleted
MAIN.bans_tested 0 0.00 Bans tested against objects (lookup)
MAIN.bans_obj_killed 0 0.00 Objects killed by bans (lookup)
MAIN.bans_lurker_tested 0 0.00 Bans tested against objects (lurker)
MAIN.bans_tests_tested 0 0.00 Ban tests tested against objects (lookup)
MAIN.bans_lurker_tests_tested 0 0.00 Ban tests tested against objects (lurker)
MAIN.bans_lurker_obj_killed 0 0.00 Objects killed by bans (lurker)
MAIN.bans_dups 0 0.00 Bans superseded by other bans
MAIN.bans_lurker_contention 0 0.00 Lurker gave way for lookup
MAIN.bans_persisted_bytes 13 . Bytes used by the persisted ban lists
MAIN.bans_persisted_fragmentation 0 . Extra bytes in persisted ban lists due to fragmentation
MAIN.n_purges 2 . Number of purge operations executed
MAIN.n_obj_purged 2 . Number of purged objects
MAIN.exp_mailed 154 0.00 Number of objects mailed to expiry thread
MAIN.exp_received 154 0.00 Number of objects received by expiry thread
MAIN.hcb_nolock 73144 1.53 HCB Lookups without lock
MAIN.hcb_lock 152 0.00 HCB Lookups with lock
MAIN.hcb_insert 152 0.00 HCB Inserts
MAIN.esi_errors 0 0.00 ESI parse errors (unlock)
MAIN.esi_warnings 0 0.00 ESI parse warnings (unlock)
MAIN.vmods 0 . Loaded VMODs
MAIN.n_gzip 0 0.00 Gzip operations
MAIN.n_gunzip 0 0.00 Gunzip operations
MAIN.vsm_free 972544 . Free VSM space
MAIN.vsm_used 83962064 . Used VSM space
MAIN.vsm_cooling 0 . Cooling VSM space
MAIN.vsm_overflow 0 . Overflow VSM space
MAIN.vsm_overflowed 0 0.00 Overflowed VSM space
MGT.uptime 47764 1.00 Management process uptime
MGT.child_start 1 0.00 Child process started
MGT.child_exit 0 0.00 Child process normal exit
MGT.child_stop 0 0.00 Child process unexpected exit
MGT.child_died 0 0.00 Child process died (signal)
MGT.child_dump 0 0.00 Child process core dumped
MGT.child_panic 0 0.00 Child process panic
MEMPOOL.vbc.live 6 . In use
MEMPOOL.vbc.pool 10 . In Pool
MEMPOOL.vbc.sz_wanted 88 . Size requested
MEMPOOL.vbc.sz_needed 120 . Size allocated
MEMPOOL.vbc.allocs 3537 0.07 Allocations
MEMPOOL.vbc.frees 3531 0.07 Frees
MEMPOOL.vbc.recycle 3469 0.07 Recycled from pool
MEMPOOL.vbc.timeout 182 0.00 Timed out from pool
MEMPOOL.vbc.toosmall 0 0.00 Too small to recycle
MEMPOOL.vbc.surplus 0 0.00 Too many for pool
MEMPOOL.vbc.randry 68 0.00 Pool ran dry
MEMPOOL.busyobj.live 0 . In use
MEMPOOL.busyobj.pool 10 . In Pool
MEMPOOL.busyobj.sz_wanted 65536 . Size requested
MEMPOOL.busyobj.sz_needed 65568 . Size allocated
MEMPOOL.busyobj.allocs 354211 7.42 Allocations
MEMPOOL.busyobj.frees 354211 7.42 Frees
MEMPOOL.busyobj.recycle 354063 7.41 Recycled from pool
MEMPOOL.busyobj.timeout 212 0.00 Timed out from pool
MEMPOOL.busyobj.toosmall 0 0.00 Too small to recycle
MEMPOOL.busyobj.surplus 0 0.00 Too many for pool
MEMPOOL.busyobj.randry 148 0.00 Pool ran dry
MEMPOOL.req0.live 0 . In use
MEMPOOL.req0.pool 10 . In Pool
MEMPOOL.req0.sz_wanted 65536 . Size requested
MEMPOOL.req0.sz_needed 65568 . Size allocated
MEMPOOL.req0.allocs 249219 5.22 Allocations
MEMPOOL.req0.frees 249219 5.22 Frees
MEMPOOL.req0.recycle 248702 5.21 Recycled from pool
MEMPOOL.req0.timeout 559 0.01 Timed out from pool
MEMPOOL.req0.toosmall 0 0.00 Too small to recycle
MEMPOOL.req0.surplus 40 0.00 Too many for pool
MEMPOOL.req0.randry 517 0.01 Pool ran dry
MEMPOOL.sess0.live 0 . In use
MEMPOOL.sess0.pool 10 . In Pool
MEMPOOL.sess0.sz_wanted 384 . Size requested
MEMPOOL.sess0.sz_needed 416 . Size allocated
MEMPOOL.sess0.allocs 441 0.01 Allocations
MEMPOOL.sess0.frees 441 0.01 Frees
MEMPOOL.sess0.recycle 64 0.00 Recycled from pool
MEMPOOL.sess0.timeout 369 0.01 Timed out from pool
MEMPOOL.sess0.toosmall 0 0.00 Too small to recycle
MEMPOOL.sess0.surplus 67 0.00 Too many for pool
MEMPOOL.sess0.randry 377 0.01 Pool ran dry
MEMPOOL.req1.live 0 . In use
MEMPOOL.req1.pool 10 . In Pool
MEMPOOL.req1.sz_wanted 65536 . Size requested
MEMPOOL.req1.sz_needed 65568 . Size allocated
MEMPOOL.req1.allocs 178151 3.73 Allocations
MEMPOOL.req1.frees 178151 3.73 Frees
MEMPOOL.req1.recycle 177736 3.72 Recycled from pool
MEMPOOL.req1.timeout 481 0.01 Timed out from pool
MEMPOOL.req1.toosmall 0 0.00 Too small to recycle
MEMPOOL.req1.surplus 23 0.00 Too many for pool
MEMPOOL.req1.randry 415 0.01 Pool ran dry
MEMPOOL.sess1.live 0 . In use
MEMPOOL.sess1.pool 10 . In Pool
MEMPOOL.sess1.sz_wanted 384 . Size requested
MEMPOOL.sess1.sz_needed 416 . Size allocated
MEMPOOL.sess1.allocs 407 0.01 Allocations
MEMPOOL.sess1.frees 407 0.01 Frees
MEMPOOL.sess1.recycle 64 0.00 Recycled from pool
MEMPOOL.sess1.timeout 339 0.01 Timed out from pool
MEMPOOL.sess1.toosmall 0 0.00 Too small to recycle
MEMPOOL.sess1.surplus 64 0.00 Too many for pool
MEMPOOL.sess1.randry 343 0.01 Pool ran dry
SMA.s0.c_req 327 0.01 Allocator requests
SMA.s0.c_fail 0 0.00 Allocator failures
SMA.s0.c_bytes 5656978 118.44 Bytes allocated
SMA.s0.c_freed 2492444 52.18 Bytes freed
SMA.s0.g_alloc 194 . Allocations outstanding
SMA.s0.g_bytes 3164534 . Bytes outstanding
SMA.s0.g_space 1070577290 . Bytes available
SMA.Transient.c_req 708108 14.83 Allocator requests
SMA.Transient.c_fail 0 0.00 Allocator failures
SMA.Transient.c_bytes 2703443424 56600.02 Bytes allocated
SMA.Transient.c_freed 2703443424 56600.02 Bytes freed
SMA.Transient.g_alloc 0 . Allocations outstanding
SMA.Transient.g_bytes 0 . Bytes outstanding
SMA.Transient.g_space 0 . Bytes available
VBE.default(127.0.0.1,,8080).vcls 2 . VCL references
VBE.default(127.0.0.1,,8080).happy 0 . Happy health probes
VBE.default(127.0.0.1,,8080).bereq_hdrbytes 47588248 996.32 Request header bytes
VBE.default(127.0.0.1,,8080).bereq_bodybytes 0 0.00 Request body bytes
VBE.default(127.0.0.1,,8080).beresp_hdrbytes 72693732 1521.94 Response header bytes
VBE.default(127.0.0.1,,8080).beresp_bodybytes 2473817104 51792.50 Response body bytes
VBE.default(127.0.0.1,,8080).pipe_hdrbytes 0 0.00 Pipe request header bytes
VBE.default(127.0.0.1,,8080).pipe_out 0 0.00 Piped bytes to backend
VBE.default(127.0.0.1,,8080).pipe_in 0 0.00 Piped bytes from backend
LCK.sms.creat 0 0.00 Created locks
LCK.sms.destroy 0 0.00 Destroyed locks
LCK.sms.locks 0 0.00 Lock Operations
LCK.smp.creat 0 0.00 Created locks
LCK.smp.destroy 0 0.00 Destroyed locks
LCK.smp.locks 0 0.00 Lock Operations
LCK.sma.creat 2 0.00 Created locks
LCK.sma.destroy 0 0.00 Destroyed locks
LCK.sma.locks 1416676 29.66 Lock Operations
LCK.smf.creat 0 0.00 Created locks
LCK.smf.destroy 0 0.00 Destroyed locks
LCK.smf.locks 0 0.00 Lock Operations
LCK.hsl.creat 0 0.00 Created locks
LCK.hsl.destroy 0 0.00 Destroyed locks
LCK.hsl.locks 0 0.00 Lock Operations
LCK.hcb.creat 1 0.00 Created locks
LCK.hcb.destroy 0 0.00 Destroyed locks
LCK.hcb.locks 481 0.01 Lock Operations
LCK.hcl.creat 0 0.00 Created locks
LCK.hcl.destroy 0 0.00 Destroyed locks
LCK.hcl.locks 0 0.00 Lock Operations
LCK.vcl.creat 1 0.00 Created locks
LCK.vcl.destroy 0 0.00 Destroyed locks
LCK.vcl.locks 710015 14.87 Lock Operations
LCK.sessmem.creat 0 0.00 Created locks
LCK.sessmem.destroy 0 0.00 Destroyed locks
LCK.sessmem.locks 0 0.00 Lock Operations
LCK.sess.creat 848 0.02 Created locks
LCK.sess.destroy 848 0.02 Destroyed locks
LCK.sess.locks 0 0.00 Lock Operations
LCK.wstat.creat 1 0.00 Created locks
LCK.wstat.destroy 0 0.00 Destroyed locks
LCK.wstat.locks 791316 16.57 Lock Operations
LCK.herder.creat 0 0.00 Created locks
LCK.herder.destroy 0 0.00 Destroyed locks
LCK.herder.locks 0 0.00 Lock Operations
LCK.wq.creat 3 0.00 Created locks
LCK.wq.destroy 0 0.00 Destroyed locks
LCK.wq.locks 2365705 49.53 Lock Operations
LCK.objhdr.creat 353 0.01 Created locks
LCK.objhdr.destroy 237 0.00 Destroyed locks
LCK.objhdr.locks 4623042 96.79 Lock Operations
LCK.exp.creat 1 0.00 Created locks
LCK.exp.destroy 0 0.00 Destroyed locks
LCK.exp.locks 10003 0.21 Lock Operations
LCK.lru.creat 2 0.00 Created locks
LCK.lru.destroy 0 0.00 Destroyed locks
LCK.lru.locks 505 0.01 Lock Operations
LCK.cli.creat 1 0.00 Created locks
LCK.cli.destroy 0 0.00 Destroyed locks
LCK.cli.locks 15939 0.33 Lock Operations
LCK.ban.creat 1 0.00 Created locks
LCK.ban.destroy 0 0.00 Destroyed locks
LCK.ban.locks 157052 3.29 Lock Operations
LCK.vbp.creat 1 0.00 Created locks
LCK.vbp.destroy 0 0.00 Destroyed locks
LCK.vbp.locks 0 0.00 Lock Operations
LCK.backend.creat 1 0.00 Created locks
LCK.backend.destroy 0 0.00 Destroyed locks
LCK.backend.locks 712009 14.91 Lock Operations
LCK.vcapace.creat 1 0.00 Created locks
LCK.vcapace.destroy 0 0.00 Destroyed locks
LCK.vcapace.locks 0 0.00 Lock Operations
LCK.nbusyobj.creat 0 0.00 Created locks
LCK.nbusyobj.destroy 0 0.00 Destroyed locks
LCK.nbusyobj.locks 0 0.00 Lock Operations
LCK.busyobj.creat 354194 7.42 Created locks
LCK.busyobj.destroy 354208 7.42 Destroyed locks
LCK.busyobj.locks 2658757 55.66 Lock Operations
LCK.mempool.creat 6 0.00 Created locks
LCK.mempool.destroy 0 0.00 Destroyed locks
LCK.mempool.locks 1828426 38.28 Lock Operations
LCK.vxid.creat 1 0.00 Created locks
LCK.vxid.destroy 0 0.00 Destroyed locks
LCK.vxid.locks 417 0.01 Lock Operations
LCK.pipestat.creat 1 0.00 Created locks
LCK.pipestat.destroy 0 0.00 Destroyed locks
LCK.pipestat.locks 0 0.00 Lock Operations
perf report:
- 40.48% init [kernel.kallsyms] [k] native_safe_halt
native_safe_halt
- 20.92% swapper [kernel.kallsyms] [k] native_safe_halt
native_safe_halt
- 2.33% varnishd [kernel.kallsyms] [k] _spin_unlock_irqrestore
- _spin_unlock_irqrestore
+ 30.72% pthread_cond_broadcast@@GLIBC_2.3.2
+ 28.50% pthread_cond_signal@@GLIBC_2.3.2
+ 15.73% __libc_writev
+ 11.28% 0x351940e6fd
+ 3.27% pthread_cond_timedwait@@GLIBC_2.3.2
+ 3.09% __poll
+ 2.78% 0x351940e75d
+ 1.41% epoll_wait
+ 0.75% __lll_unlock_wake
- 1.06% varnishd libc-2.12.so [.] __memset_sse2
__memset_sse2
- 0.88% varnishd [kernel.kallsyms] [k] finish_task_switch
- finish_task_switch
+ 28.86% pthread_cond_wait@@GLIBC_2.3.2
+ 23.55% pthread_cond_timedwait@@GLIBC_2.3.2
+ 16.71% 0x351940e75d
+ 15.35% epoll_wait
7.22% pthread_cond_signal@@GLIBC_2.3.2
+ 7.22% __poll
+ 0.74% __lll_lock_wait
- 0.88% varnishd libc-2.12.so [.] vfprintf
vfprintf
- 0.81% varnishd libc-2.12.so [.] __printf_fp
__printf_fp
+ 0.66% varnishd varnishd [.] 0x00000000000200fd
+ 0.59% httpd [kernel.kallsyms] [k] _spin_unlock_irqrestore
+ 0.54% varnishd [kernel.kallsyms] [k] native_read_tsc
More information about the varnish-misc
mailing list