<div dir="ltr">adding varnish-misc back<div><br></div><div>Try mounting it as tmpfs (with exec rights), this files will get a log of IO, so one usual suspects is that you are killing your filesystem with them<br clear="all"><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div>-- <br></div>Guillaume Quintard<br></div></div></div><br></div></div><br><div class="gmail_quote"><div dir="ltr">On Mon, Dec 10, 2018 at 4:48 PM Marc Fournier <<a href="mailto:marc.fournier@camptocamp.com">marc.fournier@camptocamp.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
Hello Guillaume,<br>
<br>
No, the binaries are on an overlay2 filesystem (docker container), but<br>
/var/lib/varnish itself is bound to a directory on the host's<br>
filesystem, which is ext4.<br>
<br>
This machine does run in a VM (amazon EC2) though. But nothing in dmesg<br>
indicates any OS- or hardware level problem.<br>
<br>
Thanks !<br>
<br>
Marc<br>
<br>
Guillaume Quintard <<a href="mailto:guillaume@varnish-software.com" target="_blank">guillaume@varnish-software.com</a>> writes:<br>
<br>
> Is /var/lib/varnish mounted as tmpfs?<br>
><br>
> -- <br>
> Guillaume Quintard<br>
><br>
><br>
> On Fri, Nov 9, 2018 at 5:25 PM Marc Fournier <<a href="mailto:marc.fournier@camptocamp.com" target="_blank">marc.fournier@camptocamp.com</a>><br>
> wrote:<br>
><br>
>><br>
>> Hello,<br>
>><br>
>> I'm struggling figuring out a problem that occurs rarely, and random<br>
>> times, on different machines with the same setup (so excluding hardware<br>
>> problems). This leads to varnish not answering any requests anymore for<br>
>> several minutes.<br>
>><br>
>> Running on a 8 CPU amazon EC2 instance with the following parameters:<br>
>> varnishd -F -f /etc/varnish/config.vcl -I /etc/varnish/settings.cli -n<br>
>> varnishd -s malloc,8G -a <a href="http://127.0.0.1:8080" rel="noreferrer" target="_blank">127.0.0.1:8080</a>,PROXY<br>
>><br>
>> When the lockup occurs, the loadavg increases from 0 to about 20 and 2<br>
>> out of 8 CPUs are 100% in iowait mode (maybe because I use 2 threadpools ?)<br>
>><br>
>> There is plenty of free memory, and no disk IO activity, nothing in<br>
>> dmesg. Apart from varnish, there's haproxy and a couple of varnishlog<br>
>> and other monitoring tools running on these machines.<br>
>><br>
>> The 2 non-default settings I set are:<br>
>> thread_pool_min 1000<br>
>> timeout_idle 45<br>
>><br>
>> On specificity: the backends set quite short TTLs on the objects they<br>
>> produce (less than 10s), but grace time is set to 300s, so I was<br>
>> wondering is there might be a side-effect with the Transient store.<br>
>><br>
>> Any hint or clue would be much welcome !<br>
>><br>
>> Below are the log messages and the output of varnishstat -1:<br>
>><br>
>> 2018-10-24T02:40:27.455363000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:40:27.455899000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-10-24T02:41:27.512111000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:41:27.512321000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-10-24T02:42:27.571685000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:42:27.571883000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-10-24T02:43:27.598954000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:43:27.599156000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-10-24T02:44:27.645895000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:44:27.646108000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-10-24T02:45:03.252791000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:45:03.253069000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-10-24T02:45:03.253274000Z Error: Child (14) not responding to CLI,<br>
>> killed it.<br>
>> 2018-10-24T02:45:03.253465000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error<br>
>> 2018-10-24T02:45:03.253769000Z Error: Child (14) died signal=3 (core<br>
>> dumped)<br>
>> 2018-10-24T02:45:03.254007000Z Debug: Child cleanup complete<br>
>> 2018-10-24T02:45:03.255242000Z Debug: Child (2202) Started<br>
>> 2018-10-24T02:45:03.339992000Z Info: Child (2202) said Child starts<br>
>><br>
>> 2018-11-06T20:03:29.675883000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:03:29.676394000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:04:29.701217000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:04:29.701422000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:05:29.758068000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:05:29.758269000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:06:29.781659000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:06:29.781867000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:07:29.804125000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:07:29.804336000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:08:29.864111000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:08:29.864318000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:09:29.873115000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:09:29.873321000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:10:29.932102000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:10:29.932307000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:11:29.992180000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:11:29.992387000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:12:30.052123000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:12:30.052328000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:13:30.112074000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:13:30.112275000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:14:30.172123000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:14:30.172326000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:15:03.116961000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:15:03.117291000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error (hdr)<br>
>> 2018-11-06T20:15:03.117517000Z Error: Child (2202) not responding to CLI,<br>
>> killed it.<br>
>> 2018-11-06T20:15:03.117747000Z Error: Unexpected reply from ping: 400 CLI<br>
>> communication error<br>
>> 2018-11-06T20:15:03.119003000Z Error: Child (2202) died signal=3 (core<br>
>> dumped)<br>
>> 2018-11-06T20:15:03.119377000Z Debug: Child cleanup complete<br>
>> 2018-11-06T20:15:03.120560000Z Debug: Child (4327) Started<br>
>> 2018-11-06T20:15:03.207214000Z Info: Child (4327) said Child starts<br>
>><br>
>> MAIN.uptime 244472 1.00 Child process uptime<br>
>> MAIN.sess_conn 10813893 44.23 Sessions accepted<br>
>> MAIN.sess_drop 0 0.00 Sessions dropped<br>
>> MAIN.sess_fail 0 0.00 Session accept failures<br>
>> MAIN.client_req_400 5 0.00 Client requests received,<br>
>> subject to 400 errors<br>
>> MAIN.client_req_417 0 0.00 Client requests received,<br>
>> subject to 417 errors<br>
>> MAIN.client_req 11329721 46.34 Good client requests received<br>
>> MAIN.cache_hit 9785347 40.03 Cache hits<br>
>> MAIN.cache_hitpass 0 0.00 Cache hits for pass.<br>
>> MAIN.cache_hitmiss 87463 0.36 Cache hits for miss.<br>
>> MAIN.cache_miss 439975 1.80 Cache misses<br>
>> MAIN.backend_conn 1233017 5.04 Backend conn. success<br>
>> MAIN.backend_unhealthy 0 0.00 Backend conn. not<br>
>> attempted<br>
>> MAIN.backend_busy 0 0.00 Backend conn. too many<br>
>> MAIN.backend_fail 0 0.00 Backend conn. failures<br>
>> MAIN.backend_reuse 1641618 6.71 Backend conn. reuses<br>
>> MAIN.backend_recycle 1647099 6.74 Backend conn. recycles<br>
>> MAIN.backend_retry 4 0.00 Backend conn. retry<br>
>> MAIN.fetch_head 6 0.00 Fetch no body (HEAD)<br>
>> MAIN.fetch_length 264385 1.08 Fetch with Length<br>
>> MAIN.fetch_chunked 15594 0.06 Fetch chunked<br>
>> MAIN.fetch_eof 1226485 5.02 Fetch EOF<br>
>> MAIN.fetch_bad 0 0.00 Fetch bad T-E<br>
>> MAIN.fetch_none 1 0.00 Fetch no body<br>
>> MAIN.fetch_1xx 0 0.00 Fetch no body (1xx)<br>
>> MAIN.fetch_204 8 0.00 Fetch no body (204)<br>
>> MAIN.fetch_304 1367111 5.59 Fetch no body (304)<br>
>> MAIN.fetch_failed 2 0.00 Fetch failed (all causes)<br>
>> MAIN.fetch_no_thread 0 0.00 Fetch failed (no thread)<br>
>> MAIN.pools 2 . Number of thread pools<br>
>> MAIN.threads 2000 . Total number of threads<br>
>> MAIN.threads_limited 0 0.00 Threads hit max<br>
>> MAIN.threads_created 2012 0.01 Threads created<br>
>> MAIN.threads_destroyed 12 0.00 Threads destroyed<br>
>> MAIN.threads_failed 0 0.00 Thread creation failed<br>
>> MAIN.thread_queue_len 0 . Length of session queue<br>
>> MAIN.busy_sleep 3558 0.01 Number of requests sent<br>
>> to sleep on busy objhdr<br>
>> MAIN.busy_wakeup 3558 0.01 Number of requests woken<br>
>> after sleep on busy objhdr<br>
>> MAIN.busy_killed 0 0.00 Number of requests killed<br>
>> after sleep on busy objhdr<br>
>> MAIN.sess_queued 1288 0.01 Sessions queued for thread<br>
>> MAIN.sess_dropped 0 0.00 Sessions dropped for<br>
>> thread<br>
>> MAIN.n_object 1645 . object structs made<br>
>> MAIN.n_vampireobject 0 . unresurrected objects<br>
>> MAIN.n_objectcore 3407 . objectcore structs made<br>
>> MAIN.n_objecthead 3649 . objecthead structs made<br>
>> MAIN.n_backend 2 . Number of backends<br>
>> MAIN.n_expired 438348 . Number of expired objects<br>
>> MAIN.n_lru_nuked 0 . Number of LRU nuked<br>
>> objects<br>
>> MAIN.n_lru_moved 5255908 . Number of LRU moved<br>
>> objects<br>
>> MAIN.losthdr 0 0.00 HTTP header overflows<br>
>> MAIN.s_sess 10813893 44.23 Total sessions seen<br>
>> MAIN.s_req 11329721 46.34 Total requests seen<br>
>> MAIN.s_pipe 1724 0.01 Total pipe sessions seen<br>
>> MAIN.s_pass 1017203 4.16 Total pass-ed requests<br>
>> seen<br>
>> MAIN.s_fetch 1457178 5.96 Total backend fetches<br>
>> initiated<br>
>> MAIN.s_synth 85381 0.35 Total synthethic<br>
>> responses made<br>
>> MAIN.s_req_hdrbytes 6238979422 25520.22 Request header bytes<br>
>> MAIN.s_req_bodybytes 197010427 805.86 Request body bytes<br>
>> MAIN.s_resp_hdrbytes 6906180092 28249.37 Response header bytes<br>
>> MAIN.s_resp_bodybytes 1332774227295 5451643.65 Response body bytes<br>
>> MAIN.s_pipe_hdrbytes 2686309 10.99 Pipe request header bytes<br>
>> MAIN.s_pipe_in 1409866 5.77 Piped bytes from client<br>
>> MAIN.s_pipe_out 17882349 73.15 Piped bytes to client<br>
>> MAIN.sess_closed 33230 0.14 Session Closed<br>
>> MAIN.sess_closed_err 5187 0.02 Session Closed with error<br>
>> MAIN.sess_readahead 0 0.00 Session Read Ahead<br>
>> MAIN.sess_herd 4943201 20.22 Session herd<br>
>> MAIN.sc_rem_close 10784144 44.11 Session OK REM_CLOSE<br>
>> MAIN.sc_req_close 21829 0.09 Session OK REQ_CLOSE<br>
>> MAIN.sc_req_http10 932 0.00 Session Err REQ_HTTP10<br>
>> MAIN.sc_rx_bad 0 0.00 Session Err RX_BAD<br>
>> MAIN.sc_rx_body 0 0.00 Session Err RX_BODY<br>
>> MAIN.sc_rx_junk 5 0.00 Session Err RX_JUNK<br>
>> MAIN.sc_rx_overflow 0 0.00 Session Err RX_OVERFLOW<br>
>> MAIN.sc_rx_timeout 4250 0.02 Session Err RX_TIMEOUT<br>
>> MAIN.sc_tx_pipe 1724 0.01 Session OK TX_PIPE<br>
>> MAIN.sc_tx_error 0 0.00 Session Err TX_ERROR<br>
>> MAIN.sc_tx_eof 543 0.00 Session OK TX_EOF<br>
>> MAIN.sc_resp_close 0 0.00 Session OK RESP_CLOSE<br>
>> MAIN.sc_overload 0 0.00 Session Err OVERLOAD<br>
>> MAIN.sc_pipe_overflow 0 0.00 Session Err PIPE_OVERFLOW<br>
>> MAIN.sc_range_short 0 0.00 Session Err RANGE_SHORT<br>
>> MAIN.sc_req_http20 0 0.00 Session Err REQ_HTTP20<br>
>> MAIN.sc_vcl_failure 0 0.00 Session Err VCL_FAILURE<br>
>> MAIN.shm_records 921799607 3770.57 SHM records<br>
>> MAIN.shm_writes 82947584 339.29 SHM writes<br>
>> MAIN.shm_flushes 1821 0.01 SHM flushes due to<br>
>> overflow<br>
>> MAIN.shm_cont 961287 3.93 SHM MTX contention<br>
>> MAIN.shm_cycles 436 0.00 SHM cycles through buffer<br>
>> MAIN.backend_req 2862111 11.71 Backend requests made<br>
>> MAIN.n_vcl 2 0.00 Number of loaded VCLs in<br>
>> total<br>
>> MAIN.n_vcl_avail 2 0.00 Number of VCLs available<br>
>> MAIN.n_vcl_discard 0 0.00 Number of discarded VCLs<br>
>> MAIN.vcl_fail 0 0.00 VCL failures<br>
>> MAIN.bans 1 . Count of bans<br>
>> MAIN.bans_completed 1 . Number of bans marked<br>
>> 'completed'<br>
>> MAIN.bans_obj 0 . Number of bans using obj.*<br>
>> MAIN.bans_req 0 . Number of bans using req.*<br>
>> MAIN.bans_added 1 0.00 Bans added<br>
>> MAIN.bans_deleted 0 0.00 Bans deleted<br>
>> MAIN.bans_tested 0 0.00 Bans tested against<br>
>> objects (lookup)<br>
>> MAIN.bans_obj_killed 0 0.00 Objects killed by bans<br>
>> (lookup)<br>
>> MAIN.bans_lurker_tested 0 0.00 Bans tested against<br>
>> objects (lurker)<br>
>> MAIN.bans_tests_tested 0 0.00 Ban tests tested against<br>
>> objects (lookup)<br>
>> MAIN.bans_lurker_tests_tested 0 0.00 Ban tests tested<br>
>> against objects (lurker)<br>
>> MAIN.bans_lurker_obj_killed 0 0.00 Objects killed by<br>
>> bans (lurker)<br>
>> MAIN.bans_lurker_obj_killed_cutoff 0 0.00 Objects<br>
>> killed by bans for cutoff (lurker)<br>
>> MAIN.bans_dups 0 0.00 Bans<br>
>> superseded by other bans<br>
>> MAIN.bans_lurker_contention 0 0.00 Lurker gave<br>
>> way for lookup<br>
>> MAIN.bans_persisted_bytes 16 . Bytes used by<br>
>> the persisted ban lists<br>
>> MAIN.bans_persisted_fragmentation 0 . Extra bytes<br>
>> in persisted ban lists due to fragmentation<br>
>> MAIN.n_purges 0 . Number of<br>
>> purge operations executed<br>
>> MAIN.n_obj_purged 0 . Number of<br>
>> purged objects<br>
>> MAIN.exp_mailed 3272785 13.39 Number of<br>
>> objects mailed to expiry thread<br>
>> MAIN.exp_received 3272785 13.39 Number of<br>
>> objects received by expiry thread<br>
>> MAIN.hcb_nolock 10225413 41.83 HCB Lookups<br>
>> without lock<br>
>> MAIN.hcb_lock 348309 1.42 HCB Lookups<br>
>> with lock<br>
>> MAIN.hcb_insert 348094 1.42 HCB Inserts<br>
>> MAIN.esi_errors 0 0.00 ESI parse<br>
>> errors (unlock)<br>
>> MAIN.esi_warnings 0 0.00 ESI parse<br>
>> warnings (unlock)<br>
>> MAIN.vmods 1 . Loaded VMODs<br>
>> MAIN.n_gzip 1262083 5.16 Gzip<br>
>> operations<br>
>> MAIN.n_gunzip 41886 0.17 Gunzip<br>
>> operations<br>
>> MAIN.n_test_gunzip 0 0.00 Test gunzip<br>
>> operations<br>
>> MAIN.vsm_free 973840 . Free VSM space<br>
>> MAIN.vsm_used 83960768 . Used VSM space<br>
>> MAIN.vsm_cooling 0 . Cooling VSM<br>
>> space<br>
>> MAIN.vsm_overflow 0 . Overflow VSM<br>
>> space<br>
>> MAIN.vsm_overflowed 0 0.00 Overflowed<br>
>> VSM space<br>
>> MGT.uptime 3145114 12.86 Management<br>
>> process uptime<br>
>> MGT.child_start 3 0.00 Child process<br>
>> started<br>
>> MGT.child_exit 0 0.00 Child process<br>
>> normal exit<br>
>> MGT.child_stop 0 0.00 Child process<br>
>> unexpected exit<br>
>> MGT.child_died 2 0.00 Child process<br>
>> died (signal)<br>
>> MGT.child_dump 2 0.00 Child process<br>
>> core dumped<br>
>> MGT.child_panic 0 0.00 Child process<br>
>> panic<br>
>> MEMPOOL.busyobj.live 8 . In use<br>
>> MEMPOOL.busyobj.pool 11 . In Pool<br>
>> MEMPOOL.busyobj.sz_wanted 65536 . Size requested<br>
>> MEMPOOL.busyobj.sz_actual 65504 . Size allocated<br>
>> MEMPOOL.busyobj.allocs 2875324 11.76 Allocations<br>
>> MEMPOOL.busyobj.frees 2875316 11.76 Frees<br>
>> MEMPOOL.busyobj.recycle 2862130 11.71 Recycled from<br>
>> pool<br>
>> MEMPOOL.busyobj.timeout 108162 0.44 Timed out<br>
>> from pool<br>
>> MEMPOOL.busyobj.toosmall 0 0.00 Too small to<br>
>> recycle<br>
>> MEMPOOL.busyobj.surplus 0 0.00 Too many for<br>
>> pool<br>
>> MEMPOOL.busyobj.randry 13194 0.05 Pool ran dry<br>
>> MEMPOOL.req0.live 3 . In use<br>
>> MEMPOOL.req0.pool 14 . In Pool<br>
>> MEMPOOL.req0.sz_wanted 65536 . Size requested<br>
>> MEMPOOL.req0.sz_actual 65504 . Size allocated<br>
>> MEMPOOL.req0.allocs 7871304 32.20 Allocations<br>
>> MEMPOOL.req0.frees 7871301 32.20 Frees<br>
>> MEMPOOL.req0.recycle 7850975 32.11 Recycled from<br>
>> pool<br>
>> MEMPOOL.req0.timeout 82821 0.34 Timed out<br>
>> from pool<br>
>> MEMPOOL.req0.toosmall 0 0.00 Too small to<br>
>> recycle<br>
>> MEMPOOL.req0.surplus 2158 0.01 Too many for<br>
>> pool<br>
>> MEMPOOL.req0.randry 20329 0.08 Pool ran dry<br>
>> MEMPOOL.sess0.live 14 . In use<br>
>> MEMPOOL.sess0.pool 13 . In Pool<br>
>> MEMPOOL.sess0.sz_wanted 512 . Size requested<br>
>> MEMPOOL.sess0.sz_actual 480 . Size allocated<br>
>> MEMPOOL.sess0.allocs 5407625 22.12 Allocations<br>
>> MEMPOOL.sess0.frees 5407611 22.12 Frees<br>
>> MEMPOOL.sess0.recycle 5342462 21.85 Recycled from<br>
>> pool<br>
>> MEMPOOL.sess0.timeout 184908 0.76 Timed out<br>
>> from pool<br>
>> MEMPOOL.sess0.toosmall 0 0.00 Too small to<br>
>> recycle<br>
>> MEMPOOL.sess0.surplus 30850 0.13 Too many for<br>
>> pool<br>
>> MEMPOOL.sess0.randry 65163 0.27 Pool ran dry<br>
>> MEMPOOL.req1.live 4 . In use<br>
>> MEMPOOL.req1.pool 18 . In Pool<br>
>> MEMPOOL.req1.sz_wanted 65536 . Size requested<br>
>> MEMPOOL.req1.sz_actual 65504 . Size allocated<br>
>> MEMPOOL.req1.allocs 7870901 32.20 Allocations<br>
>> MEMPOOL.req1.frees 7870897 32.20 Frees<br>
>> MEMPOOL.req1.recycle 7850560 32.11 Recycled from<br>
>> pool<br>
>> MEMPOOL.req1.timeout 82989 0.34 Timed out<br>
>> from pool<br>
>> MEMPOOL.req1.toosmall 0 0.00 Too small to<br>
>> recycle<br>
>> MEMPOOL.req1.surplus 1794 0.01 Too many for<br>
>> pool<br>
>> MEMPOOL.req1.randry 20341 0.08 Pool ran dry<br>
>> MEMPOOL.sess1.live 17 . In use<br>
>> MEMPOOL.sess1.pool 18 . In Pool<br>
>> MEMPOOL.sess1.sz_wanted 512 . Size requested<br>
>> MEMPOOL.sess1.sz_actual 480 . Size allocated<br>
>> MEMPOOL.sess1.allocs 5406276 22.11 Allocations<br>
>> MEMPOOL.sess1.frees 5406259 22.11 Frees<br>
>> MEMPOOL.sess1.recycle 5340931 21.85 Recycled from<br>
>> pool<br>
>> MEMPOOL.sess1.timeout 184777 0.76 Timed out<br>
>> from pool<br>
>> MEMPOOL.sess1.toosmall 0 0.00 Too small to<br>
>> recycle<br>
>> MEMPOOL.sess1.surplus 30947 0.13 Too many for<br>
>> pool<br>
>> MEMPOOL.sess1.randry 65345 0.27 Pool ran dry<br>
>> SMA.s0.c_req 4309927 17.63 Allocator<br>
>> requests<br>
>> SMA.s0.c_fail 0 0.00 Allocator<br>
>> failures<br>
>> SMA.s0.c_bytes 127888920949 523122.98 Bytes<br>
>> allocated<br>
>> SMA.s0.c_freed 127714379518 522409.03 Bytes freed<br>
>> SMA.s0.g_alloc 3072 . Allocations<br>
>> outstanding<br>
>> SMA.s0.g_bytes 174541431 . Bytes<br>
>> outstanding<br>
>> SMA.s0.g_space 8415393161 . Bytes<br>
>> available<br>
>> SMA.Transient.c_req 3511659 14.36 Allocator<br>
>> requests<br>
>> SMA.Transient.c_fail 0 0.00 Allocator<br>
>> failures<br>
>> SMA.Transient.c_bytes 17678767500 72314.08 Bytes<br>
>> allocated<br>
>> SMA.Transient.c_freed 17678606604 72313.42 Bytes freed<br>
>> SMA.Transient.g_alloc 196 . Allocations<br>
>> outstanding<br>
>> SMA.Transient.g_bytes 160896 . Bytes<br>
>> outstanding<br>
>> SMA.Transient.g_space 0 . Bytes<br>
>> available<br>
>> VBE.conf-2018-11-06_142400.default.happy 0 . Happy<br>
>> health probes<br>
>> VBE.conf-2018-11-06_142400.default.bereq_hdrbytes 1601813566<br>
>> 6552.14 Request header bytes<br>
>> VBE.conf-2018-11-06_142400.default.bereq_bodybytes 197010427<br>
>> 805.86 Request body bytes<br>
>> VBE.conf-2018-11-06_142400.default.beresp_hdrbytes 1481249422<br>
>> 6058.97 Response header bytes<br>
>> VBE.conf-2018-11-06_142400.default.beresp_bodybytes 47877555284<br>
>> 195840.65 Response body bytes<br>
>> VBE.conf-2018-11-06_142400.default.pipe_hdrbytes 2718879<br>
>> 11.12 Pipe request header bytes<br>
>> VBE.conf-2018-11-06_142400.default.pipe_out 1409866<br>
>> 5.77 Piped bytes to backend<br>
>> VBE.conf-2018-11-06_142400.default.pipe_in 17882349<br>
>> 73.15 Piped bytes from backend<br>
>> VBE.conf-2018-11-06_142400.default.conn 8<br>
>> . Concurrent connections to backend<br>
>> VBE.conf-2018-11-06_142400.default.req 2875327<br>
>> 11.76 Backend requests sent<br>
>> LCK.backend.creat 3<br>
>> 0.00 Created locks<br>
>> LCK.backend.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.backend.locks 5750647<br>
>> 23.52 Lock Operations<br>
>> LCK.backend_tcp.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.backend_tcp.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.backend_tcp.locks 9038321<br>
>> 36.97 Lock Operations<br>
>> LCK.ban.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.ban.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.ban.locks 5084447<br>
>> 20.80 Lock Operations<br>
>> LCK.busyobj.creat 3213172<br>
>> 13.14 Created locks<br>
>> LCK.busyobj.destroy 3214904<br>
>> 13.15 Destroyed locks<br>
>> LCK.busyobj.locks 30427598<br>
>> 124.46 Lock Operations<br>
>> LCK.cli.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.cli.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.cli.locks 81496<br>
>> 0.33 Lock Operations<br>
>> LCK.exp.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.exp.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.exp.locks 13382080<br>
>> 54.74 Lock Operations<br>
>> LCK.hcb.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.hcb.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.hcb.locks 696280<br>
>> 2.85 Lock Operations<br>
>> LCK.lru.creat 2<br>
>> 0.00 Created locks<br>
>> LCK.lru.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.lru.locks 8967041<br>
>> 36.68 Lock Operations<br>
>> LCK.mempool.creat 5<br>
>> 0.00 Created locks<br>
>> LCK.mempool.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.mempool.locks 60929423<br>
>> 249.23 Lock Operations<br>
>> LCK.objhdr.creat 349642<br>
>> 1.43 Created locks<br>
>> LCK.objhdr.destroy 346251<br>
>> 1.42 Destroyed locks<br>
>> LCK.objhdr.locks 75491395<br>
>> 308.79 Lock Operations<br>
>> LCK.pipestat.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.pipestat.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.pipestat.locks 1724<br>
>> 0.01 Lock Operations<br>
>> LCK.sess.creat 10813321<br>
>> 44.23 Created locks<br>
>> LCK.sess.destroy 10813836<br>
>> 44.23 Destroyed locks<br>
>> LCK.sess.locks 16814084<br>
>> 68.78 Lock Operations<br>
>> LCK.vbe.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.vbe.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.vbe.locks 81489<br>
>> 0.33 Lock Operations<br>
>> LCK.vcapace.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.vcapace.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.vcapace.locks 0<br>
>> 0.00 Lock Operations<br>
>> LCK.vcl.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.vcl.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.vcl.locks 5815522<br>
>> 23.79 Lock Operations<br>
>> LCK.vxid.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.vxid.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.vxid.locks 2346<br>
>> 0.01 Lock Operations<br>
>> LCK.waiter.creat 2<br>
>> 0.00 Created locks<br>
>> LCK.waiter.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.waiter.locks 19733708<br>
>> 80.72 Lock Operations<br>
>> LCK.wq.creat 3<br>
>> 0.00 Created locks<br>
>> LCK.wq.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.wq.locks 56100309<br>
>> 229.48 Lock Operations<br>
>> LCK.wstat.creat 1<br>
>> 0.00 Created locks<br>
>> LCK.wstat.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.wstat.locks 21802570<br>
>> 89.18 Lock Operations<br>
>> LCK.sma.creat 2<br>
>> 0.00 Created locks<br>
>> LCK.sma.destroy 0<br>
>> 0.00 Destroyed locks<br>
>> LCK.sma.locks 15639885<br>
>> 63.97 Lock Operations<br>
>><br>
>> Thanks !<br>
>><br>
>> Marc Fournier<br>
>> _______________________________________________<br>
>> varnish-misc mailing list<br>
>> <a href="mailto:varnish-misc@varnish-cache.org" target="_blank">varnish-misc@varnish-cache.org</a><br>
>> <a href="https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc" rel="noreferrer" target="_blank">https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc</a><br>
>><br>
> _______________________________________________<br>
> varnish-misc mailing list<br>
> <a href="mailto:varnish-misc@varnish-cache.org" target="_blank">varnish-misc@varnish-cache.org</a><br>
> <a href="https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc" rel="noreferrer" target="_blank">https://www.varnish-cache.org/lists/mailman/listinfo/varnish-misc</a><br>
</blockquote></div>