[Fryer] master FAIL. 9 of 25 tests succeeded.
fryer at oneiros.varnish-software.com
fryer at oneiros.varnish-software.com
Tue Jun 5 16:04:35 CEST 2012
Tests Failed: httperf-lru-nostream-gzip httperf-lru-nostream-gzip-deflateoff httperf-lru-default httperf-lru-stream-default httperf-hot httperf-lru-nostream-nogzip cold-gzip httperf-lru-stream-gzip httperf-lru-stream-nogzip cold-nogzip siege-test httperf-lru-nostream-default httperf-rapid-expire streaming-grace cold-default purge-fail
Tests OK: streaming memleak 4gpluss-stream 4gpluss basic-fryer 4gpluss-nostream lru-random 4gpluss-nogzip streaming-gzip
2012-06-05 11:22:20 [1,18]: Server pantoum checked out varnish-3.0.0-beta2-1014-ge75cd2e of branch master
2012-06-05 11:22:45 [2,24]: httperf-lru-nostream-gzip(httperf): Starting test
2012-06-05 11:25:56 WARNING [0,191]: httperf-lru-nostream-gzip(httperf): Panic detected. I think!
2012-06-05 11:25:56 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf):
Last panic at: Tue, 05 Jun 2012 11:24:34 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fa38c97c9ca: _end+7fa38c2f8202
0x7fa38c6d9cdd: _end+7fa38c055515
sp = 0x7fa37f10cc20 {
fd = 20, id = 20, xid = 1855245884,
client = 10.20.100.9 14474,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fa380314158 {
id = "req",
{s,f,r,e} = {0x7fa380315730,+816,(nil),+59632},
},
http[req] = {
ws = 0x7fa380314158[req]
"GET",
"/1/6/4/8/8/3.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7fa37c8dac60 {
ws = 0x7fa37c8dae20 {
id = "wrk",
{s,f,r,e} = {0x7fa37c8da450,0x7fa37c8da450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 11:25:56 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 187 stat: 81 diff: 106). Did we crash?
2012-06-05 11:25:56 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-05 11:25:56 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf): Out of bounds: client_req(212090) less than lower boundary 1989920
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Load: 13:25:57 up 4 days, 23:48, 3 users, load average: 0.64, 0.59, 0.71
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Test name: httperf-lru-nostream-gzip
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Varnish options:
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): -t=3600
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): -s=malloc,30M
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Varnish parameters:
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): thread_stats_rate=1
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): thread_pool_max=5000
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): nuke_limit=250
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): http_gzip_support=on
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): thread_pool_min=200
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Payload size (excludes headers): 10K
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Branch: master
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Number of clients involved: 24
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Type of test: httperf
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Test iterations: 1
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Runtime: 187 seconds
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
}
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Number of total connections: 200000
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Requests per connection: 10
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 11:25:57 [1, 0]: httperf-lru-nostream-gzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 11:26:03 [2, 6]: httperf-lru-nostream-gzip-deflateoff(httperf): Starting test
2012-06-05 11:29:28 WARNING [0,204]: httperf-lru-nostream-gzip-deflateoff(httperf): Panic detected. I think!
2012-06-05 11:29:28 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf):
Last panic at: Tue, 05 Jun 2012 11:28:56 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fdf80d129ca: _end+7fdf8068e202
0x7fdf80a6fcdd: _end+7fdf803eb515
sp = 0x7fdf6fdc0d20 {
fd = 14, id = 14, xid = 413360242,
client = 10.20.100.9 16926,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fdf70c27158 {
id = "req",
{s,f,r,e} = {0x7fdf70c28730,+248,(nil),+59632},
},
http[req] = {
ws = 0x7fdf70c27158[req]
"GET",
"/1/9/2/8/6/5.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7fdf709b2c60 {
ws = 0x7fdf709b2e20 {
id = "wrk",
{s,f,r,e} = {0x7fdf709b2450,0x7fdf709b2450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 11:29:28 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Varnishstat uptime and measured run-time is too large (measured: 200 stat: 31 diff: 169). Did we crash?
2012-06-05 11:29:28 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-05 11:29:28 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Out of bounds: client_req(10040) less than lower boundary 1989920
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Load: 13:29:28 up 4 days, 23:52, 3 users, load average: 0.50, 0.70, 0.74
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Test name: httperf-lru-nostream-gzip-deflateoff
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Varnish options:
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): -t=3600
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): -s=malloc,30M
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Varnish parameters:
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): thread_stats_rate=1
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): thread_pool_max=5000
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): nuke_limit=250
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): http_gzip_support=on
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): thread_pool_min=200
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Payload size (excludes headers): 10K
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Branch: master
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Number of clients involved: 24
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Type of test: httperf
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Test iterations: 1
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Runtime: 200 seconds
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
set beresp.do_gzip = true;
}
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Number of total connections: 200000
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Requests per connection: 10
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 11:29:28 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 11:29:35 [2, 6]: streaming(httperf): Starting test
2012-06-05 11:32:29 [2,173]: httperf-lru-default(httperf): Starting test
2012-06-05 11:35:33 WARNING [0,183]: httperf-lru-default(httperf): Panic detected. I think!
2012-06-05 11:35:33 WARNING [0, 0]: httperf-lru-default(httperf):
Last panic at: Tue, 05 Jun 2012 11:34:30 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fc4c805d9ca: _end+7fc4c79d9202
0x7fc4c7dbacdd: _end+7fc4c7736515
sp = 0x7fc4b9502320 {
fd = 13, id = 13, xid = 1709036440,
client = 10.20.100.8 13248,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fc4b8ba0158 {
id = "req",
{s,f,r,e} = {0x7fc4b8ba1730,+816,(nil),+59632},
},
http[req] = {
ws = 0x7fc4b8ba0158[req]
"GET",
"/1/4/6/9/1/3.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7fc4b8142c60 {
ws = 0x7fc4b8142e20 {
id = "wrk",
{s,f,r,e} = {0x7fc4b8142450,0x7fc4b8142450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 11:35:33 WARNING [0, 0]: httperf-lru-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 179 stat: 61 diff: 118). Did we crash?
2012-06-05 11:35:33 WARNING [0, 0]: httperf-lru-default(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-05 11:35:33 WARNING [0, 0]: httperf-lru-default(httperf): Out of bounds: client_req(142190) less than lower boundary 1989920
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Load: 13:35:33 up 4 days, 23:58, 3 users, load average: 0.67, 0.74, 0.73
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Test name: httperf-lru-default
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Varnish options:
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): -t=3600
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): -s=malloc,30M
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Varnish parameters:
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): thread_stats_rate=1
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): thread_pool_max=5000
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): nuke_limit=250
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): thread_pool_min=200
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Payload size (excludes headers): 10K
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Branch: master
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Number of clients involved: 24
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Type of test: httperf
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Test iterations: 1
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Runtime: 179 seconds
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Number of total connections: 200000
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Requests per connection: 10
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 11:35:33 [1, 0]: httperf-lru-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 11:35:40 [2, 6]: memleak(httperf): Starting test
2012-06-05 11:38:03 [2,143]: 4gpluss-stream(httperf): Starting test
2012-06-05 11:38:06 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-05 11:38:06 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-05 11:38:07 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-05 11:38:07 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-05 11:38:08 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-05 11:58:33 [2,1224]: httperf-lru-stream-default(httperf): Starting test
2012-06-05 12:02:42 WARNING [0,249]: httperf-lru-stream-default(httperf): Panic detected. I think!
2012-06-05 12:02:42 WARNING [0, 0]: httperf-lru-stream-default(httperf):
Last panic at: Tue, 05 Jun 2012 12:00:31 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fdf7c0cf9ca: _end+7fdf7ba4b202
0x7fdf7be2ccdd: _end+7fdf7b7a8515
sp = 0x7fdf6df18720 {
fd = 30, id = 30, xid = 1331107874,
client = 10.20.100.9 13392,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fdf6c1c6158 {
id = "req",
{s,f,r,e} = {0x7fdf6c1c7730,+248,(nil),+59632},
},
http[req] = {
ws = 0x7fdf6c1c6158[req]
"GET",
"/1/5/1/8/2/6.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7fdf6c091c60 {
ws = 0x7fdf6c091e20 {
id = "wrk",
{s,f,r,e} = {0x7fdf6c091450,0x7fdf6c091450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 12:02:42 WARNING [0, 0]: httperf-lru-stream-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 245 stat: 130 diff: 115). Did we crash?
2012-06-05 12:02:43 WARNING [0, 0]: httperf-lru-stream-default(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-05 12:02:43 WARNING [0, 0]: httperf-lru-stream-default(httperf): Out of bounds: client_req(304670) less than lower boundary 1989920
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Load: 14:02:43 up 5 days, 25 min, 3 users, load average: 0.69, 0.86, 0.90
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Test name: httperf-lru-stream-default
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Varnish options:
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): -t=3600
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): -s=malloc,30M
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Varnish parameters:
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): thread_stats_rate=1
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): thread_pool_max=5000
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): nuke_limit=250
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): thread_pool_min=200
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Payload size (excludes headers): 10K
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Branch: master
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Number of clients involved: 24
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Type of test: httperf
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Test iterations: 1
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Runtime: 245 seconds
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Number of total connections: 200000
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Requests per connection: 10
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 12:02:43 [1, 0]: httperf-lru-stream-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 12:02:50 [2, 6]: httperf-hot(httperf): Starting test
2012-06-05 12:04:04 WARNING [0,74]: httperf-hot(httperf): Panic detected. I think!
2012-06-05 12:04:04 WARNING [0, 0]: httperf-hot(httperf):
Last panic at: Tue, 05 Jun 2012 12:03:28 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7ff174fe69ca: _end+7ff174962202
0x7ff174d43cdd: _end+7ff1746bf515
sp = 0x7ff168a05120 {
fd = 13, id = 13, xid = 142322634,
client = 10.20.100.9 7959,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7ff165d13158 {
id = "req",
{s,f,r,e} = {0x7ff165d14730,+232,(nil),+59632},
},
http[req] = {
ws = 0x7ff165d13158[req]
"GET",
"/8/4/0.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7ff1663a5c60 {
ws = 0x7ff1663a5e20 {
id = "wrk",
{s,f,r,e} = {0x7ff1663a5450,0x7ff1663a5450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 12:04:04 WARNING [0, 0]: httperf-hot(httperf): Varnishstat uptime and measured run-time is too large (measured: 71 stat: 35 diff: 36). Did we crash?
2012-06-05 12:04:05 WARNING [0, 0]: httperf-hot(httperf): Out of bounds: client_req(104880) less than lower boundary 989840
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Load: 14:04:05 up 5 days, 26 min, 3 users, load average: 0.45, 0.77, 0.87
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Test name: httperf-hot
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Varnish options:
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Varnish parameters:
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): thread_stats_rate=1
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Payload size (excludes headers): 256
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Branch: master
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Number of clients involved: 24
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Type of test: httperf
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Test iterations: 1
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Runtime: 71 seconds
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Number of total connections: 100000
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Requests per connection: 10
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Extra options to httperf: --wset=1000,0.25
2012-06-05 12:04:05 [1, 0]: httperf-hot(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 4166 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000,0.25
2012-06-05 12:04:12 [2, 6]: httperf-lru-nostream-nogzip(httperf): Starting test
2012-06-05 12:07:27 WARNING [0,195]: httperf-lru-nostream-nogzip(httperf): Panic detected. I think!
2012-06-05 12:07:27 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:06:15 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fe79a9fa9ca: _end+7fe79a376202
0x7fe79a757cdd: _end+7fe79a0d3515
sp = 0x7fe78b603e20 {
fd = 23, id = 23, xid = 1424899721,
client = 10.20.100.9 15432,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fe78b779158 {
id = "req",
{s,f,r,e} = {0x7fe78b77a730,+248,(nil),+59632},
},
http[req] = {
ws = 0x7fe78b779158[req]
"GET",
"/1/7/4/0/5/0.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7fe78a926c60 {
ws = 0x7fe78a926e20 {
id = "wrk",
{s,f,r,e} = {0x7fe78a926450,0x7fe78a926450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 12:07:27 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 191 stat: 71 diff: 120). Did we crash?
2012-06-05 12:07:27 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf): Out of bounds: n_lru_nuked(13509) less than lower boundary 80000
2012-06-05 12:07:27 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf): Out of bounds: client_req(163712) less than lower boundary 1989920
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Load: 14:07:28 up 5 days, 30 min, 3 users, load average: 0.60, 0.83, 0.88
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Test name: httperf-lru-nostream-nogzip
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Varnish options:
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): -t=3600
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): -s=malloc,30M
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Varnish parameters:
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): thread_stats_rate=1
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): thread_pool_max=5000
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): nuke_limit=250
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): http_gzip_support=off
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): thread_pool_min=200
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Payload size (excludes headers): 10K
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Branch: master
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Number of clients involved: 24
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Type of test: httperf
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Test iterations: 1
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Runtime: 191 seconds
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
}
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Number of total connections: 200000
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Requests per connection: 10
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 12:07:28 [1, 0]: httperf-lru-nostream-nogzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 12:07:34 [2, 6]: cold-gzip(httperf): Starting test
2012-06-05 12:10:36 WARNING [0,182]: cold-gzip(httperf): Panic detected. I think!
2012-06-05 12:10:36 WARNING [0, 0]: cold-gzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:09:38 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7faf04f069ca: _end+7faf04882202
0x7faf04c63cdd: _end+7faf045df515
sp = 0x7faefb511f20 {
fd = 15, id = 15, xid = 753706176,
client = 10.20.100.8 5446,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7faef16bc158 {
id = "req",
{s,f,r,e} = {0x7faef16bd730,+256,(nil),+59632},
},
http[req] = {
ws = 0x7faef16bc158[req]
"GET",
"/0/2/7/0/2/5/2.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7faef63c8c60 {
ws = 0x7faef63c8e20 {
id = "wrk",
{s,f,r,e} = {0x7faef63c8450,0x7faef63c8450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 12:10:36 WARNING [0, 0]: cold-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 178 stat: 57 diff: 121). Did we crash?
2012-06-05 12:14:03 WARNING [0,206]: cold-gzip(httperf): Panic detected. I think!
2012-06-05 12:14:03 WARNING [0, 0]: cold-gzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:12:18 GMT
Assert error in cnt_first(), cache/cache_center.c line 943:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x41b021: CNT_Session+781
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7faf04f069ca: _end+7faf04882202
0x7faf04c63cdd: _end+7faf045df515
sp = 0x7faeed7a9420 {
fd = 29, id = 29, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7faef5a8a158 {
id = "req",
{s,f,r,e} = {0x7faef5a8b730,0x7faef5a8b730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7faef6c14c60 {
ws = 0x7faef6c14e20 {
id = "wrk",
{s,f,r,e} = {0x7faef6c14450,0x7faef6c14450,(nil),+2048},
},
},
},
2012-06-05 12:14:03 WARNING [0, 0]: cold-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 384 stat: 104 diff: 280). Did we crash?
2012-06-05 12:14:03 WARNING [0, 0]: cold-gzip(httperf): Out of bounds: client_req(207324) less than lower boundary 1589840
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Load: 14:14:04 up 5 days, 36 min, 3 users, load average: 0.42, 1.01, 1.02
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Test name: cold-gzip
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Varnish options:
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): -t=3600
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): -s=malloc,10G
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Varnish parameters:
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): thread_stats_rate=1
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): http_gzip_support=on
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Payload size (excludes headers): 256
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Branch: master
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Number of clients involved: 24
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Type of test: httperf
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Test iterations: 2
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Runtime: 384 seconds
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Number of total connections: 80000
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Requests per connection: 10
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Extra options to httperf: --wset=4000000,0.50
2012-06-05 12:14:04 [1, 0]: cold-gzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 3333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=4000000,0.50
2012-06-05 12:14:10 [2, 6]: 4gpluss(httperf): Starting test
2012-06-05 12:14:13 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-05 12:14:13 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-05 12:14:14 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-05 12:14:14 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-05 12:14:15 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-05 12:35:26 [2,1271]: httperf-lru-stream-gzip(httperf): Starting test
2012-06-05 12:39:12 WARNING [0,225]: httperf-lru-stream-gzip(httperf): Panic detected. I think!
2012-06-05 12:39:12 WARNING [0, 0]: httperf-lru-stream-gzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:38:44 GMT
Assert error in VCL_fetch_method(), ../../include/tbl/vcl_returns.h line 56:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x439875: VCL_fetch_method+1a5
0x4186a9: cnt_fetch+479
0x41accd: CNT_Session+42d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7f10f69d99ca: _end+7f10f6355202
0x7f10f6736cdd: _end+7f10f60b2515
sp = 0x7f10e8914420 {
fd = 21, id = 21, xid = 1528589147,
client = 10.20.100.9 17754,
step = STP_FETCH,
handling = fetch,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
busyobj = 0x7f10e7502020 {
ws = 0x7f10e7502070 {
id = "bo",
{s,f,r,e} = {0x7f10e7503aa0,+512,(nil),+58752},
},
do_stream
bodystatus = 3 (chunked),
},
http[bereq] = {
ws = 0x7f10e7502070[bo]
"GET",
"/1/9/2/7/9/8.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
"X-Varnish: 1528589147",
"Accept-Encoding: gzip",
},
http[beresp] = {
ws = 0x7f10e7502070[bo]
"HTTP/1.1",
"200",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 05 Jun 2012 12:38:44 GMT",
"Content-Type: text/plain",
"Last-Modified: Tue, 05 Jun 2012 12:35:30 GMT",
"Transfer-Encoding: chunked",
"Connection: keep-alive",
"Content-Encoding: gzip",
},
ws = 0x7f10e9390158 {
id = "req",
{s,f,r,e} = {0x7f10e9391730,+136,(nil),+59632},
},
http[req] = {
ws = 0x7f10e9390158[req]
"GET",
"/1/9/2/7/9/8.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7f10e6932c60 {
ws = 0x7f10e6932e20 {
id = "wrk",
{s,f,r,e} = {0x7f10e6932450,0x7f10e6932450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 12:39:12 WARNING [0, 0]: httperf-lru-stream-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 221 stat: 27 diff: 194). Did we crash?
2012-06-05 12:39:12 WARNING [0, 0]: httperf-lru-stream-gzip(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-05 12:39:12 WARNING [0, 0]: httperf-lru-stream-gzip(httperf): Out of bounds: client_req(7000) less than lower boundary 1989920
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Load: 14:39:13 up 5 days, 1:02, 3 users, load average: 0.67, 0.82, 1.10
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Test name: httperf-lru-stream-gzip
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Varnish options:
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): -t=3600
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): -s=malloc,30M
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Varnish parameters:
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): thread_stats_rate=1
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): thread_pool_max=5000
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): nuke_limit=250
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): http_gzip_support=on
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): thread_pool_min=200
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Payload size (excludes headers): 10K
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Branch: master
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Number of clients involved: 24
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Type of test: httperf
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Test iterations: 1
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Runtime: 221 seconds
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Number of total connections: 200000
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Requests per connection: 10
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 12:39:13 [1, 0]: httperf-lru-stream-gzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 12:39:19 [2, 6]: httperf-lru-stream-nogzip(httperf): Starting test
2012-06-05 12:42:36 WARNING [0,196]: httperf-lru-stream-nogzip(httperf): Panic detected. I think!
2012-06-05 12:42:36 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:41:18 GMT
Assert error in VCL_deliver_method(), ../../include/tbl/vcl_returns.h line 60:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x439695: VCL_deliver_method+1a5
0x419729: cnt_prepresp+269
0x41ac6d: CNT_Session+3cd
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fca53d5b9ca: _end+7fca536d7202
0x7fca53ab8cdd: _end+7fca53434515
sp = 0x7fca4501af20 {
fd = 16, id = 16, xid = 777659782,
client = 10.20.100.8 14047,
step = STP_PREPRESP,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fca43ab6158 {
id = "req",
{s,f,r,e} = {0x7fca43ab7730,+400,(nil),+59632},
},
http[req] = {
ws = 0x7fca43ab6158[req]
"GET",
"/1/5/6/9/9/6.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
},
http[resp] = {
ws = 0x7fca43ab6158[req]
"HTTP/1.1",
"OK",
"Server: nginx/0.7.65",
"Content-Type: text/plain",
"Last-Modified: Tue, 05 Jun 2012 12:39:23 GMT",
"Content-Length: 10240",
"Accept-Ranges: bytes",
"Date: Tue, 05 Jun 2012 12:41:18 GMT",
"X-Varnish: 777659782 777659666",
"Age: 0",
"Via: 1.1 varnish",
"Connection: keep-alive",
},
worker = 0x7fca43cc2c60 {
ws = 0x7fca43cc2e20 {
id = "wrk",
{s,f,r,e} = {0x7fca43cc2450,0x7fca43cc2450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
obj = 0x7fca43d17400 {
xid = 777659666,
ws = 0x7fca43d17418 {
id = "obj",
{s,f,r,e} = {0x7fca43d175d8,+192,(nil),+224},
},
http[obj] = {
ws = 0x7fca43d17418[obj]
"HTTP/1.1",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 05 Jun 2012 12:41:18 GMT",
"Content-Type: text/plain",
"Last-Modified: Tue, 05 Jun 2012 12:39:23 GMT",
"Content-Length: 10240",
},
len = 10240,
store = {
10240 {
4c 6f 72 65 6d 20 69 70 73 75 6d 20 64 6f 6c 6f |Lorem ipsum dolo|
72 20 73 69 74 20 61 6d 65 74 2c 20 63 6f 6e 73 |r sit amet, cons|
65 63 74 65 74 75 72 20 61 64 69 70 69 73 69 63 |ectetur adipisic|
69 6e 67 20 65 6c 69 74 2c 20 73 65 64 20 64 6f |ing elit, sed do|
[10176 more]
},
},
},
},
2012-06-05 12:42:36 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 193 stat: 77 diff: 116). Did we crash?
2012-06-05 12:42:36 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf): Out of bounds: n_lru_nuked(21958) less than lower boundary 80000
2012-06-05 12:42:36 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf): Out of bounds: client_req(248170) less than lower boundary 1989920
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Load: 14:42:37 up 5 days, 1:05, 3 users, load average: 0.50, 0.76, 1.02
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Test name: httperf-lru-stream-nogzip
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Varnish options:
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): -t=3600
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): -s=malloc,30M
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Varnish parameters:
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): thread_stats_rate=1
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): thread_pool_max=5000
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): nuke_limit=250
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): http_gzip_support=off
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): thread_pool_min=200
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Payload size (excludes headers): 10K
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Branch: master
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Number of clients involved: 24
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Type of test: httperf
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Test iterations: 1
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Runtime: 193 seconds
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Number of total connections: 200000
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Requests per connection: 10
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 12:42:37 [1, 0]: httperf-lru-stream-nogzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 12:42:43 [2, 6]: basic-fryer(httperf): Starting test
2012-06-05 12:43:08 [2,24]: cold-nogzip(httperf): Starting test
2012-06-05 12:46:26 WARNING [0,197]: cold-nogzip(httperf): Panic detected. I think!
2012-06-05 12:46:26 WARNING [0, 0]: cold-nogzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:44:40 GMT
Assert error in VCL_fetch_method(), ../../include/tbl/vcl_returns.h line 56:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x439875: VCL_fetch_method+1a5
0x4186a9: cnt_fetch+479
0x41accd: CNT_Session+42d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fd71754f9ca: _end+7fd716ecb202
0x7fd7172accdd: _end+7fd716c28515
sp = 0x7fd707c04120 {
fd = 13, id = 13, xid = 131844479,
client = 10.20.100.8 4496,
step = STP_FETCH,
handling = fetch,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
busyobj = 0x7fd707702020 {
ws = 0x7fd707702070 {
id = "bo",
{s,f,r,e} = {0x7fd707703aa0,+248,(nil),+58752},
},
do_stream
bodystatus = 4 (length),
},
http[bereq] = {
ws = 0x7fd707702070[bo]
"GET",
"/0/2/1/1/7/1/3.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
"X-Varnish: 131844479",
},
http[beresp] = {
ws = 0x7fd707702070[bo]
"HTTP/1.1",
"200",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 05 Jun 2012 12:44:40 GMT",
"Content-Type: text/plain",
"Content-Length: 256",
"Last-Modified: Tue, 05 Jun 2012 12:43:12 GMT",
"Connection: keep-alive",
"Accept-Ranges: bytes",
},
ws = 0x7fd70830b158 {
id = "req",
{s,f,r,e} = {0x7fd70830c730,+216,(nil),+59632},
},
http[req] = {
ws = 0x7fd70830b158[req]
"GET",
"/0/2/1/1/7/1/3.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
},
worker = 0x7fd70804ac60 {
ws = 0x7fd70804ae20 {
id = "wrk",
{s,f,r,e} = {0x7fd70804a450,0x7fd70804a450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 12:46:26 WARNING [0, 0]: cold-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 193 stat: 105 diff: 88). Did we crash?
2012-06-05 12:49:25 WARNING [0,178]: cold-nogzip(httperf): Panic detected. I think!
2012-06-05 12:49:25 WARNING [0, 0]: cold-nogzip(httperf):
Last panic at: Tue, 05 Jun 2012 12:47:47 GMT
Assert error in cnt_first(), cache/cache_center.c line 943:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x41b021: CNT_Session+781
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fd71754f9ca: _end+7fd716ecb202
0x7fd7172accdd: _end+7fd716c28515
sp = 0x7fd707935320 {
fd = 31, id = 31, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fd709dee158 {
id = "req",
{s,f,r,e} = {0x7fd709def730,0x7fd709def730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7fd70922cc60 {
ws = 0x7fd70922ce20 {
id = "wrk",
{s,f,r,e} = {0x7fd70922c450,0x7fd70922c450,(nil),+2048},
},
},
},
2012-06-05 12:49:25 WARNING [0, 0]: cold-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 372 stat: 97 diff: 275). Did we crash?
2012-06-05 12:49:25 WARNING [0, 0]: cold-nogzip(httperf): Out of bounds: uptime(97) less than lower boundary 100
2012-06-05 12:49:25 WARNING [0, 0]: cold-nogzip(httperf): Out of bounds: client_req(180854) less than lower boundary 1589840
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Load: 14:49:26 up 5 days, 1:12, 3 users, load average: 0.84, 0.70, 0.88
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Test name: cold-nogzip
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Varnish options:
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): -t=3600
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): -s=malloc,10G
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Varnish parameters:
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): thread_stats_rate=1
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): http_gzip_support=off
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Payload size (excludes headers): 256
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Branch: master
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Number of clients involved: 24
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Type of test: httperf
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Test iterations: 2
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Runtime: 372 seconds
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Number of total connections: 80000
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Requests per connection: 10
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Extra options to httperf: --wset=4000000,0.50
2012-06-05 12:49:26 [1, 0]: cold-nogzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 3333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=4000000,0.50
2012-06-05 12:49:32 [2, 6]: 4gpluss-nostream(httperf): Starting test
2012-06-05 12:49:35 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-05 12:49:35 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-05 12:49:36 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-05 12:49:36 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-05 12:49:37 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-05 13:10:12 [2,1235]: lru-random(httperf): Starting test
2012-06-05 13:21:36 [2,684]: siege-test(siege): Starting test
2012-06-05 13:21:55 WARNING [0,18]: siege-test(siege): Panic detected. I think!
2012-06-05 13:21:55 WARNING [0, 0]: siege-test(siege):
Last panic at: Tue, 05 Jun 2012 13:21:41 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7f666435f9ca: _end+7f6663cdb202
0x7f66640bccdd: _end+7f6663a38515
sp = 0x7f6655604620 {
fd = 16, id = 16, xid = 540277007,
client = 10.20.100.8 39352,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f6655571158 {
id = "req",
{s,f,r,e} = {0x7f6655572730,+152,(nil),+59632},
},
http[req] = {
ws = 0x7f6655571158[req]
"GET",
"/",
"HTTP/1.1",
"Host: 10.20.100.12:8080",
"Accept: */*",
"Accept-Encoding: gzip",
"User-Agent: JoeDog/1.00 [en] (X11; I; Siege 2.66)",
"Connection: close",
},
worker = 0x7f665643ec60 {
ws = 0x7f665643ee20 {
id = "wrk",
{s,f,r,e} = {0x7f665643e450,0x7f665643e450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Load: 15:21:55 up 5 days, 1:44, 3 users, load average: 0.42, 0.51, 0.55
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Test name: siege-test
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Varnish options:
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Varnish parameters:
2012-06-05 13:21:55 [1, 0]: siege-test(siege): thread_stats_rate=1
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Payload size (excludes headers): 256
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Branch: master
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Number of clients involved: 0
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Type of test: siege
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Test iterations: 1
2012-06-05 13:21:55 [1, 0]: siege-test(siege): Runtime: 15 seconds
2012-06-05 13:21:55 [1, 0]: siege-test(siege): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 13:22:02 [2, 6]: httperf-lru-nostream-default(httperf): Starting test
2012-06-05 13:25:35 WARNING [0,212]: httperf-lru-nostream-default(httperf): Panic detected. I think!
2012-06-05 13:25:35 WARNING [0, 0]: httperf-lru-nostream-default(httperf):
Last panic at: Tue, 05 Jun 2012 13:24:57 GMT
Assert error in VCL_fetch_method(), ../../include/tbl/vcl_returns.h line 56:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x439875: VCL_fetch_method+1a5
0x4186a9: cnt_fetch+479
0x41accd: CNT_Session+42d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7f43cb2269ca: _end+7f43caba2202
0x7f43caf83cdd: _end+7f43ca8ff515
sp = 0x7f43b8660c20 {
fd = 14, id = 14, xid = 1445715243,
client = 10.20.100.8 17484,
step = STP_FETCH,
handling = fetch,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
busyobj = 0x7f43b8b61020 {
ws = 0x7f43b8b61070 {
id = "bo",
{s,f,r,e} = {0x7f43b8b62aa0,+512,(nil),+58752},
},
do_stream
should_close
bodystatus = 3 (chunked),
},
http[bereq] = {
ws = 0x7f43b8b61070[bo]
"GET",
"/1/9/6/5/0/1.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
"X-Varnish: 1445715243",
"Accept-Encoding: gzip",
},
http[beresp] = {
ws = 0x7f43b8b61070[bo]
"HTTP/1.1",
"200",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 05 Jun 2012 13:24:57 GMT",
"Content-Type: text/plain",
"Last-Modified: Tue, 05 Jun 2012 13:22:06 GMT",
"Transfer-Encoding: chunked",
"Connection: close",
"Content-Encoding: gzip",
},
ws = 0x7f43b9e36158 {
id = "req",
{s,f,r,e} = {0x7f43b9e37730,+136,(nil),+59632},
},
http[req] = {
ws = 0x7f43b9e36158[req]
"GET",
"/1/9/6/5/0/1.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
},
worker = 0x7f43bb2aec60 {
ws = 0x7f43bb2aee20 {
id = "wrk",
{s,f,r,e} = {0x7f43bb2ae450,0x7f43bb2ae450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 13:25:35 WARNING [0, 0]: httperf-lru-nostream-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 208 stat: 37 diff: 171). Did we crash?
2012-06-05 13:25:35 WARNING [0, 0]: httperf-lru-nostream-default(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-05 13:25:35 WARNING [0, 0]: httperf-lru-nostream-default(httperf): Out of bounds: client_req(29820) less than lower boundary 1989920
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Load: 15:25:35 up 5 days, 1:48, 3 users, load average: 0.64, 0.77, 0.65
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Test name: httperf-lru-nostream-default
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Varnish options:
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): -t=3600
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): -s=malloc,30M
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Varnish parameters:
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): thread_stats_rate=1
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): thread_pool_max=5000
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): nuke_limit=250
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): thread_pool_min=200
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Payload size (excludes headers): 10K
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Branch: master
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Number of clients involved: 24
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Type of test: httperf
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Test iterations: 1
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Runtime: 208 seconds
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
}
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Number of total connections: 200000
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Requests per connection: 10
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-05 13:25:35 [1, 0]: httperf-lru-nostream-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-05 13:25:42 [2, 6]: httperf-rapid-expire(httperf): Starting test
2012-06-05 13:26:52 WARNING [0,70]: httperf-rapid-expire(httperf): Panic detected. I think!
2012-06-05 13:26:52 WARNING [0, 0]: httperf-rapid-expire(httperf):
Last panic at: Tue, 05 Jun 2012 13:26:18 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fc9dd3819ca: _end+7fc9dccfd202
0x7fc9dd0decdd: _end+7fc9dca5a515
sp = 0x7fc9ce005320 {
fd = 14, id = 14, xid = 1816702906,
client = 10.20.100.8 7820,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fc9ce30e158 {
id = "req",
{s,f,r,e} = {0x7fc9ce30f730,+224,(nil),+59632},
},
http[req] = {
ws = 0x7fc9ce30e158[req]
"GET",
"/4/7.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7fc9ce7d8c60 {
ws = 0x7fc9ce7d8e20 {
id = "wrk",
{s,f,r,e} = {0x7fc9ce7d8450,0x7fc9ce7d8450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 13:26:52 WARNING [0, 0]: httperf-rapid-expire(httperf): Varnishstat uptime and measured run-time is too large (measured: 66 stat: 33 diff: 33). Did we crash?
2012-06-05 13:26:52 WARNING [0, 0]: httperf-rapid-expire(httperf): Out of bounds: client_req(132960) less than lower boundary 989840
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Load: 15:26:53 up 5 days, 1:49, 3 users, load average: 0.95, 0.85, 0.68
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Test name: httperf-rapid-expire
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Varnish options:
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): -t=2
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Varnish parameters:
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): thread_stats_rate=1
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Payload size (excludes headers): 256
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Branch: master
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Number of clients involved: 24
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Type of test: httperf
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Test iterations: 1
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Runtime: 66 seconds
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Number of total connections: 100000
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Requests per connection: 10
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Extra options to httperf: --wset=100,0.30
2012-06-05 13:26:53 [1, 0]: httperf-rapid-expire(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 4166 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=100,0.30
2012-06-05 13:26:59 [2, 6]: streaming-grace(httperf): Starting test
2012-06-05 13:30:01 WARNING [0,181]: streaming-grace(httperf): Panic detected. I think!
2012-06-05 13:30:01 WARNING [0, 0]: streaming-grace(httperf):
Last panic at: Tue, 05 Jun 2012 13:28:50 GMT
Assert error in cnt_first(), cache/cache_center.c line 943:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x41b021: CNT_Session+781
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7f760f6e99ca: _end+7f760f065202
0x7f760f446cdd: _end+7f760edc2515
sp = 0x7f7605412b20 {
fd = 1070, id = 1070, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f75f3f13158 {
id = "req",
{s,f,r,e} = {0x7f75f3f14730,0x7f75f3f14730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f76031fac60 {
ws = 0x7f76031fae20 {
id = "wrk",
{s,f,r,e} = {0x7f76031fa450,0x7f76031fa450,(nil),+2048},
},
},
},
2012-06-05 13:30:01 WARNING [0, 0]: streaming-grace(httperf): Varnishstat uptime and measured run-time is too large (measured: 177 stat: 70 diff: 107). Did we crash?
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Load: 15:30:01 up 5 days, 1:52, 3 users, load average: 0.58, 0.75, 0.67
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Test name: streaming-grace
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Varnish options:
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): -t=1
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Varnish parameters:
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): thread_stats_rate=1
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): thread_pool_add_delay=1
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): http_gzip_support=off
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): default_grace=10
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Payload size (excludes headers): 1M
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Branch: master
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Number of clients involved: 24
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Type of test: httperf
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Test iterations: 1
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Runtime: 177 seconds
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
.connect_timeout = 10s;
}
sub vcl_recv {
set req.grace = 15s;
}
sub vcl_fetch {
set beresp.do_stream = true;
set beresp.grace = 10s;
set beresp.ttl = 15s;
}
sub vcl_deliver {
set resp.http.x-fryer = "some test";
}
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Number of total connections: 10000
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Requests per connection: 1
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Extra options to httperf: --wset=1000,0.1 --rate 3
2012-06-05 13:30:01 [1, 0]: streaming-grace(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 1 --num-conns 416 --port 8080 --burst-length 1 --client 23/24 --server 10.20.100.12 --wset=1000,0.1 --rate 3
2012-06-05 13:30:08 [2, 6]: cold-default(httperf): Starting test
2012-06-05 13:33:22 WARNING [0,193]: cold-default(httperf): Panic detected. I think!
2012-06-05 13:33:22 WARNING [0, 0]: cold-default(httperf):
Last panic at: Tue, 05 Jun 2012 13:31:36 GMT
Assert error in cnt_first(), cache/cache_center.c line 943:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x41b021: CNT_Session+781
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fd27d2eb9ca: _end+7fd27cc67202
0x7fd27d048cdd: _end+7fd27c9c4515
sp = 0x7fd26d803b20 {
fd = 31, id = 31, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fd26e40b158 {
id = "req",
{s,f,r,e} = {0x7fd26e40c730,0x7fd26e40c730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7fd26dfe6c60 {
ws = 0x7fd26dfe6e20 {
id = "wrk",
{s,f,r,e} = {0x7fd26dfe6450,0x7fd26dfe6450,(nil),+2048},
},
},
},
2012-06-05 13:33:22 WARNING [0, 0]: cold-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 190 stat: 105 diff: 85). Did we crash?
2012-06-05 13:36:43 WARNING [0,201]: cold-default(httperf): Panic detected. I think!
2012-06-05 13:36:43 WARNING [0, 0]: cold-default(httperf):
Last panic at: Tue, 05 Jun 2012 13:35:09 GMT
Assert error in cnt_first(), cache/cache_center.c line 943:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x41b021: CNT_Session+781
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7fd27d2eb9ca: _end+7fd27cc67202
0x7fd27d048cdd: _end+7fd27c9c4515
sp = 0x7fd26e3cc720 {
fd = 27, id = 27, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fd26e5bf158 {
id = "req",
{s,f,r,e} = {0x7fd26e5c0730,0x7fd26e5c0730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7fd26e638c60 {
ws = 0x7fd26e638e20 {
id = "wrk",
{s,f,r,e} = {0x7fd26e638450,0x7fd26e638450,(nil),+2048},
},
},
},
2012-06-05 13:36:43 WARNING [0, 0]: cold-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 391 stat: 93 diff: 298). Did we crash?
2012-06-05 13:36:44 WARNING [0, 0]: cold-default(httperf): Out of bounds: uptime(93) less than lower boundary 100
2012-06-05 13:36:44 WARNING [0, 0]: cold-default(httperf): Out of bounds: client_req(184120) less than lower boundary 1589840
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Load: 15:36:44 up 5 days, 1:59, 3 users, load average: 0.62, 0.76, 0.70
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Test name: cold-default
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Varnish options:
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): -t=3600
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): -s=malloc,10G
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Varnish parameters:
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): thread_stats_rate=1
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Payload size (excludes headers): 256
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Branch: master
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Number of clients involved: 24
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Type of test: httperf
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Test iterations: 2
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Runtime: 391 seconds
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Number of total connections: 80000
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Requests per connection: 10
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Extra options to httperf: --wset=4000000,0.50
2012-06-05 13:36:44 [1, 0]: cold-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 3333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=4000000,0.50
2012-06-05 13:36:51 [2, 6]: 4gpluss-nogzip(httperf): Starting test
2012-06-05 13:36:54 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-05 13:36:54 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-05 13:36:54 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-05 13:36:55 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-05 13:36:55 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-05 13:58:13 [2,1278]: purge-fail(httperf): Starting test
2012-06-05 14:01:53 WARNING [0,219]: purge-fail(httperf): Panic detected. I think!
2012-06-05 14:01:53 WARNING [0, 0]: purge-fail(httperf):
Last panic at: Tue, 05 Jun 2012 14:01:20 GMT
Assert error in VCL_recv_method(), ../../include/tbl/vcl_returns.h line 27:
Condition((req->sp) != NULL) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x431d88: pan_ic+d8
0x43a3b5: VCL_recv_method+1a5
0x4164c5: cnt_recv+1e5
0x41ae1d: CNT_Session+57d
0x43670d: ses_pool_task+fd
0x433592: Pool_Work_Thread+112
0x441128: wrk_thread_real+c8
0x7f5d62d4d9ca: _end+7f5d626c9202
0x7f5d62aaacdd: _end+7f5d62426515
sp = 0x7f5d53c04720 {
fd = 18, id = 18, xid = 1095460786,
client = 10.20.100.7 25417,
step = STP_RECV,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f5d46279158 {
id = "req",
{s,f,r,e} = {0x7f5d4627a730,+80,(nil),+59632},
},
http[req] = {
ws = 0x7f5d46279158[req]
"GET",
"/9/6/8.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
},
worker = 0x7f5d54114c60 {
ws = 0x7f5d54114e20 {
id = "wrk",
{s,f,r,e} = {0x7f5d54114450,0x7f5d54114450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-05 14:01:53 WARNING [0, 0]: purge-fail(httperf): Varnishstat uptime and measured run-time is too large (measured: 214 stat: 32 diff: 182). Did we crash?
2012-06-05 14:01:53 WARNING [0, 0]: purge-fail(httperf): Out of bounds: client_req(12955) less than lower boundary 290000
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Load: 16:01:53 up 5 days, 2:24, 3 users, load average: 0.51, 0.65, 0.99
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Test name: purge-fail
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Varnish options:
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Varnish parameters:
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): thread_stats_rate=1
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Payload size (excludes headers): 1K
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Branch: master
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Number of clients involved: 24
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Type of test: httperf
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Test iterations: 1
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Runtime: 214 seconds
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_recv {
if (!req.url ~ "/0/0.html") {
set req.request = "PURGE";
}
set req.url = "/foo";
return (lookup);
}
sub vcl_hit {
if (req.request == "PURGE") {
set obj.ttl = 0s;
error 200 "OK";
}
}
sub vcl_miss {
if (req.request == "PURGE") {
error 200 "Not in cache but not confusing httperf";
}
}
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Number of total connections: 300000
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Requests per connection: 1
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Extra options to httperf: --wset=999,0.5 --timeout=5
2012-06-05 14:01:53 [1, 0]: purge-fail(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 1 --num-conns 12500 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=999,0.5 --timeout=5
2012-06-05 14:02:00 [2, 6]: streaming-gzip(httperf): Starting test
2012-06-05 14:04:34 WARNING [0,154]: Tests finished with problems detected. Failed expectations: 16 Total run time: 9752 seconds
More information about the varnish-test
mailing list