[Fryer] master FAIL. 10 of 25 tests succeeded.
fryer at oneiros.varnish-software.com
fryer at oneiros.varnish-software.com
Wed Jun 13 00:15:37 CEST 2012
Tests Failed: httperf-lru-nostream-gzip httperf-lru-nostream-gzip-deflateoff httperf-lru-default httperf-lru-stream-default httperf-hot httperf-lru-nostream-nogzip cold-gzip httperf-lru-stream-gzip httperf-lru-stream-nogzip cold-nogzip siege-test httperf-lru-nostream-default httperf-rapid-expire cold-default purge-fail
Tests OK: streaming memleak 4gpluss-stream 4gpluss basic-fryer 4gpluss-nostream lru-random streaming-grace 4gpluss-nogzip streaming-gzip
2012-06-12 19:32:16 [1,14]: Server pantoum checked out varnish-3.0.0-beta2-1015-g6094581 of branch master
2012-06-12 19:32:40 [2,24]: httperf-lru-nostream-gzip(httperf): Starting test
2012-06-12 19:35:36 WARNING [0,176]: httperf-lru-nostream-gzip(httperf): Panic detected. I think!
2012-06-12 19:35:36 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf):
Last panic at: Tue, 12 Jun 2012 19:35:10 GMT
Assert error in cnt_fetch(), cache/cache_center.c line 639:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x4188dc: cnt_fetch+49c
0x41b03d: CNT_Session+49d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f918d0539ca: _end+7f918c9cf202
0x7f918cdb0cdd: _end+7f918c72c515
sp = 0x7f917db02c20 {
fd = 24, id = 24, xid = 374158426,
client = 10.20.100.9 17182,
step = STP_FETCH,
handling = fetch,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
busyobj = 0x7f917de8a020 {
ws = 0x7f917de8a070 {
id = "bo",
{s,f,r,e} = {0x7f917de8baa0,+512,(nil),+58752},
},
do_stream
bodystatus = 3 (chunked),
},
http[bereq] = {
ws = 0x7f917de8a070[bo]
"GET",
"/1/9/4/0/4/2.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
"X-Varnish: 374158426",
"Accept-Encoding: gzip",
},
http[beresp] = {
ws = 0x7f917de8a070[bo]
"HTTP/1.1",
"200",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 12 Jun 2012 19:35:10 GMT",
"Content-Type: text/plain",
"Last-Modified: Tue, 12 Jun 2012 19:32:43 GMT",
"Transfer-Encoding: chunked",
"Connection: keep-alive",
"Content-Encoding: gzip",
},
ws = 0x7f917dc68158 {
id = "req",
{s,f,r,e} = {0x7f917dc69730,+136,(nil),+59632},
},
http[req] = {
ws = 0x7f917dc68158[req]
"GET",
"/1/9/4/0/4/2.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7f917d0f6c60 {
ws = 0x7f917d0f6e20 {
id = "wrk",
{s,f,r,e} = {0x7f917d0f6450,0x7f917d0f6450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-12 19:35:36 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 172 stat: 25 diff: 147). Did we crash?
2012-06-12 19:35:36 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-12 19:35:36 WARNING [0, 0]: httperf-lru-nostream-gzip(httperf): Out of bounds: client_req(9810) less than lower boundary 1989920
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Load: 21:35:37 up 12 days, 7:58, 0 users, load average: 5.78, 4.26, 1.71
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Test name: httperf-lru-nostream-gzip
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Varnish options:
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): -t=3600
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): -s=malloc,30M
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Varnish parameters:
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): thread_stats_rate=1
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): thread_pool_max=5000
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): nuke_limit=250
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): http_gzip_support=on
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): thread_pool_min=200
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Payload size (excludes headers): 10K
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Branch: master
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Number of clients involved: 24
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Type of test: httperf
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Test iterations: 1
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Runtime: 172 seconds
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
}
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Number of total connections: 200000
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Requests per connection: 10
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 19:35:37 [1, 0]: httperf-lru-nostream-gzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 19:35:43 [2, 6]: httperf-lru-nostream-gzip-deflateoff(httperf): Starting test
2012-06-12 19:39:05 WARNING [0,201]: httperf-lru-nostream-gzip-deflateoff(httperf): Panic detected. I think!
2012-06-12 19:39:05 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf):
Last panic at: Tue, 12 Jun 2012 19:37:53 GMT
Assert error in CNT_Session(), cache/cache_center.c line 1639:
Condition((sp->req->sp) != 0) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b7ed: CNT_Session+c4d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f3ba315c9ca: _end+7f3ba2ad8202
0x7f3ba2eb9cdd: _end+7f3ba2835515
sp = 0x7f3b9511d920 {
fd = 14, id = 14, xid = 0,
client = 10.20.100.8 15055,
step = STP_WAIT,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f3b9330b158 {
id = "req",
{s,f,r,e} = {0x7f3b9330c730,0x7f3b9330c730,+32768,+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f3b93612c60 {
ws = 0x7f3b93612e20 {
id = "wrk",
{s,f,r,e} = {0x7f3b93612450,0x7f3b93612450,(nil),+2048},
},
},
},
2012-06-12 19:39:05 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Varnishstat uptime and measured run-time is too large (measured: 197 stat: 71 diff: 126). Did we crash?
2012-06-12 19:39:05 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-12 19:39:05 WARNING [0, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Out of bounds: client_req(145730) less than lower boundary 1989920
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Load: 21:39:06 up 12 days, 8:02, 0 users, load average: 0.84, 2.65, 1.59
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Test name: httperf-lru-nostream-gzip-deflateoff
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Varnish options:
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): -t=3600
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): -s=malloc,30M
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Varnish parameters:
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): thread_stats_rate=1
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): thread_pool_max=5000
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): nuke_limit=250
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): http_gzip_support=on
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): thread_pool_min=200
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Payload size (excludes headers): 10K
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Branch: master
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Number of clients involved: 24
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Type of test: httperf
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Test iterations: 1
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Runtime: 197 seconds
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
set beresp.do_gzip = true;
}
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Number of total connections: 200000
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Requests per connection: 10
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 19:39:06 [1, 0]: httperf-lru-nostream-gzip-deflateoff(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 19:39:12 [2, 6]: streaming(httperf): Starting test
2012-06-12 19:42:07 [2,174]: httperf-lru-default(httperf): Starting test
2012-06-12 19:45:29 WARNING [0,202]: httperf-lru-default(httperf): Panic detected. I think!
2012-06-12 19:45:29 WARNING [0, 0]: httperf-lru-default(httperf):
Last panic at: Tue, 12 Jun 2012 19:44:34 GMT
Assert error in cnt_first(), cache/cache_center.c line 949:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b391: CNT_Session+7f1
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f66e8dea9ca: _end+7f66e8766202
0x7f66e8b47cdd: _end+7f66e84c3515
sp = 0x7f66d9803520 {
fd = 22, id = 22, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f66d9b7c158 {
id = "req",
{s,f,r,e} = {0x7f66d9b7d730,0x7f66d9b7d730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f66d8e86c60 {
ws = 0x7f66d8e86e20 {
id = "wrk",
{s,f,r,e} = {0x7f66d8e86450,0x7f66d8e86450,(nil),+2048},
},
},
},
2012-06-12 19:45:29 WARNING [0, 0]: httperf-lru-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 198 stat: 55 diff: 143). Did we crash?
2012-06-12 19:45:30 WARNING [0, 0]: httperf-lru-default(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-12 19:45:30 WARNING [0, 0]: httperf-lru-default(httperf): Out of bounds: client_req(56340) less than lower boundary 1989920
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Load: 21:45:30 up 12 days, 8:08, 0 users, load average: 0.51, 1.34, 1.34
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Test name: httperf-lru-default
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Varnish options:
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): -t=3600
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): -s=malloc,30M
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Varnish parameters:
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): thread_stats_rate=1
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): thread_pool_max=5000
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): nuke_limit=250
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): thread_pool_min=200
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Payload size (excludes headers): 10K
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Branch: master
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Number of clients involved: 24
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Type of test: httperf
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Test iterations: 1
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Runtime: 198 seconds
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Number of total connections: 200000
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Requests per connection: 10
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 19:45:30 [1, 0]: httperf-lru-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 19:45:37 [2, 6]: memleak(httperf): Starting test
2012-06-12 19:47:58 [2,140]: 4gpluss-stream(httperf): Starting test
2012-06-12 19:48:01 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-12 19:48:01 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-12 19:48:01 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-12 19:48:01 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-12 19:48:02 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-12 20:07:54 [2,1192]: httperf-lru-stream-default(httperf): Starting test
2012-06-12 20:11:44 WARNING [0,229]: httperf-lru-stream-default(httperf): Panic detected. I think!
2012-06-12 20:11:44 WARNING [0, 0]: httperf-lru-stream-default(httperf):
Last panic at: Tue, 12 Jun 2012 20:10:21 GMT
Assert error in cnt_first(), cache/cache_center.c line 949:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b391: CNT_Session+7f1
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f85699939ca: _end+7f856930f202
0x7f85696f0cdd: _end+7f856906c515
sp = 0x7f855b80d820 {
fd = 19, id = 19, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f8559ace158 {
id = "req",
{s,f,r,e} = {0x7f8559acf730,0x7f8559acf730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f8559712c60 {
ws = 0x7f8559712e20 {
id = "wrk",
{s,f,r,e} = {0x7f8559712450,0x7f8559712450,(nil),+2048},
},
},
},
2012-06-12 20:11:44 WARNING [0, 0]: httperf-lru-stream-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 225 stat: 82 diff: 143). Did we crash?
2012-06-12 20:11:44 WARNING [0, 0]: httperf-lru-stream-default(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-12 20:11:44 WARNING [0, 0]: httperf-lru-stream-default(httperf): Out of bounds: client_req(131670) less than lower boundary 1989920
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Load: 22:11:45 up 12 days, 8:34, 0 users, load average: 6.51, 6.02, 3.16
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Test name: httperf-lru-stream-default
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Varnish options:
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): -t=3600
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): -s=malloc,30M
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Varnish parameters:
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): thread_stats_rate=1
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): thread_pool_max=5000
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): nuke_limit=250
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): thread_pool_min=200
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Payload size (excludes headers): 10K
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Branch: master
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Number of clients involved: 24
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Type of test: httperf
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Test iterations: 1
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Runtime: 225 seconds
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Number of total connections: 200000
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Requests per connection: 10
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 20:11:45 [1, 0]: httperf-lru-stream-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 20:11:51 [2, 6]: httperf-hot(httperf): Starting test
2012-06-12 20:13:03 WARNING [0,71]: httperf-hot(httperf): Panic detected. I think!
2012-06-12 20:13:03 WARNING [0, 0]: httperf-hot(httperf):
Last panic at: Tue, 12 Jun 2012 20:12:32 GMT
Assert error in SES_ReleaseReq(), cache/cache_session.c line 355:
Condition((sp->req->sp) != 0) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x436085: SES_ReleaseReq+225
0x43666b: SES_Delete+16b
0x41b6bf: CNT_Session+b1f
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7fc0728b39ca: _end+7fc07222f202
0x7fc072610cdd: _end+7fc071f8c515
sp = 0x7fc066304220 {
fd = -1, id = 22, xid = 0,
client = 10.20.100.9 8332,
step = STP_WAIT,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7fc063646158 {
id = "req",
{s,f,r,e} = {0x7fc063647730,0x7fc063647730,(nil),+59632},
},
http[req] = {
ws = 0x7fc063646158[req]
"",
"/0/9/3.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7fc063a32c60 {
ws = 0x7fc063a32e20 {
id = "wrk",
{s,f,r,e} = {0x7fc063a32450,0x7fc063a32450,(nil),+2048},
},
},
},
2012-06-12 20:13:03 WARNING [0, 0]: httperf-hot(httperf): Varnishstat uptime and measured run-time is too large (measured: 67 stat: 30 diff: 37). Did we crash?
2012-06-12 20:13:03 WARNING [0, 0]: httperf-hot(httperf): Out of bounds: client_req(98110) less than lower boundary 989840
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Load: 22:13:03 up 12 days, 8:35, 0 users, load average: 2.39, 4.85, 2.98
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Test name: httperf-hot
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Varnish options:
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Varnish parameters:
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): thread_stats_rate=1
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Payload size (excludes headers): 256
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Branch: master
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Number of clients involved: 24
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Type of test: httperf
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Test iterations: 1
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Runtime: 67 seconds
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Number of total connections: 100000
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Requests per connection: 10
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Extra options to httperf: --wset=1000,0.25
2012-06-12 20:13:03 [1, 0]: httperf-hot(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 4166 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000,0.25
2012-06-12 20:13:10 [2, 6]: httperf-lru-nostream-nogzip(httperf): Starting test
2012-06-12 20:16:48 WARNING [0,218]: httperf-lru-nostream-nogzip(httperf): Panic detected. I think!
2012-06-12 20:16:48 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:15:25 GMT
Assert error in cnt_recv(), cache/cache_center.c line 1418:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41651b: cnt_recv+23b
0x41b18d: CNT_Session+5ed
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7fb7773139ca: _end+7fb776c8f202
0x7fb777070cdd: _end+7fb7769ec515
sp = 0x7fb769914a20 {
fd = 19, id = 19, xid = 539099493,
client = 10.20.100.9 15896,
step = STP_RECV,
handling = lookup,
restarts = 0, esi_level = 0
ws = 0x7fb768059158 {
id = "req",
{s,f,r,e} = {0x7fb76805a730,+296,(nil),+59632},
},
http[req] = {
ws = 0x7fb768059158[req]
"GET",
"/1/7/3/7/6/3.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7fb767592c60 {
ws = 0x7fb767592e20 {
id = "wrk",
{s,f,r,e} = {0x7fb767592450,0x7fb767592450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-12 20:16:48 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 214 stat: 82 diff: 132). Did we crash?
2012-06-12 20:16:49 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf): Out of bounds: n_lru_nuked(7459) less than lower boundary 80000
2012-06-12 20:16:49 WARNING [0, 0]: httperf-lru-nostream-nogzip(httperf): Out of bounds: client_req(103230) less than lower boundary 1989920
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Load: 22:16:49 up 12 days, 8:39, 0 users, load average: 0.67, 2.74, 2.53
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Test name: httperf-lru-nostream-nogzip
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Varnish options:
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): -t=3600
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): -s=malloc,30M
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Varnish parameters:
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): thread_stats_rate=1
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): thread_pool_max=5000
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): nuke_limit=250
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): http_gzip_support=off
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): thread_pool_min=200
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Payload size (excludes headers): 10K
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Branch: master
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Number of clients involved: 24
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Type of test: httperf
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Test iterations: 1
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Runtime: 214 seconds
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
}
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Number of total connections: 200000
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Requests per connection: 10
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 20:16:49 [1, 0]: httperf-lru-nostream-nogzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 20:16:56 [2, 6]: cold-gzip(httperf): Starting test
2012-06-12 20:20:12 WARNING [0,196]: cold-gzip(httperf): Panic detected. I think!
2012-06-12 20:20:12 WARNING [0, 0]: cold-gzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:19:23 GMT
Assert error in cnt_first(), cache/cache_center.c line 949:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b391: CNT_Session+7f1
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f2a75ab59ca: _end+7f2a75431202
0x7f2a75812cdd: _end+7f2a7518e515
sp = 0x7f2a66926820 {
fd = 27, id = 27, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f2a668c0158 {
id = "req",
{s,f,r,e} = {0x7f2a668c1730,0x7f2a668c1730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f2a66d7ac60 {
ws = 0x7f2a66d7ae20 {
id = "wrk",
{s,f,r,e} = {0x7f2a66d7a450,0x7f2a66d7a450,(nil),+2048},
},
},
},
2012-06-12 20:20:12 WARNING [0, 0]: cold-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 192 stat: 48 diff: 144). Did we crash?
2012-06-12 20:23:22 WARNING [0,190]: cold-gzip(httperf): Panic detected. I think!
2012-06-12 20:23:22 WARNING [0, 0]: cold-gzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:22:57 GMT
Assert error in cnt_fetch(), cache/cache_center.c line 639:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x4188dc: cnt_fetch+49c
0x41b03d: CNT_Session+49d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f2a75ab59ca: _end+7f2a75431202
0x7f2a75812cdd: _end+7f2a7518e515
sp = 0x7f2a626f0d20 {
fd = 16, id = 16, xid = 1607797846,
client = 10.20.100.9 8078,
step = STP_FETCH,
handling = fetch,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
busyobj = 0x7f2a61a12020 {
ws = 0x7f2a61a12070 {
id = "bo",
{s,f,r,e} = {0x7f2a61a13aa0,+448,(nil),+58752},
},
do_stream
bodystatus = 3 (chunked),
},
http[bereq] = {
ws = 0x7f2a61a12070[bo]
"GET",
"/0/3/7/6/4/6/5.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
"X-Varnish: 1607797846",
"Accept-Encoding: gzip",
},
http[beresp] = {
ws = 0x7f2a61a12070[bo]
"HTTP/1.1",
"200",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 12 Jun 2012 20:22:57 GMT",
"Content-Type: text/plain",
"Last-Modified: Tue, 12 Jun 2012 20:16:59 GMT",
"Transfer-Encoding: chunked",
"Connection: keep-alive",
"Content-Encoding: gzip",
},
ws = 0x7f2a6316e158 {
id = "req",
{s,f,r,e} = {0x7f2a6316f730,+216,(nil),+59632},
},
http[req] = {
ws = 0x7f2a6316e158[req]
"GET",
"/0/3/7/6/4/6/5.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7f2a670dcc60 {
ws = 0x7f2a670dce20 {
id = "wrk",
{s,f,r,e} = {0x7f2a670dc450,0x7f2a670dc450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-12 20:23:22 WARNING [0, 0]: cold-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 382 stat: 25 diff: 357). Did we crash?
2012-06-12 20:23:23 WARNING [0, 0]: cold-gzip(httperf): Out of bounds: uptime(25) less than lower boundary 100
2012-06-12 20:23:23 WARNING [0, 0]: cold-gzip(httperf): Out of bounds: client_req(10730) less than lower boundary 1589840
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Load: 22:23:23 up 12 days, 8:46, 0 users, load average: 1.04, 1.39, 1.94
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Test name: cold-gzip
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Varnish options:
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): -t=3600
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): -s=malloc,10G
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Varnish parameters:
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): thread_stats_rate=1
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): http_gzip_support=on
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Payload size (excludes headers): 256
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Branch: master
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Number of clients involved: 24
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Type of test: httperf
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Test iterations: 2
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Runtime: 382 seconds
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Number of total connections: 80000
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Requests per connection: 10
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Extra options to httperf: --wset=4000000,0.50
2012-06-12 20:23:23 [1, 0]: cold-gzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 3333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=4000000,0.50
2012-06-12 20:23:30 [2, 6]: 4gpluss(httperf): Starting test
2012-06-12 20:23:33 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-12 20:23:33 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-12 20:23:33 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-12 20:23:34 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-12 20:23:34 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-12 20:45:20 [2,1305]: httperf-lru-stream-gzip(httperf): Starting test
2012-06-12 20:48:55 WARNING [0,215]: httperf-lru-stream-gzip(httperf): Panic detected. I think!
2012-06-12 20:48:55 WARNING [0, 0]: httperf-lru-stream-gzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:47:12 GMT
Assert error in cnt_first(), cache/cache_center.c line 949:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b391: CNT_Session+7f1
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7ff58e2759ca: _end+7ff58dbf1202
0x7ff58dfd2cdd: _end+7ff58d94e515
sp = 0x7ff57f01dc20 {
fd = 28, id = 28, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7ff57da57158 {
id = "req",
{s,f,r,e} = {0x7ff57da58730,0x7ff57da58730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7ff57e61ac60 {
ws = 0x7ff57e61ae20 {
id = "wrk",
{s,f,r,e} = {0x7ff57e61a450,0x7ff57e61a450,(nil),+2048},
},
},
},
2012-06-12 20:48:55 WARNING [0, 0]: httperf-lru-stream-gzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 211 stat: 102 diff: 109). Did we crash?
2012-06-12 20:48:56 WARNING [0, 0]: httperf-lru-stream-gzip(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-12 20:48:56 WARNING [0, 0]: httperf-lru-stream-gzip(httperf): Out of bounds: client_req(302390) less than lower boundary 1989920
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Load: 22:48:56 up 12 days, 9:11, 0 users, load average: 0.60, 0.83, 1.38
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Test name: httperf-lru-stream-gzip
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Varnish options:
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): -t=3600
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): -s=malloc,30M
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Varnish parameters:
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): thread_stats_rate=1
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): thread_pool_max=5000
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): nuke_limit=250
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): http_gzip_support=on
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): thread_pool_min=200
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Payload size (excludes headers): 10K
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Branch: master
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Number of clients involved: 24
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Type of test: httperf
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Test iterations: 1
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Runtime: 211 seconds
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Number of total connections: 200000
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Requests per connection: 10
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 20:48:56 [1, 0]: httperf-lru-stream-gzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 20:49:03 [2, 6]: httperf-lru-stream-nogzip(httperf): Starting test
2012-06-12 20:52:38 WARNING [0,215]: httperf-lru-stream-nogzip(httperf): Panic detected. I think!
2012-06-12 20:52:38 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:51:36 GMT
Assert error in CNT_Session(), cache/cache_center.c line 1639:
Condition((sp->req->sp) != 0) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b7ed: CNT_Session+c4d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7fed714039ca: _end+7fed70d7f202
0x7fed71160cdd: _end+7fed70adc515
sp = 0x7fed70d20620 {
fd = 16, id = 16, xid = 1049920883,
client = 10.20.100.9 16730,
step = STP_DONE,
handling = deliver,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
ws = 0x7fed62291158 {
id = "req",
{s,f,r,e} = {0x7fed62292730,+240,(nil),+59632},
},
http[req] = {
ws = 0x7fed62291158[req]
"GET",
"/1/8/9/3/6/7.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7fed61266c60 {
ws = 0x7fed61266e20 {
id = "wrk",
{s,f,r,e} = {0x7fed61266450,0x7fed61266450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-12 20:52:38 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 211 stat: 61 diff: 150). Did we crash?
2012-06-12 20:52:38 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf): Out of bounds: n_lru_nuked(4426) less than lower boundary 80000
2012-06-12 20:52:38 WARNING [0, 0]: httperf-lru-stream-nogzip(httperf): Out of bounds: client_req(72942) less than lower boundary 1989920
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Load: 22:52:39 up 12 days, 9:15, 0 users, load average: 0.35, 0.75, 1.24
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Test name: httperf-lru-stream-nogzip
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Varnish options:
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): -t=3600
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): -s=malloc,30M
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Varnish parameters:
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): thread_stats_rate=1
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): thread_pool_max=5000
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): nuke_limit=250
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): http_gzip_support=off
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): thread_pool_min=200
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Payload size (excludes headers): 10K
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Branch: master
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Number of clients involved: 24
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Type of test: httperf
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Test iterations: 1
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Runtime: 211 seconds
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Number of total connections: 200000
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Requests per connection: 10
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 20:52:39 [1, 0]: httperf-lru-stream-nogzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 20:52:45 [2, 6]: basic-fryer(httperf): Starting test
2012-06-12 20:53:08 [2,22]: cold-nogzip(httperf): Starting test
2012-06-12 20:56:23 WARNING [0,194]: cold-nogzip(httperf): Panic detected. I think!
2012-06-12 20:56:23 WARNING [0, 0]: cold-nogzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:55:18 GMT
Assert error in cnt_fetch(), cache/cache_center.c line 639:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x4188dc: cnt_fetch+49c
0x41b03d: CNT_Session+49d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f1d0c1b89ca: _end+7f1d0bb34202
0x7f1d0bf15cdd: _end+7f1d0b891515
sp = 0x7f1cfd206520 {
fd = 20, id = 20, xid = 1760320253,
client = 10.20.100.8 6019,
step = STP_FETCH,
handling = fetch,
err_code = 200, err_reason = (null),
restarts = 0, esi_level = 0
busyobj = 0x7f1cfd335020 {
ws = 0x7f1cfd335070 {
id = "bo",
{s,f,r,e} = {0x7f1cfd336aa0,+248,(nil),+58752},
},
do_stream
bodystatus = 4 (length),
},
http[bereq] = {
ws = 0x7f1cfd335070[bo]
"GET",
"/0/2/4/9/1/5/4.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
"X-Varnish: 1760320253",
},
http[beresp] = {
ws = 0x7f1cfd335070[bo]
"HTTP/1.1",
"200",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 12 Jun 2012 20:55:18 GMT",
"Content-Type: text/plain",
"Content-Length: 256",
"Last-Modified: Tue, 12 Jun 2012 20:53:12 GMT",
"Connection: keep-alive",
"Accept-Ranges: bytes",
},
ws = 0x7f1cff39c158 {
id = "req",
{s,f,r,e} = {0x7f1cff39d730,+216,(nil),+59632},
},
http[req] = {
ws = 0x7f1cff39c158[req]
"GET",
"/0/2/4/9/1/5/4.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.8",
},
worker = 0x7f1cfd5a8c60 {
ws = 0x7f1cfd5a8e20 {
id = "wrk",
{s,f,r,e} = {0x7f1cfd5a8450,0x7f1cfd5a8450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-12 20:56:23 WARNING [0, 0]: cold-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 190 stat: 64 diff: 126). Did we crash?
2012-06-12 20:59:34 WARNING [0,191]: cold-nogzip(httperf): Panic detected. I think!
2012-06-12 20:59:34 WARNING [0, 0]: cold-nogzip(httperf):
Last panic at: Tue, 12 Jun 2012 20:58:33 GMT
Assert error in CNT_Session(), cache/cache_center.c line 1639:
Condition((sp->req->sp) != 0) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b7ed: CNT_Session+c4d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f1d0c1b89ca: _end+7f1d0bb34202
0x7f1d0bf15cdd: _end+7f1d0b891515
sp = 0x7f1cf993f420 {
fd = 18, id = 18, xid = 0,
client = 10.20.100.9 5782,
step = STP_WAIT,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f1cfb257158 {
id = "req",
{s,f,r,e} = {0x7f1cfb258730,0x7f1cfb258730,+32768,+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f1cfd1aac60 {
ws = 0x7f1cfd1aae20 {
id = "wrk",
{s,f,r,e} = {0x7f1cfd1aa450,0x7f1cfd1aa450,(nil),+2048},
},
},
},
2012-06-12 20:59:34 WARNING [0, 0]: cold-nogzip(httperf): Varnishstat uptime and measured run-time is too large (measured: 382 stat: 61 diff: 321). Did we crash?
2012-06-12 20:59:35 WARNING [0, 0]: cold-nogzip(httperf): Out of bounds: uptime(61) less than lower boundary 100
2012-06-12 20:59:35 WARNING [0, 0]: cold-nogzip(httperf): Out of bounds: client_req(61060) less than lower boundary 1589840
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Load: 22:59:35 up 12 days, 9:22, 0 users, load average: 0.69, 0.75, 1.05
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Test name: cold-nogzip
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Varnish options:
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): -t=3600
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): -s=malloc,10G
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Varnish parameters:
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): thread_stats_rate=1
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): http_gzip_support=off
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Payload size (excludes headers): 256
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Branch: master
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Number of clients involved: 24
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Type of test: httperf
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Test iterations: 2
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Runtime: 382 seconds
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Number of total connections: 80000
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Requests per connection: 10
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Extra options to httperf: --wset=4000000,0.50
2012-06-12 20:59:35 [1, 0]: cold-nogzip(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 3333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=4000000,0.50
2012-06-12 20:59:42 [2, 6]: 4gpluss-nostream(httperf): Starting test
2012-06-12 20:59:45 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-12 20:59:45 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-12 20:59:45 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-12 20:59:46 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-12 20:59:46 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-12 21:19:35 [2,1188]: lru-random(httperf): Starting test
2012-06-12 21:31:14 [2,699]: siege-test(siege): Starting test
2012-06-12 21:31:32 WARNING [0,18]: siege-test(siege): Panic detected. I think!
2012-06-12 21:31:32 WARNING [0, 0]: siege-test(siege):
Last panic at: Tue, 12 Jun 2012 21:31:18 GMT
Assert error in cnt_recv(), cache/cache_center.c line 1418:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41651b: cnt_recv+23b
0x41b18d: CNT_Session+5ed
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f12820659ca: _end+7f12819e1202
0x7f1281dc2cdd: _end+7f128173e515
sp = 0x7f1272f08320 {
fd = 71, id = 71, xid = 1126170301,
client = 10.20.100.9 51036,
step = STP_RECV,
handling = lookup,
restarts = 0, esi_level = 0
ws = 0x7f1272b11158 {
id = "req",
{s,f,r,e} = {0x7f1272b12730,+200,(nil),+59632},
},
http[req] = {
ws = 0x7f1272b11158[req]
"GET",
"/",
"HTTP/1.1",
"Host: 10.20.100.12:8080",
"Accept: */*",
"Accept-Encoding: gzip",
"User-Agent: JoeDog/1.00 [en] (X11; I; Siege 2.66)",
"Connection: close",
"X-Forwarded-For: 10.20.100.9",
},
worker = 0x7f1277db5c60 {
ws = 0x7f1277db5e20 {
id = "wrk",
{s,f,r,e} = {0x7f1277db5450,0x7f1277db5450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
},
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Load: 23:31:32 up 12 days, 9:54, 0 users, load average: 0.25, 0.51, 0.64
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Test name: siege-test
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Varnish options:
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Varnish parameters:
2012-06-12 21:31:32 [1, 0]: siege-test(siege): thread_stats_rate=1
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Payload size (excludes headers): 256
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Branch: master
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Number of clients involved: 0
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Type of test: siege
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Test iterations: 1
2012-06-12 21:31:32 [1, 0]: siege-test(siege): Runtime: 14 seconds
2012-06-12 21:31:32 [1, 0]: siege-test(siege): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 21:31:39 [2, 6]: httperf-lru-nostream-default(httperf): Starting test
2012-06-12 21:35:07 WARNING [0,207]: httperf-lru-nostream-default(httperf): Panic detected. I think!
2012-06-12 21:35:07 WARNING [0, 0]: httperf-lru-nostream-default(httperf):
Last panic at: Tue, 12 Jun 2012 21:34:04 GMT
Assert error in CNT_Session(), cache/cache_center.c line 1639:
Condition((sp->req->sp) != 0) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b7ed: CNT_Session+c4d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7ffa7e08e9ca: _end+7ffa7da0a202
0x7ffa7ddebcdd: _end+7ffa7d767515
sp = 0x7ffa7808e120 {
fd = 17, id = 17, xid = 0,
client = 10.20.100.8 17814,
step = STP_WAIT,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7ffa73c36158 {
id = "req",
{s,f,r,e} = {0x7ffa73c37730,0x7ffa73c37730,+32768,+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7ffa6e23ac60 {
ws = 0x7ffa6e23ae20 {
id = "wrk",
{s,f,r,e} = {0x7ffa6e23a450,0x7ffa6e23a450,(nil),+2048},
},
},
},
2012-06-12 21:35:07 WARNING [0, 0]: httperf-lru-nostream-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 204 stat: 62 diff: 142). Did we crash?
2012-06-12 21:35:07 WARNING [0, 0]: httperf-lru-nostream-default(httperf): Out of bounds: n_lru_nuked(0) less than lower boundary 80000
2012-06-12 21:35:07 WARNING [0, 0]: httperf-lru-nostream-default(httperf): Out of bounds: client_req(91323) less than lower boundary 1989920
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Load: 23:35:08 up 12 days, 9:58, 0 users, load average: 0.64, 0.73, 0.70
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Test name: httperf-lru-nostream-default
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Varnish options:
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): -t=3600
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): -s=malloc,30M
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Varnish parameters:
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): thread_stats_rate=1
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): thread_pool_max=5000
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): nuke_limit=250
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): thread_pool_min=200
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Payload size (excludes headers): 10K
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Branch: master
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Number of clients involved: 24
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Type of test: httperf
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Test iterations: 1
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Runtime: 204 seconds
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = false;
}
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Number of total connections: 200000
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Requests per connection: 10
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Extra options to httperf: --wset=1000000,0.1
2012-06-12 21:35:08 [1, 0]: httperf-lru-nostream-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 8333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=1000000,0.1
2012-06-12 21:35:14 [2, 6]: httperf-rapid-expire(httperf): Starting test
2012-06-12 21:36:27 WARNING [0,72]: httperf-rapid-expire(httperf): Panic detected. I think!
2012-06-12 21:36:27 WARNING [0, 0]: httperf-rapid-expire(httperf):
Last panic at: Tue, 12 Jun 2012 21:35:49 GMT
Assert error in cnt_prepresp(), cache/cache_center.c line 271:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x4199b2: cnt_prepresp+292
0x41afdd: CNT_Session+43d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f6d244da9ca: _end+7f6d23e56202
0x7f6d24237cdd: _end+7f6d23bb3515
sp = 0x7f6d14f02f20 {
fd = 14, id = 14, xid = 1837192445,
client = 10.20.100.9 8024,
step = STP_PREPRESP,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f6d1a0be158 {
id = "req",
{s,f,r,e} = {0x7f6d1a0bf730,+384,(nil),+59632},
},
http[req] = {
ws = 0x7f6d1a0be158[req]
"GET",
"/0/2.html",
"HTTP/1.1",
"User-Agent: httperf/0.9.0",
"Host: 10.20.100.12",
"X-Forwarded-For: 10.20.100.9",
},
http[resp] = {
ws = 0x7f6d1a0be158[req]
"HTTP/1.1",
"OK",
"Server: nginx/0.7.65",
"Content-Type: text/plain",
"Last-Modified: Tue, 12 Jun 2012 21:35:18 GMT",
"Transfer-Encoding: chunked",
"Date: Tue, 12 Jun 2012 21:35:49 GMT",
"X-Varnish: 1837192445 1837174625",
"Age: 1",
"Via: 1.1 varnish",
"Connection: keep-alive",
},
worker = 0x7f6d15492c60 {
ws = 0x7f6d15492e20 {
id = "wrk",
{s,f,r,e} = {0x7f6d15492450,0x7f6d15492450,(nil),+2048},
},
},
vcl = {
srcname = {
"input",
"Default",
},
},
obj = 0x7f6d15646400 {
xid = 1837174625,
ws = 0x7f6d15646418 {
id = "obj",
{s,f,r,e} = {0x7f6d156465e8,+216,(nil),+248},
},
http[obj] = {
ws = 0x7f6d15646418[obj]
"HTTP/1.1",
"OK",
"Server: nginx/0.7.65",
"Date: Tue, 12 Jun 2012 21:35:48 GMT",
"Content-Type: text/plain",
"Last-Modified: Tue, 12 Jun 2012 21:35:18 GMT",
"Content-Encoding: gzip",
"Content-Length: 181",
},
len = 181,
store = {
181 {
1f 8b 08 00 00 00 00 00 00 03 25 8f 41 8e 43 31 |..........%.A.C1|
08 43 af e2 03 54 3d 45 97 b3 9d 03 d0 04 55 96 |.C...T=E......U.|
42 f8 4d 60 d4 e3 97 3f 7f 07 c2 7e 36 3f be d4 |B.M`...?...~6?..|
c0 63 a7 a1 fb f0 85 cd 80 98 c6 0d cd e7 d6 16 |.c..............|
[117 more]
},
},
},
},
2012-06-12 21:36:27 WARNING [0, 0]: httperf-rapid-expire(httperf): Varnishstat uptime and measured run-time is too large (measured: 68 stat: 36 diff: 32). Did we crash?
2012-06-12 21:36:27 WARNING [0, 0]: httperf-rapid-expire(httperf): Out of bounds: client_req(110630) less than lower boundary 989840
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Load: 23:36:27 up 12 days, 9:59, 0 users, load average: 3.02, 1.92, 1.15
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Test name: httperf-rapid-expire
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Varnish options:
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): -t=2
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Varnish parameters:
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): thread_stats_rate=1
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Payload size (excludes headers): 256
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Branch: master
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Number of clients involved: 24
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Type of test: httperf
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Test iterations: 1
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Runtime: 68 seconds
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Number of total connections: 100000
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Requests per connection: 10
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Extra options to httperf: --wset=100,0.30
2012-06-12 21:36:27 [1, 0]: httperf-rapid-expire(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 4166 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=100,0.30
2012-06-12 21:36:34 [2, 6]: streaming-grace(httperf): Starting test
2012-06-12 21:39:27 [2,173]: cold-default(httperf): Starting test
2012-06-12 21:43:00 WARNING [0,212]: cold-default(httperf): Panic detected. I think!
2012-06-12 21:43:00 WARNING [0, 0]: cold-default(httperf):
Last panic at: Tue, 12 Jun 2012 21:41:24 GMT
Assert error in cnt_first(), cache/cache_center.c line 949:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b391: CNT_Session+7f1
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f54e61309ca: _end+7f54e5aac202
0x7f54e5e8dcdd: _end+7f54e5809515
sp = 0x7f54d52db120 {
fd = 15, id = 15, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f54d6d46158 {
id = "req",
{s,f,r,e} = {0x7f54d6d47730,0x7f54d6d47730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f54d7431c60 {
ws = 0x7f54d7431e20 {
id = "wrk",
{s,f,r,e} = {0x7f54d7431450,0x7f54d7431450,(nil),+2048},
},
},
},
2012-06-12 21:43:00 WARNING [0, 0]: cold-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 208 stat: 95 diff: 113). Did we crash?
2012-06-12 21:46:11 WARNING [0,191]: cold-default(httperf): Panic detected. I think!
2012-06-12 21:46:11 WARNING [0, 0]: cold-default(httperf):
Last panic at: Tue, 12 Jun 2012 21:44:28 GMT
Assert error in cnt_first(), cache/cache_center.c line 949:
Condition(req->sp == sp) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b391: CNT_Session+7f1
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f54e61309ca: _end+7f54e5aac202
0x7f54e5e8dcdd: _end+7f54e5809515
sp = 0x7f54d9b04b20 {
fd = 29, id = 29, xid = 0,
client = ,
step = STP_FIRST,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f54d6702158 {
id = "req",
{s,f,r,e} = {0x7f54d6703730,0x7f54d6703730,(nil),+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f54d74ecc60 {
ws = 0x7f54d74ece20 {
id = "wrk",
{s,f,r,e} = {0x7f54d74ec450,0x7f54d74ec450,(nil),+2048},
},
},
},
2012-06-12 21:46:11 WARNING [0, 0]: cold-default(httperf): Varnishstat uptime and measured run-time is too large (measured: 399 stat: 102 diff: 297). Did we crash?
2012-06-12 21:46:11 WARNING [0, 0]: cold-default(httperf): Out of bounds: client_req(157313) less than lower boundary 1589840
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Load: 23:46:12 up 12 days, 10:09, 0 users, load average: 0.46, 0.85, 0.92
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Test name: cold-default
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Varnish options:
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): -t=3600
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): -s=malloc,10G
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Varnish parameters:
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): thread_stats_rate=1
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Payload size (excludes headers): 256
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Branch: master
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Number of clients involved: 24
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Type of test: httperf
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Test iterations: 2
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Runtime: 399 seconds
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_fetch {
set beresp.do_stream = true;
}
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Number of total connections: 80000
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Requests per connection: 10
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Extra options to httperf: --wset=4000000,0.50
2012-06-12 21:46:12 [1, 0]: cold-default(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 10 --num-conns 3333 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=4000000,0.50
2012-06-12 21:46:18 [2, 6]: 4gpluss-nogzip(httperf): Starting test
2012-06-12 21:46:21 WARNING [0, 2]: Varnish failed to start. Fallback attempts starting
2012-06-12 21:46:21 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_stats_rate=1
2012-06-12 21:46:22 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: thread_pool_max=300
2012-06-12 21:46:22 [1, 0]: Removed (hopefully incompatible) parameter to try to start varnish. Key: sess_timeout=60000s
2012-06-12 21:46:23 [1, 0]: Fallback worked. Parameter that seemed to cause problems: sess_timeout
2012-06-12 22:08:07 [2,1303]: purge-fail(httperf): Starting test
2012-06-12 22:12:56 WARNING [0,288]: purge-fail(httperf): Panic detected. I think!
2012-06-12 22:12:56 WARNING [0, 0]: purge-fail(httperf):
Last panic at: Tue, 12 Jun 2012 22:12:26 GMT
Assert error in CNT_Session(), cache/cache_center.c line 1639:
Condition((sp->req->sp) != 0) not true.
thread = (cache-worker)
ident = Linux,2.6.32-38-generic,x86_64,-smalloc,-smalloc,-hcritbit,epoll
Backtrace:
0x432138: pan_ic+d8
0x41b7ed: CNT_Session+c4d
0x436b2d: ses_pool_task+fd
0x433942: Pool_Work_Thread+112
0x4417a8: wrk_thread_real+c8
0x7f41859009ca: _end+7f418527c202
0x7f418565dcdd: _end+7f4184fd9515
sp = 0x7f4176702920 {
fd = 13, id = 13, xid = 0,
client = 10.20.100.8 24087,
step = STP_WAIT,
handling = deliver,
restarts = 0, esi_level = 0
ws = 0x7f4176924158 {
id = "req",
{s,f,r,e} = {0x7f4176925730,0x7f4176925730,+32768,+59632},
},
http[req] = {
ws = (nil)[]
},
worker = 0x7f4176b9bc60 {
ws = 0x7f4176b9be20 {
id = "wrk",
{s,f,r,e} = {0x7f4176b9b450,0x7f4176b9b450,(nil),+2048},
},
},
},
2012-06-12 22:12:56 WARNING [0, 0]: purge-fail(httperf): Varnishstat uptime and measured run-time is too large (measured: 284 stat: 29 diff: 255). Did we crash?
2012-06-12 22:12:56 WARNING [0, 0]: purge-fail(httperf): Out of bounds: client_req(9169) less than lower boundary 290000
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Load: 00:12:56 up 12 days, 10:35, 0 users, load average: 0.25, 0.55, 1.02
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Test name: purge-fail
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Varnish options:
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Varnish parameters:
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): thread_stats_rate=1
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Payload size (excludes headers): 1K
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Branch: master
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Number of clients involved: 24
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Type of test: httperf
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Test iterations: 1
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Runtime: 284 seconds
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): VCL:
backend foo {
.host = "localhost";
.port = "80";
}
sub vcl_recv {
if (!req.url ~ "/0/0.html") {
set req.request = "PURGE";
}
set req.url = "/foo";
return (lookup);
}
sub vcl_hit {
if (req.request == "PURGE") {
set obj.ttl = 0s;
error 200 "OK";
}
}
sub vcl_miss {
if (req.request == "PURGE") {
error 200 "Not in cache but not confusing httperf";
}
}
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Number of total connections: 300000
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Note: connections are subject to rounding when divided among clients. Expect slight deviations.
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Requests per connection: 1
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Extra options to httperf: --wset=999,0.5 --timeout=5
2012-06-12 22:12:56 [1, 0]: purge-fail(httperf): Httperf command (last client): httperf --hog --timeout 60 --num-calls 1 --num-conns 12500 --port 8080 --burst-length 10 --client 23/24 --server 10.20.100.12 --wset=999,0.5 --timeout=5
2012-06-12 22:13:03 [2, 6]: streaming-gzip(httperf): Starting test
2012-06-12 22:15:37 WARNING [0,154]: Tests finished with problems detected. Failed expectations: 15 Total run time: 9816 seconds
More information about the varnish-test
mailing list