Hi all,
Last week I had an itch that ab (the web performance tester shipped with
apache) just didn't scratch, so I decided to sprinkle some Ruby dust over
the problem and see what that bought me. RWB is a light weight (and alpha
quality) performance/load testing tool for websites. It features a couple of
nice features:
- you can build up a weighted list of URLs, then test them at a given
total number of requests and level of concurrency. Each request will
pull a random request from your weighted list.
- given a group or family of URLs, you can specify them using a
weighted base URL and an array of extensions. Each time a request is
pulled from the group, it will get a random extension from the array.
- Reports are given for the run overall, and for each URL or URL group.
Once I'm a bit happier with the engine and the reporting, RWB will be turned
into a DSL (rather like Rake) to make building and running load or performance
tests even easier.
For now though, I'd love to hear what works well (or doesn't), what additional
features people would like to see, etc.
Nice. I think your calculations on the mean request time might be wrong
since the overall request time's 50%tile is 41msecs witha shortest of 18.
Also, where's the standard deviation?
Zed A. Shaw
<snip>
Concurrency Level: 50
Total Requests: 1000
Total time for testing: 4.148434 secs
Requests per second: 241.054817311786
Mean request time: 4 msecs
Overall results:
Shortest time: 18 msecs
50.0%ile time: 41 msecs
90.0%ile time: 55 msecs
99.9%ile time: 81 msecs
Longest time: 81 msecs
This looks really quite handy -- I think I've had the same issues as you have had with ab. In a few days I'll try it on a couple of systems I'm working on.
Since you are asking for suggestions, one thing that would be nice, maybe it is already there and I just didn't notice it, is a warmup phase. In the applications I'm working on the first couple hits on a URL causes work to be done that I don't really want timed.
>
> On the list of things to do.
>
> Any other comments/requests?
>
Hi,
This looks really quite handy -- I think I've had the same issues as
you have had with ab. In a few days I'll try it on a couple of
systems I'm working on.
Since you are asking for suggestions, one thing that would be nice,
maybe it is already there and I just didn't notice it, is a warmup
phase. In the applications I'm working on the first couple hits on a
URL causes work to be done that I don't really want timed.
Not there yet, but I love the idea, so I'll add it shortly. How does
something like this look:
(in the RWB::Runner class)
warmup(num_times) # this would just walk through the
# array of urls and
url_groups num_times
# times, making num_times
* array.length
# total requests
rand_warmup(num_requests) # this would randomly select urls to request,
# making num_requests
total requests.
···
On 11/8/05, Bob Hutchison <hutch@recursive.ca> wrote:
Looks good. Maybe with the ability to sepecify a set of URLs to warmup on and an order within them? Maybe too much work for small gain?
Cheers,
Bob
···
On Nov 8, 2005, at 10:32 AM, pat eyler wrote:
On 11/8/05, Bob Hutchison <hutch@recursive.ca> wrote:
On Nov 7, 2005, at 7:52 PM, pat eyler wrote:
Since you are asking for suggestions, one thing that would be nice,
maybe it is already there and I just didn't notice it, is a warmup
phase. In the applications I'm working on the first couple hits on a
URL causes work to be done that I don't really want timed.
Not there yet, but I love the idea, so I'll add it shortly. How does
something like this look:
(in the RWB::Runner class)
warmup(num_times) # this would just walk through the
# array of urls and
url_groups num_times
# times, making num_times
* array.length
# total requests
rand_warmup(num_requests) # this would randomly select urls to request,
# making num_requests
total requests.