My dev box runs Windows XP with an oldish intel core 2. It processes my rules in 13s, with a peak memory usage of about 600MB.
 
When I run the exact same code on my 64bit linux server with a quad 2.40Ghz Xeon, it takes 22s and uses a peak of 1.5Gb memory.
 
What on earth is going on here? I would expect it to run faster on the server and for the memory usage to be similar.
 
 
Cheers,
Tim