On 6 March 2013 21:37, Jeff Gonis jeff.gonis@gmail.com wrote:
Hi Frank,
So to my chagrin the performance tests on Squeak-CI are pretty much worse than useless, displaying up to a 100% difference in times between builds. For my part I am looking into the SMark framework that Stefan Marr offered and working to learn that and see how I could use it.
But I was curious if there were some other steps I could take in the meantime to try and get the times to be a little more repeatable. The first thing that comes to mind is that we "nice" the vm with I believe the default value before running it for both the tests and the benchmarks, I don't know how to log into the server Jenkins is running on so I can't see what the default value of nice is, but I am assuming it is 0 as that seems fairly standard. Do you think it would be reasonable to try out a high-priority value for at least the benchmarks portion of the SqueakTrunk build. Fire off the benchmarks at a -20 priority and see if we can get some repeatability?
I am not sure about the impacts this might have elsewhere, which is why I wanted to run it by you before sending a pull request. If there is some other easy win that I have overlooked that leaps to mind please let me know, otherwise I will keep plugging away with SMark, and also look into running the tests multiple times to warm up Cog.
Thanks for your help, Jeff G.
Hi Jeff,
The box-admins folk might have further insight, but I think renicing to be hardcore might be the sensible thing. The performance tests don't run for long, and would only run with the SqueakTrunk build, so shouldn't impact too much on things... but I don't recall what other services run on that box.
frank