There were 4 stuck build processes that all appear to be related to testing Universes, I think. Here is an example:
jenkins 27839 0.0 0.0 1748 492 ? S Mar30 0:00 sh -c nice /var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/squeak -vm-sound-null -vm-display-null "/var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image" ../package-load-tests/Universes.st jenkins 27841 2.2 2.3 1054528 24096 ? RN Mar30 60:37 /var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/../lib/squeak/4.0-2701/squeak -vm-sound-null -vm-display-null /var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image ../package-load-tests/Universes.st
I killed them.
Ken
Hi Ken,
Thanks. I'm hoping that once I've figured out the latest issue with CI this will go away. Jenkins aborts the build, but since the build kicks off the actual tests in separate child processes, it looks like those aren't killed when the build process is killed.
frank
On 1 April 2013 21:04, Ken Causey ken@kencausey.com wrote:
There were 4 stuck build processes that all appear to be related to testing Universes, I think. Here is an example:
jenkins 27839 0.0 0.0 1748 492 ? S Mar30 0:00 sh -c nice /var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/squeak -vm-sound-null -vm-display-null "/var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image" ../package-load-tests/Universes.st jenkins 27841 2.2 2.3 1054528 24096 ? RN Mar30 60:37 /var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/../lib/squeak/4.0-2701/squeak -vm-sound-null -vm-display-null /var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image ../package-load-tests/Universes.st
I killed them.
Ken
On 04/01/2013 03:13 PM, Frank Shearar wrote:
Hi Ken,
Thanks. I'm hoping that once I've figured out the latest issue with CI this will go away. Jenkins aborts the build, but since the build kicks off the actual tests in separate child processes, it looks like those aren't killed when the build process is killed.
frank
On 1 April 2013 21:04, Ken Causeyken@kencausey.com wrote:
There were 4 stuck build processes that all appear to be related to testing Universes, I think. Here is an example:
jenkins 27839 0.0 0.0 1748 492 ? S Mar30 0:00 sh -c nice /var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/squeak -vm-sound-null -vm-display-null "/var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image" ../package-load-tests/Universes.st jenkins 27841 2.2 2.3 1054528 24096 ? RN Mar30 60:37 /var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/../lib/squeak/4.0-2701/squeak -vm-sound-null -vm-display-null /var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image ../package-load-tests/Universes.st
I killed them.
Ken
I spoke too soon. There are 2 more from today. They are about 22 hours old so I'm going to go ahead and kill them. I wondered at first why there were 2 that started at almost the same time (4 minutes apart) but it appears these tests are being done both with an interpreter VM and Cog. Correct?
Ken
On 1 April 2013 21:21, Ken Causey ken@kencausey.com wrote:
On 04/01/2013 03:13 PM, Frank Shearar wrote:
Hi Ken,
Thanks. I'm hoping that once I've figured out the latest issue with CI this will go away. Jenkins aborts the build, but since the build kicks off the actual tests in separate child processes, it looks like those aren't killed when the build process is killed.
frank
On 1 April 2013 21:04, Ken Causeyken@kencausey.com wrote:
There were 4 stuck build processes that all appear to be related to testing Universes, I think. Here is an example:
jenkins 27839 0.0 0.0 1748 492 ? S Mar30 0:00 sh -c nice
/var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/squeak -vm-sound-null -vm-display-null "/var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image" ../package-load-tests/Universes.st jenkins 27841 2.2 2.3 1054528 24096 ? RN Mar30 60:37
/var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/../lib/squeak/4.0-2701/squeak -vm-sound-null -vm-display-null /var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image ../package-load-tests/Universes.st
I killed them.
Ken
I spoke too soon. There are 2 more from today. They are about 22 hours old so I'm going to go ahead and kill them. I wondered at first why there were 2 that started at almost the same time (4 minutes apart) but it appears these tests are being done both with an interpreter VM and Cog. Correct?
Yes, that's exactly right. So until I figure out the problem I guess we'll be looking at two hung processes per SqueakTrunk run.
frank
Ken
I mean, obviously it's my fault. In this case, Universes depends on XML-Parser and Nebraska, and I've just stripped those out of the Trunk core image. Sigh. The fix is to construct an Installer script for Universes. The proper job is to make a SqueakMap entry with this Installer script (or go the Metacello route). But one step at a time...
frank
On 1 April 2013 21:26, Frank Shearar frank.shearar@gmail.com wrote:
On 1 April 2013 21:21, Ken Causey ken@kencausey.com wrote:
On 04/01/2013 03:13 PM, Frank Shearar wrote:
Hi Ken,
Thanks. I'm hoping that once I've figured out the latest issue with CI this will go away. Jenkins aborts the build, but since the build kicks off the actual tests in separate child processes, it looks like those aren't killed when the build process is killed.
frank
On 1 April 2013 21:04, Ken Causeyken@kencausey.com wrote:
There were 4 stuck build processes that all appear to be related to testing Universes, I think. Here is an example:
jenkins 27839 0.0 0.0 1748 492 ? S Mar30 0:00 sh -c nice
/var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/squeak -vm-sound-null -vm-display-null "/var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image" ../package-load-tests/Universes.st jenkins 27841 2.2 2.3 1054528 24096 ? RN Mar30 60:37
/var/lib/jenkins/workspace/SqueakTrunk/target/cog.r2701/coglinux/bin/../lib/squeak/4.0-2701/squeak -vm-sound-null -vm-display-null /var/lib/jenkins/workspace/SqueakTrunk/target/Universes.image ../package-load-tests/Universes.st
I killed them.
Ken
I spoke too soon. There are 2 more from today. They are about 22 hours old so I'm going to go ahead and kill them. I wondered at first why there were 2 that started at almost the same time (4 minutes apart) but it appears these tests are being done both with an interpreter VM and Cog. Correct?
Yes, that's exactly right. So until I figure out the problem I guess we'll be looking at two hung processes per SqueakTrunk run.
frank
Ken
box-admins@lists.squeakfoundation.org