There are 168 hours in a week and the box3 Altitude has been up continuously for 189 hours. So, the problems I have had with it sometimes haven't been around for a while.
Chris
Am 19.11.2012 14:20, schrieb Chris Cunnington:
There are 168 hours in a week and the box3 Altitude has been up continuously for 189 hours. So, the problems I have had with it sometimes haven't been around for a while.
Hi,
I gave it a few clicks so it at least sees some traffic. Then I tried to wget it that went very slow (I'm friendly when I wget and use a random delay 10). Seems wget doesn't know when a file is complete. The downloads don't work. Maybe the problem is in front of my computer and doesn't know enough about wget.
I wrote a selnium test and ran it a few times. I can offer to run it on a few computers concurrently when it's convenient for you.
Did you do some load tests or longer continuous tests?
Cheers
Herbert
On 2012-11-19 9:52 AM, Herbert König wrote:
Am 19.11.2012 14:20, schrieb Chris Cunnington:
There are 168 hours in a week and the box3 Altitude has been up continuously for 189 hours. So, the problems I have had with it sometimes haven't been around for a while.
Hi,
I gave it a few clicks so it at least sees some traffic. Then I tried to wget it that went very slow (I'm friendly when I wget and use a random delay 10). Seems wget doesn't know when a file is complete.
That's interesting.
The downloads don't work. Maybe the problem is in front of my computer and doesn't know enough about wget.
I'm not sure what you mean by downloads. Just downloading a page, you mean? Because it doesn't know when a page is complete. There's only one download link on the site and it works OK for me.
I wrote a selnium test and ran it a few times. I can offer to run it on a few computers concurrently when it's convenient for you.
That sounds great. Thanks for testing.
Did you do some load tests or longer continuous tests?
Not so far. I just deployed and monitored its resource consumption on box3 to ensure it didn't start hogging CPU, memory, etc.
Cheers
Herbert _______________________________________________ Webteam mailing list Webteam@lists.squeakfoundation.org http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/webteam
Am 19.11.2012 17:44, schrieb Chris Cunnington:
On 2012-11-19 9:52 AM, Herbert König wrote:
Am 19.11.2012 14:20, schrieb Chris Cunnington:
There are 168 hours in a week and the box3 Altitude has been up continuously for 189 hours. So, the problems I have had with it sometimes haven't been around for a while.
Hi,
I gave it a few clicks so it at least sees some traffic. Then I tried to wget it that went very slow (I'm friendly when I wget and use a random delay 10). Seems wget doesn't know when a file is complete.
That's interesting.
The downloads don't work. Maybe the problem is in front of my computer and doesn't know enough about wget.
I'm not sure what you mean by downloads. Just downloading a page, you mean? Because it doesn't know when a page is complete. There's only one download link on the site and it works OK for me.
I'm new to Linux. I use wget on Windows as a website downloader. So afterwards I got a set of files and folders I can browse offline. This is not possible with what wget gets from http://173.246.101.237:8624/. When it follows a link like "Saving to: `173.246.101.237+8624/O7OBMjjYd15btNOhMlRmoxvn'" it doesn't get to an end but just times out.
I wrote a selnium test and ran it a few times. I can offer to run it on a few computers concurrently when it's convenient for you.
That sounds great. Thanks for testing.
Did you do some load tests or longer continuous tests?
Not so far. I just deployed and monitored its resource consumption on box3 to ensure it didn't start hogging CPU, memory, etc.
That's what I use selenium for. I fire up several computers (VM'S) with Firefox and start the tests on each. Then I log into my hosted server, fire up top and watch memeory and cpu. Same time I RFB into my Squeak and play with vm statistics.
When I'm back home we can do that, or if you like (and use Firefox) you get the addon Selenium and run the attached test (file, open, run) on as many machines as you like. Yes it's HTML and you can copy paste as many of the tr as you want to make it run longer. This is a real stupid test, just clicking up and down the menu on the left.
Just don't want to shoot your server by running any number of long tests at full speed unsolicited.
wget and test attached.
Cheers
Herbert
On 2012-11-19 2:18 PM, Herbert König wrote:
Am 19.11.2012 17:44, schrieb Chris Cunnington:
On 2012-11-19 9:52 AM, Herbert König wrote:
Am 19.11.2012 14:20, schrieb Chris Cunnington:
There are 168 hours in a week and the box3 Altitude has been up continuously for 189 hours. So, the problems I have had with it sometimes haven't been around for a while.
Hi,
I gave it a few clicks so it at least sees some traffic. Then I tried to wget it that went very slow (I'm friendly when I wget and use a random delay 10). Seems wget doesn't know when a file is complete.
That's interesting.
The downloads don't work. Maybe the problem is in front of my computer and doesn't know enough about wget.
I'm not sure what you mean by downloads. Just downloading a page, you mean? Because it doesn't know when a page is complete. There's only one download link on the site and it works OK for me.
I'm new to Linux. I use wget on Windows as a website downloader. So afterwards I got a set of files and folders I can browse offline. This is not possible with what wget gets from http://173.246.101.237:8624/. When it follows a link like "Saving to: `173.246.101.237+8624/O7OBMjjYd15btNOhMlRmoxvn'" it doesn't get to an end but just times out.
I wrote a selnium test and ran it a few times. I can offer to run it on a few computers concurrently when it's convenient for you.
That sounds great. Thanks for testing.
Did you do some load tests or longer continuous tests?
Not so far. I just deployed and monitored its resource consumption on box3 to ensure it didn't start hogging CPU, memory, etc.
That's what I use selenium for. I fire up several computers (VM'S) with Firefox and start the tests on each. Then I log into my hosted server, fire up top and watch memeory and cpu. Same time I RFB into my Squeak and play with vm statistics.
When I'm back home we can do that, or if you like (and use Firefox) you get the addon Selenium and run the attached test (file, open, run) on as many machines as you like. Yes it's HTML and you can copy paste as many of the tr as you want to make it run longer. This is a real stupid test, just clicking up and down the menu on the left.
Just don't want to shoot your server by running any number of long tests at full speed unsolicited.
wget and test attached.
Cheers
Herbert
Webteam mailing list Webteam@lists.squeakfoundation.org http://lists.squeakfoundation.org/cgi-bin/mailman/listinfo/webteam
This is great looking stuff. I've added Selenium to Firefox and I'm going to explore these files you attached. I'll need a day or so to get under it. I thought Colin was on this list, but it appears he is not. He's now aware of the conversation that has been going on here, and hopefully we can get his input on some things.
Chris
On 2012-11-19 2:18 PM, Herbert König wrote:
That's what I use selenium for. I fire up several computers (VM'S) with Firefox and start the tests on each.
You have different vms for different OSs?
Then I log into my hosted server, fire up top and watch memeory and cpu. Same time I RFB into my Squeak and play with vm statistics.
You get the actions to run over and over. That's why the file you send has the same actions repeated, I guess.
When I'm back home we can do that, or if you like (and use Firefox) you get the addon Selenium and run the attached test (file, open, run) on as many machines as you like. Yes it's HTML and you can copy paste as many of the tr as you want to make it run longer. This is a real stupid test, just clicking up and down the menu on the left.
OK, I get it. I Add Test Case... and load your SqueakSiteLonger.html file into Selenium, fire up a terminal, and top -d 1 and watch at the speed I set for it. Nifty.
Just don't want to shoot your server by running any number of long tests at full speed unsolicited.
wget and test attached.
The files you downloaded with the non-RESTful file names are from the wget test. And they don't conclude. They just time out.
Cheers
Herbert
On 2012-11-21 12:01 PM, Chris Cunnington wrote:
On 2012-11-19 2:18 PM, Herbert König wrote:
That's what I use selenium for. I fire up several computers (VM'S) with Firefox and start the tests on each.
You have different vms for different OSs?
I guess not different OSs. That doesn't matter, I guess. You have different vms for different users. And then you can have a high or low frequency from several points at once.
This image has been running for ~270 hours. If you want to fire up your rig and try to overwhelm the box3.squeak.org:8624 that could be useful. We could see how much can it take.
And if it isn't useful, at least it'll be amusing. :)
Chris
This image has been running for ~270 hours. If you want to fire up your rig and try to overwhelm the box3.squeak.org:8624 that could be useful. We could see how much can it take.
And if it isn't useful, at least it'll be amusing. :)
It was at least amusing. I ran five clients usually 2 clicks per second for half an hour. Sometimes I had them all run at full speed (next click when the response from the web server was complete). I didn't see any degradation and I assume being behind one home router was the limit here. Anyway if Squeak still uses a reasonable amount of memory I think the Squeak site will not get more load than that.
Maybe one day we can do this while you are watching the CPU and Memory use? I assume Linux has tools to record these for a certain time.
Cheers
Herbert
On 2012-11-23 3:59 AM, Herbert König wrote:
This image has been running for ~270 hours. If you want to fire up your rig and try to overwhelm the box3.squeak.org:8624 that could be useful. We could see how much can it take.
And if it isn't useful, at least it'll be amusing. :)
It was at least amusing. I ran five clients usually 2 clicks per second for half an hour. Sometimes I had them all run at full speed (next click when the response from the web server was complete). I didn't see any degradation and I assume being behind one home router was the limit here. Anyway if Squeak still uses a reasonable amount of memory I think the Squeak site will not get more load than that.
Wow. That's amazing. It looks like it's fairly stable. This is the latest from top:
30743 chriscun 20 0 1042m 53m 1296 R 2.3 5.3 457:41.95 squeak
457 hours. That looks good.
I spent half an hour grep'ing apache access files looking for the half hour you tested. Then I realized there is no apache stanza. So there are no logs. Also this server has no dns (tinydns, as box2 uses DJB tools). That'll change soonish.
The problem I had with Altitude may have been due to how I was deploying it. I'd start an image, log in with RFB, load in Altitude, and then the site code. This last time I prepared the image fully on the desktop, zipped, ftp'ed, unzipped, and started. The file it loaded in at start (i.e. squeaksite.st) was minimal.
(ALServer on: 8624 application: ALSqueakApplication new) start
I zip before ftp'ing, as the line encodings in the source file can get messed up. That may be a factor for greater stability. But I don't really know. I never took a degree in compsci. This is all basically voodoo to me.
Maybe one day we can do this while you are watching the CPU and Memory use? I assume Linux has tools to record these for a certain time.
I think that's a great idea. I'll have finished merging Altitude and Bootstrap today or tomorrow. Then I'd like to retire this image and we can blast the new one.
It's exciting to see Altitude so stable. Thanks for doing this, Herbert.
Chris
webteam@lists.squeakfoundation.org