Je 2004-11-10 06:12:18 +0000, Kake L Pugh skribis:
On Tue 09 Nov 2004, Paul Makepeace openguides.org@paulm.com wrote:
I'm going to say it again since I've been Warnocked on this so far:
The issue is that the system is not resilient to bursts of searches. The system needs to be able to resist that. Making it go faster is great as it increases the number of searches possible per second but until there is something that queues searches when there are more than 'n' happening we're still running a risk.
I think what people aren't understanding about your idea is what makes "a CGI script that searches an OpenGuide" so intrinsically different from all other CGI scripts that the only way to make it not take down a server is to limit the number of copies that can run at one time.
You're right - anything that's not executing pretty quickly is open to suffer this.
I don't know why there are sometimes dozens of instances of this script in the process table. Perhaps it's some harvesting exercise; something search scripts might be vulnerable to? Wild guess.
I'm also trying to offer a solution that isn't heavy on your time so the service could come back up asap - queuing searches during load.
Or are you saying that there's something unusual about the server setup? Or that all CGI scripts other than the trivial should do this? Or that you've checked the code and looked at Plucene and done something compsci-ish (*waves hands*) to determine that the problem can't be solved in a non-CPU-intensive way?
I don't have that much time/skill :)
P