On Tue 09 Nov 2004, Paul Makepeace openguides.org@paulm.com wrote:
I'm going to say it again since I've been Warnocked on this so far:
The issue is that the system is not resilient to bursts of searches. The system needs to be able to resist that. Making it go faster is great as it increases the number of searches possible per second but until there is something that queues searches when there are more than 'n' happening we're still running a risk.
I think what people aren't understanding about your idea is what makes "a CGI script that searches an OpenGuide" so intrinsically different from all other CGI scripts that the only way to make it not take down a server is to limit the number of copies that can run at one time.
Or are you saying that there's something unusual about the server setup? Or that all CGI scripts other than the trivial should do this? Or that you've checked the code and looked at Plucene and done something compsci-ish (*waves hands*) to determine that the problem can't be solved in a non-CPU-intensive way?
(So I think this was Warnock Dilemma #4.)
Kake