Hi, I don't follow OG development but as host operator a few things are
coming up. The machine's load is gradually climbing over time and some
of that is OG.
Despite a relatively low hit rate on OG it is consuming quite a bit of
resource. If OG started taking off it would take the machine down.
First up: index.cgi requires 0.35s to perform a `perl -c` syntax check.
Any thoughts on putting OG on a mod_perl server? I have mod_perl running
here of course and we'd need to coordinate some apache.conf stuff.
Second: the supersearch.cgi gulps down CPU, often for seconds at a time.
It is a frequent resident of `top` output. This isn't really
acceptable. I'm going to request this feature be turned off unless an
effective optimisation plan or some other way to reduce its impact
here is constructed pretty soon. Sorry about this but it's encroaching
Third: I wonder if there's some way to instruct robots not to spider
parts of your wiki. This ought to speak for itself:
$ grep crawl /var/log/apache/london.openguides.org-access.log | grep 'action=edit' | wc -l
Finally: I posted about a DoS and was wondering what the status of a
solution was. http://openguides.org/mail/openguides-dev/2004-October/000542.html
Paul (any overbearing tone unintentional ;-)
Paul Makepeace .............................. http://paulm.com/inchoate/
"If my elbow was straight, then I'll show oyu mine!"
I've mostly finished my latest batch of feeds related changes. They're
live on the Cotswolds site, and Dom hopes to do a new release including
them in the not too distant future.
The main new stuff is:
* you can now get rss+atom feeds of the search results
* you can get rss+atom feeds of the all the nodes in a category or locale
* you can get rss+atom feeds of all the categories or locales
* the feeds stuff should be (pretty) fully documented at
If people could poke it all (lots of examples on the wiki), and see how
they get on, that'd be great.
Then we just need people to do funky stuff with these feeds!
For example, I've got some series 60 python code that loads a user defined
URL with the current (gps derived) lat+long. Anyone fancy writing some
code to go and find all the pubs within 2km of that location, across all
Go on, be inspired :)