On Sat, Dec 3, 2011 at 7:16 AM, Kake L Pugh kake@earth.li wrote:
Oh, sorry, I also meant to reply to this bit:
On Thu 01 Dec 2011, Philip Neustrom philipn@gmail.com wrote:
DavisWiki will be ported over, yeah. I wrote a script to import the old DavisWiki-style wiki codebase and it wasn't /too/ tricky. It's only complex if you are dead-set on capturing everything *perfectly*. LocalWiki stores pages as HTML in the database, so writing an importer is actually really easy - you just grab the rendered HTML and then dump it into a Page object.
If you would like to try importing RGL data into your system, we have a daily database dump here: http://london.randomness.org.uk/dbdump/
and our data is all available under Creative Commons Attribution 2.0.
(It's a bit more complex than just rendered HTML though.)
Kake
I'll take a look! It'd probably be easier for me to write a scraper rather than try and parse the OG-wikimarkup. Or is there an independent parser that can be used? E.g. is this a standardized markup format?
I'm going to be working on a MediaWiki importer tonight and tomorrow, so that may be a good starting point for other importer projects.
-Philip
-- OpenGuides-Dev mailing list - OpenGuides-Dev@lists.openguides.org http://lists.openguides.org/mailman/listinfo/openguides-dev