Hi there,
I've written a file-attachment patch for OpenGuides and CGI::Wiki, which stores the files in BLOBs. I've written the relevant CGI::Wiki::Store::Pg module for Postgresql, but I don't know so much about SQLite or MySQL. Do they support BLOBs? Is it a similar enough interface to PostgreSQL that I should be doing this stuff at a generic (CGI::Wiki::Store) level instead?
ta,
Toby
"Toby" == Toby Corkindale tjc@wintrmute.net writes:
Toby> I've written a file-attachment patch for OpenGuides and CGI::Wiki, which Toby> stores the files in BLOBs. I've written the relevant CGI::Wiki::Store::Pg Toby> module for Postgresql,
Why? Rows in Pg are unlimited length, and BLOBs are supported only to be "oracle-like". It's preferred that you simply store the data in the row.
Toby> but I don't know so much about SQLite or MySQL.
SQLite doesn't have blobs, but any column can store a lot:
The total amount of data in a single row of the database is limited to 1 megabyte. You can increase this limit to 16 megabytes, if you need to, by adjusting a single #define in the source tree and recompiling.
MySQL is not a database. :)
On Fri, Jan 23, 2004 at 06:23:25AM -0800, Randal L. Schwartz wrote:
"Toby" == Toby Corkindale tjc@wintrmute.net writes:
Toby> I've written a file-attachment patch for OpenGuides and CGI::Wiki, which Toby> stores the files in BLOBs. I've written the relevant CGI::Wiki::Store::Pg Toby> module for Postgresql,
Why? Rows in Pg are unlimited length, and BLOBs are supported only to be "oracle-like". It's preferred that you simply store the data in the row.
In that case, I shall.
Toby> but I don't know so much about SQLite or MySQL.
SQLite doesn't have blobs, but any column can store a lot:
The total amount of data in a single row of the database is limited to 1 megabyte. You can increase this limit to 16 megabytes, if you need to, by adjusting a single #define in the source tree and recompiling.
Hmm. 1Mb isn't a lot in my books. But I won't be using SQLite anyway.
MySQL is not a database. :)
I've been trying to tell people this for years. And then SQLite came along, and suddenly MySQL looks feature-rich by comparison.
There's obviously a market for lightweight vaguely-indexed data-store libraries, but why do they always start trying to go above their station and become a fully-fledged database? Pick a niche market and stick to it, dammit! Grr.. I'm just waiting for SQLEvenLiter to appear once SQLite decides to add things like concurrency and locking. I hear MySQL has finally got around to supporting transactions and finer-grained locking, so evidently it's time for SQLite to move up and fill its place.
tjc.
On Fri, Jan 23, 2004 at 06:23:25AM -0800, Randal L. Schwartz wrote:
"Toby" == Toby Corkindale tjc@wintrmute.net writes:
Toby> I've written a file-attachment patch for OpenGuides and CGI::Wiki, which Toby> stores the files in BLOBs. I've written the relevant CGI::Wiki::Store::Pg Toby> module for Postgresql,
Why? Rows in Pg are unlimited length, and BLOBs are supported only to be "oracle-like". It's preferred that you simply store the data in the row.
DBD::Pg doesn't seem to prefer it after all: wiki.cgi: DBD::Pg::db do failed: ERROR: parser: unterminated quoted string at or near "'BZh91AY&SY!!!z" at character 97
Note that this is with auto-quoting/escaping enabled, btw.
I tried using the quote method like so: $data = $dbh->quote($buffer, DBD::Pg::PG_BYTEA): and that went into a seemingly-infinite (or at least very long) loop, and dunking warnings about invalid characters into my webserver logs.
D'oh.
Looks like I'm going back to using the BLOBs method, which worked fine!
tjc.
DBD::Pg doesn't seem to prefer it after all: wiki.cgi: DBD::Pg::db do failed: ERROR: parser: unterminated quoted string at or near "'BZh91AY&SY!!!z" at character 97
Note that this is with auto-quoting/escaping enabled, btw.
Is that with a BYTEA field or a TEXT field? TEXT fields can't handle character 0 but BYTEA fields should. In any event what I personally do is Base64 encode before sticking such data in the DB. It's a little more space, but it avoids quoting issues.
Matt
On Sat, 24 Jan 2004, Matt Clark wrote:
DBD::Pg doesn't seem to prefer it after all: wiki.cgi: DBD::Pg::db do failed: ERROR: parser: unterminated quoted string at or near "'BZh91AY&SY!!!z" at character 97
Note that this is with auto-quoting/escaping enabled, btw.
Is that with a BYTEA field or a TEXT field? TEXT fields can't handle character 0 but BYTEA fields should. In any event what I personally do is Base64 encode before sticking such data in the DB. It's a little more space, but it avoids quoting issues.
That introduces a factor of overhead, not a constant overhead. Use a blob, then at least the overhead is a constant.
S.
On Sat, Jan 24, 2004 at 11:17:40AM -0000, Matt Clark wrote:
DBD::Pg doesn't seem to prefer it after all: wiki.cgi: DBD::Pg::db do failed: ERROR: parser: unterminated quoted string at or near "'BZh91AY&SY!!!z" at character 97
Note that this is with auto-quoting/escaping enabled, btw.
Is that with a BYTEA field or a TEXT field? TEXT fields can't handle character 0 but BYTEA fields should. In any event what I personally do is Base64 encode before sticking such data in the DB. It's a little more space, but it avoids quoting issues.
That's a BYTEA field, yes. Looks like the DBD::Pg module doesn't handle the escaping properly or something.
I'm happy to stick with BLOBs, partially because they avoid the memory overhead of having a 20Mb file stored in memory as a return from the DB. Blobs give you a file descriptor, so i can just iterate until eof, reading and printing to the client connection.
tjc
Toby Corkindale wrote:
Hi there,
I've written a file-attachment patch for OpenGuides and CGI::Wiki, which stores the files in BLOBs. I've written the relevant CGI::Wiki::Store::Pg module for Postgresql, but I don't know so much about SQLite or MySQL. Do they support BLOBs?
I believe their interfaces (and that of other DBs) are very similar: they both can handle BLOBbish data. The only issue off the top of my head is to use placeholders rather than quoted values, but you were probably gonna do that anyway...
IIRC some databases don't like to sort, compare, or get distinct blob values. (You probably weren't gonna do that anyway).
Cheers
On Fri, Jan 23, 2004 at 02:58:02PM +0000, Tim Sweetman wrote:
Toby Corkindale wrote:
Hi there,
I've written a file-attachment patch for OpenGuides and CGI::Wiki, which stores the files in BLOBs. I've written the relevant CGI::Wiki::Store::Pg module for Postgresql, but I don't know so much about SQLite or MySQL. Do they support BLOBs?
I believe their interfaces (and that of other DBs) are very similar: they both can handle BLOBbish data. The only issue off the top of my head is to use placeholders rather than quoted values, but you were probably gonna do that anyway...
Hmm, sounds like a very different interface.. Currently i call a DBI/DBD::pg function to get a file descriptor, and then i can read(), write() and seek() on it. I used to use Oracle when writing telephony apps for a few years, and that was my first experience of working with blobs. Sounds like i may have become ingrained/infected with that oraclism then! I've always done PostgreSQL stuff in the same way since then. (Been using PostgreSQL for a very long time, but never used to use it with massive fields until v7 era, and it was hard to persuade Vodafone etc to go with an opensource database in the late 90s)
So, people are saying it's fine to just create a ByteA or BLOB type and just fill it with 20Mb of binary data. OK then, off to re-write the code :)
I can see some disadvantages, in that i can't seek to part-way thru the file and re-write a few bytes, ie random access, but for file attachments in a wiki, that's not going to happen anyway.
IIRC some databases don't like to sort, compare, or get distinct blob values. (You probably weren't gonna do that anyway).
No..
ta,
Toby
openguides-dev@lists.openguides.org