Answering a question on Perlmonks: http://perlmonks.org/?node_id=601407 has set me thinking about our geodata fields, in particular our lat + long. To be valid, these must be numeric; in fact the Google Maps code will b0rk if there's garbage in there. This made me wonder if the check could be done DB side, in the form of a constraint, or possibly as a trigger to calculate the X and Y.
... Which made me think about the wider issue of data validation. I know MySQL isn't brilliant in this area, but I'm no up to speed with Pg enough to know what's on offer. I am aware of how commercial databases like Oracle and Sybase do it; that's my bread and butter.
If we had more validation checks in, either server side or DB side, we could probably block more spam, as the 'bots tend to put rubbish and URLs into many of our fields.
<braindump />