Sunday 21 January 2007 — This is almost 18 years old. Be careful.
Up until about six months ago, I was preventing spam on this site using a keyword list. As new spam would arrive, I would update the list to prevent it. It was a pain. Six months ago, I changed my comment form to use a number of tricks to make it difficult or impossible for a spambot to successfully post a comment. In that time, I have had 450 real comments, five spam comments (almost certainly by people), and perhaps 2500 failed spam attempts. That is a good ratio.
I’ve written up how I do it: Stopping spambots with hashes and honeypots.
Last week, Damien wrote Negative CAPTCHA, about fooling spambots into identifying themselves with invisible fields. This is a component of my technique, so his post spurred me to explain what I do to keep spam off this site.
I was particularly interested in the comments on Damien’s post, as they show the variety of know-it-alls that boldly proclaim facts that are plain wrong, or miss the point.
For example, about the possibility of spambots properly parsing forms with invisible elements, Guymac wrote:
It’s a simple DOM method call to determine if an element, say a form field, is visible or not. So Bot writers could trivially work around this technique.
I was amused by his use of the word “trivially”, since it neatly glossed over the need to base the bot on a browser infrastructure, and ignored some of the ways that fields can be made invisible.
My technique works well on this site. Maybe by writing it up, we can get some more good ideas flowing. I picked up some tips from commenters on Damien’s post that I have now integrated into my system.
Comments
Also how come spammers don't use a browser engine such as gecko or mshtml to parse pages?
Add a comment: