?

Log in

No account? Create an account

Previous Entry | Next Entry

Is strict parsing a way to avoid wiki spam?

In the process of writing a LALR(1) parser for the NBTSWikiWiki, and then seeing the front page get spammed a second time, I wonder now if stricter parsing would have avoided the problem. If pages couldn’t contain invalid markup constructs, and you just rejected the edit.... Then that last spam would have failed since NBTSWikiWiki’s syntax is a bit different than the spammers were suspecting.

It’s probably a bad idea, but it’s a thought. In a more geeky use of a wiki, it might be very smart.

Comments

( 2 comments — Leave a comment )
vruba
Nov. 26th, 2004 02:27 am (UTC)
Syntax instructions as captcha – heh heh uuugh.
aredridel
Nov. 28th, 2004 10:34 am (UTC)
I thought you'd say that. Not even captcha, but telling humans who care from those who really don't want to waste their time.
( 2 comments — Leave a comment )