Aria Stewart (aredridel) wrote,
Aria Stewart
aredridel

Is strict parsing a way to avoid wiki spam?

In the process of writing a LALR(1) parser for the NBTSWikiWiki, and then seeing the front page get spammed a second time, I wonder now if stricter parsing would have avoided the problem. If pages couldn’t contain invalid markup constructs, and you just rejected the edit.... Then that last spam would have failed since NBTSWikiWiki’s syntax is a bit different than the spammers were suspecting.

It’s probably a bad idea, but it’s a thought. In a more geeky use of a wiki, it might be very smart.

Subscribe

Recent Posts from This Journal

  • (no subject)

    You do occasionally visit Boston Public Library, yes? If not, get on it! You were raised in and on libraries. They are in your blood! You…

  • (no subject)

    "I had never been in a room of people who were going to say 'yes' to me before." My friend and I crammed into a rush hour crowded train…

  • Recipe: Storm in the Garden

    Recipe: Storm in the Garden Ingredients 10 ml lavender vodka 10 ml orange vodka 10 ml hibiscus vodka 200 ml ginger ale ice…

  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 2 comments