Anti-Spam Whitelist Requests
Please edit this page and add your wiki username (and MUD character name, if different from your wiki username) to the following list. I watch all edits to the entire wiki via RSS, so I should notice changes to this page within a few hours (depending on whether I'm asleep or whatever). If I'm then satisfied that your account is legitimate, I'll immediately add you to the "validated" group. I apologise for the hassle involved, but our anti-spam measures have proved largely effective, so until we get a MUD-linked system, this will have to suffice.
- Dummy Username
Also, whatever you did do seems to have stopped the spam page generation. It hasn't stopped the user accounts being generated, but they're no longer generating the spam page afterwards. A partial success at least?
--Zexium 16:40, 5 March 2011 (UTC)
- It doesn't help me at all, since I just delete them as soon as I see them in the RSS feed, but it does at least stop them being useful to the spammers. We're still getting the odd spam page being created, but after I deleted that last spam page I added a ReCaptcha test to the registration page, and when creating a page or adding a URL to a page. Whether they'll have any effect will depend on whether the spam is being created by actual humans, which I fear it is.
- I do still have the option of trying other CAPTCHA methods, such as having the user answer a simple question - something that anyone from the MUD could answer but which would baffle random spammers. I could also maybe disable new page creation to people that haven't verified their email address, but that isn't impossible for spammers to do, either. We'll see how things go, but at least I still have options open to me - I've just been trying to go for the least-intrusive and easiest to implement options so far.
- If these are user edits, they're not realising that the pictures they upload aren't actually appearing in their pages. There's been a similar issue over on the wiki at imaginary-realities, I think the current thinking over there is that it's automated and not manual edits, with something that's managing to ocr the captcha.
- I'm not sure what they're trying to achieve either - it seems a pretty ineffective sort of spamming to create an unreferenced page in a wiki, as no-one will see it unless they look at the recent pages list, and wiki spam links are unlikely to be followed.
- I've heard the suggestion before that they feed the wiki document into a search engine to raise the score of the linked page. If that's the case, then I wonder if there's something that can be included in the headers of unreferenced pages to make the likes of bingbot and googlebot ignore any links for link scoring?
- --Zexium 19:29, 5 March 2011 (UTC)
- --Zexium 19:58, 5 March 2011 (UTC)
- Mishal (talk) (MUD finger) 20:01, 5 March 2011 (UTC)
Not actually spam! But it's being flagged :(
The Learnt at level table pages are uneditable! Whenever I try, I get this message:
The text you wanted to save was blocked by the spam filter. This is probably caused by a link to a blacklisted external site. The following text is what triggered our spam filter: http://akismet.com blacklist error
The templates for the rows were similarly uneditable until I changed the comments. I think the problem is that it interprets fa.ri.de.se and so on as links--even, apparently, when the text is in a template--and thinks there are too many links on the page.
It doesn't seem to be any specific bit of text, since by testing I've found I can change either fa.ri.de.se or fa.ri.de (for example) and it's fine with that. It seems like the variety is really the problem.
Is there a whitelist or something we could add those specific things to?
I also get a 403 error if I try to go to one of those templates' pages. I can edit the pages just fine, though, so I'm not sure what that's about.
--Ilde 01:56, 9 February 2012 (EST)
- I just made a small modification to the Akismet code to exclude a new "validated" user group that I created. That same group will also be allowed to bypass the CAPTCHAs. See the section I added to the main page for more details. Mishal (talk) (MUD finger) 06:01, 10 February 2012 (EST)
- Oh, and the 403 errors are the result of an IE6 XSS vulnerability fix, as detailed here and here. Not sure what to do about that, besides maybe making the Apache mod_rewrite rule only apply to browsers with IE6's user agent string... Mishal (talk) (MUD finger) 06:11, 10 February 2012 (EST)
- Great, thanks! Edited those pages and put up a note there. The 403 issue isn't a big deal (easy enough to see what a template is anyway); I just found it very puzzling. --Ilde 18:10, 10 February 2012 (EST)
Could you please add me to the white list? -Juppie