User talk:Mishal

From DWPriests
Revision as of 19:29, 5 March 2011 by Zexium (Talk | contribs)

Jump to: navigation, search


Does my using the delete template help? The intention was that everything I marked would be in the pages to be deleted category, which would make them easier to find.

Also, whatever you did do seems to have stopped the spam page generation. It hasn't stopped the user accounts being generated, but they're no longer generating the spam page afterwards. A partial success at least?

--Zexium 16:40, 5 March 2011 (UTC)

It doesn't help me at all, since I just delete them as soon as I see them in the RSS feed, but it does at least stop them being useful to the spammers. We're still getting the odd spam page being created, but after I deleted that last spam page I added a ReCaptcha test to the registration page, and when creating a page or adding a URL to a page. Whether they'll have any effect will depend on whether the spam is being created by actual humans, which I fear it is.
I do still have the option of trying other CAPTCHA methods, such as having the user answer a simple question - something that anyone from the MUD could answer but which would baffle random spammers. I could also maybe disable new page creation to people that haven't verified their email address, but that isn't impossible for spammers to do, either. We'll see how things go, but at least I still have options open to me - I've just been trying to go for the least-intrusive and easiest to implement options so far.
Mishal (talk) (MUD finger) 17:11, 5 March 2011 (UTC)
If these are user edits, they're not realising that the pictures they upload aren't actually appearing in their pages. There's been a similar issue over on the wiki at imaginary-realities, I think the current thinking over there is that it's automated and not manual edits, with something that's managing to ocr the captcha.
I'm not sure what they're trying to achieve either - it seems a pretty ineffective sort of spamming to create an unreferenced page in a wiki, as no-one will see it unless they look at the recent pages list, and wiki spam links are unlikely to be followed.
I've heard the suggestion before that they feed the wiki document into a search engine to raise the score of the linked page. If that's the case, then I wonder if there's something that can be included in the headers of unreferenced pages to make the likes of bingbot and googlebot ignore any links for link scoring?
--Zexium 19:29, 5 March 2011 (UTC)