User talk:Mishal

From DWPriests
Revision as of 21:01, 5 March 2011 by Mishal (Talk | contribs)

Jump to: navigation, search


Does my using the delete template help? The intention was that everything I marked would be in the pages to be deleted category, which would make them easier to find.

Also, whatever you did do seems to have stopped the spam page generation. It hasn't stopped the user accounts being generated, but they're no longer generating the spam page afterwards. A partial success at least?

--Zexium 16:40, 5 March 2011 (UTC)

It doesn't help me at all, since I just delete them as soon as I see them in the RSS feed, but it does at least stop them being useful to the spammers. We're still getting the odd spam page being created, but after I deleted that last spam page I added a ReCaptcha test to the registration page, and when creating a page or adding a URL to a page. Whether they'll have any effect will depend on whether the spam is being created by actual humans, which I fear it is.
I do still have the option of trying other CAPTCHA methods, such as having the user answer a simple question - something that anyone from the MUD could answer but which would baffle random spammers. I could also maybe disable new page creation to people that haven't verified their email address, but that isn't impossible for spammers to do, either. We'll see how things go, but at least I still have options open to me - I've just been trying to go for the least-intrusive and easiest to implement options so far.
Mishal (talk) (MUD finger) 17:11, 5 March 2011 (UTC)
If these are user edits, they're not realising that the pictures they upload aren't actually appearing in their pages. There's been a similar issue over on the wiki at imaginary-realities, I think the current thinking over there is that it's automated and not manual edits, with something that's managing to ocr the captcha.
I'm not sure what they're trying to achieve either - it seems a pretty ineffective sort of spamming to create an unreferenced page in a wiki, as no-one will see it unless they look at the recent pages list, and wiki spam links are unlikely to be followed.
I've heard the suggestion before that they feed the wiki document into a search engine to raise the score of the linked page. If that's the case, then I wonder if there's something that can be included in the headers of unreferenced pages to make the likes of bingbot and googlebot ignore any links for link scoring?
--Zexium 19:29, 5 March 2011 (UTC)
I'd agree with the search engine theory and yes, we can add a 'rel="nofollow"' attribute to the links to stop search engines using them for page ranking, but that wouldn't stop the spam. It would stop the spam having any effect, but deleting the pages has just as much effect anyway and that doesn't seem to have stopped them posting here.
Mishal (talk) (MUD finger) 19:42, 5 March 2011 (UTC)
Yes, but I'm wondering if we're catching them fast enough, I imagine that as soon as the page is created they're going to try and have it indexed by whatever search engine they're trying to get pagerank on. Preventing the spam is best, but if we can't stop it outright, at least preventing the spammer benefiting from their efforts might feel like it was helping.
I think your comment about a simple question might be best. Given the principle theme of the wiki, maybe "What god does [Scoone / Althea / Kess / Halen / Vy / Rudolpho / Fillet / Khepresh] worship?"
--Zexium 19:58, 5 March 2011 (UTC)
The question doesn't even have to be that complex (an absolute newbie wouldn't have a clue, though would an absolute newbie need to be editing the wiki?). A "what is the name of the MUD?" or similar would be about as simple as you can get, I think, and if we're right about this spam being automated there's no way they could correctly answer it, even though the answer is on the wiki homepage.
Mishal (talk) (MUD finger) 20:01, 5 March 2011 (UTC)