User talk:Mishal

From DWPriests
Jump to: navigation, search

Anti-Spam Whitelist Requests

Please edit this page and add your wiki username (and MUD character name, if different from your wiki username) to the following list. I watch all edits to the entire wiki via RSS, so I should notice changes to this page within a few hours (depending on whether I'm asleep or whatever). If I'm then satisfied that your account is legitimate, I'll immediately add you to the "validated" group. I apologise for the hassle involved, but our anti-spam measures have proved largely effective, so until we get a MUD-linked system, this will have to suffice.

  • Kloke

Spam

Does my using the delete template help? The intention was that everything I marked would be in the pages to be deleted category, which would make them easier to find.

Also, whatever you did do seems to have stopped the spam page generation. It hasn't stopped the user accounts being generated, but they're no longer generating the spam page afterwards. A partial success at least?

--Zexium 16:40, 5 March 2011 (UTC)

It doesn't help me at all, since I just delete them as soon as I see them in the RSS feed, but it does at least stop them being useful to the spammers. We're still getting the odd spam page being created, but after I deleted that last spam page I added a ReCaptcha test to the registration page, and when creating a page or adding a URL to a page. Whether they'll have any effect will depend on whether the spam is being created by actual humans, which I fear it is.
I do still have the option of trying other CAPTCHA methods, such as having the user answer a simple question - something that anyone from the MUD could answer but which would baffle random spammers. I could also maybe disable new page creation to people that haven't verified their email address, but that isn't impossible for spammers to do, either. We'll see how things go, but at least I still have options open to me - I've just been trying to go for the least-intrusive and easiest to implement options so far.
Mishal (talk) (MUD finger) 17:11, 5 March 2011 (UTC)
If these are user edits, they're not realising that the pictures they upload aren't actually appearing in their pages. There's been a similar issue over on the wiki at imaginary-realities, I think the current thinking over there is that it's automated and not manual edits, with something that's managing to ocr the captcha.
I'm not sure what they're trying to achieve either - it seems a pretty ineffective sort of spamming to create an unreferenced page in a wiki, as no-one will see it unless they look at the recent pages list, and wiki spam links are unlikely to be followed.
I've heard the suggestion before that they feed the wiki document into a search engine to raise the score of the linked page. If that's the case, then I wonder if there's something that can be included in the headers of unreferenced pages to make the likes of bingbot and googlebot ignore any links for link scoring?
--Zexium 19:29, 5 March 2011 (UTC)
I'd agree with the search engine theory and yes, we can add a 'rel="nofollow"' attribute to the links to stop search engines using them for page ranking, but that wouldn't stop the spam. It would stop the spam having any effect, but deleting the pages has just as much effect anyway and that doesn't seem to have stopped them posting here.
Mishal (talk) (MUD finger) 19:42, 5 March 2011 (UTC)
Yes, but I'm wondering if we're catching them fast enough, I imagine that as soon as the page is created they're going to try and have it indexed by whatever search engine they're trying to get pagerank on. Preventing the spam is best, but if we can't stop it outright, at least preventing the spammer benefiting from their efforts might feel like it was helping.
I think your comment about a simple question might be best. Given the principle theme of the wiki, maybe "What god does [Scoone / Althea / Kess / Halen / Vy / Rudolpho / Fillet / Khepresh] worship?"
--Zexium 19:58, 5 March 2011 (UTC)
The question doesn't even have to be that complex (an absolute newbie wouldn't have a clue, though would an absolute newbie need to be editing the wiki?). A "what is the name of the MUD?" or similar would be about as simple as you can get, I think, and if we're right about this spam being automated there's no way they could correctly answer it, even though the answer is on the wiki homepage.
Mishal (talk) (MUD finger) 20:01, 5 March 2011 (UTC)
Something I've been working on potentially for the dw wiki (not sure if vthey're going to use it, but it was interesting exercise anyway) that might be of interest, a hack to ConfirmEdit_body.php that checks that new character names are valid and logged in mud users created at least 24 hours previously. Uses fsockopen to do a quick telnet exchange with the mud server login screen for basic finger info. You can tweak it for a combination of valid mud user, created more than x days ago, and currently logged in to the mud. On the flip side, it means that new accounts on here will have to use their mud character names, but I think most people do anyway. Want the modified file (mine is for mediawiki 1.14 which is what Drakkos is using on his wiki (which I think runs on one of Sojan's hosts anyway). Zexium 20:01, 15 October 2011 (EDT)
That sounds like a reasonable enough idea :) Personally I would have thought that performing a simple file_get_contents() on the MUD httpd's finger.c would have been easier (it requires a login, but it's HTTP basic auth, which PHP handles automatically with the http://user:pass@host/ syntax), but if the telnet method works (and doesn't break if, say, the MUD is down), I'm sure that will be fine.
One concern I have is that it would potentially be open to abuse, where someone could cause this wiki to DoS the MUD until Sojan bans this IP address, so some way of limiting the requests that it makes per-hour and/or per-IP address could be useful. It could cache lookups so that it only performs a lookup once for each username, but that doesn't stop someone scripting an attack that cycles through every possible username (aaaaa, aaaab, aaaac, etc.). I do, of course, realise that this makes it instantly much more complicated and involves the use of a database of some variety, but I'd hate to get in Sojan's bad books :p
I'm currently stuck in bed with an exciting post-breakup depression + ear infection combination, but if you poke me in the direction of your modification, I'll look into implementing it when I'm next at my desk :) Mishal (talk) (MUD finger) 09:18, 16 October 2011 (EDT)
Good points, the issue I have with using the http interface is that it either leaves a mud password in the code, or needs an extra lookup to a secure file to read the mud password to use. I'm not sure which method uses less data, but I think that what what the telnet method gains in the menus it loses in the html and headers. I have other code that does the same thing using curl and dom manipulation of the received file, which is probably a better technical solution but again involves extra (curl, dom) overhead. I'll have a think about implementing a a way of throttling requests. Perhaps 3 strikes and then reject the ip. Zexium 11:07, 16 October 2011 (EDT)
Update - I'm creating on the mud now, which means I can probably talk to Sojan about the easiest and most secure way to provide an external interface to "is x a player" function. My development took a bit of a back seat due to some other issues, but I hope to revisit it soon. Thanks for the access changes. Zexium 17:50, 2 January 2012 (EST)

Not actually spam! But it's being flagged :(

The Learnt at level table pages are uneditable! Whenever I try, I get this message:

The text you wanted to save was blocked by the spam filter. This is probably caused by a link to a blacklisted external site.

The following text is what triggered our spam filter: http://akismet.com blacklist error 

The templates for the rows were similarly uneditable until I changed the comments. I think the problem is that it interprets fa.ri.de.se and so on as links--even, apparently, when the text is in a template--and thinks there are too many links on the page.

It doesn't seem to be any specific bit of text, since by testing I've found I can change either fa.ri.de.se or fa.ri.de (for example) and it's fine with that. It seems like the variety is really the problem.

Is there a whitelist or something we could add those specific things to?

I also get a 403 error if I try to go to one of those templates' pages. I can edit the pages just fine, though, so I'm not sure what that's about.

--Ilde 01:56, 9 February 2012 (EST)

I just made a small modification to the Akismet code to exclude a new "validated" user group that I created. That same group will also be allowed to bypass the CAPTCHAs. See the section I added to the main page for more details. Mishal (talk) (MUD finger) 06:01, 10 February 2012 (EST)
Oh, and the 403 errors are the result of an IE6 XSS vulnerability fix, as detailed here and here. Not sure what to do about that, besides maybe making the Apache mod_rewrite rule only apply to browsers with IE6's user agent string... Mishal (talk) (MUD finger) 06:11, 10 February 2012 (EST)
Great, thanks! Edited those pages and put up a note there. The 403 issue isn't a big deal (easy enough to see what a template is anyway); I just found it very puzzling. --Ilde 18:10, 10 February 2012 (EST)

Could you please add me to the white list? -Juppie

Sorry, didn't see this until now :/ Will see if I can make this thing email me about edits... Mishal (talk) (MUD finger) 11:15, 13 May 2012 (EDT)

certclick*

Is it possible to blacklist usernames /^certclick.*/i - seems to be rather pervasive at the moment. Zexium 16:04, 9 July 2012 (BST)