CMcC would like it to be known that (Josh) is a sock puppet of He Who May Not Be Named, [unperson], Robert [unperson], whose name is Legion. This is known technically and indubitably, and not merely by textual analysis (which nonetheless shows all the hallmarks of [unperson]-thought.)[1] and [2] give you some good background on this rapscallion, this netkook, this irresistible force of a lesser nature.== Information required ==(Josh) Since the previous software running this site was working fairly well, why was it replaced in the first place? Could we have a little information on what's going on?CMcC thought it was all public information.The previous incarnation was swamped by spiders. It was designed with locks and a lock breaking mechanism with a timeout of 10 minutes. The spidering was such that the timeouts timed out on valid edits, enabling multiple edits to be partially committed, thus causing corruption of data.It was considered a good idea to try to harden the server - to exclude rampant spiders. There was a mood to change, and change was necessary to prevent the inevitable repetition of the corruption.After about a fortnight, nothing was being done, so I decided I'd be willing to port the backend to my Wub front end, on the basis that I needed to write some hardening of the type needed anyway, and that Wub shouldn't suffer from the same network issues.In the absence of any practical alternatives or anyone to fix the problem in the time frame needed, that's what I chose to do. I get the benefit of testing Wub in a heavy duty application. The wiki gets to be hardened against attack and accident.When someone comes up with a better working implementation, I'm more than happy to hand it off to them. To that end, the wikit stuff is currently in subversion, and will be constituted as its own project. I don't plan to do much in terms of extending wikit beyond the functionality it had, but some people like jdc seem to have plans and willingness to put them into action.Feel free to contribute!
(Josh) Thanks Colin for your great efforts! Great attitude! Great spirit of initiative! You rolled out your sleeves, you spit in your hands and you moved forward. Way to go!Unfortunately, in all modesty, I must admit that I am not versed enough in these sort of problems but perhaps others are and they might contribute solutions. They might even be able to contribute an algorithm of some kind. I doubt it though since you are playing in a very very specialized area. But let's remain optimistic.My 2 Euros: wouldn't it be a good idea to only let in participants with passwords the same way it is done in the chat? This way vandals won't be able to vandalize the wiki and we could go back to the old Wikit? It seems to me (and to a lot of wiki webmasters) that times have changed: there are way too many cookoos out there so we cannot leave the gate open at night like we did in the old days. This wiki has always been very peaceful thanks to the fine and dedicated participants from all around the world therefore participants have never caused a single problem here; vandals are the ones who screwed up the wiki. They shouldn't have access to the wiki in the first place. We should close the gate.stevel Josh, this is a regular question and there's a regular answer ;)Consider the analogy of a shop window. Occasionally you get vandalism, but the solution isn't to board up the shop window, but rather to replace the glass on the rare occasions it is smashed, and perhaps install some security lighting.The wiki is Tcl's shop front and so we want to avoid boarding it up. One design goal of the wiki is to avoid barriers to people contributing (even if that means occasional vandalism). That's why we don't require passwords.(Josh) Great answer! Thanks Steve. IMHO in the absence of a technical solution, the implementation of a passwords system could be the solution however. Is it possible to examine how other wikis have fixed the very same problem and what solutions were implemented? I am sure they must also have been attacked by spiders.stevel No, the implementation of a password system is explicitly not what we want. This decision has been quite deliberate and well considered over a number of years. Passwords are a barrier to people contributing, and they don't stop spammers.The spidering issue has been dealt with via a honeypot (visit wiki page 5 if you want to see it in action). Also, forcing people to register before editing means we get a cookie on their browser, so we can detect persistent spammers should that become necessary. And once the revisions are back working again it will be easier to restore after vandalism.I'm not suggesting this system is perfect, but it is sufficient for now and preserves the open nature of his wiki. We could do a lot worse.dkf: There are a number of ways to implement anti-spam measures, and it has been a long-standing policy of the Tcler's Wiki to avoid techniques that discourage contribution. Instead, we've used a policy in the past of relying on the community to spot spamming and revert it rapidly. There are a few technical components to support that policy not yet online, but experience shows that spam isn't a big problem with a large community of vigilant Recent Changes watchers. However, by encouraging people to always contribute with a consistent ID (something which was not consistently done before) it makes it easier to trace activities of persistent and annoying scum and put in place measures to deal with them as necessary.(Josh) We certainly all trust you guys are doing the best for this wiki.Interesting! I am new to this cookie security approach.In the password system: a spammer (or a vandal) requires a password; he gets it, he spams and he keeps on spamming (or vandalizing) until his password is revoked. Then he gets a new password using another e-mail address and he starts the same behaviour over again.Question: Is that what you're suggesting?The problem seemingly with your cookie approach is that you end up blocking complete IP networks just to stop one individual from spamming when with the password system you only block one spammer at a time.But then again I could very well be mistaken. Since the cookie installed cannot be edited, you could therefore stop Joe Blow@142433 and he won't be able to post from his computer anymore but JamesK@142433 could be able to post however. Am I right?Question: Can you actually stop anyone from deleting a cookie in his computer and take a new one?Went to page 5. Hey you need good eyes to read the characters. I am sure more than one honest participant will be caught in the web!I also tried to make sense of what Colin wrote above:It was designed with locks and a lock breaking mechanism with a timeout of 10 minutes. The spidering was such that the timeouts timed out on valid edits, enabling multiple edits to be partially committed, thus causing corruption of data.It is very well written, well formulated but perhaps not clear enough for non-experts like me.Question:What does this all mean? Can you provide examples? No offense! I am not trying to be mean or difficult or nosy; it's just that when I don't understand something I ask questions. That's how I learn. :-)DKF: Two reasons why I'm not giving details:
- There's no fixed rules anyway; we can tune our response as we see fit. (Let the Kangaroo Court decide their fate!)
- I don't want to give spammers a recipe for working around our response.
And the light came! :-)LV Josh, basically, the best guess as to what happened was that the code to prevent multiple people from trying to update the single wiki file failed, and the single file was corrupted. Looking at log files seemed to indicate that around the time of the problem, a large number of files were being requested by a particular address. The files included the edit page urls - each one of which caused the initiation of a timer, initially designed to give someone about 10 minutes to edit and submit a page. When the software timers ran out, the system began allowing requests for those pages to occur. It seems likely that multiple updates occurred during this time of losing page edit locks, resulting in corrupted data.(Josh) Thanks Larry! Now I get it! Or I am very close to getting it.I presume the edit conflict algorithm works this way: when A clicks on the Edit button, he has 10 minutes to save his edit. He has more or less taken control of the page. During those ten minutes anyone could click on the Edit button for this page but if he wants to save the page, he will be shown the Edit conflict message and he won't be able to save. Of course it will make more sense to disallow the clicking of the edit button for a page under time-out but this is not the way it is being done. Anyone can click on the edit button of a page during the time-out but he won't be able to save before the end of the 10 minute time-out is the way this is generally done.When the spider attack occurred, the server went berserk and the edit conflict and the time-out code did not function properly (to say the least! In fact it was total chaos) therefore many edits have been damaged.Considering the above, doesn't it make sense to put in place a mechanism that won't allow any user to make more than say 5 edits every 5 minutes? I have seen this done on other wikis and it works fairly well. This way no creep could ever be able to attack the wiki's server anymore! Simple!I suspect this limited editing code was put in place on other wikis to counter such server attacks but also to curb the enthusiasm of certain obsessive-compulsive and other similar users who were posting way too often :-) It worked very well. I believe the name of this function was time-limited editing or timed edits. I remember now: the slow post function is what it was called.This being said, it was a very wise decision to close the wiki all together, to fix the problem and it was also another wise decision to go slowly but surely to fix the problem.I would appreciate comments concerning the possible implementation of this slow post solution.This timed-edits solution coupled with the TCL's Oceania Connection's efforts to strengthen the server should bring an excellent solution to the current problem, I believe.The implementation of the slow posts could be made even simpler with the implementation of the current cookie system so we would go full circle and provide an excellent solution.The Oceania connection is fortifying the castle so that it could resist the enemy's attacks. Excellent! But it would also make a lot of sense to make sure the enemies can't use the road to the castle!With such a protection, I fail to see how a spider could cause any strain on the server since he couldn't make more than 5 posts every 5 minutes! Hopefully the code will be so good that it will cause strain on the creep's computer if he tries to spider us. That I would like! :-) And I'm sure I wouldn't be the only one! :-)
Problem caused by the download of the wiki snapshotLars H: I think there is something wrong with LV's explanation. Clicking "Edit" doesn't lock the page, it only gives you a page with (a) an Edit-box, (b) "Save" and "Cancel" buttons, and (c) a hidden piece of data recording the version (basically the time it was last saved) of the page being edited. Locking the page only happens when you click "Save", and the wiki is only changed if the current version of the page when you're saving is the same version that you checked out (if they're different, you get an "Edit conflict" page). Spiders requesting Edit pages should therefore not be a problem (unless they also start clicking "Save", but that seems unlikely since it's an entirely different operation on the HTTP level). Restricting the number of edits per user would therefore not make sense.I've seen it claimed that the spidering problem was rather due to downloads of wiki snapshots, since generating a snapshot requires (required?) locking the entire wiki, and could add up to enough time that locks set by ordinary editing operations in progress were broken. It would be interesting to see an explanation of how the switch to Wub addresses this issue...(Josh) Larry's explanation was excellent. It might have been my interpretation that wasn't quite right.In fact you might very well be mistaken yourself, Lars.I believe the algo works this way or somewhere along those lines:1. A user requests a page. Let's say Ask and you shall be given # 52. He edits it, he wants to save it. He clicks on Save.3. The program looks for this page in the Pages requested for save database.4. It gets a clearance. No one has requested to save this particular page; no time-out for it. The ten-minute page time-out starts.Now someone comes during the ten-minute delay and requests the very same page. Fine. He clicks on edit. He sees the edit box, the Save button, the cancel etc. When he wants to save, the process described above starts (at Step 3). The page is in the Timed-out pages database. No way to save it. You get an edit conflict.What the creep did is probably request hundreds of pages for editing and saving thus screwing the database and the code.Therefore my suggestion makes a lot of sense. If he was on slow posts, he wouldn't have been able to save more than 5 pages every 5 minutes and all his saving requests would have been rejected except one or two.
Lars' question; Josh's solutionIn any case, please do not forget to address Lars' issue as well: Lars writes:I've seen it claimed that the spidering problem was rather due to downloads of wiki snapshots, since generating a snapshot requires (required?) locking the entire wiki, and could add up to enough time that locks set by ordinary editing operations in progress were broken. It would be interesting to see an explanation of how the switch to Wub addresses this issue...(Josh) Larry has written that: Looking at log files seemed to indicate that around the time of the problem, a large number of files were being requested by a particular address.Requested for what? For editing?In any case, I would solve the current problem in the following manner:1. I would put back Wikit2. I would write and implement the slow posts mechanism after getting the algorithm from the wiki webmaster who has implemented it on its own wiki. Why reinvent the wheel?We'd post on the Wikit with the slow posts mechanism added.3. Colin would continue his work on the Wub peacefully. When it would be ready, it would replace the Wikit + the slow posts function if need be.
[Category Wikit|Category Discussion]