Pathologically Eclectic Rubbish Lister | |
PerlMonks |
Duplicate of:nnnnnn field in the Consideration Nodelet?by BrowserUk (Patriarch) |
on Feb 17, 2003 at 22:09 UTC ( [id://236130]=monkdiscuss: print w/replies, xml ) | Need Help?? |
There was a recent node from dws regarding accidental duplicates, and the most frequent reason for consideration is Duplicate. Unfortunately, the usual result is that both copies of the duplicate end up with replies, and both copies get considered for deletion. And tonight we has the situation of one node that had been considered, approved, and front-paged all simultaneously. Isn't it possible to add a few interlocks somewhere? First, if a node did not reach approved status until it had received some number greater than 1 Approval clicks. Once a node has received a certain number of approvals, it would be immune to consideration except by an editor/pmdev/god. Then, as duplicates are so common, add a dup_id field to the consideration nodelet so that the 'other' node_id had to be supplied when considering for reasons of duplication, and once one of the pair was considered for reasons of duplication, the other would be locked out from being considered on the same basis. The check would be done upon submit rather than when the nodelet s built. Submitting a node for reason of duplication when the corresponding other node had already been so considered would reject the attempt. It should not be hard to at least verify that the 'other nodeid' supplied was by the same author, which would prevent the situation I have seen a couple of times where node with similar titles but different authors were considered as duplicates. Seems like it might also make sense to withhold a Frontpage request from being actioned whilst a node is considered or if it is mentioned as an Other node in a duplicate node consideration. Examine what is said, not who speaks. The 7th Rule of perl club is -- pearl clubs are easily damaged. Use a diamond club instead.
Back to
Perl Monks Discussion
|
|