good chemistry is complicated, and a little bit messy -LW |
|
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
From my experience building and block web robots and script kiddies the following have been the most effective:
An extremly complex cookie mechanism in a dynaic external javascript file Cookies sent in with random images on the page IP address tracking Tracking what order your parameters come in on which broswers Wheither or not they sent it as a post or a get. Whether or not requests are coming in at regular intervals (if they are coming in at intervals less then 5 seconds then it is probably a robot of some kind.) If the client has requested to logon under a vastly different user name. If you are trying to keep people web web robots rfom downloading your entire site you can give people bandwidth quotas (I think apache does this though I'm not sure) Instead of sending the "he is logged on this is his id" cookie with the logged in page send it from a style sheet on that page. All of these methods can be worked around (the ip tracking one being the hardest) but implementing a set of them could make someone thing twice about how hard they want your content. -Douglas In reply to Re: Password hacker killer
by GermanHerman
|
|