Re: enabling "Submit" button after X seconds.
by tinita (Parson) on Jun 13, 2010 at 12:11 UTC
|
This is not a perl question. Your question is about HTML/Javascript, but the problem itself is not solvable by Javascript like this.
A robot just submits the form. It doesn't care if there is any Javascript in the HTML which does something with the submit button. The only way to make the robots' life more difficult is to demand something "intelligent" from the user. Like captchas do. | [reply] |
Re: enabling "Submit" button after X seconds.
by bradcathey (Prior) on Jun 13, 2010 at 13:27 UTC
|
Robots work by filling out every field on a form. I learned the trick of hiding (using CSS) a text field with an innocuous name and then validating it on the Perl side of things. If the field contains a value, I reject it.
If you did want to create a delay, it is possible to use javascript and jQuery, but that's too OT for the Monastery.
—Brad "The important work of moving the world forward does not wait to be done by perfect men." George Eliot
| [reply] |
Re: enabling "Submit" button after X seconds.
by ikegami (Patriarch) on Jun 13, 2010 at 17:37 UTC
|
Those robots don't use the button per say. Disabling it is useless.
What you could do is include the time the form was generated in a hidden field, then compare the current time against that field when the form is submitted.
I'm not saying that "disabling" the submit button for a period of time is the best solution, just how to achieve it.
| [reply] |
|
With recaptcha it worked just fine till now, many, many months. I'm sure recaptcha was hacked recently because I started to get automatic submissions (different IPs) every 4-5 seconds during 1-2h, resulting in thousands of submissions and big server load. As disabling the button is impossible to achieve with perl I have to meter first access and last access. If difference is less than X seconds submission will not be processed. Seems the only solution at this time.
| [reply] |
|
| [reply] |
|
|
| [reply] |
|
The robot he described is a generic robot. It *won't* change the value to an older timestamp. It doesn't matter that it *could*.
If we were talking about a robot that specifically targets his site, then you'd have a point. There are solutions for that too, such as the aforementioned encryption. (Public key encryption is unnecessary, though. Symetric encryption would be faster.)
| [reply] |
|
|
a hidden field can be filled by those robots easily (it's just another input type).
Use public key encryption, and encrypt the timestamp.
If you can't decrypt the timestamp or its missing, you know a robot is trying to circumvent your throttling attempts, so you throttle it :)
| [reply] |
Re: enabling "Submit" button after X seconds.
by Corion (Patriarch) on Jun 13, 2010 at 15:17 UTC
|
| [reply] |
Re: enabling "Submit" button after X seconds.
by davies (Prior) on Jun 15, 2010 at 22:29 UTC
|
I bring to the problem a mind totally unencumbered by theory. In other words, I haven't a clue what I'm talking about. But it occurs to me that there are certain fields that must be completed on a web form. I believe (perhaps wrongly) that it's possible to hide fields from human view but not from robots. Therefore, assuming "name" is compulsory, would it be possible to have four fields, namea, nameb, namec and named, all in the same place but only one of which, chosen at random, is visible to the human? Anything filling in one of the hidden fields would have to be a robot.
If I've wasted anyone's time with a daft idea, I apologise.
Regards,
John Davies | [reply] |
|
The human behind robot will do some tests before. If those hidden fields shouldn't be filled they will not be filled by robot as well.
Back to my issue, I was able to solve it leaving reCaptcha in place and also setting a short cookie plus few other algorithms based on my website/scripts particularities.
I also reported this issue to reCaptcha and I'm waiting their response. I'm sure they are working right now to find a solution.
| [reply] |