Do you still use reCAPTCHA or something homemade?

For us, we use jQuery to inject a hidden form field upon form submission that sends a little time sensitive and cookie sensitive string that we validate on the server side.

I’m always curious to know what everyone else is doing.

Using 2FA so it can be used with Authy on my phone.

As an end user or as a web developer using their API?

Use it in ASP.NET website for authentication. Found a tool to help build the QR code that Authy can read (or you can manually add the code in Authy). Same tool validates the 6 digit code returned by Authy. Development is dorment a.t.m. but if you want to know which I used, I'll look it up.

Yes, I use recaptcha.

I still use reCaptcha for basic form validation.
Can you explain a little more about how your solution works.
If it's generated on submit, how does it validate against a non-bot user?

I no longer use reCAPTCHA because I find the "find the road signs" puzzles designed to help their self-driving car research extraordinarily annoying.

I don't want to explain too much about how what I'm doing works here in this public forum, as we get a lot of spam bots that are designed specifically for DaniWeb. But I'll run you through the basics ...

Basically what we do is use javascript to inject a hidden form input field upon pressing the Submit button. A lot of bots don't evaluate javascript on a page and just send the HTTP request to submit the form without actually evaluating the page the form is on or loading the page's external javascript file, so they won't ever be triggering the javascript. If the input field doesn't exist, the server knows it's a bot. Secondly, the input field contains an encoded string that only stays active for a couple of minutes and is unique to the user. Therefore, a human spammer can't submit the form manually, inspect the HTTP request being sent out, and then copy the string value used that one manual time for all the future bot requests.

We use something very similar with what Dani explained. But before getting to it , I would like to point out that the first step is the server side prevention of robots sending form data. Of course even with a good server side implementation you can't be 100% sure but you minify the risk. Let me give you an example , if the operation is to sign up it's not logical to accept more than once every minute from the same IP or / and session id or / and hash of all the other info that you can get for the browser.

In client side validation we use the mouseover event in an element that surrounds each submit button in order to generate a hash. Even if the robot runs JavaScript it is really weird to fire that event that is not in the button but in a surrounding element. Websockets through web workers help hide even more that process ( how many robots can connect through web sockets ?) and even if they do , how many hours should any one spend to understand how this implementation is done and how the hash is generated ?.

Now that WebAssembly is available to most browsers I am thinking ( when I will find free time ) to create a process with WebAssembly that completely hides the implementation client side (plus a new hash with parts of the data to be send) , of course there should be an alternative for old browsers (that don't support WebAssembly) but because the percentage for those is very low I wouldn't even mind using reCAPTCHA for these rare occasions.

Using CAPTCHA is easy (for the programmer / developer) and pretty safe , so I recommend it in very critical parts of the application , but I agree that it creates an awful user experience. Also I am thinking to implement it (I haven't yet) if the request passes the client side robots prevention that we have now but fails in the server side robots prevention ( one more thing in the TODO list ;) )

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.