Hi folks. On the comment of our programmer saying that "our webform needs a captcha", I felt it was time in 2011 to rethink the old dilemma... There must be/I must invent a better solution! Why?
Most of my clients hate CAPTCHAs. My clients are typically 40 ~ 60 age, female & male managers & decision makers, not nerds like myself. Now, admitted: I myself feel often like a robot: I have to squeeze my eyes and read those meaningless obfuscated letters in the Capcha... Sometimes I even fail when filling them in! Go back etc. If that is a turnoff for me, just imagine what it must be like for human customers. So: shouldn't the forms have better A.I. built in by now to smell the difference between spammy robots and real human visitors/clients?
The Big Picture
To tell difference between human and robot by giving credibility points. 100 points=human | 0%=robot.
AWARD human credibility points for:
- human mouse movements, that don't follow any mathematical patterns
- non-instantaneous reading delays between page load and first input(s) in form
- when typing in form, delays are measured for letters, spaces and word completions
- typical human hesitations, behaviors, deleting, rephrasing etc
- when global flooding threshold isn't reached (number of total submissions within 1 hour)
RETRACT credibility points for:
- suspiciously instant pasting / completions of one or more form fields
- website hyperlinks in form (very spammy and uncommon in most forms)
- single ip quickly reading many many pages with <1 sec. page viewing-time on all no-form pages
- when url/email field (inside HTML, but invisible by human) is populated or non-empty
- retract points if IP of visitor is inside worldwide network of DBs of spammy sites
When more than 50% human credibility, allow to be sent without captcha. If less than 50%, force easy captcha puzzle to be shown. Less than 25%? Show a difficult captcha like the today's default eye-squeezing nonsense word captchas.
Currently websites assume we are 0% human by default. Unless we prove them otherwise.
I feel its time to reverse that false prejudice!
Imagine the user-friendliness
Your site distinguishing itself from others, showing your audience
your sites KNOWS the difference between a robot and a human. Imagine the advantage. I am trying to capture the essence of that distinguishing edge.
Obviously this question is centered around inventive ideas and new A.I. code. Let us for a minute not think in existing .js .css .php .cfm etc but first try to distinguish human/spamserver behavior, then think of simple smart ways that provide a better, more user-friendly alternative than forcing your clients/visitors to write CAPTHCAS.
The bounty to my question goes to the most elegant/clever/simple answer. Important side note: Sometimes creative new idea's or questions are quickly or jealously closed off as off-topic due to lack of exsisting old answers. So was the case with my question! As an art director I can say that this is the greatest bottleneck to innovation and progress. Fortunately some angels vote for reopening and now its a wiki! Thanks for not accepting yesteryears remedies, and pushing innovation forward!