Commenter I Ain't Spartacus was the first to pick the low-hanging fruit here, replying with: "If it takes you 3 tries, and 'today's artificial intelligence technology can solve even the most difficult variant of distorted text at 99.8 per cent accuracy' - then does that mean you're actually not a human?!"
He continued: "I've never been able to get my Captcha ratio much above 1 in 3 either. So does that mean I'm some sub-standard part of the Matrix?"
Performing worse than a bot is indeed something to be quite ashamed of, although the bot at least has the advantage of being programmed to do only the one task and isn't managing 20 tabs of streaming video in the background.
There Is An Advert That Never Goes Out
Commenters beneath TechCrunch's story about the new crowd-computing system Google's come up with were mostly of the serious type, debating how people with disabilities could cope, especially if they have some sort of software helper.
Stephen Hawking, for example, would be locked out of Ticketmaster's shopping basket were he to order his computer assistant to attempt to buy tickets to see Morrissey at the O2.
The ethics of Google using people power to sort cat photos was also called into question, with reader Wesley Joseph asking: "How is it ethical to ask people to type a word or click on a picture for 'security' or 'non-robot verification' purposes and then use that information for corporate financial gain? If I'm doing something that makes them money and they are lying to me about what I am doing, that's wrong."
Derek Williamson thinks that's a bit hypocritical of an opinion to have, though, saying: "Digitizing books and identifying road numbers for Maps/Streetview are both fairly useful to society as a whole as well, though. So you were okay with it when it was completely useless, but it's wrong if it's actually somewhat useful?"
And Sebastian Vidal further heaped shame upon Wesley, with his bitter: "Newsflash: Google uses everything for corporate financial gain - it's kind of their thing."