Liberty Alliance: CAPTCHA update - draft for discussion

Hi Again,

According to Wikipedia, Liberty Alliance was disolved in 2009:
https://en.wikipedia.org/wiki/Liberty_Alliance

It gave way to the Kantara Initiative that is active still:
https://en.wikipedia.org/wiki/Kantara_Initiative

Janina

White, Jason J writes:
> We agreed at the meeting to circulate Scott's latest draft and to continue to coordinate the editing work on list. Please see the attached draft (assuming that the list processor distributes and archives it).
> 
> 
> ________________________________
> 
> This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited.
> 
> 
> Thank you for your compliance.
> 
> ________________________________

> Abstract
> 
>    The use of CAPTCHAs in recent years has seen a number of different
>    approaches being used to tell humans and robots apart. While the
>    traditional identification of text in an image remains popular, other
>    mechanisms such as multi-device authentication and the Google reCAPTCHA
>    are gaining in prominence.
> 
>    However, for people with disabilities, it often remains the case that
>    the CAPTCHA presented determines that the user is not human. In
>    addition, research suggests that many popular CAPTCHA techniques are no
>    longer secure.
> 
>    This document examines a number of potential solutions that allow
>    systems to test for human users while preserving access for users with
>    disabilities.
> 
> Status of this document
> 
>    Editor: Scott Hollier
> 
>    For: RQTF
> 
>    Date: 6 April 2018
> 
>    This is an updated version of a document previously titled
>    “[1]Inaccessibility of CAPTCHA”
> 
>    Publication as a Working Group Note does not imply endorsement by the
>    W3C Membership. This is a draft document and may be updated, replaced
>    or obsoleted by other documents at any time. It is inappropriate to
>    cite this document as other than work in progress.
> 
> Table of contents
> 
>                                     Contents
> 
>    [2]Abstract 1
> 
>    [3]Status of this document 1
> 
>    [4]Table of contents 1
> 
>    [5]1. The problem 2
> 
>    [6]2. A Security effectiveness 2
> 
>    [7]3. Types of CAPTCHA and access implications 3
> 
>    [8]3.1 Traditional character-based CAPTCHA 3
> 
>    [9]3.2 Logic puzzles 3
> 
>    [10]3.3 Sound output 4
> 
>    [11]3.4 Limited-use accounts 4
> 
>    [12]3.5 Non-interactive checks 5
> 
>    [13]3.6 Federated identity systems 6
> 
>    [14]3.7 Multiple user devices 7
> 
>    [15]3.8 Image and video 7
> 
>    [16]4. Conclusion 8
> 
>    [17]5. Acknowledgments 8
> 
>    [18]6. References 8
> 
> 1. The problem
> 
>    Web sites with resources that are attractive to aggregators such as
>    sign-up Web pages, travel and event ticket sites, Web-based email
>    accounts and social media portals have taken measures to ensure that
>    they can offer their service to individual users without content being
>    harvested or exploited by Web robots.
> 
>    An initially popular solution was the use of graphical representations
>    of text in registration or comment areas. The site would attempt to
>    verify that the user in question was in fact a human by requiring the
>    user to complete this task commonly referred to as a Completely
>    Automated Public Turing test to Tell Computers and Humans Apart
>    [[19]CAPTCHA].
> 
>    The CAPTCHA was initially developed by researchers at Carnegie Mellon
>    University and has been primarily associated with a technique whereby
>    an individual had to identify a distorted set of characters from a
>    bitmapped image, then enter those characters into a form. However, in
>    recent times the types of CAPTCHA that appear on Web sites and mobile
>    apps have changed significantly. As such, the term "CAPTCHA" is used in
>    this document to refer to all projects which are specifically designed
>    to differentiate a human from a computer.
> 
>    While online users broadly have reported finding traditional CAPTCHAs
>    frustrating to complete, it is generally assumed that a CAPTCHA can be
>    resolved within a few incorrect attempts. The point of distinction for
>    people with disabilities is that a CAPTCHA not only separates computers
>    from humans, but also removes people with disabilities from being able
>    to complete the processes they requested. For example, reliance on
>    visual and textual verification comes at a huge price to users who are
>    blind, visually impaired or dyslexic. Likewise audio CAPTCHAs represent
>    challenges for people who are Deaf or hearing impaired, and the
>    assumption of traditional CAPTCHAs that all Web users can read a
>    particular character set or English-based words makes the test
>    inaccessible to a large number of Web users.
> 
>    Assistive Technology users also face challenges resulting from the
>    CAPTCHA image due to it containing no meaningful text equivalent, as
>    that would make it a giveaway to computerized systems. In many cases,
>    these systems make it impossible for users with certain disabilities to
>    create accounts, write comments, or make purchases on these sites - n
>    essence, CAPTCHAs fail to properly recognize users with disabilities as
>    human. Such issues also extend to situational disabilities whereby a
>    user may not be able to effectively view a traditional CAPTCHA on a
>    mobile device due to the small screen size or hear an audio-based
>    CAPTCHA in a noisy environment.
> 
>   2. Security effectiveness
> 
>    It is important to note that the effectiveness in using a CAPTCHA as a
>    security solution has deteriorated in recent years. Current CAPTCHA
>    methods that rely primarily on text-based or image-based problems can
>    be largely cracked using both complex and simple computer algorithms.
>    Research suggests that approximately 20% of traditional CAPTCHAs can be
>    broken using OCR algorithms (Hernández-Castro, C. J., Barrero, D. F., &
>    R-Moreno, M. D., 2016)(Li, Q., 2015).
> 
>    In addition, pattern-matching algorithms in some instances can achieve
>    an even higher success rate of cracking CAPTCHAs (Yan, J., & El Ahmad,
>    A. S., 2009)(Sano, S., Otsuka, T., Itoyama, K., & Okuno, H. G., 2015).
>    While efforts are being made to strengthen traditional CAPTCHA
>    security, more robust security solutions run the risk of reducing the
>    abilities for typical users to understand the CATPCHA that needs to be
>    resolved (Nakaguro, Y., Dailey, M. N., Marukatat, S., & Makhanov, S.
>    S., 2013).
> 
>    As such, it is highly recommended that for both security and
>    accessibility reasons, alternative security methods are considered in
>    preference to the use of a traditional image-based CAPTCHA such as
>    two-step or multi-device verification methods.
> 
> 3. Types of CAPTCHA and access implications
> 
>    There are many techniques available to users to discourage or eliminate
>    fraudulent account creations or uses. Several of them may be as
>    effective as the visual verification technique while being more
>    accessible to people with disabilities. Others may be overlaid as an
>    accommodation for the purposes of accessibility. The following list
>    highlights common CAPTCHA types and their respective accessibility
>    implications.
> 
>   3.1 Traditional character-based CAPTCHA
> 
>    The traditional character-based CAPTCHA, as previously discussed, is
>    largely inaccessible and insecure. It focuses on the presentation of
>    letters or words presented in an image and designed to be difficult for
>    robots to identify. The user is then asked to enter the CAPTCHA
>    information into a form.
> 
>    The use of a traditional CAPTCHA is particularly problematic for people
>    who are blind or visually impaired as the assistive technology cannot
>    process the image, therefore preventing users from entering the results
>    in the form. Due to the mechanisms used to prevent the CAPTCHA from
>    being read by robots, the characters are often distorted or have other
>    characters in close proximity making it difficult to read visually. The
>    common CAPTCHA technique has also been found to be less reliably solved
>    by users with learning disabilities (Gafni & Nagar).
> 
>    In addition, there is currently a dominant assumption that all web
>    users can understand English, which is not the case. Examples such as
>    Arabic and Thai demonstrate the barriers associated with CAPTCHAs based
>    on written English and related language character sets (Tangmanee, C.,
>    2016).
> 
>   3.2 Logic puzzles
> 
>    The goal of visual verification is to separate human from machine. One
>    reasonable way to do this is to test for logic. Simple mathematical
>    word puzzles, trivia, and the like may raise the bar for robots, at
>    least to the point where using them is more attractive elsewhere.
> 
>    Problems: Users with cognitive disabilities may still have trouble.
>    Answers may need to be handled flexibly, if they require free-form
>    text. A system would have to maintain a vast number of questions, or
>    shift them around programmatically, in order to keep spiders from
>    capturing them all. This approach is also subject to defeat by human
>    operators.
> 
>   3.3 Sound output
> 
>    To reframe the problem, text is easy to manipulate, which is good for
>    assistive technologies, but just as good for robots. So, a logical
>    means of trying to solve this problem is to offer another non-textual
>    method of using the same content. To achieve this, audio is played that
>    contains a series of numbers, letters or words being read out which the
>    user then needs to enter into a form..
> 
>    However, according to the CNet article [[20]NEWSCOM], if the sound
>    output, which is itself distorted to avoid the same programmatic abuse,
>    can render the CAPTCHA difficult to hear. There can also be confusion
>    in understanding whether a number is to be entered as a numerical value
>    or as a word, e.g. ‘7’ or ‘seven’. There are also temporal issues in
>    that if it an element of an audio CAPTCHA is not understood, the entire
>    CAPTCHA needs to be replayed. Currently not all audio CAPTCHAs provide
>    a replay option, meaning it is often the case that an entirely new
>    audio CAPTCHA has to be played if any part of it is difficult to
>    understand.
> 
>    Users who are deaf-blind, don't have or use a sound card, work in noisy
>    environments, or don't have required sound plugins are likewise left in
>    the lurch. Since this content is auditory in nature, users often have
>    to write down the code before entering it, which is very inconvenient.
> 
>    Although auditory forms of CAPTCHA that present distorted speech create
>    recognition difficulties for screen reader users, the accuracy with
>    which such users can complete the CAPTCHA tasks is increased if the
>    user interface is carefully designed to prevent screen reader audio and
>    CAPTCHA audio from being intermixed. This can be achieved by
>    implementing functions for controlling the audio that do not require
>    the user to move focus away from the text response field (Bigham, J.
>    P., & Cavender, A. C. 2009).
> 
>    Experiments with a combined auditory and visual CAPTCHA requiring users
>    to identify well known objects by recognizing either images or sounds,
>    suggest that this technique is highly usable by screen reader users.
>    However, its security-related properties remain to be explored (Sauer,
>    G., Lazar, J., Hochheiser, H., & Feng, J. 2010).
> 
>   3.4 Limited-use accounts
> 
>    Users of free accounts very rarely need full and immediate access to a
>    site's resources. For example, users who are searching for concert
>    tickets may need to conduct only three searches a day, and new email
>    users may only need to send a canned notification of their new address
>    to their friends, and a few other free-form messages. Sites may create
>    policies that limit the frequency of interaction explicitly (that is,
>    by disabling an account for the rest of the day) or implicitly (by
>    slowing the response times incrementally). Creating limits for new
>    users can be an effective means of making high-value sites unattractive
>    targets to robots.
> 
>    The drawbacks to this approach include having to take a trial-and-error
>    approach to determine a useful technique. It requires site designers to
>    look at statistics of normal and exceptional users, and determine
>    whether a bright line exists between them.
> 
>   3.5 Non-interactive checks
> 
>    While CAPTCHA and other interactive approaches to spam control are
>    sometimes effective, they do make using a site more complex. This is
>    often unnecessary, as a large number of non-interactive mechanisms
>    exist to check for spam or other invalid content.
> 
>    This category contains two popular non-interactive approaches: spam
>    filtering, in which an automated tool evaluates the content of a
>    transaction, and heuristic checks, which evaluate the behavior of the
>    client.
> 
>     3.5.1 Spam filtering
> 
>    Applications that use continuous authentication and "hot words" to flag
>    spam content, or Bayesian filtering to detect other patterns consistent
>    with spam, are very popular, and quite effective. While such risk
>    analysis systems may experience false negatives from time to time,
>    properly-tuned systems can be comparable to a CAPTCHA approach, while
>    also removing the added cognitive burden on the user.
> 
>    Most major blogging software contains spam filtering capabilities, or
>    can be fitted with a plug-in for this functionality. Many of these
>    filters can automatically delete messages that reach a certain spam
>    threshold, and mark questionable messages for manual moderation. More
>    advanced systems can control attacks based on post frequency, filter
>    content sent using the [[21]TRACKBACK] protocol, and ban users by IP
>    address range, temporarily or permanently.
> 
>     3.5.2 Heuristic checks and the Google ReCAPTCHA
> 
>    Heuristics are discoveries in a process that seem to indicate a given
>    result. It may be possible to detect the presence of a robotic user
>    based on the volume of data the user requests, series of common pages
>    visited, IP addresses, data entry methods, or other signature data that
>    can be collected.
> 
>    Again, this requires a good look at the data of a site. If
>    pattern-matching algorithms can't find good heuristics, then this is
>    not a good solution. Also, polymorphism, or the creation of changing
>    footprints, is apt to result, if it hasn't already, in robots, just as
>    polymorphic ("stealth") viruses appeared to get around virus checkers
>    looking for known viral footprints.
> 
>    Another heuristic approach identified in [[22]KILLBOTS] involves the
>    use of CAPTCHA images, with a twist: how the user reacts to the test is
>    as important as whether or not it was solved. This system, which was
>    designed to thwart distributed denial of service (DDoS) attacks, bans
>    automated attackers which make repeated attempts to retrieve a certain
>    page, while protecting against marking humans incorrectly as automated
>    traffic. When the server's load drops below a certain level, the
>    authentication process is removed entirely.
> 
>    An example of a CAPTCHA base don this approach is the Google ReCAPTCHA
>    which features a checkbox labelled ‘I am not a robot’ or similar
>    phrasing. The process works by collecting data such as mouse movement
>    and keyboard navigation to determine if the user is a human or robot,
>    while keeping the CAPTCHA process relatively simple.
> 
>    Anecdotal evidence suggests that this CAPTCHA is currently the most
>    accessible CAPTCHA solution and can be completed with a variety of
>    assistive technologies. However, there is little formalised research
>    investigating if this is indeed the case. There is also the additional
>    concern that the inability of completing the reCATPCHA tends to default
>    back to a traditional inaccessible CAPTCHA.
> 
>   3.6 Federated identity systems
> 
>    Many large companies such as Microsoft, Apple, Amazon, Google and the
>    Liberty Alliance have created competing "federated network identity"
>    systems, which can allow a user to create an account, set his or her
>    preferences, payment data, etc., and have that data persist across all
>    sites and devices that use the same service. Due to large companies now
>    requiring a federated identity to use cloud-based services on their
>    respective digital ecosystems, the popularity of federated identities
>    has increased significantly. As a result, many Web sites and Web
>    Services, allow a portable form of identification across the Web.
> 
>     3.6.1 Single sign-on
> 
>    Ironically enough, the Passport system itself is one of the very same
>    services that currently utilizes visual verification techniques. These
>    single sign-on services will have to be among the most accessible on
>    the Web in order to offer these benefits to people with disabilities.
>    Additionally, use of these services will need to be ubiquitous to truly
>    solve the problems addressed here once and for all.
> 
>     3.6.2 Public-key infrastructure solutions
> 
>    Another approach is to use certificates for individuals who wish to
>    verify their identity. The certificate can be issued in such a way as
>    to ensure something close to a one-person-one-vote system by for
>    example issuing these identifiers in person and enabling users to
>    develop distributed trust networks, or having the certificates issued
>    by highly trusted authorities such as governments. These type of
>    systems have been implemented for securing web pages, and for
>    authenticating email.
> 
>    The cost of creating fraudulent certificates needs to be high enough to
>    destroy the value of producing them in most cases. Sites would need to
>    use mechanisms which are widely implemented in user agents.
> 
>    A subset of this concept, in which only people with disabilities who
>    are affected by other verification systems would register, raises a
>    privacy concern in that the user would need to telegraph to every site
>    that she has a disability. The stigma of users with disabilities having
>    to register themselves to receive the same services should be avoided.
>    With that said, there are a few instances in which users may want to
>    inform sites of their disabilities or other needs: sites such as
>    Bookshare [[23]BOOKSHARE] require evidence of a visual disability in
>    order to allow users to access printed materials which are often
>    unavailable in audio or Braille form. An American copyright provision
>    known as the Chafee Amendment [[24]CHAFEE] allows copyrighted materials
>    to be reproduced in forms that are only usable by blind and visually
>    impaired users. A public-key infrastructure system would allow
>    Bookshare's maintainers to ensure that the site and its users are in
>    compliance with copyright law.
> 
>     3.6.3 Biometrics
> 
>    A popular authentication method on mobile platforms include biometric
>    technology whereby a fingerprint or facial recognition authentication
>    method is used. This process effectively limits the ability of spammers
>    to create infinite email accounts. The E.U./U.S. government
>    requirements to section 3.5.3. Explain the growing popularity of
>    dual-factor authentication using biometrics.
> 
>   3.7 Multiple user devices
> 
>    The user of multiple devices such as a computer, smartphone, tablet
>    and/or wearable could provide additional support for user
>    authentication. This could assist in addressing accessibility issues by
>    using assistive technologies on each device to confirm the user is a
>    human and is a specific user (Cetin, C., 2015). The use of biometrics
>    as previously discussed could also be used as one such device
>    authentication mechanism.
> 
>   3.8 Image and video
> 
>     3.8.1 Visual comparison CAPTCHAs
> 
>    There are a number of new techniques based on the identification of
>    still images. This can include identifying whether an image is a man or
>    a woman, or whether an image is human-shaped or avatar-shaped among
>    other comparison solutions (Conti, M., Guarisco, C., & Spolaor, R.,
>    2015)( Kim, J., Kim, S., Yang, J., Ryu, J.-h., & Wohn, K., 2014)(
>    Korayem, M., 2015).
> 
>    While alternative audio comparison CAPTCHAs could be provided such as
>    using similar or different tones for comparison, the reliance on visual
>    comparison alone would be difficult for people with vision-related
>    disabilities
> 
>     3.8.2 3D CAPTCHA
> 
>    A 3D representation of letters and numbers can make it more difficult
>    for OCR software to identify, in turn making it more secure (Nguyen, V.
>    D., Chow, Y.-W., & Susilo, W., 2014). However this solution has similar
>    accessibility issues to traditional CAPTCHAs.
> 
>    We recommend further exploration of the use of risk analysis techniques
>    (as exemplified by the approach that Google have taken) to reduce the
>    need for CAPTCHA.
> 
>     3.8.3 Video Game CAPTCHA
> 
>    This process suggests the completion of a basic video game as a
>    CAPTCHA. The benefits include the removal of language barriers, and
>    multiple interface methods could potentially make such a solution
>    accessible (Yang, T.-I., Koong, C.-S., & Tseng, C.-C., 2015). It would
>    also have the benefit of making CAPCHAs an enjoyable process, reducing
>    the frustrations generally associated with traditional CAPTCHAs.
> 
> 4. Conclusion
> 
>    The evolution of CAPTCHA techniques has highlighted that traditional
>    solutions such as text-based characters contained in images are not
>    only challenging for people with disabilities, but also insecure. While
>    a majority of CAPTHCAs in use remain challenging for people with
>    disabilities to complete, recent additions including the Google
>    reCAPTCHA, multi-device authentication and the increased prevalence of
>    Federated identity systems currently provide the most accessible and
>    flexible options in separating humans from robots.
> 
>    However, while some CAPTCHA solutions are better than others, there is
>    currently no ideal solution. As such, it is important that any
>    implementation of a CAPTCHA is not going to prevent people with
>    disabilities from being identified as human.
> 
> 5. Acknowledgments
> 
>    Thanks to the following contributors: Kentarou Fukuda, Marc-Antoine
>    Garrigue, Al Gilman, Charles McCathieNevile, David Pawson, David
>    Poehlman, Janina Sajka, and Jason White.
> 
>    This publication has been funded in part with Federal funds from the
>    U.S. Department of Education under contract number ED05CO0039. The
>    content of this publication does not necessarily reflect the views or
>    policies of the U.S. Department of Education, nor does mention of trade
>    names, commercial products, or organizations imply endorsement by the
>    U.S. Government.
> 
> 6. References
> 
>    [AICAPTCHA]
> 
>      [25]aiCaptcha: Using AI to beat CAPTCHA and post comment spam, Casey
>      Chesnut. The site is online at
>      http://www.brains-n-brawn.com/default.aspx?vDir=aicaptcha
> 
>    [ANTIPHISHING]
> 
>      [26]Phishing Activity Trends Report July, 2005, Anti-Phishing
>      Working Group.   Available online at
>      http://antiphishing.org/APWG_Phishing_Activity_Report_Jul_05.pdf
> 
>    [ANTIROBOT]
> 
>      [27]Inaccessibility of Visually-Oriented Anti-Robot Tests, Matt May.
>      The site is online at http://www.w3.org/TR/turingtest
> 
>    [BOOKSHARE]
> 
>      [28]Bookshare.org home page. The site is online at
>      [29]http://www.bookshare.org
> 
>    [BREAKING]
> 
>      [30]Breaking CAPTCHAs Without Using OCR, Howard Yeend. The site is
>      online at http://www.cs.berkeley.edu/~mori/gimpy/gimpy.html
> 
>    [BREAKINGOCR]
> 
>      [31]Breaking CAPTCHAs Without Using OCR, Howard Yeend. The site is
>      online at http://www.puremango.co.uk/cm_breaking_captcha_115.php
> 
>    [CAPTCHA]
> 
>      [32]The CAPTCHA Project, Carnegie Mellon University. The project is
>      online at http://www.captcha.net
> 
>    [CHAFEE]
> 
>      [33]17 USC 121, Limitations on exclusive rights: reproduction for
>      blind or other people with disabilities (also known as the Chafee
>      Amendment): This amendment is online at
>      http://www.loc.gov/copyright/title17/92chap1.html
> 
>    [KILLBOTS]
> 
>      [34]Botz-4-Sale: Surviving DDos Attacks that Mimic Flash Crowds,
>      Srikanth Kandula, Dina Katabi, Matthias Jacob, and Arthur Burger,
>      Usenix NSDI 2005. Best Student Paper Award. This paper is online at
>      http://www.usenix.org/events/nsdi05/tech/kandula/kandula_html/ or
>      http://nms.lcs.mit.edu/%7Ekandula/data/killbots.ps
> 
>    [NEWSCOM]
> 
>      [35]Spam-bot tests flunk the blind, Paul Festa. News.com, 2 July
>      2003. This article is online at
>      http://news.com.com/2100-1032-1022814.html
> 
>    [PINGUARD]
> 
>      [36]PIN Guard, ING Direct site. This site is online at
>      https://secure1.ingdirect.com/tpw/popup_whatIsThis.html
> 
>    [PWNTCHA]
> 
>      [37]PWNtcha - CAPTCHA decoder, Sam Hocevar. The site is online at
>      http://sam.zoy.org/pwntcha/
> 
>    [TRACKBACK]
> 
>      [38]Trackback, Wikipedia. The site is online at
>      http://en.wikipedia.org/wiki/Trackback
> 
>    [TURING]
> 
>      [39]The Turing Test, The Alan Turing Internet Scrapbook, 2002. The
>      document is online at
>      [40]http://www.turing.org.uk/turing/scrapbook/test.html
> 
>      * Bigham, J. P., & Cavender, A. C. (2009, April). Evaluating existing
>        audio CAPTCHAs and an interface optimized for non-visual use. In
>        Proceedings of the SIGCHI Conference on Human Factors in Computing
>        Systems
>      * Catuogno, L., & Galdi, C. (2014). On user authentication by means
>        of video events recognition. Journal of Ambient Intelligence and
>        Humanized Computing, 5(6), 909-918. doi:10.1007/s12652-014-0248-5
>      * Cetin, C. (2015). Design, Testing and Implementation of a New
>        Authentication Method Using Multiple Devices. In J. Ligatti, D.
>        Goldgof, & Y. Liu (Eds.): ProQuest Dissertations Publishing.
>      * Conti, M., Guarisco, C., & Spolaor, R. (2015). CAPTCHaStar! A novel
>        CAPTCHA based on interactive shape discovery.
>      * Gafni, R., & Nagar, I. The Effect of CAPTCHA on User Experience
>        among Users with and without Learning Disabilities.
>      * Hernández-Castro, C. J., Barrero, D. F., & R-Moreno, M. D. (2016).
>        Machine learning and empathy: the Civil Rights CAPTCHA. Concurrency
>        and Computation: Practice and Experience, 28(4), 1310-1323.
>        doi:10.1002/cpe.3632
>      * Kim, J., Kim, S., Yang, J., Ryu, J.-h., & Wohn, K. (2014).
>        FaceCAPTCHA: a CAPTCHA that identifies the gender of face images
>        unrecognized by existing gender classifiers. An International
>        Journal, 72(2), 1215-1237. doi:10.1007/s11042-013-1422-z
>      * Kluever, K. (2008). Evaluating the usability and security of a
>        video CAPTCHA. In R. Zanibbi, Z. Butler, & R. Canosa (Eds.):
>        ProQuest Dissertations Publishing.
>      * Korayem, M. (2015). Social and egocentric image classification for
>        scientific and privacy applications. In D. Crandall, J. Bollen, A.
>        Kapadia, & P. Radivojac (Eds.): ProQuest Dissertations Publishing.
>      * Li, Q. (2015). A computer vision attack on the ARTiFACIAL CAPTCHA.
>        An International Journal, 74(13), 4583-4597.
>        doi:10.1007/s11042-013-1823-z
>      * Nakaguro, Y., Dailey, M. N., Marukatat, S., & Makhanov, S. S.
>        (2013). Defeating line-noise CAPTCHAs with multiple quadratic
>        snakes. Computers & Security, 37, 91-110.
>        doi:10.1016/j.cose.2013.05.003
>      * Nguyen, V. D., Chow, Y.-W., & Susilo, W. (2014). On the security of
>        text-based 3D CAPTCHAs. Computers & Security, 45, 84-99.
>        doi:10.1016/j.cose.2014.05.004
>      * Sano, S., Otsuka, T., Itoyama, K., & Okuno, H. G. (2015). HMM-based
>        Attacks on Google's ReCAPTCHA with Continuous Visual and Audio
>        Symbols. Journal of Information Processing, 23(6), 814-826.
>        doi:10.2197/ipsjjip.23.814
>      * Sauer, G., Lazar, J., Hochheiser, H., & Feng, J. (2010). Towards a
>        universally usable human interaction proof: evaluation of task
>        completion strategies. ACM Transactions on Accessible Computing
>        (TACCESS) , 2(4), 15.
>      * Tangmanee, C. (2016). Effects of Text Rotation, String Length, and
>        Letter Format on Text-based CAPTCHA Robustness. Journal of Applied
>        Security Research, 11(3), 349-361.
>        doi:10.1080/19361610.2016.1178553
>      * Yan, J., & El Ahmad, A. S. (2009). CAPTCHA Security: A Case Study.
>        Security & Privacy, IEEE, 7(4). doi:10.1109/MSP.2009.84
>      * Yeh, H. T., Chen, B. C., & Wu, Y. C. (2013). Mobile user
>        authentication system in cloud environment. Security and
>        Communication Networks, 6(9), 1161-1168. doi:10.1002/sec.688
>      * Yang, T.-I., Koong, C.-S., & Tseng, C.-C. (2015). Game-based image
>        semantic CAPTCHA on handset devices. An International Journal,
>        74(14), 5141-5156. doi:10.1007/s11042-013-1666-7
> 
> References
> 
>    1. http://www.w3.org/TR/turingtest/
>    2. file:///tmp/captcha-update-20180406.html#abstract
>    3. file:///tmp/captcha-update-20180406.html#status-of-this-document
>    4. file:///tmp/captcha-update-20180406.html#table-of-contents
>    5. file:///tmp/captcha-update-20180406.html#the-problem
>    6. file:///tmp/captcha-update-20180406.html#security-effectiveness
>    7. file:///tmp/captcha-update-20180406.html#types-of-captcha-and-access-implications
>    8. file:///tmp/captcha-update-20180406.html#traditional-character-based-captcha
>    9. file:///tmp/captcha-update-20180406.html#logic-puzzles
>   10. file:///tmp/captcha-update-20180406.html#sound-output
>   11. file:///tmp/captcha-update-20180406.html#limited-use-accounts
>   12. file:///tmp/captcha-update-20180406.html#non-interactive-checks
>   13. file:///tmp/captcha-update-20180406.html#federated-identity-systems
>   14. file:///tmp/captcha-update-20180406.html#multiple-user-devices
>   15. file:///tmp/captcha-update-20180406.html#image-and-video
>   16. file:///tmp/captcha-update-20180406.html#conclusion
>   17. file:///tmp/captcha-update-20180406.html#acknowledgments
>   18. file:///tmp/captcha-update-20180406.html#references
>   19. http://www.w3.org/TR/turingtest/#ref-CAPTCHA
>   20. http://www.w3.org/TR/turingtest/#ref-NEWSCOM
>   21. http://www.w3.org/TR/turingtest/#ref-TRACKBACK
>   22. http://www.w3.org/TR/turingtest/#ref-KILLBOTS
>   23. http://www.w3.org/TR/turingtest/#ref-BOOKSHARE
>   24. http://www.w3.org/TR/turingtest/#ref-CHAFEE
>   25. http://www.brains-n-brawn.com/default.aspx?vDir=aicaptcha
>   26. http://antiphishing.org/APWG_Phishing_Activity_Report_Jul_05.pdf
>   27. http://www.w3.org/TR/turingtest
>   28. http://www.bookshare.org/
>   29. http://www.bookshare.org/
>   30. http://www.cs.berkeley.edu/~mori/gimpy/gimpy.html
>   31. http://www.puremango.co.uk/cm_breaking_captcha_115.php
>   32. http://www.captcha.net/
>   33. http://www.copyright.gov/title17/92chap1.html
>   34. http://www.usenix.org/events/nsdi05/tech/kandula/kandula_html/
>   35. http://news.com.com/2100-1032-1022814.html
>   36. https://secure1.ingdirect.com/tpw/popup_whatIsThis.html
>   37. http://sam.zoy.org/pwntcha/
>   38. http://en.wikipedia.org/wiki/Trackback
>   39. http://www.turing.org.uk/turing/scrapbook/test.html
>   40. http://www.turing.org.uk/turing/scrapbook/test.html


-- 

Janina Sajka

Linux Foundation Fellow
Executive Chair, Accessibility Workgroup: http://a11y.org

The World Wide Web Consortium (W3C), Web Accessibility Initiative (WAI)
Chair, Accessible Platform Architectures http://www.w3.org/wai/apa

Received on Thursday, 12 April 2018 10:14:49 UTC