Re: warning category for techniques / failures.

thanks

I can see the value of warnings.   I just don’t think you should say they are common ways that things don’t pass   (which means    “common failures”  because not passing means failure).

because that become    "Common failures that don’t automatically fail”

maybe something like 


Warning = something that needs to be manually checked because conformance changes for different contexts. 

or some such.

tough to figure out how to say it 

gregg

> On May 4, 2016, at 3:16 AM, Alastair Campbell <acampbell@nomensa.com> wrote:
> 
> Gregg wrote:
>> "Do not understand:
>> 3. [New] Warnings (common ways that pages don’t pass, but don’t automatically fail.)
>> 
>> What does this mean?
> 
> Hi Gregg,
> 
> It is trying to say that: If your page does X, it probably fails. We are not 100% sure it fails, you might have passed some other way, but you’d better check.
> 
> There are probably more things we can document under a ‘warnings’ category than failures, as they don’t’ have to be 100% failures in all circumstances. 
> 
> I’m sure some of the testers on the list could come up with many examples. I’ll do a starter for 10 to give some examples:
> 
> - Data table doesn’t have a visible caption.
> - No visible label for a form field.
> - Related fields are not grouped with a fields & legend
> - Main heading is not an H1
> - Submit button isn’t at the bottom of the form.
> - Icon doesn’t have supporting text.
> - Use of 'click here' / 'read more’.
> 
> None of these are definitely failures, but the presence of them on a page rings warning bells! 
> Many automated tools have a ‘warning’ category for things they pick up but cannot be sure are failures.
> 
> Obviously we could come up with millions of these, so it should be ‘common’ ones rather than all. We could even ask a testing tool person to see if they have any aggregate stats on these.
> 
> Kind regards,
> 
> -Alastair
> 
> 

Received on Thursday, 5 May 2016 15:56:09 UTC