EME test reports

One thing we should probably resolve:  Including key system strings in the subtest name is creating misleading test gaps vs. our 2 passing implementation requirement.  The wptreport tool treats these as separate tests.  All com.microsoft.playready subtests are flagged currently as not having two passing implementations.

As a simple experiment, I edited the JSONs to pull the key system names from the drm subtest names.  That results in:


-          Test files: 105; Total subtests: 257

-          Test files without 2 passes: 29; Subtests without 2 passes: 49; Failure level: 49/257 (19.07%)

-          Completely failed files: 29; Completely failed subtests: 8; Failure level: 8/257 (3.11%)

Vs. online (with key system names in subtest names);


-          Test files: 105; Total subtests: 299

-          Test files without 2 passes: 50; Subtests without 2 passes: 91; Failure level: 91/299 (30.43%)

-          Completely failed files: 50; Completely failed subtests: 10; Failure level: 10/299 (3.34%)

That means:


-          Listing the key system in the subtest names results in 42 additional subtests.

-          Most of these have more than one passing implementation presently.

-          2 subtests (drm-keystatuses-multiple-sessions) report as complete fail with key system in the subtest name, but have one passing implementation when key system is not.

Options to handle this are:


-          Remove the keysystem names from the subtests, but lose visibility into keysystem specific failures.

-          Leave them in and post those results to the website, but prepare manual tallies when we submit PR for review.

Jerry

Received on Monday, 29 August 2016 19:37:02 UTC