- From: Ivan Herman <ivan@w3.org>
- Date: Fri, 9 Sep 2016 18:12:23 +0200
- To: W3C Public Annotation List <public-annotation@w3.org>
- Message-Id: <64B8D172-C1DA-4B0A-A7A7-A3AE71E1BC8F@w3.org>
Minutes are here:
https://www.w3.org/2016/09/09-annotation-minutes.html
Text version below
Cheers
Ivan
----
Ivan Herman, W3C
Digital Publishing Lead
Home: http://www.w3.org/People/Ivan/
mobile: +31-641044153
ORCID ID: http://orcid.org/0000-0003-0782-2704
[1]W3C
[1] http://www.w3.org/
Web Annotation Working Group Teleconference
09 Sep 2016
[2]Agenda
[2] http://www.w3.org/mid/092901d20a41$502a57f0$f07f07d0$@illinois.edu
See also: [3]IRC log
[3] http://www.w3.org/2016/09/09-annotation-irc
Attendees
Present
Shane McCarron, Tim Cole, Rob Sanderson (azaroth), Ben
De Meester, Benjamin Young (bigbluehat), TB Dinesh,
Randall Leeds
Regrets
Chair
Rob, Tim
Scribe
azaroth
Contents
* [4]Topics
1. [5]Minutes Approval
2. [6]Announcements
3. [7]Issue updates?
4. [8]Testing
* [9]Summary of Action Items
* [10]Summary of Resolutions
__________________________________________________________
<scribe> scribenick: azaroth
Minutes Approval
PROPOSED RESOLUTION: Minutes of the previous call are approved:
[11]https://www.w3.org/2016/09/02-annotation-minutes.html
https://www.w3.org/2016/09/02-annotation-minutes.html
+1
<TimCole> +1
<bjdmeest> +1
<ivan> +1
RESOLUTION: Minutes of the previous call are approved:
[12]https://www.w3.org/2016/09/02-annotation-minutes.html
[12] https://www.w3.org/2016/09/02-annotation-minutes.html
Announcements
TimCole: We have successfully republished the three documents
in CR
... TPAC is fast approaching. We won't have any meetings there,
but need to think about schedule for the calls
ivan: I'll be out
... I leave on Sunday so here next week, but out the following
ShaneM: I'll be tied up too
... 4pm in Lisbon?
... I have a meeting at 4pm on Friday.
Ivan: I'll be fried :)
TimCole: Cancel Sept 23rd meeting
azaroth: My regrets for next week.
TimCole: Okay, cancel 23rd but will meet next week
... Will meet on the 30th. Any other announcements or
questions?
Issue updates?
TimCole: There was an issue closed
Ivan: Sarven said that he was okay to close it, so I did
TimCole: What about the i18n issues?
ivan: No idea :( Anyone on the social web group might know
more?
TimCole: Would it be okay to reach out to Richard?
Ivan: I can do that.
bigbluehat: Social picked a very different solution for i18n
issue. DPUB on the other hand did the same as us
... they went with no text direction stated. Web manifests and
ourselves have it as an explicit property
... Social are trusting the bidi character will be recognised
and implemented
TimCole: That's okay, we'll see what happens at the end of CR
... Any editorial progress?
azaroth: None, have been waiting for i18n issue to resolve
before making any further changes
TimCole: Any other issue related topics?
Testing
TimCole: Posted a short note about the state of the model
testing. A couple of issues have come up
... Haven't gotten new reports from implementers.
... For sections 1-4 of the model, we now have 173 assertions,
organized into 10 tests
... broken down along validation of MUST, or whether a feature
is implemented
... 10 tests means that if you run the suite, you paste your
annotation in 10 times
... the advantage of it is that if you want to test only the
requirements or a particular feature you can do so by running
only a subset of tests
... impressed by how quickly they run
... could probably reduce it however to fewer tests
... Shane, do you see any issue with the test runner software
having one test with 173 assertions?
ShaneM: No, the system doesn't care
... we moved the text box. It might not be merged yet though
... I don't have merge capability. The code codes reviewed and
hopefully someone merges it.
<bigbluehat> TimCole: which server are you looking at?
w3c-test.org? or testdev.spec-ops.io?
ShaneM: The people who have merge capability don't seem to pay
attention to our requests
ivan: You should try to talk to Philippe about this
... He's the one responsible for getting WGs to do things
properly. We went down this route for testing, but we shouldn't
be thwarted by issues like this
... at the end of the day, it's his responsibility
ShaneM: I can ping him now?
ivan: It may need a longer discussion.
ShaneM: There just needs to be someone on staff who takes care
of this sort of stuff
ivan: I don't have a practical proposal, but the current way
doesn't work
... it has created barriers for us many times already, which
isn't acceptable
... we're bound to deadlines, and accountable to W3M, but that
means we need to be able to do what we have to do
ShaneM: From our perspective, I should have stayed on top of
the PRs...
ivan: No, don't take it on you, you shouldn't have to chase
people, there should be a process
ShaneM: There's 173 outstanding requests, so it's not just us
ivan: I discussed this with Ralph as well. There's something
fundamentally wrong. We can come back to this in Lisbon
<bigbluehat>
[13]https://github.com/w3c/web-platform-tests/pulls?q=is%3Aopen
https://github.com/w3c/web-platform-tests/pulls?q=is:open+is:pr+label:wg-annotation
+is%3Apr+label%3Awg-annotation
TimCole: One question is whether we should go down to a few
tests
<ShaneM> bigbluehat:
[14]https://github.com/w3c/web-platform-tests/pull/3634 r?
[14] https://github.com/w3c/web-platform-tests/pull/3634
TimCole: easy to make that change now
<bigbluehat> yeah. working on that one now
TimCole: would just require all the assertions in one big test
azaroth: Would it be possible to copy the annotation from the
first test to the second test, and so forth? then it would
populate the text box with that annotation
ShaneM: If it's the same annotation, that might be possible. If
it's a different annotation, it wouldn't make sense
TimCole: Still need to deal with section 5 of the model, as
it's not annotations
... you'd paste in a collection or a page, which are
fundamentally different
... as many /clients/ won't implement that, it'll be a bit
strange
azaroth: We'd probably want annotation server implementers to
do the collections and pages
TimCole: Should this be a fourth set of tests?
azaroth: Easier at the end of CR if it was part of the model
testing, just a different section
... (explains issue)
ivan: We have entries in the model that are relevant to the
server, so closer to the protocol
... how does it affect the reporting?
TimCole: if there's one series of tests that starts with
annotations and then the collection, the client developers
won't know what to do with the collection and page tests
azaroth: and the server implementers won't necessarily have
anything useful for the first part
ShaneM: What if those tests were added to the protocol suite?
<Zakim> ShaneM, you wanted to ask if we should be pulling ajv
into the protocol tests to see what features of the data model
are used there? OR add a protocol exerciser to the model tests
ShaneM: or add server tests to the model tests to hit a server
and get the content?
ivan: My reaction was the same as your first option -- those
tests should be performed by the protocol testing procedure
<ShaneM> protocol !== data model though
ivan: that's where they come into the picture
... so the server/protocol implementers will test some of the
model, which is fine
... the reporting is tricky
TimCole: one thing we could do is name the tests for the
collection and page carefully, and give different URLs to the
testers
... so the client people would only get annotations, and the
server people would only get collections and pages
+1 if that's possible
scribe: we could do that with regexps or folder names perhaps
ShaneM: Have a PR on the regexps
TimCole: The problem would be that the results would be
incomplete -- it would only have collection or annotation
tests. Would that merge okay?
ShaneM: I don't know the answer
azaroth: We really need that to work, as a client might not do
all the tests and we want to know what they did do
TimCole: I'd prefer that, as it keeps everything together in
the right place
ShaneM: THe test description needs to clearly lay out the
expectations for what to put in the text box
TimCole: We may need to move all the tests into a different
child folder. Everything in this folder is for the annotation,
and everything in this folder is for the collections
... number of tests... since we can't currently retain the
annotation between tests, might be better to reduce the number
of tests and put lots of assertions
... one for all the MUSTS and one for all the optional SHOULDs
and MAYs
... would expect most of them to fail, but should get some for
each
+1
ShaneM: I'm fine with that
TimCole: Even clicking the button 10 times if the annotation is
copied would be a pain
... Last big issue for model testing is the process for
uploading results
... Single client implementation will create multiple types of
annotations.
... First type might exercise specific resources, second might
do text bodies
... but might not have one annotation that does both
... so want client implementers to run multiple annotations
through both the tests
... affects some of the counts and how to name the files
... need to not count the same implementation twice
... instructions get a bit complicated for what should be
uploaded
... e.g. that the readme needs to be updated
... Illinois developer reported that the instructions were long
and cumbersome
ShaneM: Don't have a problem with people having to read
instructions. Issue is that we're all learning about this way
of testing
... don't have to upload the annotations if you don't want to
TimCole: Could create codes for people that we invite
... but they'd need to fill out the other information
... and can't do that for everyone up front
ivan: Try to invite people, but hard to invite folks not in the
WG
... have to actively bring people in
... if they come they'll have questions and someone will have
to help them get over the hurdles
... don't remember how difficult for the RDFa testing, for
example
... might have been a bit more automatic, but there were
instructions about what to set up
... there's always something like that, there's always some
help and intervention needed
TimCole: Would shorten things if we can set up the base line as
a template
... then don't need to describe it... just say to copy the
template
... can set up the folders for them to make it easier
ShaneM: If you don't understand github, there's no way you'll
be able to do this
... would be nice if there /was/ a way to upload of course
TimCole: Have to rename file from the results, then fork the
repo, create a PR, add the results, annotations and update the
readme file
... so not terribly complicated but harder with no examples
... who can help with instructions and Readme?
azaroth: Can try to take a look early next week
TimCole: We'll make some edits before Monday
... anyone else?
<tbdinesh_> I will ask someone in my team to do that
TimCole: Shared some of this with Randall and Nick but don't
know if they've had a look at it yet
... Probably they'll say it's too long, Benjamin gave the same
feedback
... Rob do you have annotations from a client
azaroth: Yep, I can do that
TimCole: Thanks Dinesh
<Zakim> ShaneM, you wanted to talk aout updating process
ShaneM: in terms of update process, just do a single PR
... I've already corrected the files that weren't named
correctly for her
TimCole: By the end of the day we should have the right set of
reports for 1 implementation with 3 annotations
... going to rearrange the tests
ShaneM: I'll withdraw the PR to put in the new tests then
ivan: Looking at the 3 result files. Protocol looks pretty good
... no information about the two implementations
ShaneM: The implementation details are in the readme
... no way to have the implementation details in the response
file
ivan: Can the results file have a link to where to go for where
to find the things?
... Management will have no idea what to do with the file
... no facility in WPT reports for how to do it, so would have
to do it by hand
... Ahh, don't do it by hand. Just needs to be clear that
there's a readme file
<ShaneM>
[15]https://github.com/w3c/test-results/tree/gh-pages/annotatio
n-protocol
[15] https://github.com/w3c/test-results/tree/gh-pages/annotation-protocol
ivan: in the HTML file, through github.io, I'd just like a
static link to say go here to understand these
TimCole: We could put that link at the top of the file?
ShaneM: If that satisfies Ivan's requirement?
ivan: If that link is on the report, that's fine
... can it not say fail for optional things?
ShaneM: We've talked about that :( It has to say fail at the
moment
... we just need to say it's optional
ivan: If I look at the model, there's lots of fails
... the impression is that it looks bad
... don't know what to do though
TimCole: For any given annotation, for optional features, it
might have only very few features
... so just need two greens across the row
... even if there's 100 fails
ivan: As a reporting issue, it could be a problem
... at the transition call there'll be a long discussion as to
what is going on
<ShaneM>
[16]https://w3c.github.io/test-results/annotation-model/less-th
an-2.html
[16] https://w3c.github.io/test-results/annotation-model/less-than-2.html
TimCole: We can reduce down what we think of as a feature
<ShaneM> oops...
<ShaneM> Take a look at that
TimCole: we have a test for 'is this a selector' and then 7
tests for what sort of selector
... will see more green for general selectors than for specific
ones
... need to talk about this before the end of the month
... or put them into skipped
ivan: it's more the reporting
<ShaneM> "what is skipped
ivan: from high up, if what's optional is put somewhere else in
the report, it gives a different view
TimCole: THere'll be the top part with lots of greens, the
MUSTS, and then 100 something mostly reds for the optionals
ivan: Psychologically speaking I'd try to find a diffrent color
than red
... but at least we need to separate them
TimCole: a skip would not show up in red?
ShaneM: what's a skip?
TimCole: In the test format, there's an option to skip?
ShaneM: ahh, that's just flow control on assertions
TimCole: If I skip an assertion, they wouldn't show up?
... there'd be a blank box?
ShaneM: Probably?
ivan: You understand the problem though, it will create issues
ShaneM: The fundamental problem is that the Web Platform (in
general) does not envision the notion of optional
... but we have it. I don't know how to represent that except
by naming the assertion clearly
... That's not reflected in the report, but we could
ivan: Is it possible?
ShaneM: It is. The WPT people hate that we've made our subtest
names complicated but whatever
... putting mandatory or optional at the beginning will make
them more complicated
azaroth: Could we leave failed optional as a blank box
ivan: that would be better
TimCole: I'll play with flow control this afternoon
ivan: Gregg has done full vocab testing?
azaroth: I have as well
ivan: The report is empty though
...
...
...
*tumbleweed*
scribe: ...
ShaneM: I'm doing my best
<Zakim> ShaneM, you wanted to talk about protocol testing...
ShaneM: I can do it by hand
ivan: Once we've done it, it's a small report we can do by hand
TimCole: Sorry for not talking about HTML serialization
... or protocol testing
ShaneM: Lets work on the sequencing thing and see if we can
figure it out
TimCole: No reason to test selectors if you don't have a
specific resource
... the more consolidated tests will make it easier
... Worried that the report will correctly have blank cells
ShaneM: Suggest reorganizing tests first
TimCole: Thanks all
Bye!
<ivan> trackbot, end telcon
Summary of Action Items
Summary of Resolutions
1. [17]Minutes of the previous call are approved:
https://www.w3.org/2016/09/02-annotation-minutes.html
[End of minutes]
__________________________________________________________
Minutes formatted by David Booth's [18]scribe.perl version
1.144 ([19]CVS log)
$Date: 2016/09/09 16:10:24 $
[18] http://dev.w3.org/cvsweb/~checkout~/2002/scribe/scribedoc.htm
[19] http://dev.w3.org/cvsweb/2002/scribe/
Received on Friday, 9 September 2016 16:12:34 UTC