W3C home > Mailing lists > Public > public-wot-ig@w3.org > December 2018

[PlugFest/Test] minutes - 5 December 2018

From: Kazuyuki Ashimura <ashimura@w3.org>
Date: Thu, 6 Dec 2018 11:59:42 +0900
Message-ID: <CAJ8iq9VNGKN9ccnjQeJg8LK4cOhQZEX4ONZ+i64neq-UwoSTXQ@mail.gmail.com>
To: Public Web of Things IG <public-wot-ig@w3.org>, public-wot-wg@w3.org
available at:
  https://www.w3.org/2018/12/05-wot-pf-minutes.html

also as text below.

Thanks,

Kazuyuki

---

   [1]W3C

      [1] http://www.w3.org/

                               - DRAFT -

                           WoT-PlugFest/Test

05 Dec 2018

Attendees

   Present
          Kaz_Ashimura, Michael_McCool, Ege_Korkan, ege,
          Kunihiko_Toumura, Michael_Lagally, Taki_Kamiya,
          Toru_Kawaguchi, Tomoaki_Mizushima

   Regrets

   Chair
          McCool

   Scribe
          ege, kaz

Contents

     * [2]Topics
         1. [3]Test plan update
         2. [4]TestFest logistics
         3. [5]CR exit criteria
         4. [6]TD version for TestFest
         5. [7]JSON-LD validation tool
         6. [8]Additional meeting?
         7. [9]Possible interoperability report Note
     * [10]Summary of Action Items
     * [11]Summary of Resolutions
     __________________________________________________________

   <ege> link for the assertion tester:
   [12]https://github.com/egekorkan/thingweb-playground/tree/asser
   tionTest

     [12] https://github.com/egekorkan/thingweb-playground/tree/assertionTest

   <McCool>
   [13]https://www.w3.org/WoT/IG/wiki/PlugFest_WebConf#Agenda_05.1
   2.2018

     [13] https://www.w3.org/WoT/IG/wiki/PlugFest_WebConf#Agenda_05.12.2018

   <kaz> scribenick: ege

Test plan update

   McCool: so let's get started
   ... sort out logistics for next week
   ... did changes but mostly cosmetic
   ... we can go to my repo to see the actual branch
   ... input data needed for the report
   ... implementation needed from fujitsu and hitachi
   ... it would be useful to have it before the testfest
   ... would be good to have it out of the way
   ... so email me or do a PR
   ... against my repo or main repo
   ... ege did some progress

   <kaz> scribenick: kaz

   Ege: TD validation tool
   ... how to test the assertions
   ... new tool here

   [14]https://github.com/egekorkan/thingweb-playground/tree/asser
   tionTest

     [14] https://github.com/egekorkan/thingweb-playground/tree/assertionTest

   Ege: it runs through all the assertions
   ... and generates CSV
   ... same format as McCool's report
   ... and some extra information why failed

   McCool: ok
   ... TD being tested

   Ege: directly creates the results

   McCool: forgot to mention this tool on the testfest logistics
   page
   ... visits mccool's updated-test-results

   [15]https://github.com/mmccool/wot-thing-description/tree/updat
   ed-test-results

     [15] https://github.com/mmccool/wot-thing-description/tree/updated-test-results

   McCool: once you have a result file
   ... put it here

   [16]https://github.com/mmccool/wot-thing-description/tree/updat
   ed-test-results/testing/inputs/results

     [16] https://github.com/mmccool/wot-thing-description/tree/updated-test-results/testing/inputs/results

   Kaz: can you put that instruction to the testfest page?

   McCool: has a README.md already

   [17]https://github.com/mmccool/wot-thing-description/tree/updat
   ed-test-results/testing

     [17] https://github.com/mmccool/wot-thing-description/tree/updated-test-results/testing

   Kaz: you can add the above URL to the testfest page then

   Ege: some of the assertions have problems
   ... some of them are combination of multiple assertions
   ... in this case, both the two assertions to be handled at once
   ... should I make both of then failed if either of them failed?

   McCool: there is a mechanism to track the situation
   (parents/children)
   ... you can go ahead and create a new assertion which is
   specialized
   ... we need more specialized assertions

   Kaz: if the parent assertion fails the children assertions also
   should fail

   McCool: explains the example of the top assertion
   ... we need all the assertions done
   ... good to know which assertions could be checked by the
   automatic tool
   ... maybe some of assertions can't be checked by the automatic
   tool

   Ege: e.g., idiopotent test

   McCool: we need to go through the test specification
   description as well
   ... maybe we can put some note here (at the test specification
   descriptions)
   ... if only part of the assertions can be tested automatically,
   that's fine
   ... once you find which can be handled, that would be good

   Toru: question
   ... panasonic has some thing like air conditioner
   ... but the TDs are hand-written
   ... can they also be included?

   Kaz: think that can be included given that TD is also exposed
   to outside for applications
   ... as part of TD producer, e.g., an air conditioner this time

   McCool: yeah, so we should strike the phrase of
   "programatically generated"
   ... this description is loose enough for proxy as well
   ... consume and produce
   ... currently Panasonic gave me implementation description here
   ... 4 devices
   ... all one implementation of one code-base

   Toru: tx

   McCool: probably Ege needs to flesh out the tool more
   ... some of the test are manual

   Ege: would be ok to categorize the assertions?
   ... JSON Schema, network tool and manual, e.g.

   McCool: yeah, we can flesh that out

   Ege: also another sort of assertions?
   ... additional fields

   McCool: there are 4 fields: pass, fail, not-impl, total
   ... context column includes contextual link
   ... not terrible shape actually now

TestFest logistics

   McCool: can add information here

   [18]https://github.com/w3c/wot/tree/master/testfest/2018-12-onl
   ine/

     [18] https://github.com/w3c/wot/tree/master/testfest/2018-12-online/

   McCool: add information
   ... schedule
   ... webex
   ... any restriction for that?

   Kaz: no
   ... anybody from the WG/IG can join the calls

   McCool: Monday: Script webex
   ... Wednesday: Editors webex
   ... Friday: TD partly
   ... will work on the procedure
   ... and preparation TODO
   ... each organization with one of more implementation needs to
   submit an implementation description
   ... make sure all the implementations are online
   ... copy all TDs to the TDs subdirectory
   ... and data collection procedure
   ... validate TDs, generating results files per TD
   ... merge results files, giving result file per implementations
   ... check in result files
   ... record any interop tests
   ... run npm
   ... (shows the resource for the "interop test" part)
   ... the system merge all the CSV reports to generate the table
   at the interop test part

   Toru: do we use google hangout video for that purpose?

   McCool: last time Matthias provided that
   ... the question is what would happen this time
   ... Matthias is not here
   ... will ask him

   Ege: this is a free service, isn't it?

   McCool: not sure the number of participants for the free
   service
   ... let me look into it
   ... another point to mention
   ... penetration test
   ... not really complicated
   ... looks like Ege's network service testing
   ... Elena wanted try various things
   ... next section on assertion testing
   ... and then interop testing

   Ege: can be automatically generated?

   McCool: that's what I'm assuming
   ... if you can do that, let's do that
   ... Elena can look into Burp Suite
   ... we can automatically generate a configuration file
   ... procedure to be determined
   ... btw, as for the interop testing part
   ... record any interop tests in testing/input/interop
   ... next week during the scripting call, we'll continue the
   discussion

CR exit criteria

   [19]https://github.com/w3c/wot/blob/master/testing/requirements
   .md

     [19] https://github.com/w3c/wot/blob/master/testing/requirements.md

   [20]https://github.com/w3c/wot/blob/master/testing/criteria.md

     [20] https://github.com/w3c/wot/blob/master/testing/criteria.md

   McCool: kaz mentioned that DCAT/SSN are better examples for
   data model spec

   Kaz: feedback from the call with Ralph and PLH yesterday, what
   we need to do is clarifying the TD vocabulary and show 2
   independent implementations use the vocabulary

   McCool: yeah
   ... still confusion about assertions within the draft report to
   be updated

TD version for TestFest

   <kaz> FYI, the diff between published TD (oct 21) and the
   current editor's draft (nov 29) available at:
   [21]https://w3c.github.io/wot-thing-description/diff.html

     [21] https://w3c.github.io/wot-thing-description/diff.html

   McCool: which version of TD to be used for the TestFest
   ... essentially freeze the TD spec today
   ... if your implementation actually fails that's OK
   ... we're checking the testing procedure now
   ... this is a snapshot today

   [22]https://w3c.github.io/wot-thing-description/diff.html

     [22] https://w3c.github.io/wot-thing-description/diff.html

   Kaz: fyi, the above is the diff between the published version
   on Oct 21 and the current editor's draft on Nov 29

   McCool: ok
   ... let me capture the URL on the testfest page

JSON-LD validation tool

   [23]https://json-ld.org/

     [23] https://json-ld.org/

   Kaz: another suggestion from W3M was
   ... we might want to look into JSON-LD WG's validator above
   ... Ege, is your playground based on that?

   Ege: I built my playground validator from scratch

   Kaz: can you quickly look into the generic JSON-LD validator
   at: [24]https://json-ld.org/?

     [24] https://json-ld.org/

   Ege: can do that

   Kaz: would be helpful
   ... tx

Additional meeting?

   McCool: small group of us working on validation
   ... maybe some more work tomorrow?

   Kaz: let's do that during the Monday meeting

   McCool: first hour on Monday?

   Kaz: 1pm on Monday in Europe

   Ege: can make it

Possible interoperability report Note

   Kaz: btw, the feedback from the w3m guys included that the
   interoperability part of the draft implementation report
   doesn't have to be part of the official implementation report
   ... on the other hand, it would be useful for implementers to
   publish it as part of a separate interoperability test report
   as a WG Note

   [adjourned]

Summary of Action Items

Summary of Resolutions

   [End of minutes]
     __________________________________________________________


    Minutes formatted by David Booth's [25]scribe.perl version
    1.152 ([26]CVS log)
    $Date: 2018/12/06 02:58:16 $

     [25] http://dev.w3.org/cvsweb/~checkout~/2002/scribe/scribedoc.htm
     [26] http://dev.w3.org/cvsweb/2002/scribe/
Received on Thursday, 6 December 2018 03:00:49 UTC

This archive was generated by hypermail 2.3.1 : Thursday, 6 December 2018 03:00:49 UTC