- From: Chris Weber <chris@lookout.net>
- Date: Mon, 12 Nov 2012 23:10:09 -0800
- To: Larry Masinter <masinter@adobe.com>
- CC: "julian.reschke@gmx.de" <julian.reschke@gmx.de>, Philippe Le Hegaret <plh@w3.org>, "public-test-infra@w3.org" <public-test-infra@w3.org>
On 11/12/2012 8:13 AM, Larry Masinter wrote: > I think I heard from Philippe that perhaps W3C could help set up a > server (with wildcard DNS entry) for testing? Or is there some other > kind of instrumentation we could use in the browser to test URL > parsing? A server with a wildcard DNS entry would be great. It would be the most transparent way I can think of to test how the parsed URL hits the wire. But we might need a dedicated domain name to use as well, e.g. w3c-url-testing.com, because the server would need to handle all arbitrary incoming requests produced by the test cases, and we wouldn't want them interfering with the existing test domain, e.g.: GET /foo/bar/1?2=3 GET /bar/foo/ GET /c%7C//foo/bar.html GET /foo%2%C3%82%C2%A9zbar ... With that information, and a unique test case identifier, we could correlate the URL components in the HTTP request (host, path, query), with those in the browser's DOM. Browsers could be instrumented by using APIs such as Chrome's webRequest, but it would require writing and installing an extension for each browser as far as I know. I'm still working on the basic DOM parsing tests, with most of them working now here: http://www.lookout.net/test/url/ I still have plenty of cleanup to do, and more test groups to process, namely the groups that breakdown each component of the URL. And that's before I can get to any server-side work. Best regards, Chris
Received on Tuesday, 13 November 2012 07:10:37 UTC