W3C home > Mailing lists > Public > www-archive@w3.org > October 2014

Fwd: Re: URL Spec rewrite (bug 25946) and galimatias test results

From: Sam Ruby <rubys@intertwingly.net>
Date: Fri, 31 Oct 2014 01:08:08 -0700
Message-ID: <54534368.8090005@intertwingly.net>
To: "www-archive@w3.org" <www-archive@w3.org>
First attempt apparently didn't make it to the online archives.

- Sam Ruby

-------- Forwarded Message --------
Subject: 	Re: URL Spec rewrite (bug 25946) and galimatias test results
Date: 	Wed, 29 Oct 2014 01:14:18 +0100
From: 	Santiago M. Mola <santi@mola.io>
To: 	Sam Ruby <rubys@intertwingly.net>
CC: 	Michael(tm) Smith <mike@w3.org>, www-archive@w3.org


2014-10-25 2:36 GMT+02:00 Sam Ruby <rubys@intertwingly.net

     He suggested that I ask you for feedback on the following:


It's definitely a useful resource.

I think that the parser defined in the current spec is easier to follow
if you want to implement it as-is.

Your approach gives a better idea about what should the parser do on a
higher level. After implementing the parser following the current spec,
I had a hard-time determining what should be the parsing output for some
cases, These new diagrams would solve that problem.

     I also said that I would test galimatias for compatibility.  I've
     posted the results here:


   Thank you for taking the time to include Galimatias!

     A few notes: it doesn't appear to me that galimatias reports any
     recoverable parse errors (for example, including a tab or a linefeed
     inside a path).

Galimatias checks every defined error, both recoverable and fatal. It
provides a customizable ErrorHandler interface. The core provides a
DefaultErrorHandler that just ignores any recoverable error and
StrictErrorHandler, which fails on any error. The user could implement a
LoggingErrorHandler that logs recoverable parsing errors, a
CollectorErrorHandler that collects every error for later
analysis/validation, etc. I will add more error handlers to the core if
common patterns of use emerge among the user community.

     Also galimatias doesn't provide the interfaces that the URL Standard
     defines, for example to get the portname - an interface that is
     supposed to return null if the port matches the default port for the

Right. Galimatias does not implement the URLUtils interface. I have
opened an issue to keep a reference for it:

I'm still not sure I want to provide URLUtils interface as is. It's a
browser-centric API that I don't find particularly useful outside the
JavaScript-in-a-browser scope. Maybe it makes sense for standards
validation code such as validator.nu <http://validator.nu>?

     Even with that accounted for, there still are a number of notable

     Null pointer exceptions, some examples:

Some of these seem caused because the call to url.host().toString() in
your test case. In these cases, host is null. This was the intended

     Returning an empty string instead of null for fragment:

AFAIK this is consistent with the standard (fragment, not hash). You can
change your testing class to:

result.put("hash", (url.fragment()!=null && !url.fragment().isEmpty()) ? 
"#"+url.fragment() : "");

     Returning an empty string instead of null for query:

Again, this is the standard (query, not search). You can change your
testing class to:

result.put("search", (url.query()!=null && !url.query().isEmpty()) ? 
"?"+url.query() : "");

     ipv6 addresses not wrapped in []:

Right. IPv6 addresses are wrapped in [] when serialized as part of an
URL, but they are not wrapped when printed as standalone entities. I'll
fix it:

     difference in case:

Galimatias is biased towards URL normalization. It tries to minimize the
creation of URLs that are equivalent according to the standard. A
setting to disable this percent-encoding normalization behaviour will be
provided if there is a real world use case for it.

     If you want to review how I captured these results, the program I
     used can be found here:


     Please let me know if you identify any problems with that program,
     and I will be glad to rerun the tests.

I have added a test to check with

Do you have more up-to-date data?

I found Galimatias failed with this:


My latest code parses this URL as http://0xc0.0250.01./
This seems in line with the standard, since it does not perform sanity
checks for DNS rules. See: https://github.com/smola/galimatias/issues/26
It also fails with this one:

But is a valid domain name at DNS-level and it's not
forbidden by the URL standard.

Apart from these two cases, Galimatias passes all test cases if
query/search fragment/hash differences are considered.

Please, let me know if there is any failure in the future or if you have
any feedback.

Thank you again!


Received on Friday, 31 October 2014 08:08:39 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:35:05 UTC