- From: Ville Skyttä <ville.skytta@iki.fi>
- Date: Thu, 1 May 2008 15:00:44 +0300
- To: public-qa-dev@w3.org
On Thursday 24 April 2008, Olivier Thereaux wrote: > On Mon, Apr 21, 2008, bugzilla@farnsworth.w3.org wrote: > > ------- Comment #1 from ville.skytta@iki.fi 2008-04-21 07:13 ------- > > Done in CVS, will be in 4.4. > > http://dev.w3.org/cvsweb/perl/modules/W3C/LinkChecker/bin/checklink.diff? > >r1=4.98&r2=4.99 > > Ville, > > Thanks a lot for this fast patch. Do you have an idea of the things left > to clear and clean before we can push a 4.4 release? I haven't had a lot > of bandwidth for checklink in the past few weeks but I don't recall > there was much in the way of a release. I just had a look, and seems that the thing needing most work is HTML output in command line mode. I'm working on a patch, it should be ready pretty soon. The other thing I found is that the "Checking link" javascript callbacks printed from perl don't get the URL to display in HTML escaped form (in check_uri()). HTML escaping them is trivial but doing so makes my Firefox and Konqueror display &'s as &'s in the status widget. I'm a bit slow today and maybe I'm missing something obvious there, but doesn't that sound like a browser bug? > It may also be worth looking again at where the project is heading. > Checklink is now a venerable service, working well, and it seems from > the feedback I receive now and then that the only issue is that its good > behavior (following robots.txt and waiting a second between requests) is > annoying in an age of faster web servers. It may be worth looking at > whether to trigger the robotUA only in recursive mode, or whether we > want to plug in a faster ajax interface (not for the recursive mode), > etc. Yep. And for significant new developments, also check whether Perl is still the desired implementation language for it in the first place...
Received on Thursday, 1 May 2008 12:01:19 UTC