W3C home > Mailing lists > Public > w3c-wai-gl@w3.org > July to September 2001

Re: Proposal deriving from checkpoint 2.1

From: Kynn Bartlett <kynn-edapta@idyllmtn.com>
Date: Mon, 30 Jul 2001 12:00:49 -0400
Message-Id: <a0510030ab78b2f801767@[10.0.1.2]>
To: Anne Pemberton <apembert@erols.com>, jasonw@ariel.ucs.unimelb.edu.au, Web Content Guidelines <w3c-wai-gl@w3.org>
At 11:02 AM -0400 2001/7/29, Anne Pemberton wrote:
>         What is the lowest common denominator -

I hate the "lowest common denominator" arguments -- I think those tend to
waste time and put the wrong spin on things; from a PR standpoint it tends
to be somewhat easy to grasp but also insulting as well. :)  So I will
assume you are asking "what are the edge cases"?

>consider the simplest site, consisting of one page of content. What 
>navigation mechanism is needed for such a site?

Ah, there's an easy trap to fall into.  You don't measure content in
terms of _pages_.  A page is just an arbitrary chunk of content presented
in one medium.  What might be a "page" on a web site could be six panels
of a tri-fold brochure if printed out.

And of course there's often a need for intra-page navigation as well;
consider the WCAG 1.0 document itself.  It is "one page" but includes
extensive navigation within itself.  So it's possible to have a single
page that requires navigation mechanisms!

That's why it's better to speak in terms of navigating content, and
the question then is "what navigation mechanism is appropriate for
this type of content"?

>When does a site grow big enough (or start out big enough) to need a 
>site map? an index or table of contents? more than either?

These are information architecture questions and are harder to quantify
than some things.  Which is why it would be nice if we could say:

      "Checkpoint X.X:  Use a reasonable architecture for your information."

Completely "uncheckable" and thus problematic.  (Or is the "checkable"
concept problematic?)

>does putting a search engine on a site mean linking to yahoo or 
>google from a page?

Good point, this is one of the ways that we will be misunderstood by
web developers, especially inexperienced ones.

>         What is involved in adding a search engine to a site? Do you 
>buy one and hook it up? Do you have to make one from scratch? Sorry 
>if these questions are terribly naive.
>                                                 Anne

No, these are good questions.

There are several ways to make a search engine for a site:

(1)  Buy the software, install it on your web server, index your pages,
      and write a form on the site to access it.

      This is, I believe, what the W3C has done for their web site, as
      they use AltaVista technology, I think.  (Maybe it wasn't bought,
      maybe it was donated.)

      No specific specialized knowledge is necessary here, as the searc
      engine software should come with instructions and support, but you
      will need to have the ability (and understanding) to add software
      (e.g. CGI) to your web server.  (Not all web hosts provide this.)

(2)  Write the software yourself, but otherwise as per #1.  This is
      okay for smaller sites sometimes, and for specific applications and
      types of searching.  For example, the Virtual Dog Show allows you
      to look for entered dogs in a hierarchical manner, by class and
      breed, but also offers a specialized search engine which finds dogs
      based on user-entered criteria.

      This requires that you know how to program server-side applications,
      e.g. CGI, PHP, ASP, Perl, etc., and most likely SQL.

(3)  Get some free software, but otherwise as per #1.

      The knowledge here is likely a bit more specialized, since most
      free software for web servers has weak documentation and assumes
      you know quite a bit about system administration.  But you don't need
      to know how to write a search function.

(4)  Use a third party service ("application service provider" model --
      the term ASP, by the way, is "overloaded" in that it can mean several
      things, sigh) which will automatically index your site and provide a
      searchable index.  (The search form is on your site, it sends users to
      the third party site for results, and those results link back to your
      site.)  These services might charge something or they might be free.

      AtomZ is an example of this type of service, as is SearchButton
      who the HWG uses (but they have gotten out of the "free service to
      small sites" game).  Here is a URL for AtomZ (no endorsement
      implied):

           http://www.atomz.com/services/atomz_search/index.htm

(5)  Piggyback on existing search engines that allow you to restrict a
      search to a specific site.  For example, make a search box like
      this:

            Search:  [+host:kynn.com                   ]

      Make the form submit to AltaVista, and have the user type her search
      parameters after the "+host:kynn.com" parameter.  This is a "better
      than nothing" approach because AV's indexing is infrequent and the
      user interface ("don't delete the kynn.com part!") is less than
      optimal, but it may be better than no search at all.

Hope this helps.

--Kynn

-- 
Kynn Bartlett <kynn@reef.com>
Technical Developer Liaison
Reef North America
Accessibility - W3C - Integrator Network
Tel +1 949-567-7006
________________________________________
BUSINESS IS DYNAMIC. TAKE CONTROL.
________________________________________
http://www.reef.com
Received on Monday, 30 July 2001 12:31:25 GMT

This archive was generated by hypermail 2.2.0+W3C-0.50 : Monday, 7 December 2009 10:47:11 GMT