Re: IA of Max's JS files for upload?

This concerns the needed improvements to the work completed so far
against the MSDN-JS import project, improvements like better page
names and conversion of content to template syntax.

There was a documentary about nuclear reactors that attributed the
high cost of production in part to improvements in regulations.  By
the time a reactor was designed and then built, such time had elapsed
that a new set of regulations began to take effect, requiring
considerable rework and additional costs.

Such is the nature of physical construction.  Software, however, is
very much unlike physical material.  Data input and output is more
like soft putty to work and rework.  So rather than wait until we have
what is perfect, my plan is to create interim versions of the work
completed so far, uploaded to a test wiki, but also committed to
github.

The next round of tools and scripts use the interim version as input,
and create another set of output.  Successive rounds can be created
like this until it is complete.  The main driver is that tools and
scripts operate on local files works much faster and more efficiently
than against web resources (pages stored in a wiki) which have to be
downloaded and uploaded.

The spirit of a wiki is successive refinement, so having the bulk
import effort use that pattern seems apropos.

So the first round of output, all the wiki content so far plus
upload-mapping.wiki as the mapping of where they should be uploaded,
is now located here, which I'll use for the next round of processing:

    https://github.com/maxpolk/msdn-js-conversion/tree/master/round-alice

The next round will address page names.  The internal content quality
was not reviewed yet, which would lead to a fix for empty "see also"
sections on the bottom of some pages, removal of some Microsoft
specific things like not using ActiveXObject that runs only in
Internet Explorer, and so forth.

Received on Saturday, 15 June 2013 16:31:45 UTC