maintaining thousands of WWW pages
I've developed a static approach to the generation and maintenance of large volumes of Web pages. This approach relies on the separation of content and form. Essentially the data structure is determined (page content) and template pages are designed (i.e. form is added) and a generic tool is adapted for a specific requirement to facilitate efficient:
- data entry/ image selection,
- administration procedures (out for approval, OK etc)
by data entry/admin staff.
The data entered during the day is complied at nightime and all index pages are created and FTPed automatically to servers.
The asserted advantages of this approach are:
- primarily static files => compatibility with ISPs, and much better performance where CPU resources are scarce on busy Internet servers,
- all links and pi data (order info etc) are created automatically so integrity is excellent, and
- consistent user interfaces.
Adaption costs per application are as low as US$10,000 which includes template pages (form) produced by graphic designers. I think this compares well, even on cost, with proprietry data base solutions when configuration is taken into account.
Over here, there always seems to be a debate going on between "static" and "dynamic" approaches to Web site development. Maybe I've missed something, but I believe that an automated static approach wins hands down where it can effectively be applied (particularly in relation to server performance). Furthermore, it can be applied at least to the primary interfaces to all large Web sites and links can be provided to serious data base engines where the sheer volume of data requires a proprietry solution.
I'd really appreciate your comments on this approach, and would be delighted to discuss it further.
.. Brian O'Shea