Re: Build managment with Delta-V

Bruce Atherton (bruce@hagbard.flair.law.ubc.ca)
Thu, 29 Jul 1999 14:17:22 -0700


Message-Id: <199907292111.OAA24944@hagbard.flair.law.ubc.ca>
Date: Thu, 29 Jul 1999 14:17:22 -0700
To: Jim Whitehead <ejw@ics.uci.edu>, ietf-dav-versioning@w3.org
From: Bruce Atherton <bruce@hagbard.flair.law.ubc.ca>
In-Reply-To: <NDBBIKLAGLCOPGKGADOJAEENCCAA.ejw@ics.uci.edu>
Subject: Re: Build managment with Delta-V

At 07:49 PM 7/28/99 -0700, Jim Whitehead wrote:
>
>Some points in this space:
>
>1) source code local, compiler local, object files local
>2) source code remote, compiler remote, object files remote
>3) source code remote, compiler local, object files remote
>
>So, what are your thoughts?

I'd add another point to that space: 
4) source files remote, transformation tool (compiler) local, derived
(object) files local.

I find point 4 to be a useful architecture for implementing different
promotion levels within a sandbox. Want to run the production version of
some code with the latest changes from your sandbox? Tell the build system
to first look for source files in your sandbox, and failing that to use a
given URL. This saves you from having to check out the entire source tree.

Perhaps I would see no need for point 4 if it were more common to put
derived files into a versioning system, but I think that is currently
regarded as bad practice, partly because of the possibility of
synchronization errors between source and derived, partly because of
inadequacies in current versioning systems in dealing with files that are
binary. I'm sure there are other reasons I'm not aware of.

Consider doing a rebuild of a program. Typically this would involve a "make
clean" followed by a "make". Do you really want the versioning system to
record the state where the object files are deleted? Doesn't it make more
sense to perform the build steps locally and then, if you really wanted to,
store the resulting derived files?

Just my POV, anyway.