Message-Id: <m119xUF-000OXNC@jazzie.com> From: sds@jazzie.com (Sean Shapira) To: Ken.Coar@Golux.Com (Rodent of Unusual Size) Date: Thu, 29 Jul 1999 14:13:11 -0700 (PDT) Cc: ejw@ics.uci.edu, ietf-dav-versioning@w3.org In-Reply-To: <37A02C76.277A74D@Golux.Com> from "Rodent of Unusual Size" at Jul 29, 99 06:27:02 am Subject: Re: Build managment with Delta-V > Jim Whitehead wrote: > > 2) source code remote, compiler remote, object files remote > > > > In this model, you would submit the URL of a makefile, some > > parameters, and a workspace to a remote build server, and the build > > server would then go off and remotely compile the source code. To which Ken Coar responded: > In ye olden daze (before the onset of ye greye haires), this was > called 'batch processing.' [...] been there, done that, dropped > my obligatory box o' cards. Normally I would agree with this sentiment, as it facilitates a less restrictive (and thus more efficient) development environment. But I fear it doesn't always work, and Delta-V needs to allow support for the cases where it doesn't. > I like the CVS model, which would have me replicate the source > to my workspace, perform the build, and then check in any > updated results as appropriate. To my eye Ken has elided an important step: build verification testing. In a BVT the user at a minimum makes sure the derived object is not totally broken (e.g.: an executable that the loader will fail to load). Ideally the BVT would include a complete regression test suite, run on all supported target platforms. > I'm not in favour of any remote processing at all, unless > features or functions are required that cannot exist locally. I would not want to build and BVT for 8 or 10 different target platforms locally -- I would much prefer to batch this work out, especially in situations where cross-compilers were unavailable! (This sentiment assumes the project doesn't simply rely on beta testers to discover BVT failures....) Returning to Jim's model 2, in the extreme one would want the BVT used as a precondition for successfully performing the versioning action. In the vernacular, "The checkin should fail unless the BVT passes." To make this work, the versioning system might need to handle transactions that take a long time (hours, or even days) to complete. And certainly it would require transactions that could span multiple resources "atomically," i.e. such that the server-side BVT would be run on objects derived from the entire set of changed sources, which would then either be accepted together into the version tree, or rejected together. It seems likely the representatives of sophisticated versioning systems (Atria was mentioned in a prior message) know about these needs, and have voiced them in design team discussions. True? -- Sean Shapira sds@jazzie.com +1 206 443 2028 Serving the Net since 1990.