- From: Foteos Macrides <MACRIDES@sci.wfbr.edu>
- Date: Thu, 10 Oct 1996 13:06:57 -0500 (EST)
- To: dan@spyglass.com
- Cc: http-wg%cuckoo.hpl.hp.com@hplb.hpl.hp.com
Daniel DuBois <dan@spyglass.com> wrote: >At 05:09 PM 10/10/96 +0200, Koen Holtman wrote: >>some HTML form hacks would be needed to provide the same level of downwards >>compatibility with existing browsers that Safe can provide, for example >> >> <form action="..." method=post preferred_method=get-with-body> >> .... >> </form>. >>So it boils down to cruft in HTTP vs. cruft in HTML. > >Aren't proxies disallowed from forwarding methods they don't understand? >Wouldn't GETWITHBODY require a HTTP/1.2 (or rather, a 1.3, since servers >would be forced to accept it in 1.2, but clients would need to not send it >until 1.3, ala FullURL)? Safe: yes could be sent today. What the GETwithBody would be replacing in this discussion is not just any GET, but ones which would otherwise have a ?searchpart. The HTTP/1.1 draft states that Cache-Control and Expires headers *can* be used to yield and regulate caching of replies from POST requests. What exactly is still being sought via a GETwithBodyInsteadOfSearchpart that can't be achieved via a POST with "Safe: yes" and Cache-Control/Expires headers? Are there *any* headers or procedures which can't be made to treat a POST with "Safe: yes" as, in effect, a GETwithBodyInsteadOfSearchpart? Fote ========================================================================= Foteos Macrides Worcester Foundation for Biomedical Research MACRIDES@SCI.WFBR.EDU 222 Maple Avenue, Shrewsbury, MA 01545 =========================================================================
Received on Thursday, 10 October 1996 10:20:54 UTC