Concepts to improve Http2.0

Hi,

I am not new to the concept of the IETF, however, I have yet to make an
offical submission.

I would like to put forth a concept that can further improve the
performance of http 2.0.
I have a couple of other concepts as well regarding content expiry headers
which would affect http 1.1.
Additionally I would also like to look into concepts to prevent unnecessary
push requests for content that is already cached by the browser. Since
mobile bandwidth constraints, would be obviously benefit from not push
content that is already cached.

Full document on the concept can be found  at the link below and first
abstract can be found to follow this email.

https://docs.google.com/document/d/1xGY4GycBMt4zyCoJpzoIZrlLOs1bwaRVBfPE9aXdbyE/edit?usp=sharing

If you could please advise as to the path to follow.


Kind Regards,

Wesley Oliver
Http Response Stream - Optimistic approach for performance improvement and
Snowball effect of Response Body Programming paradigm shift of benefits

Abstract

Traditionally in http 1.1 one is required to buffer an http response on the
server side. If a change to the headers was to be made during the response
somewhere during the page generation code, because headers are not allowed
to be changed after the message-body has been transmitted. Changing these
semantics by removing this constraint in http 2.0 will open the door to an
http response programming paradigm shift in possibilities. Benefits,
improved and optimal bandwidth utilization, reduce overall page render
resource latency and potentially an increase in server page requests that
can be processed.
Concept:

Allow multiple response to be sent over the wire for the same request,
whereby the last response that has been transmitted over the wire, will
form the official response that will be permanently rendered in the client
browser.

This is an optimistic approach, when the response will not change,
therefore eliminating the need to buffer the response. As soon as network
buffer has a full packet or has been forced flushed it can be transmitted
over the wire, reducing the latency of the response experience by the
client. Additionally it also allows for improved bandwidth utilization
after the server has received the request, as it can immediately start
sending response packets, reducing potentially wasted bandwidth during the
time in which the response is being generated and then buffered before
transmission.




-- 
-- 
Web Site that I have developed:
http://www.swimdynamics.co.za


Skype: wezley_oliver
MSN messenger: wesley.olis@gmail.com

Received on Wednesday, 27 July 2016 06:20:15 UTC