W3C home > Mailing lists > Public > www-tag@w3.org > July 2014

Re: Food for thought (resurfacing)

From: Alex Russell <slightlyoff@google.com>
Date: Tue, 29 Jul 2014 12:03:21 -0700
Message-ID: <CANr5HFUAMwxBgHLLmbE5O-aOGsRx2nQcQh-17C8TGUNY6sF5_A@mail.gmail.com>
To: Marc Fawzi <marc.fawzi@gmail.com>
Cc: "www-tag@w3.org List" <www-tag@w3.org>, Noah Mendelsohn <nrm@arcanedomain.com>, Marcos Caceres <w3c@marcosc.com>, Larry Masinter <masinter@adobe.com>
On 28 Jul 2014 22:39, "Marc Fawzi" <marc.fawzi@gmail.com> wrote:
>
> <<
> Antiquated systems without the ability to auto-update are the root of all
security and developer-pain evil. They should either be forcibly
disconnected from the network for everyone's good (a requirement which
special configuration environments are often aligned with) or upgraded.
> >>
>
> Tell that to our high-recurring-revenue customers in the financial
industry who have just upgraded from IE6 to IE8 and don't feel like
upgrading again for as long as Windows 7 lives

Happy to. Point me at em.

> The web standards process is too slow and too imperfect for tomorrow's
world, which as we know is always approaching. Efforts like NiDIUM prove
that innovation cannot be dictated by any one group of people (in this case
W3C, TAG and the major vendors who lead them) and disasters like DRM on the
Web (EME) are going to countered by a new breed of browser vendors who
don't believe in sticking to outdated paradigms like HTM/CSS which were
designed for the world of hypertext documents not for serious application
development. There will be a time when major browser vendors will have to
play catch up with the new emerging paradigms while carrying the burden of
supporting the web's legacy technologies Guess who's gonna win that race
long term?
>
> The web does. Not the W3C, TAG et al. All these organizations are
temporary constructs that have to find a niche place in the complex reality
of tomorrow.
>
> Just a verbalized prediction. That's all.
>
>
>
>
>
>
>
> On Mon, Jul 28, 2014 at 10:02 PM, Alex Russell <slightlyoff@google.com>
wrote:
>>
>> On Mon, Jul 28, 2014 at 5:21 PM, Larry Masinter <masinter@adobe.com>
wrote:
>>>
>>> > We're not to a fully auto-updating world yet, but are closer than
ever before and the trend lines are good.
>>>
>>> I think the issue (about dynamically loading engines) isn't the number
of players (one, three, or fifty) but the variety.
>>>
>>> Reality check please:
>>> Is that actually the real world, are the trend lines really that way?
>>
>>
>> Yes it is.
>>
>>>
>>> Or is it only if you are only looking at the auto-updating subset?
>>
>>
>> Nope. Legacy clients are being replaced with auto-updating clients in
general.
>>
>>>
>>> And if it's true the whole world is really trending toward auto-update
everything, is it unreservedly "good"?
>>
>>
>> Yes. Yes it is. Old code is pwn'd code.
>>
>>>
>>> Software updates tend to target (and is tested against) recent hardware
and platforms.
>>> Software updates are disruptive. Updates fix old bugs but can introduce
new ones.
>>> Software updates can be impractical in small-memory embedded systems or
those with special configurations and requirements.
>>
>>
>> Antiquated systems without the ability to auto-update are the root of
all security and developer-pain evil. They should either be forcibly
disconnected from the network for everyone's good (a requirement which
special configuration environments are often aligned with) or upgraded.
>>
>>>
>>> A fully auto-updating world, or one in which engines are dynamically
loaded, is good for fully auto-updating / dynamically loading browser
vendors (whether one or many), but not so good for end users of other
applications.
>>
>>
>> Given the last 10 years of web (in)security, we absolutely, positively,
100% know better. This might have been a reasonable argument in another
age, but not today. The jury is no longer out.
>>
>
Received on Tuesday, 29 July 2014 19:03:49 UTC

This archive was generated by hypermail 2.4.0 : Friday, 17 January 2020 22:57:03 UTC