Re: W3C Polyfill service?

Hey Brian,

On Sunday, October 30, 2016, Brian Kardell <bkardell@gmail.com> wrote:

>
>
> On Sun, Oct 23, 2016 at 12:49 AM, Andrew Betts <andrew.betts@gmail.com
> <javascript:_e(%7B%7D,'cvml','andrew.betts@gmail.com');>> wrote:
>
>> Alex:
>> > I'm not sure the W3C is really suited to the task of running
>> web-critical, high-performance infrastructure.
>>
>> Agreed. The angle that intrigued me was the idea of the W3C 'blessing'
>> (though co-branding, use of a w3.org subdomain, governance etc) a single
>> co-ordinated polyfill effort, but without taking on the responsibility for
>> the infrastructure.  Currently Fastly and Heroku run the infrastructure for
>> polyfill.io and they seem well suited to it.
>>
>> Travis:
>> > What would be the goal of a polyfil repository? Would they count as an
>> implementation of a feature? Could they also be used to help the testing
>> effort? If so, then I would love to see them integrated into
>> web-platform-tests
>>
>> Goals are to enable faster adoption of new platform features.  Developers
>> can be hesitant if they think the polyfill will bloat their code, or if
>> they are not sure if it's any good, etc.  Web platform tests are
>> interesting - in many cases polyfills are not perfect replicas of the
>> native feature, often due to technical feasibility, sometimes due to lack
>> of resources on the part of the author to cover the entire spec.
>>
>> Building a polyfill is harder than building the feature natively, since
>> your code needs to work in all browsers (and by definition not just the
>> latest ones) whereas the native impl only has to work in one version of one
>> browser.  At the same time polyfill authors are normally individuals with
>> limited resources whereas native implementations are created by large teams
>> within well funded corporations.  Also, the nature of polyfills means it's
>> possible to retrospectively fix bugs in them much more easily - especially
>> if they are served from a centrally managed CDN.
>>
>> In polyfill.io we've looked at the feasibility of testing polyfills with
>> web platform tests, but in practice almost all polyfills fail those tests,
>> and yet people still use them successfully.  Obviously it's great when a
>> polyfill author makes a polyfill that passes WPTs, but since most don't,
>> making that a condition of acceptance into a community polyfill library
>> would be problematic.  I guess polyfills might help the testing effort in
>> other ways?
>>
>> A
>>
>
>
> I do think that there are several things that W3C and TAG can/should do
> regarding a lot of these ideas, I've shared my thoughts in the past and
> I've been trying to figure out what the best way to comment because there
> are a number of related issues/discussions I see happening that all seem to
> confluence around ideas that I've had/private discussions that I haven't
> had a chance to assemble into something presentable.  I'm going to skip all
> the details and highlight the things that I think something should happen
> around, perhaps questions/comments might help me figure out how to best
> pursue/present more.
>
>
> Much of this is centered on the fact that things that are implemented in
> JavaScript, along with a draft proposal for standardization are not the
> traditional polyfill as defined by Remy Sharp - they have a number of
> distinct concerns that are expressly not concerns for real polyfills.  If
> we do this well, it is potentially a boon to the ecosystem, and there are
> lots of potential followups to make sure that we can make that vibrant and
> successful.  If we do it poorly, we actually can harm the ecosystem.
>
> I think that 'thing' _requires_ naming distinction from polyfill.  I also
> think it has a perfectly useful one already that has been widely
> disseminated in all forms of media, but I don't want to get bogged up in
> that in the main argument :)
>
> So one thing that TAG can do, and I thought there was agreement to do this
> in another TAG meeting, is to put a note or finding of best practices
> around development of this 'thing' that helps achieves its goals and avoid
> pitfalls/harm.  I think the most major bits are: a) forward compatibility
> in which you don't squat on the name before it is really standard is one
> (frequently a leading underscore or $ is enough)
>

There's a pretty big grey area here that I think is just as important: when
is it reasonable to drop the prefix? When a spec hits CR? PR? Has multiple
native impls? When most modern browsers support the feature?


> b) recognition of the fact that in order to have value early on,
> versioning really matters.  c) ways to express differences and limitations
>
> Another thing (again, I've mentioned this before) is that channeling
> people toward safer ones and harnessing this as part of the process becomes
> useful to both sides (standards and developers).
>

Safety is an interesting concept here; do we mean "avoids premature
compatibility with a specific grammar or semantic which may be out of sync
w/ the final spec"?


> Incubation should encourage having one of these.  It should be linked in
> the proposal and be required to follow those best practices.  This would
> provide some kind of aid for developers to find such things and have at
> least some sense of their quality.
>

I think I've been phrasing this slightly differently: incubation should
seek to get developer feedback and iterate quickly based on it. There are
many cases where polyfills don't have enough power to provide a reasonable
facsimile of the eventual feature. The Houdini APIs come to mind. In these
cases it'd be a bit of a distraction to say "make a polyfill!" when the way
to get feedback about how well something like Custom Paint will function
will be derived from a prototype implementation that devs can target
(hence Origin
Trials <https://github.com/jpchase/OriginTrials/blob/gh-pages/explainer.md>
).


> Finding and sense of quality are two areas that I think really matter for
> success and I would like to see more on.  Early in the industrial
> revolution, no one had a lot of experience with the newly available raw
> materials - tinkerers produced a lot of varying quality.  There was, for a
> time, a substantial impasse of trust.  Stores were gambling reputations
> (lots of consumer laws were not in place) by taking on a product they knew
> nothing about, consumers were gambling purchasing a product they knew
> nothing about.  Independent signals really helped this problem a lot and
> got us past that impasse.  Things like UL listing were a mark that people
> could trust.  If you think about this, we do that with tech already.  If
> lots of smart people are saying "this thing is good" then I look at it.  If
> I come across some random thing I've never heard of, I'm more skeptical.
>
> Ideally, what I would like to see is a vibrant incubation group where the
> bar to entry isn't too high or too low - and that incubations include
> reasonably good and safe  implementations wherever possible.  That those
> proposals and incubations provide a way that they can be indexed and in a
> caniuse/chromestatus kind of catalog for proposed standards of a certain
> measure of quality as well as information about positive signaling by
> vendors.  Invited experts in particular fields should be able to positively
> signal too.
>
> If we had these sorts of things, it would be very easy to shepard
> developers toward them for standards efforts and participation because it
> would make a lot of aspects of their lives a lot easier and engagement more
> valuable.  It stops short of W3C "blessing" or "hosting" anything, but it
> would make those sorts of efforts considerably more plausible and useful by
> simply being somewhat integrated into process.
>
>
>
>
> --
> Brian Kardell :: @briankardell
>

Received on Sunday, 30 October 2016 07:27:46 UTC