Re: [whatwg/encoding] Add a static decode and encode method to `TextEncoder` and `TextDecoder` (#267)

> Gecko doesn't actually use USVString in TextEncoder WebIDL, either. However, the WebIDL layer still has enough overhead that e.g. wasm-bindgen [conditionally avoids](https://github.com/rustwasm/wasm-bindgen/blob/57b1a57c5e221d5d66a34df9e6152c45de8da561/crates/cli-support/src/js/mod.rs#L1451-L1452) going through the WebIDL layer.

Mh, that seems very weird to me. I very much doubt that WebIDL inherently needs to be a slowdown here. Currently the overhead likely stems from most engines slowing down when calling from JS to native code. This is a performance issue that can be fixed. In V8 based runtimes, `TextEncoder.encodeInto` can use V8's "fast API" for better JIT inlined performance for example.

The only WebIDL conversion taking place here is the `DOMString` conversion, which really is just a call to [ToString](https://tc39.es/ecma262/#sec-tostring) which is a 0 cost op if the call is already a string. A native API in ES would also need to do some runtime type checking (likely the exact same checks), so I really don't see why performance should be a reason to have a native JS API.

TLDR: I don't think spec decisions should be driven by current performance issues in engines, as I don't think that there are any technical reasons for why `TextEncoder#encodeInto` must be slower than `String#encodeIntoUTF8`. It's just a matter of optimizing the codepaths as far as I can tell.

Please correct me if I missed something obvious here.

-- 
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/encoding/issues/267#issuecomment-1134573291
You are receiving this because you are subscribed to this thread.

Message ID: <whatwg/encoding/issues/267/1134573291@github.com>

Received on Monday, 23 May 2022 11:49:50 UTC