- From: Domenic Denicola via GitHub <sysbot+gh@w3.org>
- Date: Tue, 02 Jun 2015 18:13:44 +0000
- To: public-device-apis@w3.org
> No, at the lowest level they may be polling, but it is invisible to the user at most systems, therefore it is seen as push. Right, so it becomes a question of whether we want to expose low-level primitives, or higher-level abstractions. I think the lower level primitives make a lot of sense here (in many cases), because they allow more use cases. For example, note how games use polling of mouse/keyboard and gamepad buttons to achieve lower latency. (I've gotten many complaints about the event-based mousemove API actually at conferences from game developers.) It also allows more conservative or on-demand sampling instead of at a specific predefined frequency. Given that these calls will involve IPC to some degree that can be valuable. That said, > Based on my experience with JavaScript programs (running in node.js on a variety of platforms) that either read sysfs GPIO directly or via some compiled native bindings, delivering an indication of value change or value read by emitting an event has been both extremely efficient. It's also very intuitive for those writing the programs. > This is actually my exact experience with anyone that's ever used JavaScript to interact with sensory hardware. from @rwaldron is good in-the-field experience, and we should heed it. > Yes, ideally a standardized EventEmitter or similar would be preferable. Key word "ideally" :). The main thing I want to communicate in this thread is: don't block your spec on other specs. -- GitHub Notif of comment by domenic See https://github.com/w3c/sensors/issues/21#issuecomment-108038846
Received on Tuesday, 2 June 2015 18:13:45 UTC