- From: Jussi Kalliokoski <notifications@github.com>
- Date: Mon, 26 Aug 2013 11:38:53 -0700
- To: WebAudio/web-midi-api <web-midi-api@noreply.github.com>
- Message-ID: <WebAudio/web-midi-api/issues/45/23284997@github.com>
I agree about the latency, we need to take that into account. Some use cases:
* Virtual instrument on a web page. Has inherent latency from the audio graph that the host needs to take into account. Needs a way to set the latency of its virtual input.
* Web page that does sequencing. Needs to take the latency of an external instrument into account. Needs a way to read the latency of its output.
* A web page that takes a guitar input and converts it to MIDI. Has latency in the D/A and pitch detection and needs to convey that to the consumer. Needs a way to set the latency of its virtual output.
* A notation software. Needs to be able to sync various MIDI sources in order to get a synced composition. Needs a way to read the latency of its inputs.
So basically, I think what we need is for normal ports a way to read their latency (if not available, report 0) and for virtual ports to write their latency, e.g.
```java
partial interface MIDIInput {
readonly double latency;
};
partial interface MIDIOutput {
readonly double latency;
};
partial interface VirtualMIDIInput {
double latency;
};
partial interface VirtualMIDIOutput {
double latency;
};
```
---
Reply to this email directly or view it on GitHub:
https://github.com/WebAudio/web-midi-api/issues/45#issuecomment-23284997
Received on Thursday, 29 August 2013 03:09:06 UTC