Here's another example of code that is going to have problems with the
Webkit/Blink race-prone model:
var audioBuffer =
audioContext.createBuffer(1,10000,audioContext.sampleRate);
for (var i = 0; i < 10000; ++i) {
audioBuffer.getChannelData(0)[i] = ...;
}
var audioBufferSourceNode = audioContext.createBufferSource();
audioBufferSourceNode.buffer = audioBuffer;
audioBufferSourceNode.start();
... wait for some event to fire ...
for (var i = 0; i < 10000; ++i) {
audioBuffer.getChannelData(0)[i] = ...;
}
var audioBufferSourceNode2 = audioContext.createBufferSource();
audioBufferSourceNode2.buffer = audioBuffer;
audioBufferSourceNode2.start(audioContext.currentTime + 2);
As I understand it, in Webkit/Blink it's possible for the second set of
modifications to audioBuffer to corrupt the sound being played by
audioBufferSourceNode. Whether is actually happens or not could depend on
the timing of the event, and how much buffering the Web Audio
implementation uses, and how quickly the shared memory system propagates
writes between cores, and other implementation details.
It's also worth considering what behaviors the spec should allow for code
like this. Should the spec allow implementations to not corrupt
audioBufferSourceNode, even if the delay between start() calls is
arbitrarily small? This code would be 100% reliable in Gecko, and it's easy
to see how authors could come to rely on that.
Rob
--
Jtehsauts tshaei dS,o n" Wohfy Mdaon yhoaus eanuttehrotraiitny eovni
le atrhtohu gthot sf oirng iyvoeu rs ihnesa.r"t sS?o Whhei csha iids teoa
stiheer :p atroa lsyazye,d 'mYaonu,r "sGients uapr,e tfaokreg iyvoeunr,
'm aotr atnod sgaoy ,h o'mGee.t" uTph eann dt hwea lmka'n? gBoutt uIp
waanndt wyeonut thoo mken.o w *
*