- From: guest271314 <notifications@github.com>
- Date: Fri, 30 Jun 2017 10:47:53 -0700
- To: whatwg/fetch <fetch@noreply.github.com>
- Cc: Subscribed <subscribed@noreply.github.com>
- Message-ID: <whatwg/fetch/issues/554/312331259@github.com>
You can create an `AudioBuffer` using `OfflineAudioContext` containing audio data between specific time ranges. This will allow a sample of audio data to be played even where the specified range of playback does not begin at `0`.
```
async function audioSample({url, from, to, when = 0, channels = 2, sampleRate = 44100}) {
const duration = to - from;
const [ac, oac] = [new AudioContext(), new OfflineAudioContext(channels, sampleRate * duration, sampleRate)];
const request = await fetch(url);
const response = await request.arrayBuffer();
// this should only be necessary once to get `AudioBuffer`
const data = await ac.decodeAudioData(response);
const source = oac.createBufferSource();
source.buffer = data;
source.connect(oac.destination);
// create an `AudioBuffer` of audio between `from` and `to`
source.start(when, from, duration);
const ab = await oac.startRendering();
return {ab, ac, oac}
}
// create an `AudioBuffer` of media content from 60 to 65 seconds
audioSample({url:"https://ia600305.us.archive.org/30/items/return_201605/return.mp3", from:60, to:65})
.then(({ab, ac, oac}) => {
console.log(ab, ac, oac);
const source = ac.createBufferSource();
source.buffer = ab;
source.connect(ac.destination);
// close `AudioContext`
// https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/close
source.onended = event => ac.close().then(() => console.log(event, ac));
source.start();
});
```
[plnkr](http://plnkr.co/edit/QLZ2ju0DavdGNg2iOYof?p=preview)
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/whatwg/fetch/issues/554#issuecomment-312331259
Received on Friday, 30 June 2017 17:48:25 UTC