This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
Audio-ISSUE-55 (HTMLMediaElementIntegration): HTMLMediaElement integration [Web Audio API] http://www.w3.org/2011/audio/track/issues/55 Raised by: Philip Jägenstedt On product: Web Audio API https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#MediaElementAudioSourceNode https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#AudioElementIntegration The section "Integration with the audio and video elements" should be merged into the definition of MediaElementAudioSourceNode. Unfortunately, the combination of the two still leaves the behavior completely undefined, with only an example given and no normative requirements for implementations. What happens when a HTMLMediaElement * has readyState HAVE_NOTHING? * is paused? * is seeking? * has no audio channels? * switches the active audio channel using the AudioTrack.enabled interface. * is muted? * has volume < 1?
Fixed: https://dvcs.w3.org/hg/audio/rev/9224fb26e77d
Looks like "must behave in an identical fashion after the MediaElementAudioSourceNode has been created" answers all of the questions. I note that it implies that when two audio tracks are enabled at the same time, they will be mixed before entering the Web Audio API and there's no way to process them separately.
Closing. See https://github.com/WebAudio/web-audio-api/issues for up to date list of issues for the Web Audio API.