This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
Audio-ISSUE-81 (OscillatorTypeModification): Oscillator type modification [Web Audio API] http://www.w3.org/2011/audio/track/issues/81 Raised by: Philip Jägenstedt On product: Web Audio API https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html#Oscillator This issue is related to https://www.w3.org/2011/audio/track/issues/28 but with the additional complication of internal state. May the oscillator type and wave table (for CUSTOM) be modified at any time, or should such modifications throw an exception if playbackState > UNSCHEDULED_STATE? It would certainly be simpler to implement if an Oscillator is "frozen" once created, but it's not clear if continuously modifying the waveform is desirable for some types of effects.
Yes, it should be possible to change the type or call setWaveTable() at any time.
(In reply to comment #1) > Yes, it should be possible to change the type or call setWaveTable() at any > time. Should there be any smoothing or similar applied to the change (as specified for delayNode, for instance)? Should there be a minimal delay between the change and when the effect takes place (e.g. if an Oscillator node is connected directly to the AudioDestination node, how long before the change becomes audiable)?
(In reply to comment #2) > (In reply to comment #1) > > Yes, it should be possible to change the type or call setWaveTable() at any > > time. > > Should there be any smoothing or similar applied to the change (as specified > for delayNode, for instance)? > > Should there be a minimal delay between the change and when the effect takes > place (e.g. if an Oscillator node is connected directly to the AudioDestination > node, how long before the change becomes audiable)? I don't think it's essential for the change to occur with minimal delay. In other words, I don't think it's as critical for this change to happen as quickly as say a call to noteOn(0) (representing playing a sound immediately). I'm expecting the type will be changed as a result of some user action in the UI (changing osc-type menu from square->sawtooth).
(In reply to comment #3) > I don't think it's essential for the change to occur with minimal delay. In > other words, I don't think it's as critical for this change to happen as > quickly as say a call to noteOn(0) (representing playing a sound immediately). > I'm expecting the type will be changed as a result of some user action in the > UI (changing osc-type menu from square->sawtooth). Since the interface isn't designed for (near) sample-accurate waveform switches (which would be required for doing things such as emulating some sounds from 8-bit systems, e.g. the rapidly changing waveform that can be heard in [1]), I think that an acceptable solution would be to move the type and WaveTable attributes to the constructor instead (and remove them from the interface). I can't think of a single application that would not work even with this restriction (either replace nodes when you change waveform, or use AudioGain to switch between multiple oscillators at precise times), and it sure would simplify a lot of things. Otherwise, I think that the specification must say something about the expected latency associated with changes to the type/WaveTable parameters. [1] http://www.youtube.com/watch?v=CT2GEVqsomQ&t=179
(In reply to comment #4) > (In reply to comment #3) > > I don't think it's essential for the change to occur with minimal delay. In > > other words, I don't think it's as critical for this change to happen as > > quickly as say a call to noteOn(0) (representing playing a sound immediately). > > I'm expecting the type will be changed as a result of some user action in the > > UI (changing osc-type menu from square->sawtooth). > > Since the interface isn't designed for (near) sample-accurate waveform switches > (which would be required for doing things such as emulating some sounds from > 8-bit systems, e.g. the rapidly changing waveform that can be heard in [1]), I > think that an acceptable solution would be to move the type and WaveTable > attributes to the constructor instead (and remove them from the interface). I > can't think of a single application that would not work even with this > restriction (either replace nodes when you change waveform, or use AudioGain to > switch between multiple oscillators at precise times), and it sure would > simplify a lot of things. > > Otherwise, I think that the specification must say something about the expected > latency associated with changes to the type/WaveTable parameters. > > [1] http://www.youtube.com/watch?v=CT2GEVqsomQ&t=179 I think it's important to support changing the oscillator type and to be able to set the wavetable after creation, because these are features of the oscillator that are important to control. Even basic analog synthesizers allow this. An example use case changing oscillator type: http://chromium.googlecode.com/svn/trunk/samples/audio/oscillator.html An example use case calling setWaveTable(): http://chromium.googlecode.com/svn/trunk/samples/audio/tone-editor.html
(In reply to comment #5) > (In reply to comment #4) > > (In reply to comment #3) > > > I don't think it's essential for the change to occur with minimal delay. In > > > other words, I don't think it's as critical for this change to happen as > > > quickly as say a call to noteOn(0) (representing playing a sound immediately). > > > I'm expecting the type will be changed as a result of some user action in the > > > UI (changing osc-type menu from square->sawtooth). > > > > Since the interface isn't designed for (near) sample-accurate waveform switches > > (which would be required for doing things such as emulating some sounds from > > 8-bit systems, e.g. the rapidly changing waveform that can be heard in [1]), I > > think that an acceptable solution would be to move the type and WaveTable > > attributes to the constructor instead (and remove them from the interface). I > > can't think of a single application that would not work even with this > > restriction (either replace nodes when you change waveform, or use AudioGain to > > switch between multiple oscillators at precise times), and it sure would > > simplify a lot of things. > > > > Otherwise, I think that the specification must say something about the expected > > latency associated with changes to the type/WaveTable parameters. > > > > [1] http://www.youtube.com/watch?v=CT2GEVqsomQ&t=179 > > I think it's important to support changing the oscillator type and to be able > to set the wavetable after creation, because these are features of the > oscillator that are important to control. Even basic analog synthesizers allow > this. > > An example use case changing oscillator type: > http://chromium.googlecode.com/svn/trunk/samples/audio/oscillator.html > > An example use case calling setWaveTable(): > http://chromium.googlecode.com/svn/trunk/samples/audio/tone-editor.html Sorry, I forgot to mention that these two examples would need to be run in Chrome Canary or WebKit nightly, as the stable Chrome does not *quite* yet support this.
(In reply to comment #5) > I think it's important to support changing the oscillator type and to be able > to set the wavetable after creation, because these are features of the > oscillator that are important to control. Even basic analog synthesizers allow > this. I agree that it's a powerful tool to be able to change the waveform type and wavetable over time. For instance, you can use it to simulate real string instruments, and of course for doing funky synth effects (such as Fairlight CMI style changing-harmonics-over-time synthesis, as mentioned by Ray on the list). However, the problem right now is that the spec does not really say anything about timing accuracy. I'm not opposing having the attributes modifiable - I'm just saying that IF they are modifiable, we need to specify the behavior in more detail, including upper and lower limits to how long it may take for an attribute change to take effect and possibly things such as transition method (e.g. cross-fade or not) and internal phase preservation (or not). In other words, a Web developer must know what to expect, and similarly we must be able to write tests for testing this behavior. An easy-way-out here would be to make the attributes read-only, which IMO would actually make the interface more logical, especially w.r.t the setWaveTable method & type attribute interaction. And I believe you could implement the mentioned examples anyway (just with a few more lines of code).
Do we still want to make the `type` oscillator types immutable? It may be too late for such breaking change (something like what comment 4 suggest). If we agree on keeping the interface as-is, we need to be more precise on the behaviour a conforming implementation should adopt when mutating the `type` attribute. An author writing an application that requires high degree of synchronization when changing waveforms would need to use something like GainNodes and the automation functions (setValueAtTime and friends) anyways, so it might not be of too much importance to have very strict bounds, here. As comment 3 says, I also expect this to be mainly used as a response to a non time-critical action (say, a user changing the first oscillator in a multi-oscillator synth using a dropdown widget, when in the process of designing a sound). In light of this use-case for modifying the `type` attribute on a playing OscillatorNode, we could spec it this way: - Considering the node support "custom" waveforms, possibly out of phase with the basic waveform types, we have two options: (1) The phase is preserved when switching between basic oscillator type (we need to spec coherent phase among basic types for that, this is bug 17366). Switching to/from "custom" type resets the phase ; (2) Switching the type resets the phase, letting everything in the hands of the author ; I'm in favor of adopting (2), for consistency. - There should not be a crossfade when changing the type. If an author wants morphing between two or more waveforms, using GainNode with multiple OscillatorNode is easy enough ; - Lower bound for the type change to be reflected in the node output could be the next block, making this act like an a-rate parameter does not make much sense ; - I'm not sure about what could be a realistic upper bound. Depending on the generation method, switching oscillator can be virtually free (changing the algorithm when using time-domain generation), or more expensive (because an ifft is needed, for example). In practice, considering the above, this probably does not matter too much.
(In reply to comment #8) > Do we still want to make the `type` oscillator types immutable? It may be > too late for such breaking change (something like what comment 4 suggest). I do not want to change types to be immutable, for the reasons Chris listed. > If we agree on keeping the interface as-is, we need to be more precise on > the behaviour a conforming implementation should adopt when mutating the > `type` attribute. Definitely agree. > In light of this use-case for modifying the `type` attribute on a playing > OscillatorNode, we could spec it this way: > - Considering the node support "custom" waveforms, possibly out of phase > with the basic waveform types, we have two options: > (1) The phase is preserved when switching between basic oscillator type > (we need to spec coherent phase among basic types for that, this is bug > 17366). Switching to/from "custom" type resets the phase ; > (2) Switching the type resets the phase, letting everything in the hands > of the author ; > > I'm in favor of adopting (2), for consistency. Consistency with what? The author doesn't have any phase control (other than doing the math, figuring out precisely when the earliest they can switch, and aligning phase that way). I would lean toward (1) for that reason. > - There should not be a crossfade when changing the type. If an author wants > morphing between two or more waveforms, using GainNode with multiple > OscillatorNode is easy enough ; +1. > - Lower bound for the type change to be reflected in the node output could > be the next block, making this act like an a-rate parameter does not make > much sense ; +1.
(In reply to comment #9) > > In light of this use-case for modifying the `type` attribute on a playing > > OscillatorNode, we could spec it this way: > > - Considering the node support "custom" waveforms, possibly out of phase > > with the basic waveform types, we have two options: > > (1) The phase is preserved when switching between basic oscillator type > > (we need to spec coherent phase among basic types for that, this is bug > > 17366). Switching to/from "custom" type resets the phase ; > > (2) Switching the type resets the phase, letting everything in the hands > > of the author ; > > > > I'm in favor of adopting (2), for consistency. > > Consistency with what? The author doesn't have any phase control (other > than doing the math, figuring out precisely when the earliest they can > switch, and aligning phase that way). I would lean toward (1) for that > reason. Consistency between switching between two basic oscillator and switching between a basic oscillator and a custom waveform (i.e. reset in both cases). I agree than (2) is probably okay as well, since it means less work for the author.
In the case of AudioBufferSourceNode, we let the buffer attribute to be set during playback with no restrictions, and of course no guarantees on when the "switch" happens. I don't see why we should strive to do something different here, therefore I think option (2) is probably better.
(In reply to comment #11) > In the case of AudioBufferSourceNode, we let the buffer attribute to be set > during playback with no restrictions, and of course no guarantees on when > the "switch" happens. I don't see why we should strive to do something > different here, therefore I think option (2) is probably better. This isn't the same case at all. In fact, if you change the buffer attribute, what happens? Does it reset? I would not think this would make a change. I tested, and Blink does in fact maintain phase across this change, and the more I've thought about, the more I believe that is exactly the right thing to do. Otherwise, you have no idea what phase the oscillator is in anymore.
Web Audio API issues have been migrated to Github. See https://github.com/WebAudio/web-audio-api/issues
Closing. See https://github.com/WebAudio/web-audio-api/issues for up to date list of issues for the Web Audio API.