Wiring up WebAudio with WebVR

UPDATE

THREE.js developer Mr.doob has posted an important comment on this.

ORIGINAL ARTICLE

WebVR matters. And the great WebVR Boilerplate by Boris Smus allows to get started with it immediately.

Playing around with it, I got the idea to use the Web Audio API to spatialize the sound of an object within the matrix, so that a person wearing a headphone could not only see, but also hear where an object is located.

Since the Web Audio API is great, you can do that with ease.

Getting started

Download the WebVR Boilerplate and open index.html. In there we create a simple THREE.js sphere.

// Create sphere
var geometry = new THREE.SphereGeometry(0.4, 32, 32);
var material = new THREE.MeshNormalMaterial();
var sphere = new THREE.Mesh(geometry, material);

// Add sphere mesh to your three.js scene
scene.add(sphere);

Then comes some audio magic. Create an AudioContext, a BufferSource and a panner node. Load a buffer source and connect the nodes. Notice, that we use a panner node that applies some head-related transfer functions to the signal. HRTF gives us a much more high-grade spatializing algorithm than the panner node’s standard setting.

// Create audio context
var context = new AudioContext();
var source = context.createBufferSource();
var panner = context.createPanner();
// Important to set the panningModel to high-quality HRTF
panner.panningModel = "HRTF";

// Fetch the sound that the object emits, attach it to the source node and start playing
fetch('grow_beeps_sample.wav')
.then(function(response){
    return response.arrayBuffer();
})
.then(function(arrayBuffer){
    context.decodeAudioData(arrayBuffer, function(audioBuffer){
        source.buffer = audioBuffer;
        source.loop = true;
        source.start();
    });
});

source.connect(panner);
panner.connect(context.destination);

The Animation Loop

The animation loop is a function that is executed 60 times a second, one time for each frame. Here’s where we can apply movement to objects.

Here, we are going to use Web Audio’s AudioListener interface. It represents the position and orientation of the person listening to the audio scene. We do not need to set the AudioListener’s position, as by default it is (0|0|0), which is the same as the camera’s position.
But we do need to set the AudioListener’s orientation to the same as the camera’s. For that, we need both camera’s top and front vectors to obtain a complete orientation of the camera.
If we only had the camera’s direction, i. e. its front vector, we would not notice if it rotates around its own Z axis. This is equivalent to rolling your head to the left or to the right. After rolling, you would still look in the same direction but your ears would have moved to another position.

// Request animation frame loop function
function animate(timestamp) {
    
    timestamp = timestamp || 0;

    // calculate the current position of the circling sphere with sine and cosine functions
    var posX = 3 * Math.cos(Math.PI + timestamp/1500);
    var posZ = 3 * Math.sin(Math.PI + timestamp/1500);

    // set current X and Z location of the sphere
    sphere.position.x = posX;
    sphere.position.z = posZ;

    // get the camera's front vector, i. e. its world direction
    var cameraDirection = camera.getWorldDirection();

    // apply some three.js trick to obtain the camera's top vector
    var cameraTop = new THREE.Vector3(0,1,0).applyEuler(camera.getWorldRotation());

    // set the listener's orientation with front and top vector
    context.listener.setOrientation(cameraDirection.x, cameraDirection.y, cameraDirection.z, cameraTop.x, cameraTop.y, cameraTop.z);

    // set the position of the panner. it is equal to the sphere's position
    panner.setPosition(posX, 0, posZ);

    // Update VR headset position and apply to camera.
controls.update();

// Render the scene through the manager.
manager.render(scene, camera, timestamp);

requestAnimationFrame(animate);
}

That was all. There is much more fun you can have with the PannerNode. For example, you can set how quickly the volume of the object is reduced as it moves away from listener. You can also define a cone, outside of which the volume will be reduced. Have a look at the Web Audio spec.

coordinate system

Demo

Be aware that the binaural effect can only work when wearing stereo headphones.

https://webaudiotech.com/sites/webaudio_webvr

When not wearing a HMD, use mouse drag and drop to move your head.

WebAudio spec is due to change

Be aware, that the WebAudio specification is not finished yet. The interfaces described here can change but currently do work in Chrome.

More on this

Boris Smus has created a demo that includes a ConvolverNode introducing reverb after spatializing by the PannerNode. There’s also a list in the blog post with further resources on this topic.

4 thoughts on “Wiring up WebAudio with WebVR”

  1. On the trick to get the top vector of the camera:
    I deduced this by trial and error.

    The euler angles returned by getWorldRotation() describe, by what amount and in what directions an object has rotated.
    However, they do not include a vector which represents the starting point or a vector representing the end point of this rotation process.

    But what we want is the end point of the rotation process (in world space), because this is the camera’s new top vector (in world space).
    To obtain it, we must apply a rotation by these euler angles to the original top vector of the camera.
    We know that this vector is (0|1|0).

    Unfortunately, there is no documentation on the method getWorldRotation() yet on threejs.org. Neither on Object3D class nor on Camera class.

  2. Is it possible to muffle the sound to zero when the sphere is behind us? Unfortunately, the use of “coneInnerAngle” does not have any effect.

Leave a Reply

Your email address will not be published. Required fields are marked *