This project was inspired when I came across this GitHub repository which implements a metronome in JavaScript. I immediately realized the potential of this when used to sequence audio tracks to be on beat with each other. I combined this with Three.JS in order to visualize the music and create a more engaging experience.
The backbone of the visualizer is the Web Audio API's Analyzer class. First, I created an AudioEngine class and initialized the required components.
export function AudioEngine(){ let audioLoader = new THREE.AudioLoader(); this.ctx = new AudioContext(); this.analyser = this.ctx.createAnalyser(); this.analyser.connect(this.ctx.destination); this.loops = []; const audioRoot = './audio/'; this.audioFiles = [ 'drumz.mp3', 'bass.mp3', 'more.mp3' ];
Then, I create a function within the class to handle the loading of the audio files.
async function loadAudio(file) { return new Promise(function(resolve, reject) { var audio; audioLoader.load(file, function(buffer) { audio = new Loop(parent.ctx, buffer, parent.analyser); console.log('loaded file: ' + file); //push audio object to loops list in Metronome() parent.loops.push({ name: file, audio: audio, active: false }); resolve(); }, function(xhr) { console.log('loading: ' + file + ' ' + (xhr.loaded / xhr.total) * 100 + '%'); }, function(err) { console.log("failed to load file: " + file); console.log(err); }); }); }
From there, I also add in a function to the audio engine to handle the starting and stopping of loops.
this.scheduleLoops = function(time) { for (var key in parent.loops) { if (!parent.loops[key].audio.isPlaying && parent.loops[key].active) { console.log(key); parent.loops[key].audio.play(time); } if (!parent.loops[key].active && parent.loops[key].audio.isPlaying) { parent.loops[key].audio.stop(); } } };
From this point, it was simply integrating the audio into the visualizer class and arranging the rest of the presentation.