BACKGROUND
My goal is to create a JavaScript-based web app to analyse and display frequency information in audio sources, both in-page sources (<audio>
tag) and signals streamed from the client's microphone. I am well on my way :)
As a keen saxophonist, one of my goals is to compare the information inherent in the tone of different saxophonists and instruments by examining the distribution of upper partials in relation to a fundamental pitch. In short, I want to derive a representation of why different instrumentalists and instrument brands sound different even when playing the same pitch. Additionally I want to compare the tuning and frequency distribution of various 'alternative fingerings' against traditional or standard fingerings by the same player/instrument.
Accessing and displaying frequency information is a fairly trivial matter using the JS AudioContext.analyserNode
, which I am using in conjunction with the HTML5 Canvas
element to create a frequency map or 'winamp-style bargraph' similar to the one found 'Visualizations with Web Audio API' @ MDN.
PROBLEM
In order to achieve my goal I need to identify some particular information in the audio source, significantly the frequency in Hertz of the fundamental tone, for direct comparison between instrumentalists/instruments, and the frequency range of the source, to identify the frequency spectrum of the sounds I'm interested in. That information is to be found in the variable fData
below...
// example...
var APP = function() {
// ...select source and initialise etc..
var aCTX = new AudioContext(),
ANAL = aCTX.createAnalyser(),
rANF = requestAnimationFrame,
ucID = null;
ANAL.fftSize = 2048;
function audioSourceStream(stream) {
var source = aCTX.createMediaStreamSource(stream);
source.connect(ANAL);
var fData = new Uint8Array(ANAL.frequencyBinCount);
(function updateCanvas() {
ANAL.getByteFrequencyData(fData);
// using 'fData' to paint HTML5 Canvas
ucID = rANF(updateCanvas);
}());
}
};
ISSUES
While I can easily represent fData
as a bar- or line-graph etc via the <canvas>
API, such that the fundamental and upper partials of a sound source are clearly visible, so far I have not been able to determine...
- The frequency range of
fData
(min-max Hz) - The frequency of each value in
fData
(Hz)
Without this I cannot begin to identify the dominant frequency of the source (in order to compare variations in tuning against traditional musical pitch names) and/or highlight or excluded regions of the represented spectrum (zooming in or out etc) for more detailed examination.
My intention is to prominently display the dominant frequency by pitch (note name) and frequency (Hz) and to display the frequency of any individual bar in the graph on-mouseover. N.B. I already have a data object in which all the frequencies (Hz) of the chromatic pitches between C0-B8 are stored.
Despite reading the AudioContext.analyserNode specification several times, and virtually every page on this site and MDN about this subject, I still have no firm idea about how to accomplish this portion of my task.
Bascially, how does one go about turning the values in the Uint8Array() fData
into a representation of the amplitude of each frequency in Hertz which the fData
array elements reflect.
Any advice, suggestions, or encouragement would be greatly appreciated.
BP