Low-Latency f0 Estimation for the Finger Plucked Electric Bass Guitar Using the Absolute Difference Function


  • Christhian Henrique Gomes Fonseca School of Electrical and Computer Engineering, University of Campinas
  • Tiago Tavares School of Electric and Computer Engineering (FEEC), University of Campinas, Brazil




f0 estimation, low latency, Audio-to-MIDI converter, Music information retrieval, MIDI-bass


Audio-to-MIDI conversion can be used to allow digital musical control through an analog instrument. Audio-to-MIDI converters rely on fundamental frequency estimators that are usually restricted to a minimum delay of two fundamental periods. This delay is perceptible for the case of bass notes. In this dissertation, we propose a low-latency fundamental frequency estimation method that relies on specific characteristics of the electric bass guitar. By means of physical modeling and signal  acquisition, we show that the assumptions of this method are based on the generalization of all electric basses. We evaluated our method in a dataset with musical notes played by diverse bassists. Results show that our method outperforms the Yin method in low-latency settings, which indicates its suitability for low-latency audio-to-MIDI conversion of the electric bass sound.


Download data is not yet available.

Author Biography

Christhian Henrique Gomes Fonseca, School of Electrical and Computer Engineering, University of Campinas

School of Electric and Computer Engineering (FEEC), University of Campinas, Brazil

Interdisciplinary Nucleus for Sound Studies (NICS), University of Campinas, Brazil


GIBSON, J.; WARREN, A. The MIDI Standard. [S.l.]: http://www.indiana.edu/ emusic/361/midi.htm, accessed 05/9/2019.

DERRIEN, O. A very low latency pitch tracker for audio to midi conversion. 17th International Conference on Digital Audio Effects (DAFx-14), 2014.

KLAPURI, A. P. Multiple fundamental frequency estimation based on harmonicity and spectral smoothness. IEEE Trans. Speech and Audio Proc, 2003.

RABINER, L. On the use of autocorrelation analysis for pitch detection. IEEE Transactions on Acoustics, Speech, and Signal Processing, v. 25, n. 1, p. 24-33, February 1977.

CHEVEIGNÉ, A. de; KAWAHARA, H. YIN, a fundamental frequency estimator for speech and music. The Journal of the Acoustical Society of America, v. 111, n. 4, p. 1917-1930, 2002.

HELLER, E. J. Why You Hear What You Hear. [S.l.]: Princeton University Press, 2012. (Chapter 23; pp. 437-504).

OXENHAM, A. J. Pitch perception. Journal of Neuroscience, v. 32, n. 39, p. 13335-13338, 26 September 2012.

CARIANI, P. A.; DELGUTTE, B. Neural correlates of the pitch of complex tones. I. Pitch and pitch salience. J. Neurophysiol. 76, 1996.

TERHARDT, E. Pitch, consonance and harmony. J. Acoust. Soc. Am. 55, 1974.

FORNARI, J. Percepção, Cognição e Afeto Musical. [S.l.: s.n.], 2010.

ISO16:1975-ACOUSTICS. Standard tuning frequency. [S.l.]: International Organization for Standardization, 1975.

IAZZETTA, F. Tutoriais de Áudio e Acústica. [S.l.]: http://www2.eca.usp.br/prof/iazzetta/tutor/acustica, accessed 04/25/2019.

NOLL, A. M. Pitch determination of human speech by the harmonic product spectrum, the harmonic surn spectrum, and a maximum likelihood estimate. Symposium on Computer Processing in Communication, ed., University of Broodlyn Press, New York, v. 19, p. 779-797, 1970. Disponı́vel em: https://ci.nii.ac.jp/naid/10000045637/en/i.

OPPENHEIM, A.; SCHAFER, R. Discrete-Time Signal Processing. [S.l.]: Prentice Hall, 1999.

SINGH, C. P.; KUMAR, T. K. Efficient pitch detection algorithms for pitched musical instrument sounds: A comparative performance evaluation. In: 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI). [S.l.: s.n.], 2014. p. 1876-1880.

GERHARD, D. Pitch extraction and fundamental frequency: History and current techniques. Technical Report TR-CS 2003-06, 2003.

KNESEBECK, A.; ZOLZER, U. Comparison of pitch trackers for real-time guitar effects. 13th Int. Conference on Digital Audio Effects (DAFx-10), 2010.

RESEARCHGATE. YIN, A fundamental frequency estimator for speech and music. [S.l.]: https://www.researchgate.net/publication/11367890, YIN, A fundamental frequency estimator for speech and music, accessed 06/02/2020.

GREEF, W. The influence of perception latency on the quality of musical performance during a simulated delay scenario. University of Pretoria, Department of Music, 2016.

WANG, Y. Low latency audio processing. Queen Mary University of London, School of Electronic Engineering and Computer Science, 2017.

LESTER, M.; BOLEY, J. The effects of latency on live sound monitoring. Journal of the Audio Engineering Society, 2007.

JAIN, M. Numerical Methods for Scientific and Engineering Computation. 1st ed.. ed. [S.l.]: New Age International, 2003. p. 844.

LANGTANGEN, H. Finite difference methods for wave motion. preliminary version. [S.l.]: Department of Informatics, University of Oslo, 2016.

JANSSON, E. Acoustics for Violin and Guitar Makers. 4th ed.. ed. [S.l.]: http://www.speech.kth.se/music/acviguit4/, 2002.

PORCIDES, C.; TAVARES, L. Resultados preliminares de um estudo comparativo de métodos de detecção de onsets em sinais de Áudio. Anais do Simpósio de Processamento de Sinais da UNICAMP, Vol. 1, 2014.

GUYOT, P. Fast python implementation of the yin algorithm. http://doi.org/10.5281/zenodo.1220947, 2018. ”accessed 01/02/2018”.

FREEPATS. Sound Banks. [S.l.]: http://freepats.zenvoid.org/index.html, accessed 09/02/2020.




How to Cite

Fonseca, C. H. G., & Tavares, T. (2020). Low-Latency f0 Estimation for the Finger Plucked Electric Bass Guitar Using the Absolute Difference Function. Revista De Informática Teórica E Aplicada, 27(4), 79–94. https://doi.org/10.22456/2175-2745.103182



Selected Papers - SBCM 2019