A music score can go through many different versions. First the composer writes it, then the copyist may change it a bit, and then the printer may change it further. Or, the composer might create a slightly different score for different performances, as is the case with Handel’s Messiah.
When BarFinder was born, its goal was to create an easy way to line up all the different versions of a music score. A group with the University of Paderborn in Germany wanted to compare all the editions from one composer’s work. If one version has a note F where all the others have an F-sharp, the way to pinpoint the odd one out would be by lining up the bars in the score—which, until BarFinder, would be done manually.
BarFinder specializes in finding barlines automatically. The project uses an open-source image-processing algorithm that extracts the position of barlines on a music score. The user inputs the image, and BarFinder deduces the number and position of measures on the page. BarFinder uses Optical Music Recognition, or OMR, to recognize the measures, and encodes all extracted data using the Music Encoding Initiative (MEI). Readers may be familiar with the technology’s textual counterparts: Optical Character Recognition (OCR), and the Text Encoding Initiative (TEI).
As can be seen in the above image, which shows the steps in the image processing, the computer-recognized measures appear in red.
Besides the original use of comparing editions, there are many broader uses for BarFinder within the area of digitized music. BarFinder facilitates multimodal music presentation and the navigation of music catalogues, and enables, for example, synchronized score reading and playback. With the automatic recognition of the positions of all measures on the page, searching for any element becomes simpler.
BarFinder is one of the many projects under Principal Investigator Dr. Ichiro Fujinaga (Associate Professor of Music Technology, Schulich School of Music, McGill) that uses OMR to improve the searchability of digitized music. The project is funded, in part, by the Social Sciences and Humanities Research Council and the University of Paderborn.
To experiment with searchable digitized music files, have a look at the Liber Usualis: http://ddmal.music.mcgill.ca/liber/