[en] In western music, a semitone constitutes a theoretical and perceptual boundary between tones and is defined as a unit for categorical perception of intervals (Burns & Ward, 1978). However, melodic perception does not rely exclusively on this category but also involves the notion of ‘correctness’. If we usually classify melodies as “in tune” or “out of tune” depending on the size of interval deviations (smaller than a semitone) along melodies, the transition between the two categories remains unclear. This study examines the process involved in pitch accuracy perception.
Twenty-five participants identified melodies as “in tune” or “out of tune” and rated their confidence for each answer. The pitch manipulation consisted of the enlargement of an interval in 5 cent steps (from 0 to 50 cent deviation). The interval deviated was either a Major 2nd or a Perfect 4th and occurred in the middle or end of a 6-tone melody. The task was run twice, before and after an explicit definition of the two labels. Repeated measure ANOVAs were conducted to examine the effect of the deviation on the proportion of in- tune answers and on the confidence levels.
For the participants who were able to learn the labels (n = 20), the proportion of in tune answers varies greatly according to the amplitude of the deviation and depended on the size of the interval manipulated. Associated with the confidence level measurement, the identification data support a categorical perception process. Interestingly, explicitly learning the labels increased the overall confidence but did not modify drastically the profile of the categories and the process behind the categorization.
This study suggests that explicit learning is not necessary to develop higher order categories relative to “correctness”. Nevertheless, such a process seems limited to certain intervals. Further investigation of other intervals and individual differences seems promising to better understand the mechanisms underlying music perception.