Publication Date
2008
Description
Skip to Next Section COMPOSERS COMMONLY USE MAJOR OR MINOR SCALES to create different moods in music.Nonmusicians show poor discrimination and classification of this musical dimension; however, they can perform these tasks if the decision is phrased as happy vs. sad.We created pairs of melodies identical except for mode; the first major or minor third or sixth was the critical note that distinguished major from minor mode. Musicians and nonmusicians judged each melody as major vs. minor or happy vs. sad.We collected ERP waveforms, triggered to the onset of the critical note. Musicians showed a late positive component (P3) to the critical note only for the minor melodies, and in both tasks.Nonmusicians could adequately classify the melodies as happy or sad but showed little evidence of processing the critical information. Major appears to be the default mode in music, and musicians and nonmusicians apparently process mode differently.
Journal
Music Perception
Volume
25
Issue
3
First Page
181
Last Page
191
Department
Psychology
Link to Published Version
http://www.jstor.org/stable/10.1525/mp.2008.25.3.181
DOI
10.1525/mp.2008.25.3.181
Recommended Citation
Halpern, Andrea R.; Martin, Jeffrey S.; and Reed, Tara D.. "An ERP Study of Major-Minor Classification in Melodies." Music Perception (2008) : 181-191.