Abstract:
|
We investigated the extent to which people can discriminate between languages on the basis of their characteristic temporal, rhythmic information, and the extent to which this ability generalizes across sensory modalities. We used rhythmical patterns derived from the alternation of vowels and consonants in English and Japanese, presented in audition, vision, both audition and vision at the same time, or touch. Experiment 1 confirmed that discrimination is possible on the basis of auditory rhythmic patterns, and extended it to the case of vision, using ‘aperture-close’ mouth movements of a schematic face. In Experiment 2, language discrimination was demonstrated using visual and auditory materials that did not resemble spoken articulation. In a combined analysis including data from Experiments 1 and 2, a beneficial effect was also found when the auditory rhythmic information was available to participants. Despite the fact that discrimination could be achieved using vision alone, auditory performance was nevertheless better. In a final experiment, we demonstrate that the rhythm of speech can also be discriminated successfully by means of vibrotactile patterns delivered to the fingertip. The results of the present study therefore demonstrate that discrimination between language's syllabic rhythmic patterns is possible on the basis of visual and tactile displays. |
Abstract:
|
This research was supported by grants PSI2012-39149, PSI2009-12859 and RYC-2008-03672 from Ministerio de Economía y Competitividad (MINECO, Spain) to J. Navarra, the European COST action TD0904, and by grants PSI2010-15426 and CDS00012 from MINECO (Spain), 2009SGR-292 from Comissionat per a Universitats i Recerca del DIUE-Generalitat de Catalunya, and European Research Council (StG-2010 263145) to S. Soto-Faraco. |