Thursday, May 06, 2004
Some answers may come from a recent study published by a University of Rochester team. Examining how adults and children parse language sounds (made up words), they've found that the brain seems to do all kinds of high-level statistical analysis of those sounds, unconsciously, between the ear and conscious mind. Foreign languages always sound like they're being spoken very fast, because the non-speaker is not attuned to where the breaks should occur and how words sound. However, with experience, the brain learns to break up the constant stream of sounds into understandable (or at least distinguishable) bits: words and phrases. This new theory of how is most interesting.