Language was once thought to be a single side of the brain phenomenon.
New technology helps to seek how the brain maps sounds and language to come up
with meaning. Researchers at UC San Francisco found that people use both sides
of their brain to categorize and understand language.
This wasn’t the only discovery. Instead of responding to phonemes
the brain actually responds to more elemental pieces of information called Features.
The difference is profound as the individual sound isn’t as important as the
categorization of these sounds at an elemental level. The brain is processing
deeper than scientists originally predict.
The way in which a person uses lips, tongue or vocal cords
determines the overall meaning and understanding. If this is true then language
has a biological component and is based in deeply held abilities of what makes
us fundamentally human when compared to other species.
The research is important because it can help people with
reading and speech problems. It may even adjust how we come to understand and
teach the English language. If the speech is associated with the movements of
air within the mouth then classifications and history of words can be analyzed on
a different level.