Advertisement

Study Calls for Early Test of Hearing

Share
Times Staff Writer

A strikingly high percentage of children aged 2 to 4 suffer from total or partial hearing loss undetected by their parents and doctors, even though there were early signs of hearing deficiency that could have been recognized, a new study contends.

Because their deafness has not been diagnosed and treated, such youngsters often miss out on crucial early development of language skills. The deficit may take years to overcome or may be irreparable, according to a hearing specialist at the State University of New York.

This conclusion, argues Dr. James Coplan of the university’s Upstate Medical Center in Syracuse, underscores the importance of vigilance by parents and pediatricians. Coplan’s study was published in the journal Pediatrics.

Advertisement

He found that of 1,000 children referred to him, 4.6% proved to have undetected total or significant hearing loss--with the condition missed or ignored for as long as two to four years per child. The average delay was a year. For each baby who is totally deaf, another five or six are hearing impaired.

That such conditions exist is especially tragic, Coplan said in a telephone interview, because modern techniques make it possible to determine and assess hearing loss almost from the moment of birth.

Parents, Coplan said, should be acutely aware of a baby’s early communication development. A newborn, for instance, he said, should look at its parent’s face and watch the parent’s lips when the parent speaks. At three months, Coplan said, a baby should have a “conversation,” in the sense that the child makes noises in seeming response to talk from a parent.

“If the parents have any concern about hearing, they should be insistent on having their child tested by an audiologist,” Coplan said, even if such overtures are initially ignored or even rejected by a doctor. “Parents who are concerned about a child’s hearing are very often right,” he said.

Salt-Free Limitations

If you’re one of the possibly millions of people who have gotten religion about overuse of salt--cutting it out of cooking and not adding it to already prepared foods--you may figure you’ve rid your diet of unwanted salt and the attendant sodium. Unfortunately, warns a new medical study, this may be disappointingly naive.

The depressing reality, concludes a study in the Lancet, a major British medical journal, is that in the average diet, only 15% of salt content comes from what is added in home cooking or sprinkled on at mealtime. Another 10% of the sodium is in the food to start with. But the other 75% of dietary sodium is added in various phases of food processing by manufacturers.

Advertisement

Thus, argues a team from the Rowett Research Institute in Scotland, the challenge is far less a matter of banishing salt from one’s kitchen and dining-room table and more working to affect salt use in the food industry--though the maximum personal use of unprocessed foods clearly could help, too.

Ironically, the researchers found cooking salt may not be as much of a villain in the sodium drama as originally believed because only 20% to 36% of the salt added to cooking water for vegetables, for instance, actually remains in the food when it is served. Cooking salt sodium concentrations are further diminished, the study found, by uneaten sauces and food left on the plate.

Advertisement