In the mornings, as I push Sonya down West End Avenue in her stroller, Sonya sings. Loudly. Sometimes she takes requests. Most of the time she sings to her own soundtrack, on repeat. It’s pretty cute. Strangers often smile as we pass. I love to sing too, so I sometimes try to accompany, to which she immediately protests. “Stop Mom! It’s my turn! Okay?”
While Sonya often sings to express joy, she also turns to music when she is angry. Her current ‘go-to’ angry song is one she wrote herself. It goes:
I don’t like
I don’t like
I don’t like
As I often discuss on this blog, when we learned Sonya would need cochlear implants we were initially very concerned about what this would mean in terms of her appreciation of music. Now, Sonya hasn’t yet demonstrated tone or pitch accuracy. She often starts a song in one key and ends up in a different key. According to AudiologyOnline, most kids develop this skill by kindergarten, though these are skills that might be more difficult for someone using cochlear implants. Cochlear implants were developed to access speech. Music is more complex. But, clearly, there is much we don’t understand about electronic hearing. Despite the obstacles, Sonya loves to sing. She loves to dance. And for the time being, she seems to enjoy it as well as her friends who hear acoustically.
A MAP is a program that optimizes a cochlear implant user’s access to sound. The audiologist connects Sonya’s processors to a computer. Sonya then hears a series of beeps and the audiologist measures her response.
As an infant, measuring this response was rather tricky. The audiologist might observe a change in eye movement, a head turn or Sonya might stop moving. All of these behaviors indicate that Sonya was hearing the sound. Upon seeing such behavior, the audiologist would light up a black box with a toy playing the drum (or something similar). This would condition Sonya to look at the box when she heard the sound. Thus, the audiologist was able to get a sense as to which sounds Sonya could and could not hear.
Now that Sonya is two, her mappings are a bit different. When Sonya hears a sound, she puts a coin in a piggy bank. She is still working on this skill, but it is a much easier/accurate way to determine whether or not she hears the sound.
First the audiologist sends the sound directly to Sonya’s processors. We don’t hear it. Only she does. Then, the audiologist turns her processors on to detect noises in her environment. In the videos below, you can watch Sonya listen to the sounds through her processors at first – putting a coin in a pig each time she hears the sound. Then, she will repeat sounds, which shows us that she is hearing the sounds in her environment as well. The sounds that Sonya is asked to repeat are called “Ling Sounds”. Ling Sounds are different sounds which vary from high to low pitch. They are considered the range of speech sounds needed to acquire language.
Sonya’s speech therapists use the below Ling sound symbols when working with Sonya . When Sonya is presented with an airplane, for instance, she knows to make the “ah” sound. It’s another way to make sure she is receiving the auditory input necessary to speak.
Our daughter Sonya was born deaf but with the help of cochlear implants, she now hears most everything we do, albeit differently. Unlike hearing aids, which amplify sound, cochlear implants bypass the damaged area of the ear, and stimulate the auditory nerve directly, using electrodes. The signals produced by the electrodes are then recognized as sounds by the brain. Since Sonya is only 22 months, she unfortunately can’t describe what it is like to hear with cochlear implants.
My family’s friend Barb Cole, however, can! Barb suffered from degenerative hearing loss, and received a cochlear implant in her retirement. In the following post, Barb shares her difficult yet fascinating story:
My Cochlear Hybrid Implant Story
By Barb Cole
In the beginning, I did not realize I had a hearing loss until a colleague told me I was talking very loudly on the phone. I was only in my mid 30s, but found the need to gradually increase the phone’s volume at work, so I could hear my clients. I probably had hearing issues prior to this intervention and was not aware of them. I saw a doctor, who told me my hearing was borderline and that at this time I did not need hearing aids.
Ten years later, I decided to get my masters in special education. I finally got hearing aids in order to work with my students (children are more difficult to hear than adults for me). For the next 15 years I taught special needs students. I loved it, but my hearing was declining and I needed to make a career decision for my students and for myself – and so I retired.
At that time, I had received new hearing aids, but they were not really helping, so I looked into cochlear implants. Unfortunately, I was told I would not qualify for the devices, because I still had some hearing. My hearing loss was mostly in the high frequency range. I became depressed and frustrated. I could not understand my grandchildren or people on the phone. My hearing difficulties affected my speech. I struggled with pronouncing words correctly – especially multi-syllable words. I needed help going into stores, seeing doctors and talking with friends, It was very disturbing and humiliating.
When we moved from Minneapolis to Chicago my life changed! My new doctor said I qualified for a Cochlear Hybrid Implant. I would be his second patient and I would participate in a Cochlear study. He was so positive and understanding. I had my operation in November 2015, which went very well (but recovery was very painful). I was connected with my processor in December. I feel like I am traveling a very long journey. The doctor says it can take up to two years to heal.
My brain needs to re-learn how to hear, so working with the audiologist has been extremely important. There were some misunderstandings during post-activation, because I was so used to writing everything down. I needed time to process information before I could understand it. My brain is not used to hearing, processing, and understanding at the same time. It turned out to be an adjustment for the audiologist too – I kept trying to slow her down and repeat important information.
As my brain was trying to connect with my implant, I would hear loud chirping sounds – which is normal. This is the brain’s reaction to the new range of high frequency sounds (such as a child’s squeal or a police siren) that I am now able to hear. As the audiologist programs my processor, my range of sounds have been expanding – this is done slowly. The chirping sounds are annoying, but as the brain adjusts then they begin to fade – until new sounds are introduced. The radio, tv, and other electronic devices tend to have high frequencies, which causes more chirping sounds.
My brain is still learning to work with my implant. I do two hours of hearing homework a day – it’s exhausting. Homework includes listening to audio books and following along with the printed books. I do hearing games and exercises on my iPad and computer. My husband reads to me and I read to him. I listen to the TV, radio, etc., with captions, as I need to hear different sound sources in order to make the connection in my brain.
Over the last six months, there has been much testing and adjusting. This past June was my six-month post-activation check-up. I received the best news. My hearing is now in the normal range (at the bottom but in the normal range) under perfect conditions. The chirping sounds had lessened, which meant my brain had successfully adjusted to the frequency range. The audiologist then expanded the frequency range enabling me to hear more sounds.
One of the hardest adjustments has been dealing with people. I still need people to look at me, talk a little slower and clearly, so that I can hear and understand them. Loudness is an issue. While, I can adjust my devices for volume, when people talk too loudly the sounds I hear are distorted. I have thus lowered the volume of the TV – a big improvement. It is a bit easier to listen on the phone, but there are many variables: articulation, fluency, distortions, pitch, accents, and background noises. My pronunciation has greatly improved, however. I can now hear the difference in how I pronounce words and the correct way they should be pronounced.
Overall, I am glad I have the implant/processor, but it is a huge commitment and adjustment. Today I can hear birds and insects – even other people’s conversations!