The DYnamic Therapy Associates Blog
Training ABA Language Targets for Nonspeaking Students using Speech Generating Devices (dedicated or via mobile devices)
We met this adorable little fellow, Michael, today who is an emerging communicator using the Speak For Yourself app. Michael has ABA specialists who help him learn many new skills. Our chat with his mother reminded me of this tip sheet I created a while ago to help our ABA professionals working with us to teach language to AAC users. AAC users have unique needs when working with an ABA protocol. Here are a few ideas that may help!
Implementation Notes: Because the student is producing language in an alternative form than speech, his training for tacting and manding will have some slightly different considerations than for speaking patients. There are 2 types of prompts which can be used to teach communication via a speech generating device: stimulus prompts and response prompts
Response Prompts: Prompts provided by the partner (backward chaining with physical prompts, direct point cues, direct verbal cues, indirect cues and natural cues). Independent tacting on an SGD (speech generating device) requires navigation to reach vocabulary. Interventionists can remove this requirement to simply target tacting by using “partner assisted navigation” where the therapist goes to the page where the vocabulary item exists and then asks the patient to label the presented photograph by touching the corresponding symbol on the page.
Independent manding on an SGD also requires navigation. If you want the patient to exhibit the skill of manding in isolation of the navigation demand, similar to tacting, the therapist uses partner assisted navigation to take the student to the appropriate page and then expects the student to select the desired item to request.
In order to achieve independent tacting and manding on his communication device, the student must master navigation. Navigation is taught through backward chaining in the following sequence:
o Therapist navigates to the appropriate page for the student, student is asked to mand/tact on that specific page.
o Therapist demonstrates navigation to the appropriate page, student is asked to mand/tact on that page
o Therapist navigates to one page that links directly to the specific page where vocabulary is targeted. Student is asked to select the button that links to the targeted page and then asked to mand/tact on that page. ex: therapist navigates to a dictionary (“things” or “my words” page) and the student is asked to select the appropriate category button
o Therapist gradually backs out of navigation, teaching one page navigation at a time until student can navigate from the main page to the specific vocabulary page.
Stimulus Prompts: prompts that are embedded in the page sets Stimulus prompts are visual and position cues that are part of the presentation of the vocabulary. They can include the following:
· color coding,
· hiding extraneous buttons/messages,
· shape cues and
· position cues on the page of the device.
If the training is completed on the student’s regular page set, the student will be able to use motor planning to assist him in navigation. Clinical evidence and research indicates that this motor planning is often attained even in the absence of an understanding of categorical, grammatical or functional vocabulary organization. Students simply learn the motor movements/locations on the screen required to get to the desired vocabulary. They use visual images to assist in the initial learning stages but, like adults who type on a keyboard, students learn the position of the linking buttons to increase their rate of communication over time.
By simplifying learning through stimulus and response prompts, students can learn independent navigation of their devices in order to produce language spontaneously. Prompting teaches words in the context of the student’s language system rather than on random pages that cannot be accessed independently by the student for future communication.
And remember, the most important thing about communicating using an AAC system, is COMMUNICATION! Establishing social relationships is one of the most important functions of language development and communication so, whenever possible, sit down and have a good 'ole unstructured chat!
AAC plays an important role in helping our families navigate medical procedures, tolerate hospital/ doctor visits and, explain their healthcare needs.
Kasey's mom Tracey describes her experience after surgery, "Kasey was groggy and on lots of medication post surgery, but when offered her T10, she navigated to "more"! We'd been giving her sips of apple juice and she clearly wanted more."
Here's a blog post I wrote for PrAACtical AAC on the subject of AAC and serious medical procedures:
Begin at the Beginning: Assessing Motivation, Communication and Voice Output for Children with Complex Communication Needs
We had the opportunity this month to meet some very special boys who have complex communication needs and sensory impairment. One of our boys is deaf-blind, ambulatory and nonverbal. One youngster is functionally blind and has severe motor impairment. He is nonverbal and nonambulatory. Our last fella is nonverbal, has a severe visual impairment and significant sensory defensiveness. All three boys are curious, focused when motivated and responsive. They all communicate primarily through affect, unconventional gestures and vocalizations.
The sensory evaluation starts many days in advance with collecting a wide variety of items to present systematically to the children in the following sensory categories: visual, auditory, tactile, proprioceptive, vestibular and olfactory. Fun shopping, including some handy dandy toolbox organizers, and this is what we came up with!
Our Sensory Collection:
Kendal and I have had some wonderful conversations. He's been telling me about his favorite music (country!), his new temporary home, his opinion about some of the people in his life... Kendal and I have known each other for many, many years. We've struggled through traditional therapy, YouTube infused visits, extremely short visits as he was determined not to join me and now we've settled down into this nice relationship. I've always loved Kendal but now I can say I truly enjoy interacting with him. The difference is that now Kendal and I really understand each other. Kendal and I interact with both of us using his communication device as a "translator box." Kendal struggles greatly with comprehending spoken language. His social and environmental awareness greatly exceeds his understanding of verbal language. Sometimes we think he hears that Charlie Brown teacher voice when we talk.
The technique I use most with Kendal is Aided Language Input. Very simply stated, I use Kendal's device, the NovaChat 10, to talk to him. The visual symbols paired with my (Charlie Brown teacher) speech help Kendal understand me so that he can participate in our conversations. I touch symbols on his device which correspond to the words I'm saying. The value of aided language input is that Kendal is able to see someone communicate the same way he does. He has a model for how aided communication works. There is no pressure for him to imitate my model. If he wants to participate, awesome! If he just wants to listen, that's okay too. Either way Kendal is learning language and building relationships. Kendal's mom is amazing and she uses this technique as much as possible at home and in the community. Kendal seems calmer and happier as a result of his improved understanding and ability to offer his own opinions. He's found a great deal of satisfaction in being able to tell us about the other people in his lives and their ups and downs in caring for him. We've also seen an increase in Kendal's understanding of our words. He's responding more and more to our verbal language.
You'll see in our video that Kendal pays close attention to my symbol selection and then reads the message window text. He obviously understands and "gets" my jokes as we take turns talking. Since Kendal primarily communicates using 1-2 symbols per message, I used to simplify my messages to just a few symbols per statement. Now I use 4-7 symbols per statement because he's shown me he understands. There are some really wonderful resources on the internet to explain how Aided Language Input works. Here's a great compilation from PrAACtical AAC: http://praacticalaac.org/tag/aided-language-input/
We thought we'd just show you!
When I first meet a child for therapy, I always ask parents to tell me what their child likes to do. What characters, what toys, what games, what videos, what books.... So many times parents of children with severe speech and physical impairment look at me like I'm a little crazy when I ask about board games. "Well, she would probably like them but she can't do it." Since that sounds like a challenge, I have been adapting board games for many years. (Plus, I like games and my own kids are getting too old to play Candyland with me anymore!)
...you can just add numbers (or colors, or letters) to the playing pieces so your child can use his/her device to make a selection. For this game, Alli uses her communication device to pick 2 numbers to guess where the two halves of Gingerbread Man are hiding.
The next step is to make sure the game is as visually and physically accessible as possible. Here's a Candyland Game I adapted to help kids with fine motor difficulty move and place their pieces without sliding and falling. We tap out each move so our kids with visual challenges can hear how far the players pieces are moving.
I have been fortunate enough over the years to have had access to wonderful software from a variety of manufacturers. Although I have my favorite products, they never have everything I want for my therapy goals so I find myself mixing and matching components of my activities to make them just right! I think it is great fun to throw all the different developers ideas into one sandbox where they can all happily play together (at least in my world!). I love to slap one developer's symbols into another developer's software and throw it all on another developer's device. I'm positive they LOVE that too. The good ones, anyway!
iPads are wonderful and the amazing array of apps is dizzying. Nonetheless, apps are significantly less robust than full blown software. So, here we are, back in the sandbox, pulling in the best ideas from one app to supplement the fabulous ideas of another. A little enthusiasm and app switching savvy and we have therapy activities that are more meaningful and motivating than if we only used one app at a time the way nature (or the developer) intended. Something like this bunny dog who will sit on command and go for walks with you but doesn't bark or bite. And must be potty trained. The best of all worlds.
Why combine apps?
Peek-A-Boo Barn has a bouncing barn with animals that knock on the doors to come out. Adorable! We practice our greetings when the animals pop out. Clicker Sentences let's us write about our activity, include a picture and print to take home.
Balloonimals let's kids blow up balloons and wiggle them into silly animated characters. Kids can take a photo and see the animals in different locales. Abitalk Sentence Builder let's you create sentences from your own images so we can write about our animals. (These are from my former intern, Katie Millican).
First Phrases (Hamaguchi) is one of our "go-to" apps for teaching students verbs and simple phrase creation. We watch the animation and when cued to repeat the phrase, our kids either verbally repeat OR use their speech generating devices to make their own 2 word phrase. Super motivating and targets basic, common verbs. This app very nicely doesn't charge ahead until you activate an arrow but, as I mentioned before, kids are FAST! After we play this game, we use our Speech Box app with screen shots of the First Phrases animation scenes so we can sort through a pile of pictures and decide which ones to talk about. Speech Box is a great library of "boxes" of pictures. It seems the developer was thinking of providing an easy access to sets of articulation cards (it comes with these) but it has a wonderful, broad set of photos AND allows you to add your own (screen shots of First Phrases in this case).
I use this concept for YouTube videos as well. Those little jokers go by super fast and there are ALL of those "suggested" (and sometimes "suggestive") videos to grab attention. Figure out what video is a favorite and just snap a few screen shots of it to drop into another app.
First Words International is another fantastic Hamaguchi app that teaches early developing single noun vocabulary in categories (vehicles, animals etc). It is truly a wonderful app that introduces multiple examples of the targeted word (5 different buses appear on-screen) and then presents the targeted word on a display with 4 random pictures distractors. Students are asked to find the "bus" on the page. Then students get to "spin the wheel" where, magically, a picture of the targeted word appears and they are asked to label it. We label with our voices, signs or our AAC device. Just to "cement" the concept in a little more firmly, we'll watch a video of the targeted word on our VideoTouch apps (animals, vehicles and instruments) or listen to it's sound effect on SoundTouch app. These are great apps that give multiple example videos or photos & sound effects for common nouns.
I'm definitely not done with this idea. I've been scanning all my older, but still amazing materials books onto my iPad , with the FasterScan HD app, coloring them in with HELLO Crayon app (thanks to suggestions from the Twitterverse) and saving them into Dropbox app so I can whip them out at a moments notice for therapy, to print out for homework or to share with my therapists. So much you can do with that silly little iPad!
What are YOUR favorite apps to combine? Leave us a note in our comments so we can learn from each other!
Today was a day of Story Creating using Story Creator App by Innovative Mobile Apps. This is one of the best apps. It also happens to be free. This app lets you import a picture from your camera roll, type text to describe the picture and record yourself reading the text. For therapy we use it to help our patients to be motivate to write (and speak) multiword phrases about their favorite topics. So far this week we've made books about sweets, baseball, creepy characters, Disney (of course) and countries where we want to travel.
THIS IS HOW WE DO IT
Super motivating and easy. A great little activity which keep our students focused and talking!
Hello, all! This post was inspired by Kate Ahern’s session titled “Bringing AAC Home” at ATIA in Orlando that Vicki and I attended. She had some great ideas and insight about our kiddos using their AAC devices at home and in the community. Many of us don’t have the opportunity to provide services to our clients outside the speech clinic (at home, in the community, at school), so we don’t always know how much their voice is heard when we aren’t in therapy. This does not mean, of course, that their needs aren’t met. I understand there are many things that need to be done at home and using an AAC device can sometime s be viewed as “homework” that may or may not get done.
About the Author: I am a SLP who has the distinct fortune of having a job that is also my passion. I have been an AAC Specialist for almost 25 years in schools and my private clinic. I currently own Dynamic Therapy with my husband, Chuck (also of 25 years) who is my business partner and enabler. We have a wonderful staff of SLPs & AAC Specialists who work with us to help our patients. I hope you find my blog helpful as you join me in our journey with our unique and amazing friends! Vicki Clarke, MS CCC-SLP