Chapter 1: Natural User Interfaces and Accessibility

The widespread adoption of mobile computing is a good thing for librarians who care about access for all. That’s because mobile devices make use of natural user interfaces, and those interfaces are making computing easier for people of all ages and abilities, as you’ll see in this report.

This trend, combined with the move toward multi-device ecosystems and the emphasis on students as creators with mobile apps, means that mobile learning is headed in a direction that is empowering for learners of all abilities.

There are other trends in mobile learning, but this report focuses on these three:

  • natural user interfaces and accessibility
  • multi-device ecosystems
  • content creation with mobile devices

That’s because there are synergies between these trends that offer opportunities for those who care about access for all. We’ll also discuss opportunities for libraries and librarians and suggest resources for continuing your learning.

Many notes and links are included throughout the report, so it’s recommended that you sit down with your computer or mobile device and prepare to follow links to view interesting photos and videos related to the topics within.

I enjoyed gathering the research for this report, and I think you will find it inspiring to read about how mobile computing is making life easier for so many people of all ages and abilities.

Natural User Interfaces Are Making Computing Easier for All Ages and Abilities

NUI stands for natural user interface. Natural user interfaces are those where humans interact with computers using actions related to everyday behavior, such as touch, gestures, speech, and conversation.1 The goal is to make it easy for humans to understand and use computers without having to learn complicated or abstract ways of doing things. Designers of these interfaces aim to create experiences that feel just as natural to a novice as to an expert user—and for expert users it can feel like an extension of their body.

Putting the Human before the Computer: The Move from GUI to NUI

NUIs are a new branch in the evolution of human-computer interaction, after GUIs (graphic user interfaces). GUIs were designed to make computing easier to learn than command-line interfaces, where you had to remember specific commands and type them. By using the metaphor of a desktop, with a trash can, menus, and so on, users no longer needed to memorize commands. You could just look under each menu to find the command you needed. Now with NUIs, it’s possible to move away from abstract metaphors and toward interactions that feel more natural.

Some examples of natural user interfaces are touchscreens, speech recognition, and voice commands. There is also camera input, which allows the device to recognize objects in the real world, and augmented reality, using the camera in a way that shows additional information superimposed on your view of the real world.

Experts agree that NUIs won’t entirely replace GUIs. Instead, they are opening up a new niche of computing that is accessible to a wider audience. GUIs will continue to exist for those applications and devices where they work best.2

Mobile devices are using many kinds of natural user interfaces, and that’s good news for learners. In this section, we’ll look at several examples of how these interfaces work and how they make computing easier for all ages and abilities. And in the final chapter of this report we’ll discuss some opportunities for librarians to help their communities by serving as expert advisors about these mobile technologies.

Types of NUIs

The NUIs we’ll discuss can be grouped into the following categories:

Touch, including the following:

  • Touchscreens and multi-touch gestures, such as those used on smartphones and tablets.
  • Haptic interfaces, such as the Apple Watch’s ability to tap your wrist as a way to notify you.
  • Force Touch and 3D Touch. These are Apple’s technologies used in the Apple Watch, the trackpad on newer models of Apple’s MacBook, and the latest models of iPhones (6s and 6s Plus). With Force Touch you can push down on the screen to activate certain functions. With 3D Touch, there are more sensitive levels of touch based on a lighter or firmer push.

Sound, including the following:

  • Speech recognition, such as that used by Siri and other digital assistants.
  • Conversational interfaces, where you can talk to your device and it can react, such as that used by the Amazon Echo, a device that listens for your commands for answering questions, playing music, and controlling other smart devices.
  • “Hearables”—new kinds of devices that merge the health-tracking features of smart watches and fitness bands with high-quality audio like that found in premium earbuds. An example is “The Dash”—wireless, bio-sensing headphones.

Sight, including the following:

  • Camera as seeing eye, such as that used by Google Translate, which allows you to point your camera at text in one language and see it translated to another language on the fly.
  • Camera as scanner, such as with document-scanning or barcode-scanning apps, especially apps that use OCR (optical character recognition) to translate what the camera sees to written text. Scanbot for iOS and Android is a good example.
  • Augmented reality, such as the ability to show a virtual overlay on top of what you are seeing, like that offered by the app Layar.

In this report we’ll look at examples of how each of these types of interfaces can enable better access to learning, for people with various types of disabilities, and for everyone.

3D Touch

www.apple.com/iphone-6s/3d-touch

Amazon Echo

www.amazon.com/Amazon-SK705DI-Echo/dp/B00X4WHP5E

The Dash smart earphones

https://store.bragi.com

Scanbot

https://scanbot.io/en

Layar

https://www.layar.com/mobile-download

Why Everyone Should Pay Attention to Matters of Disability

All of us, over time, move between times of independence and dependence on others. If you’ve ever had a broken bone or similar injury, you’ve experienced the need for assistance.

Do you wear glasses? Contact lenses? Perhaps you have braces on your teeth? Maybe you know someone who has a pacemaker. Whether your need for these technologies is temporary or permanent, you are using “assistive technologies.” In fact, one could say that all technology is assistive technology. Even your smartphone or headphones serve to augment your abilities in some way. Separating technologies into assistive technology versus “regular” technology no longer makes sense in a world where we all benefit from technologies that increase our abilities.

It’s important for all of us to think about how technology can help improve lives, whether you work with those who have recognized disabilities or not. The idea of “normalcy,” based on a bell curve of statistics about how your body or mind should perform, is a recent one (since the nineteenth century).3 This goes along with the idea that disabilities are medical conditions in need of cures and the idea that disabled bodies are somehow deficient.

These days there is talk of turning people into cyborgs, with both utopian and dystopian visions of how that might play out.4 Exoskeletons designed to make soldiers run faster and carry heavy loads have been developed from military research and are now being used to help people who are paralyzed stand and walk.5 This is proving to be wonderful for some people and has also sparked debate. There are those to want to “fix” people with disabilities, while doing nothing to adjust infrastructure, laws, and social norms to make our society more accessible for all.6

This is a significant concern particularly in the autism community and the deaf community. They are not people in need of fixing. If you haven’t had the opportunity to think much about this issue, these two articles will help:

  • Judy Endow, “Is Autism a Disability or a Difference?”
  • Allegra Ringo, “Understanding Deafness: Not Everyone Wants to Be ‘Fixed’”

For more in-depth thinking about these issues, see Carrie Wade’s “Pity Is Not Progress: Why Disabled People Don’t Need to Be ‘Fixed.’”

Is Autism a Disability or a Difference?

http://ollibean.com/2014/06/20/autism-disability-difference

Understanding Deafness: Not Everyone Wants to Be “Fixed”

www.theatlantic.com/health/archive/2013/08/understanding-deafness-not-everyone-wants-to-be-fixed/278527

Pity Is Not Progress: Why Disabled People Don’t Need to Be “Fixed”

https://medium.com/@wadeacar/pity-is-not-progress-f416f0491f9e

Improving our physical and virtual worlds for people with disabilities makes things better for everyone. For example, adding curb cuts in sidewalks for wheelchair access makes it easier for those pushing strollers or pulling a rolling suitcase. Making text resizable on websites also helps people without any vision problems, for example, in situations where you are doing a presentation and need to show people in the back row something specific.

With this idea in mind, let’s now look at some categories of disabilities and how mobile devices with their natural user interfaces are helping to make access to computing easier for all of us.

Types of Disabilities

Apple’s iOS and Google’s Android have many features built in for improving accessibility. Apple has taken the lead in this area, and it has some useful groupings of features on its website. We’ll use those categories as a starting point.

For an overview, here are Apple’s accessibility categories for iOS, with the names of each feature.

  • Vision. VoiceOver, Siri, Speak Screen, Dictation, Zoom, Font Adjustments, Invert Colors and Grayscale, and Braille Displays for iOS.
  • Hearing. FaceTime, Closed Captions, Messages with iMessage, Mono Audio, Visible and Vibrating Alerts, and Made for iPhone Hearing Aids.
  • Physical and Motor Skills. AssistiveTouch, Siri, Switch Control, Touch Accommodations, Dictation, Predictive Text, Keyboard Shortcuts, Support for Third Party Keyboards, and Hardware Keyboard Support.
  • Learning and Literacy. Guided Access, Speak Screen, Dictionary, Safari Reader, and Speech Apps.

    Braille Displays for iOS

    https://www.apple.com/accessibility/ios/braille-display.html

    Made for iPhone Hearing Aids

    https://www.apple.com/accessibility/ios/hearing-aids

    iOS Speech Apps

    https://www.apple.com/accessibility/third-party/#speech

To learn more, see Apple’s iOS Accessibility site. For Android, which has similar features, see Google’s site. Android users need to keep in mind that not every device has the same set of features. Some features are specific to certain models of smartphones and tablets, so it’s best to look at your manufacturer’s website for specifics.

iOS Accessibility

https://www.apple.com/accessibility/ios

Android Accessibility Help

https://support.google.com/accessibility/android

Rather than go through each feature and explain it, let’s look at a few stories of particular people and how mobile devices are improving their lives. These examples are from the website Bridging Apps.7

Bridging Apps

http://bridgingapps.org

Stories of People Whose Lives Are Improved by the Use of Mobile Technologies

The Proloquo2Go App Is Helping Those with Autism Spectrum Disorder Become More Verbal

Susan is a speech-language pathologist at an outpatient rehabilitation clinic for children up to age twenty-one. Many of the children she works with have an autism diagnosis. One of the apps she finds most useful is called Proloquo2Go. The app makes it easy to add pictures with vocabulary words of objects that are meaningful to the child. The child can touch specific items on the screen as a way to start communicating. She tells the story of a student who at first could communicate only with sounds, squeals, and temper tantrums. Using this app, he was able not only to request things that he wanted, but also to greet people, ask questions, and begin speaking verbally in short phrases.

Proloquo2Go

www.assistiveware.com/product/proloquo2go

She discusses this and other apps for people with autism, and she emphasizes that while they can’t replace an actual therapist, these technologies are extremely beneficial. Many children are interested, engaged, and motivated when using this type of technology.

Read Susan’s full story on the Bridging Apps website.8

Apps Are Helping Those with Autism Spectrum Disorder Develop Fine Motor Skills, and They Serve as a Bridge to Becoming More Social

Michael is the father of Colin, a young boy with autism spectrum disorder. He tells the story of how his son likes to use an app called Write My Name. With it, he writes his name and his brother’s name and spells out words, tracing every letter of the alphabet with his finger on the screen. It helps him develop the fine motor skills that many kids with ASD struggle with. Michael also talks about how much Colin loves the calendar and calculator, since he’s precocious in math and with dates.

Michael makes the point that while some experts are critical of the use of apps with ASD kids because it’s taking them away from interactions with people, many of these apps help form a bridge that leads Colin to communicate more in a way that’s less threatening to him. So Michael is very grateful for how these devices and apps are improving life for his son.

Read Colin’s full story on the Bridging Apps website.9

Write My Name

www.iactionbook.com/?slide=write-my-name&lang=en

A Successful Businessman and Entrepreneur Uses His iPad without Touching the Screen—He’s Been a Quadriplegic Since Age Eight

Todd is a successful businessman, mentor, and public speaker. An accident at age eight caused a spinal cord injury that left him as a C4 quadriplegic. He uses a device called a Tecla Shield that enables him to use his iPhone, iPad, or laptop without touching the screen.10 Control is done entirely with his chin and mouth since he has no movement below the shoulders.

I recommend taking a few minutes to watch his video, “iLove Stories of Independence,” where he tells the story of gaining more independence with these technologies and what that means to him. Voice recognition has come a long way since he was young, but you always had to press a button to start the process—something he could not do without help from others. With the advent of Apple’s Siri, he was able to communicate on his own for the first time using his iPhone, without the help of others. You’ll hear from a friend of his, who is also quadriplegic, in the video, and the two of them tell the amazing story of how they can get out into nature with their wheelchairs and iPhones, easily staying in touch with others, not needing to have other people do everything for them.

iLove Stories of Independence video

https://youtu.be/w9LGAJVLBEg

Read Todd’s full story on Bridging Apps, and watch his video on YouTube.11

A Seventh Grader with Cerebral Palsy Uses the Bookshare Service with an App Called Read2Go to Enjoy Books without the Help of Others

Connor is a seventh grader with cerebral palsy who used to depend on his mom to read to him. Now he uses a service called Bookshare that offers a free online library for students with print disabilities who qualify. He uses an app called Read2Go on his iPad to read the books on his own. With it, he can follow words that are highlighted on the screen and listen to the text being read aloud.

This type of reading experience is useful for a wide range of students, including those who have trouble focusing or concentrating on the screen. Read Connor’s full story on Bridging Apps.12

The people in these examples have what we would traditionally call a disability. As you read these and other examples in this report, think about how these technologies can be useful to a wide range of people of all ages and abilities, not only those we would call disabled.

Bookshare: An Accessible Online Library

https://www.bookshare.org

Read2Go

http://read2go.org

Examples of NUIs

Now let’s look at specific apps and devices that are making learning easier because of their natural user interfaces. These examples include apps designed specifically for people with disabilities and apps designed for everyone. All of these technologies can improve learning for a wide range of ages and abilities.

Touch

Touchscreens and Multi-touch Gestures

Human anatomy apps for medical students are booming. They make use of touchscreens in innovative ways to help students understand and visualize the human body and its processes. An excellent example is 3D Brain by Cold Spring Harbor Laboratory. It’s free and available for both Android and iOS.

You can use the touchscreen to rotate and zoom around interactive structures to learn about how each region of the brain functions. Watch Cold Spring Harbor’s demo video to see how the app works.

3D Brain for iOS

https://itunes.apple.com/en/app/3d-brain/id331399332?mt=8

3D Brain for Android

https://play.google.com/store/apps/details?id=org.dnalc.threedbrain&hl=en

3D Brain demo video

https://youtu.be/1Hmi2bVVzLQ

Another developer that makes excellent anatomy apps is 3D4Medical. Its Complete Anatomy Lab app for iOS is one of the apps rated the highest by medical students. 3D4Medical has several different apps for various systems of the human body. There is a demo video on YouTube for the app Brain & Nervous System Pro III, which I made for one of my online courses. Not only can you slice and rotate the brain to learn about it, but you can also add your own notes, drop pins to mark specific regions (like you would in Google Maps), and quiz yourself.

Complete Anatomy Lab

http://completeanatomy.3d4medical.com

Brain and Nervous System Pro III

https://itunes.apple.com/en/app/brain-nervous-system-pro-iii/id451427448?mt=8

Brain and Nervous System Pro demo video

https://youtu.be/akWspUriVqU

For learning about anatomy, being able to touch and manipulate the images with your finger is so much more intuitive than using a mouse and menus. Read the user comments on these apps to see how much students appreciate them.13

Haptic Interfaces

Haptic interfaces are those involving touch. They can include the act of touching a device and also the device communicating back to you by touch or vibration. An example is the Apple Watch, which uses a technology that Apple calls its “taptic engine.” When you receive a notification, the watch lightly taps you on the wrist. You can also choose to receive subtle audio cues along with the taps.

Apple Watch

www.apple.com/watch/watch-reimagined

One way that the “taptic engine” is used is with the Apple Maps walking directions on the watch. Once you’ve entered where you want to go and started the directions, the watch will tap you in different ways to indicate whether to turn right or left. A steady series of twelve taps means to turn right at the intersection you’re approaching, and three pairs of two taps means to turn left. You also feel a vibration when you’re on the last leg of the trip and when you arrive.

This is especially useful for blind users, as you can imagine. For an interesting demonstration, you can listen to an eleven-minute podcast by David Woodbridge, an assistive technology consultant for Vision Australia, who is blind himself.14 The podcast is an audio recording of his walk in a residential neighborhood with his seeing eye dog and the Apple Watch on his way to pick up his boys from school. You can hear the sounds of traffic on his street, you can hear the voice on his watch instructing him (“take a left on Gilda street in 20 meters”), and you can hear the audio dings that go along with the vibrations. He talks about how it’s working for every step, making you feel you are with him during his walk.

Woodbridge is not the only blind user to be enthusiastic about the Apple Watch. Molly Watt writes about her experience in her blog post “My Ears, My Eyes, My Apple Watch.” She is an advocate for those living with Usher Syndrome, which for her means she is deaf and mostly blind. Here’s what she has to say:

So far for me the most useful App on the Apple Watch is Maps—on my iPhone I can plan my journey from one destination to another, for me it will be on foot with Unis my guide dog. This is where Haptics really come into its own—I can be directed without hearing or sight, but by a series of taps via the watch onto my wrist—12 taps means turn right at the junction or 3 pairs of 2 taps means turn left, I’m still experimenting with this but so far very impressed—Usher Syndrome accessible!15

It’s interesting to think about the bigger picture of how haptic interfaces might be helpful for different situations and for people of all ages and abilities.

3D Touch and Force Touch

Force Touch was first introduced by Apple in the Apple Watch.16 With Force Touch the device can recognize that you are pushing down in order to activate various features, such as switching faces on your watch. 3D Touch is a newer feature, first announced for the iPhone 6s and 6s Plus, that is more sensitive and quicker to react and can offer different levels of action based on how firmly you press.17

With 3D Touch, you can use something called “peek” and “pop.” The peek opens a preview of what you’re looking at, such as an e-mail message in a list, and pop lets you go immediately to that content, such as the full e-mail message. The device gives you haptic feedback so that when you peek, your finger gets a response buzz of ten milliseconds. A longer look gets you a buzz of fifteen milliseconds. It’s built in to the apps from Apple, such as Mail and Maps, and third-party developers can add it to their apps as well.

Though right now this feature is available only for Apple devices, it’s likely that Android smartphones will add a similar feature.18

Here are a few examples of what app developers are doing with 3D Touch.

Evernote (iOS and Android)

Evernote is making it easier to move between your notes. Peek into links and press deeper to pop into the full website for that link. Use peek also to preview your notes from Evernote’s homescreen.

Evernote

https://evernote.com

Dropbox (iOS and Android)

Dropbox offers 3D Touch on the homescreen icon to activate new Quick Action shortcuts, such as viewing files you’ve recently added. Use 3D Touch inside Dropbox to peek into files and folders.

Dropbox

https://www.dropbox.com

Genius Scan (iOS and Android)

Genius Scan is a very useful document scanner app. Use 3D Touch Quick Actions to scan from the cloud or from one of your photos.

Genius Scan

http://thegrizzlylabs.com

Sky Guide (iOS)

Sky Guide is one of many excellent astronomy apps for finding constellations, planets, and satellites in the night sky. Peek and pop give quick access to your favorites, search, and satellites from the homescreen icon.

Sky Guide

www.fifthstarlabs.com/#sky-guide

Magic Piano (iOS)

Magic Piano, the fun music-playing game and virtual piano, uses 3D Touch presses as a way to control the volume—the harder you press down on a piano key, the louder it will be.

Magic Piano

www.smule.com/apps

Endless Alphabet (iOS)

Endless Alphabet is an app for kids to help them learn their ABCs and build vocabulary with fun puzzle games. 3D Touch enables you to press firmly on a letter to make it louder and more animated. This developer also makes another popular and useful app called Endless Reader (iOS). It introduces sight words to learn for reading fluency.

Endless Alphabet

www.originatorkids.com/?p=564

Endless Reader

www.originatorkids.com/?p=40

One of the best things about 3D Touch is that it means less visual scanning and less need for fine motor skills. It’s a time and energy saver. See Steven Aquino’s “What 3D Touch Could Mean for Accessibility” for more thoughts on how 3D Touch makes computing easier for those with disabilities.

What 3D Touch Could Mean for Accessibility

www.macworld.com/article/2983642/ios/what-3d-touch-could-mean-for-accessibility.html

Sound

Speech Recognition in a Dictionary

Speech recognition has improved quite a lot since the early days when Siri was first introduced on the iPhone. Many apps take advantage of this fact, and one simple example is the Merriam-Webster Dictionary app. With this app, you can speak a word, and the app will hear and recognize it and show you the dictionary definition. Of course, you can also look up words by typing them. Another useful feature is that when you tap on a small speaker icon, it will pronounce the word for you. To see and hear this app in action, watch the short video demo that I made for my Apps for Librarians and Educators online course.

Merriam-Webster Dictionary app

www.merriam-webster.com/dictionary-apps/android-ipad-iphone-windows.htm

Merriam-Webster Dictionary demo video

https://youtu.be/111PwdSHVgo

Apps for Librarians and Educators online course

http://apps4librarians.com

Reading Aloud for Children

Many children’s books for the iPad are being designed to take advantage of several features, such as audio and sound recording. An excellent example is the children’s storybook app The Pedlar Lady of Gushing Cross.

In this app, you can have the story read aloud to you while you follow the text, or you can turn off the read-aloud function and read it yourself or to your child. The read-aloud narration is available in English, Spanish, or French. The app also includes a very nice feature where you can record your own voice reading the book and save it in the app.

To see this app in action, watch a short demo video I made for my online course Book as iPad App.

The Pedlar Lady of Gushing Cross

http://moving-tales.com/the-pedlar-lady-of-cushing-cross

Pedlar Lady of Gushing Cross demo video

https://youtu.be/QAmHQq4uoGY?t=3m34s

Book as iPad App online course

http://apps4librarians.com/bookapps

You can probably imagine how these features open up the reading experience for different ages and abilities. The multiple language options are also useful to learners of those languages.

Reading Aloud for All Ages

Capti Narrator is a useful app for having text read aloud to you. With it you can save text from many different sources, such as websites, e-books from Project Gutenberg or Bookshare, and documents in your Dropbox, Google Drive, OneDrive, Instapaper, or Pocket accounts. You can make playlists of items you want to hear and then listen to them whenever you like. You may also follow along, reading the text and seeing each word highlighted as it’s read. The app is available for iPhone, iPad, and iPod Touch and also for Firefox on the Mac and any browser on Windows XP or greater.

In the settings, there are several different narrator voices and several different languages to choose from. If you want to hear some articles written in English and some written in Spanish, you can select which English voice and which Spanish voice to use. I like “Tessa” from South Africa, and “Paulina” from Mexico. You can set the speed to a rate that’s most comfortable to listen to.

You can listen to formats such as Word documents, PDFs, HTML pages, DRM-free EPUB e-books, and more. It’s a free app for the basic features, and with the premium app you can get more voices. There is also an educational version that institutions can purchase and make available to their users.

This type of app is useful for many different audiences, such as anyone who listens while working out or driving, people who learn better through listening, people with print reading disabilities, people learning foreign languages (including ESL students), and more.

The voices do quite a good job of sounding natural, with the exception of running the title of the article into the first sentence. The app is also customized to work together with screen readers, such as VoiceOver for iOS or JAWS for Windows.

Learn more and see video demos on the developer’s website.

Capti Narrator

https://www.captivoice.com/capti-site

Bookshare

https://www.bookshare.org

Capti for Education

https://www.captivoice.com/capti-site/public/entry/education

Capti website

https://www.captivoice.com/capti-site

Audio Wayfinding for the Blind

BlindSquare is an iOS app for blind people to help them navigate the world. It uses GPS and the compass to locate you; then it gathers information from Foursquare and Open Street Map to tell you what’s nearby, such as coffee shops or transit stops. It uses high-quality voices to tell you about interesting places and street crossings nearby. It’s available in several different languages and works in many locations worldwide. You can use it together with navigation apps to hear what’s nearby as you are navigating. Learn how it works by listening to the podcast demo on the Apple Vis website. Android users may want to look at a similar app, Sendero GPS LookAround.

BlindSquare

http://blindsquare.com

Introduction and Overview of BlindSquare podcast

www.applevis.com/podcast/episodes/introduction-and-overview-blindsquare-updated

Sendero GPS LookAround

https://www.senderogroup.com/products/shopandroid.htm

Conversational Interfaces: Talk with Your Device

Google Now, Microsoft’s Cortana, and Apple’s Siri are the leading smart digital assistants. You can talk to them and quickly find information.

Google Now

https://www.google.com/landing/now

Microsoft Cortana

www.windowsphone.com/en-us/how-to/wp8/cortana/meet-cortana

Apple’s Siri

www.apple.com/ios/siri

Now there is a new device from Amazon, the Amazon Echo, which is similar, but not a mobile device. It’s a Bluetooth speaker that serves as a conversational smart assistant. It takes the form of a nine-inch-tall, black cylinder that you plug in to power and connect to your Wi-Fi network.

Amazon Echo

www.amazon.com/Amazon-SK705DI-Echo/dp/B00X4WHP5E

You can activate listening mode by saying “Alexa.” Then ask your question. You can speak in a normal voice from anywhere in the room, and she answers in a natural way, using a woman’s voice. She can understand you even when there is background noise, like music playing in the room, and she can recognize anyone without being previously trained. Of course if you ask something beyond what she can do, Alexa will say that she can’t find the answer, and sometimes she does mishear you. However, she works very well for the things she’s programmed to do, which are many.

You can ask her to play music from Amazon Prime streaming music or from other services like Pandora or iHeartRadio. You can ask her to play any radio station from around the world, streaming from TuneIn.com. You can ask for news headlines (NPR, BBC, and others), traffic, and weather; ask her to read audiobooks you’ve purchased from Audible; set timers and alarms, ask factual questions; ask for sports scores; ask her to add things to your shopping list, to spell a word, to tell you when the next appointment is in your calendar, to tell you jokes, and more. She also has quite a few witty responses from pop culture references.

The Echo is getting very positive reviews, and Amazon is planning to add more features and capabilities, especially since it is offering ways for developers to create apps for the Echo.19

To see her in action, start by watching Amazon’s video demo. If you want to enjoy the fun side of her with pop culture references, watch the video by Jerry David, “The Truth behind Amazon Echo.”

Echo video demo

https://youtu.be/KkOCeAtKHIc

The Truth behind Amazon Echo

https://youtu.be/EaynIXcWvyM

As you think about what devices like this can mean for people with disabilities, most likely blind users will come to mind first. See the report from Kathryn Wyeth at the Michigan Assistive Technology Program, who finds the Echo useful, “Getting to Know Alexa: Amazon Echo.”

Getting to Know Alexa: Amazon Echo

www.miassisttech.org/mdrcat/index.php/getting-to-know-alexa-amazon-echo

Smart Hearing Aids, Controlled with Apple Watch

Hearing aids are improving, and now there are apps that help you control them. What’s even more convenient is the ability to control a hearing aid from your Apple Watch while your iPhone is tucked away in your bag or pocket.

Twenty-year-old Molly Watt, who was mentioned earlier in the section on haptic interfaces, is deaf and partially blind due to Usher Syndrome. Her website is devoted to raising awareness about it and helping others understand the challenges.

Molly Watt’s website

www.mollywatt.com

In June 2015, she wrote a post called, “My Ears, My Eyes, My Apple Watch.”20 She had previously written about how the small screen of the Apple Watch works well for her since she has a small window of vision in one eye. As with many blind or deaf people, her sense of touch is very acute, so the haptic touches of the watch work very well for her.

In this post she tells the experience of getting a new hearing aid, the ReSound Linx2 smart hearing aid. It works together with the ReSound Smart app (Android and iOS) to allow discreet control of volume, switching between smart programs (with settings optimized for different environments), finding your mislaid hearing aids, and more. This is one of many models of hearing aids that connect to your smartphone via Bluetooth.

ReSound LiNX2 smart hearing aid

www.resound.com/en-US/hearing-aids/resound-linx2#.VjmtYWC5Ml0

ReSound Smart app

www.resound.com/en-US/hearing-aids/apps/smart-app

She tells the story of how for the first time she was able to tell which direction various sounds were coming from, a huge advancement for safety.

She also discusses making a phone call to her father from the watch and hearing his real voice for the first time.

On leaving the ReSound Offices and within 15 minutes I had my Apple Watch set up and it was a real “WOW” moment when I made my first call to my Dad, via my Apple Watch, his voice came straight into my ears, he sounded different, so much clearer than before, it dawned on me, I’d never heard my Dad’s real voice before, my Mum, ever faithful support and chauffeur sat beside me sounded totally different, even I sounded different to myself, it was strange, very strange, hard to process but it made me feel so emotional that day, day one, I was experiencing so much, new things for the first time ever! I spoke into my Apple Watch talking to my Dad, it was quite amazing, my iPhone was safely tucked away in my bag.21

She sums up how the Apple Watch works so well for her, especially for safety and communication.

I guess to sum up to date, since 24 April when I received my Apple Watch I have found a completely new way of dealing with my everyday challenges. The watch allows me to get from A to B safely with Guidedog Unis using maps and taptics. My friends who have Apple Watch can get my attention using taptics even if I cannot see them, which is more often than not but by using taptics it alerts me, is comforting and keeps me safe and confident in situations where previously I may have felt vulnerable. More and more of my friends now have Apple Watch and we have developed some basic codes to communicate by messaging on the small screen and of course I’m alerted by the vibrating which is brilliant.22

Remember that people with hearing impairments are often portrayed as broken in some way and that even with advances like this, it’s still important to accommodate everyone’s needs with closed captioning, ASL interpreters, and the like. Don’t expect everyone to want to be “fixed” by new technologies.23

Smart Hearing and Augmented Hearing—Smart Listening Features for Headphones

In addition to “smart” hearing aids, there are also “smart listening systems,” such as the Soundhawk Scoop, designed for anyone who wants to boost their hearing in noisy environments.24 It’s a wireless earpiece, wireless microphone, and charging case, together with a mobile app. It streams music or phone calls from your phone, and you can customize what you hear in the environment around you. This means it can also be used to tune into conversations in a noisy place. It also comes with a wireless microphone that you can place near what you want to hear. Move far away and it will stream what it hears to you. Keep an ear on your baby, perhaps? To learn more, see the Soundhawk website.

Soundhawk

www.soundhawk.com

Another new item, called The Dash, mentioned earlier, is three kinds of devices in one. It’s (1) a set of Bluetooth earbuds for streaming music from a smartphone, (2) a music storage device so you can listen to music without your smartphone, and (3) a fitness tracking system, measuring heart rate, running pace, distance, and more.

The Dash

https://store.bragi.com

It turns out that the ear is a good place to measure vital signs. The ear doesn’t move around much (like your wrist) and can be used to measure heart rate, blood pressure, temperature, and pulse oximetry.25 New kinds of devices are being developed to combine fitness tracking with high-end audio.

The Here Active Listening System from Doppler Labs is a set of earbuds designed to customize real-time live sounds in your environment. Instead of streaming music from your smartphone, it’s meant to be used at live music concerts and other situations where you want to customize what you are hearing in real time. See the video on the Kickstarter page to learn more.

Here Active Listening System Kickstarter page

https://www.kickstarter.com/projects/dopplerlabs/here-active-listening-change-the-way-you-hear-the/description

Analysts are predicting that the market for hearables will be large since this type of technology is advancing quickly and the prices are coming down. In the words of commentator Mike Elgan:

These devices will give us super hearing like a comic-book mutant superhero.

The ability to customize which sounds we hear from our environment and which we tune out will become far more powerful. We’ll be able to carry on clear conversations at loud concerts, or do the opposite—tune out nearby conversations and hear only the music.

We’ll be able to set a range for the sound we hear—only sounds generated within 10 feet of us, with everything else blocked out, for example. Or, when we choose, we’ll be able to block out all sound.

When somebody says something that we don’t catch, a quick gesture will give us “instant replay” on what was said. And we’ll be able to retroactively capture audio for posterity.26

This may bring to mind some dystopian science fiction scenarios. Have you seen the movie Her, where Joaquin Phoenix’s character walks around with a smart earpiece that serves as his interface to an AI-based personal assistant? 27 Or maybe you’ve seen the episode of TV series Black Mirror called “The Entire History of You,” where everything is recorded all of the time and can be played back instantly.28

Like every new technology, the use of hearables has potential for both utopian and dystopian futures. On the positive side, one of the main reasons that people use any type of wearable device is to reduce the need to access their smartphones for every basic task. With wearables, it’s possible to blend technology more easily in your life without breaking the flow.

And for people who are just beginning to experience some hearing loss but don’t use hearing aids, these advances will be very helpful. Some estimates say that 16 percent of the world population will suffer from hearing loss this year. For those over forty-five, there is a one in five chance that they have some hearing loss.29

Some people don’t want the stigma of being seen as “disabled” by wearing highly visible assistive devices like traditional hearing aids. Many of these new products are stylish, modern, and used by all kinds of people, something that helps them gain acceptance. For earbuds as fashion, see the Kickstarter campaign “OwnPhones—The World’s First 3D Printed Wireless Earbuds.” They even include a “jewelry collection” with earbuds made of silver, brass, or bronze or with or gold plating. Three-D printing makes these buds custom fitted to the shape of your ear.

OwnPhones: Wireless, Custom-Fit, 3D Printed Earbuds, Kickstarter page

https://www.kickstarter.com/projects/ownphones/ownphones-the-worlds-first-custom-fit-3d-printed-e/description

Sight

Camera as Seeing Eye for Language Translation

Smartphone cameras are being used in interesting ways beyond just taking photos. Having a camera in a mobile device means it can be used as a “seeing eye” of sorts, bringing visual information into an app for use. Here are some examples.

Google Translate (iOS and Android) uses the camera to take a photo of text in a language you don’t understand and gives you instant translation. For example, I lived in Budapest for a few months in the summer of 2015, and I don’t speak Hungarian. While shopping in a grocery store, I wanted to read the label on a package of frozen cherries—I wasn’t sure if they were pitted or not. Using Google Translate, I pointed my iPhone camera at the label, and immediately it showed the words pitted cherries superimposed over the Hungarian words on the label. That was extremely useful. It can translate without your needing to capture the photo—just by pointing your camera at the text. Google uses a technology for this called “deep neural nets,” which you can learn more about on the Google Research Blog.

See the World in Your Language with Google Translate, Google Official Blog

https://googleblog.blogspot.com/2015/07/see-world-in-your-language-with-google.html

How Google Translate Squeezes Deep Learning onto a Phone, Google Research Blog

http://googleresearch.blogspot.com/2015/07/how-google-translate-squeezes-deep.html

For more detailed translations, such as a long list of ingredients in tiny print, you can also capture the photo within the app, run your finger over the part where the words are, and save the translated text. It’s not 100 percent foolproof, but it’s close enough in many situations to be very useful.

The app currently recognizes twenty-seven languages: English, French, German, Italian, Portuguese, Russian, Spanish, Bulgarian, Catalan, Croatian, Czech, Danish, Dutch, Filipino, Finnish, Hungarian, Indonesian, Lithuanian, Norwegian, Polish, Romanian, Slovak, Swedish, Turkish, and Ukrainian.

Camera As Seeing Eye for Tree Identification

Leafsnap (iOS) is a tree identification app that recognizes tree species from photos of their leaves. It was developed by Columbia University, the University of Maryland, and the Smithsonian.

When you see a tree that you wish to identify, put a leaf against a white background (carry a piece of paper on your hike), and snap a photo with the app. It matches the shape of the leaf with its database to recognize the tree and bring you to the entry. Like any good nature guide, it’s full of descriptive information about trees, include the seeds, flowers, buds, leaves, and more.

It currently includes trees found in the Northeastern United States and Canada. There is a separate version for trees in the United Kingdom. To see it in action, watch Leafsnap’s demo video.

Leafsnap

http://leafsnap.com

Leafsnap UK

https://itunes.apple.com/gb/app/leafsnap-uk/id877397884?mt=8

Leafsnap demo video

https://youtu.be/KCpR4JTEy4c

Camera as Seeing Eye with Currency Recognition for the Blind

LookTel Money Reader (iOS) instantly recognizes currency and speaks the denomination. This makes it easy for blind or visually impaired users to recognize and count bills. Point the camera at the bill, and it speaks the amount in real time. It also displays the denomination on the screen in high-contrast large numerals for those who have partial vision.

It recognizes 21 currencies: the US dollar, Australian dollar, Bahraini dinar, Brazilian real, Belarusian ruble, British pound, Canadian dollar, Euro, Hungarian forint, Israeli shekel, Indian rupee, Japanese yen, Kuwaiti dinar, Mexican peso, New Zealand dollar, Polish zloty, Russian ruble, Saudi Arabian riyal, Singapore dollar, and United Arab Emirates dirham.

To see it in action, watch the demo video.

LookTel also makes an app called LookTel Recognizer for recognizing everyday objects, such as those in your pantry or at the grocery store.

Another similar app that can scan text on any item and read it aloud is KNFB Reader for Android and iOS. It’s getting very positive reviews by blind users—just scan the item using VoiceOver commands, and it can recognize the text and read it to you. The app can also be used with a connected Braille display, handy for situations where you don’t want the text read out loud. See the KNFB Reader video demo.

LookTel Money Reader

www.looktel.com/moneyreader

LookTel Money Reader demo video

https://youtu.be/_HMVXEZNeNM

LookTel Recognizer

www.looktel.com/recognizer

KNFB Reader

www.knfbreader.com

KNFB Reader video demo

https://youtu.be/cS-i9rn9nao

Camera as Scanner

Scanbot is a document and QR code scanner (iOS and Android). Just open the app and use the camera to snap a photo of your document. It automatically recognizes the edges and offers tips to guide you in moving your phone to the best view of the document before it snaps. In addition to documents, you can scan barcodes and QR codes. It saves your documents as either PDFs or JPGs, and you can also save to various cloud services, like Dropbox, Evernote, or Google Drive. If you upgrade to Pro via an in-app purchase, you also get OCR (optical character recognition), password protection, ability to add pages to existing scans, full text search within your scans, and more.

It’s very handy to have a scanner in your pocket at all times. Students often use this app to scan printed worksheets and other files in order to be able to access them from anywhere.

See a video demo on Scanbot’s home page.

Scanbot

https://scanbot.io/en/index.html

Augmented Reality for Understanding Chemistry or Human Anatomy

Elements 4D (Android and iOS) is an app that works together with a set of wooden blocks or paper cubes you can make yourself (with a downloadable PDF template) to show visually what happens when you combine two elements from the periodic table.

Open the app and point your device toward an element on the block, add a second element, and move them together until they touch (in your augmented view). Then see what is created from your combination in the augmented view on your device. For example, hydrogen combined with oxygen results in water. Look at the Elements 4D website to see how it works.

You can download lesson plans from the developer’s website—with options for elementary school, middle school, or high school.

This developer also makes an app for learning human anatomy using augmented reality: Anatomy 4D (Android and iOS).

Elements 4D

http://elements4d.daqri.com

Elements 4D Chemistry lesson plan, grades 9–12

http://daqri-elementsweb.s3.amazonaws.com/lesson_plans/E4D_LessonPlan_HS.pdf

Anatomy 4D

http://blog.daqri.com/anatomy-4d-changes-the-way-we-learn-about-the-human-body

Anatomy 4D video demo

https://youtu.be/ITEsxjnmvow

These apps that use the camera in interesting ways are enhancing learning for students of all ages.

Summary

As you’ve seen in these examples, NUIs are using unique features of mobile devices to make interacting easier. This can open up computing to new audiences, such as the very young, the very old, or people with disabilities.

NUIs won’t completely replace GUIs, just like personal computers didn’t replace mainframes. Mainframes are still used widely in large organizations for banking, finance, health care, insurance, government, and more. Even though transactions are going mobile, they are often powered by mainframes on the back end.30 In the same way, GUIs will continue to be used for popular desktop applications, in office settings, and more.

But the use of NUIs on mobile devices is one of the main reasons that education and access to information is getting easier for all ages and abilities. It’s not only the mobility of devices, but the human-centered interfaces.

Notes

  1. WhatIs.com, s.v. “natural user interface (NUI),” accessed November 4, 2015, http://whatis.techtarget.com/definition/natural-user-interface-NUI.
  2. “The NUI may represent a revolution in computing, not because it replaces existing ways of interacting with computers, but because it enables computing to expand into new niches that could be of tremendous size and importance. Like previous interfaces, the NUI draws its power from reducing interface learning cost.” From Daniel Wigdor and Dennis Wixon, Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Burlington, MA: Morgan Kaufmann, 2011), Kindle locations 639–41, www.amazon.com/Brave-NUI-World-Designing-Interfaces-ebook/dp/B0058MX59I.
  3. Lennard J. David, Enforcing Normalcy: Disability, Deafness, and the Body, (New York: Verso, 2014), www.amazon.com/Enforcing-Normalcy-Disability-Deafness-Body-ebook/dp/B00K4BA7O4.
  4. Encyclopedia of Science Fiction, s.v. “cyborgs,” April 10, 2015, www.sf-encyclopedia.com/entry/cyborgs.
  5. Natalie Healey, “Step by Step: The ReWalk Motorized Exoskeleton,” interview with Mukul Talaty, Medical Device Developments, November 24, 2014, www.medicaldevice-developments.com/features/featurestep-by-step---the-rewalk-motorised-exoskeleton-4447524.
  6. Rose Eveleth, “The Exoskeleton’s Hidden Burden,” The Atlantic, August 7, 2015, www.theatlantic.com/technology/archive/2015/08/exoskeletons-disability-assistive-technology/400667.
  7. Bridging Apps, accessed January 15, 2016, http://bridgingapps.org. The homepage describes the site’s purpose as “Bridging the gap between technology and people with disabilities.”
  8. Julie Melton Smith, “Susan’s Success Story,” Bridging Apps: Success Stories, August 29, 2015, http://bridgingapps.org/2015/08/susans-success-story.
  9. Michael McWatters, “Colin’s Success Story,” Bridging Apps: Success Stories, June 15, 2015, http://bridgingapps.org/2015/06/colins-success-story.
  10. To learn more about the Tecla Shield, see Julie Melton Smith, “Tecla Shield by Komodo,” Bridging Apps, July 17, 2014, http://bridgingapps.org/2014/07/tecla-shield-komodo.
  11. Julie Melton Smith, “Todd’s Success Story,” Bridging Apps: Success Stories, September 2, 2014, http://bridgingapps.org/2014/09/todds-success-story; Stabelfeldt, “iLove Stories of Independence.”
  12. Julie Melton Smith, “Connor’s Success Story,” Bridging Apps: Success Stories, January 27, 2014, http://bridgingapps.org/2014/01/connors-success-story.
  13. See, for example, reviews in “Brain and Nervous System Pro III,” 3D4Medical, accessed January 15, 2016, http://applications.3d4medical.com/brain_nervous.php.
  14. Download or stream the podcast: David Woodbridge, “Apple Watch 101: A Live Demonstration of Getting Walking Directions from the Watch and Apple Maps,” MP3 file, AppleVis, May 1, 2015, www.applevis.com/podcast/episodes/apple-watch-101-live-demonstration-getting-walking-directions-watch-and-apple-maps.
  15. Molly Watt, “My Ears, My Eyes, My Apple Watch,” Molly Watt: Living with Usher Syndrome (blog), June 6, 2015, www.mollywatt.com/blog/entry/my-ears-my-eyes-my-apple-watch.
  16. To learn more about the Force Touch, see Élyse Betters, “What Is Force Touch? Apple’s Haptic Feedback Technology Explained,” Pocket-lint, March 11, 2015, www.pocket-lint.com/news/133176-what-is-force-touch-apple-s-haptic-feedback-technology-explained.
  17. Kelsey Campbell-Dollaghan, “The iPhone’s New 3D Touch Is the Future of Apple User Interaction: Here’s How It Works,” Gizmodo, September 9, 2015, http://gizmodo.com/the-iphones-new-3d-touch-is-the-future-of-user-interact-1729628006.
  18. David Curry, “This 3D Touch-Style Tech Could Mean Android Phones Will Get Pressure-Sensitive Screens,” Digital Trends, October 7, 2015, www.digitaltrends.com/mobile/synaptics-3d-touch-for-android.
  19. David Pogue, “Amazon’s Echo Brings the ‘Star Trek’ Computer to Your Home,” Yahoo Tech, July 16, 2015, https://www.yahoo.com/tech/amazons-echo-brings-the-star-trek-computer-to-124102850474.html; “Echo Apps & Skills Are Coming,” Love My Echo, June 29, 2015, http://lovemyecho.com/2015/06/29/echo-apps-skills-are-coming.
  20. Watt, “My Ears, My Eyes, My Apple Watch.”
  21. Ibid.
  22. Ibid.
  23. David Peter, “The Hearing Monoculture Rejects Those Who Can’t Hear,” Model View Culture, issue 17, February 24, 2015, https://modelviewculture.com/pieces/the-hearing-monoculture-rejects-those-who-cant-hear.
  24. Ariel Schwartz, “Boost Your Ears to Superhuman Levels with These Cyborg Ears,” Fast Company, June 24, 2014, www.fastcoexist.com/3032226/healthware/boost-your-ears-to-superhuman-levels-with-these-hearing-aids-for-people-who-can-h.
  25. Rachel Metz, “Using Your Ear to Track Your Heart,” MIT Technology Review, August 1, 2014, www.technologyreview.com/news/529571/using-your-ear-to-track-your-heart.
  26. Mike Elgan, “New Earbuds Give You Super-hearing,” Computerworld, accessed August 17, 2015, www.computerworld.com/article/2971267/wearables/new-earbuds-give-you-super-hearing.html.
  27. IMDB listing for the film “Her,” www.imdb.com/title/tt1798709/.
  28. IMDB listing for the TV series “Black Mirror,” www.imdb.com/title/tt2089050/.
  29. Deborah Fountain Fugazy, “The Best of Tech—Hearables Are Changing Lives,” VMI News, VARTA Microbattery, September 24, 2015, www.varta-microbattery-usa.com/blog/hearables-the-necessary-wearables.-are-you-the-1-in-5?hsFormKey=2355e2e45548b94ea48f8d83af1fe3ed.
  30. Davey Alba, “Why on Earth Is IBM Still Making Mainframes?” Wired, January 13, 2015, www.wired.com/2015/01/z13-mainframe.

Refbacks

  • There are currently no refbacks.


Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy