While we definitely blame technology for being at the root of many evils, there are many ways tech, and specifically the Internet of things, can help us.
One of the most promising is with Alexa’s new “Show and Tell” skill. Users can hold an object up in front of an Amazon Show, and the AI will identify the object.
Show and Tell
We all remember Show and Tell in school. It was often the highlight of the day when you or a classmate got to bring a prized possession into school and share it with your class.
Alexa’s Show and Tell works a little differently, but mostly it’s just used for a completely different reason. Instead of sharing your possession with your classmates, you’re sharing with Alexa.
But there’s a better purpose as well. Instead of just sharing with Alexa, the AI will identify the object for people who are vision-impaired.
Can’t tell the difference between a can of chicken noodle soup and a can of bean soup? Alexa will help you find out which one it is. Can’t tell the difference between toothpaste and hemorrhoid cream? Alexa can tell you.
To use the Alexa skill, users hold up the item in front of the Echo Show camera and ask, “Alexa, what am I holding?” Alexa’s artificial intelligence will identify the object.
“The whole idea for Show and Tell came about from feedback from blind and low-vision customers,” explains the head of Amazon’s Alexa for Everyone team, Sarah Caplener.
“We heard that product identificaton can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
The World Health Organization says around 1.3 billion people across the globe are estimated to be living with some degree of vision impairment.
Amazon for Everyone’s principal accessiblity researcher Josh Miele works with blind Amazon users on product research to help those people live better lives with Amazon’s products and helped create Show and Tell with his research.
Using Show and Tell
Show and Tell is already available to first- and second-generation Echo Show devices. Confusingly, it’s not available to more recent generations of the Show. Normally in the tech world, the first few gens are the last to get a new feature or skill. Though admittedly, the earlier versions are larger and much more expensive.
Users of first- and second-gen Shows only need to hold up an obect to their device’s camera and ask, “Alexa, what am I holding” to start using the skill. It’s as simple as that.
Stacie Grijalva lost her sight at 41 and now finds ways assistive techology can help the visually-impaired. “My job is to help people with visual impairments see how technology can affect people’s lives and make them feel better about what they do on a day-to-day basis,” she said.
She finds Show and Tell to be a “tremendous help and a huge time-saver.” She likes that she can “do it on my own by just asking Alexa.”
Do you know a visually-impaired person who could be helped by the Amazon Show and Tell skill? Have you struggled with the situation yourself? Tell us your experiences in the comments section below.
Get the best of IoT Tech Trends delivered right to your inbox!