Skip to main content

Increasing Accessibility with Machine Learning

mapping excersize with IBM
mapping matrix with IBM
Students constructing accessibility matrix of Watson tools with the IBM Accessibility and Watson Health team.

“With any new tool or technology, it is important to have an awareness of the possibilities, the affordances it provides to the designer,” says Ellis Anderson, a graduate graphic design student who was recently part of Associate Professor of Graphic Design Helen Armstrong’s GD 503 Graduate Graphic Design Studio. Armstrong’s students spent the semester researching and experimenting with machine learning (ML), partnering with IBM Watson Health to learn more about the cutting-edge technology.

“The theme of the studio was machine learning and accessibility, so working with IBM Watson Health made perfect sense,” says Armstrong. To help students dig deeper into emerging technologies, Armstrong reached out to Kim Holmes, Visual and UX Designer at IBM. With Holmes’ help, a collaboration between the graduate studio and IBM Watson Health was established. Together, Armstrong and Holmes developed a design prompt that challenged students to consider “how might an interface harness the capabilities of machine learning to respond to impairment as a blind or visually impaired or deaf or hard of hearing user completes a specific task?”

“The IBM team generously gave their time to support the students during this project, and the students greatly appreciated their input.”—Helen Armstrong

The class split into two groups, with each designing an interface and device that would use ML to remove barriers of access for individuals with vision or hearing impairment. The groups interviewed individuals who are blind or visually impaired (BVI) or deaf or hard of hearing about what their biggest “pain points” were and then brainstormed how ML might mitigate these problems.

Graduate student Jessye Holmgren-Sidell says that interviewing potential users and user testing prototypes was a crucial step in the design process. One of the IBM team members who worked with students was profoundly deaf and “mentioned that she did not want a device that constantly required her to look at her phone while having conversations with her colleagues. She wanted to be ‘technology free’ in the moment,” says Holmgren-Sidell. The team based their design around this idea, creating Here-U, a discrete watch-like device that users tap on to signal that they cannot hear. The Here-U works in tandem with the user’s existing hearing aid or implant and adjusts sound settings based on user feedback, learning the environment in real-time while also storing the information for future use. Additionally, users can allow their data to be shared online, allowing other Here-U users to benefit from their experience.

Team presentation at IBM
A student team delivers final presentation at IBM.

The second group of students created a device called NICO which uses image recognition technology to detect spills in a user’s kitchen. The students came to this idea after conducting interviews that revealed the difficulty individuals who are BVI face in maintaining a clean kitchen. Without NICO, an individual who is BVI detects spells by running their hands over their countertop, which can be a sticky or unsanitary.

To address this problem, the students designed a pair of eyeglasses that scan the user’s kitchen, audibly altering the user of any spills via bone conduction technology. Only the user can hear what is being said, allowing for complete privacy and the possibility of eventually extending use beyond the kitchen. Using image recognition technology and a pool of data collected from all NICO users, NICO uses ML to not just notify users of a spill but to identify it as well.

Students designed both assistive devices based on the concept that the more they are used, the more they improve. “What’s interesting about ML is that it has the ability to reshape itself to an individual user or group of users through repeated interaction. This opens the door to all sorts of applications. As burgeoning, critical designers, it is our responsibility and privilege to explore these relatively uncharted territories,” says Anderson. ML isn’t a farout sci-fi concept, it is an emerging technology that designers can use to improve quality of life.

Working with the IBM team helped students think big but also pragmatically. Anderson says their “feedback and advice helped reign in our ideas a bit, not as a limitation but more as a logistical framework. Partnering with an organization like IBM is valuable because it lets students catch a glimpse of how the professional field approaches the technology. It’s no longer abstract, it’s practicable.”

Armstrong was grateful for the collaboration with IBM—“The IBM team generously gave their time to support the students during this project, and the students greatly appreciated their input.” Participating IBM team members included Kim Holmes, M.E. Miller [MID ‘12], Alexandra Grossi [MGD ‘17], Clara MacDonell [BGD ‘16], Kevin Schultz, and Jason Brown.

Final presentation at IBM HQ RTP
Group photo of GD503 at IBM on final presentation day. Helen Armstrong on far right.

Staci Kleinmaier is a professional writer and photographer in Apex, North Carolina. She uses words and images to tell stories. To see her work, visit www.stacikleinmaier.com.