{"id":23195,"date":"2019-01-24T12:12:35","date_gmt":"2019-01-24T12:12:35","guid":{"rendered":"https:\/\/design.ncsu.edu\/graphic-design\/2019\/01\/24\/increasing-accessibility-with-machine-learning\/"},"modified":"2023-02-04T16:17:44","modified_gmt":"2023-02-04T21:17:44","slug":"increasing-accessibility-with-machine-learning","status":"publish","type":"post","link":"https:\/\/design.ncsu.edu\/graphic-design\/2019\/01\/24\/increasing-accessibility-with-machine-learning\/","title":{"rendered":"Increasing Accessibility with Machine Learning"},"content":{"rendered":"
\"mapping<\/a>
Students constructing accessibility matrix of Watson tools with the IBM Accessibility and Watson Health team.<\/em><\/figcaption><\/figure>\n

\u201cWith any new tool or technology, it is important to have an awareness of the possibilities, the affordances it provides to the designer,\u201d says Ellis Anderson, a graduate graphic design student who was recently part of Associate Professor of Graphic Design Helen Armstrong<\/a>\u2019s GD 503 Graduate Graphic Design Studio. Armstrong\u2019s students spent the semester researching and experimenting with machine learning (ML), partnering with IBM Watson Health<\/a> to learn more about the cutting-edge technology.<\/p>\n

\u201cThe theme of the studio was machine learning and accessibility, so working with IBM Watson Health made perfect sense,\u201d says Armstrong. To help students dig deeper into emerging technologies, Armstrong reached out to Kim Holmes, Visual and UX Designer at IBM. With Holmes\u2019 help, a collaboration between the graduate studio and IBM Watson Health was established. Together, Armstrong and Holmes developed a design prompt that challenged students to consider \u201chow might an interface harness the capabilities of machine learning to respond to impairment as a blind or visually impaired or deaf or hard of hearing user completes a specific task?\u201d<\/p>\n

\u201cThe IBM team generously gave their time to support the students during this project, and the students greatly appreciated their input.\u201d\u2014Helen Armstrong<\/p><\/blockquote>\n

The class split into two groups, with each designing an interface and device that would use ML to remove barriers of access for individuals with vision or hearing impairment. The groups interviewed individuals who are blind or visually impaired (BVI) or deaf or hard of hearing about what their biggest \u201cpain points\u201d were and then brainstormed how ML might mitigate these problems.<\/p>\n

Graduate student Jessye Holmgren-Sidell<\/a> says that interviewing potential users and user testing prototypes was a crucial step in the design process. One of the IBM team members who worked with students was profoundly deaf and \u201cmentioned that she did not want a device that constantly required her to look at her phone while having conversations with her colleagues. She wanted to be \u2018technology free\u2019 in the moment,\u201d says Holmgren-Sidell. The team based their design around this idea, creating Here-U, a discrete watch-like device that users tap on to signal that they cannot hear. The Here-U works in tandem with the user\u2019s existing hearing aid or implant and adjusts sound settings based on user feedback, learning the environment in real-time while also storing the information for future use. Additionally, users can allow their data to be shared online, allowing other Here-U users to benefit from their experience.<\/p>\n

\"Team<\/a>
A student team delivers <\/em>final presentation at IBM.<\/em><\/figcaption><\/figure>\n

The second group of students created a device called NICO which uses image recognition technology to detect spills in a user’s kitchen. The students came to this idea after conducting interviews that revealed the difficulty individuals who are BVI face in maintaining a clean kitchen. Without NICO, an individual who is BVI detects spells by running their hands over their countertop, which can be a sticky or unsanitary.<\/p>\n

To address this problem, the students designed a pair of eyeglasses that scan the user\u2019s kitchen, audibly altering the user of any spills via bone conduction technology. Only the user can hear what is being said, allowing for complete privacy and the possibility of eventually extending use beyond the kitchen. Using image recognition technology and a pool of data collected from all NICO users, NICO uses ML to not just notify users of a spill but to identify it as well.<\/p>\n

Students designed both assistive devices based on the concept that the more they are used, the more they improve. \u201cWhat\u2019s interesting about ML is that it has the ability to reshape itself to an individual user or group of users through repeated interaction. This opens the door to all sorts of applications. As burgeoning, critical designers, it is our responsibility and privilege to explore these relatively uncharted territories,\u201d says Anderson. ML isn\u2019t a farout sci-fi concept, it is an emerging technology that designers can use to improve quality of life.<\/p>\n

Working with the IBM team helped students think big but also pragmatically. Anderson says their \u201cfeedback and advice helped reign in our ideas a bit, not as a limitation but more as a logistical framework. Partnering with an organization like IBM is valuable because it lets students catch a glimpse of how the professional field approaches the technology. It\u2019s no longer abstract, it\u2019s practicable<\/em>.\u201d<\/p>\n

Armstrong was grateful for the collaboration with IBM\u2014\u201cThe IBM team generously gave their time to support the students during this project, and the students greatly appreciated their input.\u201d Participating IBM team members included Kim Holmes, M.E. Miller [MID \u201812], Alexandra Grossi [MGD \u201817], Clara MacDonell [BGD \u201816], Kevin Schultz, and Jason Brown.<\/p>\n

\"Final<\/a>
Group photo of GD503 at IBM on final presentation day. Helen Armstrong on far right.<\/em><\/figcaption><\/figure>\n
\n

Staci Kleinmaier is a professional writer and photographer in Apex, North Carolina. She uses words and images to tell stories. To see her work, visit www.stacikleinmaier.com.<\/em><\/span><\/p>\n

This post was originally published<\/a> in College of Design Blog.<\/em><\/p>","protected":false,"raw":"[caption id=\"attachment_15898\" align=\"aligncenter\" width=\"800\"]\"mapping<\/a> Students constructing accessibility matrix of Watson tools with the IBM Accessibility and Watson Health team.<\/em>[\/caption]\r\n\r\n\u201cWith any new tool or technology, it is important to have an awareness of the possibilities, the affordances it provides to the designer,\u201d says Ellis Anderson, a graduate graphic design student who was recently part of Associate Professor of Graphic Design Helen Armstrong<\/a>\u2019s GD 503 Graduate Graphic Design Studio. Armstrong\u2019s students spent the semester researching and experimenting with machine learning (ML), partnering with IBM Watson Health<\/a> to learn more about the cutting-edge technology.\r\n\r\n\u201cThe theme of the studio was machine learning and accessibility, so working with IBM Watson Health made perfect sense,\u201d says Armstrong. To help students dig deeper into emerging technologies, Armstrong reached out to Kim Holmes, Visual and UX Designer at IBM. With Holmes\u2019 help, a collaboration between the graduate studio and IBM Watson Health was established. Together, Armstrong and Holmes developed a design prompt that challenged students to consider \u201chow might an interface harness the capabilities of machine learning to respond to impairment as a blind or visually impaired or deaf or hard of hearing user completes a specific task?\u201d\r\n

\u201cThe IBM team generously gave their time to support the students during this project, and the students greatly appreciated their input.\u201d\u2014Helen Armstrong<\/blockquote>\r\nThe class split into two groups, with each designing an interface and device that would use ML to remove barriers of access for individuals with vision or hearing impairment. The groups interviewed individuals who are blind or visually impaired (BVI) or deaf or hard of hearing about what their biggest \u201cpain points\u201d were and then brainstormed how ML might mitigate these problems.\r\n\r\nGraduate student Jessye Holmgren-Sidell<\/a> says that interviewing potential users and user testing prototypes was a crucial step in the design process. One of the IBM team members who worked with students was profoundly deaf and \u201cmentioned that she did not want a device that constantly required her to look at her phone while having conversations with her colleagues. She wanted to be \u2018technology free\u2019 in the moment,\u201d says Holmgren-Sidell. The team based their design around this idea, creating Here-U, a discrete watch-like device that users tap on to signal that they cannot hear. The Here-U works in tandem with the user\u2019s existing hearing aid or implant and adjusts sound settings based on user feedback, learning the environment in real-time while also storing the information for future use. Additionally, users can allow their data to be shared online, allowing other Here-U users to benefit from their experience.\r\n\r\n[caption id=\"attachment_15900\" align=\"alignleft\" width=\"450\"]\"Team<\/a> A student team delivers <\/em>final presentation at IBM.<\/em>[\/caption]\r\n\r\nThe second group of students created a device called NICO which uses image recognition technology to detect spills in a user's kitchen. The students came to this idea after conducting interviews that revealed the difficulty individuals who are BVI face in maintaining a clean kitchen. Without NICO, an individual who is BVI detects spells by running their hands over their countertop, which can be a sticky or unsanitary.\r\n\r\nTo address this problem, the students designed a pair of eyeglasses that scan the user\u2019s kitchen, audibly altering the user of any spills via bone conduction technology. Only the user can hear what is being said, allowing for complete privacy and the possibility of eventually extending use beyond the kitchen. Using image recognition technology and a pool of data collected from all NICO users, NICO uses ML to not just notify users of a spill but to identify it as well.\r\n\r\nStudents designed both assistive devices based on the concept that the more they are used, the more they improve. \u201cWhat\u2019s interesting about ML is that it has the ability to reshape itself to an individual user or group of users through repeated interaction. This opens the door to all sorts of applications. As burgeoning, critical designers, it is our responsibility and privilege to explore these relatively uncharted territories,\u201d says Anderson. ML isn\u2019t a farout sci-fi concept, it is an emerging technology that designers can use to improve quality of life.\r\n\r\nWorking with the IBM team helped students think big but also pragmatically. Anderson says their \u201cfeedback and advice helped reign in our ideas a bit, not as a limitation but more as a logistical framework. Partnering with an organization like IBM is valuable because it lets students catch a glimpse of how the professional field approaches the technology. It\u2019s no longer abstract, it\u2019s practicable<\/em>.\u201d\r\n\r\nArmstrong was grateful for the collaboration with IBM\u2014\u201cThe IBM team generously gave their time to support the students during this project, and the students greatly appreciated their input.\u201d Participating IBM team members included Kim Holmes, M.E. Miller [MID \u201812], Alexandra Grossi [MGD \u201817], Clara MacDonell [BGD \u201816], Kevin Schultz, and Jason Brown.\r\n\r\n[caption id=\"attachment_15899\" align=\"aligncenter\" width=\"800\"]\"Final<\/a> Group photo of GD503 at IBM on final presentation day. Helen Armstrong on far right.<\/em>[\/caption]\r\n\r\n
\r\n\r\nStaci Kleinmaier is a professional writer and photographer in Apex, North Carolina. She uses words and images to tell stories. To see her work, visit www.stacikleinmaier.com.<\/em><\/span>"},"excerpt":{"rendered":"

Helen Armstrong’s graduate graphic design studio at NC State Design collaborates with IBM Watson & IBM Accessibility to explore increased accessibility through machine learning.<\/p>\n","protected":false},"author":270,"featured_media":23196,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"source":"ncstate_wire","ncst_custom_author":"","ncst_show_custom_author":false,"ncst_dynamicHeaderBlockName":"","ncst_dynamicHeaderData":"","ncst_content_audit_freq":"","ncst_content_audit_date":"","ncst_content_audit_display":false,"ncst_backToTopFlag":"","footnotes":""},"categories":[1],"tags":[5],"class_list":["post-23195","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","tag-_from-newswire-collection-271"],"displayCategory":null,"acf":[],"_links":{"self":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts\/23195"}],"collection":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/users\/270"}],"replies":[{"embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/comments?post=23195"}],"version-history":[{"count":1,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts\/23195\/revisions"}],"predecessor-version":[{"id":24143,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts\/23195\/revisions\/24143"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/media\/23196"}],"wp:attachment":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/media?parent=23195"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/categories?post=23195"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/tags?post=23195"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}