{"id":24994,"date":"2024-07-25T13:37:41","date_gmt":"2024-07-25T17:37:41","guid":{"rendered":"https:\/\/design.ncsu.edu\/graphic-design\/2024\/07\/25\/the-future-of-design-in-technology\/"},"modified":"2025-07-05T19:47:42","modified_gmt":"2025-07-05T23:47:42","slug":"the-future-of-design-in-technology","status":"publish","type":"post","link":"https:\/\/design.ncsu.edu\/graphic-design\/2024\/07\/25\/the-future-of-design-in-technology\/","title":{"rendered":"The Future of Design in Technology"},"content":{"rendered":"\n\n\n\n\n

Is AI here to take over our jobs? We asked two faculty in the College of Design to explore the future of design within the realm of technology, and delve into some of the pros and cons of redefining their work with this endlessly-transforming industry.<\/p>\n\n\n\n

While the nature of design has always been to explore, question, ideate, iterate, prototype, test and test again, newly emerging technologies related to Artificial Intelligence (AI) have the potential to disrupt our current cycles of work and propose new roles for a future generation of students.<\/p>\n\n\n\n

Helen Armstrong, a professor of graphic & experience design and director of graduate programs, has been researching in this space since 2015. One of the ways she\u2019s preparing students at both the undergraduate and graduate level for this shift is to encourage students to continue to view their work as part of a larger system, rather than a one-off solution.
\u201cWhen you design, you\u2019re not designing something that finishes the moment your work is done,\u201d she says. \u201cThe interfaces we\u2019re designing today have to be able to interact with a whole ecosystem of other products within a spectrum of environments. These interfaces also have to anticipate and respond to user needs\u2014before humans make requests. We are entering an era of anticipatory design. For example, a few years ago our students worked with an auto parts company to integrate AI into their retail spaces. One of the solutions was an anticipatory maintenance application that identified customer vehicles and then enabled the sales team\u2019s members to predict not only current maintenance needs but also near future and far future needs.<\/p>\n\n\n\n

Exploring user interface (UI) and user experience (UX) design means students must consider the end user of a product and consider the ways in which that user might interact with a product – whether that\u2019s the interface you use to access your mobile banking or the notification you receive from your fridge, reminding you to buy milk. Each of these elements can talk to one another and respond to human needs, and there\u2019s an inherent agreement of trust that humans place in the systems.<\/p>\n\n\n\n

\u201cOur research has found that people tend to begin by over-trusting autonomous systems, but if they disagree with the system\u2019s prediction, then that trust begins to erode very quickly,\u201d Armstrong says. So how can designers design the system to create a bit of skepticism and teach users more about how the machine might be making its prediction?<\/p>\n\n\n

\n
\n

\u201cOur research has found that people tend to begin by over-trusting autonomous systems, but if they disagree with the system\u2019s prediction, then that trust begins to erode very quickly.\u201d<\/p>\n <\/div>\n<\/blockquote>\n\n\n\n

A lot of that connection boils down to an ever-increasing need to understand data and being able to consider the larger implications of working with tools with a vast resource of computing power.<\/p>\n\n\n\n

Shawn Protz, a professor of architecture and digital technology, compares the addition of AI to that of the practice of hand-drafting before programs like AutoCAD took over.
\u201cI understand that shift in process affected the outcomes in how we make architecture \u2013 moving from hand-drafting to the computer,\u201d he says. \u201cThe computer was always part of my education \u2013 we used hand-drafting to learn how to move from 2D to 3D, but I don\u2019t think it\u2019s changed anything. The advances in BIM (building information modeling) garnered a lot of hype in the beginning, but now it\u2019s just normal expected use. It was hyped as something new, but it was really just an extension of what we\u2019d always been doing in architecture. We\u2019ve been using abstractions like plans and sections to understand the complex spatial design, but it gave us the superpower of coordinating the way we draw and the way we think in three dimensions. I certainly had to learn some new software features, but everything just feels like an extension of what we\u2019ve been trying to do this whole time. Even AI \u2013 I don\u2019t see it as radically changing we\u2019ve already been doing for centuries.\u201d<\/p>\n\n\n\n

\"in
Yash Shah, a student in the Master of Advanced Architectural Studies (MAAS) program, created a workflow with the generative AI platform Midjourney to visualize future scenes of the Matunga Market in Mumbai as an example of a vibrant and flexible model of urbanism and food production.<\/figcaption><\/figure>\n\n\n\n

With the addition of AI, students and professionals will be able to redefine what it means to iterate on a design or a series of designs. Protz sees it as a way to augment work processes, rather than removing part of the role. During a recent lecture at the College of Design, David Benjamin, founding principal at The Living and associate professor at Columbia\u2019s GSAPP, shared his workflow of using machine learning in Grasshopper 3D<\/a> to refine ideas and further his creative practice. \u201cBenjamin sets up parameters and generates preliminary floor plans and can then work with his client to explore possibilities. So the work isn\u2019t designing for you, but it\u2019s allowing you to see what\u2019s unexpected, what\u2019s missing, and is another voice or agent, rather than something that might be replacing you,\u201d Protz says.<\/p>\n\n\n\n

This use of AI as an early idea generator in which the designer refines the final product is just one option for enhancing our current workflows. While AI helps us grasp complexity and can augment the creative process, it has the potential for unwanted consequences as well.
\u201cWe need to be considering what human skills we\u2019re automating away vs. what skills we could be shoring up through these systems,\u201d says Armstrong. \u201cThink about the GPS system in your car – its goal is to help you navigate from point A to point B. Often, this is great, but, sometimes, you would prefer the system to teach you how to better navigate between spaces yourself. Recently I worked on a project with Dr. Matthew Peterson in which we were thinking about this concept in relation to human memory. We worked with the NC State Laboratory for Analytic Sciences to consider how generative AI might be used to help an intelligence analyst capture and remember vital information. We weren\u2019t using AI to replace human memory but rather leveraging the technology to increase analysts\u2019 natural memory capacities. Each time designers create an interface today, they make choices around automation that can either bypass certain human skills or team up with AI to expand those skills. That\u2019s a very powerful position and a very heavy responsibility for designers right now,\u201d she adds.
While the question of automation and future human abilities is one of the concerns for AI, another deals with the larger unexpected ramifications of this technology. Issues of bias, misuse of data and even intense usage of the earth\u2019s natural resources all have implications in this field.<\/p>\n\n\n\n

Armstrong sees her role as encouraging students to question their influence on the future of society. \u201cI always tell my grad students – this is such a great time to be a student. We\u2019re in the midst of figuring out what roles these systems can play in our future, and determining how we can build systems that support the future we want to live in,\u201d she says.<\/p>\n\n\n\n

\"caption
Students Hannah Faub, Randa Hadi, Harrison Lyman and Matt Norton in Armstrong\u2019s master of graphic & experience design class partnered with an auto parts company to explore how the company might ethically leverage machine learning to produce personalized, efficient, useful consumer interactions. The interface shows what a sales team member might see as an existing customer profile.<\/figcaption><\/figure>\n\n\n\n

Both she and Protz point to guiding resources that are keeping a watchful eye on the future of this technology. Guiding principles such as the AI Bill of Rights<\/a> and advocates such as the AI Now Institute<\/a> continue to bring meaningful discourse and governance to this topic.
President Biden has recognized the potential impact unchecked technology advances could have on civil rights and democratic values, foundational principles to the United States of America. Based on this charge, the White House Office of Science and Technology Policy has identified five principles that should guide the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence. The Blueprint for an AI Bill of Rights is a guide for a society that protects all people from these threats\u2014and uses technologies in ways that reinforce our highest values.\u201d
1<\/a><\/sup><\/p>\n\n\n\n

Armstrong encourages each of her students to read this document, and to think of it as a design brief. \u201cEvery paragraph is something that designers could be thinking about and working on right now,\u201d she says. She gives an example from the section on Notice and Explanation, which focuses on letting end users know that they may be impacted by the systems they use. It\u2019s a design question – how do designers develop UX\/UI designs that can meet the expectations outlined in the AI Bill of Rights?<\/p>\n\n\n\n

Protz is also exploring the ethical implications of some of these technologies, especially taking into consideration those that wield the power behind these mighty datasets. While guiding principles such as the AI Bill of Rights encourage the Notice and Explanation of the ways in which your data can be used, advocates such as the AI Now Institute point to the failure of big tech corporations to adhere to these \u201cdata minimization\u201d models of accountability.2<\/a><\/sup> \u201cThis brings up all kinds of issues around privacy and bias that students need to be critically engaging with,\u201d adds Armstrong.<\/p>\n\n\n\n

Exploring some of these larger societal problems may be beyond the scope of architects, but there is more the profession could be doing to consider the ethical implications and the use of these tools when it comes to professional practice. Similar to considering the ways in which the built environment affects our climate and landscape, we should also consider the intense energy usage and water consumption3<\/a><\/sup> that the data centers powering artificial intelligence need to churn out endless iterations for designs and visualizations.<\/p>\n\n\n\n

Consistently pushing his students to be critical thinkers is part of Protz\u2019 role as a professor. \u201cI try to emphasize that all of these tools have a lineage and are part of a continuum of thinking,\u201d he says.<\/p>\n\n\n\n

\u201cThink about what the tool allows you to do, what it limits you from doing, and what opportunities or frictions does that create?\u201d He feels that a large part of his role in the college is embracing and questioning new digital tools, being willing to try every new thing that comes out, and \u201cjust play around with it.\u201d <\/p>\n\n\n\n

Armstrong feels that some of the shifts into more reliance on AI will be so subtle to become invisible. \u201cThese capabilities are being unbundled and put into existing technologies. So it\u2019s not like you go pick a special generative tool to get the output you want \u2013 they are being seamlessly incorporated into products so that we barely even know it\u2019s happening.\u201d<\/p>\n\n\n\n

Understanding that these systems are predictive and do not have a true sense of truth or meaning is something Armstrong underscores as important to this work. Having a healthy dose of skepticism as we continue to work with these tools encourages us to be aware of both their immense potential and their limitations. <\/p>\n\n\n\n

That hearkens back to Protz\u2019s goal to help students become critical thinkers. The College of Design is positioning its graduates to not only work with and explore the boundaries of new technologies such as AI, but to also step back and assess the roles in which these technologies have the power to shape our lives.<\/p>\n\n\n\n

Both see AI as a huge shift in the means by which our work moves forward, with Armstrong likening the emergence of AI to the early days of the internet. The influence on the dissemination and processing of knowledge is profound, as is the ability to propagate false knowledge. Navigating that balance continues to be a space that designers can have influence on in the future.<\/p>\n\n\n\n

\u201cAs long as humans are part of the equation, designers will be needed. Machines cannot truly understand the human experience. Moving forward, we need to be designing interfaces that enable humans and machines to work together to engage thoughtfully with the strengths of both,\u201d says Armstrong. \u201cThis space is only going to be more important as we move forward.\u201d<\/p>\n\n\n\n

This article first appeared in the May 2024 issue of Designlife magazine. Explore other articles from this issue<\/a>.<\/p>\n\n\n\n

Student Predictions <\/h2>\n\n\n\n

What future challenges do you think will affect your generation of designers?<\/strong><\/p>\n\n\n\n

\u201cI think that designers are going to have to think about bigger problems. We\u2019ll have to do more design thinking around systems that we didn\u2019t think of as traditional design.\u201d – Ellis<\/p>\n\n\n\n

Story Contributors<\/h2>\n\n\n\n

Helen Armstrong<\/a> <\/strong>is the director of the master\u2019s in graphic & experience design (MGXD) program and professor of graphic & experience design at NC State. Her research focuses on digital rights, human-machine teaming and accessible design. Her research partners have included IBM, Redhat, REI, Advance Auto Parts, SAS Analytics, Sealed Air and the Laboratory for Analytic Sciences.<\/p>\n\n\n\n

Shawn Protz<\/a><\/strong> is an assistant professor of architecture and digital technology. He has explored a range of subjects spanning from structural and environmental systems to digital representation and fabrication; past classes have covered design communication, building information modeling, climatic design, housing, inflatable architecture and tectonics. At NC State, Protz focuses on building a vibrant digital culture and developing coursework and research projects that explore emerging digital systems and materials.<\/p>\n\n\n\n

Author: Christine Klocke<\/a><\/strong> is the director of communications and marketing at NC State University\u2019s College of Design. She is a graduate of the University of North Carolina at Chapel Hill\u2019s Hussman School of Journalism and Media.<\/p>\n\n\n\n

Footnotes<\/h3>\n\n\n\n
    \n
  1. \u201cBlueprint for an AI Bill of Rights.<\/a>\u201d The White House, Office of Science and Technology Policy n.d. <\/li>\n\n\n\n
  2. \u201cData Minimization as a Tool for AI Accountability.<\/a>\u201d AI Now Institute, 2023. April 11. <\/li>\n\n\n\n
  3. \u201cThe Climate Costs of Big Tech<\/a>.\u201d AI Now Institute, 2023. April 11.<\/li>\n<\/ol>\n

    This post was originally published<\/a> in College of Design Blog.<\/em><\/p>","protected":false,"raw":"\n\n\n\n\n

    Is AI here to take over our jobs? We asked two faculty in the College of Design to explore the future of design within the realm of technology, and delve into some of the pros and cons of redefining their work with this endlessly-transforming industry.<\/p>\n\n\n\n

    While the nature of design has always been to explore, question, ideate, iterate, prototype, test and test again, newly emerging technologies related to Artificial Intelligence (AI) have the potential to disrupt our current cycles of work and propose new roles for a future generation of students.<\/p>\n\n\n\n

    Helen Armstrong, a professor of graphic & experience design and director of graduate programs, has been researching in this space since 2015. One of the ways she\u2019s preparing students at both the undergraduate and graduate level for this shift is to encourage students to continue to view their work as part of a larger system, rather than a one-off solution.
    \u201cWhen you design, you\u2019re not designing something that finishes the moment your work is done,\u201d she says. \u201cThe interfaces we\u2019re designing today have to be able to interact with a whole ecosystem of other products within a spectrum of environments. These interfaces also have to anticipate and respond to user needs\u2014before humans make requests. We are entering an era of anticipatory design. For example, a few years ago our students worked with an auto parts company to integrate AI into their retail spaces. One of the solutions was an anticipatory maintenance application that identified customer vehicles and then enabled the sales team\u2019s members to predict not only current maintenance needs but also near future and far future needs.<\/p>\n\n\n\n

    Exploring user interface (UI) and user experience (UX) design means students must consider the end user of a product and consider the ways in which that user might interact with a product - whether that\u2019s the interface you use to access your mobile banking or the notification you receive from your fridge, reminding you to buy milk. Each of these elements can talk to one another and respond to human needs, and there\u2019s an inherent agreement of trust that humans place in the systems.<\/p>\n\n\n\n

    \u201cOur research has found that people tend to begin by over-trusting autonomous systems, but if they disagree with the system\u2019s prediction, then that trust begins to erode very quickly,\u201d Armstrong says. So how can designers design the system to create a bit of skepticism and teach users more about how the machine might be making its prediction?<\/p>\n\n\n\n

    Our research has found that people tend to begin by over-trusting autonomous systems, but if they disagree with the system\u2019s prediction, then that trust begins to erode very quickly.<\/p><\/div><\/blockquote>\n\n\n\n

    A lot of that connection boils down to an ever-increasing need to understand data and being able to consider the larger implications of working with tools with a vast resource of computing power.<\/p>\n\n\n\n

    Shawn Protz, a professor of architecture and digital technology, compares the addition of AI to that of the practice of hand-drafting before programs like AutoCAD took over.
    \u201cI understand that shift in process affected the outcomes in how we make architecture \u2013 moving from hand-drafting to the computer,\u201d he says. \u201cThe computer was always part of my education \u2013 we used hand-drafting to learn how to move from 2D to 3D, but I don\u2019t think it\u2019s changed anything. The advances in BIM (building information modeling) garnered a lot of hype in the beginning, but now it\u2019s just normal expected use. It was hyped as something new, but it was really just an extension of what we\u2019d always been doing in architecture. We\u2019ve been using abstractions like plans and sections to understand the complex spatial design, but it gave us the superpower of coordinating the way we draw and the way we think in three dimensions. I certainly had to learn some new software features, but everything just feels like an extension of what we\u2019ve been trying to do this whole time. Even AI \u2013 I don\u2019t see it as radically changing we\u2019ve already been doing for centuries.\u201d<\/p>\n\n\n\n

    \"in
    Yash Shah, a student in the Master of Advanced Architectural Studies (MAAS) program, created a workflow with the generative AI platform Midjourney to visualize future scenes of the Matunga Market in Mumbai as an example of a vibrant and flexible model of urbanism and food production.<\/figcaption><\/figure>\n\n\n\n

    With the addition of AI, students and professionals will be able to redefine what it means to iterate on a design or a series of designs. Protz sees it as a way to augment work processes, rather than removing part of the role. During a recent lecture at the College of Design, David Benjamin, founding principal at The Living and associate professor at Columbia\u2019s GSAPP, shared his workflow of using machine learning in Grasshopper 3D<\/a> to refine ideas and further his creative practice. \u201cBenjamin sets up parameters and generates preliminary floor plans and can then work with his client to explore possibilities. So the work isn\u2019t designing for you, but it\u2019s allowing you to see what\u2019s unexpected, what\u2019s missing, and is another voice or agent, rather than something that might be replacing you,\u201d Protz says.<\/p>\n\n\n\n

    This use of AI as an early idea generator in which the designer refines the final product is just one option for enhancing our current workflows. While AI helps us grasp complexity and can augment the creative process, it has the potential for unwanted consequences as well.
    \u201cWe need to be considering what human skills we\u2019re automating away vs. what skills we could be shoring up through these systems,\u201d says Armstrong. \u201cThink about the GPS system in your car - its goal is to help you navigate from point A to point B. Often, this is great, but, sometimes, you would prefer the system to teach you how to better navigate between spaces yourself. Recently I worked on a project with Dr. Matthew Peterson in which we were thinking about this concept in relation to human memory. We worked with the NC State Laboratory for Analytic Sciences to consider how generative AI might be used to help an intelligence analyst capture and remember vital information. We weren\u2019t using AI to replace human memory but rather leveraging the technology to increase analysts\u2019 natural memory capacities. Each time designers create an interface today, they make choices around automation that can either bypass certain human skills or team up with AI to expand those skills. That\u2019s a very powerful position and a very heavy responsibility for designers right now,\u201d she adds.
    While the question of automation and future human abilities is one of the concerns for AI, another deals with the larger unexpected ramifications of this technology. Issues of bias, misuse of data and even intense usage of the earth\u2019s natural resources all have implications in this field.<\/p>\n\n\n\n

    Armstrong sees her role as encouraging students to question their influence on the future of society. \u201cI always tell my grad students - this is such a great time to be a student. We\u2019re in the midst of figuring out what roles these systems can play in our future, and determining how we can build systems that support the future we want to live in,\u201d she says.<\/p>\n\n\n\n

    \"caption
    Students Hannah Faub, Randa Hadi, Harrison Lyman and Matt Norton in Armstrong\u2019s master of graphic & experience design class partnered with an auto parts company to explore how the company might ethically leverage machine learning to produce personalized, efficient, useful consumer interactions. The interface shows what a sales team member might see as an existing customer profile.<\/figcaption><\/figure>\n\n\n\n

    Both she and Protz point to guiding resources that are keeping a watchful eye on the future of this technology. Guiding principles such as the AI Bill of Rights<\/a> and advocates such as the AI Now Institute<\/a> continue to bring meaningful discourse and governance to this topic.
    President Biden has recognized the potential impact unchecked technology advances could have on civil rights and democratic values, foundational principles to the United States of America. Based on this charge, the White House Office of Science and Technology Policy has identified five principles that should guide the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence. The Blueprint for an AI Bill of Rights is a guide for a society that protects all people from these threats\u2014and uses technologies in ways that reinforce our highest values.\u201d
    1<\/a><\/sup><\/p>\n\n\n\n

    Armstrong encourages each of her students to read this document, and to think of it as a design brief. \u201cEvery paragraph is something that designers could be thinking about and working on right now,\u201d she says. She gives an example from the section on Notice and Explanation, which focuses on letting end users know that they may be impacted by the systems they use. It\u2019s a design question - how do designers develop UX\/UI designs that can meet the expectations outlined in the AI Bill of Rights?<\/p>\n\n\n\n

    Protz is also exploring the ethical implications of some of these technologies, especially taking into consideration those that wield the power behind these mighty datasets. While guiding principles such as the AI Bill of Rights encourage the Notice and Explanation of the ways in which your data can be used, advocates such as the AI Now Institute point to the failure of big tech corporations to adhere to these \u201cdata minimization\u201d models of accountability.2<\/a><\/sup> \u201cThis brings up all kinds of issues around privacy and bias that students need to be critically engaging with,\u201d adds Armstrong.<\/p>\n\n\n\n

    Exploring some of these larger societal problems may be beyond the scope of architects, but there is more the profession could be doing to consider the ethical implications and the use of these tools when it comes to professional practice. Similar to considering the ways in which the built environment affects our climate and landscape, we should also consider the intense energy usage and water consumption3<\/a><\/sup> that the data centers powering artificial intelligence need to churn out endless iterations for designs and visualizations.<\/p>\n\n\n\n

    Consistently pushing his students to be critical thinkers is part of Protz\u2019 role as a professor. \u201cI try to emphasize that all of these tools have a lineage and are part of a continuum of thinking,\u201d he says.<\/p>\n\n\n\n

    \u201cThink about what the tool allows you to do, what it limits you from doing, and what opportunities or frictions does that create?\u201d He feels that a large part of his role in the college is embracing and questioning new digital tools, being willing to try every new thing that comes out, and \u201cjust play around with it.\u201d <\/p>\n\n\n\n

    Armstrong feels that some of the shifts into more reliance on AI will be so subtle to become invisible. \u201cThese capabilities are being unbundled and put into existing technologies. So it\u2019s not like you go pick a special generative tool to get the output you want \u2013 they are being seamlessly incorporated into products so that we barely even know it\u2019s happening.\u201d<\/p>\n\n\n\n

    Understanding that these systems are predictive and do not have a true sense of truth or meaning is something Armstrong underscores as important to this work. Having a healthy dose of skepticism as we continue to work with these tools encourages us to be aware of both their immense potential and their limitations. <\/p>\n\n\n\n

    That hearkens back to Protz\u2019s goal to help students become critical thinkers. The College of Design is positioning its graduates to not only work with and explore the boundaries of new technologies such as AI, but to also step back and assess the roles in which these technologies have the power to shape our lives.<\/p>\n\n\n\n

    Both see AI as a huge shift in the means by which our work moves forward, with Armstrong likening the emergence of AI to the early days of the internet. The influence on the dissemination and processing of knowledge is profound, as is the ability to propagate false knowledge. Navigating that balance continues to be a space that designers can have influence on in the future.<\/p>\n\n\n\n

    \u201cAs long as humans are part of the equation, designers will be needed. Machines cannot truly understand the human experience. Moving forward, we need to be designing interfaces that enable humans and machines to work together to engage thoughtfully with the strengths of both,\u201d says Armstrong. \u201cThis space is only going to be more important as we move forward.\u201d<\/p>\n\n\n\n

    This article first appeared in the May 2024 issue of Designlife magazine. Explore other articles from this issue<\/a>.<\/p>\n\n\n\n

    Student Predictions <\/h2>\n\n\n\n

    What future challenges do you think will affect your generation of designers?<\/strong><\/p>\n\n\n\n

    \u201cI think that designers are going to have to think about bigger problems. We\u2019ll have to do more design thinking around systems that we didn\u2019t think of as traditional design.\u201d - Ellis<\/p>\n\n\n\n

    Story Contributors<\/h2>\n\n\n\n

    Helen Armstrong<\/a> <\/strong>is the director of the master\u2019s in graphic & experience design (MGXD) program and professor of graphic & experience design at NC State. Her research focuses on digital rights, human-machine teaming and accessible design. Her research partners have included IBM, Redhat, REI, Advance Auto Parts, SAS Analytics, Sealed Air and the Laboratory for Analytic Sciences.<\/p>\n\n\n\n

    Shawn Protz<\/a><\/strong> is an assistant professor of architecture and digital technology. He has explored a range of subjects spanning from structural and environmental systems to digital representation and fabrication; past classes have covered design communication, building information modeling, climatic design, housing, inflatable architecture and tectonics. At NC State, Protz focuses on building a vibrant digital culture and developing coursework and research projects that explore emerging digital systems and materials.<\/p>\n\n\n\n

    Author: Christine Klocke<\/a><\/strong> is the director of communications and marketing at NC State University\u2019s College of Design. She is a graduate of the University of North Carolina at Chapel Hill\u2019s Hussman School of Journalism and Media.<\/p>\n\n\n\n

    Footnotes<\/h3>\n\n\n\n
      \n
    1. \u201cBlueprint for an AI Bill of Rights.<\/a>\u201d The White House, Office of Science and Technology Policy n.d. <\/li>\n\n\n\n
    2. \u201cData Minimization as a Tool for AI Accountability.<\/a>\u201d AI Now Institute, 2023. April 11. <\/li>\n\n\n\n
    3. \u201cThe Climate Costs of Big Tech<\/a>.\u201d AI Now Institute, 2023. April 11.<\/li>\n<\/ol>\n"},"excerpt":{"rendered":"

      How will technology continue to affect our role as designers? We interviewed two faculty in the College of Design to explore technology\u2019s role as it relates to ideation, human-machine teaming, and the impacts on our rights and resources.<\/p>\n","protected":false},"author":13,"featured_media":24995,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"source":"ncstate_wire","ncst_custom_author":"","ncst_show_custom_author":false,"ncst_dynamicHeaderBlockName":"","ncst_dynamicHeaderData":"","ncst_content_audit_freq":"","ncst_content_audit_date":"","ncst_content_audit_display":false,"ncst_backToTopFlag":"","footnotes":""},"categories":[1],"tags":[5],"class_list":["post-24994","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","tag-_from-newswire-collection-271"],"displayCategory":null,"acf":{"ncst_posts_meta_modified_date":null},"_links":{"self":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts\/24994","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/comments?post=24994"}],"version-history":[{"count":4,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts\/24994\/revisions"}],"predecessor-version":[{"id":25212,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/posts\/24994\/revisions\/25212"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/media\/24995"}],"wp:attachment":[{"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/media?parent=24994"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/categories?post=24994"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/design.ncsu.edu\/graphic-design\/wp-json\/wp\/v2\/tags?post=24994"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}