Skip to main content
Research + Innovation

The Future of Design in Technology

How will technology continue to affect our role as designers? We interviewed two faculty in the College of Design to explore technology’s role as it relates to ideation, human-machine teaming and the impacts on our rights and resources.

decorative
Erisa Harris, a student in Protz’s Advanced Architectural Design Studio, used Unreal Engine to create Icelandic turf houses, a near-extinct but promising example of biogenic building technology. Erisa used photos and the AI texture generator Poly to create the digital turf material.

Is AI here to take over our jobs? We asked two faculty in the College of Design to explore the future of design within the realm of technology, and delve into some of the pros and cons of redefining their work with this endlessly-transforming industry.

While the nature of design has always been to explore, question, ideate, iterate, prototype, test and test again, newly emerging technologies related to Artificial Intelligence (AI) have the potential to disrupt our current cycles of work and propose new roles for a future generation of students.

Helen Armstrong, a professor of graphic & experience design and director of graduate programs, has been researching in this space since 2015. One of the ways she’s preparing students at both the undergraduate and graduate level for this shift is to encourage students to continue to view their work as part of a larger system, rather than a one-off solution.
“When you design, you’re not designing something that finishes the moment your work is done,” she says. “The interfaces we’re designing today have to be able to interact with a whole ecosystem of other products within a spectrum of environments. These interfaces also have to anticipate and respond to user needs—before humans make requests. We are entering an era of anticipatory design. For example, a few years ago our students worked with an auto parts company to integrate AI into their retail spaces. One of the solutions was an anticipatory maintenance application that identified customer vehicles and then enabled the sales team’s members to predict not only current maintenance needs but also near future and far future needs.

Exploring user interface (UI) and user experience (UX) design means students must consider the end user of a product and consider the ways in which that user might interact with a product – whether that’s the interface you use to access your mobile banking or the notification you receive from your fridge, reminding you to buy milk. Each of these elements can talk to one another and respond to human needs, and there’s an inherent agreement of trust that humans place in the systems.

“Our research has found that people tend to begin by over-trusting autonomous systems, but if they disagree with the system’s prediction, then that trust begins to erode very quickly,” Armstrong says. So how can designers design the system to create a bit of skepticism and teach users more about how the machine might be making its prediction?

Our research has found that people tend to begin by over-trusting autonomous systems, but if they disagree with the system’s prediction, then that trust begins to erode very quickly.

A lot of that connection boils down to an ever-increasing need to understand data and being able to consider the larger implications of working with tools with a vast resource of computing power.

Shawn Protz, a professor of architecture and digital technology, compares the addition of AI to that of the practice of hand-drafting before programs like AutoCAD took over.
“I understand that shift in process affected the outcomes in how we make architecture – moving from hand-drafting to the computer,” he says. “The computer was always part of my education – we used hand-drafting to learn how to move from 2D to 3D, but I don’t think it’s changed anything. The advances in BIM (building information modeling) garnered a lot of hype in the beginning, but now it’s just normal expected use. It was hyped as something new, but it was really just an extension of what we’d always been doing in architecture. We’ve been using abstractions like plans and sections to understand the complex spatial design, but it gave us the superpower of coordinating the way we draw and the way we think in three dimensions. I certainly had to learn some new software features, but everything just feels like an extension of what we’ve been trying to do this whole time. Even AI – I don’t see it as radically changing we’ve already been doing for centuries.”

in caption
Yash Shah, a student in the Master of Advanced Architectural Studies (MAAS) program, created a workflow with the generative AI platform Midjourney to visualize future scenes of the Matunga Market in Mumbai as an example of a vibrant and flexible model of urbanism and food production.

With the addition of AI, students and professionals will be able to redefine what it means to iterate on a design or a series of designs. Protz sees it as a way to augment work processes, rather than removing part of the role. During a recent lecture at the College of Design, David Benjamin, founding principal at The Living and associate professor at Columbia’s GSAPP, shared his workflow of using machine learning in Grasshopper 3D to refine ideas and further his creative practice. “Benjamin sets up parameters and generates preliminary floor plans and can then work with his client to explore possibilities. So the work isn’t designing for you, but it’s allowing you to see what’s unexpected, what’s missing, and is another voice or agent, rather than something that might be replacing you,” Protz says.

This use of AI as an early idea generator in which the designer refines the final product is just one option for enhancing our current workflows. While AI helps us grasp complexity and can augment the creative process, it has the potential for unwanted consequences as well.
“We need to be considering what human skills we’re automating away vs. what skills we could be shoring up through these systems,” says Armstrong. “Think about the GPS system in your car – its goal is to help you navigate from point A to point B. Often, this is great, but, sometimes, you would prefer the system to teach you how to better navigate between spaces yourself. Recently I worked on a project with Dr. Matthew Peterson in which we were thinking about this concept in relation to human memory. We worked with the NC State Laboratory for Analytic Sciences to consider how generative AI might be used to help an intelligence analyst capture and remember vital information. We weren’t using AI to replace human memory but rather leveraging the technology to increase analysts’ natural memory capacities. Each time designers create an interface today, they make choices around automation that can either bypass certain human skills or team up with AI to expand those skills. That’s a very powerful position and a very heavy responsibility for designers right now,” she adds.
While the question of automation and future human abilities is one of the concerns for AI, another deals with the larger unexpected ramifications of this technology. Issues of bias, misuse of data and even intense usage of the earth’s natural resources all have implications in this field.

Armstrong sees her role as encouraging students to question their influence on the future of society. “I always tell my grad students – this is such a great time to be a student. We’re in the midst of figuring out what roles these systems can play in our future, and determining how we can build systems that support the future we want to live in,” she says.

caption included
Students Hannah Faub, Randa Hadi, Harrison Lyman and Matt Norton in Armstrong’s master of graphic & experience design class partnered with an auto parts company to explore how the company might ethically leverage machine learning to produce personalized, efficient, useful consumer interactions. The interface shows what a sales team member might see as an existing customer profile.

Both she and Protz point to guiding resources that are keeping a watchful eye on the future of this technology. Guiding principles such as the AI Bill of Rights and advocates such as the AI Now Institute continue to bring meaningful discourse and governance to this topic.
President Biden has recognized the potential impact unchecked technology advances could have on civil rights and democratic values, foundational principles to the United States of America. Based on this charge, the White House Office of Science and Technology Policy has identified five principles that should guide the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence. The Blueprint for an AI Bill of Rights is a guide for a society that protects all people from these threats—and uses technologies in ways that reinforce our highest values.”1

Armstrong encourages each of her students to read this document, and to think of it as a design brief. “Every paragraph is something that designers could be thinking about and working on right now,” she says. She gives an example from the section on Notice and Explanation, which focuses on letting end users know that they may be impacted by the systems they use. It’s a design question – how do designers develop UX/UI designs that can meet the expectations outlined in the AI Bill of Rights?

Protz is also exploring the ethical implications of some of these technologies, especially taking into consideration those that wield the power behind these mighty datasets. While guiding principles such as the AI Bill of Rights encourage the Notice and Explanation of the ways in which your data can be used, advocates such as the AI Now Institute point to the failure of big tech corporations to adhere to these “data minimization” models of accountability.2 “This brings up all kinds of issues around privacy and bias that students need to be critically engaging with,” adds Armstrong.

Exploring some of these larger societal problems may be beyond the scope of architects, but there is more the profession could be doing to consider the ethical implications and the use of these tools when it comes to professional practice. Similar to considering the ways in which the built environment affects our climate and landscape, we should also consider the intense energy usage and water consumption3 that the data centers powering artificial intelligence need to churn out endless iterations for designs and visualizations.

Consistently pushing his students to be critical thinkers is part of Protz’ role as a professor. “I try to emphasize that all of these tools have a lineage and are part of a continuum of thinking,” he says.

“Think about what the tool allows you to do, what it limits you from doing, and what opportunities or frictions does that create?” He feels that a large part of his role in the college is embracing and questioning new digital tools, being willing to try every new thing that comes out, and “just play around with it.”

Armstrong feels that some of the shifts into more reliance on AI will be so subtle to become invisible. “These capabilities are being unbundled and put into existing technologies. So it’s not like you go pick a special generative tool to get the output you want – they are being seamlessly incorporated into products so that we barely even know it’s happening.”

Understanding that these systems are predictive and do not have a true sense of truth or meaning is something Armstrong underscores as important to this work. Having a healthy dose of skepticism as we continue to work with these tools encourages us to be aware of both their immense potential and their limitations.

That hearkens back to Protz’s goal to help students become critical thinkers. The College of Design is positioning its graduates to not only work with and explore the boundaries of new technologies such as AI, but to also step back and assess the roles in which these technologies have the power to shape our lives.

Both see AI as a huge shift in the means by which our work moves forward, with Armstrong likening the emergence of AI to the early days of the internet. The influence on the dissemination and processing of knowledge is profound, as is the ability to propagate false knowledge. Navigating that balance continues to be a space that designers can have influence on in the future.

“As long as humans are part of the equation, designers will be needed. Machines cannot truly understand the human experience. Moving forward, we need to be designing interfaces that enable humans and machines to work together to engage thoughtfully with the strengths of both,” says Armstrong. “This space is only going to be more important as we move forward.”

This article first appeared in the May 2024 issue of Designlife magazine. Explore other articles from this issue.

Student Predictions

What future challenges do you think will affect your generation of designers?

“I think that designers are going to have to think about bigger problems. We’ll have to do more design thinking around systems that we didn’t think of as traditional design.” – Ellis

Story Contributors

Helen Armstrong is the director of the master’s in graphic & experience design (MGXD) program and professor of graphic & experience design at NC State. Her research focuses on digital rights, human-machine teaming and accessible design. Her research partners have included IBM, Redhat, REI, Advance Auto Parts, SAS Analytics, Sealed Air and the Laboratory for Analytic Sciences.

Shawn Protz is an assistant professor of architecture and digital technology. He has explored a range of subjects spanning from structural and environmental systems to digital representation and fabrication; past classes have covered design communication, building information modeling, climatic design, housing, inflatable architecture and tectonics. At NC State, Protz focuses on building a vibrant digital culture and developing coursework and research projects that explore emerging digital systems and materials.

Author: Christine Klocke is the director of communications and marketing at NC State University’s College of Design. She is a graduate of the University of North Carolina at Chapel Hill’s Hussman School of Journalism and Media.

Footnotes

  1. Blueprint for an AI Bill of Rights.” The White House, Office of Science and Technology Policy n.d.
  2. Data Minimization as a Tool for AI Accountability.” AI Now Institute, 2023. April 11.
  3. The Climate Costs of Big Tech.” AI Now Institute, 2023. April 11.