Design in the time of ChatGPT
A cautious human’s reflection
I’m a little wary of stepping into the AI-and-design debate. There are so many voices and it feels on occasion like I’m wading into a rushing river armed with little more than a KeepCup and a mild caffeine dependency. The current is fast, the information contradictory and the water is filled with Medium posts shouting all kinds of advice about how to navigate it (including this one!). In times of uncertainty we all grasp at trying to predict the future but we’re far from great at it, let alone prophetic.
Still, it’s hard not to reflect on how AI is already shaping our work. I’ve used it a lot lately. I’ve relied on it to sharpen my grammar, summarise clumsy notes and bash out first drafts I can iterate from, even trauma-dumped on it during mental lows and it’s incredibly helpful. But, I feel like something is missing.
What happens when everything we read is AI-generated? If the information is technically valid, does it matter who or what wrote it? On the one hand no, but something deeper says yes. There’s a physicality to human connection, even through writing. Something subtle and sensory happens when we connect with another person’s experience. AI can mimic the tone but it doesn’t have it’s own human experience. We read more than just words.
After moving to Aotearoa New Zealand, I studied Environmental Planning and slowly embedded myself into my local community through coffee catch-ups, listening to people’s stories and observations, and through experiments in public space. I have become increasingly fascinated by temporary urbanism — essentially prototyping in public, with people. It’s messy, slow and social. Which, in essence, is the point.
Cities are complex systems. Understanding them and changing them requires not just insight, but consensus. Consensus doesn’t come through a prompt or a dashboard but through conversation, context, and trust. Designing in the public realm shouldn’t be transactional, it should be relational. This is, I suspect, where AI currently reaches its limits.
Design tasks that are craft-based, repeatable or heavily pattern-driven are already being soaked up by AI. We’re told junior roles are in jeopardy. The future, they say, belongs to those who can think systemically, synthesize complexity, and build meaningful relationships. In other words, to designers who can’t be easily automated.
Meanwhile, we’re told that working remotely from a beach in Bali is the dream. But the shine is coming off that particular digital nomad fantasy (The Guardian has thoughts). Partly, I think, because we’re waking up to how isolated work has become. Even creative collaboration is becoming transactional, done through tools and not relationships. We’re in meetings, not having conversations. We’re collaborating with avatars and emojis. It’s efficient but does it feel meaningful?
I’ve been reading KA McKercher’s Beyond Sticky Notes, and I’m reminded again of the kind of design work that matters most to me. The kind that builds trust, fosters participation and supports real conversations. I enjoy delving into and appreciating the complexity and nuance of real world problem space. It’s the type of design and research that’s not so easily templated.
For some reason I often find myself on the techno-Luddite side of the fence, despite that fact that I’m truly fascinated by the potential of technology. I use Google Maps to drive to the supermarket, even if it’s embarrassingly close. Yes, I’ve outsourced some of my own sense of direction to the machine. Am I becoming dumber? Possibly. Am I grateful when it avoids traffic? Definitely. I’m not against the tech, I think I’m just cautious about the trade-offs.
Some literature suggests AI could actually support participatory design by lowering barriers to engagement or helping facilitate workshops. Maybe it can. But I think there’s also a risk that we rely on biased models or overvalue AI outputs to the point that human creativity, disagreement and dialogue get flattened. Participation isn’t just process, it’s power-sharing, and that is something I don’t think we should let go of lightly.
These days, I notice where AI helps, and where it falters. It speeds up production, helps me sprint solo and handles some of the grunt work. But it doesn’t know how to build a relationship. It doesn’t know what matters to people. It can’t sense the pause before a hard truth or the subtle joy of a co-created idea. It doesn’t read body language. That’s still our job.
And so, as everything seems to be speeding up, towards what, I’m not sure, I’ve started to reflect more about what I want to hold on to and what’s important to me amongst what I think is one of the biggest technological changes we have ever experienced:
The ability to navigate complexity with care and tact
To build trust with people
To collaborate with people directly
To feel the work in my body (a friend of mine once said my whole body is a research tool)
AI is a tool and a mighty one at that, but it’s not the work nor the reason for it. Innovation happens in many ways, and not all of them are technological. So, I am remaining cautiously optimistic. But I’m also trying to stay grounded in what makes design human and hoping that’s enough to hold onto.