Photo By OpenAI

The UX of AI: Why ‘Where’ Matters

By Nelson Taruc, Design Lead at Lextech

It’s not just about AI, it’s where you put it

On Oct. 3, 2024, OpenAI introduced canvas, a new interface for working with ChatGPT. As someone actively exploring the interaction of AI, design and code, I immediately gravitated toward the numerous demos around coding in canvas.

A great technical display? Yes. But the canvas human experience is fundamentally flawed.

In an article written by Sebastiaan de With on Apple Intelligence, he shared a sentence that resonated with me. It's something I’ve known instinctively for years when it comes to the UX of AI:

“If you own the screen, you win.”

The AI tools that can best understand your full context, so they can deliver their predictions at the locations most useful to your coding workflow, will emerge as best in class.

It's not just about capabilities. It's where you deliver them.

In other words, the UX problem has nothing to do with what canvas can do. It has everything to do with where the AI work is being performed.

Or more precisely, where it is not done.

You see, canvas doesn’t work in Xcode or VS Code or any other IDE at this time. It seems like you'll have to copy/paste or import/export to take advantage of canvas in your workflow.

As an AI designer, copy/paste from one app to another is kludgy as heck. It’s the lowest-common-denominator integration you can offer. If your AI product requires extensive copy and paste for use in other tools, that's a UX red flag.

The only way for tools like canvas to succeed is to “own the screen” (in this case, in an IDE) and deeply integrate these features into an existing code editor. VS Code seems the most likely candidate here given the OpenAI's ties to Microsoft; it would seem far less likely for Apple to allow such integration in Xcode for security reasons.

The other pathway for canvas is to become its own standalone IDE: code and publish inside a single app. This eliminates the dependencies of trying to integrate into another tool. I put the probability of this happening as less than 50 percent, but it's an option.

A great example of AI “owning the screen” is the upcoming Swift Assist feature in Xcode. Changes you make to the code appear instantly; no need to copy and paste. Code changes can be previewed immediately in Xcode. You stay in the same tool with the AI, and it has access to the full context of your Xcode project (e.g. targets, dependencies and assets).

Granted, canvas is still in early beta. A lot can change. But until canvas finds a way to “own the screen,” its impact on digital development workflows will be limited. It’ll be great for simple coding explorations and experiments, but it’ll be difficult for canvas to fit into a production-grade AI-assisted workflow without tighter integration with IDEs.

The key takeaway: When it comes to the UX of AI, don’t just focus on the functionality and “what” your product does: “Where” matters too!

It’s not just about AI, it’s where you put it

Let Us Help!

We are innovative thinkers who care about our clients as long-term partners.

How can we help your people Thrive At Work and boost company and employee productivity? Contact us at info@lextech.com or fill out this form.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.