How AI and UX are shaping each other
Browse Blog Topics

How AI and UX are shaping each other

As AI becomes increasingly pervasive in our daily lives, new relationships are being forged between customer and product, data and desired outcomes, and the designers and engineers building the tools to merge the two.

At the October edition of Samsung NEXT Europe’s “Building Tech …” event in Berlin, we teamed up with visual search engine provider Nyris to host a series of talks by industry experts, followed by a panel discussion exploring important issues that arise when artificial intelligence (AI) intersects with user experience (UX).

The definition of AI is continuously changing
What exactly is AI?

“The truth is I don’t know the answer to that, but I do know that the answer changes all the time — and that’s the problem with artificial intelligence,” said Holger Eggert, a strategic experience designer with more than 20 years of experience working on UX design in large corporations as well as startups. “Fortunately, the definition of ‘user experience’ doesn’t change all the time.”

When applying UX principles to AI, Holger summed it up simply: “You want to make something useful, then you want to make something usable, and then you want to make something delightful.”

Ultimately, he said, it’s about using UX design to shift perceptions from AI as “magic” towards AI as an accessible, functional, and trustworthy technology.

Not all data will be meaningful
“When you ask your data scientists to develop a [machine learning] model, they tell you, ‘Give me the data,’” said Gunnar Valge, senior product manager for ML/AI at CRM software company Pipedrive, “So that’s what we did at our company and we went down a funnel…”

Gunnar went onto explain that in such a scenario, data scientists start with tons of data, most of which they don’t understand. Of the data that is understood, only a fraction is relevant to solving the problem at hand.

Of the data that is relevant, he said, only part of it was reliable… and from the reliable data, only a fraction could correctly predict what is being used to predict in the model. From that data, only a small fraction actually impacts the end-user value.

Designers and engineers need to work together
“Developing a common vocabulary and understanding is always a challenge — a very people-oriented challenge — so I always encourage engineers and product people to have an open dialogue pretty much at the inception,” said Appu Shaji, CEO of computer vision startup Mobius Labs.

Echoing a similar sentiment, BCG Digital Ventures lead experience designer Anna Sardini said that while working on LabTwin, an AI-powered digital lab assistant, it was often difficult for designers to understand the technical constraints or repercussions of working with AI. To solve this issue, Anna and the design team decided to involve the engineering team in their process in three different ways.

“First, we worked with them as experts. For example, when we needed to get data from users, we interviewed a data scientist to gain a better understanding of what kind of data should be collected to move the needle,” she explained, “Secondly, we worked with them as co-creators. So once we had ideas or concepts, we talked it through with them to see how it might work or function. Thirdly, we worked with them as partners. In our case, we brought data scientists into the labs to observe how they would use the product.”

If you’re in Berlin and want to stay updated on future “Building Tech …” events, be sure to sign up for our newsletter!

Related Stories