Building AI products to scale
Browse Blog Topics

Building AI products to scale

As more AI-first products come to market, companies increasingly are facing the challenge of industrialising and standardising high-precision, robust, and enterprise-ready solutions of AI product production.

Companies must consider new considerations when moving away from “classic” software development and toward the building of continuous learning pipelines with versioned datasets, annotation tools, and workflows, as well as automatic cloud and edge-device deployments.

On the 7th March, Samsung NEXT Europe teamed up with Nyris, the object recognition engine built on AI frameworks, to host a panel of AI product experts on three key factors worth considering when scaling AI production.

The panel featured Valohai CEO Eero Laaksonen, Google Brain Germany’s Jakob Uskoreit, Peltarion Deep Learning Researcher Justin Shenk, and BCG Digital Ventures Senior Data Scientist Nora Neumann.

Is standardisation helpful or harmful?
On this, the experts were divided. While Valohai and Peltarion work on building products that will assist in the standardisation of AI development processes, Jakob of Google Brain felt that standardisation could result in “too much of a good thing.” He went on to suggest that over-architected infrastructure during times of rapid technological development has the potential to harmfully slow down processes.

In a recent article by another Googler, Clemens Mewald proposed the following framework to figure out when to invest in the standardisation of workflows and tooling versus when to embrace a more loosely coupled and flexible infrastructure stack and tooling.

 

Build from naturally occurring data
Both Nora and Jakob recommended building products where plenty of data naturally occurs. Referring to BCG Digital Ventures’ need to validate business opportunities for large corporations, Nora explained that her company often has the luxury of access to years of data already gathered by clients. That gives them a significant competitive edge over startups who lack access to the same sort of data backlog.

Regulations to consider
Valohai’s CEO Eero emphasised that GDPR and financial sector regulations have already begun having an impact on the design of AI systems. With growing discussions around bias and more stringent privacy laws, building auditable AI systems is becoming increasingly important. The ability to version data and log the entire development cycle from training to production is crucial to allow for the reproducibility of experiments at any point in time.


A big thank you to BCG DV for hosting the event! If you are in Berlin and want to stay up-to-date on future events, be sure to sign up to our newsletter!

Related Stories