Game engines & the future of 3D worlds

Component-SM-interview-Danny-Lange_2020-10-30-161602-1.jpeg

Do we live in a simulation? Samsung Next's Iskender Dirik sat down with Unity's Dr. Danny Lange, SVP of AI & ML to talk all things metaverse, gaming, simulations, and synthetic media.

What is Unity and what exactly covers your role there?

We are fundamentally a game engine company. We provide a platform for game development, but actually, we are more than that.

A game engine can be used for many, many different other purposes. It's really a real-time 3D engine. An engine to create and drive 3D content. I run the AI team there and our mission is to really power the future of AI using our underlying engine.

Games have been used for decades to drive AI. And we have really refined the Unity engine, created an API so that you can really push the boundaries of AI using our engine.

But it's actually a two-way road because we also use the AI to create content for the engine and to make games and other interactive experiences much more... Bring them much more alive.

Could you give us some numbers about Unity to help us to understand your scale?

Yes, we have a fantastic developer environment. We have over 1.5 million developers on the Unity platform, developing applications for over 20 different device platforms. And the Unity engine was downloaded as a part of an application over three billion times in 2019.

How has Unity's game engine scaled outside of the video game industry? Like which industries are being transformed by Unity?

A lot of industries. Think about your old-fashioned CAD design, sort of 2D blueprints. Take that into our world, it becomes animated 3D content.

So if you want to design a new car, you can basically imagine it in Unity. You can use VR devices. You can design the car. You can virtually sit in it. You can see how the interior of the car is going to look, the exterior.

Go into architecture and you can design a building, you can walk in a building virtually. So think about the 2D CAD design becoming a 3D immersive experience.

At Unity, a game engine is one of the most used technologies for use cases and industries like architecture, engineering, transportation, and many more. Why is that?

It's because you will rarely find anything more demanding than commercial game development. Building successful games, you have to check a lot of boxes there. It has to be highly performant, it has to be beautiful in graphics, it has to run on a variety of platforms. And just to think about all the different genres of games from first-person shooter to games to puzzles, et cetera.

Being a game engine, the bar is really, really high. And then you can take it into other areas and be highly competitive there, into other fields where that same experience is basically just disruptive to the technologies already being used.

At Samsung Next, we are excited about AI in general and about synthetic media in particular. How does AI help Unity users to build great games or simulations beyond gaming?

AI has some fantastic abilities to assist humans to be our extra brain when we create content. So, there's a lot of new techniques coming out over the last five to 10 years that allows a creator to use AI to place content, to create meaningful scenes, but in a more automated fashion and getting the assistance of AI for texture creation.

We have some great examples of how to create very natural-looking fabric for interior designs and so on. So AI has really evolved to become a great creative assistant. The human is the creator here, but the AI helps to lift the user to be more productive and create more lifelike results.

Do GANs, like Generative Adversarial Networks play an important role?

Absolutely. This is a remarkable technique to create surfaces, to create textures. It can also be used to actually design and create scenes. GANs are one of the most exciting innovations over the last five years.

We are very excited about synthetic media, and we see synthetic media as a third evolutionary stage of media. So, what do you think about synthetic media? Are you also excited about it and is it a topic for Unity and your work?

We're on the same page here. In game development and games, it is inherently synthetic media. And as our graphics capabilities grow, as our compute grows, and as you embed AI into these systems, you get more and more natural-looking, more and more realistic experience in these games.

The games actually become digital entertainment, it becomes synthetic media. To add to that, what comes to it is also that individual experience. That when I go through this synthetic media-driven experience, it becomes unique. It becomes personalized to me. And you will have a different experience based on your interactions with the system.

There's another thing you are excited about, which is synthetic data. I know that you have been quite outspoken about the potential of synthetic data when it comes to overcoming bias in real-world data. Could you give us some context about that?

Sure. So, synthetic data is actually fantastic. For years, for decades, we have used basically real-world data to train AI systems, whether that is your shopping behavior or it's computer vision system using supervised learning on hand-labeled imagery.

Moving into a world of synthetic data allows you to create much larger amounts of data, I would say practically an infinite amount of data at very low cost, and the data is perfectly labeled.

For the sake of our discussion today, let's just look at visual data. So, instead of grabbing a camera and basically drive or walk down the street and taking pictures of things, you can actually generate those objects, those visuals in the Unity engine. And you can generate thousands, if not millions of billions of images that are perfectly labeled because it's the engine that created those images.

So we have to remember that with machine learning, using real-world data amplifies real-world biases. You will train your system on what's already in the world and what's in your world and in the pictures that you took in your location. So, it can be anything from an overweight of adult individuals walking down the streets, so that your system will not be very good at recognizing children.

It may be something about skin color, it may be clothing, it may be hairstyles. So, just taking real-world data amplifies the biases in your data.

Moving to synthetic data, you can actually now design a clean data set. You can start having distributions of sizes of individuals, skin colors, et cetera. So that when you train your computer system, it is as unbiased as you can design it.

It moves the responsibility to the synthetic data designer. They have to be aware of how to ensure that that data is unbiased. But moving away from the real-world data, which we all know are highly biased, is a good first step.

Is it fair to say that synthetic media and synthetic data is the solution for one of AI's and machine learning's biggest challenges, which is training systems?

Absolutely. I mentioned that the mission of my team is to power the future of AI. And the way we do that is to create very easy to use APIs between AI systems and the Unity engine so that you can create all the visual data that you need. I would say cognitive data, like puzzles and behavioral data, as we see in reinforcement learning.

We have created APIs so that it's easy to use the Unity engine to create very low cost, I mean like very low-cost data, extremely low-cost data in absolutely incredible amounts that... Basically creating easy access to training data is really democratizing AI, and that is our fundamental thesis.

What is the number one technical limitation that Unity, but also the gaming industry in general, faces today? And do you see that changing in the future?

I think there are several key limitations. One of them is basically creating as realistic-looking environments as possible. And that then drives to a challenge around scale.

You want to have many objects in your synthetic data. You want to have many objects moving around. You want them to be natural looking. You want to have wind in there. You want to have different lighting conditions.

That's a huge challenge, but on the other hand, we are making huge progress on both on the software side using AI to do this, and also on the hardware side, we're getting better and better hardware. So it is about pushing the limit of the capabilities of the hardware and the software all the time, and gaming engines are right out there on the frontier, constantly utilizing the latest graphics cards, utilizing the latest CPUs, et cetera.

We're constantly pushing the boundaries there, but it is really about natural-looking, realistic imagery and behavior at a massive scale.

Another concept we are super excited about is the idea of the metaverse. Can you describe what you consider to be the metaverse? How would you define it?

It's where the real world meets the virtual world. So if you think about AR systems, augmented reality systems, it's essentially this mixture of a virtual world where there's virtual content that is basically projected on the real world with real existing physical content. It's where they meet and blend together. That, to me, is the metaverse.

And how is Unity already an enabling technology for the metaverse?

We are, because the use of virtual reality and augmented reality combined with AI create this merged experience for our users.

Whether that is a car designer using VR goggles to have a merged experience of actually holding a steering wheel in a car, but then using the goggles to change certain aspects of the design in the car at the same time.

It's where physical meets virtual. We're already doing that. And I think as we look into the future, we're going to see that scale, you're going to see how we can design building and we can create experiences in buildings.

I'm often thinking as an example is that we can put art on walls, when we wear our AR glasses, we can see personalized art on the walls in the buildings that we visit. This is coming. This is a very exciting technology.

In the future, we'll spend a significant time in virtual realities. But how about ethics and laws? Can I do on a virtual island what I want to do, even if it's forbidden by law or not seen as ethical in the real world? How should we tackle these challenges?

That's a very interesting question. I think that we, as humans, as a society -- whether things are in the physical world and the virtual world -- I think we are going to uphold a general common set of ethics. We are going to require a certain level of proper behavior, whether it's virtual or real.

We see it today that there's already a virtual world in the sense that if you look at social media where people post stuff. That stuff is no longer considered all harmless because it can impact elections. It can impact decision-making.

When you move into a virtual world with synthetic media into the metaverse, it gets even more complicated. And I think there will be rules around behavior and we want that. We want to have some kind of society where there are rules of how we interact with each other and some boundaries.

There are people out there that believe that we live in a simulation. So, perhaps in reality, Unity is much more developed than we all know at this point in time and someone used it to simulate what we call life. Were you guys now developing GAN, an early version of Unity, do we live in a simulation?

I actually do think so. I do think that we live in a simulation, but it's not like... Don't think of this in a sort of say a simplistic way of an alien watching over a computer running the universe as a simulation.

But I think that very much you can consider the laws of nature and the physical world that we see could very much be expressed as a simulation, quantum mechanics. The quantum behavior of time and space is not analog, it is discreet, which would fit very much with a simulation. Don't simplify this, but really consider this in a very abstract sense that physics itself may be a giant simulation.

What does the Unity developer ecosystem look like? Are there many Unity developers out there and where do you find them if you want to build great stuff on Unity? And how do you find them?

There are over a million developers on the Unity platform. At one point, Unity was one of the top 10 job titles on LinkedIn. Actually, LinkedIn changed the rules, so that you can not have a brand name as a job title any longer.

There's a lot of Unity developers across the world. For 15 years, it has been our mission to democratize game development. So, there's been a constant focus on simple APIs, great tools to basically stimulate game development. And we basically take this out, and every team at Unity is really focused on lowering the barrier for developers.

We want to make it easy to create content, whether that is programmatically or using tools. And I think that just by the sheer number of games being published every day on the Unity platform, I think that speaks to the usability of it and the very very successful ecosystem around Unity.

What are the skills you should look for when you search for people who can build that great stuff on Unity? Do you need people with an engineering background experience in certain programming languages or do you need creative people, game designers, or basically all of that?

It's a very diverse crowd. We see engineers. Our programming language is C Sharp, it's kind of a Java-like language. C Sharp, it's fairly easy to learn.

We also see a lot of artists using our tools to create content. You have to look at this as a game. Game development is a mixture of engineering work and engineering discipline in writing code and building the software. But without good art and without the involvement of artistic creators, it's not going to be a very exciting game to play. So, we see a lot of that.

We also see people who are focusing on the operations and monetization of games. So, there's a wide range of roles involved in the Unity ecosystem.

Let's quickly speak about Europe. Why is Europe a great place for Unity?

I think that it's because of the diversity. You have many different communities in Europe. Europe is an extremely diverse place that is very dense, with a broad set of different cultures. But at the same time, you have seen with the strength of the EU a lot of cross-border interaction, a lot of collaboration. It's a very, very lively environment and there are just so many creative people in Europe.

It's time for a shout out. Who, in your opinion, is particularly building exciting stuff on Unity in Europe?

There are so many. And I'm now going to run the risk of mentioning someone and not all the others. So I'm going to say when it comes to AI, one of the most fascinating uses of Unity is done by DeepMind in London.

They're really advancing the field of artificial intelligence. Actually, their mission is artificial general intelligence. And they're using Unity as their primary vehicle for generating these very complex behaviors that they're training the AI systems on. And I want to mention that, then I don't get into game development and mentioning one studio or the other. There's a lot of very, very interesting game development taking place right now.

Where in the area of enabling tech for the gaming and media industry do you see the biggest potential for venture capitalists?

I see it in the application space. And when I say application space, it's really broad. But if you look beyond gaming, the use of the Unity engine in areas of engineering, robotics, construction, city planning, complex spatial optimizations, I think that there's so many opportunities in there.

We have, at this point, only scratched the surface. I see it as my mission at Unity, and I've mentioned this several times is to enable this next generation of AI systems. And I think that will be a fantastic opportunity for founders and for VCs to get into, use the engine to build these real applications that we hadn't seen yet, that we can barely imagine. That's where I would put my money.

Thank you, Danny, thanks a lot. It was very, very exciting and a pleasure.

It was a pleasure here too. And thank you for having me, it was very exciting.

Previous
Previous

Breaking with the status quo in the mobile telecom industry

Next
Next

Q&A: The future of synthetic audio