What's new

How Roblox is Using AI to Create Its Own Holodeck

This feature is part of AI Week. For more stories, including How AI Could Doom Animation and comments from experts like Tim Sweeney, check out our hub.


Although perceived by many as “that thing for kids with the chunky characters,” in the 17 years since Roblox first launched in September 2006 it has evolved into a sophisticated 3D game creation platform with 60 million daily users. There are more than 40 million different player experiences within Roblox right now, ranging in complexity from simple chat rooms to Call of Duty-style first person shooters. Alongside the work Epic is doing with both Fortnite and Unreal Engine 5, Roblox is largely responsible for setting expectations around what’s possible with the much-fabled “metaverse” and has kickstarted the careers of thousands upon thousands of aspiring game makers.

The underlying technology of Roblox has been steadily evolving since the day it was launched. The tools it provides range from simple graphical editors to a full-blown scripting language that allows creators to build virtually anything they can imagine. Powering all of that is a complex set of tools that now includes a growing number of AI-powered systems that make things simpler than ever to create and share ideas.

Roblox’ chief science officer, Morgan McGuire, has led the company’s research into AI since he joined in 2021. Previously he was director of research at Nvidia where he led work exploring the possibilities of augmented and virtual reality as well as its cloud-based “Hyperscale” graphics system. Prior to that, he was the professor of computer science at Williams College in Williamstown, MA while also consulting for game engine company Unity Technologies, and others.

The following nuggets of AI-focused wisdom are taken from a much longer interview we recorded with McGuire during the 2023 Game Developers Conference. You can watch the full hour-long conversation that goes into further detail on other aspects of Roblox above

Somewhere between what kids on a playground would say and what a professional game designer would write as the game design document, and then give it some reference art...And then given all of that, it would give you a rough draft of an interactive experience.

Building games just by talking to the AI



What I didn’t see coming until about two years ago, was that the advances in things like the chatbot models for AI, they also work for source code, and they work remarkably well.


Where eventually I would love to be – and this is our research plan – is at the point where you can give a page-long summary of what your game is about. Somewhere between what kids on a playground would say and what a professional game designer would write as the game design document, and then give it some reference art. “Here are some photos I like. This is the vibe I want.” The stuff you have concept artists for in a normal game studio. And then given all of that, it would give you a rough draft of an interactive experience.


I would love to get even the typing out of there. I think there’s all kinds of reasons, some of which are accessibility, so not everybody can talk, not everybody's going to be comfortable or in office space where they can. What we've seen for human communication is when we added facial tracking, you got so much more richness. We’ve been experimenting with hand tracking as well.


I think talking, hand gestures, facial expressions, when I’m working with a concept artist, the literal words I’m saying are only part of the message – the tone, the face, the gestures. I might be pulling up references and saying, “no, look at this thing on my phone, I want it more like this.” So I think all of that will come into the content creation tools. Tools that understand humans better are where it’s at. I think that's what generative AI, the work with the transformer models, things like GPT-4 and ChatGPT that have been in the press a lot lately, I think that's super exciting for content creation.

How AI helps with animation



There’s no suit, there’s no green screen. It just uses video to figure out what the character animation is, and then we map it to whatever character you have and we do all the blending stuff. It's like Hollywood motion capture, but instead of $20,000 a second, it's completely free and anyone can use it.



How Roblox uses AI to help pathfinding in games



Our pathfinding, unlike most of the game engines I’ve worked on, is designed to scale to novice users as well as experts. You can take a world and you can say, “I want to get from here to there,” and you don’t have to handcraft a lot of stuff. It knows things like, “the character can swim. This thing looks climbable.” You don’t even have to flag it. It’ll realize, those are stairs, or that’s a ladder, or that’s a wall I can climb.

AI is enhancing – ironically – everything that’s human, but adding the elements that we don’t have tools to capture another way.

How AI can make games feel more real



What I’ve seen, both from our own experiments and what others have shown me, is this notion of making characters that really feel alive. On Roblox, mostly because our game engine scales super well, we currently have a production ceiling of 500 players in the same game, fully interacting. We’re in beta going up to 700. We’re probably going to hit 1,000 very soon. My biggest research project right now is we want to get to a 50,000-player single instance. No sharding. You go to Lollapalooza in Roblox.


I think AI is going to make that breakthrough for us, where for a player avatar, you control the intent, but then we figure out everything about the context of animation for your body so that it moves naturally.


AI is enhancing – ironically – everything that’s human, but adding the elements that we don’t have tools to capture another way. That to me, I think will be magical. That’s where games cross over into being not virtual reality “the tech,” but virtual reality in the sense of a twin or a role-playing twin of the real world, that’s really exciting to me, because I hate the feeling that my interaction with another human is limited by the tech. I think AI has broken that barrier now.


How AI helps make in-game materials more “real”



We now have materials from text prompts. Creating a shape is step one, but then you’ve got to paint the shape. Painting it means all the physical properties, not just the colors. So you have to say how rough the materials are, how shiny they are, how they reflect light in different ways.


In Roblox, if you have a table, it isn’t just an empty 3D shape painted to look like wood, which is how most game engines work, it's actually wood. It knows it’s wood, it knows how buoyant it is, it knows what point it would fracture, it knows the density. We have 16 material categories, then you can color them. And then now you can put your own custom materials on that. Just exploring the space of materials is really fun (with AI), being able to say things like, “no, I wanted the bricks to be rougher or sticking out more. I want them painted white.”


Using AI for content moderation



Moderation isn't just a thumbs up or thumbs down anymore, it’s grading on a lot of axes, and then for each set of audiences, deciding, “is this content appropriate for them?” We have text AI moderation, we have image-based AI moderation, and audio-based moderation that we’re working on.


3D is a whole new world, so that’s one that we’re really charting a new course in. Everything that goes out on the platform is moderated. Sometimes it’s by a human being, sometimes it’s an AI who filters it and then might call in the human or let it pass. And then increasingly what we're trying to do is have the human moderation providing feedback so that there’s a small number of humans who will always be there, but the AI is watching that and learning about it.


It’s protecting the creators and asking “did you really want this?” It’s not just the content, it's the content in context. Sometimes something might be appropriate in the abstract, but then when you see how it’s used, you realize [it’s not.]

I love the Terminator movie series and I love James Cameron, but I’m not worried about that kind of AI problem.

AI tech isn’t scary, how we might use it is



What I’m concerned about is not the tech itself. It’s really important that companies step up their ethical responsibility, and that if they don’t, the governments step in and regulate this in intelligent ways, which is tough. Governments are by nature slow moving, it takes a long time for them to get educated about new technology.


My concern is just how do we – as very different societies around the world, different governments, and different companies – engage with generative AI responsibly? Because what we’re seeing today, and most of the things I’ve seen people worry about, I think are not the right things to be worried about. I love the Terminator movie series and I love James Cameron, but I’m not worried about that kind of AI problem.



John Davison is IGN's publisher.

Continue reading...
 
Top