Roblox’s New Art and AI Coding Tools- The Future of Game Development

Last updated on May 16th, 2023 at 09:24 am

Rate this post

At the Game Developers Conference on Monday, Roblox unveiled a new set of AI tools that allow the company’s millions of player-creators to use only text descriptions to create usable game code and in-game 2D surfaces. The future of game development was also discussed.

Stef Corazza, head of Roblox Studio, told a packed conference audience that the release is a big step toward “democratizing” game creation, giving it to people “who were blocked by technical hurdles.

The Roblox Code Assist beta that was released on Monday morning certainly appears to have the potential to enable users to quickly and easily create straightforward code snippets. A user could instruct the system to “make orb turn red and destroy after 0.3 seconds when the player touches it,” according to an example that Corazza provided at the conference. Based on an orb object that was defined by the coder and provided earlier in the code, the system then generates a seven-line Lua function that performs exactly that.

A second request for a function to “create a 3 by 3 grid of orbs around orb” results in the generation of a few lines of code to insert a small grid of those objects into the game scene.

According to Corazza, just four months ago, it was unclear whether this tool would perform sufficiently for today’s public release. However, Roblox has taken advantage of recent advancements in the generation of natural language code.

However, fine-tuning that standard model with code from the Roblox platform itself was the key to producing results that were usable for the company’s Code Generator Beta. He stated that the crucial context “significantly improves the output quality.”

Corazza stated that coders using the tool also requires context. It is analogous to asking a knowledge expert to take a test “in a completely white room where you didn’t hear the question completely” when you ask the AI to generate code on a blank document. However, Corazza stated that in internal testing, attempts that started with no such “context” code had a 50% higher “acceptance rate” for the AI tool’s suggestions than attempts that started with just three lines of sample code.

Also read: AI Tools in Daily Life Activities

According to Corazza, the primary objective of the Code Generator Beta at this time is to “help automate basic coding tasks so you can focus on creative work” and to enable experienced coders to avoid “having to work on simple stuff.” Later on, however, Corazza said he sees a more chatbot-style interface that can be utilized as a learning instrument, making sense of how code works and recording capabilities for those learning the essentials.

Roblox has released a Material Generator in addition to the AI code generator to automate the laborious process of layering flat art assets on top of the many 2D surfaces in a game world.

This goes beyond what a tool like Stable Diffusion can do for basic image generation. Other “maps” for attributes like albedo, roughness, and “metalness” are also automatically layered onto the surface by Roblox’s tool. The game engine can then use those attributes to accurately reflect light and respond to other objects.

However, Corazza stated that this is only “step one” of the company’s plans to generate AI assets. An artificial intelligence (AI) system that can create “specific geometry” that can completely retexture a full 3D model or character is the next step. Corazza stated that the team has seen “some early breakthroughs,” and he is “confident it will land” eventually. This is “a very hard problem to crack” because of the need to be aware of the full context of the object itself (for example, where various body parts go on a living character).

Corazza added that the “holy grail” for this kind of tool is something that can simultaneously mimic a “specific game style.” The idea would be to use just a few drawings from a concept artist and have an artificial intelligence instantly create an entire set of assets that are consistent with that style and work together.

Corazza suggested that as an “extreme example” of a possible future use case, someone might be able to type “Scene with a forest, a river, and a large rock” and receive a fully interactive, realistic 3D world that matches the prompt. He stated, “It will feel like nuclear fusion.” I’ll say two years until it’s finished.

Corazza stated that an environment like Roblox is a natural place to play around with these kinds of early, imperfect generative test cases, even though Roblox’s AI system still does not always “suggest perfect code.” He stated that, in contrast to self-driving automobiles, where any AI errors could have “massive consequences,” the “bar is a little bit lower” for Roblox code and surfaces generated by AI. If the generation isn’t good, there won’t be any catastrophic events—just click the button and make another one.”

However, the challenge of managing what the company anticipates will be a massive influx of AI-generated content in the future is already being planned for. 

Corazza was enthusiastic about future waves of AI-powered game creation tools that will eventually “converge” around the generation of all of a game’s assets from a single text prompt, including materials, code, 3D assets, terrain, audio, avatars, and 3D scenes and images. In contrast to the current emphasis on fine, granular control at the code/vertex level, those future tools will be constructed more directly around capturing the “intent of the user.”

Leave a Reply

Your email address will not be published. Required fields are marked *