September 20, 2024
1 Solar System Way, Planet Earth, USA
Gaming

NVIDIA researchers harness generative AI in real time

NVIDIA researchers used NVIDIA Edifya multimodal architecture for visual generative AI, to build a detailed 3D desert landscape in a few minutes in a live demonstration at SIGGRAPH. Live in real time event on Tuesday.

During the event, one of the most important sessions at the prestigious graphics conference, NVIDIA researchers demonstrated how, with the help of an AI agent, they could create and edit a desert landscape from scratch in five minutes. The live demonstration highlighted how generative AI can act as an assistant to artists by speeding up ideation and generating custom secondary assets that would otherwise have been sourced from a repository.

By dramatically reducing ideation time, these AI technologies will enable 3D artists to be more productive and creative, giving them the tools to explore concepts faster and streamline parts of their workflows. For example, they could generate the background assets or HDRi 360 environments that the scene needs in minutes, rather than spending hours finding or creating them.

From idea to 3D scene in three minutes

Creating a complete 3D scene is a complex and time-consuming task. Artists must support their main element with many background objects to create a rich scene and then find an appropriate background and environment map to light it. Due to time constraints, they often had to balance quick results and creative exploration.

With the support of AI agents, creative teams can achieve both goals: quickly bring concepts to life and continue iterating to achieve the right look.

In the real-time live demonstration, researchers used an AI agent to instruct an NVIDIA Edify-powered model to generate dozens of 3D assets, including cacti, rocks, and a bull skull, with previews produced in just seconds.

They then instructed the agent to use other models to create possible backgrounds and a layout of how objects would be placed in the scene, and showed how the agent could adapt to last-minute changes in creative direction by quickly swapping out rocks for gold nuggets.

With a design plan in place, they asked the agent to create high-quality assets and render the scene as a photorealistic image in NVIDIA Omniverse USD Composeran application to create virtual worlds.

NVIDIA Edify accelerates the generation of environments

NVIDIA Edify models can help creators focus on core assets while accelerating the creation of environments and background objects using AI-powered scene generation tools. The real-time live demo showcased two Edify models:

  • Build 3D Generate ready-to-edit 3D meshes from text or image prompts. Within seconds, the model can generate previews, including rotating animations of each object, to help creators quickly prototype before committing to a specific design.
  • Build 360 HDRi Use text or image cues to generate up to 16K High Dynamic Range (HDRi) images of natural landscapes, which can be used as backgrounds and to illuminate scenes.

During the demonstration, the researchers also showed an AI agent powered by a large language model and USD Dispositionan AI model that generates scene layouts using OpenUSD, a platform for 3D workflows.

At SIGGRAPH, NVIDIA also announced that two leading creative content companies are giving designers and artists new ways to increase productivity with generative AI using Tools powered by NVIDIA Edify.

Shutterstock has launched its commercial beta version Generative 3D The service allows creators to quickly prototype and generate 3D assets using text or image prompts. Its Edify-based 360 HDRi generator has also entered early access.

Getty Images updated its Generative artificial intelligence from Getty Images Service with the latest version of NVIDIA Edify. Users can now create images twice as fast, with improved output quality and fast fulfillment, plus advanced controls and adjustments.

Leveraging Universal Scene Description in NVIDIA Omniverse

3D objects, environment maps, and designs generated with Edify models are structured using USD, a standard format for describing and composing 3D worlds. This compatibility allows artists to immediately import Edify-powered creations into Omniverse USD Composer.

Within Composer, you can use popular digital content creation tools to further modify the scene, such as repositioning objects, changing their appearance, or adjusting lighting.

Real-Time Live is one of the most anticipated events at SIGGRAPH, showcasing around a dozen real-time applications including generative artificial intelligence, virtual reality, and live performance capture technology. Watch the replay below.

    Leave feedback about this

    • Quality
    • Price
    • Service

    PROS

    +
    Add Field

    CONS

    +
    Add Field
    Choose Image
    Choose Video
    X