Unity puts “digital humans” in the spotlight in new 3D engine tech demo


At the Game Developers Conference currently being held in San Francisco, the Unity development engine is showing off its photorealistic human rendering technologies in an impressive demo called enemies.

The development engine wars continue to rage. While Epic has been the main player in the news since the release of next-gen consoles and graphics cards, thanks to several extremely spectacular technical demos of the Unreal Engine 5, it is this time Unity which benefits from the Game Developers Conference (March 23 to 25 in San Francisco) to flex your muscles. The developer of the engine of the same name indeed shows its technical demo there enemiesshowcasing the marvels made possible by its latest technologies applicable to the creation of photorealistic “digital humans”.

This is indeed a major issue for Unity, which is not confined only to the world of video games. Because again like the Unreal Engine, Unity dreams of a much more universal role in the future. It also aims to appeal to the film and special effects industry – it is no coincidence that the company bought the Weta Digital studio in November 2021 – and to become an essential tool for the builders of the famous “metaverse”. Two areas where the need to create human avatars that are as realistic as possible is of critical importance.

Note that the demo enemies focuses only on 3D rendering technologies per se, and not really on the tools available to developers for character creation. In this sense, Unity’s work cannot be compared to that carried out by Epic on its Metahuman Creator system, whose first objective is precisely to offer creators a sort of catalog of physical characteristics to be freely assembled, in order to manufacture the most quickly and simply possible its own digital humans.

Multiple technologies combined

Among said rendering technologies, particular attention should be paid to the rendering system. shading skin, extremely sophisticated. It includes, among other things, a virtualization of the geometry carried out directly on the GPU and allowing meshs extremely detailed (important for the representation of down, for example), as well as a simulation system and wrinkles for a remarkably natural management of transluminescence (colorations caused by the diffusion of light inside the skin).

The eyes, on the other hand, benefit from a detailed management of the caustics on the iris, giving a very tangible side to the transparency of the cornea, and thus very effectively avoiding the impression of “plastic eyeball” that gives rise to often the eyes modeled in 3D. Regarding the hair, the engine shows how it can now interface with the pipeline Weta’s Barbershop — the very one who was behind the ultra-realistic rendering of the fur of the simian heroes of the last trilogy The Planet of the Apes At the movie theater.

Finally, the demo uses the raytracing for rendering reflections and ambient occlusion, and takes advantage of Nvidia’s DLSS technology to shoot in real time in 4K and at 30 fps… on unspecified hardware.



Source link -98