After the setbacks associated with the supposed real-time rendering of avatars’ legs at Meta (see article), here is Mark Zuckerberg’s company catching up with a new tech demo. And this time, no denial possible! The technical demo Will haveentirely on a black background, allows you to realize the omnipotence of the Meta Quest Pro in terms of face tracking and D’eye tracking. So yes, the rendering is still unstable (and not falsely simulated as was the case at the Connect 2022), but every move you see is authentic.
It is also a much more complete representation of what we saw during the last conference of Meta. And for good reason, the company preferred to listen to the players, and more precisely those who spoke with Andrew Bosworth (director of the department AR/VR at the house of Meta), which required giving their avatars real bodies, starting with the legs.
Our colleagues fromUploadVR were able to test the said technical demo and are quite amazed by the resultso much so that they imagined they were facing a real mirror, meaning that the avatar projected movements that the user of the helmet did not even suspect. They admit that this experience allowed them to avoid this impression ofuncanny valleyso clean is the replica.
I was particularly drawn to the details of the skin. If I squint and wrinkle my nose, I see the skin around it realistically straighten up, and it does the same when I raise my eyebrows. These subtle details, like the crease in the cheeks that move with the mouth, really give the impression that this is not just an object in front of me, but a living being.
Whether the expressions actually look like me when I’m behind the helmet is another question. Since this avatar’s face doesn’t match mine, it’s hard to tell. But the fact that the movements are at least plausibly realistic is an important first step towards virtual avatars that feel natural and believable.
The demo ofWill have (the explorer?) will be published by Meta as a project Open Source, in order to help more developers to work on it. The icing on the cake, Zuckerberg’s firm indicates that the app is all-in-onethat is to say that all the tools necessary for the creation of humanoid or human avatars are automatically integrated in one and the same place.
Developers will also be able to use theAPIs (the piece of code allowing information to pass from one application to another)Will have which will include values corresponding to the system FACSallowing in particular to describe the movement of the different muscles of the human face.
Know that confidentiality is taken very seriously and Meta wants each avatar to be “unique” to avoid identity theft. In details, developers will not be able to access the raw images of the user’s face, but will still have the binary code to authenticate you. This verification is done from what makes you unique, for example the way you frown your eyebrows or your nose, your tics and your tocs, etc. But this information will also be used in game to match your physical “you” to the avatar you have chosen, whether humanoid or not.
As a reminder, and this was already the case in the days of the first Meta Quest, the images captured by the cameras cannot be viewed or by Meta nor by the developers, whether internal (in the case of the Meta Quest Pro) or external.