Meta Offers a Glimpse of R&D

Michael Abrash, Meta Reality Labs’ Chief Scientist, offered an overview of part of the research and development behind the company’s run into XR and their metaverse at the Facebook Connect 2021 conference. This R&D is worth billions of dollars and we expect to see some very good high-tech things from this company in the near future. 

Leading the team at Meta’s Reality Labs Research is Michael Abrash and they have been given the task of researching technology that Meta believes could be fundamental to XR and their metaverse in the coming future. Abrash shared some of that very work at the Connect 2021 conference. 

Codec, Full-Body Avatars

Meta’s project for Codec Avatars hopes to achieve a system that is able to capture and represent hyper-realistic avatars for their use in XR. It’s a challenge to not only scan a person’s body but to get it to actually move in a realistic manner and making a system that is capable of running all of that in real time so that the avatar can interact and be interacted with. 

Each time the company has shown off their codec avatars there have been improvements made. Progress is a good sign. The project originally was just realistic heads of avatars, but they have since made the progression to full body avatars. 

There is a video of their progress which shows their latest work on the codec avatars in which a researcher named Yaser Sheikh explains that the full body avatars now support even more complex facial expressions, eye movements, and even hand and body gesturing that involve self-contact. It is implied by a viewer watching the presentation that all of the video is happening in real time virtual reality.

Abrash has acknowledged that it is very important to consider the security of your identity now that hyper-realistic avatars are very clearly going to be in the near future. He stated that the company is currently, “thinking about how we can secure your avatar, whether by tying it to an authenticated account, or by verifying identity in some other way.”

Hyper-realistic Skin and Hair Rendering 

The research group’s end game is to eventually reach photorealism in their avatars although there is no need to convince us of their already realistic look. Abrash showed off a simulation of their latest work playing with lighting and skin and hair rendering of the avatars. Again, there was no claim this was happening in real time but it gives us a good look at the expectations that the company is aiming for in future work on these holographic avatars

Clothing Simulator

Not only will the bodies of the codec avatars be realistic but the clothing as well. The company, Meta, has stated that they intend for the avatar clothing to continue being an important way for users to express their individuality and the only way to do that is by making the clothing photorealistic as well. There is a video in which the company shows off a clothing simulation with some hands-on interaction to show their progress on the clothing. 

Real Time Virtual Places

XR can easily take us to all new realities and there is no reason we should not be able to teleport our friends into our living spaces. Meta made that possible with real time recreation of your home and everything that you own. Quite a feat considering everything needs to be realistic and run in real time. 

There is a video where Meta did that exact thing with an apartment and everything you might find in an average apartment. This allows users to move around their real home and interact with their belongings while also using virtual reality and keeping both in sync.

If you have a virtual guest over then they can see you moving around your real home and interacting with your own belongings in a very natural way. When using Augmented Reality glasses, having this kind of technology will make the experience that much more immersive and entertaining. This is a best-case scenario for the technology but with the way Meta is making progress we have high hopes.

EMG Input

Controller and hand tracking are useful input devices for all things XR today, but making the move to all-day headsets that we even use in public and having to move our arms around or stab the air with our fingers if we want to type out a message or switch applications could prove to be a pain. 

Meta has been researching more subtle and natural ways of input for XR via a wrist-worn floating interface called EMG. A prototype was shown for the first time at Connect 2021. Though it was only a simulation and a prototype, Abrash believes it to be “genuinely unprecedented technology” with large potential as well as room for improvement. 

Contextual AI

The company is also working on contextual Artificial Intelligence systems to make AR even more useful. Abrash believes that by anticipating intent, a user may only need one button for input that will choose the right action depending on the context of the situation like turning on a light or an appliance such as the television. 

Abrash says it is “going to take about a dozen major technological breakthroughs […] and we’re working on all of them.” He has said that the overview given of the current research and development going on is actually only a fraction of the work they are really doing.