Meta, the company formerly known as Facebook, held a presentation today where it announced its rebranding and showed off what it imagines the future of computing will look like — but amid all the fluff, it actually had some interesting tech demos. The first showed off its life-like Codex avatars and an environment for them to exist in, which the company says was rendered in real-time and reacted to real-world objects. Mark Zuckerberg also talked about the company’s work with neural interfaces that let you control a computer just by moving your fingers.

We’ll discuss the demos in just a moment, but it’s really worth seeing them for yourself. You can check them out on the Facebook keynote archive or on Reuters’ YouTube stream if you prefer that interface.

Gif: Meta

According to Meta, this is both a virtual avatar and a virtual environment.

  Illustration by Alex Castro / The Verge

What is the metaverse?

It depends who you ask, but it usually refers to an array of interconnected digital spaces, sometimes in VR, sometimes experienced through a social network, and sometimes including real-time reference points to the physical world. You can read more about it here.

During the presentation, Meta showed off the work it’s been doing on its Codec Avatars to give users better control over their eyes, facial expressions, hairstyles, and appearance. The company also showed off the ability to simulate how the avatar’s hair and skin could react to different lighting conditions and environments and even how it was working on interactive clothing.

The presenter made it very clear that this tech is “still very definitely research,” and it’s no wonder — there’s an immense amount of hardware required to make one of the avatars, and Meta’s likely using very high-powered computers to render it. But it’s at least exciting to see that Meta’s goal is to let us render ourselves with graphics equivalent to what cutting-edge video game engines are capable of.

Gif: Meta

According to Meta, this was rendered in real-time.

The company also showed off its real-time environment rendering, which it said would eventually be a place for you to use your avatar to interact with others. The system also lets people interact with real-life objects, with the changes being reflected in the virtual world. While realistic environments are nothing new, being able to make changes to them in the real world and see those changes happen in a virtual one makes for a really cool demo (even if the tracking points added to the real-life objects are a bit distracting in how much they stand out).