- Lowpass
- Posts
- Scoop: Meta is building an XR calling service
Scoop: Meta is building an XR calling service
Also: Matthew Ball interview
Welcome to Lowpass! This week: Meta’s XR calling service, and an interview with Matthew Ball.
In partnership with
This week’s Lowpass newsletter is free for all subscribers; next week’s lead story will only go out to paying members. Upgrade now to not miss it.
Scoop: Meta is building an XR calling service
Meta is working on an XR calling service that incorporates the company’s photorealistic Codec Avatars, according to a series of recent job listings. Intriguingly, one of the roles that Meta is looking to fill for this service is that of an iOS developer.
Meta has been working on Codec Avatars for a few years years now, with the goal of ultimately “making it as natural and effortless to interact with people in virtual reality as it is with someone right in front of you,” as a 2019 blog post put it. While that work had long been research-dominated, the newly-surfaced job listings suggest that the company is getting closer to testing an immersive telepresence service with Codec Avatars among its own employees.
A job listing for a design prototyper states that the hire will be working “closely with engineers, scientists, and research product managers to build an internal XR calling service.” “You will be responsible for the highest level of polish, creativity, and interaction that allows the team to activate users and collect feedback that informs and empowers researchers and cross-functional partners toward the next generation of Codec Avatars,” the job listing continues.
A Meta spokesperson declined to comment.
One of the issues that has held Codec Avatars back in the past was the complexity of scanning people to create their avatars. When Mark Zuckerberg recorded an interview with Lex Fridman using Codec Avatars last year, he admitted that the scanning process alone had taken hours. What’s more, Meta has built a massive 3D capture rig that ingests 180 gigabytes of data per second to create Codec Avatars.
More recently, Meta has been working on using mobile devices to scan what it calls “Instant Codec Avatars,” – and that’s likely where the iOS developer the company is now looking to hire comes in. That developer will help “build and scale an internal XR calling service,” according to the job listing, which adds: “We are looking for a developer with experience in user interfaces, infrastructure, and/or tools supporting applications on the iPhone or iPad using the iOS SDK.”
While the job listing doesn’t specify how Meta intends to use iOS devices, it is likely that the company is looking to build a scanning app powered by the Lidar sensors present in high-end iPhones and iPads.
Meta executives have said in the past that it may still take years for the company to bring Codec Avatars to its VR headsets. However, VR scoop hound Luna discovered this month that a recent Quest headset update already includes code to support Codec Avatars-powered video calls. I’d expect that the company will update us on its work in this space during September’s Meta Connect conference.
SPONSORED
Volu.dev, your spatial development toolkit
Building for WebXR is exhausting: local server, SSL certificates, IP address, port forwarding… Plus, headsets don't even have debug tools 😑
Meet Volu.dev, your spatial development toolkit.
🌈 Easily connect to your headset. Local-only, peer-to-peer connection, your code never leaves your network.
🧰 Inspect, debug, and tweak code, directly from your headset.
⚙️ Support for three.js, AFrame, and MRjs.
🎈 Free and account-less.
Matthew Ball: The metaverse isn’t about just one device or service
Former Amazon Studios exec turned VC Matthew Ball has been a metaverse evangelist ever since first writing about the subject in 2019. Ball published a seminal book about the metaverse in 2022, and released an updated and revised version titled The Metaverse: Building the Spatial Internet this week.
I recently caught up with Ball to ask him about the lessons he learned writing about the metaverse, how his thinking about VR headsets has changed, and how we’ll actually know that the metaverse has arrived.
Ever since you published your Fortnite essay in 2019, you’ve become known as “the guy” to explain the metaverse. Has that job gotten easier, or harder?
It has become both. It has become easier because I have a deeper and richer understanding of the term. I also have a better understanding of which analogies are most effective and how to explain the metaverse effectively.
At the same point, the [metaverse] mania of late 2021 and 2022 meant it's now necessary to overcome preconception and misconception. When I started writing about the space, the typical response was: What is the metaverse? Now, it’s more common for people to [say]: Isn’t the metaverse just VR, or isn’t crypto the metaverse? That’s a different challenge.
How have your own thoughts on the metaverse evolved over the past few years?
Where my thinking has evolved most of all is around what it takes to actually build it. Technical, societal, from a regulatory perspective. It relates to the laws of physics, the impracticality of certain hardware problems today. It spans standards, networking protocols, as well as head-mounted displays, and so on.
Reply