Arilyn Blog – All things augmented

AR Weekly | September 18 | 2020 | Facebook Connect edition

Written by Liisa Mathlin | 18.9.2020

This week we'll focus on the Facebook Connect event. The event looking to the future of AR and VR was organized online earlier this week.

Many of the sessions were aimed more towards VR creators and revolved around the new Oculus Quest 2, shipping from October 13th. Still, many interesting insights about AR's tomorrow lay ahead!

Creation of AR glasses begins

Probably the most groundbreaking news about Facebook's future AR efforts was about their Project Aria. The project starting in September aims to collect information and data needed to build functioning AR glasses. 

A group of Facebook employees and contractors will wear research glasses that capture video and audio, and eye-tracking and location data. With the information and data captured, Facebook Reality Lab will have essential insights into what genuinely functioning AR glasses require. 

Along with the technical requirements, the research focuses on privacy. What are the places that real-time mapping and data collecting are off-limits, and how to protect non-users?

The first step towards AR glasses for the masses is Facebook's smartglasses, which are released next year. The glasses will be a collaboration between Facebook and an eyewear pioneer Luxottica. The partnership enables future smart- and AR glasses to have various designs instead of just one or two styles. 

Future trends: Accessibility and scalability

The combining themes throughout the keynote and different sessions were accessibility and scalability on XR solutions. New features and applications from Facebook include AR try-ons, New York Times' AR articles, and AR Storytime. These all will accelerate the adoption of AR as they enable better reach and understanding of the technology to wider audiences. 

With a better understanding, more communities and organizations will see what problems can be solved with XR. Better understanding creates more demand, and so the solutions offered need to be scalable, rather than custom. 

On a panel discussion How developers and ISVs are accelerating VR adoption in the enterprise, the panelists (Jacob Loewenstein, Head of Business, Spatial, Justin Barad, CEO of Osso VR, and Kyle Jackson, CEO of Talespin) highlight two things: out-of-box experiences and remote work. Yes, this was about VR, but these sure are themes that consider AR as well.

They state that because most of the end-users don't have the know-how on Unity and Unreal, they don't care about customizing the apps. So, the solutions need to focus on repeatability and standardization. The focus should be on un-sexy parts to make the technology understandable and adoptable instead of fancy details to scale the industry up. 

The future of augmented reality

Probably due to the COVID pandemic, lots of talks revolved around remote work, community, and productivity of work. Facebook Reality Labs cracked the curtain for the applications they're working on. The XR solutions like these will improve massively our new normal of working remotely. And, in the end, our lives as a whole.

Applications introduced include ideal AR interfaces, faster text input, audio AR, and, of course, the Project Aria. The one they focus on the most is the latter, but the AR audio gets a nice amount of attention. 

The goal of their AR audio application is to improve focus. With selectively filtering in-ear audio monitors, the user can filter out the ambient background noise and center the attention to their conversation. The filtering works with what they call beamforming: you can only hear the person you're looking at. With an AR assistant's help, the user can include or exclude people to conversations, so any parts of the conversation won't get lost.

The most exciting part for me was their vision on how the AR cloud, the Mirrorworld, the Magicverse will be like, and how we'll get there. Starting with Project Aria, a precise 3D map of the world will store all the relevant information and data to each user. The live map will consist of three layers: location, index, and content. 

The location layer is exactly what you think: the locations you spend your time in. The two others are the ones that store the info about those locations: what is in them and why. The index layer consists of objects and the structures of surroundings. The content layer is quoted as a personal ontology layer: it stores the relationships, histories, and predictions of the info stored in the other two layers. Also, the information that's personally important to the user like flight details, RSVPs for a party, and favorite Thai-restaurants are stored there.

After all, the XR industry is all about people. To better train us, improve the collaboration and focus, and better and more sustainable unleash our world's and our potential.

The computing is transforming from dragging, pointing, and clicking to seamlessly thread into every-day life actions. The second wave of human-oriented computing is here.

Next week we'll get back to more than just one topic! Stay tuned.