AR Shared

Multi-player in Augmented Reality using WebXR API.

Themis García
5 min readDec 8, 2020
Orange Sphere following my position

Description

An exploration of features for collaboration in AR.
Project for Explorations in AR at ITP/NYU.
Code | Demo (for iPhone, open it in Mozilla XR)

Motivations and Context

Not sure if it’s the months of isolation while going to school, but I felt more interested in sharing and collaborative environments. Before the pandemic, we didn't find that much need for these experiences, but in this situation that we are living in now, I think that we are pushed to change to this interaction, which may stay after covid times.

In this first part of the semester, I conducted design research on the distance learning experience in school. As part of the research process, we did an assessment of what tools teachers are using in the virtual environment. I noticed that there might be space for shared experiences in AR, especially on accessible platforms.

Are there similar projects out there? How does this fit in with previous work?
I found that there are very similar tools to what I am exploring, such as Vectary. Vectary is a 3D tool collaborative environment that has a feature that runs in Augmented Reality. However, these are robust products with a level of complexity that is not easy to use for very first time users. My exploration is oriented towards integration into applications such as Jamboard, Miro, or even presentation apps.

Goal

Find a meaningful interaction for sharing AR space.

Process

Although I explored only key features, I wanted to have a context for the design. For this, I created a simple user flow of the interaction where the feature would be useful. This flow is presented in a web-based whiteboard application that has the modality of AR.

Wireframe for Key Features

Since this project is developed mostly as a technical exploration, I decided to work on three wireframes to have more visibility for my main interactions:

  • Real-time model modification
  • Chat
  • User position visibility

Process

For the implementation of this project, I used WebXR, Socket.io and Three.js.

As a beginner coder, the early stages of this prototype were based on understanding a little bit more about the environment. My first goal was just to position a cube in a three.js environment and be able to manipulate and see it in AR.

The cube is moving through an Array of positions in x and y.

What has worked for me in coding so far is to build little by little and have a lot of small goals.

So, my second goal was to have models inside the AR session.

For my next step, I followed class material and socket io tutorials. This helped me implement Socket.io into the three.js environment. Because of this, one of the first things that I have integrated into my application was messaging. This helped me test if the server was connected.

Next, I used an Event Listener to change the position of the sphere and change the color of the cuboid.

In this next gif, there is a shared environment where the sphere changes position when the screen is touched.

Change position

In my next step, I was able to draw the position of other users constantly. the first user, don't see the track of his own position.

Track the position of other users and change the color of the sphere.

Findings

In this process, I want to assess where the value of participatory and collaborative environments in AR lies. I think that this environment has a great presentation value. It just needs to be more accessible and easy to use for a wider user base.

For the context of use, I think that AR sessions could help to have a better understanding of arguments of scale and proportion.

Seeing the position of other users in relation to the model can bring engagement and information to the experience, such as Miro board or Figma do when you see the movement of another mouse. However, this part could bring complexities about the interpretation of how to implement those virtual spaces.

Real-time communication could help with the engagement of these tools in a collaborative environment. In this exploration, I used the chat as the first interpretation of that.

Challenges

Calibrating space

The main challenge that I noticed during this project was how to sync virtual spaces. When you open an ar session, each user starts in the same virtual location but that does not necessarily line up with the real world. I think that this is not a major problem when users are not sharing the same physical space. However, this is a problem if they are. I have to do more research on how to resolve this.

Calling three.js and servers

I don't have a lot of experience with servers, so I forgot to call three.js. This issue took me a long time to figure out, but it was simple to solve once I went to the coding lab to ask for help.

Conclusion

This project was more of a technical exploration for me to be able to understand better the flow of webXR, real-time communication, and three.js.

This is the very first time that I worked on a project where each achievement gave me joy. The reality is that each time that I opened the AR session and saw something new, even if it was glitchy, I was so impressed! 😅

There is a lot to improve and work on for this project. I will definitely continue this exploration to have the fundamentals for a really nice web application.

Resources and Reference

  • Experiments in AR class material
  • ITP Coding Lab
  • Socket.IO — Lynda tutorial
  • Three.js — Lynda tutorial

--

--

Themis García

Product UX Designer, Accessibility Researcher, Artist | PR-born & raised | She, Her, Ella| themisgarcia.com