Features of XR Collaboration Tools

This section will identify key XR collaboration tool features and provide a detailed description of what the feature and its benefits are. This will be important to guiding you to understand the potential of using XR collaboration tools for your communication needs.

Modes of Interaction


XR platforms are increasingly incorporating various voice-based technologies which improve the user experience and make interactions more intuitive, simultaneously bringing them closer to what we naturally expect in real-life and enhancing them with “super-powers” that are not available in the course of an average face-to-face or video conferencing meeting. These include voice commands that can bring up information in the form of digital displays or holographic objects and AI assistants that can take, save, and subsequently share meeting notes and action points and even translate what is being said into various languages, in displays visible only to relevant participants. These productivity-enhancing functionalities are becoming features of various platforms, and we expect the trend towards intuitive, voice-based interaction to continue as the demand for them also grows in the business community.


The ability to share information visually in a three-dimensional space is a powerful advantage afforded by XR collaboration tools. In addition to contributing towards building a sense of presence and immersion as mentioned above, this also reduces the cognitive load on the brain, which, from a neuroscience perspective, allows more efficient comprehension and retention of information being presented. Most people find it difficult to visualize abstract concepts and two-dimensional data on a spreadsheet, yet in XR it is possible to present the same information in much more relatable and instantly recognizable ways that do not require the brain to constantly engage in a translation process that keeps us from being fully present in the moment. It follows that the quality of collaboration arising from such interactions is likely to be higher.


In order to create the sense of embodied cognition crucial for immersion and presence, a user’s physical actions must be accurately mirrored in the virtual environment. Controllers have become increasingly sophisticated at tracking and translating hand movement with little or no lag, as well as allowing them to access menus and transport themselves around those environments with a few simple clicks. As XR technology evolves, however, we are likely to increasingly see such interfaces becoming more “transparent” meaning that movements can be tracked without the need for the user to be holding an actual controller, and that lightweight wearables will not only allow for more nuanced movements (think individual fingers flexing as opposed to blocky, static hands) but also haptic feedback that can convey the feeling of pressure, resistance, and temperature. In the not-too-distant future we might very well expect realistic (and entirely sanitary) XR handshakes to become the new social norm for business interactions.

Hand Tracking

Modes of Hand Tracking Based Interactions

With the rising interest in Virtual/Augmented Reality combined with the fast development and improvement of available devices, new interaction features are becoming available and are of great interest for collaboration during XR events. Currently, we can say that the main types of interaction are 1. Conventional Controller based interactions and 2. Hand Tracking based interactions. In addition to this, each of these types of interaction has different sub-modes. Depending on each mode, the interactions slightly differ. Here we want to point out a few of these modes for Hand Tracking based interactions in the context of collaboration and XR events.

Image courtesy of XR Bootcamp.

UI Mode

One of the first things you will see in VR for sure is some kind of UI. Maybe it’s a main menu style panel with 2D buttons or a hand wrist menu or even 3D style buttons, sliders, etc. So how do we interact with UI using our hands? Traditionally, you have a pointer of some sort, which you can use to interact with UI elements from a distance. But using Hand Tracking, we have the opportunity to add our Fingers and Hands into the formula to push buttons, slide knobs, turn things by grabbing, and many more. Especially interactions, where you “pinch” specific control elements like knobs or touch switches with the tip of your fingers, increase the immersion by a great amount and at the same time make it look a lot more natural to those around you during a collaboration session.

Kinematic Interaction Mode (Grabbing)

It’s most likely that you also will have to interact with, and more specifically will have to grab objects in VR. Objects grabbed kinematically, can affect other objects physically, but can not be affected themselves, at least not while in the grabbed state. Though it’s sometimes desirable like during a talk or discussion in front of other people, this is not fully realistic but more stable during the interaction itself.

Physical Interaction Mode (Grabbing)

Conversely to kinematic interaction mode, physical grabbing is more realistic. One downside would be that the grabbed objects are only moving as a reaction to physical forces. In most physics engines this can lead to stuttering, jumping, and other weird behaviors of the grabbed objects, barring filtering and stabilization. In general, while physical interaction mode during a collaboration with other people adds a certain fun factor, it does not substantially enhance collaborations and interactions during an XR event.

Gaze Tracking

Just as voice and gesture recognition technologies have come along in leaps and bounds in recent years, gaze tracking has evolved to the point where devices use it to replace the need for controllers in many instances. Microsoft’s HoloLens 2, for example, allows you to trigger various actions simply by directing your gaze towards a holographic button or trigger, and automatically scrolls text in tandem with the rate at which a particular user reads. Although this is a relatively high-end feature at the moment, we can expect it to filter down to more affordable mass-consumer devices in future, and for such intuitive, frictionless interactions to increasingly become the norm.

Other Features

Room Configuration

A “room” in VR is a very broad term, which can range from a deserted beach to an industrial plant or even the surface of another planet. It also encompasses traditional board rooms and basically anything else your imagination can conjure up. From Ian Dawson’s “Iron Man Jarvis" interface in Tony Stark’s lab to Dulce Baerga’s simulators in Second Life, the potential for branded and customized XR environments is tantalizing. There are already many ways in which users can build customized and branded rooms with relative ease, but it is also worth investing the time in optimizing the room to maximize your collaborative efforts, especially since you are likely to be spending a lot of time in those rooms as virtual meetings become more commonplace. 

Highly Customized Meeting Environments 

While standard rooms have a limited set of prefabricated environments, customized environments can be integrated with a variety of content such as streams from Twitter, Facebook, and Instagram. Additionally, it is popular to upload customer banners and 3D models into the room so guests can interact with that additional content. 

XR meeting spaces can be designed to replicate real-world locations such as board rooms (with custom furnishings and equipment, as well as branding and reference information), or more creative settings. This can be particularly useful in educational contexts, where it would be possible, for example, to hold a class on the surface of the moon, or examine a jet engine inside a realistic virtual hangar. 

Private & Public Rooms

XR platforms largely tend to mirror the way we approach shared spaces in the real world, namely dividing them into public and private, with established social norms dictating how we access them and interact within them. Public rooms in XR tend to be readily accessible and free to use, yet offer limited privacy and customization features. Private spaces, however, offer options for the user to decorate and personalize the layout with virtual objects, social media feeds, bot assistants, and much more, depending on the platform. There is no set prescription about which type of room is best suited for each use case, but as a rule these would tend to fall into the same brackets as they would in the real world, where a casual meeting would likely be held in a public space that did not require extensive preparation, whereas an important and potentially confidential presentation would perhaps justify creation of a dedicated room.

Room Mirroring

Many XR collaboration tools provide users with the ability to effectively replicate real-world environments virtually, therefore “mirroring” their look and layout in XR and minimizing the need for creating and setting up environments from scratch.

Spatial Audio

This image has an empty alt attribute; its file name is features-audio-spatial2.jpg

Humans are hardwired to pay attention to sound and instinctively use it to map their surroundings, find points of interest, and assess potential danger, so spatial audio is a key part in making the experience of collaborating in XR more immersive and building a sense of presence. Spatial audio essentially emulates how we perceive sound in the real world by mimicking the pitch, volume, reverberation level, and other audio cues the brain would expect during such real-world experiences. Building a dynamic soundscape is essential for effective immersive experiences. It allows programmers to create content whose sounds can come from any direction. To achieve this, XR uses software algorithms that manipulate a program’s sound wave frequencies, creating audio levels that become louder or softer depending on the user’s distance from a virtual object. The sound also shifts from one headphone speaker to the other as the person moves their head from side to side or as the virtual objects move on their own. Different size rooms give you different levels of comfort as a human, and if things don’t match your expectations in terms of what they should sound like, you instinctively feel quite uncomfortable.


XR collaboration tools offer a wide range of functionalities that replicate and/or augment other digital collaborative tools (file sharing, messaging, calendars, clocks, timers and even traditional real-world ones such as white boards and sticky notes.

This image has an empty alt attribute; its file name is features-content-collaboration-e1590608817832-1024x376.png

Interaction with 3D Objects

XR tools allow for interactions that would be too difficult, expensive, or dangerous in real life. Architects can visualize different layouts, identifying potential issues and avoiding costly mistakes before construction even begins. Designers and engineers can test prototypes and view how minute changes affect aerodynamics and performance. Clinicians can optimize the layout of operating rooms, and manufacturers can easily move several tons of heavy machinery and equipment until they find the most efficient arrangement that can be agreed upon collectively and implemented with confidence.

Text Input

A current issue with most XR experiences is that traditional methods of text input can be awkward, slow, inefficient, and downright frustrating, as anyone who has attempted to type on a virtual keyboard can attest to. Advances in voice technology are likely to make voice-to-text input the norm for XR, as it offers an intuitive, quick, and hands-free alternative. 


There are a number of features that XR collaboration platforms provide to facilitate administrators and organizers of collaboration sessions. Here are some of the more important ones and information on how to make the best use of them.


Some of the XR collaboration tools provide diagnostics that you can use to evaluate your custom room design or additional content to make sure that your fellow collaborators will have a good experience.

Time Zone Management

Working with a virtual team can be complicated – especially if you are separated by different time zones. Some XR collaboration tools provide features to overcome time zone challenges and make the most of your geographically distributed team by, for example, automatically calculating time zone differences and taking those into account when arranging meetings. However, if time zone management isn’t a built-in feature of the XR Collaboration tool you are using, here are two tools that can be used to coordinate schedules across the world.

Every Time Zone: Need to know what time it is, or will be, across the world when you schedule your next XR collaboration? Every Time Zone lets you compare multiple time zones now or at a specified future date.

World Time Buddy: Planning a XR session across multiple time zones? World Time Buddy gives you a side by side view of scheduling in every time zone you need in order to help you choose the perfect time for your session.

When multiple people are working together from different time zones, communications can quickly get complicated, so it is a good idea to set a default time zone for your group—either where the majority of the participants are located or where your clients are. Alternatively, GMT (Greenwich Mean Time) is often still used as a standard. 

Session Recording and Transcripts

This feature can be extremely handy, not only for note-taking at a later stage without having to assign a person to be in charge of taking minutes during a meeting, for example, but also to enable participants who might not have been able to attend a meeting for whatever reason the chance to catch up. This is particularly useful in distributed teams where different time zones make it challenging for everyone to meet at the same time. 

An automatic session transcript can save significant time on note-taking during meetings and is a valuable tool for minimizing distraction and helping participants to become meaningfully engaged in their interactions.

Session Analytics

There are specific analytics that you will want to get from XR collaboration tools – primarily information about events occurring within both the artificial reality and the device being used to create the artificial reality. It can be useful to review session analytics to determine the effectiveness of your room design, XR tool feature access, and content assets. XR analytic metrics can be divided into XR scene metrics, XR device metrics, and attendee/session metrics.

XR Scene Metrics

Sometimes the design of a room, 3D space, or content you are using for XR collaboration isn’t effective enough. Analysts commonly visualize this information as a heat map, coloring the different regions of a VR space according to the amount of attention they receive from users. The more interest an area gets, the redder it appears. You can use these metrics to evaluate the results of your design and content layout and make changes as needed. Analytic categories for XR scene metrics include:

  • Event Zones (where users are participating within a room or virtual space)
  • Gaze Heatmaps (where users are focusing their eyes)
  • User Paths (how users flow through the XR environment)
  • Content Engagement (which content elements users are interacting with)
  • Tool Engagement (which functions users are interacting with)

XR Device Metrics

Devices being used by XR collaboration participants should be effective for the experience. Technical issues will reduce the desire for people to use XR for collaboration, so we want to know about them right away. For example, VR needs to run at least at 90 frames per second – drop in frame rate can produce lagging or choppiness that disorients your fellow collaborators. Here are some of the metrics that will be of interest:

  • Performance (FPS)
  • Teleportation Events (locomotion count)
  • Hardware Data (user devices by class and model)
  • HMD Collision with World
  • Controller Collision with World
  • Button Presses

Attendee/Session Metrics

Getting data about what people do during an XR collaboration meeting is important - especially for teachers and business managers. Tracking session time is common practice for Web analysts across all digital platforms, but it is especially relevant for XR since it can be a meaningful measure of engagement. When users are immersed in an XR experience, they tend to spend a lot of focus and time on exploring their surroundings (even when they are aware of interaction opportunities). Session time can offer valuable insight into the immersive and transportive effects of your VR experience. Here are some metrics to look for that will support this analysis:

  • Number of Attendees
  • Comparison between Attendee Counts, RSVPs, and Kicks or Removals
  • User Locations (geo mapping)
  • Session Time

Simultaneous Collaborators and Sessions

It is important to review each tool’s capabilities in terms of the maximum number of simultaneous collaborations you can have in one “room” or session and the total number of sessions. Most XR collaboration tools have capacity limits. Information about such limits is available in the XR Collaboration Product Directory hosted at XRCollaboration.com.

It is important to evaluate tools to determine if they support specific role types. As you engage in collaborations with more than 8-10 people you will find that having separate features for Admin, Moderators, Speakers, Collaborators, and Viewers will be valuable.

Streaming to Other Platforms

The use of XR devices is still not widespread. Therefore it is important to make the content experience accessible to those who don't have devices. Also, most collaboration platforms have a technical limit to the number of people who can inhabit a shared space (typically 30-40). For both of these reasons, the ability to stream a camera view of the XR collaboration experience to mobile, tablet, and desktop computer devices can be very valuable.

On-Device Casting

Some devices, such as the Oculus Quest 1 and 2, offer on-device casting or live-streaming functionality. You can stream the your camera view directly to the mobile Oculus app, to your Chrome browser, or even directly to your Facebook feed. The first two methods are quite useful because you can then share your mobile phone or desktop browser display to others via standard videoconferencing tools or even relay the livestream to other platforms such as Twitch or YouTube.


Android based XR devices that don't have on device casting built in might be able to connect to a desktop computer using a third-party tool called Vyzor. Using Vyzor, you can share your Android device across the office or across the globe by just using a special URL.

In-App Casting

Some XR collaboration tools have a dedicated live-streaming feature built in and even allow the use and placement of a separate virtual camera to provide the source for an external livestream. Using multiple virtual cameras can allow remote participants to observe the XR collaboration session from multiple perspectives and with a view of different content – a very powerful feature indeed!

Each of you will be looking to address different needs when it comes to considering the use of an XR collaboration tool. We have identified a set of common requirements that should be considered when selecting the best tools and platforms to suit those needs. On our XRCollaboration.com website, we provide an interactive tool that can be used to tailor recommendations to those specific needs and requirements. 

High Level

  • Ability to participate via a wide range of devices to allow more convenient access to potential participants: PC-based VR, standalone VR, smartphone-tethered AR, standalone AR, tablets, smartphones, and Windows/Macintosh computers.
  • Stable and reliable software. Check the vendor’s blog or release history to see if they are regularly posting software updates.
  • Accessibility equal to or better than IRL (in real-life) events.
  • Intuitive interface, so that first-time users aren’t discouraged from participating. We’ve spent time gathering information on each tool’s first-time onboarding experience so you can plan ahead how much time it will take to get started.
  • Support for at least 5 participants in real time without glitches or lag. We’ve evaluated each tool’s ability to support concurrent users in a single collaboration session.
  • Both free access and paid access, and public access as well as private access. 

User Interface

An effective user interface should:

  • Allow users to quickly and easily visualize the way their avatar appears to others
  • Offer users a consistent set of features on supported platforms
  • Make interaction controls available without having to switch in and out of XR
  • Not have distracting, “always-on” UI elements that crowd the user’s view
  • Include a simple, toggle-able menu, with most common commands accessible via minimal steps 
  • Display time of day (in user’s time zone) and details of current session
  • Allow users to easily see the schedule and navigate to other sessions
  • Support both teleporting and joystick locomotion/turning
  • Use standardized controls for locomotion and interaction so users can easily switch between platforms 
  • Allow users to change settings without affecting their avatar movements
  • Include a pause function, so that users can temporarily leave the XR environment with ease (we all need “bio breaks”)
  • Have the functionality to save sessions, slides, and information on fellow attendees 
  • Allow participants to easily exchange contact details.