Published: April 21, 2026

An extended comparative analysis of AR/MR equipment for physical asset management applications

Francisco Malva1
Romeu Nogueira2
Jose Marinho3
Nuno Cid Martins4
Ana Malta5
Mateus Mendes6
Nuno Ferreira7
1, 2, 3, 4, 5, 6, 7Polytechnic University of Coimbra, Coimbra, Portugal
Corresponding Author:
Francisco Malva
Article in Press
Views 13
Reads 4
Downloads 50

Abstract

Despite the increasing adoption of Augmented and Mixed Reality (AR/MR) technologies in asset-intensive industries, organisations involved in Physical Asset Management (PAM) still lack domain-specific guidance for selecting appropriate AR/MR devices. Existing comparative studies largely focus on consumer, gaming, or generic enterprise contexts and fail to account for the operational, environmental, and safety constraints inherent to PAM. This study addresses this gap by conducting a qualitative, criteria-based comparative analysis of eleven commercially available AR/MR devices. The analysis evaluates lightweight AR glasses and MR headsets using parameters derived from PAM operational requirements, including processing capabilities, ergonomics, display characteristics, spatial mapping performance, interaction modalities, ruggedisation, security certifications, and software ecosystem support. Device specifications were collected from official vendor documentation and publicly available technical sources, and assessed through explicit parameter-to-use-case reasoning rather than numerical scoring. The results reveal a clear functional differentiation between device categories. AR glasses are better suited for long-duration field inspections and safety-critical environments due to their lightweight design, transparency, and ruggedisation, while MR headsets offer superior computational power, spatial mapping, and interaction capabilities, making them more appropriate for training, complex maintenance, and detailed asset visualisation in controlled settings. By aligning AR/MR devices characteristics with PAM-specific operational demands, this study provides actionable guidance for practitioners and contributes to addressing a gap in the PAM literature.

1. Introduction

This paper is an extension of previous work presented in the International Conference on Physical Asset Management and Data Science [1], where the initial methodology and results were introduced. Building on that foundation, this study introduces Physical Asset Management (PAM) and its integration with Augmented and Mixed Reality (AR/MR) technologies.

Unlike Virtual Reality (VR), which immerses users in a fully simulated digital environment, AR overlays virtual elements onto the real-world using devices like smartphones, tablets, AR devices, significantly improving the accessibility and visualization of spatial and location-sensitive data. Furthermore, MR expands on this integration by unifying both the digital and real worlds, allowing users not only to perceive but also to interact with and manipulate virtual objects as if they coexisted with physical ones, thereby enabling more natural, dynamic, and collaborative interactions within hybrid environments.

As a result, the adoption of AR/MR devices has grown rapidly across multiple domains, particularly in the field of Physical Asset Management (PAM). For example, as shown by Ahmed [2] and Ojha [3], AR/MR was successfully applied to construction management and asset inspection/training respectively, showcasing their potential for real-world industrial use cases within PAM.

It is, therefore, critical that organizations have access to comprehensive and up-to-date resources to support informed decision-making regarding the selection and implementation of AR/MR equipment. Although several detailed papers and resources have explored the use of AR/MR in selected industries, such as the work of Artem et al. [4] and Arian et al. [5], none currently seems to exist comparing AR/MR devices to gauge its suitability for PAM tasks in particular. As PAM work involves operational conditions that differ significantly from those typically considered in general AR/MR reviews, a domain-focused assessment is needed to examine AR/MR equipment in light of the operational, environmental and integration constraints that are characteristic of PAM.

This article presents a comparative analysis of commercially available AR/MR devices, explicitly mapping hardware characteristics to operational requirements typical of asset-intensive environments. While AR/MR equipment has been widely studied, most existing comparison studies and benchmarking efforts focus on consumer virtual reality, gaming performance, or general enterprise applications. Such studies typically evaluate display fidelity, latency, interaction techniques, or developer ecosystems under laboratory or controlled conditions. In contrast, PAM imposes constraints such as long-duration field use, ruggedisation, safety certifications, offline operation, and tight integration with asset management workflows. These factors are rarely addressed in existing AR/MR devices comparisons, limiting their applicability for PAM decision-makers. The goal of this study is therefore not to identify a universally optimal device, but to provide a structured basis for selecting AR/MR equipment according to specific PAM use cases and operational constraints.

1.1. Physical asset management

Unlike financial assets such as stocks or bonds, whose value is largely shaped by external market conditions, physical assets derive their value primarily from their functional role within operational systems. While both financial and physical assets exist along a spectrum of liquidity, critical physical assets are distinguished by their deep operational integration rather than by their resale potential. Assets such as power-generation equipment, industrial production lines, or transport infrastructure cannot be exchanged or redeployed without substantial operational disruption, as they are embedded within specific workflows, safety requirements, and maintenance regimes. Their economic and strategic value is therefore determined by performance, availability, and reliability within the asset system, rather than by ease of liquidation.

Because of their physical nature, these assets demand continuous management. Factors such as regular maintenance, repair, upgrades, and compliance with safety or environmental regulations directly influence their performance, lifespan, and residual value. In addition, the management of physical assets extends beyond their active use to include considerations around construction, commissioning, repurposing, and even end-of-life processes such as decommissioning, recycling, or demolition.

To address these challenges, the discipline of PAM has emerged as a structured framework for optimizing the life cycle of physical assets. PAM encompasses the strategic, tactical, and operational activities required to balance performance, risk, and cost over time. It is not merely an informal specification, having been standardized in ISO 55000 [6], which defines guidelines for effective asset management. Maletič et al. [7] showcases the visible correlation between the implementation of core, structured principles of PAM and an increase in operational performance.

1.2. Integration with technology and AR/MR

In recent years, technology has become deeply embedded in PAM processes. Technology’s capacity for mass data collection and processing has allowed PAM to streamline several previously tedious tasks such as bookkeeping and maintenance.

PAM has transformed from a reactive discipline, where problems were resolved as they happened, to a preventative one, where technology is used to analyse projects and attempt to fix issues before they ever become relevant problems.

To this end, PAM has integrated the following technologies, among others:

1) Internet of Things (IoT) – Assets can be embedded with sensors that report their condition, allowing for predictive maintenance and failure detection more expeditiously. Zein and Karimah [8] showcase a success case, where usage of IoT technology in a company for monitoring machinery allowed for an increase of asset utilization from 62 % to 85 %.

2) Big Data – In the information era, collection of data has become paramount to operational success. Extracting data from assets enables the creation of datasets, which facilitates data analysis and prediction, thereby increasing operational efficiency. With the current rise of Artificial Intelligence (AI) and usage of Machine Learning (ML) models, large volumes of data have become more relevant than ever. Campos et al. [9] propose an architecture for big data analysis in industrial asset management.

3) Digital Twins – A digital twin is a dynamic digital representation of a physical asset that integrates real-time data from sensors, maintenance records, and operational history to mirror the behaviour and condition of the asset. In PAM, digital twins enhance lifecycle management by enabling simulation of degradation, optimization of replacement schedules, and investment planning. They also support predictive and condition-based maintenance, as anomalies can be detected earlier, reducing unplanned downtime and costs. Furthermore, digital twins allow performance optimization through real-time benchmarking and analysis, while also strengthening risk and safety management by simulating fault or stress scenarios. Moretti et al. [10] describe a standardized Digital Twin digital architecture that was developed and applied at the West Cambridge DT research facility, which allowed for the development of a flexible asset management and monitoring system as a proof of concept.

More importantly, and as the focus of this paper, PAM has become increasingly intertwined with AR/MR technologies. This trend is supported by broader evidence beyond isolated case studies. A comprehensive systematic literature review by Koumou and Isafiade, in [11], analysed over 200 publications between 2010 and 2023 and identified a marked growth in the use of immersive technologies, such as AR, MR, and VR, within asset management-related research, particularly after 2019. Their analysis indicates that, while immersive approaches are increasingly applied across fourteen domains, the adoption of AR/MR remains moderate when compared to other enabling technologies such as Building Information Modelling (BIM), the Internet of Things (IoT), and Digital Twins. Nevertheless, the reviewed studies show that immersive technologies are frequently combined with these paradigms to support inspection, maintenance, training, and operational decision-making, providing contextualised visualisation and spatially situated information that cannot be achieved through conventional interfaces. This growing body of literature indicates a transition of immersive technologies from experimental prototypes toward more systematic integration within PAM-related workflows.

When integrated with enabling technologies such as IoT, Big Data, Digital Twins, and AI, AR/MR not only improves situational awareness but also facilitates data-driven decision-making in the field, reducing errors, downtime, and costs. As a result, AR/MR is emerging as a transformative tool within PAM, bridging the gap between advanced analytics and practical, on-site asset management practices. It has been a prolific field of study, with several works exploring the capabilities of the union between these two technologies.

AR/MR has been used successfully to drive processes in several areas. Several usage examples will be highlighted in the following subsections.

1.2.1. Machinery/manufacturing

AR/MR has been used successfully in the manufacturing industry in several areas, such as maintenance, training, and for instructions during the actual manufacturing process.

Kostolani et al. [12] present a predictive maintenance system powered by AR in a manufacturing plant, showcasing an increase in maintenance efficiency and prediction of failure.

Lai et al. [13] introduce an assembly instruction assistant that leverages Convolution Neural Networks (CNNs), pose estimation algorithms, and AR to overlay CAD models of the individual parts and display assembly instructions for a carving machine. By testing the assembly process using a paper manual on one side and the AR-based system on the other, it was found that the proposed AR-based solution was able to decrease the assembly time and error count by 33.2 % and 32.4 %, respectively.

1.2.2. Real estate

Amed et al. [14] propose the utilization of a BIM model along with AR/MR and VR to give an immersive overview of a house during its sales phase. Compared to traditional approaches, where price negotiations were always met with attrition between both the developer/realtor and the customer, the introduction of an immersive overview of the property allowed for faster and less laborious price negotiations, due to the customer being able to better understand the scale of the project and the associated underlying costs.

Satapathy et al. [15] demonstrate the development of a renting recommendation web application that enhances apartment/property visualisation through AR/MR, powered by Vuforia SDK. Although the property’s model must be authored beforehand, it allows the user to have a preview of the location’s space and, therefore, making a more informed decision.

1.2.3. Infrastructure

AR/MR has been used fairly extensively in infrastructure construction, visualization, and maintenance.

Dolas and Ulukavak [16] propose the usage of an AR application, implemented through the Unity engine, as a way to visualize the water and sewage infrastructure of the Seyrantepe neighbourhood in Turkey. Despite several technological drawbacks, the experiment was successful in showcasing the potential of such technology.

Mascarenas et al. [17] describes a novel AR framework that aims to aid in the development and maintenance of nuclear material infrastructures. Due to its highly volatile and reactive nature, the infrastructure surrounding nuclear material is one of the most highly regulated sectors. With several potentially dangerous and long, demanding tasks related to nuclear upkeep, this framework attempts to ease these burdens by providing precise, detailed instructions for each piece of equipment through an interconnected marker-based architecture. The implementation and demonstration of a prototype of the proposed AR system was implemented in a surrogate environment, validating its viability and opening an avenue for further developments.

Additionally, AR/MR is a useful tool for safety in infrastructure management, allowing for an augmented training experience and for better predictive maintenance and fault prevention. Gong et al. [18] highlight the usage of AR/MR for safety, overlaying virtual building plans and safety information onto the real-world environment to help workers identify and navigate hazards. It also highlights its usage of training workers on the proper use of equipment and tools to improve safety. Negi et al. [19] emphasize AR’s usage in checking the quality of constructions and anticipate structural failures. Its integration with other technologies such as IoT, AI and Big Data further improve the accuracy of infrastructure management.

2. Methodology

For the purpose of this analysis, AR/MR-capable devices were classified into two primary categories:

1) AR Glasses: Lightweight, wearable devices designed to overlay digital information onto the user’s field of view without significantly obstructing vision. These are best suited for field operations, inspections, remote assistance, and tasks requiring mobility and hands-free access to information.

2) MR Headsets: Heavier, more immersive devices that primarily serve VR applications but include AR/MR functionality via passthrough cameras. These devices provide enhanced spatial awareness, interaction capabilities, and processing power, making them more appropriate for stationary or semi-mobile use cases such as training simulations, complex maintenance procedures, or asset visualization in controlled environments.

This categorization is necessary due to the inherent differences in ergonomics, application focus, and user interaction. AR glasses are generally intended for real-time support and lightweight visualization, while headsets are used where richer interaction and environmental mapping are essential.

The analysis followed a qualitative, document-based comparative approach. Technical specifications for each device were collected exclusively from official product sheets, vendor documentation, and publicly available technical sources. The parameters selected for comparison were chosen based on their operational relevance to PAM. Quantitative specifications such as resolution, storage size, camera megapixels, and field of view were compiled into descriptive tables to form the basis for a qualitative interpretation of device suitability. This interpretation considered how each technical parameter aligns with PAM requirements, including ergonomics, environmental constraints, and integration with digital workflows. Hands-on experience with one AR device (Microsoft HoloLens 2) and one MR device (Meta Quest 3) informed contextual observations, but was not used for performance measurement. Since the analysis is based on product specifications rather than empirical performance testing, the conclusions presented here should be interpreted as qualitative assessments of suitability rather than definitive performance rankings. The objective was not to score or rank devices but to identify trends and relative strengths across categories relevant to PAM, based on its operational needs and device characteristics. A quantitative scoring or weighting framework was intentionally not employed due to the absence of a validated PAM-specific benchmarking model and the high variability of operational priorities across PAM use cases. Introducing generic weights could bias results and reduce applicability across different asset management contexts.

2.1. PAM requirements

The selection of AR/MR devices for PAM must be grounded in the strategic and operational realities of asset-intensive industries. According to ISO 55000, asset management is the coordinated activity to realize value from assets. This process relies fundamentally on risk-based, information-driven decision-making, requiring that any technological intervention enhances data acquisition and processing without introducing operational fragility or safety risks [11], [20].

Concerning operational and environmental constraints, unlike controlled office or laboratory settings, PAM fieldwork often occurs in harsh environments characterized by exposure to mechanical loads and contaminants such as dust or corrosive liquids [21], [22]. Field studies indicate that hardware must be ruggedised to withstand drops from working heights (e.g., 1.2 meters) and resist water and airborne dust to ensure reliability [22].

In regulated industrial sectors, devices must comply with strict safety standards, in line with the principles of ISO 55000. This includes compatibility with personal protective equipment (PPE) such as hard hats, safety glasses, and gloves, which can otherwise impair gesture recognition and touch inputs on standard devices [4], [22].

Ergonomics and human factors must be considered as fundamental PAM requirements since the physical burden on the operator is a critical limiting factor for technology adoption. Industrial Human-Computer Interaction (HCI) literature highlights that heavy head-mounted displays (HMD) can cause neck fatigue, eye strain, and physical discomfort, often limiting effective continuous use to approximately two hours. However, PAM maintenance workflows often span full shifts, requiring devices that are lightweight and ergonomically balanced to prevent musculoskeletal stress. Additionally, the “hands-free” principle is essential in maintenance and assembly tasks, allowing workers to manipulate tools and components while simultaneously viewing instructions, a capability where AR smart glasses often outperform tablets or smartphones [4].

AR/MR devices must integrate with existing enterprise systems such as Enterprise Resource Planning (ERP), BIM systems, and Digital Twins to ensure data traceability. This integration facilitates real-time fault diagnosis and remote expert collaboration, which can significantly reduce travel and maintenance error rate [23]. However, field studies consistently report that industrial sites often suffer from intermittent or non-existent network coverage [23], [24]. Consequently, reliance solely on solutions such as cloud computing is a vulnerability and devices must possess offline operation capabilities and sufficient local storage to prevent workflow disruption during connectivity loss [22], [23].

Based on this synthesis, the PAM-specific requirements operationalised in this study for evaluating AR/MR equipment are defined as: (i) ruggedisation and environmental resistance; (ii) compliance with safety and regulatory certifications; (iii) suitability for long-duration, mobile field use (ergonomics and weight); (iv) offline operation and local storage capability; (v) spatial awareness and documentation support for inspection and maintenance; and (vi) integration with management workflows (e.g., capacity to interface with existing asset information systems such as IoT and BIM). These requirements form the basis for the parameter selection and criteria-to-use-case mapping applied throughout the comparative analysis in Section 3.

2.2. Scenario-based PAM requirements mapping

The typical PAM scenarios can be organized into the following categories, with practical application of AR/MR technologies in alignment with industrial operational demands: full-shift field Inspection, remote expert assistance, training and simulation, hazardous or regulated site operation, offline maintenance workflow, and indoor asset visualization and navigation.

Full-shift field Inspection scenarios (S1) involve workers performing routine status checks and data collection across a facility over several hours. They navigate both indoor and outdoor environments, requiring a balance between access to digital data and physical situational awareness. This mobile, long-duration context requires devices with low weight and transparent optics to prevent neck fatigue and maintain situational awareness.

Remote expert assistance scenarios (S2) consist of field workers using AR/MR devices to collaborate in real-time with remote specialists, overlaying live instructions and technical manuals onto physical equipment to troubleshoot issues. Hardware must support hands-free interaction, robust audio/voice recognition for noisy environments, and reliable connectivity.

Controlled training and simulation scenarios (S3) consist in enabling workers to practice maintenance and assembly procedures or safety protocols in indoor, controlled environments, without physical and error risks. In this type of scenario, AR/MR technologies are used to overlay digital models and instructions onto real-world environments, often requiring high visual fidelity, a wide Field of View (FoV), and advanced spatial mapping to allow the manipulation of complex 3D virtual objects.

Hazardous or regulated site operation scenarios (S4) consist of operations occurring in highly volatile or reactive environments (e.g., nuclear infrastructure). These sites often require strict adherence to safety standards and the use of personal protective equipment. Devices must, thus, have specific security certifications like IP66/IP67 (dust/water resistance) and MIL-STD-810 (environmental durability). In this type of scenario, AR/VR devices can overlay hazard warnings to help workers identify and navigate risks.

Offline maintenance workflow scenarios (S5) involve assets located in remote locations, indoor and outdoor, or shielded plant rooms with no network coverage. In this case, maintenance must continue even when there is intermittent or non-existent network coverage. Equipment must therefore possess sufficient local storage and processing power for on-device AI operations or 3D model rendering. For example, the AR/VR device caches models and technical documentation locally, ensuring the alignment with the asset data is not lost when connectivity drops.

Indoor asset visualization and navigation scenarios (S6) results from the need to locate and interact with hidden or spatially complex assets, such as underground infrastructure and interior wiring. This requires advanced spatial mapping, SLAM (Simultaneous Localization and Mapping), and depth sensing (e.g., LiDAR – Light Detection and Ranging) to ensure digital overlays remain anchored to the physical asset. For example, AR/MR devices enable viewing hidden assets behind walls or underground locations, guiding the worker to the precise location of a fault.

Aligned with the discussion in the previous subsection, Table 1 summarizes the critical and relevant AR/MR device requirements for each defined type of PAM scenario. Requirements marked as critical correspond to conditions that are mandatory for safe, effective, or standards-compliant operation, while relevant requirements indicate desirable characteristics that enhance performance or usability without being strictly mandatory.

Table 1PAM scenario-based requirements

Requirement
S1 field
S2 remote
S3 train
S4 hazard
S5 offline
S6 indoor
R1 – Lightweight/Wearability
R2 – Ruggedization
R3 – Safety/Compliance
R4 – Offline capability
R5 – Connectivity
R6 – Display/Audio quality
R7 – Spatial mapping
R8 – Interaction richness
R9 – Latency tolerance
R10 – SDK/Integration readiness
✓ – Critical requirement
△ – Relevant but non-critical requirement

2.3. Comparison parameters

In this study, the selected devices were evaluated based on the following parameters:

1) CPU and RAM: PAM applications frequently involve 3D visualization, object recognition, and spatial mapping, which are computation-heavy. Evaluating CPU and memory capacity ensures that devices can support these intensive tasks without lag.

2) Internal storage: Many PAM environments have limited or no internet access. Local storage of 3D models, technical documentation, and offline databases is crucial for autonomous operation.

3) Operating system: The OS determines application compatibility and integration with enterprise asset management software. A well-supported OS also affects security and developer accessibility.

4) Ergonomics: Many PAM tasks require prolonged usage of AR/MR devices, often in physically demanding conditions. Comfort is therefore critical for operator endurance and device usability.

5) Display resolution and type: High resolution is essential for interpreting technical diagrams and overlaying precise information. Display opacity also influences situational awareness and is a decisive factor in choosing between transparent AR and passthrough-based MR.

6) Audio and voice interaction: Industrial environments are often noisy and require hands-free operation. Devices must have robust audio systems and voice recognition to allow efficient interaction without the need for physical input.

7) Camera capabilities and spatial mapping: Essential for inspection and maintenance, where the ability to document assets and overlay spatially aligned information is crucial. SLAM, anchoring, and depth sensing are critical to this.

8) Connectivity: Real-time access to cloud-based systems and remote support functionalities depends on robust and versatile connectivity options, especially in distributed asset environments.

9) Artificial Intelligence support: Increasingly, AR/MR applications in PAM incorporate AI for predictive maintenance, object recognition, and anomaly detection. The ability of a device to support AI-driven applications is therefore an emerging factor of interest.

10) Interaction methods: Some PAM scenarios benefit from gestures or controller-based interaction. Evaluating a range of input modalities ensures adaptability to different contexts.

11) Response latency: Low latency is vital for immersive AR/MR applications, especially in MR headsets using passthrough video. High latency can result in a disorienting experience and reduced task efficiency.

12) Security certifications: In PAM, some devices may be used in harsh or regulated environments. Certifications like IP67 or MIL-STD-810 ensure the hardware’s durability and compliance with safety standards.

13) SDK and engine support: For scalable deployment and custom development, support for engines like Unity and Unreal is critical. SDK availability directly affects integration ease and development speed.

For each parameter, interpretation focuses on how specific technical characteristics enable or constrain typical PAM tasks, using explicit parameter-to-use-case reasoning (e.g., device weight for full-shift field inspections, certification profiles for hazardous environments, and spatial mapping capabilities for complex maintenance activities).

In addition to the parameters above, there are several other relevant factors that are important in PAM applications, but for which standardized or publicly available data is often lacking. These include:

1) Battery life: Many PAM tasks are carried out in the field over extended periods. Evaluating battery life ensures the device can operate through a full work shift without interruption. While this information is not always available across all devices, it remains a critical parameter.

2) Multi-user collaboration features: Important for remote assistance and synchronized work in distributed teams.

3) Battery charging options: Including swappable batteries or rapid charging capabilities.

4) Total cost of ownership: Encompassing the purchase price, cost of required accessories, software licenses (SDKs), and support/maintenance fees.

5) User authentication and access control: Especially relevant in secure environments or multi-user scenarios.

6) User feedback and field-testing results: Provides insight into real-world performance, durability, and usability.

Although these parameters are not included in the comparative tables due to data limitations, their importance is recognized, and they should be considered in practical evaluations or pilot studies.

Table 2 presents the selected AR glasses (GL) and MR headsets (HS) for analysis. AR glasses and MR headsets are, respectively, used to denote lightweight, optical see-through devices, and immersive head-mounted systems with advanced spatial mapping. These terms are used consistently throughout the manuscript. Although MagicLeap 2 is specifically designed for AR/MR applications, it was categorized with MR headsets due to its specifications and capabilities aligning more closely with that group.

It is important to note that the selection of devices does not reflect a ranking or endorsement. The list is based on prior research, updated to reflect current market offers, with older or discontinued models replaced by actively supported alternatives.

To gather device data, among other external sources, the following were consulted: the Vuzix M400 product sheet [25], Vuzix M4000 product sheet [26], RealWear Navigator 500 data sheet [27], a presentation on the ThirdEye X2 [28], MagicLeap2 Product Specification Version 6.2 [29], Meta Quest 3 Product Page [30], HTC VIVE XR Elite Specification Page [31], Varjo XR-4 Product Page [32], Pico 4 Product Page [33], Apple Vision Pro Product Page [34], and Microsoft HoloLens 2 Documentation [35]. In cases where additional context was required or specifications were incomplete, further academic and industry sources were consulted. When certain specification values could not be determined reliably, their corresponding table entries were marked with an em-dash (—).

Table 2Selected AR and MR devices for analysis

AR Glasses (GL)
MR Headsets (HS)
Vuzix 400
MagicLeap 2
Vuzix M4000
Meta Quest 3
RealWear Navigator 500
HTC Vive VR Elite
ThirdEye X2
Varjo XR-4
Pico 4 Pro
Apple Vision Pro
Microsoft HoloLens 2

All conclusions in this study are expressed as qualitative assessments of device suitability based on manufacturer specifications and publicly available documentation, rather than empirically validated performance comparisons.

3. Results and discussion

The interpretations presented in this section are derived from the qualitative, criteria-based methodology described in Section 2. Conclusions regarding suitability are not based on assumed preferences or generic ergonomic norms, but on systematic mapping between documented device characteristics and the operational demands of PAM tasks. For example, observations related to ergonomics, mobility, or immersion are grounded in measurable parameters such as device weight, display type, field of view, and certification profiles, and are interpreted within the context of typical PAM scenarios (e.g., full-shift field inspections versus short-duration training sessions). This structured reasoning framework enhances the reliability and reproducibility of the analysis, even in the absence of empirical performance testing.

Therefore, this section presents comparative findings from the evaluation of selected AR and MR devices, attempting to provide a qualitative comparison on key technical specifications relevant to their use in PAM. Currently, to the best of our knowledge, there are no standardised benchmarking frameworks specifically for AR/MR equipment within the PAM domain. Furthermore, distinct PAM contexts (e.g. hazardous industrial sites versus controlled cleanrooms) impose unique operational constraints, so applying a generic scoring model would require a customised framework, which does not yet exist. The development of such a framework is a complex undertaking that merits dedicated study. Consequently, the following discussion provides a qualitative assessment of suitability, mapping technical specifications directly to the distinct requirements and features relevant to the PAM context.

3.1. CPU and RAM

The computational capabilities of AR and MR devices are primarily determined by their central processing units (CPUs) and memory (RAM), which directly influence performance, responsiveness, and the ability to run complex applications. This section presents a comparative overview of the CPU and RAM specifications of selected AR glasses and MR headsets, highlighting the variations in processing power across devices designed for different use cases. Table 3 summarizes the key specifications of these systems.

The results show a clear division in processing capability between AR glasses and MR headsets. While AR glasses offer modest configurations, MR headsets far surpass them in computing power. As such, whenever the task demands it, MR headsets are generally more suitable for heavier processing workloads.

Table 3CPU and RAM specifications of the selected AR/MR devices

Device
CPU
RAM
Vuzix 400
Qualcomm XR1
6 GB
Vuzix 4000
Qualcomm XR1
6 GB
RealWear Navigator 500
Snapdragon 662
4 GB
ThirdEye X2
Qualcomm XR1
4 GB
MagicLeap 2
AMD Zen 2
16 GB
Meta Quest 3
Snapdragon XR2
8 GB
HTC Vive VR Elite
Snapdragon XR2
8 GB
Varjo XR-4
Host Dependant
Host Dependant
Pico 4 Pro
Snapdragon XR2
8 GB
Apple Vision Pro
Apple M2
16 GB
Microsoft HoloLens 2
Snapdragon 850 Compute Platform
4 GB

3.2. Internal storage

Internal storage capacity and type are critical factors influencing the data handling capabilities and application reliability of AR and MR devices. This section presents an overview of the storage specifications of various AR glasses and MR headsets, highlighting differences in storage size, expandability, and underlying technology.

These specifications can impact the device’s suitability for data-intensive applications, such as real-time asset visualization, offline content access, and extended field use without reliance on cloud connectivity (especially in PAM, where some environments may not have a reliable connection to the Internet).

The data gathered on this subject is presented in Table 4. Storage capacities reinforce a very clear performance split. While the discussed AR glasses uniformly feature 64 GB flash storage, MR headsets offer significantly higher capacities.

Table 4Storage specifications of the selected AR/MR devices

Device
Size
Type
Vuzix M400
64 GB
Flash
Vuzix M4000
64 GB
Flash
RealWear Navigator 500
64 GB
Flash + MicroSD
ThirdEye X2
64 GB
Flash
MagicLeap 2
256 GB
SSD
Meta Quest 3
512 GB
Flash
HTC Vive XR Elite
128 GB
Flash
Varjo XR-4
Host PC Dependent
Pico 4 Pro
128 GB / 256 GB
Flash
Apple Vision Pro
256 GB / 512 GB / 1 TB
Flash
Microsoft HoloLens 2
64 GB
Flash

3.3. Ergonomics

The successful deployment of AR/MR devices in PAM relies on ergonomic design, as this directly affects user adoption, operational efficiency and long-term usability. As field operators often perform physically demanding tasks over an extended period, comfort, ease of use and physical adaptability are critical factors in device selection. In this context, this subsection evaluates two key factors that influence ergonomic suitability: the weight and field of view (FoV) of selected AR/MR devices. The results of this research can be viewed in Tables 5 and 6, respectively.

The weight analysis highlights the ergonomic advantage of AR glasses, which in our sample weigh approximately 170-274 g (Table 5), in contrast to MR headsets, which are substantially heavier and typically exceed 500 g. Such weight can lead to user fatigue and limit the practical use of headsets for long-duration, mobile tasks.

Complementing this, the FoV comparison highlights a trade-off between form factor and visual immersion: AR glasses, while more comfortable and lightweight, generally offer a narrower field of view. In contrast, headsets provide significantly wider FoVs, offering a more immersive experience that is especially beneficial in wide environments or when managing high information loads.

In summary, based on the provided data, lightweight AR glasses are typically better suited for mobile, long-duration field operations, whereas headsets with larger FoV are more appropriate for stationary or semi-mobile scenarios requiring higher levels of visual immersion and interaction.

Table 5Weight specifications of the selected AR/MR devices

Device
Weight (grams)
Vuzix M400
190
Vuzix M4000
222
RealWear Navigator 500
274
ThirdEye X2
170
MagicLeap 2
260
Meta Quest 3
515
HTC Vive XR Elite
625
Varjo XR-4
665
Pico 4 Pro
586
Apple Vision Pro
Microsoft HoloLens 2
566

Table 6FoV specifications of the selected AR/MR devices

Device
FoV (degrees)
Vuzix M400 Xtreme Kit
16.8
Vuzix M4000
28
RealWear Navigator 500
20
ThirdEye X2
42
MagicLeap 2
70
Meta Quest 3
96
HTC Vive XR Elite
110
Varjo XR-4
115
Pico 4 Pro
105
Apple Vision Pro
100-120
Microsoft HoloLens 2
52

3.4. Display resolution and type

The display characteristics of AR/MR devices greatly influence their effectiveness in various applications. Display resolution, type, and opacity collectively determine the usability of a device by influencing what their operators can perceive.

Display resolution governs the legibility of technical schematics and component labels during equipment inspections. High-resolution displays aid in more precise and legible visualization of any necessary content.

The display type affects contrast ratios and colour fidelity. While this parameter may be unimportant in some situations, use cases that require greater contrast or colour depth may benefit from choosing a display type with better colour optics.

Opacity is perhaps the most operationally significant parameter, creating a fundamental divide in AR/MR device capabilities. Transparent displays come with the ability to maintain environmental awareness even while operating AR/MR equipment and allow the passthrough process to be simpler than opaque displays (as there is no need to capture and project the environment image onto a screen). However, they come at a trade-off of being more expensive to produce, leading to more expensive equipment and, overall, smaller resolutions. The data gathered on this subject can be consulted in Table 7.

Table 7Display specifications of the selected AR/MR devices

Device
Resolution
Opacity
Vuzix 400
Opaque
Vuzix 4000
Transparent
RealWear Navigator 500
720p
Opaque
ThirdEye X2
720p
Transparent
MagicLeap 2
1440p
Transparent
Meta Quest 3
4K
Opaque
HTC Vive VR Elite
720p
Opaque
Varjo XR-4
Opaque
Pico 4 Pro
4K
Opaque
Apple Vision Pro
4K
Opaque
Microsoft HoloLens 2
2K
Transparent

Display technology showcases a trade-off between resolution and operational transparency. AR glasses typically offer lower resolutions and simpler display types, though some models like the Vuzix M4000 and ThirdEye X2 support transparent optics, which aid in preserving real-world visibility. MR headsets deliver higher resolutions, providing highly detailed visual feedback well aligned with asset visualization or intricate work procedures.

Notably, transparent headsets such as MagicLeap 2 and HoloLens 2 attempt to bridge this gap by offering both clarity and environmental awareness.

3.5. Audio and voice capabilities

Audio input and output features (Table 8) are crucial for effective operation and user interaction in AR/MR applications. Research such as [36] demonstrates that audio-based interactions significantly improve user attention and task efficiency compared to traditional interaction methods like touchscreens. Consequently, it is essential for AR/MR devices to possess robust and versatile audio capabilities to fully support these interaction modalities. This section outlines the audio and voice-related capabilities of selected AR glasses and MR headsets, focusing on the number and type of built-in speakers and microphones, support for external audio devices, and the presence of voice recognition capabilities.

Table 8Audio and Voice capabilities of the selected AR/MR devices

Device
Speakers
Microphone
External audio
Voice recognition
Vuzix 400
1
1
Bluetooth
Yes
Vuzix 4000
1
1
Bluetooth
Yes
RealWear Navigator 500
1
4
Headphone Jack
Yes
ThirdEye X2
2
1
Headphone Jack
Yes
MagicLeap 2
2
4
Bluetooth
Yes
Meta Quest 3
2
3
Bluetooth
Yes
HTC Vive VR Elite
2
2
USB-C
Yes
Varjo XR-4
2
2
Headphone Jack
No
Pico 4 Pro
2
2
Bluetooth
Yes
Apple Vision Pro
2
6
Bluetooth
Yes
Microsoft HoloLens 2
2
5
Bluetooth
Yes

Voice interaction is uniformly supported across all models, yet headsets excel in audio fidelity, mostly due to supporting stereo audio more readily, which allows for richer audio environments and for technologies like spatial audio, which can enhance the user’s experience and allow for more in-depth immersion in the AR/MR setting.

3.6. Camera capabilities and spatial mapping

Cameras and spatial mapping technologies are fundamental to AR/MR devices, as they enable accurate overlay of digital content onto the real-world environment. Cameras capture visual data that serves both as media for documentation and as input for spatial mapping. Spatial mapping refers to the process by which an AR/MR device scans and digitally reconstructs its surroundings, allowing virtual objects to be placed and anchored realistically within the physical space.

A key enabler of spatial mapping is Simultaneous Localization and Mapping (SLAM), which allows the device to track its position and orientation while concurrently building a map of the environment. This capability ensures that virtual content remains stable and correctly aligned even as the user moves. Anchoring further supports this by fixing virtual objects to specific points or surfaces in the real world, preventing unwanted drift.

Depth sensing technologies, such as Light Detection and Ranging (LiDAR) or infrared sensors, capture distance information to generate detailed 3D meshes of the environment. These meshes improve spatial awareness by enabling realistic interactions between physical and virtual elements, including occlusion and collision detection.

Table 9 summarizes the relevant camera specifications of AR/MR devices. The key parameters are:

1) RGB Camera Count: Amount of RGB cameras used on the device to capture the surrounding environment.

2) Depth Sensing: Indicates the presence of sensors and algorithms that capture depth data and generate 3D meshes of the surroundings to enhance spatial mapping and realistic interaction.

3) Camera Resolution (MP): The megapixel count of the camera, indicating image quality and detail for media capture and spatial analysis.

4) SLAM: Indicates whether the device supports Simultaneous Localization and Mapping for real-time environment mapping and localization.

5) Anchoring: Denotes if the device can attach virtual objects persistently to real-world locations.

Table 9Camera and spatial mapping capabilities of the selected AR/MR devices

Brand
RGB camera count
Depth sensing
MP
SLAM
Anchoring
Vuzix M400
1
No
13
No
No
Vuzix M4000
1
No
12.8
No
No
RealWear Navigator 500
1
No
48
No
No
ThirdEye X2
1
Yes
13
Yes
Yes
MagicLeap 2
2
Yes
12.6
Yes
Yes
Meta Quest 3
2
Yes
Yes
Yes
HTC Vive XR Elite
2
Yes
16
Yes
Yes
Varjo XR-4
2
Yes
20
Yes
Yes
Pico 4 Pro
2
Yes
16
Yes
Yes
Apple Vision Pro
2
Yes
6.5
Yes
Yes
Microsoft HoloLens 2
2
Yes
8
Yes
Yes

Camera quality and spatial capabilities are key differentiators. While AR glasses often lack SLAM, anchoring, or depth sensing (except for the ThirdEye X2), all MR headsets support these features. Varjo XR-4, with dual 20MP cameras and LiDAR, offers unmatched environmental mapping. Devices like the Meta Quest 3 and Apple Vision Pro also integrate depth sensors and SLAM for precise spatial awareness. This makes headsets significantly more capable for tasks involving AR/MR overlays in dynamic or complex environments.

Another differentiator lies in geolocation capabilities. All analysed AR glasses are equipped with built-in GPS, enabling accurate outdoor tracking of assets and personnel, an important feature for field operations across large geographic areas. In contrast, the evaluated MR headsets do not include GPS. Instead, they rely on aforementioned techniques such as SLAM for spatial awareness, making them more suitable for indoor environments where precise positioning is needed.

3.7. Connectivity

In PAM, reliable connectivity is essential for ensuring operational efficiency. It enables seamless data integration with asset management systems, facilitates remote collaboration, and supports a range of other critical functions. Connectivity features are generally standardized across both AR glasses and mixed reality MR headsets. Most devices support Wi-Fi, Bluetooth, and USB-C connections, providing robust wireless and wired communication options.

3.8. Artificial intelligence capabilities

As industries advance and PAM workflows become increasingly more complex, the integration of AI within AR/MR devices presents a significant opportunity to enhance productivity.

Built-in AI capabilities enable real-time scene understanding, natural language processing, object recognition, predictive maintenance suggestions, among other tools that help enhance workflows. The following list summarizes the AI capabilities available across the selected devices:

1) MagicLeap 2: 2 dedicated ML cores/14 image processing cores.

2) Meta Quest 3: Meta AI.

3) Apple Vision Pro: Apple Intelligence.

4) Microsoft HoloLens 2: Cortana (Deprecated).

From this overview, it is evident that AI capabilities remain unevenly distributed across the AR/MR landscape. Most current AR glasses and MR headsets lack meaningful onboard AI functionality, relying instead on remote processing. This can introduce latency and limit real-time interaction, particularly problematic in bandwidth-constrained industrial environments.

Only a select few devices, such as the MagicLeap 2 and Apple Vision Pro, currently integrate dedicated AI hardware for local inference. These are better suited for sophisticated PAM applications that require on-device intelligence, such as autonomous diagnostics, voice-command control, and advanced environmental awareness.

However, this absence of dedicated AI processing hardware does not dictate that the other devices are incapable of AI tasks, merely that they have less processing power for such workloads, and may the need arise for more computational power, it may be necessary to offload that work to other external devices.

3.9. Device intractability

Efficient interaction is essential for AR/MR applications, not only for casual environments, but also for industrial environments where hands-free or intuitive input methods improve workflow efficiency. Table 10 presents the data gathered about the interaction methods available across the selected AR glasses and MR headsets.

Interaction methods vary widely. AR glasses are largely limited to voice and touch, with no hand tracking or controller support. Conversely, MR headsets universally support advanced interaction modes, including gesture control and physical controllers. The HoloLens 2 and MagicLeap 2 are particularly robust, offering seamless multimodal interfaces. This enables more intuitive and efficient workflows in PAM tasks requiring complex interaction, such as virtual equipment manipulation or remote training.

Table 10Intractability of the selected AR/MR devices

Device
Touch input
Voice
Hand tracking
Controller support
Vuzix M400
Yes
Yes
No
No
Vuzix M4000
Yes
Yes
No
No
RealWear Navigator 500
Buttons only
Yes
No
No
ThirdEye X2
Yes
Yes
No
No
MagicLeap 2
No
Yes
Yes
Yes
Meta Quest 3
No
Yes
Yes
Yes
HTC Vive XR Elite
No
Yes
Yes
Yes
Varjo XR-4
No
Yes
Yes
Yes
Pico 4 Pro
No
Yes
Yes
Yes
Apple Vision Pro
No
Yes
Yes
No
Microsoft HoloLens 2
No
Yes
Yes
Yes

3.10. Response latency

Response latency, shown in Table 11, is the time elapsed from when an image is captured by a headset’s camera until it is displayed on the headset’s screen. Essentially, it measures the delay involved in “passing through” the real-world view, processing it, and projecting it onto the display. As shown by Fuvattanasilp et al. [37], although the study’s focus was on mobile devices rather than MR headsets, it revealed that users prefer devices with latency of 132 ms or lower, and that latency differences become noticeable when the gap exceeds 69 ms.

Table 11Latency of the selected MR headsets

Device
Latency (ms, estimated)
MagicLeap 2
Meta Quest 3
39 ms [38]
Htc Vive XR Elite
40 ms [38]
Varjo XR-4
22 ms [32]
Pico 4 Pro
Apple Vision Pro
11 ms [38]
Microsoft Hololens 2

Low response latency is critical for effective usability in augmented reality (AR) devices, as high latency creates a noticeable lag between the user’s movements and the corresponding visual update. This lag can cause the system to feel unresponsive and severely impact its usability. This is particularly relevant to MR headsets that do not utilize transparent displays but instead rely on video passthrough or similar mechanisms to render the real world digitally, enabling AR/MR functionality.

Among headsets, the Apple Vision Pro and Varjo XR-4 show a significant advantage over the other MR devices. Notably, devices with transparent displays mostly bypass this concern, presenting real-world visuals directly, making them more latency-tolerant.

3.11. Security certifications

In some scenarios, it is important for the device in use to be capable of enduring various unpredictable conditions. This aspect reveals itself even more critical in workplaces where strict adherence to EU or OSHA workplace safety specifications is necessary. Therefore, having a guarantee of the device’s resilience and toughness being paramount. Even though a few headsets have some security certifications, AR glasses usually take the edge on this front. The data on this aspect can be observed in Table 12.

Table 12Security certifications of the selected AR/MR devices

Device
Certifications
Vuzix M400
IP67
Vuzix M4000
IP67
RealWear Navigator 500
IP66, MIL-STD-810
ThirdEye X2
ANSI/ISEA Z87.1, IP40, EN 166
HoloLens 2
(Industrial Edition)
ISO 14644-1 Class 5-8
UL Class I, Division 2 (Groups A, B, C, D)

For clarity, these certifications will be explained in the following itemized list:

1) IP67 [39]: Fully dust-tight and protected against immersion in water up to 1 meter for 30 minutes. Suitable for rugged environments with high dust and moisture exposure.

2) IP66 [39]: Fully dust-tight and protected against powerful water jets from any direction. Well, aligned with industrial or outdoor settings where devices may be sprayed or hosed down.

3) IP40 [39]: Limited protection against solid objects larger than 1 mm (e.g., tools or wires); no protection against liquids. Suitable for clean, indoor environments.

4) MIL-STD-810 [40]: U.S. military standard for environmental durability, including resistance to shock, vibration, extreme temperatures, humidity, and altitude. Ensures suitability for field and industrial use.

5) ANSI/ISEA Z87.1 [41]: American national standard for eye protection. Ensures resistance to high-velocity impacts, splash protection, and optical clarity. Commonly required in industrial and construction sites.

6) EN 166: European standard for personal eye protection. Covers mechanical strength, dust and liquid resistance, and optical quality. Widely used in workplaces across the EU, it was replaced in November 2024 by EN ISO 16321:2022.

7) ISO 14644-1 Class 5 to 8 [42]: Certifies that the device meets the cleanliness standards for use in clean rooms. It complies with particle emission limits typically required in environments such as pharmaceutical manufacturing and semiconductor fabrication.

8) UL Class I, Division 2 (Groups A, B, C, D) [43]: Ensures that the device is safe for use in hazardous locations where flammable gases, vapours, or liquids may be present. This is a North American standard for equipment used in potentially explosive atmospheres.

AR glasses have a noticeable advantage over MR headsets. Conversely, MR headsets generally lack such certifications, making them less appropriate for exposure to more abrasive conditions.

3.12. SDK and engine support

Engine and SDK support is crucial for good programmer experience and fast turnaround time, as support for graphical engines like Unity [44] and Unreal [45] can be a tremendous help to speed up development. The gathered information can be seen in Table 13.

Table 13Engine and SDK support of the selected AR/MR devices

Device
Unity
Unreal
SDK Info
Vuzix M400
Yes
No
No
Vuzix M4000
Yes
No
No
RealWear Navigator 500
Yes
No
No
ThirdEye X2
Yes
Yes
VisionEye SDK
Meta Quest 3
Yes
Yes
Meta SDK/OpenXR
HTC Vive XR Elite
Yes
Yes
VIVE OpenXR/Wave
Varjo XR-4
Yes
Yes
Plugin Support
Pico 4 Pro
Yes
Yes
Pico SDK
Apple Vision Pro
Yes
No
No
Microsoft HoloLens 2
Yes
Yes
Native/OpenXR

Development support is comprehensive across most MR headsets, with near-universal compatibility with Unity and Unreal engines. OpenXR and proprietary SDKs further simplify integration into existing PAM platforms. AR glasses lack in this area, with only ThirdEye X2 offering broader engine and SDK support. In terms of documentation, all listed SDKs have comprehensive documentations, with detailed documentation of functionality and with a variety of examples pertaining to features.

3.13. Summary of comparative findings

Table 14 provides a qualitative summary of how the selected AR glasses align with the PAM scenario-based requirements considered in Subsection 2.2 (Table 1). All AR glasses analysed weigh between 170g and 274g, ensuring they meet the hands-free principle essential for full-shift maintenance and assembly tasks. Devices like the RealWear Navigator 500 and Vuzix models are highly ruggedized for harsh environments, whereas the ThirdEye X2 is suited for cleaner, indoor environments due to its IP40 rating. All AR glasses analysed are designed for compatibility with personal protective equipment such as hard hats and safety glasses, and their transparent optics help maintain situational awareness in safety-critical environments.

Most AR glasses lack SLAM and depth sensing, making them more suitable for remote assistance and digital checklists rather than complex 3D asset visualization. Only ThirdEye X2 supports these advanced features. The analysed devices are primarily limited to voice and touch inputs, notably lacking the sophisticated hand tracking found in MR headsets. As optical see-through devices, AR glasses provide “zero latency” for real-world viewing, which is essential to maintain situational awareness. Concerning development support, the ThirdEye X2 offers the most versatile development support among the glasses, including Unreal Engine and the proprietary VisionEye SDK.

Table 14Selected AR glasses requirement coverage

Requirement
Vuzix 400
Vuzix M4000
RealWear navigator 500
ThirdEye X2
R1 – Lightweight/Wearability
R2 – Ruggedization
R3 – Safety/Compliance
R4 – Offline capability
R5 – Connectivity
R6 – Display/Audio quality
R7 – Spatial mapping
X
X
X
R8 – Interaction richness
R9 – Latency tolerance
R10 – SDK/Integration readiness
X
X
X
✓ – Fully covered
△ – Partially covered
X – Not covered

Table 15 provides a qualitative summary of how the selected MR headsets align with the PAM scenario-based requirements considered in Subsection 2.2 (Table 1). MR headsets generally provide significantly higher computational power and 4K resolutions compared to AR glasses, making them more suitable for detailed 3D asset visualization and complex Digital Twin applications. They are substantially heavier than AR glasses, typically exceeding 500 g (e.g., Varjo XR-4 at 665 g and HTC VIVE XR Elite at 625 g), increasing the potential for neck fatigue and restricts their use to shorter or semi-mobile tasks. Unlike most AR glasses, MR headsets have spatial awareness capabilities, supporting SLAM, depth sensing, and multimodal interactions such as hand tracking and physical controllers.

Most MR headsets are fragile and lack industrial ruggedization certifications, making them unsuitable for hazardous environments. However, the Microsoft HoloLens 2 device is an exception with clean room and hazardous location certifications. Passthrough-based MR headsets (e.g., Meta Quest 3) introduce variable response latency (ranging from 11 ms to 40 ms), which can impact usability and cause disorientation if not strictly minimized. The HoloLens 2, as an optical see-through device, mostly bypasses this concern. Devices with large internal storage, such as the Apple Vision Pro (up to 1TB) and Meta Quest 3 (512GB), are rated fully for offline scenarios. The Microsoft HoloLens 2 and HTC VIVE XR Elite are marked as partial due to their smaller, non-expandable storage capacities ranging from 64GB to 128GB. Concerning the SDK/integration readiness requirement, while most devices offer near-universal compatibility with Unity and Unreal engines, the Apple Vision Pro is rated partially as it currently lacks Unreal engine support.

Table 15Selected MR headsets requirement coverage

Requirement
ML2
Q3
VIVE
Varjo
Pico
AVP
MHL2
R1 – Lightweight/Wearability
R2 – Ruggedization
X
X
X
X
X
X
R3 – Safety/Compliance
X
X
X
X
X
X
R4 – Offline capability
R5 – Connectivity
R6 – Display/Audio quality
R7 – Spatial mapping
R8 – Interaction richness
R9 – Latency tolerance
R10 – SDK/Integration readiness
✓ – Fully Covered
△ – Partially Covered
X – Not covered
ML2: Magic Leap 2
Q3: Meta Quest 3 AVP: Apple Vision Pro
VIVE: HTC VIVE XR Elite
Pico: Pico 4 Pro
MHL2: Microsoft HoloLens 2
Varjo: Varjo XR-4

Table 16 provides a qualitative assessment of the suitability of the analysed AR/MR devices across the six PAM-derived scenarios defined in Section 2.2. AR glasses are generally rated high for field inspection (S1) and hazardous site operation (S4) scenarios. This is due to their lightweight design (under 300 g), transparent optics, and ruggedisation, which prioritize operator endurance and situational awareness. Vuzix M400 and M4000, and RealWear Navigator 500 devices are rated high for type S4 scenarios due to their IP66/67 and MIL-STD-810 certifications. Among the headsets, only the Microsoft HoloLens 2 provides equivalent high suitability.

Table 16PAM-derived scenario suitability of the selected AR/MR devices

Device
S1 field
S2 remote
S3 train
S4 hazard
S5 offline
S6 indoor
Vuzix M400
X
X
Vuzix M4000
X
X
RealWear Navigator 500
X
X
ThirdEye X2
Magic Leap 2
X
Pico 4 Pro
X
X
Meta Quest 3
X
X
Apple Vision Pro
X
X
HTC VIVE XR Elite
X
X
Microsoft HoloLens 2
✓ – High suitability
△ – Medium suitability
X – Low suitability

The analysed MR Headsets are rated high for training and simulation (S3), and indoor asset visualization and navigation (S6) scenarios. Their high suitability is attributed to superior computational power and advanced spatial mapping capabilities, making them appropriate for complex visualization in controlled settings. The ThirdEye X2 AR glasses are also rated high for the S6 scenario type since it also possesses the SLAM and depth-sensing hardware required for indoor anchoring and precise digital overlays. The RealWear Navigator 500 and most MR headsets are rated high regarding the offline maintenance workflows scenario (S5) due to the higher storage capacities or expandability options needed to manage large local datasets.

To conclude the technical analysis presented in this section, the findings are presented as a final descriptive and interpretative summary through a two-fold perspective: parameters used for the evaluation of the AR/MR devices (Table 17) and PAM-derived scenarios (Table 18).

Table 17Comparative summary of AR glasses vs. MR headsets and their PAM suitability

Parameter
AR Glasses (GL)
MR Headsets (HS)
Suitability for PAM
Processing and
storage
Modest CPU/RAM; Lower storage (typically 64GB)
High-performance CPU; Large storage (up to 1TB)
GL: Sufficient for remote assistance and digital checklists
HS: Typically required for complex Digital Twins and offline AI processing
Ergonomics
Lightweight (<300 g); Low physical fatigue
Heavy (>500 g); High potential for neck fatigue
GL: Well-aligned with full-shift, mobile inspection tasks
HS: Likely better suited for short-duration or stationary tasks
Display and visuals
Transparent optics; Lower resolution; High situational awareness
Opaque/Pass-through; 4K resolution; Immersive visual isolation
GL: Safety-critical environments requiring real-world view
HS: Detailed schematic review and 3D visualization
Environment and safety
Rugged (IP67, MIL-STD-810, Safety Glass standards)
Generally fragile; Lack industrial certifications
GL: Hazardous, outdoor, or industrial sites
HS: Controlled clean rooms, offices, or training facilities
Spatial mapping
Basic; Often relies on GPS for outdoor tracking
Advanced (SLAM, LiDAR, Depth Sensing); No GPS
GL: Geo-located outdoor assets
HS: Precise indoor overlay and complex spatial anchoring
Interaction
Voice & Touch (buttons/pad); Hands-free focus
Hand Tracking & Controllers; Multi- modal input
GL: Simple data entry/retrieval while working with tools
HS: Manipulating complex virtual 3D objects
Latency
Zero latency (Optical see-through)
Variable latency (Video see-through, ≈11-40 ms)
GL: Preferred where operator dizziness/disorientation is a risk
HS: Acceptable if latency is minimized (<13 ms)
Software support
Proprietary SDKs; Limited engine support
Broad Unity/Unreal/ OpenXR support
GL: Specific vendor workflows
HS: Custom, scalable enterprise development

For a parameter’s perspective, Table 17 provides a condensed overview of the differences between AR glasses and MR headsets. In line with the qualitative methodology outlined in Section 2, this summary maps the technical specifications discussed in the previous subsections directly onto their operational implications for PAM. From the perspective of PAM-derived scenarios, Table 18 presents a summary of the corresponding operational contexts, device parameters, and global suitability of AR glasses and MR headsets.

The comparison reveals a clear functional dichotomy. AR glasses prioritise operator endurance, environmental resilience, and situational awareness, making them the appropriate choice for fieldwork and active inspections. By contrast, MR headsets prioritise computational power, visual fidelity, and complex interaction, making them better suited to training and detailed 3D visualisation in controlled environments. This confirms that no single device can satisfy all PAM use cases, and that equipment selection must be strictly dictated by the operational profile of the task at hand.

Table 18PAM scenario-based summary of the suitability of AR glasses and MR headsets

Scenario
Operational context
Relevant device parameters
Suitability
Full-shift field inspection
Mobile; indoor/outdoor; safety-critical; long duration; intermittent connectivity
Low weight (Table 5); transparent display (Table 7); ruggedisation and safety certifications (Table 12); offline capability and storage (Table 4)
GL:
HS:
X
Remote assistance during maintenance
Mobile; hands-free; mixed indoor/outdoor; network-dependent
Hands-free interaction (Table 10); camera and documentation capture (Table 9); audio/voice support (Table 8); connectivity (Section 3.7)
GL:
HS:
X
Controlled training and simulation
Stationary/semi-mobile; indoor; controlled environment
High visual fidelity and wide FoV (Tables 6–7); spatial mapping and depth sensing (Table 9); compute capacity (Table 3); multimodal interaction (Table 10)
GL:
HS:
X
Hazardous or regulated site operation
Industrial; safety-critical; hazardous zones; PPE required
Safety certifications and ruggedisation (Table 12); transparent optics (Table 7); minimal latency (Table 11); reliable audio (Table 8)
GL:
HS:
X
Offline maintenance workflow
Field-based; disconnected; documentation-heavy
Local storage and offline operation (Table 4); documentation capture (Table 9); OS/SDK support (Section 3.12); battery endurance (qualitative)
GL:
HS:
Indoor asset visualisation and navigation
Indoor; spatially complex; precision-oriented
SLAM, depth sensing, and anchoring (Table 9); wide FoV (Table 6); low latency (Table 11)
GL:
HS:
X
✓ – Well-suited
△ – Partially suitable
X – Not suitable

4. Conclusions

AR/MR technologies have the potential to significantly enhance PAM by supporting maintenance and inspection processes, improving operational efficiency, and increasing safety in the management of physical assets. The main goal of this article was to provide a comparative overview of various AR/MR devices currently available on the market, with a focus on assessing their applicability within a PAM context. Most existing studies target consumer, gaming, or general enterprise use, meaning that PAM decision-makers often lack consolidated, PAM-specific hardware comparisons. This work aims to fill this gap and help organizations understand how device characteristics align with PAM requirements. Rather than promoting a universal “best” device, which does not exist in this context, this study offers a way to map device capabilities to PAM tasks.

A lack of first-hand experience with each device and the reliance on product sheets means that device performance could not be empirically validated. In addition, missing/undisclosed manufacturer data reduced completeness in some parameters. While the scope of the study is limited, several key observations can be drawn:

1) AR glasses tend to offer more affordable and ergonomic solutions. Although they are typically more limited in terms of hardware capabilities, their lightweight design and resilience make them well-suited for use in environments that may be challenging for more delicate electronic equipment.

2) MR Headsets, on the other hand, generally provide greater computational power and a broader range of functionalities. However, these advantages come with trade-offs, including higher costs, shorter battery life, increased bulk, and latency issues that are less pronounced in AR glasses.

In summary, no single category of device emerges as definitively superior. Rather, each is better suited to specific use cases and operational environments, underscoring the importance of context when selecting AR/MR equipment for PAM applications.

AR glasses align well with mobile inspections and long-duration field tasks due to their lightweight design, durability, and certification profiles. Their transparent optics are also advantageous in environments where situational awareness is critical. MR headsets, on the other hand, provide richer spatial mapping and visualization capabilities, making them more suitable for training, complex maintenance, and static or semi-mobile workflows. However, despite strong technical capabilities, devices without ruggedisation or safety certifications may be unsuitable for hazardous PAM environments.

For further research, live tests could be performed with the devices to improve the limited scope of the project’s research, which was based solely on product sheets and official documentation. Because these product sheets do not reveal real-world performance, future research should measure response latency, battery endurance, SLAM stability, and ergonomic impact in operational settings. Furthermore, user acceptance testing could also be included, as it is an important metric in determining a device’s utility. Hands-on studies could assess cognitive load, usability, and acceptance during typical PAM workflows such as inspections or maintenance procedures.

Another relevant area to explore as future work is integration complexity, evaluating each device’s interoperability with asset management systems, IoT platforms, and digital twins.

Finally, while this study provides a qualitative mapping of device capabilities, the lack of a standardised benchmarking framework in existing literature is a notable shortcoming. Addressing this is not straightforward, as a universal scoring model would be inadequate for such a diverse field as PAM. An effective evaluation system therefore requires a dynamic, weighted-criteria approach that prioritises parameters such as ergonomics and latency based on specific use case profiles. The development of such a complex, adaptable framework merits its own dedicated study. This could serve as the quantitative counterpart to the qualitative insights presented in this article.

References

  • F. Malva, J. Marinho, N. C. Martins, A. Malta, M. Mendes, and N. Ferreira, “A comparative analysis of ar equipment for physical asset management applications,” in Proceedings of PAMDAS 2025 – International Conference on Physical Asset Management and Data Science, Jul. 2025.
  • S. Ahmed, “A review on using opportunities of augmented reality and virtual reality in construction project management,” Organization, Technology and Management in Construction: an International Journal, Vol. 11, No. 1, pp. 1839–1852, Feb. 2019, https://doi.org/10.2478/otmcj-2018-0012
  • R. Ojha, “Integrating digital twin and augmented reality for asset inspection and training introduction,” International Journal of Research and Analytical Reviews, Vol. 11, Nov. 2024.
  • A. B. Solomashenko, O. L. Afanaseva, M. V. Shishova, I. E. Gulianskii, S. A. Sobolnikov, and N. V. Petrov, “Industrial applications of AR headsets: a review of the devices and experience,” Light: Advanced Manufacturing, Vol. 6, No. 2, p. 358, Jan. 2025, https://doi.org/10.37188/lam.2025.023
  • A. Mehrfard, J. Fotouhi, G. Taylor, T. Forster, N. Navab, and B. Fuerst, “A comparative analysis of virtual reality head-mounted display systems,” arXiv:1912.02913, Jan. 2019, https://doi.org/10.48550/arxiv.1912.02913
  • “Asset management – Vocabulary, overview and principles,” ISO Standardization, ISO 55000:2024, 2024.
  • D. Maletič, M. Maletič, B. Al-Najjar, and B. Gomišček, “An analysis of physical asset management core practices and their influence on operational performance,” Sustainability, Vol. 12, No. 21, p. 9097, Oct. 2020, https://doi.org/10.3390/su12219097
  • A. Zein and M. Karimah, “The role of internet of things (IOT) in enhancing asset management and operational efficiency,” Dirya: Journal of Economic Management, Vol. 2, No. 1, pp. 1–8, Jun. 2025.
  • J. Campos, P. Sharma, U. G. Gabiria, E. Jantunen, and D. Baglee, “A big data analytical architecture for the asset management,” in Procedia CIRP, Vol. 64, pp. 369–374, Jan. 2017, https://doi.org/10.1016/j.procir.2017.03.019
  • N. Moretti, X. Xie, J. Merino Garcia, J. Chang, and A. Kumar Parlikad, “Digital twin based built environment asset management services development,” in IOP Conference Series: Earth and Environmental Science, Vol. 1101, No. 9, p. 092023, Nov. 2022, https://doi.org/10.1088/1755-1315/1101/9/092023
  • K. O. Koumou and O. E. Isafiade, “Asset management trends in diverse settings involving immersive technology: a systematic literature review,” IEEE Access, Vol. 12, pp. 141785–141813, Jan. 2024, https://doi.org/10.1109/access.2024.3461548
  • M. Kostolani, J. Murin, and S. Kozak, “Intelligent predictive maintenance control using augmented reality,” in 2019 22nd International Conference on Process Control (PC19), pp. 131–135, Jun. 2019, https://doi.org/10.1109/pc.2019.8815042
  • Z.-H. Lai, W. Tao, M. C. Leu, and Z. Yin, “Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing,” Journal of Manufacturing Systems, Vol. 55, pp. 69–81, Apr. 2020, https://doi.org/10.1016/j.jmsy.2020.02.010
  • B. A. Ouedraogo, L.-C. Lien, U. Dolgorsuren, and Y. N. Liu, “Using virtual reality and augmented reality for presale house customer change,” in 37th International Symposium on Automation and Robotics in Construction, Oct. 2020, https://doi.org/10.22260/isarc2020/0003
  • S. M. Satapathy, R. Jhaveri, U. Khanna, and A. K. Dwivedi, “Smart rent portal using recommendation system visualized by augmented reality,” Procedia Computer Science, Vol. 171, pp. 197–206, Jan. 2020, https://doi.org/10.1016/j.procs.2020.04.021
  • M. E. Dolas and M. Ulukavak, “Displaying infrastructure data with augmented reality technology in field works,” in Proceedings of the 3rd International Conference on Virtual Reality, 2021.
  • D. Mascareñas et al., “Augmented reality for enabling smart nuclear infrastructure,” Frontiers in Built Environment, Vol. 5, p. 82, Jun. 2019, https://doi.org/10.3389/fbuil.2019.00082
  • P. Gong, Y. Lu, R. Lovreglio, X. Lv, and Z. Chi, “Applications and effectiveness of augmented reality in safety training: A systematic literature review and meta-analysis,” Safety Science, Vol. 178, p. 106624, Oct. 2024, https://doi.org/10.1016/j.ssci.2024.106624
  • P. Negi et al., “Specific soft computing strategies for the digitalization of infrastructure and its sustainability: a comprehensive analysis,” Archives of Computational Methods in Engineering, Vol. 31, No. 3, pp. 1341–1362, Oct. 2023, https://doi.org/10.1007/s11831-023-10018-x
  • J. R. Minnaar, “Developing a framework for identifying and assessing data quality issues in asset management decision-making,” Stellenbosch University, Stellenbosch, 2015.
  • Y. Qian, K. Zhang, L. Cao, and Z. Zhang, “Design and research of distribution equipment and asset management system based on internet of things technology,” in 2nd IEEE Conference on Energy Internet and Energy System Integration (EI2), pp. 1–6, Oct. 2018, https://doi.org/10.1109/ei2.2018.8582376
  • M. Lorenz, S. Knopp, and P. Klimant, “Industrial augmented reality: requirements for an augmented reality maintenance worker support system,” in IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp. 151–153, Oct. 2018, https://doi.org/10.1109/ismar-adjunct.2018.00055
  • B. Nagaiah, “Field service inspection using AR/VR systems integrated with enterprise systems,” International Journal of Multidisciplinary Research and Growth Evaluation, Vol. 6, No. 4, pp. 1274–1277, Jan. 2025, https://doi.org/10.54660/.ijmrge.2025.6.4.1274-1277
  • L. Bettina and O. Astrid, “Ergonomic design of mobil interaction devices to assist field worker and increase process safety,” Chemical Engineering Transactions, Vol. 82, pp. 145–150, Oct. 2020, https://doi.org/10.3303/cet2082025
  • “Vuzix m400 product sheet.” Vuzix, https://vuzix-website.s3.amazonaws.com/files/ content/product-sheets/vuzix-m400-smart-glasses.pdf (accessed May 2025).
  • “Vuzix m4000 product sheet.” Vuzix, https://vuzix-website.s3.amazonaws.com/files/ Content/product-sheets/Vuzix-M4000-Smart-Glasses.pdf (accessed May 2025).
  • “Realwear navigator 500 product sheet.” Realwear, https://realwear.at/wp-content/ uploads/2021/12/datasheet realwear hmt-2 navigator500.pdf (accessed May 2025).
  • “Thirdeye x2 presentation.” BorealTech, https://borealtech.com/wp-content/uploads/ 2020/09/third-eye-x2-mr.pdf (accessed May 2025).
  • “Product specification version 6.2.” MagicLeap, https://www.magicleap.care/hc/en-us/article attachments/33734272340493 (accessed May 2025).
  • “Meta quest 3 product page.” Meta, https://www.meta.com/quest/quest-3/ (accessed May 2025).
  • “VIVE XR Elite and Deluxe Pack.” HTC, https://www.vive.com/us/product/vive-xr-elite/specs/ (accessed May 2025).
  • “Varjo xr-4 product page.” Varjo, https://b2b-store.varjo.com/product/xr-4 (accessed May 2025).
  • “Pico 4 product specification page.” Pico, https://www.com/global/products/pico4/specs (accessed May 2025).
  • “Apple vision pro product specification page.” Apple, https://www.apple.com/apple-vision-pro/specs/ (accessed May 2025).
  • “Microsoft hololens 2 documentation.” Microsoft, https://learn.microsoft.com/pt-pt/hololens/ (accessed May 2025).
  • J. Li, “Beyond sight: enhancing augmented reality interactivity with audio-based and non-visual interfaces,” Applied Sciences, vol. 14, no. 11, p. 4881, jun. 2024, https://doi.org/10.3390/app14114881
  • V. Fuvattanasilp, M. Kljun, H. Kato, and K. Pucihar, “The effect of latency on high precision micro instructions in mobile AR,” in MobileHCI’20: 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 1–5, Oct. 2020, https://doi.org/10.1145/3406324.3410716
  • T. Björk. “Apple vision pro benchmark test 1: see-through latency, photon-to-photon | optofidelity.” OptoFidelity, https://www.optofidelity.com/insights/blogs/apple-vision-pro-benchmark-test-1-see-through-latency-photon-to-photon (accessed 2024).
  • “Ingress protection (ip) ratings.” IEC, https://www.iec.ch/ip-ratings (accessed May 2025).
  • “MIL-STD-810G: department of defense test method standard for environmental engineering considerations and laboratory tests,” Department of Defense, United States of America, Military Standard MIL-STD-810G, Oct. 2008.
  • “ANSI/ISEA Z87.1-2020: American national standard for occupational and educational personal eye and face protection devices,” International Safety Equipment Association, Arlington, VA, 2020.
  • “ISO 14644-1:2015 – cleanrooms and associated controlled environments – part 1: Classification of air cleanliness by particle concentration,” International Organization for Standardization, 2015.
  • “UL and C-UL Hazardous areas certification for North America.” UL Solutions, https://www.ul.com/services/ul-and-c-ul-hazardous-areas-certification-north-america (accessed 2025).
  • “Unity Real-time development platform.” Unity Technologies, https://unity.com/ (accessed 2025).
  • “Unreal engine.” Epic Games, https://www.unrealengine.com (accessed 2025).

About this article

Received
October 6, 2025
Accepted
February 3, 2026
Published
April 21, 2026
Keywords
augmented reality
mixed reality
ar
mr
glasses
headsets
physical asset management
pam
Acknowledgements

This work is included in the CRIARTE: Construction with Intelligent Robotics and Revolutionary Architecture of Emerging Technologies project (COMPETE2030-FEDER-01192200), developed within the Innovation and Digital Transition Operational Programme (Portugal 2030) and the European Union.

Data Availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

Author Contributions

Francisco Malva: investigation, methodology, visualization, writing-original draft preparation. Romeu Nogueira: investigation, visualization, writing-review and editing. Jose Marinho: conceptualization, formal analysis, methodology, project administration, validation, visualization, writing-review and editing. Nuno Cid Martins: conceptualization, formal analysis, methodology, project administration, validation, visualization, writing-review and editing. Ana Malta: resources. Mateus Mendes: funding acquisition, resources. Nuno Ferreira: resources.

Conflict of interest

The authors declare that they have no conflict of interest.