Sei sulla pagina 1di 19

Automation in Construction 107 (2019) 102940

Contents lists available at ScienceDirect

Automation in Construction
journal homepage: www.elsevier.com/locate/autcon

Remote interactive collaboration in facilities management using BIM-based T


mixed reality

Khaled El Ammaria, Amin Hammadb,
a
Department of Building, Civil and Environmental Engineering, Concordia University, Canada
b
Concordia Institute for Information Systems Engineering, Concordia University, Canada

A R T I C LE I N FO A B S T R A C T

Keywords: Facilities Management (FM) day-to-day tasks require suitable methods to facilitate work orders and improve
Facilities management performance by better collaboration between the office and the field. Building Information Modeling (BIM)
Mixed reality provides opportunities to support collaboration and to improve the efficiency of Computerized Maintenance
Feature-based tracking Management Systems (CMMSs) by sharing building information between different applications/users
Building information modeling
throughout the lifecycle of the facility. However, manual retrieval of building element information can be
Remote interactive visual collaboration
challenging and time consuming for field workers during FM operations. Mixed Reality (MR) is a visualization
technique that can be used to improve visual perception of the facility by superimposing 3D virtual objects and
textual information on top of the view of real-world building objects. This paper discusses the development of a
collaborative BIM-based MR approach to support facilities field tasks. The proposed framework integrates
multisource facilities information, BIM models, and feature-based tracking in an MR-based setting to retrieve
information based on time (e.g. inspection schedule) and the location of the field worker, visualize inspection
and maintenance operations, and support remote collaboration and visual communication between the field
worker and the manager at the office. The field worker uses an Augmented Reality (AR) application installed on
his/her tablet. The manager at the office uses an Immersive Augmented Virtuality (IAV) application installed on
a desktop computer. Based on the field worker location, as well as the inspection or maintenance schedule, the
field worker is given work orders and instructions from the office. Other sensory data (e.g., infrared thermo-
graphy) can provide additional layers of information by augmenting the actual view of the field worker and
supporting him/her in making effective decisions about existing and potential problems while communicating
with the office in an Interactive Virtual Collaboration (IVC) mode. Finally, to investigate the applicability of the
proposed method, a prototype system is developed and tested in a case study.

1. Introduction issues. This research proposes an innovative Building Information


Modeling (BIM) based approach for supporting large-scale facilities'
Facilities Management (FM) tasks are fragmented and require field tasks. Specifically, the approach supports Interactive Virtual
gathering and sharing large amounts of information related to facilities Collaboration (IVC) between the facility manager and the field worker
spaces and components, and covering historical inspection data and using different Mixed Reality (MR) methods. Virtual Reality (VR) is a
operation information. Despite the availability of sophisticated technology that builds a virtual 3D model in a computer to visually
Computerized Maintenance Management Systems (CMMSs), these sys- reproduce a computer-generated simulation of a real or imaginary en-
tems focus on the data management aspects (i.e., work orders, resource vironment, which the users can experience. Augmented Reality (AR) is
management and asset inventory) and lack the functions required to a visualization technology that integrates a real-time view of the user's
facilitate data collection and data entry, as well as data retrieval and real environment and virtual objects within the same environment.
visualization when and where needed. There are several research pro- Augmented Virtuality (AV) is a 3D environment augmented with a part
blems in the FM domain for supporting field tasks. These problems can of the real world (e.g., video). Mixed Reality (MR) is a term covering
be categorized as the following: (1) Automatic data capturing and re- the continuum from reality to AR, AV, and VR [1].
cording issues, (2) Visualization issues, and (3) Remote collaboration The proposed system supports facilitating data collection and data


Corresponding author.
E-mail addresses: elamm_k@encs.concordia.ca (K. El Ammari), hammad@ciise.concordia.ca (A. Hammad).

https://doi.org/10.1016/j.autcon.2019.102940
Received 10 January 2019; Received in revised form 5 August 2019; Accepted 29 August 2019
Available online 05 September 2019
0926-5805/ © 2019 Elsevier B.V. All rights reserved.
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

entry. The AR module of the system provides an accurate selection of collaboration using MR can help solving these issues by providing
the facility element and retrieves all its BIM information without better visual communication and interaction.
spending time to locate the element in BIM software (e.g. Autodesk
Revit). This is facilitated by the accurate pose estimation the proposed 2.1. Collaboration in MR-based AEC/FM applications
hybrid tracking method provides. In addition, adding any extra in-
formation is done on the spot without having to search in the BIM Milgram and Colquhoun [1] introduced the MR continuum diagram
model. When the field worker selects a real element (e.g., a light which defines the levels of augmentation. Besides the reality, the levels
switch) though the AR view, he/she is provided with all the attributes are Augmented Reality (AR), Augmented Virtuality (AV), and Virtual
of this component from the BIM Database without dealing with the Reality (VR). MR integrates the real view of the user's environment and
actual BIM model. some virtual contents. Augmenting the view with virtual contents is
The previous research (e.g., [2–4]) shows that collaboration using called AR, while augmenting the Virtual Environment (VE) with a part
Augmented Reality (AR) technology can have significant impact on the of the real view is called AV. In these visualization techniques, addi-
Architecture, Engineering and Construction/Facilities Management tional virtual contents (e.g., picture, text or graphical model) can be
(AEC/FM) industry. Although MR hardware and software systems are superimposed and spatially attached to the relevant objects in the
becoming more supportive for a wide range of collaboration scenarios scene. The application of MR has been researched in different domains,
in AEC/FM, there are a number of research issues in MR-based colla- such as in construction for supporting building inspection [3], proactive
boration, which need further investigation including process colla- construction defect management using BIM [10], facilitating pipe as-
boration and collaborative situational awareness. Furthermore, issues sembly [11], improving students' performance in a building design
related to the taxonomy of visualization (i.e., when, what, and how to project [12], enhancing construction safety [13], improving design
augment) and collaboration (e.g., levels and types of interaction, im- collaboration [4], and accessing building information through a situa-
mersion, telepresence) need more investigation. The previous research tional awareness approach [14].
did not investigate how MR can be used from the task point of view, or Collaborative systems have been used in the AEC/FM industry to
which MR mode is more convenient for field workers or office managers support product design, maintenance and factory planning. BIM-based
since the nature of the FM tasks requires managers to have more access systems such as Revizto [15], AutoCAD 360 [16], and Bluescape [17]
to attribute information and less access to facilities' geometric data. On allow site-office collaboration by sharing BIM and CAD drawings. The
the other hand, the field workers need to focus on the real facility visual collaboration in these systems is limited to sharing 2D/3D views,
components and require additional information and support with less documents and annotations about the facility components. However,
time-consuming model navigation and data retrieval. collaboration in MR is relatively new due to the recent advancement of
The proposed framework integrates multisource facilities databases, telecommunication technologies. Wang and Dunston [18] mentioned
BIM models, and feature-based tracking in an MR-based distributed that AR can reduce the mental disturbance in collaborative design tasks.
setting to retrieve information based on the time and the location of the Wang et al. [4] proposed a collaborative design system to enhance the
field worker, visualize inspection and maintenance operations, and distributed cognition among remote designers. Their research focused
support collaboration and visual communication between the field on how their system affects collaboration based on distributed cogni-
worker and the manager at the office. tion and mutual awareness. Dong et al. [19] showed that AR can fa-
The objectives of this paper are: (1) Investigating an automated cilitate engineering processes' communication. Similar research con-
method to capture and record location-based building element data ducted by Datcu et al. [20] led to the development of a system that
(e.g., defects) and visually share them in real time with the remote supports situational awareness among team members in the area of
office, and (2) Investigating the potential of integrating MR and geor- security. In the same context, a remote assistance system called ReMote
eferenced BIM data to provide improved remote visual collaboration allows a remote expert to assist workers in a mining site [21]. The re-
between field workers and the FM office with less mental load. mote expert views a real-time reconstruction of a remote workspace
The field worker uses an AR-based application installed on his/her using a VR headset. Gauglitz et al. [2] discussed the development of a
tablet, while the manager at the office uses an Immersive Augmented tablet-based AR system that integrates a touch-based interface used by a
Virtuality (IAV) based application. Finally, to investigate the usability remote user to tag real-world elements with annotations for supporting
of the proposed method, a prototype system called Collaborative BIM- FM operations. Oda et al. [22] introduced an AR-based remote colla-
based Markerless Mixed Reality Facility Management System (CBIM3R- boration system for supporting equipment inspection and maintenance.
FMS) is developed and tested in a case study. However, the visual collaboration presented in these research projects
was done by sharing either the same view or the same model from
2. Literature review different perspectives. This kind of collaboration is neglecting the fact
that different users may have different interests in the data they want to
During complex facility operations, problem solving still requires a access and interact with. These different interests can cause confusion
team of experts to physically meet and interact with each other [5]. The and inaccuracy of data sharing during collaboration. In other words, the
identification of the problem and the creation of a shared under- ability to share ideas about what to do on a specific facility component
standing are major challenges for efficiently solving a problem [6]. The by a field worker and a remote expert may improve visual collaboration
lack of field tasks' collaborative support negatively affects onsite and reduce inaccuracy of shared data that may lead to wrong decisions.
workers' ability to sort out problems [7]. Furthermore, the long physical
distance between project participants is one of the primary causes of 2.2. Location tracking methods for indoor MR applications
delays in decision making [8]. According to Kestle [9], misinterpreta-
tion and miscommunication of project results and lack of delegated In recent years, location tracking methods have received great at-
authority to field workers often hindered progress. The mis- tention in the area of pervasive computing because many applications
communication between stakeholders can lead to a number of avoid- need to know where the objects of their interest are located. Location
able errors, such as those caused by wrong facility element identifica- tracking in MR applications has been discussed by several researchers.
tion, wrong data entering/re-entering, wrong interpretation, and [23] presented a typology of AR applications based on the tracking
difficulties in supporting field tasks with expert knowledge when requirements. This research shows that the most challenging tracking
needed. Furthermore, the collaboration between the office and field issue in AR is the calibration and registration. In FM, an accurate es-
workers in real-time is not limited to data sharing as they need to timation of the location and the orientation of the camera used for the
collaborate to perform an inspection or maintenance tasks. Real-time MR is an essential step to estimate the camera pose and to correctly

2
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

register the augmentation on the actual scene.


In general, tracking methods can be classified into two categories:
sensor-based and computer vision (CV) based. In the sensor-based ap-

interaction
Real-time
proach, several location tracking technologies have been investigated
including the Global Positioning System (GPS), Wireless Local Area


Network (WLAN), Inertial Measurement Units (IMU), Radio Frequency
Identification (RFID), and Ultra-Wideband (UWB) Real-time Location
Systems (RTLS) [24]. According to Deak et al. [25], UWB signals can

Collaboration
pass through walls and offer high accuracy. However, they are rela-
tively expensive to integrate due to the high infrastructure cost. For

Remote
indoor applications, GPS has low coverage (4.5%) [26]. LaMarca et al.


[26] also indicated that WLAN has high coverage (94.5%) and less

Collaboration
accuracy in indoor environments (15–20 m). In the CV-based approach,

Sharing
the tracking methods can be either marker-based using a registered

Data
printed marker or markerless-based (also known as feature-based) using


registered natural features [27]. The marker-based method requires

Guidance
creating a large database of markers. The creation and the calibration
process of these markers is time-consuming and requires accurate pla-


cement in the real world to achieve accurate and reliable tracking. In
FM, it is not practical to attach markers everywhere to maintain
tracking. Supporting system mobility is important for FM applications,

Virtual
reality
which eliminates the applicability of tracking methods that rely on



RTLSs or marker-based CV approaches. In the feature-based tracking
method, the detection is based on the features of the real environment
and does not require markers. The feature-based method can provide

Visualization

Augmented
acceptable tracking accuracy for MR-based applications supporting in-

reality
door FM operations. However, using feature-based tracking alone may
not provide accurate and stable tracking for indoor MR applications.


That is due to the lack of unique features or/and changes in light

Mobility
conditions.
To support mobility of MR-based FMSs, Neges et al. [28] discussed a


method that combines an Inertial Measurement Unit (IMU) and step
Telepresence

counter marker-based tracking for AR based indoor navigation support.


Tracking

However, this method is based on a manual procedure to determine the


start position for route calculation. Table 1 summarizes the related


research efforts and compares them based on four categories: (1) Data
Sensory data

integration using BIM, georeferenced contents, sensory data, and scal-


ability of the data; (2) Tracking method (i.e., support for system mo-
bility and users' telepresence); (3) Visualization methods; and (4) Col-

laboration type. Even though there are several research projects
Georeferencing

investigating collaboration, the research about remote interaction in


FM is limited considering the different information needs for different
Integrated data
Comparison of recent research efforts related to MR applications in AEC/FM.

stakeholders.


Support

3. Proposed method
BIM

As shown in Fig. 1, the framework supports two types of activities:


BIM based Multiuser Collaborative Virtual Environments for end user
Zero latency: Real-time synchronization of BIM data in virtual reality

A study on software architecture for effective BIM/GIS-based facility

Collaborative visualization of engineering processes using tabletop


Integrating mobile building information modeling and augmented

(1) Activities of the facility manager where the routine FM work is


Mutual awareness in collaborative design: An Augmented Reality
Precision study on augmented reality-based visual guidance for

performed including the review of work orders, scheduling and as-


information through a situation awareness approach [14]
Mixed-Reality for Object-Focused Remote Collaboration [30]

A mobile Augmented Reality method for accessing building

signing inspection and maintenance tasks, and supporting field workers


Comparative effectiveness of mixed reality-based virtual

using the IAV module; and (2) Activities of the field worker (e.g., in-
spector) who uses a tablet to perform the assigned task in the AR-based
reality systems: an experimental study [29]

module and collaborate with the facility manager in real time. The
environments in collaborative design [18]

facility manager, using VR headset, is immersed in the VR facility


for collaborative decision-making [31]

model and he/she is able to interact with the BIM-based facility com-
management data integration [34]

integrated telepresence system [4]

ponents and with the field worker in real time. The field worker uses
facility management tasks [32]

tablet-based AR module that allows him/her to interact with the real


environment by retrieving georeferenced facility data, and to collabo-
augmented reality [19]

rate with the remote office by adding georeferenced annotations. The


databases are used to add, update, and retrieve task-related data.
involvement [33]

Table 2 defines which visualization technology is more suitable for


which task. In addition, it classifies the relevant visual information
which will be shown depending on the task. The hardware used in this
research was selected carefully to provide improved visualization and
Table 1

Paper

perception that satisfy the needs of the facility manager at the office
and the field workers with less visual and mental disturbance. During

3
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Fig. 1. Interaction between office manager and field worker in CBIM3R-FMS.

the remote collaboration session, the facility manager needs to have full the georeferenced BIM model to be used in different modules; (c) Data
access of the BIM information. The immersive mode of the IAV module interaction phase, which allows AR and IAV modules users to interact
allows the facility manger to access this detailed BIM information and with the processed georeferenced facility data through the four data-
also interact with the VE using the virtual keyboard to write or modify bases mentioned in Section 3.1. The main purpose of using CityGML is
any BIM data, tasks, or defect related information. The VR controllers to provide the high-level visual reference for navigation purposes and
allow him/her to interact with the VE and navigate, select element, add the scalability of the system to manage large-scale facilities; such as
marks or sketches. On the other side, the field worker needs to focus university campuses, airports and large industrial compounds.
more on the real environment and to have more specific BIM in- This research proposes two methods to enhance the visualization
formation based on his/her element interaction or FOV. Using a tablet and the MR experiences for both the field worker and the office facility
with an easy to use interface allows the worker to easily retrieve BIM manager. These methods include the Virtual Hatch and the Multi-Layer
data by picking on the real element and to write any additional in- MR views. The virtual hatch allows the AR module uses to visualize BIM
formation. In addition, he/she can mark or sketch on the AR view using hidden content (e.g., Dry wall studs) as a virtual cut on the wall to
the tablet's touch screen. enhance their depth perception. The multi-layer MR views for both AR
CBIM3R-FMS requires using georeferenced BIM-based facility data, and IAV modules allow users to view content in an organized way that
which is important for FM task collaboration because the field worker allow them to understand each other's context.
should know his/her relative location with respect to the target element
to apply the augmentation. At the same time, the facility manager 3.1. Data pre-processing for CBIM3R-FMS
should know both the target element and the field worker locations to
know how to assign and support tasks. As shown is Fig. 2, there are four databases which are connected
Using feature-based tracking alone is not efficient for indoor AR with the two modules of the system. The databases are: Tracking data-
applications. For example, because of the repetitive nature and pat- base, BIM database, Defect database, and Task database. During the pre-
terned style of the multi-story office building design, unique features processing phase, feature-based tracking requires collecting large
can be hard to find. This similarity in building elements design may number of images within the building in order to detect the features
reduce the efficiency of the feature-based tracking system since the that will be used for tracking. The field worker has the role of building
Tracking database has similar features registered. In addition, the fea- and feeding the databases with the relevant information. First, he/she is
ture-based tracking requires the system to search in a large database responsible of collecting the images, which will be used to create the
that has a large number of registered tracking data. Also, considering feature-based tracking data. Then, he/she is responsible of collecting
changing in lighting conditions and some reflected materials in indoor defect data and storing them in the Defect database. On the other hand,
environments, some registered features may not be detected. Therefore, populating the Tracking database should be done at the office by pro-
this research uses a hybrid tracking method [35] that combines: (1) cessing the collected images and 3D models of the building to create the
marker-based tracking for retrieving the relevant feature-based tracking feature-based tracking data. The BIM models should be created and
data from the tracking database; (2) feature-based tracking as the main saved in the BIM database. In addition, the facility manager updates the
tracking method; and (3) gyroscope sensory data to capture AR module Task database based on the inspection reports, records and schedule.
pose orientation when registered features are not detected. The gyro- The pre-processing data workflow is as the following:
scope of the tablet can be used to compensate for the loss of feature-
based tracking, assuming that the field worker is at a fixed position and (1) The images should be collected by the field worker.
he/she is only changing the orientation of the tablet. (2) BIM-based modeling is done at the office, which can be newly
To build large-scale VEs for CBIM3R-FMS that satisfy the require- created or updated models. The BIM data are saved is an IfcXML
ments of FM tasks, data from different sources need to be integrated. format in the BIM database. The IfcXML format is suitable for non-
The integration process is based on three phases: (a) Data collection geometric data retrieval and exchange using an Extensible Markup
phase for multisource data including CityGML, BIM models, task data, Language (XML) parser. The database hierarchical structure in-
and feature-based tracking data; (b) Data conversion phase to prepare herits the structure of the IfcXML main nodes. For example, if the

Table 2
MR technologies and their main roles for inspection and maintenance tasks.
Visualization Technology User Main roles

IAV Facility manager at the IAV helps the facility manager visualize geo-located task information combined with the virtual model and the video
office captured by the field worker.
AR Field worker AR allows the field worker to work freely without the need to search for relevant building elements in a complex building
model, avoiding the complexity of navigation in the VE to find these elements.

4
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Fig. 2. Pre-processed data for CBIM3R-FMS.

facility consists of a number of buildings and each building has Defect database uses the same approach. Tasks in the Task database are
many floors, the database will have the same hierarchy to facilitate also linked to spaces or specific elements in the BIM database using their
data retrieval during real-time operations. IDs. The position information is used to generate the 3D objects for task
(3) The collected images should be processed including the registration marking.
of the CV features by mapping them with the BIM model to generate
tracking data. The tracking data are saved in the Tracking database. 3.2. MR-based interactive visual collaboration (IVC)
(4) The VE is created based on the BIM models, which will be used in
the IAV and AR modules. Furthermore, CityGML allows creating The interactive visual collaboration (IVC) has two modes: (1) non-
models with different Levels of Detail (LOD) for large 3D environ- constrained mode where the field worker and facility manage Field of
ments. CityGML data are used as the base model for CBIM3R-FMS. Views (FOVs) are not matched and synchronized, and (2) constrained
The 3D multisource data integration is processed and the result is mode where their FOVs are synchronized. The IVC between the facility
transferred to a data format that keeps the necessary information manager at the office and the field worker is important to avoid mis-
including elements IDs and the hierarchy structure of the original interpretation of the assigned inspection or maintenance tasks. This can
model (e.g., Autodesk FBX technology). It is important to keep both lead to unnecessary rework, which consequently reduces productivity
the structured 3D data and the unique IDs to integrate with the [36]. In the MR-based collaboration, the manager at the office can as-
BIM-based attribute database. sign location-based tasks and instantly share this information with the
(5) The task scheduling is done by the facility manager, where he/she field worker using the IAV module. On the other side, the field worker,
should update the Task database with task-related information in- using the mobile AR module, can share the real view he/she sees with
cluding task ID, scheduled date, and assigned location, worker, and the manager. Visual guidance is given by the manager to facilitate
tools. performing tasks. As shown in Fig. 3, the proposed system has two sides
(6) Detected defects by the field inspectors are stored in the Defect that support interaction and collaboration.
database including geotagged defects images and descriptive textual
information about the defects. During the operations, since the field 3.2.1. IAV-based module
worker pose is tracked using the location and orientation of his/her On the office IAV side, the manager points at the specific place or
tablet, he/she will be able to take geotagged pictures of the po- component where the inspection or the maintenance should take place.
tential defects and automatically send them to the office. The The location of this picking is captured using ray tracing method. This
images will be positioned at their relative positions in the VE. This location information is used to place the generated task tag in the VE.
will allow the facility manager to initiate the work order and assign The location information along with a brief task description and the
it to the suitable inspector. target building element information are streamed to the on-site mobile
AR-based module.
Moreover, the conditions of components are updated to reflect the
work done by inspectors and technicians according to the preventive
3.2.2. Mobile AR-based module
and corrective work orders. The updated data from the system can be
Using the mobile AR-based module, the field worker views the
retrieved and visualized using defined queries for visual analytics. In
marked tasks and navigates through the real environment with the help
addition, new defect-related attributes, required for FM, are added to
of a rendered direction arrow for visual guidance. The arrow orienta-
the Defect database.
tion is calculated based on the tagged task position and the tablet
The data in the Tracking database are stored in a relational database
camera position. When the field worker reaches the assigned place and
(e.g. MySQL) and linked with the relevant BIM component in the BIM
finds the task tag in the AR view, he/she marks the defect on the ele-
database (e.g. a wall in a specific room) using the BIM component un-
ment through the AR interface. After marking the defect, the image of
ique ID. The link between the BIM component and defect records in the
the defected element and the location information of the defect mark

5
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Fig. 3. Interactive visual collaboration.

(e.g. a red sphere representing leakage on the ceiling) are streamed and worker, his/her gesture-based picking location values (xo, yo, zo), which
placed in the VE. are used to instantiate task marks, pointing arrows, or sketches, are
In this two-way interaction, the manager is immersed in the VE to streamed to the AR module in real time. Then, these values are used to
have more access to the facility information and to support the FM tasks replicate the visual modalities (i.e., task marks, pointing arrow, or
by IVC and data sharing with the field worker. The collaboration aspect sketch). On the other hand, when the field worker adds annotation
of the proposed framework allows both the manager at the office and (e.g., a sketch), his/her gesture-based picking location values (xw, yw,
the field worker, not only to share the geometrical data of what they are zw) are streamed to the IAV module. These values are used to replicate
interested in, but also to share their point of view as the pose of the his sketch in the IAV module.
virtual camera in the office-side IAV can be changed based on the pose
of the field worker camera in the constrained mode. In the constrained 3.3. CBIM3R-FMS system architecture
mode, the facility manager virtual camera pose in the VE is adjusted
based on the AR module camera pose information. On the other hand, Based on the aforementioned general framework, Fig. 4 illustrates
the free navigation mode allows the facility manager to navigate freely the system architecture, which consists of three parts: (a) FM office IAV
in the VE. module, (b) field worker AR module, and (c) the databases. The flow is
During an IVC session, two sets of pose data are streamed from each described by using four types of arrows: data updating for updating the
module: (1) camera pose data (i.e., location and orientation), and (2) databases, data retrieving for reading elements' defects attributes from
picking point data (location of the defect). First, to accurately position the Defect database, data streaming for on-demand real-time video
the manager in the IAV environment, the global coordinates and the streaming and pose data streaming between the AR and IAV modules,
orientation of the field worker (x, y, z, α, β, γ) should be streamed and and data flow for the feature-based tracking data. The dataflow between
converted to the IAV coordinate system (x′, y′, z′, α′, β′, γ′). This pose these components is explained in the following:
data sharing allows the manager to compare the real view of the target
building element, which the inspector is looking at through the tablet, (1) On the IAV office side, the facility manager reviews work orders
with the relevant element in the IVE environment. On the other hand, and schedules inspection or maintenance tasks. The manager
the three degrees of freedom (DoFs) of the head movement of the starts assigning georeferenced inspection or maintenance tasks by
manager in the IAV environment can be used to update 2D guiding pinpointing in the VE and selecting components. The task as-
arrows in the AR module to show the manager's point of interest. In signing could be based on the selected space (e.g., room) or ele-
addition, audio-based and text-based chatting can support the com- ment (e.g., wall). The selected component can be highlighted with
munication experience by sharing verbal descriptions or comments. colors to define the priority of the work orders. For example, red is
Second, when the facility manager starts the IVC session with the field used to indicate high priority, orange for moderate priority, and

6
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Fig. 4. System architecture of CBIM3R-FMS.

yellow for low priority. The task data, including location in- AR and the IAV views. The facility manager retrieves the updated
formation, are saved in the Task database. task data and starts collaborating with the field worker by aug-
(2) The video from the tablet camera is used for computer vision (CV) menting the mobile AR view with his/her comments and inter-
feature-based tracking. The video frames are used to detect the active pointing and sketching.
relevant features and retrieve them from the Tracking database. (8) Extra sensory data, such as thermography images captured by an
The tracking is activated based on the detected features. infrared (IR) camera, can add an additional layer of information to
(3) The embedded orientation sensors in the tablet feed both the the augmented scene about the condition of the facility compo-
mobile AR side and the office IAV side of CBIM3R-FMS with pose nents, and help the inspector detect the source of the problem
tracking data. On the mobile AR side, these tracking data can (e.g., water leakage inside the drywall).
support the CV-based tracking when no registered features are (9) The field worker can share his/her AR module view with the
detected. The tracking data of the tablet are streamed to update manager to help him/her better diagnose the element condition
the pose of the virtual camera of the IAV module. The matching of since the manager has more experience. This view will be shown
the two modules' camera pose helps each user to identify the other in IAV mode.
user point of interest, which is scoped by the camera FOV. (10) The manager collaborates with the field worker using virtual
(4) The current position of the field worker and the georeferenced pointing arrows to guide him in real time and discuses, if needed,
tasks' location information are used to update the direction of the any task related matters. This allows both users to be aware of
navigation guiding arrow as an augmentation to help the field each other's point or area of interest. The system also supports
worker navigate his/her way to the task location. Based on the audio and text-based chatting to facilitate real-time communica-
pose of the tablet camera of the field worker, basic information tion.
about the task is retrieved from the Task database and augmented
over the view of the real building element. The MR-based IVC helps both the field worker and the facility
(5) The field worker can select an element by picking the real element manager to visually communicate with each other and share the ne-
through the AR view and retrieve additional BIM information from cessary information related to the assigned task. The mobile AR-based
the BIM database. The picking is initiated based on a raycasting module helps the inspector to visualize the captured sensory data (e.g.,
method [37], and the unique ID from the geometrically optimized the IR picture), mark the detected defect, and review elements in-
model, as discussed in [35], is used to retrieve relative data from spection history. On the other hand, the office IAV allows the manager
the database. Once the element is identified in AR, the field to guide the field worker where to go, share what he/she sees in his/her
worker can add or modify inspection or maintenance information. rich model, and perform tasks that require more experience. For in-
This information can be geotagged images or textual information. stance, the facility manager can help the field inspector in assigning the
The resulting data will be saved in the Defect database. defect severity and discuss about potential elements that require further
(6) The defect data stored in the Defect database including building inspection.
elements, and elements IDs are used to update the AR module
view.
3.4. Interaction modalities for AR and IAV modules
(7) Based on the required task, the facility manager updates the BIM
model. For example, if the required task is making an opening in a
Picking [38] is the main technique used for interacting with the VE
drywall, the BIM model is modified by the facility manager, using
in the AR and IAV modules. The pick object can be a ray, which is
a BIM tool, and then the modified model is used to update both the
extended from the finger picking position through the VE. When a pick

7
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Fig. 5. The uses of picking technique in AR and IAV modules.

is requested, pickable objects that intersect with the pick object are 4. Implementation
computed. The pick returns a list of objects, from which the nearest
object can be computed. Picking can be used by the field worker or the To facilitate the development process, the CBIM3R-FMS prototype
facility manager for different purposes. The picking done by the facility system was developed using Unity3D game engine [39]. The engine is
manager is used to select an element, assign tasks, place pointing arrow used to build the VE and to allow the users to interact with the facility
or sketch in the IAV module. The picking done by the field worker al- components in both modules, and then to collaborate by exchanging
lows him/her to select an element, mark defects, place pointing arrow visual information about the assigned tasks. The VE was modeled by
or sketching in the AR module. In the case of marking defects, the in- merging different sources of data: (1) LOD2 CityGML model with tex-
tersection between the ray and the 3D object is used as the center point tures for the city blocks. The city blocks of the downtown area of
of the defect mark (e.g., a sphere). Fig. 5 shows the different uses of the Montreal were created based on CityGML data provided by the city
picking technique with different modalities for the AR and the IAV [40]. The CityGML data are used as the base model for CBIM3R-FMS.
modules. Each module has two layers to deal with these modalities (i.e., The 3D multisource data integration is processed and transferred to
created objects and received objects). Each user is manipulating dif- Unity3D as a FBX model; (2) detailed BIM of one building (LOD400),
ferent MR modalities: (1) The facility manager tags a task using a blue and (3) virtual objects (e.g., office equipment). Furthermore, unifying
cube to represent a task place in the IAV module; (2) The same cube is the coordinate system of all the EV contents and the real world objects
instantiated and positioned in the AR module and 2D guiding arrows is done to facilitate integration and navigation in both the AR and IAV
are used to refer to any off-screen objects to help the field worker find environments. This step required locating the 3D contents on a base
them in the AR module; (3) Color-coded georeferenced tags (e.g., map that is already georeferenced.
sphere) are added by the field worker to represent defect severity. For A case study is implemented at Concordia University's EV Building
example, a sever defect is tagged with a red animated tag, while a mild to demonstrate the validity of the proposed method. In this case study,
defect is tagged with a yellow static mark. Since the mark is georefer- FM required data from different sources are integrated using standard
enced, the facility manager has the ability to search based on the lo- formats [35] and defects are attached to specific components in the VE
cation and components in the virtual environment to investigate pos- using Unity3D. The model was developed in Autodesk Revit [41] for
sible causes; (4) The same objects will be visualized in both the AR and one floor of the EV building at Concordia University (Fig. 6(a)) and
the IAV module and 2D guiding arrows are used to refer to any off- adding one detailed BIM model (LOD 400) for a research lab in the
screen objects' positions to help the facility manager find them in the same floor (Fig. 6(b)). Also, the defects are considered as components of
IAV module; (5) Pointing orange 3D arrows and green-line sketching the 3D model. Other data are added regarding the maintenance pro-
are used by the field worker to refer to a specific area of interest in the cedures, such as the condition and last inspection date of the element
real environment; (6) The same pointing arrows and sketches are shown that has the defect.
in real time in the IAV module. Steps (7) and (8) are the same as steps The unique ID for a building element in FBX model is the same as
(5) and (6) except with yellow color for the pointing arrow and red the one in the IfcXML-based data stored in the database. When an
color for the sketching. It is important to use different colors to identify element is selected in Unity3D environment, the building element is
the user who made them. The right side of the figure shows the output highlighted and a query is activated to retrieve the element informa-
of these IVC components in the two modules. tion.

(a) BIM model (LOD300) of the EV building (b) Detailed BIM model (LOD400) of a lab
Fig. 6. Multilevel BIM models for the EV building.

8
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

4.1. CBIM3R-FMS software and hardware requirements the sensed temperature range is from −4 °F to 248 °F (−20° to
120 °C). The camera sensitivity allows detecting temperature dif-
(a) Software requirements ferences as small as 0.18° F (0.1 °C). The light weight of the camera
(78 g) makes it ideal for mobile applications.
(1) Unity3D is a 3D visualization and game development system. The (2) The IAV module is implemented as a standalone application and
engine has many features including physics and rendering to allow runs on a desktop PC (Dell Precision T1700 with Intel(R) Core(TM)
creating interactive 3D VEs. The main reasons of choosing Unity3D i7–4790 that has CPU processing speed of 3.60 GHz and 8 GB RAM).
for implementing the proposed framework are: (a) The game engine In addition, the IAV module uses three VR-based gadgets: (1)
can be used as an authoring tool and it allows focusing on the Oculus Rift for IAV visualization; and (2) Touch controllers for
concepts of the prototyping; (b) It has integration capabilities for gesture-based tracking. Oculus Rift [46] is a head-mounted display
developing AR and VR tools, linking with external databases and (HMD) with two touch controllers, which allow the facility man-
interoperability support (e.g., FBX's Autodesk native format); and ager to interact with the VE in the IAV module by tracking the head
(c) It has support of multiuser real-time collaboration and data movement. The main reasons for selecting Oculus Rift for this ap-
sharing. plication are its wide horizontal FOV (100°), display resolution
(2) Vuforia [42] is an AR Software Development Kit (SDK) which (960 × 1080 pixels per eye), low-latency positional tracking, and
supports feature-based tracking using mobile devices. The main use the support of its SDK2 version for integrating with Unity3D game
of this tool in this research is to create the tracking data and use engine. The touch controllers allow simulating the movement of
them within Unity3D development environment through Vuforia hands, so the facility manager can interact with virtual objects in a
Unity3D plug-in. The plug-in is used to access the registered similar way to what he/she does in the real world.
tracking data inside Unity3D and compare them with the detected
features captured by the tablet camera. 4.2. Immersive augmented virtuality module
(3) For BIM attribute database interaction, the IfcXML parser package
uses Xerces Java Parser [43]. This parser provides XML parsing and The IAV module of CBIM3R-FMS allows the manager to visualize
generation, and implements the W3C XML and DOM (Document activities from different views. Fig. 8(a) shows the large scale VE (i.e.,
Object Model) standards, as well as the SAX (Simple API for XML) urban and building level) through the Graphical User Interface (GUI) in
standard. The parser is used to read IFC-based data of the building the IAV module. Fig. 8(b) shows the collaborative IAV view of the
attributes and save them in the database. streamed camera view from the tablet of the field worker, super-
(4) The hotspot application (i.e., Connectify [44]) is installed on the imposed on the VR model which improves the situational awareness for
server where the IAV module is installed. This allows WLAN-based the facility manager. The touch controllers are used to interact with the
peer-to-peer connectivity between the two modules and connects VE. The GUI manipulation allows the facility manager to adjust the
them with the databases, as well as allows them to stream pose and location of the GUIs considering his 360° spherical view. For example,
video data and to call functions on the remote server. the facility manager can move the GUI, using gesture-based direct
manipulation behavior to relocate the GUI away from his current FOV
(b) Hardware requirements to focus on the task.

The visualization of the MR contents as well as the local and remote


4.3. Augmented reality module
interactivity of CBIM3R-FMS modules depend on specific hardware
requirements. Fig. 7 shows the used hardware in CBIM3R-FMS modules
The AR module is the field worker side application, which is im-
which are explained in the following:
plemented in Unity3D and deployed as an Android application. The
application is deployed as an android application and installed on a
(1) The AR module is implemented as feature-based mobile AR appli-
Samsung Galaxy Tab S2 tablet, which is equipped with WLAN con-
cation installed on an Android tablet (Samsung Galaxy Tab S2 with
nectivity. This allows the AR application to connect with the databases
1.8GHz + 1.4GHz Octa Core Processer and 3 GB RAM). The CMOS
and to stream location and video data to the IAV module using Remote
8.0 MP back camera of the tablet is used for tracking and feeding
Procedure Call (RPC) protocol. To track field workers in the VE, a
the IAV module with video frames. The tablet has connectivity
special script was created to adjust the position (X, Y, Z), and orienta-
options including Wi-Fi, GPS and Bluetooth. Sensors on the tablet
tion (β) around the Y axis to define the location and heading of the field
include a proximity sensor, accelerometer, ambient light sensor and
workers on the map. The AR module is able to track the field worker's
gyroscope. However, in this research only the gyroscope is used for
camera pose using the hybrid tracking method [35]. Vuforia Unity
the hybrid tracking method [35]. FLIR ONE camera [45] is a
Extension [47] is used to access the registered tracking data inside
thermal camera that can be attached to the tablet to provide
Unity3D and compare them with the detected features in the images of
thermal images. The camera has resolution of 160 × 120 pixels and
the video captured by the back camera of the tablet. The video is also

Oculus Rift Headset Computer


Tablet (Android)
FLIR ONE
Thermal Camera
+ for android
+

Touch controllers
IAV Module AR Module

Fig. 7. Hardware used in implementation.

9
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Onsite tablet streamed


camera view

Tracked worker location

CityGML and BIM-based VE


VR model

(a) Large-scale view (b) IAV view


Fig. 8. Examples of office IAV views.

streamed to the IAV module. The augmentation of the virtual contents combination of the real camera video, augmenting BIM-based elements,
can be building elements, elements IDs, or textual information about sensory data (i.e., thermal image), textual information retrieved from
inspection history stored in the Defect database. Considering that the the BIM and Defect databases. Minimizing the GUI components of the AR
field worker is using one hand to hold the tablet and the other to in- module is important due to the limited screen size. Therefore, the GUI is
teract with the AR contents, the GUI was designed to minimize the data designed using check toggles to enable or disable the GUI components
entry and maximize the AR view. When the field worker interacts with as needed.
the GUI, he/she uses the picking behavior to select elements, to mark The main activation button of the GUI is the right bottom button
defects, or to add or retrieve defect information. When the field worker labelled as START. Clicking this button activates the left side multi-
picks a real object through the AR view, the relative georeferenced button GUI, which contains the following: (1) The button CHAT for
component's bounding box and the attached data are shown as aug- activating a chatting channel between the AR module and the IAV
mentation. Then, the field worker can pick a specific point on the module; (2) The button GUIDE for activating the navigation guidance to
component to mark a defect. Once the defect mark (e.g., a sphere) is help the field worker locate georeferenced tagged tasks; (3) The button
generated, a new window is opened for adding the defect information. HATCH to activate the virtual hatch [35] which can be manipulated
The defect mark is georeferenced and its location information is stored with finger dragging functionalities; (4) The button PART to place the
automatically in the Defect database. desired component (e.g. new light switch component); (5) The button
Fig. 9 shows the AR module view of the system. The view consists of INFO to visualize the selected element information including its ID, last

Video of Real Environment BIM element Selected Element Information

Facing element info. Constrained Mode

Chatting Off/On

Navigation Guidance

Virtual Hatch

Object Manipulation

Inspection Info.

Sketching / Marking
8:28:475

Scene Lighting

Reset Chat Channel Thermal Image Visual Interaction

Fig. 9. Graphical user interface design of AR Module.

10
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

inspection data, current status and any additional note that the in- remote IVC between the AR and IAV modules. The participants are
spector may add; (6) The button with a pencil icon to activate marking asked to perform 17 tasks. The first eight tasks are non-collaborative
or sketching; (7) The button LIGHT to enable the georeferenced scene mode and tasks from nine to 17 use the IVC mode. In the first mode, the
lighting [35]; and (8) The button with a broom icon to reset marking or participant playing the role of the field worker performs simple tasks on
sketching and to start a new session of IVC. The upper right button is to his/her own without assistance from the facility manager. In the second
allow constrained mode and update the remote IAV module view. The mode, the field worker is supported remotely by the facility manager in
thumb-up button is only used for the usability testing to allow test the IVC mode. The validation is done based on testing three types of
participants to confirm finishing their tasks for analysis purposes. tasks: tag finding tasks, interaction tasks, and collaboration tasks. In the
tag finding tasks, the field worker is asked to navigate and find assigned
4.4. IAV-AR remote interaction tasks with and without visual guidance support from the system. The
participant playing the role of the facility manager is asked to perform
As explained in Section 3.4, for accurate communication between the same navigation task. In the interaction tasks, the field worker and
the facility manager and the field worker, it is important to accurately the facility manager are asked to mark and sketch on their AR or IAV
specify the element or the point of interest on their screen views to view, respectively. The time, number of errors and accuracy of per-
avoid confusion. After updating the VR camera pose in the IAV module forming the tasks are calculated. In the collaboration tasks, the field
based on the detected AR camera pose, both users can point on an worker interactively collaborates with the facility manager in the IVC
element to select or apply marking. These events are captured and the mode to perform the assigned tasks. During the collaboration session,
location of the generated marks (e.g., color-coded sphere) can be in- they mark on the same specific element in their different environments
stantly shared with each other. The geo-referenced marks' location and sketch around a specific area. The time, number of errors, and
values are streamed using RPC protocol. The data types sent through accuracy of marking and sketching are captured, and analyzed. After
the Wireless Local Area Network (WLAN) connection between the AR finishing all required tasks, the participants are asked questions tar-
and IAV modules are: (1) camera pose data as float values, (2) streamed geting the specific module they are using. After collecting the testing
video frames, (3) view picking point coordinates, and (4) textual data. data, an Analysis of Variance (ANOVA) is used to compare the means of
First, the camera pose data are sent from the AR module to the IAV two groups of observations [48] and check if there is significant dif-
module to update the IAV camera pose in the VE. Second, the video ference between the two groups of field workers using the different test
frames of the field worker tablet camera view are used to texture a 2D conditions (i.e. collaborative vs. non-collaborative).
plane positioned in front of the IAV view. Additional layer of thermal
images captured by the infrared camera are steamed for texturing the 5.1. Usability testing
same 2D plane when enabled. Third, the view picking point coordinates
can be calculated using ray tracing technique. This technique allows As a validation of the CBIM3R-FMS approach, a simple field task has
calculating the intersecting point coordinates between a ray generated been planned. In this task, the field worker is asked to perform the same
by picking on the screen and an element collider. These coordinates will task using the CBIM3R-FMS with and without enhanced visualization
be used to instantiate a marking object on each module by its user and support including the guiding arrows and sensory data. Based on the
instantly stream the same coordinates to instantiate the same object in output of the two tests, the collected data were classified based on the
the other module. When element selection is disabled, the streamed percentage of errors occurred, time to finish the task, and the easiness
coordinates are used to position the 3D guiding arrow with a normal level as indicated by the field workers. The usability testing is done by
vector based on the direction of the interaction element. Fourth, the 30 participants to evaluate the AR and IAV modules. For each test, two
textual information is sent between both modules. This information can participants are asked to perform the 17 tasks in each module. Each test
be stored in the database to update the selected element user interface has two sessions. In the first session, one participant uses the AR
on both modules. When applying picking behavior on the IAV module module as a field worker and the other uses the IAV module as a facility
or the AR module, an object collider should be detected to identify a 3D manager. They both use their modules based on the two testing con-
element hit (e.g., a wall). If this condition is satisfied, each module can ditions (i.e. with and without enhanced visualization support). In the
independently instantiate a 3D object (e.g., sphere) as a mark on the second session, they switch their roles (as field worker and facility
building element. The object should have a different color to differ- manager) and repeat the test with the new assigned module.
entiate between the local mark and the telemark. To instantiate the The main task used in this test focuses on a reinstallation task where
mark on the other remote module, the virtual mark coordinates should the field worker is required to replace a thermostat attached on a wall.
be received using the RPCs protocol. Fig. 10 shows the data structure Fig. 11(a) shows the old thermostat that should be replaced by the new
used in remote marking between AR and IAV modules. The 3D marking model shown in Fig. 11(b). The developed system helps the field worker
model is used to instantiate 3DMark object that has mesh collider and to check the installation conditions. Before installing the actual new
mesh renderer. Then, the instantiated 3DMark has vector3 data and a thermostat, based on his/her judgment, the field worker can use his/her
specific color for each module to avoid confusion. The vector3 data is finger to position the new virtual thermostat at a suitable place by
shared between the two modules using the RPC protocol over Wi-Fi avoiding clashes with nearby visible objects (i.e., emergency switch and
channel. For interaction, the AR module uses touch-based interaction to light switch) and invisible objects inside the wall (i.e. water pipes that
instantiate the mark, where the IAV uses the controllers' buttons. The emit heat). The IVC mode allows the facility manager to provide sup-
intersecting hit point between a ray extended from the camera pose and port to the field worker by visually guiding him/her throughout the test
the target surface is used to position the mark. tasks.
Fig. 12(a) shows the mark and sketch in the IAV module positioned
5. Evaluation of CBIM3R-FMS on the BIM model. Fig. 12(b) shows the AR module view with the re-
mote mark (i.e. yellow arrow) and a sketch (i.e. red line) done by the
The objective of this section is to evaluate the efficiency of the facility manager. The yellow sphere mark represents the point where
proposed framework, and the IVC method. To achieve this, a test sce- the facility manager is pointing in the VE. After signing the consent
nario was prepared and a usability testing was conducted by selecting form (Ethics Review Board approval number 30008493), each partici-
random participants with some knowledge of FM inspection and pant is given 10 min to familiarize himself/herself with the AR or IAV
maintenance tasks. For effectiveness measurement, the time, errors and module. As a first step, the participant is introduced to the usability test
accuracy of interaction data related to the marking and sketching are scenario and its tasks.
captured and used to measure the efficiency and accuracy levels of the Table 1 lists all the tasks for each module user. In the second step,

11
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Fig. 10. Data structure for AR-IAV modules telemarking.

the participant using the AR module is asked to independently perform occurred E (i.e. the number of unsuccessful tasks), which is calculated
seven consecutive tasks (Task 2 to Task 8). The propose of these tasks is as the following:
to measure the user performance without remote assistance considering
R
the physical constraints including space availability, nearby electrical E = ∑j=1 Ej / R
(1)
wires and heat-emitting sources that may affect the thermostat. Then in
the third step, the AR module user starts using the IVC by using the where E is the number of errors occurred and R is the number of par-
constrained mode (Task 9). The remaining tasks (10 to 17) include the ticipants.
two modules interaction where the tasks depend on each other. The
time to finish each task is recorded by the system in the Task database.
5.2. After-scenario questionnaire
Based on the time recorded and the answers of the questionnaire, sta-
tistical analysis is performed to measure: (1) The effectiveness of
The purpose of conducting the usability testing is to measure the
CBIM3R-FMS, which includes the accuracy and completeness with
effectiveness, efficiency, and the user satisfaction of the CBIM3R-FMS
which participants achieve specified goals; (2) Its efficiency and the
AR and IAV modules using a questionnaire. To identify the test subjects
time used by the participants to achieve the tasks; and (3) The parti-
and their knowledge of BIM and MR, the first seven questions are re-
cipant satisfaction, which is measured by the comfort and acceptability
lated to the participant gender, age, level of education, professional
of use. Table 3 includes the means and standard deviations (SD) for the
background and FM experience, sight health, and familiarity with BIM
time it took to perform the task and the average number of errors
and MR. After answering these questions, the participants are asked to

Pipe Electrical Wires Old thermostat New thermostat

Emergency switch Light switch


(a) Old thermostat (real) (b) New thermostat (virtual)
Fig. 11. AR module views of the old and new thermostats.

12
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Pointing

Virtual Arrow
Sketch Virtual Arrow
Remote Sketch

(a) Local marking and sketching (b) Remote marking and sketching
(IAV Module view) (AR Module view)
Fig. 12. IAV-AR remote marking and sketching.

provide their level of agreement with the statements presented in satisfied with the accuracy of remote sketching (Q16) (mean = 2.07)
Table 4 using a 7-point Likert scale. Each participant was asked overall while IAV are less satisfied (mean = 2.57). Considering Question 17,
usability questions (Q8 to Q10), which are adopted from IBM Post- the scores indicated that the IAV users believe that finding the task tag
Study System Usability Questionnaire (PSSUQ) [49], about their overall is an easy task in VE (mean = 2.50). The AR module users however
satisfaction level using Likert scale including the user satisfaction of the showed more confident about how easy to find tagged tasks
easiness of completing the assigned tasks, the amount of time to finish (mean = 1.90). During the IVC session, both users have almost similar
the tasks, and the support of the visual information. In addition, the satisfaction level about the AR-IAV interaction experience (Q18)
mental and physical loads are also evaluated using two NASA's Task (mean = 1.57 and 1.43) and AR-IAV visual experience (mean = 1.43
Load Index questions (NASA-TLX) [50] (Q11 to Q12). To evaluate the and 1.53).
collaboration support of CBIM3R-FMS, users were asked about their Furthermore, Table 7 demonstrates the minimums, maximums,
level of satisfaction with specific aspects of the test (Q13–Q18). Finally, means, and standard deviations (SD) for the time needed for the field
an additional optional question (Q19) is about any extra comments or workers for identifying a building component (e.g. a light switch or a
questions related to the task he/she conducted or the CBIM3R-FMS thermostat) based on two conditions. The first condition is without
experiences. using the IVC mode (as performed in Task 4) and the second condition
is with the IVC mode (as performed in Task 11). By comparing the
5.2.1. Analysis of test results means, it is clear that finding tasks with IVC support took less time
5.2.1.1. Statistical analysis. Thirty participants (22 males and 8 (1.82 s) which is about 15% of the time for performing the task without
females) have participated in the test. The majority of subjects (19) the support.
are aged between 25 and 34. Table 5 summarizes the participants' For the same test related to identifying a building component with
collected demographic information. The majority of subjects (76.67%) or without IVC, Table 8 shows the results of the one-way ANOVA. In
have engineering background and only eight participants (26.67%) this case, the first group of field workers used IVC while the second
have FM experience. Furthermore, 60% of the participants used BIM in group did not. The results are statistically significant between different
their projects. Regarding MR experience, 16 participants (53.33%) conditions of the test. All different variables in the questionnaire were
reported that they had VR experience and five (16.67%) had both AR analyzed using a one-way repeated measures ANOVA with an alpha
and VR experience, whereas nine participants (30.00%) had no level of 0.05.
experience with both AR and VR. Out of the 30, just nine (30%)
reported they are having vision problems requiring wearing glasses, 5.2.1.2. Participants remarks. Table 9 summarizes the participants'
which may not be a pleasant experience due to the lack of adjustability remarks about their experience with the AR and IAV modules during
of the HMD to fit the glasses. Out of the 30 participants, eight are FM the test. Their remarks can be categorized in two categories: positive
workers (5 electricians and 3 general maintenance workers). remarks and negative remarks. Some remarks include
Table 6 shows the minimums, maximums, medians, means, stan- recommendations about features they believe are nice to have to
dard deviations (SD), and different Likert scales of all items based on improve FM tasks.
the subjects' experiment conditions which using different modules. The Fig. 13 summarizes the opinions of the eight FM workers about the
results show that IAV users are less satisfied (mean = 2.17) with the overall usefulness of CBIM3R-FMS by comparing the current practice
easiness of completing the task (Q8) compared with the AR module conventional method and the IVC method based on five criteria: (1) The
users (mean = 1.77). In addition, IAV users were less satisfied with the easiness of use; (2) Time to conduct the tasks; (3) The accuracy of the
amount of time it took to complete the task (Q9) (mean = 2.20) information; (4) Collaboration; and (5) The adaptability of the method.
whereas AR module users showed more satisfaction (mean = 1.77). The By comparing the means of each category, the results showed that
results also show that AR users are more satisfied with the support the workers preferred using the IVC over what they are using in their
information they get during the task (Q10) (mean = 1.73), where IAV current practice. However, they have a doubt about the adaptability of
users are less satisfied (mean = 2.10). For both mental and physical the system. Almost all agreed that using new technologies may expose
loads (Q 11 and Q12), the users of AR module users found it less de- their privacy and they may feel that they are tracked and all their
manding (mean = 1.37 and 1.77, respectively) compared with IAV mistakes can be recorded. Of course the facility managers will disagree
users (mean = 1.83 and 2.33). During IVC mode, both the AR and IAV since they do not consider this capability as a disadvantage.
users almost agree about the easiness of the remote arrow marking
(Q13) and remote sketching (Q15). However, only AR users are 5.2.1.3. Evaluation of the accuracy of remote interaction. To evaluate the

13
K. El Ammari and A. Hammad

Table 3
List of tasks and their average time and errors.
No. Tasks Purpose Time (Sec.) No. of errors

AR module user IAV module user Mean SD Mean SD

Non Collaborative Mode 1 Idle Start IAV Module Launching the server side first 0.79 0.10 0.00 0.00
2 Start the AR module & point at the test target area. Idle Launching the client side second. 0.77 0.20 0.10 0.31

3 When the red text turns green, press GUIDE button and Idle To measure time, errors, and effort needed with 1.22 0.8 0.37 0.67
follow guiding arrow and name the tagged Task No. 1 Arrow Guidance.
4 Keep looking around and name the tagged Task No. 2 Idle To measure time, errors, and effort needed 11.88 4.35 0.60 0.89
without Arrow Guidance.
5 Press INFO button. Go to Task No. 3 and pick on target Idle To measure time, errors, and effort needed to 3.05 2.80 0.20 0.48
element (thermostat) and read element name (green identify an element attributes.
text)
6 Press HATCH button to switch Virtual Hatch on and Idle To evaluate visual depth perception using user 3.47 2.30 0.27 0.52
reposition it around the target element. feedback in the after test survey.
7 Press LIGHT button to switch lights on Idle To evaluate georeferenced light effect using 0.66 0.20 0.00 0.00
user feedback in the after survey.

14
8 Press SKETCH button to sketch over the appeared Idle To measure time, errors, and the accuracy of 6.69 1.60 0.33 0.61
triangle drawn on the wall as accurate as possible local sketching.
Collaborative Mode 9 Press CONSTRAINED MODE button to reposition IAV Once positioned, you may press any keyboard key to To update the IAV user camera pose remotely 4.99 1.00 0.10 0.31
user. deactivate constrained mode and use on touch controller to its relative real location.
gears to navigate around.
10 Idle Press Key 2 on the touch controller to mark on thermostat To measure time, errors, and the accuracy of 2.01 1.30 0.20 0.48
(Yellow Arrow) marking in IAV module.
11 Name pointed element (three attempts). Idle To measure time, errors, and the accuracy of 1.82 0.84 0.23 0.50
identifying a remotely pointed element.
12 Press MARK button on the appeared yellow arrow as Idle To measure time, errors, and the accuracy of 1.02 0.40 0.07 0.25
accurate as possible. marking in AR module relative to the remote
mark.
13 Pick on the marked element (i.e., Thermostat) and Update its Idle To measure the errors, and the time needed to 3.76 1.70 0.13 0.35
current condition (i.e., Bad, Good, Excellent). update records.
14 Idle Read updated thermostat condition. To measure the errors, and the time needed to 1.17 0.80 0.03 0.18
review updated records.
15 Idle Press Key 3 to sketch out over the remote sketch done by To measure time, errors, and the accuracy of 4.97 1.00 0.20 0.41
the AR user. following the remote. Sketching.
16 Press PART button to place new thermostat inside Idle To measure time, errors, and the accuracy of 5.15 1.30 0.13 0.35
created sketch. placing a building element in its proper place.
17 Idle Press update button to update record. To measure errors and time needed to update 1.02 0.30 0.00 0.00
records to allow AR user to see.
Automation in Construction 107 (2019) 102940
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Table 4
After-scenario questionnaire.
No Question Likert scale

IBM (PSSUQ) 8 Overall, I am satisfied with the ease of completing this task. 1 = Strongly Agree to 7 = Strongly Disagree
9 Overall, I am satisfied with the amount of time it took to complete this task. 1 = Strongly Agree to 7 = Strongly Disagree
10 Overall, I am satisfied with the support information when completing this task. 1 = Strongly Agree to 7 = Strongly Disagree
NASA-TLX 11 Mental Demand: How much mentally demanding was the task? 1 = Very Low to 7 = Very High
12 Physical Demand: How much mentally demanding was the task? 1 = Very Low to 7 = Very High
IVC 13 I am satisfied with the ease of remote arrow marking. 1 = Strongly Agree to 7 = Strongly Disagree
14 I am satisfied with the accuracy of remote arrow marking. 1 = Strongly Agree to 7 = Strongly Disagree
15 I am satisfied with the ease of remote sketching. 1 = Strongly Agree to 7 = Strongly Disagree
16 I am satisfied with the accuracy of remote sketching. 1 = Strongly Agree to 7 = Strongly Disagree
17 I am satisfied with the ease of finding the task tag. 1 = Strongly Agree to 7 = Strongly Disagree
18 Overall, I am satisfied with the AR-IAV interaction experience. 1 = Strongly Agree to 7 = Strongly Disagree
19 Do you have any other comments, questions, or concerns?

Table 5 Table 7
Demographics of test participants. Descriptive statistics for the time for identifying tagged tasks.
Variables Frequency Percentage (%) Time (Sec.)

Gender Male 22 73 Without IVC With IVC


Female 8 27
Age 18–24 1 3 Minimum 02.02 00.90
25–34 19 63 Maximum 20.49 03.35
35–44 8 27 Mean 11.88 01.82
45–54 2 7 Standard deviation 04.35 00.84
Level of education High school 6 20
BSc 10 33
MSc 13 44
Table 8
PhD 1 3
ANOVA analysis results for IVC dependent variables.
Professional background Engineering 23 77
Non-engineering 7 23 Source of variation SS df MS F-ratio P-value F-critical
FM Experience Yes 8 27
No 22 73 Between groups 2579.25 1 2579.25 14.78 0.0003 4.007
BIM Experience Yes 18 60 Within groups 10,118.86 58 174.46
No 12 40 Total 12,698.11 59
Mixed Reality Experience VR only 16 53
AR only 0 0 SS = Sum Square, df = Degree of Freedom, MS = Means Square, F = F sta-
AR and VR 5 17 tistic.
Non 9 30
Reported vision problem Yes 9 30
No 21 70 calculated as shown in Table 10: (1) to ground truth target in IAV
module, (2) to ground truth target in AR module, and (3) between AR
and IAV modules marks.
interactive collaboration experience using the IVC mode, it is important The results show that the average error to the target point is 1.03 cm
to evaluate the accuracy of remote marking and sketching with respect in the AR module and 1.12 cm in the IAV module, and the average error
to the georeferenced BIM model. First, the facility manager is shown a between the AR and IAV marks is 0.31 cm. These values indicate that
target point (i.e. the center point of Fig. 14) and asked to mark on top of the level of accuracy is acceptable for pointing on small facility com-
it. Second, the new pointing arrow added by the facility manager is ponents such as electrical components. Furthermore, the same accuracy
shown in the AR module, and the field worker is asked to mark on top is achievable when performing sketching.
of it as accurately as possible. Then, the following three errors Several factors may affect the accuracy of marking related to the AR
(maximum, minimum, means and the standard deviations) are and IAV modules as shown in Table 11. The user skill can be improved

Table 6
Descriptive statistics for the after-scenario questionnaire.
Question no. Test conditions

AR module IAV module

Min Max Median Mean SD Min Max Median Mean SD

Q8 1 3 2 1.77 0.68 1 3 2 2.17 0.79


Q9 1 3 2 1.77 0.73 1 4 2 2.20 0.89
Q10 1 3 2 1.73 0.64 1 3 2 2.10 0.71
Q11 1 4 1 1.37 0.72 1 4 1.5 1.83 1.02
Q12 1 4 2 1.77 0.90 1 4 2.5 2.33 1.12
Q13 1 3 1 1.33 0.55 1 3 1 1.37 0.56
Q14 1 3 1 1.47 0.63 1 3 1 1.53 0.68
Q15 1 3 2 1.57 0.57 1 3 1 1.47 0.57
Q16 1 4 2 2.07 0.74 1 5 2 2.57 1.07
Q17 1 3 2 1.90 0.66 1 6 2 2.50 1.38
Q18 1 4 1 1.57 0.73 1 4 1 1.43 0.68

15
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

Table 9
Summary of participants' remarks.
Positive remarks Negative remarks

AR module • Collaborating with a co-worker and sharing his view looks very clear and • AR-module cannot be used in hands-free mode using AR glasses.
accurate. • Using heavy and large tablet can be physically demanding for some
• Interacting remotely using marking and sketching saves time and eliminates users.
errors. • Needs extra training to avoid making mistakes
• Finding building elements, without looking at maps or complex 3D models, is very
interesting and time saving.
• Using touch screen for marking and sketching can be tricky and
accidentally trigger some buttons.
• BIM-based AR makes inspection and maintenance tasks much easier.
• The GUI is easy to use
IAV module • The interaction with the remote field worker feels like the user is actually there. • Element search feature is missing.
• Interacting with remote worker can reduce the time for waiting for approval. • Needs more training with the controllers.
• No motion sickness.
Overall • The AR-IAV visual interaction experience is very promising. • Adding audio chatting may save extra time.
• Adaptability of the system.
• Privacy issues.
Easiness performed the task; and nij is the result of task i by user j (nij]1 if the
5 task was completed, and nij = 0 if the task was unsuccessful and the
4.5 user failed to achieve the goal). Then, the effectiveness statistic error is
4 calculated using Eq. (3).
3.5
3 E (100 − E)
2.5 σE =
R (3)
2
Adaptability 1.5 Time
Finally, the overall CBIM3R-FMS Effectiveness is calculated using
1
0.5
Eq. (4).
0 E = E ± σE (4)

Fig. 15 shows the effectiveness scale of using CBIM3R-FMS. The


percentage of participants who have completed their tasks successfully
are shown in gray; the percentage of unsuccessful ones are shown in
black. The results show that the overall integral effectiveness E is
90.20% with an effectiveness statistic error (σE) margin of ± 1.8.
Therefore, the overall CBIM3R-FMS effectiveness is determined as good
Collaboration Accuracy
with sufficient degree of confidence according to the scale presented in
Current practice (conventional) IVC Table 12.

Fig. 13. Comparison chart between current practice and IVC task performance 6. Conclusions and future work
(unit: mm).
The main contribution of this paper is developing the MR frame-
by more training as one of the participants indicated. Although all work for facilities management which has field AR module and office
participants had a training session, the results show that some partici- IAV module. These two modules can be used independently or com-
pants had difficulty using the correct controller button for marking and bined using interactive visual collaboration. The field and the office
sketching. Others had problems using the touch screen of the tablet and tasks can be performed independently with the ability to consult and
applying multiple touches, which resulted in multiple marks. Also, the coordinate at any time without misinterpretation of the task related
incorrect configuration of the VR hardware settings may result in ac- data. Field tasks efficiency is improved by minimizing data entry time
curacy issues. For example, if the tracking cameras are not configured and errors. The validation has three major parts: (1) usability testing to
correctly, the location of the controllers will not be tracked correctly. quantify and analyze data related to the conducted questionnaire in-
That means, the ray used for marking and sketching will be casted from cluding user's satisfaction, time and errors analysis, (2) accuracy of
a wrong position. The stability of the feature-based tracking is im- remote marking and sketching, and (3) system effectiveness. The vali-
portant and may affect the accuracy of the marking and sketching. dation results show that the field tasks efficiency is improved by
Flickering augmenting objects or invisible surfaces (i.e., surfaces of the minimizing data entry time and errors. The field workers took much
bounding boxes of building elements) can cause inaccurate marking or longer time (mean = 11.88 s.) to identify tagged tasks using non-col-
sketching. laborative approach compared with the IVC approach (1.82 s.). That
reduced the time by 85%. The number of errors is also reduced by 62%.
5.2.1.4. CBIM3R-FMS effectiveness. According to ISO-9241[51], In addition, all participants agreed that being able to remotely mark a
product effectiveness is defined as the accuracy and completeness of building element in a MR environment is useful to visually interact and
user goal achievement. To measure the overall effectiveness of CBIM3R- communicate with a remote co-worker. The average error between the
FMS, first the percentage of users who completed their tasks without facility manager mark and the field worker mark is 0.31 cm. This error
any errors needs to be calculated using Eq. (2). is considered acceptable for inspection and maintenance tasks where
users not only can point on the building element, but they can also
R N
∑ j = 1 ∑i = 1 nij select a specific part of it. For example, the users can mark a part of a
E = ∗ 100 thermostat such as the screen, button, etc. On the other hand, the
RN (2)
marking results show that the marking error from the ground truth in
where N is the total number of tasks; R is the number of users who the AR module has larger distance 1.03 cm. In addition, ANOVA

16
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

● Ground Truth Target ○ Field Marks ● Office Marks


Fig. 14. Comparison graph between AR module user marking and IAV user marking.

Table 10 However, the following limitations still remain to be addressed in the


Statistics of errors for remote marking using IVC. future: (1) To maintain an accurate tracking, the system requires unique
Errors (cm) Max Min Mean SD
well-distributed features to be used as targets, which can be challenging
in indoor environments; (2) The proposed system does not support
To ground truth target in IAV module 1.80 0.06 1.12 0.50 hands-free mode to allow the field workers perform their tasks while
To ground truth target in AR module 1.96 0.09 1.03 0.56 using the AR module; (3) The system does not support occlusion for
Between AR and IAV modules marks 1.07 0.03 0.31 0.28
moving objects in the AR module; (4) Using touch screen for marking
and sketching can be tricky and accidentally trigger some buttons; and
analysis showed that there is any statistically significant difference (5) Privacy issues related to tracking field workers should be con-
between the means of using CBIM3R-FMS with and without IVC. sidered.
Furthermore, the CBIM3R-FMS effectiveness validation results show The proposed method has been implemented in a prototype system.
that the overall integral effectiveness E is 90.20% with an effectiveness Full implementation of the software will require major investments in
statistic error margin of ± 1.8. This indicates that the system effec- two directions: (1) scalability of the software (i.e. full implementation
tiveness is good according to ISO scale. In addition, the results show of the different modules to cope with large scale applications), and (2)
that the AR module users were satisfied with the overall easiness of the developing the BIM database with a suitable level of detail for visua-
module (mean = 1.77) and the time it took them to complete the as- lization purposes to be used as the backbone of the system. Knowing
signed tasks (mean = 1.77). The mean overall users' satisfaction levels that BIM is getting increasing momentum in FM applications, it is
of the IAV module is 2.17 for the easiness of use and 2.20 for the hoped that the required BIM database will be available.
completion time. Participants have indicated that there are some as- For the future studies, as field workers need to work with their
pects of the CBIM3R-FMS approach that can be improved. They re- hands, it is recommended to investigate methods for projecting the AR
commended AR glasses for the AR module to allow the field worker to module contents on surrounding surfaces instead of seeing them
perform his/her tasks hands-free. Overall, they found CBIM3R-FMS a through a tablet or AR glasses. Although AR glasses can augment fa-
promising tool for FM daily inspection and maintenance tasks. cilities' components based on their positions in the real world,

Table 11
Identified factors effecting marking accuracy.
Skill Hardware Software

IAV module User skills of using controllers VR controllers HMD and controllers calibration
AR module User skills of using touch screen of the tablet Sensitivity of the touch screen Feature-based tracking

17
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

120%

percentage of completion (30 participants)


0% 0% 0%
100% 3%
10% 10% 10% 10% 7%
13% 17% 17% 13% 13% 13% 17% 13%

80%

60%

100% 100% 97% 100%


90% 90% 90% 90% 93%
87% 83% 83% 87% 87% 87% 83% 87%
40%

20%

0%
T1 T2 T3 T4 T5 T6 T7 T8 T9 T10 T11 T12 T13 T14 T15 T16 T17
Tasks

Tasks completed without errors Tasks completed with errors


Fig. 15. CBIM3R-FMS effectiveness chart.

Table 12 papers/chi15-lukosch.pdf.
Effectiveness scale [51]. [6] K.A. Piirainen, G.L. Kolfschoten, S. Lukosch, The joint struggle of complex en-
gineering: a study of the challenges of collaborative design, International Journal of
Very bad Bad Normal Good Information Technology & Decision Making 11 (6) (2012) 1087–1125, https://doi.
org/10.1142/S0219622012400160.
0–50% 50–75% 75–90% 90–100% [7] B. Sidawi, A. Alsudairi, The potentials of and barriers to the utilization of advanced
computer systems in remote construction projects: case of the Kingdom of Saudi
Arabia, Visualization in Engineering 2 (1) (2014) 3–14, https://doi.org/10.1186/
2213-7459-2-3.
augmenting objects in critical situations may cause confusion to the [8] Z. Deng, H. Li, C. Tam, Q. Shen, P. Love, An application of the Internet-based
field worker and affect his/her safety. In addition, most of the current project management system, Autom. Constr. 10 (2) (2001) 239–246, https://doi.
org/10.1016/S0926-5805(99)00037-0.
AR headsets (e.g. Hololens, magic leap) do not support mobility.
[9] L. Kestle, Remote site design management, Doctoral dissertation University of
Furthermore, these headsets have no input functionalities other than Canterbury, New Zealand, 2009http://hdl.handle.net/10092/3579.
gazing or voice commands, which can be hard to use in the working [10] C. Park, D. Lee, O. Kwon, X. Wang, A framework for proactive construction defect
environment. The challenge here is the interaction with the AR contents management using BIM, augmented reality and ontology-based data collection
template, Autom. Constr. 33 (2013) 61–71, https://doi.org/10.1016/j.autcon.2012.
using the AR module functionalities. MIT's SixthSense technology can 09.010.
be considered for integration with the AR module [52]. This technology [11] L. Hou, X. Wang, M. Truijens, Using augmented reality to facilitate piping assembly:
showed a new way of human-computer interaction that can be re-in- an experiment-based evaluation, J. Comput. Civ. Eng. 29 (1) (2013), https://doi.
org/10.1061/(ASCE)CP.1943-5487.0000344.
vestigated to facilitate FM tasks, where field workers can focus more on [12] A. Shirazi, A.H. Behzadan, Content delivery using augmented reality to enhance
their actual inspection or maintenance tasks with a minimal interaction students' performance in a building design and assembly project, Advances in
with the AR GUI. In addition, visualization issues, including moving Engineering Education 4 (3) (2015) n3 (ISSN: EISSN-1941-1766).
[13] N. Yabuki, T. Onoue, T. Fukuda, S. Yoshida, A heatstroke prediction and prevention
object occlusion in AR, should be investigated. Finally, although the system for outdoor construction workers, Visualization in Engineering 1 (1) (2013)
usability testing showed that participants have concerns about exposing 11, https://doi.org/10.1186/2213-7459-1-11.
their privacy by tracking their locations, tracking the field workers is [14] J. Irizarry, M. Gheisari, G. Williams, B.N. Walker, InfoSPOT: a mobile augmented
reality method for accessing building information through a situation awareness
strictly used to improve their performance. In addition, tracking can
approach, Autom. Constr. 33 (2013) 11–23, https://doi.org/10.1016/j.autcon.
improve personal safety in case of emergencies. These aspects should be 2012.09.002.
considered in future work as arguments to reduce the resistance of the [15] Revizto, Revizto: real-time issue tracking software for architecture, engineering and
construction with a focus on collaboration and BIM project coordination, Retrieved
workers to using tracking technologies in general.
on 02/23/2018 from: https://revizto.com/en/>.
[16] Autodesk360, A360: collaboration for design teams, Retrieved on 03/15/2018
References from: https://a360.autodesk.com/.
[17] Bluescape, Collaborative workspace and virtual workspace, Retrieved on 06/12/
2019 from: https://bluescape.com/.
[1] P. Milgram, H. Colquhoun, A taxonomy of real and virtual world display integra- [18] X. Wang, P.S. Dunston, Comparative effectiveness of mixed reality-based virtual
tion, Mixed Reality: Merging Real and Virtual Worlds, 1 1999, pp. 1–26, , https:// environments in collaborative design, IEEE Trans. Syst. Man Cybern. Part C Appl.
doi.org/10.1007/978-3-642-87512-0_1. Rev. 41 (3) (2011) 284–296, https://doi.org/10.1109/TSMCC.2010.2093573.
[2] S. Gauglitz, B. Nuernberger, M. Turk, T. Höllerer, In touch with the remote world: [19] S. Dong, A.H. Behzadan, F. Chen, V.R. Kamat, Collaborative visualization of en-
remote collaboration with augmented reality drawings and virtual navigation, gineering processes using tabletop augmented reality, Adv. Eng. Softw. 55 (2013)
Proceedings of the 20th ACM Symposium on Virtual Reality Software and 45–55, https://doi.org/10.1016/j.advengsoft.2012.09.001.
Technology 2014, ACM, 2014, pp. 197–205, , https://doi.org/10.1145/2671015. [20] D. Datcu, M. Cidota, H. Lukosch, S. Lukosch, On the usability of augmented reality
2671016. for information exchange in teams from the security domain, Proceedings of
[3] C. Koch, M. Neges, M. König, M. Abramovici, Natural markers for augmented Intelligence and Security Informatics Conference (JISIC), 2014 IEEE Joint 2014,
reality-based indoor navigation and facility maintenance, Autom. Constr. 48 (2014) IEEE, 2014, pp. 160–167, , https://doi.org/10.1109/JISIC.2014.32.
18–30, https://doi.org/10.1016/j.autcon.2014.08.009. [21] W. Huang, L. Alem, F. Tecchia, HandsIn3D: supporting remote guidance with im-
[4] X. Wang, P.E. Love, M.J. Kim, W. Wang, Mutual awareness in collaborative design: mersive virtual environments, Proceedings of IFIP Conference on Human-Computer
an augmented reality integrated telepresence system, Comput. Ind. 65 (2) (2014) Interaction 2013, Springer, 2013, pp. 70–77, , https://doi.org/10.1007/978-3-642-
314–324, https://doi.org/10.1016/j.compind.2013.11.012. 40483-2_5.
[5] S. Lukosch, Towards virtual co-location and telepresence based on augmented [22] O. Oda, M. Sukan, S. Feiner, B. Tversky, Poster: 3D referencing for remote task
reality, Semantic Scholar, 2015 http://hci.cs.wisc.edu/workshops/chi2015/ assistance in augmented reality, Proceedings of 3D User Interfaces (3DUI), 2013

18
K. El Ammari and A. Hammad Automation in Construction 107 (2019) 102940

IEEE Symposium on 2013, IEEE, 2013, pp. 179–180, , https://doi.org/10.1109/ [36] R. Williams, H. Shayesteh, L. Marjanovic-Halburd, Utilising building information
3DUI.2013.6550237. modeling for facility management, International Journal of Facility Management 5
[23] J. Normand, M. Servieres, G. Moreau, A typology of augmented reality applications (1) (2014).
based on their tracking requirements, Proceedings of Location Awareness for Mixed [37] Z. Wu, N. Wang, J. Shao, G. Deng, GPU ray casting method for visualizing 3D pi-
and Dual Reality. LAMDa Workshop. Lisbon, Portugal, 2012 2012. pelines in a virtual globe, International Journal of Digital Earth 2018 (2018) 1–14,
[24] A. Motamedi, M.M. Soltani, A. Hammad, Localization of RFID-equipped assets https://doi.org/10.1080/17538947.2018.1429504.
during the operation phase of facilities, Adv. Eng. Inform. 27 (4) (2013) 566–579, [38] J. Han, Introduction to Computer Graphics with OpenGL ES, CRC Press, 2018
https://doi.org/10.1016/j.aei.2013.07.001. (ISBN-13: 978-1498748926).
[25] G. Deak, K. Curran, J. Condell, A survey of active and passive indoor localisation [39] Unity3D, Unity3D game engine, Retrieved on 04/14/2018 from: http://unity3d.
systems, Comput. Commun. 35 (16) (2012) 1939–1954, https://doi.org/10.1016/j. com.
comcom.2012.06.004. [40] Ville de Montréal, The digital model of southwest and Ville-Marie - CityGML LOD2
[26] A. LaMarca, Y. Chawathe, S. Consolvo, J. Hightower, I. Smith, J. Scott, T. Sohn, buildings with textures, Retrieved on 05/10/2017 from: http://donnees.ville.
J. Howard, J. Hughes, F. Potter, Place lab: device positioning using radio beacons in montreal.qc.ca/dataset/maquette-numerique-batiments-citygml-lod2-avec-
the wild, Proceedings of International Conference on Pervasive Computing 2005, textures.
Springer, 2005, pp. 116–133, , https://doi.org/10.1007/11428572_8. [41] Revit, Building Information Modeling Applications, Retrieved on 05/22/2018 from:
[27] R.M. Rodriguez, R. Aguilar, S. Uceda, B. Castañeda, 3D pose estimation oriented to http://www.autodesk.com/products/revit-family/overview>.
the initialization of an augmented reality system applied to cultural heritage, [42] Vuforia SDK, How to install the Vuforia SDK Unity3D extension, Retrieved on 09/
Digital Cultural Heritage (2018) 277–288, https://doi.org/10.1007/978-3-319- 12/2017 from: https://library.vuforia.com/articles/Solution/Installing-the-Unity-
75826-8_23. Extension>.
[28] M. Neges, C. Koch, M. König, M. Abramovici, Combining visual natural markers and [43] Apache, Xerces java parser, Retrieved on 02/06/2018 from: https://xerces.apache.
IMU for improved AR based indoor navigation, Adv. Eng. Inform. 31 (2017) 18–31, org/xerces-j/.
https://doi.org/10.1016/j.aei.2015.10.005. [44] Connectify, Connectify Hotspot, Retrieved on 06/12/2019 from: www.connectify.
[29] M. Chu, J. Matthews, P.E. Love, Integrating mobile building information modelling me/.
and augmented reality systems: an experimental study, Autom. Constr. 85 (2018) [45] FlirOne, Thermal imaging cameras for your Smartphone, Retrieved on 12/01/2018
305–316, https://doi.org/10.1016/j.autcon.2017.10.032. from: https://www.flir.ca/flir-one/.
[30] M. Feick, A. Tang, S. Bateman, Mixed-reality for object-focused remote collabora- [46] Oculus, Virtual reality headset review, Retrieved on 01/03/2018 from: https://
tion, Proceedings of the 31st Annual ACM Symposium on User Interface Software www.oculus.com/.
and Technology Adjunct Proceedings 2018, ACM, 2018, pp. 63–65, , https://doi. [47] Vuforia, Natural Features and Image Ratings, Retrieved on 01/03/2018 from:
org/10.1145/3266037.3266102. https://library.vuforia.com/articles/Solution/Optimizing-Target-Detection-and-
[31] J. Du, Z. Zou, Y. Shi, D. Zhao, Zero latency: real-time synchronization of BIM data in Tracking-Stability#Natural-Features-and-Image-Ratings>.
virtual reality for collaborative decision-making, Autom. Constr. 85 (2018) 51–64, [48] Turner, J.R. and Thayer, J. (2001). "Introduction to Analysis of Variance: Design,
https://doi.org/10.1016/j.autcon.2017.10.009. Analysis & Interpretation." Sage Publications, London. ISBN-13: 978-0803970748.
[32] F. Liu, S. Seipel, Precision study on augmented reality-based visual guidance for [49] J.R. Lewis, IBM computer usability satisfaction questionnaires: psychometric eva-
facility management tasks, Autom. Constr. 90 (2018) 79–90, https://doi.org/10. luation and instructions for use, International Journal of Human-Computer
1016/j.autcon.2018.02.020. Interaction 7 (1) (1995) 57–78, https://doi.org/10.1080/10447319509526110.
[33] J.B. Sørensen, K. Svidt, BIM-based multiuser collaborative virtual environments for [50] S.G. Hart, NASA-task load index (NASA-TLX); 20 years later, Proceedings of the
end user involvement, Proceedings of 35th Annual International Conference of Human Factors and Ergonomics Society Annual Meeting 2006, Sage Publications
eCAADe–Educational and Research in Computer Aided Architectural Design in Sage CA, Los Angeles, CA, 2006, pp. 904–908, , https://doi.org/10.1177/
Europe Education and Research in Computer Aided Architectural Design in Europe 154193120605000909.
2017, 1 eCAADe, 2017, pp. 111–118. [51] ISO, ISO 9241-11:1998, ergonomic requirements for office work with visual display
[34] T.W. Kang, C.H. Hong, A study on software architecture for effective BIM/GIS-based terminals (VDTs) - part 11: guidance on usability, Retrieved on 05/04/2018 from:
facility management data integration, Autom. Constr. 54 (2015) 25–38, https://doi. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-1:v1:en.
org/10.1016/j.autcon.2015.03.019. [52] P. Mistry, P. Maes, SixthSense: a wearable gestural interface, Proceedings of ACM
[35] Khaled El Ammari, Remote Collaborative BIM-based Mixed Reality Approach for SIGGRAPH ASIA 2009 Sketches 2009, ACM, 2009, p. 11, , https://doi.org/10.
Supporting Facilities Management Field Tasks, PhD thesis, Concordia University 1145/1667146.1667160.
(2018).

19

Potrebbero piacerti anche