A State-of-the-Art Review of Augmented Reality in Engineering Analysis and Simulation
A State-of-the-Art Review of Augmented Reality in Engineering Analysis and Simulation
A State-of-the-Art Review of Augmented Reality in Engineering Analysis and Simulation
and Interaction
Review
A State-of-the-Art Review of Augmented Reality in
Engineering Analysis and Simulation
Wenkai Li 1,2 , A. Y. C. Nee 1,2, * and S. K. Ong 1,2
1 NUS Graduate School for Integrative Sciences and Engineering, National University of Singapore,
28 Medical Drive, Singapore 117456, Singapore; wenkail@u.nus.edu (W.L.); mpeongsk@nus.edu.sg (S.K.O.)
2 Mechanical Engineering Department, National University of Singapore, 9 Engineering Drive 1,
Singapore 117576, Singapore
* Correspondence: mpeneeyc@nus.edu.sg
Abstract: Augmented reality (AR) has recently become a worldwide research topic. AR technology
renders intuitive computer-generated contents on users’ physical surroundings. To improve process
efficiency and productivity, researchers and developers have paid increasing attention to AR
applications in engineering analysis and simulation. The integration of AR with numerical simulation,
such as the finite element method, provides a cognitive and scientific way for users to analyze
practical problems. By incorporating scientific visualization technologies, an AR-based system
superimposes engineering analysis and simulation results directly on real-world objects. Engineering
analysis and simulation involving diverse types of data are normally processed using specific
computer software. Correct and effective visualization of these data using an AR platform can
reduce the misinterpretation in spatial and logical aspects. Moreover, tracking performance of the AR
platforms in engineering analysis and simulation is crucial as it influences the overall user experience.
The operating environment of the AR platforms requires robust tracking performance to deliver
stable and accurate information to the users. In addition, over the past several decades, AR has
undergone a transition from desktop to mobile computing. The portability and propagation of
mobile platforms has provided engineers with convenient access to relevant information in situ.
However, on-site working environment imposes constraints on the development of mobile AR-based
systems. This paper aims to provide a systematic overview of AR in engineering analysis and
simulation. The visualization, tracking techniques as well as the implementation on mobile platforms
are discussed. Each technique is analyzed with respect to its pros and cons, as well its suitability to
particular types of applications.
1. Introduction
Engineering problems are generally mathematical models of physical phenomena [1]. There are
various types of typical engineering problems, such as solid mechanics, heat transfer, fluid flow,
electrical, magnetism, etc. Almost all physical phenomena, whether mechanical, biological, aerospace,
or chemical can be described using mathematical models [2]. Mathematical models use assumptions
and appropriate axioms to express the features of a physical system. The solution of a physical problem
can be approximated by using engineering analysis and simulation techniques, such as numerical
simulation. With the help of advanced computer technology, the computer can process fast and
accurate calculation of substantial amounts of data, and enable intuitive result visualization. Scientific
visualization can illustrate numerical simulation results graphically to enable engineers to understand
and glean insight from their data. There exists a number of numerical simulation software, many of
which are based on a WIMP-style (windows, icons, menus, pointers) environment. In the last several
decades, the trend of using innovative and intuitive systems to solve engineering problems has become
increasingly evident.
Augmented reality (AR) has been studied for several decades and can be combined with human
abilities as an efficient and complementary tool to enhance the quality of engineering analysis. An AR
system can overlay computer-generated contents on views of the physical scene, augmenting a user’s
perception and cognition of the world [3]. AR allows the users to continue interacting with both
the virtual and real objects around them. A near real-time interaction with these virtual and real
objects enables a user to judge multiple parameters simultaneously and analyze the problem efficiently.
A complete AR system should include three main elements, i.e., tracking, registration, and visualization.
Development of AR technology with precise information augmentation in real-time is a foreseeable
reality that can be used in almost any domain. Over the past decade, AR has undergone a transition
from desktop to mobile computing. The portability and propagation of mobile platforms has provided
engineers with convenient access to relevant information in situ.
Integrating AR with engineering problems is a concept that has appeared in recent years.
The improvement of equipment performance makes data processing and near real-time display
possible. AR is capable of providing an immersive and intuitive environment for the user to achieve
near real-time simulation results for problem analysis. Many review works have been conducted
to summarize the systems in this field. Behzadan et al. [4] summarized a review regarding AR
in architecture and construction simulation, and Barsom et al. [5] provided a systematic review
on AR in medical and surgery related simulation. Nee et al. [6] reviewed the research of AR
applications in manufacturing operations. Although there are many relevant works mentioned
in their review, there are no rigorous review papers that focus on AR in engineering analysis and
simulation. Therefore, the objective of this paper is to fill up this gap by providing a state-of-the-art
summary of mainstream studies of AR in engineering analysis and simulation. The remaining of this
review paper is organized as follows. Section 2 provides an overview of computer-aided technologies
in engineering applications. A statistical survey is included in this section for reference. Section 3
highlights the research concentration and paucity with a summarized table. Besides, the techniques
used for AR-based engineering analysis and simulation are summarized in Section 4. Finally, Section 5
provides a conclusion for this review and discusses the possible trends in this field.
Virtual reality (VR) technologies have been employed by researchers to achieve an immersive
and interactive environment. Various VR-based visualization and interaction techniques have been
developed. VR applications using numerical simulation began in the 90s. Several researchers [12,13]
focused on using the VR environment for finite element analysis (FEA) result visualization. Scherer and
Wabner [14] proposed a system for structural and thermal analysis. Their method visualizes FEA results
with a three-dimensional glyph. Another glyph-based simulation result visualization system was
proposed by Neugebauer et al. [15], in which stresses can be displayed using 3D glyphs. Buchau [16]
introduced a VR- based numerical simulation system, which integrates a COMSOL Multiphysics
solver for efficient computation. The post processing of simulation data plays a vital role, as accurate
visualization of simulation data could improve user experience in the VR environment. By using
the Visualization Toolkit [17], the computation results can be visualized with the interaction function
provided [18,19]. Recently, deformation simulation has been conducted by several researchers [20–22].
Some of the studies use artificial neural network (ANN) and other approximation methods [23] to
achieve real-time solutions. Interaction methods have been studied to utilize the simulation results
provided in a VR environment so as to improve the efficiency of the analysis [24] and the design
process [25]. Even though VR systems can provide visualization of the engineering analysis and
simulation results in an intuitive and efficient way, there are still limitations. First, establishing a
virtual environment with all the information involved is difficult as the detailed physical models
and properties of the surrounding objects should be defined precisely. Secondly, there is no physical
relationship between a user and the virtual content, such that the user has no influence on the
environment. This reduces the immersion feelings experienced by the user. Furthermore, the
equipment for an immersive VR-based system is not cost-effective and can cause ergonomic problems,
such as nausea during use.
Vision-based tracking, on the other hand, is accurate but relatively slow. Current vision-based tracking
can be categorized into two methods, namely, marker based tracking [30] and marker-less tracking.
Multimodal Technol. Interact. 2017, 1, 17 4 of 23
The two tracking methods complement each other and researchers have started to develop hybrid
methods to achieve ainteraction
human-computer more robust cantracking solution.
be achieved usingThe human-computer
additional interaction
accessories, can be
tangible user achieved
interface,
using additional accessories, tangible
hand gesture, and attached sensors [27]. user interface, hand gesture, and attached sensors [27].
250
200
150
100
50
0
2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017
300
250
200
150
100
50
0
2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017
Table 1. Research area and purpose of AR-based engineering analysis and simulation.
Table 6 summarizes the features and limitations of most of the AR-based engineering analysis
and simulation systems. Selected studies use different visualization methods, such as image overlay,
OpenGL programming, and special software kit, to visualize volumetric data and numerical simulation
results. A relatively stable tracking and registration module is also included. However, the clear
majority of current systems have some common limitations as well. Most of the AR systems are
designed for one specific scenario only, such as in laparoscopic surgery [59]. Besides, the virtual
contents are mostly pre-calculated and hardcoded in the systems. Moreover, selected studies support
only one platform, instead of multi-platforms. The lack of scalability restricted the application range of
these systems. In addition, most of the studies use AR as a visualization tool, and the possibility of
interacting with the simulation results is neglected.
Table 6. Common characteristics and limitations of current AR-based engineering analysis and
simulation systems.
Features Limitations
Robust tracking performance is required for Designed for one specific scenario with pre-defined
high precision engineering operations model hardcoded.
Mainly developed on one platform only. The lack of
Efficient visualization tools are implemented for
multi-platform support limited the usage of
near real-time display
the system.
Accurate registration of computer-generated Most of the system lacks effective and intuitive
volumetric data and numerical simulation interaction method. The system was only used for
result on real scene visualizing the results.
Figure 3 summarizes the research areas, research purposes, analysis and simulation methods, and
data types encountered in current AR-based engineering analysis and simulation systems. With the
development of computer technology, AR can be utilized to facilitate engineering analysis and
numerical simulation in both visualization and interaction. Tracking is one of the basic components of
an AR system, Table 7 shows the tracking techniques used in selected studies. Tracking techniques
can be divided into three categories, namely, marker-based tracking, marker-less tracking, GPS and
sensor fusion based tracking. Besides, the result updating ability is one of the research trends in
recent years. Table 8 summarizes the simulation result updating methods in selected studies. Some of
the reported studies, such as [32,43,46,70,74,75], used pre-defined simulation results. Pre-defined
result based systems limit the function and are not flexible and scalable. In addition to pre-defined
simulation results, result updating can be achieved in three different methods. First, users can
update the result through manual input. Second, the parameters can be updated by using the
computer vision techniques. For example, the deformation of tissues [58] and elastic objects [43] can be
tracked and analyzed with image processing. Third, the integration of sensor networks and solution
modules [27,41,42,46] enables near real-time result update, which provides more possibilities for
AR-based systems. Sensor networks normally consist of three parts, namely, sensor nodes, gateways,
and client processing software. Real-time load data can be captured using sensor nodes and processed
with client software. Appropriate sensors should be selected and installed depending on the different
conditions of the applications.
Multimodal Technologies and Interact. 2017, 1, 17 12 of 22
Figure
Figure 3. 3.The
Thecategories
categoriesofofresearch
researchareas,
areas, research
research purposes,
purposes, methods,
methods,and
anddata
datatypes
typesininAR-based
AR-based
engineeringanalysis
engineering analysisand
andsimulation.
simulation.
Table7.7.Tracking
Table Tracking methods
methods in
in selected
selected studies.
studies.
Pre-Defined
Pre-Defined [31,33–36,55,59,60,62,65,66,69,70,73,74]
[31,33–36,55,59,60,62,65,66,69,70,73,74]
Manual
Manual input
input [37,49,50,61,67,68]
[37,49,50,61,67,68]
Dynamic
Dynamicupdate
update Imageprocessing
Image processing [43–45,47,48,51,53,54,56–58,60,64,68,71,77]
[43–45,47,48,51,53,54,56–58,60,64,68,71,77]
Sensors [27,32,40–42,63,72]
Sensors [27,32,40,41,42,63,72,]
TheThe popularity
popularityandandportability
portabilityof ofphablet
phablet devices have
have promoted
promotedthe thedevelopment
developmentofofmobile
mobile ARAR
platforms for outdoor and on-site engineering
platforms for outdoor and on-site engineering tasks, as shown in Table 9. Most reported AR
shown in Table 9. Most reported AR systems systems
areare
still
stilldeveloped
developedfor forindoor
indoor applications [41,43,74,77].Outdoor
applications [41,43,74,77]. Outdoor ARAR systems
systems could
could serve
serve as a as a tool
tool to
to assist
assistinin important
importantdecisions
decisionswithout
withoutconstraining
constrainingthe theuser’s
user’swhereabouts
whereaboutstotoa aspecially
speciallyequipped
equipped
area.Some
area. Some of of the
the selected
selected studies
studiesarearebased
basedonona aclient-server
client-servernetwork
network model for visualization
model of
for visualization
ofsimulation
simulationdata data[33–36,38,39,40,45,47,48,55,60,69,75].
[33–36,38–40,45,47,48,55,60,69,75]. However,
However,due duetotolimited
limiteddevice
devicecapabilities,
capabilities,
mobileAR
mobile ARplatforms
platforms still
still require
requirefurther
furtherdevelopment
development for for
engineering analysis
engineering and simulation.
analysis The
and simulation.
technologies to be developed for mobile AR systems include visualization and interaction
The technologies to be developed for mobile AR systems include visualization and interaction methods. methods.
AA detaileddiscussion
detailed discussionon onthethetechniques
techniques used
used for
for AR-based
AR-based engineering
engineeringanalysis
analysisand
andsimulation
simulation is is
summarized in Section
summarized in Section 4. 4.
Table9.9.Platforms
Table Platforms of selected studies.
studies.
4. 4.
Techniques Used
Techniques Usedfor
forAR
ARApplications
Applicationsin
inEngineering
Engineering Analysis and siMulation
Analysis and siMulation
Different
Differentfrom
fromcommon
commonAR ARapplications
applications in other fields,
fields, AR
ARapplications
applicationsininengineering
engineeringanalysis
analysis
andsimulation
and simulationrequire
requirerobust
robusttracking
tracking andand visualization
visualization performance.
performance. The Thecharacteristics
characteristicsofof
engineeringscenarios
engineering scenarios(e.g.,
(e.g.,lighting
lighting variation,
variation, poorly
poorly textured
texturedobjects,
objects,marker
markerincompatibility,
incompatibility,etc.)
etc.)
have
have poseddifficulties
posed difficultiestotomost
mostof ofthe
theAR
AR techniques
techniques available
available [78].
[78]. This
Thissection
sectionaims
aimstotoprovide
provideaa
discussion
discussion ononthe
thetechniques
techniquesused
usedfor
forAR
ARapplications
applications in
in engineering
engineering analysis
analysisand
andsimulation.
simulation.
4.1.
4.1. Tracking
Tracking
TrackingininAR
Tracking ARtechnology
technologyrefers
refers to
to dynamic
dynamic sensing
sensing and
andmeasuring
measuringofofthe
thespatial
spatialproperties.
properties.
Most
Most reportedresearches
reported researchesused
usedthe
themarker-based
marker-based tracking
tracking method.
method. Marker-based
Marker-basedtracking
trackinghas
hasbeen
been
widely used ever since ARToolKit was available [30]. The advantage of marker-based tracking 13
is
Multimodal Technologies and Interact. 2017, 1, 17 13 of 22
computationally inexpensive and it can deliver relatively stable results with a low-resolution camera.
In the research reported by Weidlich et al. [70], the FEA result is superimposed on a black-and-white
fiducial marker that is pasted on the machine. Huang et al. [41,42] implemented a tracking system
based on multiple markers. The multi-marker setup enhances the stability of the tracking performance
by providing a wider range of detection. However, marker-based tracking intrudes the environment
with markers and for engineering applications, visual cluttering introduced by artificial markers
should be avoided.
Marker-less tracking aims to use natural features to determine the camera pose. The natural
features are captured in every frame without relying on the previous frames. The interest points in
the frame are represented by descriptors [79], which can be used to match with the descriptors in
the tracking model. The research community has devoted significant efforts to feature detection.
Shi [80] stated that the right features are those features which can be matched reliably. In the
engineering scenario, it means the working area around the interest points should be visually distinct
and sufficiently textured. Different interest point detectors have been evaluated by researchers all
over the world [81,82]. A descriptor should be created after the interest point has been selected.
Descriptors should capture the texture of the local neighborhood, while being relatively invariant
to changes in scale, rotation, illuminations, etc. The comparison of different descriptors has been
provided [81,83,84]. Recently, natural feature tracking was also implemented in outdoor tracking.
In outdoor tracking, the natural feature based method has been enhanced with additional measures,
such as built-in sensors in phablet devices, to make the solution robust and scalable. Koch et al. [85]
proposed a distributed mobile AR tracking framework for industrial applications. Ufkes et al. [86]
presented an end-to-end mobile AR tracking pipeline with a near 30Hz testing frame rate. Recently,
Yu et al. [87,88] provided a hybrid solution for tracking planar surfaces in an outdoor environment
with a client-server architecture. Similar tracking systems based on a client-server framework have also
been proposed [89–91]. The impact of outdoor mobile AR based tracking technology is summarized
in [92]. Outdoor tracking must address the additional challenge of searching a large localization
database. With the rapid development of phablet and HMD devices, future work should investigate
the integration of current techniques on handheld and HMD devices in order that the mobile AR
systems can adapt to complex outdoor scenes.
Current mobile devices, such as smartphones and tablets, are equipped with an array of sensors,
including global positioning system (GPS), wireless networking, and IMU sensors. GPS estimates
the 3D position of the users. However, the accuracy of the measurement can vary from 1m to 100 m.
Higher accuracy can be achieved with differential GPS (DGPS), which uses an additional correction
signal from ground stations. In addition to DGPS, real-time kinematics GPS (RTKGPS) further improves
the accuracy by measuring the phase of the signal. For smartphones and tablet devices, the accuracy
of the embedded GPS is normally within the range of 1 m to 5 m. Wireless network, such as WiFi,
Bluetooth, and mobile network, can also be used to determine the position. Wireless network based
tracking is achieved by using the identifier assigned by the base station. The tracking accuracy of
wireless network based methods is determined by the strength of the signal. IMU sensor based tracking
uses magnetometer, gyroscope, and accelerometer to determine the position of the users. This sensor
fusion technique is normally combined with optical tracking and GPS in outdoor use.
Visualization Toolkit (VTK) [96] is an open-source library with various supporting visualization
algorithms and
algorithms andinteraction
interactionmethods.
methods.These These visualization
visualization algorithms
algorithms and interaction
and interaction methodsmethods
have
have widely
been been widely implemented
implemented into visualization
into visualization tools,
tools, such such as ParaView,
as ParaView, Mayavi, VolView,
Mayavi, VolView, etc. Brunoetc.
et
Bruno
al. [97] et al. [97] presented
presented a system anamedsystemVTK4AR.
named VTK4AR.
This systemThis integrates
system integrates
basic VTK basic VTK functions
functions into an into
AR
an AR environment.
environment. The CAD The CAD
modelmodel and CFD
and CFD streamlines
streamlines cancan be be augmented
augmented ononreal
realmodels.
models. VTK4AR
VTK4AR
offers enormous
offers enormous convenience
convenience to to related
related research on scientific visualization of numerical simulation
Huang et
results. Huang et al.
al. [27,41,42] used VTK in an AR-based structural analysis system. The interaction
functions provided
functions providedby byVTK
VTKwere wereutilized
utilizedinin this
this system
system to to support
support volume
volume slicing
slicing andand clipping.
clipping. De
De Pascalis
Pascalis [98][98] presented
presented a remote
a remote rendering
rendering methodfor
method formobile
mobiledevices.
devices.The Thesimulation
simulation results
results are
generated in the the polygon
polygon filefile format
format (PLY),
(PLY), also known as standard triangle format. The The PLY
PLY format
file can be rendered remotely via VTK. However, the scalability scalability ofof the system
system is is restricted
restricted as as only
only
pre-defined PLY
pre-defined PLYfiles
filescan
canbebevisualized.
visualized.Scientific
Scientificvisualization
visualizationofofvolumetric
volumetricdata dataononmobile
mobile devices
devicesis
still an untapped research area as compared with desktop-based visualization.
is still an untapped research area as compared with desktop-based visualization. Figure 5 illustrates Figure 5 illustrates the
approach
the of integrating
approach the VTK
of integrating the with
VTK ARwithin AR
current studiesstudies
in current [41]. The visualization
[41]. pipeline pipeline
The visualization consists
of severalof
consists parts, namely,
several vtkMappers,
parts, namely, vtkActors,
vtkMappers, vtkRenderer,
vtkActors, vtkCamera,
vtkRenderer,and vtkRenderWindow.
vtkCamera, and
The images grabbed The
vtkRenderWindow. by aimages
camera grabbed
is renderedby as virtual objects
a camera by using
is rendered the vtkRenderWindow
as virtual objects by using and the
vtkRenderer. The vtkActors
vtkRenderWindow represents the
and vtkRenderer. Thephysical
vtkActorsrepresentation
represents of thethe data in the
physical rendering window.
representation of the
In order
data to register
in the rendering vtkActors
window. in the
In world coordinate
order to register system,
vtkActors thein
fundamental AR camera system,
the world coordinate information
the
is transferredAR
fundamental intocamera
vtkCamera.
information is transferred into vtkCamera.
15
Multimodal Technologies and Interact. 2017, 1, 17 15 of 22
16
Multimodal Technologies and Interact. 2017, 1, 17 16 of 22
been proposedoff-site
collaboration, [104]. The projection-based
users can see the scene ARbeing
allows users toand
captured collaborate without
transmitted, andholding any device.
video streaming is
In remote collaboration, off-site users can see the scene being captured and transmitted,
the primary mode of live transmission in remote collaboration. In Boulanger’s work [104], an AR tele- and video
streaming is the primary
training system mode of
allows remote liveto
users transmission in remote
share the view collaboration.
of local users. In theIn Boulanger’s
work reportedwork [105],
by Shen et
an AR tele-training system allows remote users to share the view of local users. In
al. [106] and Ong and Shen [107], a system was described for remote users to view a product model the work reported
by
fromShen et al. [106]
different and Ong and
perspectives. Shen [107],
On-screen a systemallows
annotation was described for remote
remote experts usersat
to look to the
view a product
work scene
model
from anyfrom differentand
viewpoint perspectives.
annotate the On-screen
scene usingannotation allows remote
corresponding experts to look at the work
tools [108].
scene from any viewpoint and annotate the scene using corresponding tools [108].
4.4. Client-Server Network Architecture
4.4. Client-Server Network Architecture
Client-server architecture has been widely used in the AR community for tracking, remote
Client-server architecture has been widely used in the AR community for tracking, remote
rendering, collaboration, etc. Simultaneous localization and mapping (SLAM) [109] is a technique
rendering, collaboration, etc. Simultaneous localization and mapping (SLAM) [109] is a technique
used for tracking an unknown environment. A client-side SLAM system is integrated with server-
used for tracking an unknown environment. A client-side SLAM system is integrated with server-side
side localization. The server side localization takes full advantage of the computational power
localization. The server side localization takes full advantage of the computational power without
without affecting the portability of the mobile client. The concept of using the computational ability
affecting the portability of the mobile client. The concept of using the computational ability of the server
of the server has influenced the development of the visualization techniques as well. For visualizing
has influenced the development of the visualization techniques as well. For visualizing engineering
engineering analysis and simulation results, the general client/server system architecture can be
analysis and simulation results, the general client/server system architecture can be summarized in
summarized in Figure 6. The server comprises multiple components and processes commands
Figure 6. The server comprises multiple components and processes commands generated from the
generated from the client side. The result rendering module handling the simulation data converts
client side. The result rendering module handling the simulation data converts the data into a readable
the data into a readable format for the client side. Although the wireless network technology has been
format for the client side. Although the wireless network technology has been well developed in
well developed in last ten years, the performance still varies depending on the actual outdoor
last ten years, the performance still varies depending on the actual outdoor location. The network
location. The network connection may not be stable enough to support remote processing of
connection may not be stable enough to support remote processing of simulation data.
simulation data.
5.
5. Conclusions
Conclusions and
and Potential
Potential Future
Future Directions
Directions
Today’s
Today’s engineering
engineering analysis
analysis and
and simulation
simulation software
software aims
aims toto provide
provide anan easy-to-use
easy-to-use interface
interface
for
for the users. Augmented reality applications are becoming increasingly common in
the users. Augmented reality applications are becoming increasingly common in many
many different
different
fields.
fields. One
One of
of the
the major
major advantages
advantages of of using
using ARAR instead
instead of
of VR
VR is
is that
that AR
AR allows
allows users
users to
to interact
interact with
with
real objects in addition to virtual contents in the augmented scene, and can amplify human
real objects in addition to virtual contents in the augmented scene, and can amplify human perception perception
and
and cognition
cognition ofof the
the real
real world.
world. This paper has
This paper has presented
presented aa state-of-the-art
state-of-the-art review
review of
of research
research studies
studies
on AR application in engineering analysis and simulation. Even though there are
on AR application in engineering analysis and simulation. Even though there are many researchers many researchers
working
working on on AR-based
AR-based engineering
engineering analysis,
analysis, there
there is
is no
no report
report to
to provide
provide aa comprehensive
comprehensive review
review on
on
those
those systems. The aim
systems. The aim of
of this
this paper
paper isis to
to provide
provide anan overview
overview of of the
the recent
recent developments
developments in in this
this
field to facilitate further investigation. Numerical simulation methods are powerful tools for
engineers who can perform on-site engineering problem solving with the integration of AR and
17
Multimodal Technologies and Interact. 2017, 1, 17 17 of 22
field to facilitate further investigation. Numerical simulation methods are powerful tools for engineers
who can perform on-site engineering problem solving with the integration of AR and numerical
analysis and simulation tools. This review starts with an overview of traditional computer-aided
technologies followed by a detailed analysis on selected studies. The technical approaches used
for addressing engineering analysis and simulation problems are discussed, which include tracking,
visualization, interaction, collaboration, and client-server network connection. Tracking techniques
have been investigated in Section 4.1. Sensor fusion techniques, such as using GPS or IMU, are
available ubiquitously in AR systems, but have insufficient accuracy to fully support AR tracking.
In addition to sensor-based tracking, optical-based tracking has been implemented to facilitate tracking
performance. Marker-based tracking relies on a simple thresholding, in which the pose of the camera
can be estimated easily from the markers. Comparing with marker-based tracking, natural feature
tracking can be performed in a scene that is not prepared artificially. Recently, with the development of
mobile computing, outdoor tracking addresses additional challenges to AR. Visualization in AR-based
engineering analysis and simulation can be divided into three categories, namely, image overlay, format
conversion, and scientific visualization. Section 4.2 has described related visualization methods in
detail. The integration of VTK with AR has been introduced considering VTK is one of the fundamental
libraries of other visualization tools. The basic characteristics of AR-based engineering analysis and
simulation systems can be summarized as:
1. Robust tracking performance in the engineering scenario for enabling accurate registration of
virtual contents;
2. Accurate visualization techniques for numerical simulation results allowing engineers to evaluate
the problems efficiently; and
3. Intuitive interaction methods for volumetric data exploration.
AR is a promising tool for a wide range of engineering application areas. A further advancement
will be the integration of AR with engineering analysis and simulation tools which has been evaluated
in several studies and applications [39,42,74,79]. In addition to the key research fields and technologies
presented in this paper, some directions for future work could be considered. One of the future
directions is a fully functional mobile AR platform. Current mobile AR solutions are still in the
infant stage, as tracking and result visualization performance cannot meet the current industrial
needs. Advanced computer vision and visualization technology could enable near real-time display
of numerical simulation results on mobile devices. Another possible direction is the use of sensor
networks and ubiquitous computing. An increasing number of commercial products are controlled
by a system-on-chip instead of traditional controllers. Sensors can be embedded in structures and
products for monitoring and maintenance in an AR environment, and the analysis and simulation data
can be provided for a better understanding of the conditions of the structures and products.
Author Contributions: Wenkai L., A. Y. C. Nee, and S. K. Ong collected and analyzed relevant research paper to
this review and wrote the paper.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Moaveni, S. Finite Element Analysis Theory and Application with ANSYS. Available online:
https://s3.amazonaws.com/academia.edu.documents/39672343/FINITE_ELEMENT_ANALYSIS.
pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1503551514&Signature=8llCti61A3gvv0%
2BneizhZ%2Bo0egk%3D&response-content-disposition=inline%3B%20filename%3DFINITE_ELEMENT_
ANALYSIS.pdf (accessed on 6 April 2007).
2. Reddy, J.N. An. Introduction to the Finite Element Method, 3rd ed.; McGraw-Hill: New York, NY, USA, 2006.
3. Azuma, R.T. A survey of augmented reality. Presence: Teleoperators. Virtual Env. 1997, 6, 355–385. [CrossRef]
4. Behzadan, A.H.; Dong, S.; Kamat, V.R. Augmented reality visualization: A review of civil infrastructure
system applications. Adv. Eng. Inform. 2015, 29, 252–267. [CrossRef]
Multimodal Technologies and Interact. 2017, 1, 17 18 of 22
5. Barsom, E.Z.; Graafland, M.; Schijven, M.P. Systematic review on the effectiveness of augmented reality
applications in medical training. Surg. Endosc. 2016, 30, 4174. [CrossRef] [PubMed]
6. Nee, A.Y.C.; Ong, S.K. Virtual and Augmented Reality Applications in Manufacturing; Spring-Verlag: London,
UK, 2004.
7. Dong, F.H. Virtual reality research on vibration characteristics of long-span bridges with considering vehicle
and wind loads based on neural networks and finite element method. Neural Comput. Appl. 2017. [CrossRef]
8. Lian, D.; Oraifige, I.A.; Hall, F.R. Real-time finite element analysis with virtual hands: An introduction.
In Proceedings of the WSCG POSTER, International Conference in Central Europe on Computer Graphics,
Visualization and Computer Vision, Plzen, Czech Republic, 2–6 February 2004.
9. Quesada, C.; González, D.; Alfaro, I.; Cueto, E.; Huerta, A.; Chinesta, F. Real-time simulation techniques
for augmented learning in science and engineering. Vis. Comput. Int. J. Comput. Graph. 2016, 32, 1465–1479.
[CrossRef]
10. Ferrise, F.; Bordegoni, M.; Marseglia, L.; Fiorentino, M.; Uva, A.E. Can Interactive Finite Element Analysis
Improve the Learning of Mechanical Behavior of Materials? A Case Study. Comput. Aided Des. Appl. 2015, 12,
45–51. [CrossRef]
11. Rose, D.; Bidmon, K.; Ertl, T. Intuitive and Interactive Modification of Large finite Element models. Available
online: http://www.visus.uni-stuttgart.de/uploads/tx_vispublications/rbevis04.pdf (accessed on 18 July 2017).
12. Yagawa, G.; Kawai, H.; Yoshimura, S.; Yoshioka, A. Mesh-invisible finite element analysis system in a virtual
reality environment. Comput. Model. Simul. Eng. 1996, 3, 289–314.
13. Yeh, T.P.; Vance, J.M. Combining MSC/NASTRAN, sensitivity methods, and virtual reality to facilitate
interactive design. Finite Elem. Anal. Des. 1997, 26, 161–169. [CrossRef]
14. Scherer, S.; Wabner, M. Advanced visualization for finite elements analysis in virtual reality environments.
Int. J. Interact. Des. Manuf. 2008, 2, 169–173. [CrossRef]
15. Neugebauer, R.; Weidlich, D.; Scherer, S.; Wabner, M. Glyph based representation of principal stress tensors
in virtual reality environments. Prod. Eng. 2008, 2, 179–183. [CrossRef]
16. Buchau, A.; Rucker, W.M. Analysis of a Three-Phase Transformer using COMSOL Multiphysics and a Virtual
Reality Environment. In Proceedings of the 2011 COMSOL Conference, Stuttgart, Germany, 26–28 October 2011.
17. Avila, L.S.; Barre, S.; Blue, R.; Geveci, B.; Henderson, A.; Hoffman, W.A.; King, B.; Law, C.C.; Martin, K.M.;
Schroeder, W.J. The VTK User’s Guide, 5th ed.; Kitware: New York, NY, USA, 2010.
18. Hafner, M.; Schoning, M.; Antczak, M.; Demenko, A.; Hameyer, K. Interactive postprocessing in 3D
electromagnetics. IEEE Trans. Magn. 2010, 46, 3437–3440. [CrossRef]
19. Schoning, M.; Hameyer, K. Applying virtual reality techniques to finite element solutions. IEEE Trans. Magn.
2008, 44, 1422–1425. [CrossRef]
20. Hambli, R.; Chamekh, A.; Salah, H.B.H. Real-time deformation of structure using finite element and neural
networks in virtual reality applications. Finite Elem. Anal. Des. 2006, 42, 985–991. [CrossRef]
21. Santhanam, A.; Fidopiastis, C.; Hamza-Lup, F.; Rolland, J.P.; Imielinska, C. Physically-based deformation
of high-resolution 3D lung models for augmented reality based medical visualization. Available online:
http://www.felixlup.net/papers/2004_MICCAI_Hamza-Lup.pdf (accessed on 18 July 2017).
22. Tzong-Ming, C.; Tu, T.H. A fast parametric deformation mechanism for virtual reality applications.
Comput. Ind. Eng. 2009, 57, 520–538. [CrossRef]
23. Connell, M.; Tullberg, O. A framework for immersive FEM visualisation using transparent object
communication in a distributed network environment. Adv. Eng. Softw. 2002, 33, 453–459. [CrossRef]
24. Liverani, A.; Kuester, F. Towards Interactive Finite Element Analysis of Shell Structures in Virtual Reality.
In Proceedings of the 1999 International Conference on Information Visualisation, London, UK, 14–16 July 1999.
25. Ingrassia, T.; Cappello, F. VirDe: a new virtual reality design approach. Int. J. Interact. Des. Manuf. 2009, 3,
1–11. [CrossRef]
26. Ong, S.K.; Yuan, M.L.; Nee, A.Y.C. Augmented reality applications in manufacturing: A survey. Int. J.
Prod. Res. 2008, 46, 2707–2742. [CrossRef]
27. Ong, S.K.; Huang, J.M. Structure design and analysis with integrated AR-FEA. CIRP Ann. Manuf. Tech. 2017,
66, 149–152. [CrossRef]
28. Zhou, F.; Duh, H.B.L.; Billinghurst, M. Trends in augmented reality tracking, interaction and display:
A review of ten years of ISMAR. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed
and Augmented Reality, Cambridge, UK, 15–18 September 2008.
Multimodal Technologies and Interact. 2017, 1, 17 19 of 22
29. Daponte, P.; De Vito, L.; Picariello, F.; Riccio, M. State of the art and future developments of the Augmented
Reality for measurement applications. Measurement 2014, 57, 53–70. [CrossRef]
30. Kato, H.; Billinghurst, M. Marker tracking and HMD calibration for a video-based augmented reality
conferencing system. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented
Reality, San Francisco, CA, USA, 20–21 October 1999.
31. Salah, Z.; Preim, B.; Rose, G. An approach for enhanced slice visualization utilizing augmented reality:
Algorithms and applications. In Proceedings of the 3rd Palestinian International Conference on Computer
and Information Technology (PICCIT), Palestine Polytechnic University, 9–11 March 2010.
32. Sutherland, C.; Hashtrudi-Zaad, K.; Sellens, R.; Abolmaesumi, P.; Mousavi, P. An augmented reality haptic
training simulator for spinal needle procedures. IEEE Trans. Biomed. Eng. 2013, 60, 3009–3018. [CrossRef]
[PubMed]
33. Carmo, M.B.; Ana, P.C.; António, F.; Ana, P.A.; Paula, R.; Cristina, C.; Miguel, C.B.; Jose, N.P. Visualization of
solar radiation data in augmented reality. In Proceedings of the IEEE International Symposium on Mixed
and Augmented Reality (ISMAR), Munich, Germany, 10–12 September 2014.
34. Carmo, M.B.; Cláudio, A.P.; Ferreira, A.; Afonso, A.P.; Redweik, P.; Catita, C.; and Meireles, C. Augmented
reality for support decision on solar radiation harnessing. In Proceedings of the Computação Gráfica e
Interação (EPCGI), Covilha, Portuguese, 24–25 November 2016.
35. Heuveline, V.; Ritterbusch, S.; Ronnas, S. Augmented reality for urban simulation visualization.
In Proceedings of the first international conference on advanced commnunications and computation,
Barcelona, Spain, 23–28 October 2011.
36. Ritterbusch, S.; Ronnås, S.; Waltschläger, I.; Heuveline, V. Augmented reality visualization of numerical
simulations in urban environments. Int. J. Adv. Syst. Meas. 2013, 6, 26–39.
37. Broll, W.; Lindt, I.; Ohlenburg, J.; Wittkämper, M.; Yuan, C.; Novotny, T.; Strothman, A. Arthur:
A collaborative augmented environment for architectural design and urban planning. J. Virtual Real. Broadcast.
2004, 1, 1–10.
38. Fukuda, T.; Mori, K.; Imaizumi, J. Integration of CFD, VR, AR and BIM for design feedback in a design
process-an experimental study. In Proceedings of the 33rd International Conference on Education and
Research in Computer Aided Architectural Design Europe (eCAADe33), Oulu, Finland, 22–26 August 2015.
39. Yabuki, N.; Furubayashi, S.; Hamada, Y.; Fukuda, T. Collaborative visualization of environmental simulation
result and sensing data using augmented reality. In Proceedings of the International Conference on
Cooperative Design, Visualization and Engineering, Osaka, Japan, 2–5 September 2012.
40. Bernasconi, A.; Kharshiduzzaman, M.; Anodio, L.F.; Bordegoni, M.; Re, G.M.; Braghin, F.; Comolli, L.
Development of a monitoring system for crack growth in bonded single-lap joints based on the strain field
and visualization by augmented reality. J. Adhes. 2014, 90, 496–510. [CrossRef]
41. Huang, J.M.; Ong, S.K.; Nee, A.Y.C. Real-time finite element structural analysis in augmented reality.
Adv. Eng. Softw. 2015, 87, 43–56. [CrossRef]
42. Huang, J.M.; Ong, S.K.; Nee, A.Y.C. Visualization and interaction of finite element analysis in augmented
reality. Comput. Aided Des. 2017, 84, 1–14. [CrossRef]
43. Paulus, C.J.; Haouchine, N.; Cazier, D.; Cotin, S. Augmented reality during cutting and tearing of deformable
objects. In Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality
(ISMAR), Fukuoka, Japan, 29 September–3 October 2015.
44. Fiorentino, M.; Monno, G.; Uva, A. Interactive “touch and see” FEM Simulation using Augmented Reality.
Int. J. Eng. Educ. 2009, 25, 1124–1128.
45. Fiorentino, M.; Monno, G.; Uva, A. Tangible Interfaces for Augmented Engineering Data
Management. Available online: https://www.intechopen.com/books/augmented-reality/tangible-
interfaces-for-augmented-engineering-data-management/ (accessed on 1 January 2010).
46. Niebling, F.; Griesser, R.; Woessner, U. Using Augmented Reality and Interactive Simulations to Realize
Hybrid Prototypes. Available online: https://www.researchgate.net/profile/Uwe_Woessner/publication/
220844660_Using_Augmented_Reality_and_Interactive_Simulations_to_Realize_Hybrid_Prototypes/
links/0c96052a9c0905da4e000000.pdf (accessed on 18 July 2017).
47. Uva, A.E.; Cristiano, S.; Fiorentino, M.; Monno, G. Distributed design review using tangible augmented
technical drawings. Comput. Aided Des. 2010, 42, 364–372. [CrossRef]
Multimodal Technologies and Interact. 2017, 1, 17 20 of 22
48. Uva, A.E.; Fiorentino, M.; Monno, G. Augmented reality integration in product development. In Proceedings
of the International conference on Innovative Methods in Product Design (IMProVe 2011), Venice, Italy, 15–17
June 2011.
49. Valentini, P.P.; Pezzuti, E. Design and interactive simulation of cross-axis compliant pivot using dynamic
splines. Int. J. Interact. Des. Manuf. 2013, 7, 261–269. [CrossRef]
50. Valentini, P.P.; Pezzuti, E. Dynamic splines for interactive simulation of elastic beams in augmented reality.
In Proceedings of the IMPROVE 2011 International Congress, Venice, Italy, 15–17 June 2011.
51. Ibáñez, M.B.; Di Serio, Á.; Villarán, D.; Kloos, C.D. Experimenting with electromagnetism using augmented
reality: Impact on flow student experience and educational effectiveness. Comput. Educ. 2014, 71, 1–13.
[CrossRef]
52. Mannuß, F.; Rubel, J.; Wagner, C.; Bingel, F.; Hinkenjann, A. Augmenting magnetic field lines for school
experiments. In Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented
Reality (ISMAR), Basel, Switzerland, 26–29 October 2011.
53. Matsutomo, S.; Mitsufuji, K.; Hiasa, Y.; Noguchi, S. Real time simulation method of magnetic field for
visualization system with augmented reality technology. IEEE Trans. Magn. 2013, 49, 1665–1668. [CrossRef]
54. Matsutomo, S.; Miyauchi, T.; Noguchi, S.; Yamashita, H. Real-time visualization system of magnetic field
utilizing augmented reality technology for education. IEEE Trans. Magn. 2012, 48, 531–534. [CrossRef]
55. Liao, H.; Inomata, T.; Sakuma, I.; Dohi, T. Three-dimensional augmented reality for MRI-guided surgery
using integral videography auto stereoscopic-image overlay. IEEE Tran Biomed. Eng. 2010, 57, 1476–1486.
[CrossRef] [PubMed]
56. Haouchine, N.; Dequidt, J.; Berger, M.O.; Cotin, S. Single view augmentation of 3D elastic objects.
In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich,
Germany, 10–12 September 2014.
57. Haouchine, N.; Dequidt, J.; Kerrien, E.; Berger, M.O.; Cotin, S. Physics-based augmented reality for 3D
deformable object. In Proceedings of the Eurographics Workshop on Virtual Reality Interaction and Physical
Simulation, Darmstadt, Germany, 6–7 December 2012.
58. Haouchine, N.; Dequidt, J.; Peterlik, I.; Kerrien, E.; Berger, M.O.; Cotin, S. Image-guided simulation of
heterogeneous tissue deformation for augmented reality during hepatic surgery. In Proceedings of the IEEE
International Symposium on Mixed and Augmented Reality (ISMAR), Adelaide, Australia, 1–4 October 2013.
59. Kong, S.H.; Haouchine, N.; Soares, R.; Klymchenko, A.; Andreiuk, B.; Marques, B.; Marescaux, J. Robust
augmented reality registration method for localization of solid organs' tumors using CT-derived virtual
biomechanical model and fluorescent fiducials. Surg. Endosc. 2017, 31, 2863–2871. [CrossRef] [PubMed]
60. Tawara, T.; Ono, K. A framework for volume segmentation and visualization using Augmented Reality.
In Proceedings of the 2010 IEEE Symposium on 3D User Interface (3DUI), Westin Waltham-Boston Waltham,
MA, USA, 20–21 March 2010.
61. Kaladji, A.; Dumenil, A.; Castro, M.; Cardon, A.; Becquemin, J.P.; Bou-Saïd, B.; Haigron, P. Prediction of
deformations during endovascular aortic aneurysm repair using finite element simulation. Comput. Med.
Imaging Graph. 2013, 37, 142–149. [CrossRef] [PubMed]
62. Ha, H.G.; Hong, J. Augmented Reality in Medicine. Hanyang Med. Rev. 2016, 36, 242–247. [CrossRef]
63. Clothier, M.; Bailey, M. Augmented reality visualization tool for kings stormwater bridge. In Proceedings
of the IASTED International Conference on Visualization, Imaging and Image Processing, Marballa, Spain,
6–8 September 2004.
64. Underkoffler, J.; Ullmer, B.; Ishii, H. Emancipated pixels: Real-world graphics in the luminous room.
In Proceedings of the 26th annual conference on Computer graphics and interactive techniques, Los Angeles,
CA, USA, 8–13 August 1999.
65. Lakaemper, R.; Malkawi, A.M. Integrating robot mapping and augmented building simulation. J. Comput.
Civil. Eng. 2009, 23, 384–390. [CrossRef]
66. Malkawi, A.M.; Srinivasan, R.S. A new paradigm for Human-Building Interaction: the use of CFD and
Augmented Reality. Autom. Constr. 2005, 14, 71–84. [CrossRef]
67. Golparvar-Fard, M.; Ham, Y. Automated diagnostics and visualization of potential energy performance
problems in existing buildings using energy performance augmented reality models. J. Comput. Civil. Eng.
2013, 28, 17–29. [CrossRef]
Multimodal Technologies and Interact. 2017, 1, 17 21 of 22
68. Ham, Y.; Golparvar-Fard, M. EPAR: Energy Performance Augmented Reality models for identification of
building energy performance deviations between actual measurements and simulation results. Energy Build.
2013, 63, 15–28. [CrossRef]
69. Graf, H.; Santos, P.; Stork, A. Augmented reality framework supporting conceptual urban planning
and enhancing the awareness for environmental impact. In Proceedings of the 2010 Spring Simulation
Multiconference, Orlando, FL, USA, 11–15 April 2010.
70. Weidlich, D.; Scherer, S.; Wabner, M. Analyses using VR/AR visualization. IEEE Comput. Graph. Appl 2008,
28, 84–86. [CrossRef] [PubMed]
71. Issartel, P.; Guéniat, F.; Ammi, M. Slicing techniques for handheld augmented reality. In Proceedings of the
2014 IEEE Symposium on 3D User Interfaces (3DUI), Minneapolis, MI, USA, 29–30 March 2014.
72. Naets, F.; Cosco, F.; Desmet, W. Improved human-computer interaction for mechanical systems design
through augmented strain/stress visualisation. Int. J. Intell. Eng. Inform. 2017, 5, 50–66. [CrossRef]
73. Moreland, J.; Wang, J.; Liu, Y.; Li, F.; Shen, L.; Wu, B.; Zhou, C. Integration of Augmented Reality with
Computational Fluid Dynamics for Power Plant Training. In Proceedings of the International Conference on
Modeling, Simulation and Visualization Methods, Las Vegas, NE, USA, 22–25 July 2013.
74. Regenbrecht, H.; Baratoff, G.; Wilke, W. Augmented reality projects in the automotive and aerospace
industries. IEEE Comput. Graph. Appl. 2005, 25, 48–56. [CrossRef] [PubMed]
75. Weidenhausen, J.; Knoepfle, C.; Stricker, D. Lessons learned on the way to industrial augmented reality
applications, a retrospective on ARVIKA. Comput. Graph. 2003, 27, 887–891. [CrossRef]
76. Buchau, A.; Rucker, W.M.; Wössner, U.; Becker, M. Augmented reality in teaching of electrodynamics. Int. J.
Comput. Math. Electr. Electron. Eng. 2009, 28, 948–963. [CrossRef]
77. Silva, R.L.; Rodrigues, P.S.; Oliveira, J.C.; Giraldi, G. Augmented Reality for Scientific Visualization: Bringing
DataSets inside the RealWorld. In Proceedings of the Summer Computer Simulation Conference (SCSC
2004), Montreal, Québec, Canada, 20–24 July 2004.
78. Engelke, T.; Keil, J.; Rojtberg, P.; Wientapper, F.; Schmitt, M.; Bockholt, U. Content first: A concept for
industrial augmented reality maintenance applications using mobile devices. In Proceedings of the 6th ACM
Multimedia Systems Conference, Portland, United States, 18–20 March 2015.
79. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110.
[CrossRef]
80. Shi, J. Good features to track. In Proceedings of the IEEE Computer Society Conference on Computer Vision
and Pattern Recognition, 1994 (CVPR’94), Seattle, WA, USA, 21–23 June 1994.
81. Gauglitz, S.; Höllerer, T.; and Turk, M. Evaluation of interest point detectors and feature descriptors for
visual tracking. Int. J. Comput. Vision 2011, 94, 335–360. [CrossRef]
82. Mikolajczyk, K.; Schmid, C. Scale affine invariant interest point detectors. Int. J. Comput. Vision 2004, 60,
63–86. [CrossRef]
83. Mikolajczyk, K.; Schmid, C. A performance evaluation of local descriptors. IEEE Trans Pattern Anal.
Mach. Intell. 2005, 27, 1615–1630. [CrossRef] [PubMed]
84. Moreels, P.; and Perona, P. Evaluation of features detectors and descriptors based on 3D objects. Int. J.
Comput. Vision 2007, 73, 263–284. [CrossRef]
85. Koch, R.; Evers-Senne, J.F.; Schiller, I.; Wuest, H.; and Stricker, D. Architecture and tracking algorithms for a
distributed mobile industrial AR system. In Proceedings of the 5th International Conference on Computer
Vision Systems (ICVS07), Bielefeld University, Germany, 21–24 March 2007.
86. Ufkes, A.; Fiala, M. A markerless augmented reality system for mobile devices. In Proceedings of the
International Conference on Computer and Robot Vision (CRV2013), Regina, Saskatchewan, Canada, 17–19
May 2013.
87. Yu, L.; Li, W.K.; Ong, S.K.; Nee, A.Y.C. Enhanced Planar Pattern Tracking for an Outdoor Augmented Reality
System. Int. J. Comput. Electr. Autom. Control Inf. Eng. 2017, 11, 125–136.
88. Yu, L.; Ong, S.K.; and Nee, A.Y.C. A tracking solution for mobile augmented reality based on sensor-aided
marker-less tracking and panoramic mapping. Multimed. Tools Appl. 2016, 75, 3199–3220. [CrossRef]
89. Gammeter, S.; Gassmann, A.; Bossard, L.; Quack, T.; and Van Gool, L. Server-side object recognition and
client-side object tracking for mobile augmented reality. In Proceedings of the IEEE Computer Society
Conference on Computer Vision and Pattern Recognition Workshops (CVPRW2010), San Francisco, CA,
USA, 13–18 June 2010.
Multimodal Technologies and Interact. 2017, 1, 17 22 of 22
90. Ha, J.; Cho, K.; Rojas, F.A.; Yang, H.S. Real-time scalable recognition and tracking based on the server-client
model for mobile augmented reality. In Proceedings of the IEEE International Symposium on VR Innovation
(ISVRI2011), Singapore, 19–20 March 2011.
91. Jung, J.; Ha, J.; Lee, S.W.; Rojas, F.A.; and Yang, H.S. Efficient mobile AR technology using scalable recognition
and tracking based on server-client model. Comput. Graph. 2012, 36, 131–139. [CrossRef]
92. Mulloni, A.; Grubert, J.; Seichter, H.; Langlotz, T.; Grasset, R.; Reitmayr, G.; Schmalstieg, D. Experiences with
the impact of tracking technology in mobile augmented reality evaluations. In Proceedings of the MobileHCI
2012 Workshop MobiVis, San Francisco, CA, USA, 21–24 September 2012.
93. Helfrich-Schkarbanenko, A.; Heuveline, V.; Reiner, R.; Ritterbusch, S. Bandwidth-efficient parallel
visualization for mobile devices. In Proceedings of the 2nd International Conference on Advanced
Communications and Computation, Venice, Italy, 21–26 October 2012.
94. Moser, M.; Weiskopf, D. Interactive volume rendering on mobile devices. Vision Model. Vis. 2008, 8, 217–226.
95. Anzt, H.; Augustin, W.; Baumann, M.; Bockelmann, H.; Gengenbach, T.; Hahn, T.; Ritterbusch, S. Hiflow3–A
Flexible and Hardware-Aware Parallel Finite Element Package. Available online: https://journals.ub.uni-
heidelberg.de/index.php/emcl-pp/article/view/11675 (accessed on 18 July 2017).
96. Schroeder, W.J.; Lorensen, B.; Martin, K. The Visualization Toolkit: An Object-Oriented Approach to 3D Graphics,
4th ed.; Kitware: New York, NY, USA, 2006.
97. Bruno, F.; Caruso, F.; De Napoli, L.; Muzzupappa, M. Visualization of industrial engineering data
visualization of industrial engineering data in augmented reality. J. Vis. 2006, 9, 319–329. [CrossRef]
98. De Pascalis, F. VTK Remote Rendering of 3D Laser Scanner Ply files for Android Mobile Devices. Available
online: http://hdl.handle.net/10380/3458 (accessed on 5 May 2014).
99. Augmented Reality Sandbox. Available online: idav.ucdavis.edu/~okreylos/ResDev/SARandbox (accessed
on 14 July 2017).
100. Lukosch, S.; Billinghurst, M.; Alem, L.; Kiyokawa, K. Collaboration in augmented reality. Comput. Support.
Coop. Work 2015, 24, 515–525. [CrossRef]
101. Fuhrmann, A.; Loffelmann, H.; Schmalstieg, D.; Gervautz, M. Collaborative visualization in augmented
reality. IEEE Comput. Graph. Appl. 1998, 18, 54–59. [CrossRef]
102. Rekimoto, J. Transvision: A hand-held augmented reality system for collaborative design. In Proceedings of
the International Conference on Virtual Systems and Multimedia, Gifu, Japan, 18–20 September 1996.
103. Dong, S.; Behzadan, A.H.; Chen, F.; Kamat, V.R. Collaborative visualization of engineering processes using
tabletop augmented reality. Adv. Eng. Softw. 2013, 55, 45–55. [CrossRef]
104. Benko, H.; Wilson, A.D.; Zannier, F. Dyadic projected spatial augmented reality. In Proceedings of the 27th
annual ACM symposium on User interface software and technology, Hawaii, United States, 5–8 October 2014.
105. Boulanger, P. Application of augmented reality to industrial tele-training. In Proceedings of the First
Canadian Conference on Computer and Robot Vision, London, ON, Canada, 17–19 May 2004.
106. Shen, Y.; Ong, S.K.; Nee, A.Y.C. Product information visualization and augmentation in collaborative design.
Comput. Aided Des. 2008, 40, 963–974. [CrossRef]
107. Ong, S.K.; Shen, Y. A mixed reality environment for collaborative product design and development.
CIRP Ann. Manuf. Tech. 2009, 58, 139–142. [CrossRef]
108. Gauglitz, S.; Nuernberger, B.; Turk, M.; Höllerer, T. In touch with the remote world: Remote collaboration
with augmented reality drawings and virtual navigation. In Proceedings of the 20th ACM Symposium on
Virtual Reality Software and Technology, Edinburgh, UK, 11–13 November 2014.
109. Tan, W.; Liu, H.; Dong, Z.; Zhang, G.; Bao, H. Robust monocular SLAM in dynamic environments.
In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR2013),
Adelaide, Australia, 1–4 October 2013.
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).