Extended Reality Seminar Report PDF
Extended Reality Seminar Report PDF
Augmented reality is the integration of digital information with the user's environment in
real time. Unlike virtual reality, which creates a totally artificial environment, augmented reality
uses the existing environment and overlays new information on top of it.
Extended reality (XR) is a term that encompasses real and virtual environments that are
generated by wearable devices or computer technology to provide an immersive experience. It
can also be described as a collection of all immersive technologies that combine real and virtual
worlds. Extended reality encompasses three main pillars of immersive technology: Virtual
Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).
1.1 History
Extended reality is a term referring to all real-and-virtual combined environments and
human-machine interactions generated by computer technology and wearables. Extended
Reality has evolved over the years. We have tried to capture the timeline of all the key
milestones in the below info graph.
1
CHAPTER 2
EXTENDED REALITY
2.1 Extended Reality(XR)
Extended reality (XR) encompasses all the technologies, including VR and AR, on the
spectrum going from devices that interact with the real world to the ones that only work with
the virtual world. XR refers to the spectrum of experiences that blurs the line betweenthe real
world and the simulated world. The technology immerses the user through visuals, audio, and
potentially. Olfactory and haptic cues.XR is an emerging umbrella term for all the immersive
technologies. The ones we already have today- Augmented Reality (AR), Virtual Reality (VR)
Mixed Reality (MR) plus those that are still to be created. All immersive technologies extend
the reality experience by either blending the virtual and “real” worlds or by creating a fully
immersive experience.
In augmented reality, virtual information and objects are overlaid on the real world. This
experience enhances the real world with digital details such as images, text, and animation. You
can access the experience through AR glasses or via screens, tablets, and smart phones. This
means users are not isolated from the real world and can still interact and see what’s going on
in front of them. The most well-known examples of AR are the Pokémon GO game (Fig 2.1)
that overlays digital creatures onto the real world or Snapchat filters that put digital objects
such as hats or glasses onto your head.
2
2.3 Virtual Reality(VR)
In contrast to augmented reality, in a virtual reality experience, users are fully immersed
in a simulated digital environment. VR visually takes the user out of their real- world
environment and into a virtual environment, typically using a headset for viewing coupled with
hand-held controllers to navigate the virtual space.
Individuals must put on a VR headset or head-mounted display to get a 360 -degree view
of an artificial world that leads their brain into believing they are, e.g., walking on the moon,
swimming under the ocean or stepped into whatever new world the VR developers created.
The gaming and entertainment industry were early adopters of this technology; however,
companies in several industries such as healthcare, construction, engineering, the military, and
more are finding VR to be very useful.
Currently, standard virtual reality systems use either virtual reality headsets or multi-
projected environments to generate some realistic images, sounds and other sensations that
simulate a user's physical presence in a virtual environment.
A person using virtual reality equipment is able to look around the artificial world, move
around in it, and interact with virtual features or items. The effect is commonly created by VR
headsets consisting of a head-mounted display with a small screen in front of the eyes, but can
also be created through specially designed rooms with multiple large screens
3
2.4 Mixed Reality(MR)
Mixed reality (MR) is a term used to describe the merging of a real-world environment
and a computer-generated one. Physical and virtual objects may co-exist in mixed reality
environment and interact in real time.
Mixed reality that incorporates haptics has sometimes been referred to as Visuo-haptic
mixed reality.
In mixed reality, digital and real-world objects co-exist and can interact with one another
in real-time. This is the latest immersive technology and is sometimes referred to as hybrid
reality. It requires a lot more processing power than VR or AR. Microsoft's HoloLens is a great
example that allows you to place digital objects into the room you are standing in and give you
the ability to spin it around or interact with the digital object in any way possible. Companies
are exploring ways they can put mixed reality to work to solve problems, support initiatives,
and make their businesses better, Mixed reality (MR) is a term used to describe the merging of
a real-world environment and a computer-generated one. Physical and virtual objects may co-
exist in mixed reality environments and interact in real time.
Mixed reality has been used in applications across fields including design, education,
entertainment, military training, healthcare, product content management, and human-in-the-
loop operation of robots.
4
CHAPTER 3
DESIGN
3.1 Interaction
Interactions with the application are the pillars for collaboration. Focused at the prototype
on interior design, more notably, object manipulation. The planned interactions are the
following:
Object Manipulation: grab, drag, throw and scale. Object manipulation refers to the
ability to interact with virtual or augmented objects using natural hand gestures or other
input methods. This allows users to pick up, move, rotate, and resize objects, as well as
perform other actions such as throwing, stacking, or squeezing. Object manipulation is
essential for a wide range of XR applications, including gaming, design, and education
Find Objects: rotate camera and move around the environment. Finding objects refers
to the ability to identify and locate objects in the XR environment. This can be done
using a variety of techniques, such as voice input, gaze tracking, or object recognition.
Once an object has been identified, its location can be represented as two separate
points: its center of mass and its bounding box. This information can be used to interact
with the object in a variety of ways, such as moving it, highlighting it, or providing
additional information about it.
Object manipulation involves Grabbing, Dragging, Throwing and Scaling objects. These
interactions must be synchronized across every user, which might cause conflicts when two
clients interact with the same object at the same time. This issue is normally solved by
implementing an ownership feature that only lets users interact with an object once they have
ownership over it. The request and transfer of ownership solves the issueof users interacting at
the same time with one object.
In VR, in order to reach objects in the environment, and to account for the limitations
in physical space, users can teleport around the VE. As an initial implementation, users would
move to pre-defined points by pointing at them. However, since some objects were almost
unreachable, users are allowed to teleport to a pre-defined area. This increased user control over
their position, since they only needed to point at the desired location. For pointing, a curved
line was used since it works better in crowded environments where it is difficult to point at the
ground in a straight line. In AR, we are using markers to determine the current position of
users. To move to another room in the VE, users need to synchronize their position with a
different marker, since each division in the VE has its own marker.
5
The combination of object manipulation and finding objects as two separate points
allows for a rich and immersive interaction experience in XR applications. For example, a user
could use hand gestures to pick up a virtual object, rotate it to inspect it from all sides, and then
place it down in a specific location. This type of interaction is not possible with traditional input
methods such as keyboards and mouse.
3.2 Collaboration
Collaborative techniques provide users with tools that facilitate cooperation between
them. A set of collaboration mechanisms is defined that help users cooperate in a shared
environment. These mechanisms are the following:
Preview: Create preview, visualize preview, manipulate preview and accept preview.
6
3.3 Effects
Several effects were implemented to guide users to the location of the highlighted user or
object. For that, three visual effects were developed to guide them. The effects are the
following:
Arrow: An arrow is displayed on screen pointing to the direction the user must lookat
to see the highlight source. The arrow is pink so that it has more contrast with the
environment.
Radar: The perimeter of a circle is displayed on the ground and starts moving towards
the center. which is the highlight origin. The effect helps users understand the position
of the object in the 3D world since it moves along the environment towards the source.
Transparent Walls: To better help visualize the origin of the effect, walls become
transparent around the screen position of the object. It also helps identify if the object is
currently occluded or not. If not, no walls will be transparent.
These help to guide users to the source of the highlight. However, since highlighting users
and objects works as a single notification system, there should be a clear differentiation
between each of the calls.
A highlighted object will display a yellow color that fades over time. The yellow portion
of the object is always visible, even when occluded. Once the yellow has completely faded the
highlight is over. However, the color only starts to fade once the object is within a certain
percentage of the user’s field-of-view (FOV). Users can hide the effect if they perform a
highlight on that same object, which, instead of sending a notification to other users, it will
cancel the effect. The same thing happens when receiving a highlight notification on an already
highlighted object, the effect is cancelled.
Finally, to visualize objects in different locations and show other users their ideas of how
the space should look, we propose a previewing feature. The previews allow users to visualize
an object in another location before moving it. Previews are a “ghost” of the original object.
They can be manipulated just like any other interactable object. Upon creating a preview, it will
be attached to the user’s hand, until it is dropped. The creation of preview is a type of object
manipulation, but instead of moving the original object it creates a new one. The preview object
only draws the edges of object’s polygons, which allows for the visualization of the interactable
while facilitating the distinction.
7
3.4 Prototype Design
The development phase yielded some insights into possible challenges that developers will
face when implementing these types of applications. The VR prototype was developed in two
different computers and VR systems, one computer was using the HTC Vive and the other was
using the Oculus Rift. The AR prototype was developed in a different computer and it was
tested using several mobile devices. The project is developed in Unity3D which provides some
needed tools, such as support for VR and AR devices, physics and graphics engine, among
others. The implementation of the tools defined in the previous section allowed us to assemble
them in a prototype centered on interior design. An AR prototype was developed alongside the
VR one, which allowed for the same type of interactions.
To visualize objects in different locations and show other users their ideas of how the
space should look, we propose a previewing feature. The previews allow users to visualize an
object in another location before moving it. Previews are a “ghost” of the original object. They
can be manipulated just like any other interactable object. Upon creating a preview, it will be
attached to the user’s hand, until it is dropped. The creation of preview is a type of object
manipulation, but instead of moving the original object it creates a new one. The preview object
only draws the edges of object’s polygons, which allows for the visualization of the interactable
while facilitating the distinction.
8
CHAPTER 4
ARCHITECTURE
4.1 Architecture
To develop an application that supports multiple XR devices, the components that depend
on specific systems are separates into isolated scenes. These scenes are loaded into the
application depending on the device. Other components, including the environment, are placed
on a scene that is shared across every device. This allows to reuse the components that do not
require specific devices. The diagram shown in Fig 4.1.1 represents two VR and two AR clients
connected to the same server. The dashed lines represent the communication between them that
maintains a synchronized state of the VE. The components inside each of the boxes are high-
level descriptions of the different elements implemented for the framework, as well as how they
interact with each other.
Out of the four clients connected to the same server, one must be the master client, which
will be responsible for keeping the environment synchronized between all clients when conflicts
occur. Conflicts may arise, for instance, when two clients try to interact with the same object at
the same time, in which case, the master client will give ownership of that object to one of the
clients.
The networking tool should be flexible enough to allow different configurations of server
setup. Photon Engine is a well-known networking solution that supports Unity. It allows
developers to choose between hosting a server themselves or using Photon’s cloud servers. It
allows for multiple clients, from different devices, to connect to the same room and interact
with a shared environment.
Through Photon Engine, objects can be synchronized and communication can be
established between multiple users. In both the VR and AR prototypes, only the avatar and the
interactive are synchronized across users. However, the actions (marked with the blue color) are
specific for VR systems since they depend on the input that is coming fromthem.
9
4.2 Proposed Solution
The evaluation stage was used to validate the developed prototype. This section focuses
on the VR component evaluation, to understand what collaborative mechanisms worked best
for VR applications. That was accomplished by testing the VR prototype with several users.
Although the connection to the server is remote, the tests were performed in the same space to
facilitate guidance throughout the experience. The information gathered during the tests takes
into consideration the research questions previously defined.
The tests were divided into two stages, which divide the levels of interaction with the
application.
10
In the first stage (S1), users were placed on separate environments to get used to the
application. As soon as users got used to the various interactions, including Tool Selection,
Teleportation, Object Manipulation and the creation, and subsequent acceptance, of Previews,
they moved on to the second stage (S2). In S2, the pair entered a shared environment, different
from the first one.
In S2, one of the users, User 1 (U1), would choose to go first. U1 would search for the
couch that was hidden in the environment. Upon finding it, User 2 (U2) would step inside VR
and wait for a highlight, sent by U1, calling attention to the couch. After receiving the
notification, U2 would start searching for the couch. U1’s task was to help U2 find it, by
performing highlights. These search tasks were timed to be analyzed later on. Afterward, U2
would create a preview of the couch and place it someplace else, which U1 would accept after
receiving a notification of a highlight to the previewed couch. They were then free to explore
the environment. Once they were separated or out of the line of sight of the other, users were
asked to highlight themselves.
Collaborative mechanisms worked best for VR applications. That was accomplished by
testing the VR prototype withseveral users. Although the connection to the server is remote, the
tests were performed in the same space to facilitate guidance throughout the experience. The
information gathered during the tests takes into consideration the research questions
previously defined.
Tools determine which actions users can perform with each of the MCs. The tools can be
selected through a menu, but only one tool is active for each of the hands. Since each hand can
have a different tool, users can perform different tasks at the same time, e.g., teleporting with
one hand and grabbing an object with the other.
11
CHAPTER 5
IMPLEMENTATION
The implementation of extended reality (XR) can vary depending on the specific type of
XR technology and the intended use case. However, there are some general steps that are
involved in implementing XR:
Identify the use case: Once you have a clear understanding of your use case, you can
start to choose the right XR technology and hardware.
Choose the right hardware: There are a variety of XR headsets and devices available,
each with its own strengths and weaknesses. Consider your budget, the desired level of
immersion, and the specific needs of your use case when choosing the right hardware.
Develop the XR content: Once you have the hardware, you need to develop the XR
content. This can be done in-house or by a third-party developer. Ifyou are developing
the content in-house, you will need to use a specialized XR development platform
Test and deploy the XR solution: Once the XR content is developed, you need to test
it thoroughly to make sure that it works as expected. Once you are satisfied with the
results, you can deploy the XR solution to your users.
Normally, VR applications use motion controllers (MCs) to interact with the virtual
world. However, since the number of buttons in a MC is not enough to account for the great
number actions that were defined above, the actions are separated into several tools.
Tools determine which actions users can perform with each of the MCs. The tools can be
selected through a menu, but only one tool is active for each of the hands. Since each hand can
have a different tool, users can perform different tasks at the same time, e.g., teleporting with
one hand and grabbing an object with the other. The tools allow developers to map the same
input from the MCs to different behavior. The only input they cannot control is the one used for
the selection of tools which is always active.
12
For the prototype, there are in total four tools and they are color coded to facilitate the
distinction. The blue tool is for manipulating objects, the purple one for creating previews and
the green one for accepting previews. These three also map input for teleportation and
highlighting objects. Finally, the fourth tool highlights the user. Since it is only one action it
reverts back to the previous tool after the highlight.
To run the prototype in AR mobile devices are used. The interactions are done through
the device’s touch screen, which differentiates it from the VR experience. AR is helpful for
visualizing data in the real world. In the case of our prototype is visualizing a whole VE over
the real environment. AR users can better understand how virtual objects are affected by the
real-world, therefore they are able to place arrows in the environment which point towards
objects/location of interest, e.g., to help them communicate where to place an object. AR users
can also leave arrow in the environment pointing at specific locations or objects, e.g., for
communicating where to place an object.
These differences in interactions makes it more difficult to develop XR applications.
However, it is also what enables more extensive experiences, since users can adapt their
interactions with the application by changing devices. These multimodal interfaces allow
different users to collaborate from different perspectives which might all improve the result.
The sensors in the headset track the user's movements and orientation.
The XR software uses this information to render the XR content in real time.
The user can interact with the XR content by moving their head and body.
13
The XR software typically uses a number of different techniques to create animmersive
experience for the user. These techniques include:
Stereo rendering: Stereo rendering creates a three-dimensional effect bydisplaying
slightly different images to each eye.
Head tracking: Head tracking allows the XR software to adjust the viewpoint of the
XR content based on the user's head movements.
World tracking: World tracking allows the XR software to position virtualobjects
in the real world.
Interaction physics: Interaction physics allows the user to interact with virtualobjects
in a realistic way.
XR is still in its early stages of development, but it has the potential to revolutionize many
industries and aspects of our lives. As the technology continues to develop and become more
affordable, we can expect to see even more innovative and groundbreaking XR applications
emerge.
14
CHAPTER 6
APPLICATIONS
Entertainment and Gaming: XR for entertainment and games may sound light, butthis
currently constitutes the number one market segment. Gamers can intimately feel what
their selected scenes would resemble in the flesh, whether crossing into another era,
place or exploring fantastic futuristic worlds. Consumers can virtually experience live
music and sporting events from the comfort of their VR headsets.
Healthcare: Experts see multiple areas where VR technology can contribute to
healthcare, including mental well-being, physiotherapy, pharmaceutical development,
and education for professionals and patients. Experts see multiple areas where VR
technology can contribute to healthcare, including mental well- being, physiotherapy,
pharmaceutical development, and education for professionals and patients.
Engineering and Manufacturing: Engineering and manufacturing can sometimes
involve dangerous functions. The use of augmented reality enables workers to conduct
these actions from a safe distance. For instance, an employee can direct a robot to
perform some tasks involving hazardous chemicals that pose a risk.
eCommerce and Retail: AR contributes to both online and offline shopping. In stores,
customers can quickly and easily learn all about products on display. You can, for
instance, instantly find reviews and recommendations just by pointing your camera at
an AR-enabled item. Retailers can also offer discounts through this medium while
uncovering shopping patterns in the data.
Education: XR enables virtual field trips, including to locations that you can’t reach in
person. You can also explore complex scientific topics in extreme detail, like
manipulating a 3D model of a molecule. When it comes to post-secondary, XR enables
remote self-paced learning. Richer interconnections make the material more memorable.
Defense: By superimposing virtual data over a view of the real world, military personnel
can use XR technologies to navigate more easily across any terrain in the world; know
the location of friendly troops or reported threats; train and rehearse for anticipated
battle scenarios, and even overlay virtual enemies and obstacles as needed for better
preparation.
15
CHAPTER 7
MAJOR CHALLENGES IN DEVELOPING XR
Cost: The high cost of XR hardware and software has been a major barrier to entry for
many consumers and businesses. While prices have come down in recent years, XR
headsets and other equipment still remain expensive compared to traditional computing
devices. Additionally, the development of high-quality XR content can be time-
consuming and costly, further adding to the overall cost of XR experiences.
Technology complexity: XR technologies are complex and require a deep
understanding of hardware, software, and human-computer interaction to develop and
implement effectively. The creation of immersive and realistic XR experiences requires
a combination of advanced graphics, motion tracking, and input technologies, which
can be challenging to integrate seamlessly.
Lack of standardization: The absence of standardized platforms and protocols has
hindered the development of interoperable XR experiences. Currently, there are
multiple proprietary XR platforms, each with its own unique hardware and software
requirements. This lack of standardization makes it difficult for developers to create
content that can be easily used across different XR devices and platforms.
User experience: XR products can be difficult to use, especially for people who are not
familiar with the technology. Companies need to develop XR products that are easy to
use and provide a positive user experience. Creating engaging and comfortable XR
experiences is essential for long-term user adoption. However, several factors can
contribute to user discomfort or dissatisfaction with XR experiences, such as motion
sickness, eye strain, and social isolation. Addressing these issues requires a deeper
understanding of human perception and interaction in virtual and augmented
environments.
Privacy and security concerns: XR products collect a lot of data about users and their
environment. Companies need to address privacy and security concerns to ensure that
users trust their XR products.
Overcoming these challenges will require a concerted effort from XR technology
developers, industry experts, policymakers, and researchers. By addressing these issues, XR
has the potential to revolutionize various aspects of our lives, from entertainment and education
to healthcare and manufacturing.
16
CHAPTER 8
SCOPE OF EXTENDED REALITY
Recent research has revealed that more than a whopping 60 percent of the respondents
believe that Extended Reality will become mainstream in the just the next five years. This
shows just how rapidly this technology is being developed, and how willingly the public is
ready to adopt it once it is ready and available in the market.
Indeed, Extended Reality has plenty of uses and could be employed in all kinds of fields
such as retail, real estate, marketing, training, entertainment, and more. It can also be used by
the best UI UX design services. The technology has the potential to completely change the way
we live our everyday lives, as it will alter our very perception of reality.
A glance at current use cases shows the potential for XR across industries:
Reality XR gives customers the ability to try before they buy. Watch manufacturer
Rolex has an AR app that allows you to try on watches on your actual wrist, and
furniture company IKEA gives customers the ability to place furniture items into their
home via their smartphone.
Entertainment XR brings immersive experiences to the entertainment world, and
offers consumers an opportunity to virtually experience live music and sporting events
from the comfort of their VR headset. While a majority of market share leans
heavily towards entertainment, it’s not the only one gearing up for a virtual expansion.
Marketing Virtual realities have opened new ways for brands to engage with consumers,
offering immersive ways to interact with new products.
Training Extended reality opens new avenues for training and education. People who
work in high- risk conditions – like chemists and pilots – can train in safety from a more
conventional classroom setting. Medical students, meanwhile, can get hands-on practice
on virtual patients.
Real Estate Property managers can streamline the rental process by allowing potential
tenants to view properties virtually, while architects and interior designers can leverage
XR to bring their designs to life.
17
Remote Work XR removes distance barriers, allowing remote employees to seamlessly
access data from anywhere in the world.
Extended reality is not without its challenges. The spread of data presents a new layer of
vulnerability for cyber-attacks, while the high cost of implementation is a barrier to entry for
many companies.
18
CHAPTER 9
ADVANTAGES AND DIS ADVANTAGES
9.1 Advantages
Companiesthat apply environments powered by the technology valuable benefits, such as:
The provision of an unusual experience. A dive into a radically different reality
allows companies to provide their users with the possibility of visiting places of
interest or experiment something without leaving the house.
Efficient information uptake. XR provides its users with a more realistic view of their
subject matter, which allows them to be trained in a more effective manner.
Safe training. Those who need to practice in high-risk conditions, such as militaryor
chemists, can train safely from conventional classrooms.
Seamless data access. XR removes distance barriers, which is why humans can
smoothly access remote data.
9.2 Disadvantages
19
CHAPTER 10
CONCLUSION
20
REFERENCES
https://www.slideshare.net/BernardMarr/what-is-extended-reality-technology-a-simple-
explanation-for-anyone
https://www.forbes.com/sites/bernardmarr/2019/08/12/what-is-extended-reality-
technology-a-simple-explanation-for- anyone/#7481ada72498
https://www.cablelabs.com/cablelabs-honored-with-a-new-technology-
emmywww.researchgate.net
https://library.educause.edu/topics/emerging-technologies/extended-reality-xr
21
22