0% found this document useful (0 votes)
33 views

AVR - Unit-1 Notes

Uploaded by

Abhinandan R
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

AVR - Unit-1 Notes

Uploaded by

Abhinandan R
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Augmented and Virtual Reality (B19CS7041)

Unit-1: Introduction to Augmented Reality (AR)

Definition of Augmented Reality:

 Augmented reality (AR) is an interactive experience of a real-world environment where the objects that
reside in the real world are enhanced by computer-generated perceptual information.
 AR can be defined as a system that fulfills three basic features: a combination of real and virtual worlds,
real-time interaction, and accurate 3D registration of virtual and real objects.

Scope of Augmented Reality:

 Augmented reality (AR) today is a little like the World Wide Web in the mid ’90s.
 AR is about using a portable device, such as a smartphone, to add a few extra details to what we see.
Examples include Google Glass and Pokémon Go, both of which received lots of hype — some positive
and some extremely negative — but have since joined the annals of yesterday’s tech fads. Currently the
most popular applications of AR are on Snapchat, where selfie lovers use smart filters to decorate and
animate photos on the fly.
 Currently, augmented reality jobs offer the greatest opportunities for creative professionals. This reflects
the broader user base for AR technology, although virtual reality jobs are also growing at a fast rate.
 Most of the augmented reality jobs available today are best described as existing job titles with an AR
descriptor. Common positions include:
o AR/VR content developer
o AR/VR content strategist
o AR/VR user experience designer
o Designer, animator, or sound artist specializing in AR & VR
o AR/VR community manager
o AR/VR project manager
A Brief History of Augmented Reality:

 Augmented reality in the 60s & 70s:


 1968: Ivan Sutherland, a Harvard professor and computer scientist, created the first head-mounted
display called ‘The Sword of Damocles’.
 1974: Myron Kruger, a computer researcher and artist, built a laboratory at the University of
Connecticut called ‘Videoplace’ that was entirely dedicated to artificial reality. Within these walls,
projection and camera technology was used to emit onscreen silhouettes which surrounded users for
an interactive experience. https://www.youtube.com/watch?v=d4DUIeXSEpk&t=2s

 Augmented reality in the 80s & 90s:


 1990: Tom Caudell, a Boeing researcher, coined the term ‘augmented reality’.
 1992: Louis Rosenburg, a researcher in the USAF Armstrong's Research Lab, created ‘Virtual
Fixtures’, which was one of the first fully functional augmented reality systems.

 The system allowed military personnel to virtually control and guide machinery to perform tasks like
training their US Air Force pilots on safer flying practices.
 1994: Julie Martin, a writer and producer, brought augmented reality to the entertainment industry
for the first time with the theater production titled Dancing in Cyberspace.
 The show featured acrobats dancing alongside projected virtual objects on the physical stage.
 1998: Sportsvision broadcasts the first live NFL game with the virtual 1st & Ten graphic system –
aka the yellow yard marker. The technology displays a yellow line overlayed on top of the feed to
that views can quickly see where the team just advance to get a first down.
 This system is still used today, although admittedly more advanced than it was in the late ‘90s.
Viewers have become accustomed to the yellow line marker and other additional graphics – most
don’t even know that this is a form of AR technology. https://www.youtube.com/watch?
v=1Oqm6eO6deU
 1999: NASA created a hybrid synthetic vision system of their X-38 spacecraft. The system
leveraged AR technology to assist in providing better navigation during their test flights.
 The augmented reality component displayed map data right on the pilot’s screen.
 Augmented reality in the 2000s & today:
 2000: Hirokazu Kato developed an open-source software library called the ARToolKit. This
package helps other developers build augmented reality software programs. The library uses video
tracking to overlay virtual graphics on top of the real world.
 2003: Sportvision enhanced the 1st & Ten graphic to include the feature on the new Skycam system
– providing viewers with an aerial shot of the field with graphics overlaid on top of it.
 2009: Esquire Magazine used augmented reality in print media for the first time in an attempt to
make the pages come alive. https://www.youtube.com/watch?v=wp2z36kKn0s
 When readers scanned the cover, the augmented reality equipped magazine featured Robert Downey
Jr. speaking to readers.
 2013: Volkswagen debuted the MARTA app (Mobile Augmented Reality Technical Assistance)
which primarily gave technicians step-by-step repair instructions within the service manual.
 This adaptation of AR technology was groundbreaking, as it could and would be applied to many
different industries to align and streamline processes.
https://www.youtube.com/watch?v=H7RzyjNJH6c
 2014: Google unveiled its Google Glass devices, a pair of augmented reality glasses that users could
wear for immersive experiences.
 Users wore the AR tech and communicated with the Internet via natural language processing
commands. With this device, users could access a variety of applications like Google Maps,
Google+, Gmail, and more.
 2016: Microsoft starts shipping its version of wearable AR technology called the HoloLens, which is
more advanced than the Google Glass, but came with a hefty price tag. It’s definitely not an
everyday type of accessory. https://www.youtube.com/watch?v=4p0BDw4VHNo
 The headset runs on Windows 10 and is essentially a wearable computer. It also allows users to scan
their surroundings and create their own AR experiences.
 2017: IKEA released its augmented reality app called IKEA Place that changed the retail industry
forever. https://www.youtube.com/watch?v=UudV1VdFtuQ
 The app allows customers to virtually preview their home decor options before actually making a
purchase.
 In current situation, as we become increasingly dependent on our mobile devices, the adoption of
augmented reality technology will begin to rise. AR software advances will be the way forward as
the overwhelming majority of consumers have a smartphone and already take it everywhere with
them, making it a convenient medium to bring AR to nearly every consumer.
 The truth is, augmented reality is already used by everyday consumers – they just don’t know it.
The Snapchat dog filter and others are powered by AR. The biggest shift in augmented reality will
have to be how its delivered to change the perception.
 Wearable tech is slowly becoming the norm and, as this trend continues, people might be more
receptive to AR hardware.

Examples:

 IKEA Mobile App


 Nintendo’s Pokémon Go App
 Google Pixel’s Star Wars Stickers
 Disney Coloring Book
 L’Oréal Makeup App
 Weather Channel Studio Effects
 U.S. Army - Tactical Augmented Reality https://www.youtube.com/watch?v=x8p19j8C6VI&t=2s
Related Fields:

 Medical Training: From operating MRI equipment to performing complex surgeries, AR tech holds the
potential to boost the depth and effectiveness of medical training in many areas.
 Retail: In today's physical retail environment, shoppers are using their smartphones more than ever to
compare prices or look up additional information on products they're browsing. World famous
motorcycle brand Harley Davidson is one great instance of a brand making the most of this trend, by
developing an AR app that shoppers can use in-store. Users can view a motorcycle they might be
interesting in buying in the showroom and customize it using the app to see which colors and features
they might like.
 Architectural Design & Modeling: From interior design to architecture and construction, AR is
helping professionals visualize their final products during the creative process. Use of headsets enables
architects, engineers, and design professionals’ step directly into their buildings and spaces to see how
their designs might look, and even make virtual on the spot changes. Urban planners can even model
how entire city layouts might look using AR headset visualization. Any design or modeling jobs that
involve spatial relationships are a perfect use case for AR tech.

System Structure of Augmented Reality:

 Since we already know that the blend of directed-perception (from physical world) and computer-
mediated perception needs to be in the real-time in order to provide a great AR experience.
 Sensors connect the physical world to the computer-mediated box. It can be a camera,
accelerometer, GPS, compass, or microphone. So, sensors make the first building-block of AR
architecture.
 Sensors can be classified into two categories:
o The one measuring a physical property of the environment that is not applicable to a human
sense. e.g., Geo-location.
o The ones capturing a physical property, directly detectable by human sensing capabilities. e.g.,
Camera.
 The next building block is Context-analyzer. It analyzes the data produced by sensors. This
component may have two functions:
o To recognize that the conditions are met, like a face detection by the camera, or GPS position of
user in a certain range.
o To recognize and track this condition, like computing the 3D position of detected face.

Key Technology in AR:

 Intelligent display technology:


 According to relevant data, more than 65% of the information acquired by human beings comes
from their own vision, which has become the most intuitive way for human beings to interact with
the real environment.
 With the development of intelligent display technology, augmented reality becomes a possibility,
which is pushed to a new height by the various kinds of display devices generated based on
intelligent display technology.
 Specifically, there are three main categories of display devices that occupy an important position in
the field of AR technology today. First, helmet display (HMD) was born in 1968. The optical
perspective helmet display developed by Professor Ivan Sutherland makes it possible to superimpose
simple graphics constructed by computers on real scenes in real time. In the later development,
optical perspective helmet-mounted display and video perspective helmet-mounted display
constitute the backbone of helmet-mounted display.
 Second, handheld device display, relying on the augmented reality technology of handheld display,
handheld device display is very light, small, especially the popularity of smart phones, through video
perspective to the use of augmented reality technology to present.
 Third, other display devices, such as PC desktop displays, match the real-world scene information
captured by the camera to a three-dimensional virtual model generated by the computer and are
ultimately displayed by the desktop display.
 3D registration technology:
 As one of the most critical technologies in the augmented reality system, 3d registration technology
enables virtual images to be superimposed accurately in the real environment.
 The main flow of 3D registration technology has two steps. First, determine the relationship between
the virtual image, the model and the direction and position information of the camera or display
device.
 Second, the virtual rendered image and model are accurately projected into the real environment, so
the virtual image and model can be merged with the real environment.
 There are various ways of 3d registration, such as the registration technology based on hardware
tracker, the 3d registration technology based on computer vision, the 3d registration technology
based on wireless network and the mixed registration technology, among which the former two are
the most popular.
 For the three-dimensional registration technology based on computer vision, it sets the reference
point to realize the determination of the direction and position of the real scene by the camera or the
display.

 Intelligent interaction technology:


 Intelligent interactive technology is closely related to intelligent display technology, 3d
registration technology, ergonomics, cognitive psychology, and other disciplines.
 In AR systems, there are a variety of intelligent interactions, including hardware device
interactions, location interactions, tag-based or other information-based interactions.
 With the development of intelligent interaction technology, augmented reality not only
superimposes virtual information to real scenes, but also realizes the interaction between people
and virtual objects in real scenes.
 This interaction is based on the fact that people give specific instructions to the virtual object in
the scene, and the virtual object can make some feedback, thus enabling the audience of the
augmented reality application to achieve a better experience.

General solution for calculating geometric & illumination consistency in the augmented
environment:

 Light probe methods:


 Methods based on light probes use a special hardware (usually a camera with a fisheye lens) to
capture the illumination in high quality.
 Approaches which capture the illumination in real-time and use it for rendering in AR. These
approaches also calculate global illumination in an AR scene which introduces the light
reflections between virtual and real objects. The advantage of these methods is a high visual
fidelity of the rendered result.
 However, they need a high computational power and therefore are not suitable for mobile
devices. A method for consistent near-field illumination which runs on a mobile device.
 This method renders virtual objects lit consistently with the real world and it calculates global
illumination. Its limitation is the requirement of multiple external cameras connected to a
computer which sends the data to the mobile device via Wi-Fi.
 Light estimation methods:
 The estimation of real illumination was calculated from shadows. Estimation was done based on
the distribution of illumination by analyzing the relationships between the image brightness and
the occlusions of incoming light.
 Another method for real-time estimation of diffuse illumination from arbitrary geometry,
captured by an RGB-D camera. This method reconstructs the real geometry and surrounding
illumination which is used for rendering of the virtual content in AR with consistent
illumination.

 Rendering with natural illumination:


 Once the real light is reconstructed, rendering with natural illumination plays an important role
for achieving a consistent appearance of virtual and real objects.
 An efficient method for irradiance environment maps calculation by utilizing spherical
harmonics. This method is well suitable to calculate the diffuse illumination. A fast
approximation of environment map convolution can be achieved by MIP-mapping.
Unit-1: Introduction to Virtual Reality (VR)

Fundamental Concept of VR:

 Definition of Virtual Reality:


 Virtual reality (VR) is the term used to describe a three-dimensional, computer-generated
environment which can be explored and interacted with by a person. That person becomes part of
this virtual world or is immersed within this environment and at the same time as there, is able to
manipulate objects or perform a series of actions.

 Origin of Virtual Reality:


 The exact origins of virtual reality are disputed, partly because of how difficult it has been to
formulate a definition for the concept of an alternative existence. Elements of virtual reality
appeared as early as the 1860s.
 1860s: French avant-garde playwright Antonin Artaud took the view that illusion was not distinct
from reality, advocating that spectator at a play should suspend disbelief and regard the drama on
stage as reality. The first references to the more modern concept of virtual reality came from science
fiction.
 1935: Stanley G. Weinbaum's 1935 short story "Pygmalion's Spectacles" describes a goggle-based
virtual reality system with holographic recording of fictional experiences, including smell and touch.
 1962: Morton Heilig wrote in the 1950s of an "Experience Theatre" that could encompass all the
senses in an effective manner, thus drawing the viewer into the onscreen activity. He built a
prototype of his vision dubbed the Sensorama in 1962, along with five short films to be displayed in
it while engaging multiple senses (sight, sound, smell, and touch). Predating digital computing, the
Sensorama was a mechanical device.
 1968: Ivan Sutherland, with the help of his student Bob Sproull, created what was widely considered
to be the first head-mounted display (HMD) system for use in immersive simulation applications. It
was primitive both in terms of user interface and realism, and the HMD to be worn by the user was
so heavy that it had to be suspended from the ceiling. The graphics comprising the virtual
environment were simple wire-frame model rooms. The formidable appearance of the device
inspired its name, The Sword of Damocles.
 1978: The Aspen Movie Map was created at the MIT. The program was a crude virtual simulation of
Aspen, Colorado in which users could wander the streets in one of the three modes: summer, winter,
and polygons.
 1980: The term "virtual reality" was popularized by Jaron Lanier, one of the modern pioneers of the
field. Lanier had founded the company VPL Research in 1985. VPL Research has developed several
VR devices like the Data Glove, the Eye Phone, and the Audio Sphere. VPL licensed the Data Glove
technology to Mattel, which used it to make an accessory known as the Power Glove. While the
Power Glove was hard to use and not popular, at US$75, it was an early affordable VR device.
 1988: Star Trek - The Next Generation introduces “The Holodeck”.
 1989: A VPL Research DataSuit, a full-body outfit with sensors for measuring the movement of
arms, legs, and trunk. Developed circa 1989. Displayed at the Nissho Iwai showroom in Tokyo.
 1991: Carolina Cruz-Neira, Daniel J. Sandin and Thomas A. DeFanti from the Electronic
Visualization Laboratory created the first cubic immersive room, The Cave. Developed as Cruz-
Neira's PhD thesis, it involved a multi-projected environment, similar to the holodeck, allowing
people to see their own bodies in relation to others in the room.
 1993: Sega announces SegaVR prototype for Mega Drive console. It used LCD screens in the visor,
stereo headphones, and inertial sensors that allowed the system to track and react to the movements
of the user's head. In the same year, Virtuality launched and went on to become the first mass-
produced, networked, multiplayer VR entertainment system. It was released in many countries,
including a dedicated VR arcade at Embarcadero Center in San Francisco. Costing up to 73,000 per
multi-pod Virtuality system, they featured headsets and exoskeleton gloves that gave one of the first
"immersive" VR experiences. Antonio Medina, a MIT graduate and NASA scientist, designed a
virtual reality system to "drive" Mars rovers from Earth in apparent real time despite the substantial
delay of Mars-Earth-Mars signals.
 1995: The Virtual Boy was created by Nintendo and was released in Japan and North America. Also
in 1995, a group in Seattle created public demonstrations of a CAVE-like 270-degree immersive
projection room called the Virtual Environment Theater, produced by entrepreneurs Chet Dagit and
Bob Jacobson.
 1999: The Witchkowski’s brothers releases “The Matrix”.
 2003: Linden Labs releases “Second Life”.
 2007: Google introduced Street View, a service that shows panoramic views of an increasing
number of worldwide positions such as roads, indoor buildings, and rural areas. It also features a
stereoscopic 3D mode, introduced in 2010.
 2010: Palmer Luckey designed the first prototype of the Oculus Rift. This prototype, built on a shell
of another virtual reality headset, was only capable of rotational tracking. However, it boasted a 90-
degree field of vision that was previously unseen in the consumer market at the time. This initial
design would later serve as a basis from which the later designs came.
 2013: Valve discovered and freely shared the breakthrough of low-persistence displays which make
lag-free and smear-free display of VR content possible. This was adopted by Oculus and was used in
all their future headsets.
In early 2014, Valve showed off their SteamSight prototype, the precursor to both consumer
headsets released in 2016. It shared major features with the consumer headsets including separate 1K
displays per eye, low persistence, positional tracking over a large area, and fresnel lenses.
 2014: Facebook purchased Oculus VR for $2 billion. This purchase occurred before any of the
devices ordered through Oculus' 2012 Kickstarter had shipped. In that same month, Sony announced
Project Morpheus (its code name for PlayStation VR), a virtual reality headset for the PlayStation 4
video game console. Google announces Cardboard, a do-it-yourself stereoscopic viewer for
smartphones. The user places their smartphone in the cardboard holder, which they wear on their
head.
 2015: The Kickstarter campaign for Gloveone, a pair of gloves providing motion tracking and haptic
feedback, was successfully funded, with over $150,000 in contributions. HTC and Valve
Corporation announced the virtual reality headset HTC Vive and controllers. The set included
tracking technology called Lighthouse, which utilized wall-mounted "base stations" for positional
tracking using infrared light.
 2018: Steven Spielberg releases “Ready Player One” based on the book “Ready Player One” by
Ernest Cline that came out in 2011. 2018 also saw release of a number of standalone headsets like
Oculus Go, Vive Focus, Lenovo Mirage, etc. that does not require an additional device like a Mobile
Phone or a dedicated computer to run VR.
 2019: Oculus launched 2 New headsets with Inside-out tracking, devices which do not need external
equipment for sensing the environment around the user. The Oculus Rift S is the successor of the
Oculus Rift, which was PC Based, and the Oculus Quest is a Standalone VR Headset with 6-DoF
tracking and two handheld controllers.
 2020: Oculus launched the Oculus Quest 2 in late 2020 which was priced $100 less than its
predecessor, the Oculus Quest.

 What are Virtual, Augmented and Mixed Realities?


 Virtual reality (VR) is a computer-generated scenario that simulates experience through senses and
perception. The immersive environment can be similar to the real-world, or it can be fantastical,
creating an experience not possible in ordinary physical reality. Instead of viewing the environment
on a screen in front of them, users get an immersive experience and are able to interact with the
environment.
 Augmented reality (AR) is an interactive experience of a real-world environment whose elements
are "augmented" by computer-generated perceptual information, sometimes across multiple sensory
modalities, including visual, auditory, haptic, somatosensory, and olfactory. The overlaid sensory
information can be constructive (i.e., additive to the natural environment) or destructive (i.e.,
masking of the natural environment) and is seamlessly interwoven with the physical world such that
it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters
one’s ongoing perception of a real-world environment, whereas virtual reality completely replaces
the user's real-world environment with a simulated one. Augmented reality is related to two largely
synonymous terms: mixed reality and computer-mediated reality.
 Mixed Reality (MR) is the merge of real and virtual worlds to produce new environments and
visualizations where physical and digital objects co-exist and interact in real time. It allows you to
see and get immersed in your surroundings even while you are interacting with the digital objects
embedded in your surroundings. It gives you the ability to keep your one feet in reality and the other
in the digital world, merging these two worlds together.

Components or Types of VR:

 Broadly, VR can be classified based on its type of immersion and the type of device you intend to
use.
 Based on Type of Immersion, VR can be categorized broadly as:
1. 360 Degree Media and
2. Computer Generated 3D VR (CG3D VR)
 360 Degree Media: These are basically 360 Degree Camera-shot images or videos or rendered
scenes or images in 3D. Camera shot 360 media enable you to experience or see a real-life place or
scenario shot using a 360-degree camera. While a Rendered 360 Image or a video lets people
experience images and scenes that were computer generated using any 3D application.

A 360-degree panoramic Image.

A realistic 360 render of a house.

 Computer Generated 3D VR: This is completely 3D immersive VR where you build a 3D space
for the user to explore and interact with.
Computer Generated 3D VR.

Primary features or factors that help to create a complete virtual reality experience:

 Immersion: as explained above is the trick to get our brain to visualize itself in an environment that
it is not currently in.
 Teleportation: is the ability of moving across various environments without having the need to
leave your premise. Virtual Reality allows you to change your physical surrounding without moving
even an inch from your position.
 Interaction: when one is able to interact with this new environment that one is looking at, the power
of the immersion amplifies into making the belief of this Virtual Reality to be an actual Reality more
concrete.
 Presence: is the ability to feel that one is actually at the place that one sees one is in.
 Sensory feedback: It is easy to break the illusion of Virtual Reality if our brain sees something, but
our other senses reject that notion and rebel against it. But then our senses complement to the visual
feedback that it is receiving, it creates an absolute Virtual Reality.

Present Development on VR:

 Technological advances that helped shape Virtual Reality possible:


 Virtual Reality is not a direct result of the technology being dedicatedly developed for the intended
purpose but are borrowed from a range of other diverse resources to make it possible. For example,
most of the sensors like gyroscopes and motion sensors that are used to track the head orientation
and body positions in a VR headset were primarily developed for smartphones. Small HD screens
used initially to make the display for smartphones are used as displays in a Virtual Reality headset.
Here is a list of few technical advances that has made Virtual Reality possible:
 Haptics: Haptics is the basic involvement of touch as a feedback to the senses for confirming the
belief of whatever they are seeing is actually there.
 3D Display: 3D or 3-dimensional display is the technology that helps build this illusion of depth. To
present stereoscopic images and films, two images are projected superimposed onto the same screen
or display through different polarizing filters. The viewer wears low-cost eyeglasses which contain a
pair of different polarizing filters. As each filter passes only that light which is similarly polarized
and blocks the light polarized in the opposite direction, each eye sees a different image. This is used
to produce a three-dimensional effect by projecting the same scene into both eyes but depicted from
slightly different perspectives. The display mechanisms that help achieve 3D display are:
o Stereoscopy: Stereoscopy (also called stereoscopics, or stereo imaging) is a technique for
creating or enhancing the illusion of depth in an image by means of stereopsis for binocular
vision.
o Polarization: A polarized 3D system uses polarization glasses to create the illusion of three-
dimensional images by restricting the light that reaches each eye (an example of stereoscopy).
 Alternate Frame Rendering: Alternate Frame Rendering (AFR) is a technique of graphics
rendering in personal computers which combines the work output of two or more graphics
processing units (GPU) for a single monitor, in order to improve image quality, or to accelerate the
rendering performance. The technique is that one graphics processing unit computes all the odd
video frames, the other renders the even frames.
 360 Degree View: The ability to constructing displays that show a complete 360-degree
environment either by taking an individual into an environment which has displays surrounding in
all directions or by rendering the images on the displays placed in front of eyes which moves as
quickly and rapidly with the moving chassis of the display as and when the head rotates.
 Motion and Orientation: The ability of measuring motion and direction in space and translating it
into a Virtual environment is critical for creating the illusion of the virtual reality. And this ability of
the HMDs to respond correctly to the user’s actions in the virtual environment, is achieved by the
help of these sensors:
o Accelerometer: An accelerometer is an instrument used to measure acceleration of a moving or
a vibrating body and is therefore used in VR devices to measure the acceleration along a
particular axis. The accelerometer is used in our smartphones for instance to let the device know
whether the user has held the device in landscape or portrait mode. And similarly, the primary
function of the accelerometers in our VR device is also to tell the direction the user is facing.
o Gyroscope: A gyroscope is a device used to measure orientation. The device consists of a wheel
or disc mounted so that it can spin rapidly about an axis which itself is free to alter in any
direction. The orientation of the axis is not affected by tilting of the mounting, so gyroscopes can
be used to provide stability or maintain a reference direction in navigation systems, automatic
pilots, and stabilizers.
o Magnetometer: A magnetometer is a device used to measure magnetic forces, usually Earth’s
magnetism and thus tell the direction that it is facing. A compass is a simple type of
magnetometer, one that measures the direction of an ambient magnetic field.
 Depth Sensing: As the name suggests, depth sensing is the ability of a computing system to measure
depth of the real environment. The main components that make it possible are an IR (Infra-Red)
projector and an IR Camera. An IR projector emits many dots in the surrounding in its line of sight
and the IR camera then sees and understand these dots and the processors calibrate the position of
the object according to the shape, size, and density of these dots.
 Computer Graphics: This is probably the most critical topic in Virtual Reality. Although VR has
been in existence through many decades but only recently with increasing portable computing power
being easily accessible, a lot of quality work in Computer Graphics has been made possible, that in
turn enables the kind of VR that we experience today.
 Light Field Camera: A light field camera, also known as plenoptic camera, captures information
about the light field emanating from a scene; that is, the intensity of light in a scene, and also the
direction that the light rays are traveling in space. This contrasts with a conventional camera, which
records only light intensity. One type of light field camera uses an array of micro-lenses placed in
front of an otherwise conventional image sensor to sense intensity, color, and directional
information. Multi-camera arrays are another type of light field camera. Holograms are a type of
film-based light field image.

 Challenges in virtual reality:


 Realistic sense
 No Nausea
 Depth
 Non-interfering Sensors
 Ergonomics
 Immersion
 Presence
 Teleportation
 Movements
 Interactions

 Basic terminologies in VR industries:


 HMDs: Head-mounted Displays also referred to sometimes as ‘Virtual Reality headsets’, or ‘VR
glasses’ attach straight to your head and present visuals directly to your eyes. HMDs may have a
small display optic in front of one eye (Monocular HMD) or both eyes (Binocular HMD). All
Oculus devices are HMDs since they are mounted on the head and has display optics that pertain to
both eyes.
 FOV: The Field of View is the extent of the observable world that is seen at any given moment. The
field of view is usually given as an angle for the horizontal or vertical component of the FOV.
 A larger angle indicates a larger field of view. For immersive VR, our entire FOV needs to be the
virtual world. As the device is brought closer to your eyes, the screen takes up more of your FOV.
Biconvex lenses magnify the screen further and make the virtual world your entire FOV.

HFoV and VFoV

Field of View
 FPS: Frame rate or Frames per second is the frequency (rate) at which consecutive images called
frames appear on a display. Displaying frames in quick succession creates the illusion of motion. i.e.,
more the frames smoother the motion.

Minimum Requirement Naked-eye judder-free acuity


Frame Rate 90fps with low persistence >300 fps

 Transform: Transform is used to place the bodies correctly in the world and calculate how they
should appear on displays. It consists of Position (Translation) & Rotation (Orientation) of the object
with reference to the given coordinate system. In may also include the scale of the object in virtual
world.

 DOF: Degrees of Freedom is the number of independently variable factors which can affects the
transform of an object. Ex: Desktop Mouse Movement - 2DOF.
 Degree of Freedom of a VR setup depends on different sensors (only rotational tracking or positional
tracking) used in setup.
 Head Rotation - Where I am looking - 3DOF Object Movement in space - Where I am - 3DOF
Object Movement + Rotation in space - 6DOF.
 Rotational Degree of Freedom are identified by amount of rotation across Pitch, Yaw & Roll axis.
Axes indicating Degrees of Freedom.

 Latency: Latency is a time interval between the stimulation and response, or, from a more general
point of view, a time delay between the cause and the effect of some physical change in the system
being observed. VR and neuroscience experts have found through user studies that a latency greater
than 20ms causes motion sickness and discomfort and have projected that it may be necessary to
reduce it to 15ms or even 7ms to fully eliminate them.
 The direct perception of latency varies wildly among people. Even when it is not perceptible, it has
been one of the main contributors to VR sickness. Adaptation causes great difficulty because people
can adjust to a constant amount of latency through long exposure; returning to the real world might
be difficult in this case. For a period of time, most of real world may not appear to be stationary.
 Foveated imaging: Foveated imaging is a digital image processing technique in which the image
resolution, or amount of detail, varies across the image according to one or more fixation points. A
fixation point indicates the highest resolution region of the image and corresponds to the center of
the eye's retina, the fovea.
 In VR, Foveated rendering is a technique used for performance optimization. It will be more
effective with eye tracking sensors. In absence of eye tracking, Fixed Foveated Rendering (FFR) is a
technology that allows the edges of the eye texture to be rendered at a lower resolution than the
center. The effect, which is nearly imperceptible, lowers the fidelity of the scene in the viewer's
peripheral vision. Because fewer total pixels need to be shaded, FFR can produce a significant
improvement in GPU fill performance.
 ATW: Timewarp / Time warping also known as Reprojection is a technique in VR (which was long
known as post rendering image warp) that warps the rendered image before sending it to the display
to correct for the head movement occurred after the rendering. Executing timewarp operation
asynchronously in a separate thread is considered as Asynchronous Time Warp (ATW). In VR,
this technique is also used to generate intermediate frames in situations when the game can’t
maintain frame rate, helping to reduce judder.
 Also, in case a prediction is used for generating the frames, time warp is used as a last-moment
adjustment to overcome prediction errors.
 ASW: Asynchronous Space Warp (ASW) is a frame-rate smoothing technique that almost halves
the CPU/GPU time required to produce nearly the same output from the same content. ATW is
limited for Rotational TW. Whereas, ASW applies animation detection, camera translation, and head
translation to previous frames in order to predict the next frame. As a result, motion is smoothed, and
applications can run on lower performance hardware.

 Engines/Tools to build VR experiences:


 There are several tools which you can use to create VR Content. Game Engines are primarily used to
create VR content. The most widely used Game Engines for VR Experiences are listed below.
 Unity3D: Unity is the most widely used engine for creating VR Experiences, due to its easy learning
curve, awesome community support and graphics capabilities.
 Unity 3D is a versatile and one of the most popular game engines used for developing games, VR,
AR, MR, and many other kinds of applications. It has a simple learning curve, and it is as powerful
as any other game engines out there.

 Unreal Engine: Unreal engine, known for its superior Graphics quality and easy to use Visual -
Node based Editor (Blueprint), which allows you to create great experiences without having to write
a single line of code.
 Godot Engine: Godot is the most powerful Open-Source Game engine. Heavily customizable, light
and packed with all kinds of features.
 WebVR: WebVR is an open specification that makes it possible to experience VR in your browser.
Their goal is to make it easier for everyone to get into VR experiences, no matter what device you
have. Being over the browser, it works with almost any device.
 Scapic: Scapic is a really amazing online platform which you can use to create and prototype VR
scenes and experiences easily. You can easily make complete VR Experiences in minutes.
 Lumberyard: Offered by Amazon. If you are looking for a VR game engine that offers you the full
convenience of developing games, this should be the one. It is comparatively a new entry in this
segment, and it is free with full source. This also means that one can tweak the engine if felt
necessary. This could be an excellent platform for developing online games and don’t need to worry
about hosting a robust game.
 CRYENGINE: Another open-source game engine developed by Crytek. CRYENGINE has a proud
legacy of pushing new technologies, and early Oculus Rift demos like "Back to Dinosaur Island" are
inspiring developers around the globe to create new, mind-blowing experiences for gamers.

 Few more game engines (Modelling Softwares with VR support):


 Blender: Blender is quickly becoming a favorite modeler for many VR developers. It is free and an
open-source software written in Python and is available for Windows, Mac, and Linux. There’s a
huge community of people devoted to this software and its use. Many websites provide tutorial
videos, forums, and documentation.
 SketchUp: Google’s SketchUp is a basic modeling application with a very low learning curve that
can get anyone up and running in a short amount of time. The tutorials on the website are excellent,
not only teaching the basics of the software but also as introductory lessons to basic 3D modelling
concepts.
 SimLab Soft: The VR edition of SimLab Composer is designed to turn any model to a VR scene on
different devices the user is running, whether it is HTC VIVE, Oculus Rift, Desktop PC or a Mobile
device. In addition, it supports panoramic images, and Augmented Reality through AR Viewer.
SimLab Composer VR will not only turn your model into an interactive VR scene, but it contains
advanced animation and VR capabilities that will give you all the needed tools to create various VR
applications from selling your designs to facilitating training; no matter what CAD application
you're using or what's your 3D skill level.
 IDEs for Native Development & Few SDKs: If you are familiar with using graphics libraries like
DirectX - OpenGL, you can directly use a few IDEs & apply your programming skills. Or take help
of a few SDKs. Some native platforms and IDEs that can be used to make AR or VR applications
are:
o Android Studio
o Visual Studio
o Spark AR
o Lens Studio
o AR Core
o AR Kit
o Vuforia
o Cardboard SDK

End of Unit-1

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy