Unit I Ar VR

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 20
At a glance
Powered by AI
Some key takeaways from the passage are that augmented reality enhances the real world by adding digital information, unlike virtual reality which replaces the real world. It allows overlaying of images, videos and graphics in real-time and is being used increasingly in apps and for entertainment.

Augmented reality has evolved from early prototypes in the 1960s-70s to the development of AR tools and SDKs in the 1980s-2000s. Major milestones included the creation of head mounted displays, development of AR systems by Boeing and the US Air Force, and the release of open source ARToolkit. Recent innovations include Google Glass and Microsoft Hololens.

The passage gives the example of A1 Telekom Austria using augmented reality smart glasses to enable remote maintenance assistance. Volkswagen and BMW are also using augmented reality for virtual prototyping and production line training.

Unit I

Augmented Reality
Augmented reality is the technology that expands our physical world, adding layers of
digital information onto it. Unlike Virtual Reality (VR), AR does not create the whole
artificial environments to replace real with a virtual one. AR appears in direct view of an
existing environment and adds sounds, videos, graphics to it.
A view of the physical real-world environment with superimposed computer-generated
images, thus changing the perception of reality, is the AR.
First commercial uses were in television and military.
With the rise of the Internet and smartphones, AR rolled out its second wave and
nowadays is mostly related to the interactive concept. 3D models are directly projected
onto physical things or fused together in real-time, various augmented reality
apps impact our habits, social life, and the entertainment industry.
AR apps typically connect digital animation to a special ‘marker’, or with the help of GPS
in phones pinpoint the location. Augmentation is happening in real time and within the
context of the environment, for example, overlaying scores to a live feed sport events.
There are 4 types of augmented reality today:
 markerless AR
 marker-based AR
 projection-based AR
 superimposition-based AR
AR in the 1960s. In 1968 Ivan Sutherland and Bob Sproull created a first head-
mounted display, they called it The Sword of Damocles. Obviously, it was a rough
device that displayed primitive computer graphics.
AR in the 1970s. In 1975 Myron Krueger created Videoplace – an artificial reality
laboratory. The scientist envisioned the interaction with digital stuff by human
movements. This concept later was used for certain projectors, video cameras, and
onscreen silhouettes. 
AR in the 1980s. In 1980 Steve Mann developed a first portable computer
called EyeTap, designed to be worn in front of the eye. It recorded the scene to
superimposed effects on it later, and show it all to a user who could also play with it via
head movements. In 1987 Douglas George and Robert Morris developed the prototype
of a heads-up display (HUD). It displayed astronomical data over the real sky. 
AR in the 1990s. The year 1990 marked the birth of the “augmented reality” term. It first
appeared in the work of Thomas Caudell and David Mizell – Boeing company
researchers. In 1992 Louis Rosenberg of the US Air Force created the AR system
called “Virtual Fixtures”.  In 1999, a group of scientists led by Frank Delgado and Mike
Abernathy tested new navigation software, which generated runways and streets data
from a helicopter video.
AR in the 2000s. In 2000 a Japanese scientist Hirokazu Kato developed and
published ARToolKit – an open-source SDK. Later it was adjusted to work with
Adobe. In 2004 Trimble Navigation presented an outdoor helmet-mounted AR
system. In 2008 Wikitude made the AR Travel Guide for Android mobile devices. 
AR today. In 2013 Google beta tested the Google Glass – with internet connection via
Bluetooth. In 2015 Microsoft presented two brand new technologies: Windows
Holographic and HoloLens (an AR goggles with lots of sensors to display HD
holograms). In 2016 Nianticlaunched Pokemon Go game for mobile devices. The app
blew the gaming industry up and earned $2 million in a just first week.
How does Augmented Reality work
What is Augmented Reality for many of us implies a technical side, i.e. how does
AR work? For AR a certain range of data (images, animations, videos, 3D models) may
be used and people will see the result in both natural and synthetic light. Also, users are
aware of being in the real world which is advanced by computer vision, unlike in VR.
AR can be displayed on various devices: screens, glasses, handheld devices, mobile
phones, head-mounted displays. It involves technologies like S.L.A.M. (simultaneous
localization and mapping), depth tracking (briefly, a sensor data calculating the
distance to the objects), and the following components:

 Cameras and sensors. Collecting data about user’s interactions and sending it


for processing. Cameras on devices are scanning the surroundings and with this info, a
device locates physical objects and generates 3D models. It may be special duty
cameras, like in Microsoft Hololens, or common smartphone cameras to take
pictures/videos.
 Processing. AR devices eventually should act like little computers, something
modern smartphones already do. In the same manner, they require a CPU, a GPU,
flash memory, RAM, Bluetooth/WiFi, a GPS, etc. to be able to measure speed, angle,
direction, orientation in space, and so on.
 Projection. This refers to a miniature projector on AR headsets, which takes
data from sensors and projects digital content (result of processing) onto a surface to
view. In fact, the use of projections in AR has not been fully invented yet to use it in
commercial products or services.
 Reflection. Some AR devices have mirrors to assist human eyes to view virtual
images. Some have an “array of small curved mirrors” and some have a double-sided
mirror to reflect light to a camera and to a user’s eye. The goal of such reflection paths
is to perform a proper image alignment.

Types of Augmented Reality


Marker-based AR. Some also call it to image recognition, as it requires a special visual
object and a camera to scan it. It may be anything, from a printed QR code to special
signs. The AR device also calculates the position and orientation of a marker to position
the content, in some cases. Thus, a marker initiates digital animations for users to view,
and so images in a magazine may turn into 3D models.
Markerless AR.  location-based or position-based augmented reality, that utilizes a
GPS, a compass, a gyroscope, and an accelerometer to provide data based on user’s
location. This data then determines what AR content you find or get in a certain area.
With the availability of smartphones this type of AR typically produces maps and
directions, nearby businesses info. Applications include events and
information, business ads pop-ups, navigation support.

Projection-based AR. Projecting synthetic light to physical surfaces, and in some


cases allows to interact with it. These are the holograms we have all seen in sci-fi
movies like Star Wars. It detects user interaction with a projection by its alterations.

Superimposition-based AR. Replaces the original view with an augmented, fully or


partially. Object recognition plays a key role, without it the whole concept is simply
impossible. We’ve all seen the example of superimposed augmented reality in IKEA
Catalog app, that allows users to place virtual items of their furniture catalog in their
rooms.

Augmented reality devices


Many modern devices already support Augmented reality. From smartphones and
tablets to gadgets like Google Glass or handheld devices, and these technologies
continue to evolve. For processing and projection, AR devices and hardware, first of all,
have requirements such as sensors, cameras, accelerometer, gyroscope, digital
compass, GPS, CPU, displays, and things we’ve already mentioned.
Devices suitable for Augmented reality fall into the following categories:
 Mobile devices (smartphones and tablets) – the most available and best fit for
AR mobile apps, ranging from pure gaming and entertainment to business analytics,
sports, and social networking.
 Special AR devices, designed primarily and solely for augmented reality
experiences. One example is head-up displays (HUD), sending data to a transparent
display directly into user’s view. Originally introduced to train military fighters pilots, now
such devices have applications in aviation, automotive industry, manufacturing, sports,
etc.
 AR glasses (or smart glasses) – Google Glasses, Meta 2 Glasses, Laster See-
Thru, Laforge AR eyewear, etc. These units are capable of displaying notifications from
your smartphone, assisting assembly line workers, access content hands-free, etc.
 AR contact lenses (or smart lenses), taking Augmented Reality one step even
farther. Manufacturers like Samsung and Sony have announced the development of AR
lenses. Respectively, Samsung is working on lenses as the accessory to smartphones,
while Sony is designing lenses as separate AR devices (with features like taking photos
or storing data). 
 Virtual retinal displays (VRD),  creating images by projecting laser light into the
human eye. Aiming at bright, high contrast and high-resolution images, such systems
yet remain to be made for a practical use.

Applications of AR
Augmented reality may complement our everyday activities in various ways. For
instance, one of the most popular applications of AR is gaming. New AR
games provide much better experiences to players, some even promote a more active
outgoing way of life (PokemonGo, Ingress). Gaming grounds are being moved from
virtual spheres to real life, and players actually perform certain activities. For instance, a
simple gym activity for kids by the Canadian company SAGA, where to crack cubes
moving on a wall children hit it with a ball.
AR in retail may act to bring better customer engagement and retention, as well as
brand awareness and more sales. Some features may also help customers make wiser
purchases – providing product data with 3D models of any size or color. Real-
estate can also benefit from Augmented Reality via 3D tours of apartments and houses,
that can also be manipulated to amend some parts.
Potential areas for AR include:
 Education: interactive models for learning and training purposes, from
mathematics to chemistry.
 Medicine/healthcare: to help diagnose, monitor, train, localize, etc.
 Military: for advanced navigation, marking objects in real time.
 Art / installations / visual arts / music.
 Tourism: data on destinations, sightseeing objects, navigation, and directions.
 Broadcasting: enhancing live events and event streaming by overlaying content.
 Industrial design: to visualize, calculate or model.
VIRTUAL REALITY

1. What is VR?
Virtual reality (VR) is a brand new user interface unlike the conventional one, immersing
a person in digital 3D environment, instead of watching on a display. Computer-
generated imagery and content aim at simulating a real  presence through senses
(sight, hearing, touch).
Virtual reality simulation requires two main components: a source of content and a user
device. Software and hardware, in other words. Currently such systems include
headsets, all-directions treadmills, special gloves, goggles. VR tools should be providing
realistic, natural, high-quality images and interaction possibilities. For this, devices rely
on measurements like:
 image resolution,

 field of view,

 refresh rate,

 motion delay,

 pixel persistence,

 audio/video synchronization.

 The main challenge of VR is tricking the human brain into perceiving digital
content as real. That is not easy, and this “immersion” issue is what still holds
virtual reality experiences back from being enjoyable. For example, the human
visual field doesn’t work like a video frame, and besides about 180 degrees of
vision, we also have a peripheral vision.
 Yet, the VR visionaries are confident of overcoming such issues sooner or later,
campaigning for the concept and collecting investments in millions. The virtual
experience like 360-degree videos and pictures, VR apps and games, are
already available. There’s a good enough choice of headsets as well.
 For more basics of VR, and how you can explore it, watch this dope and simple
explanation with fun facts along.

2. How does virtual reality work?


As mentioned, VR requires several devices such as a headset, a computer/smartphone
or another machine to create a digital environment, and a motion tracking device in
some cases. Typically, a headset displays content before a user’s eyes, while a cable
(HDMI) transfers images to the screen from a PC. The alternative option is headsets
working with smartphones, like Google Cardboard and GearVR – a phone acts both as
a display and a source of VR content.
Some vendors apply lenses to change flat images into three-dimensional. Usually, a
100/110-degree field of sight is achieved with VR devices. The next key feature is the
frame rate per second, which should be 60 fps at a minimum to make virtual simulations
look realistically enough.
For user interaction there are several options:
 Head tracking
Head tracking system in VR headsets follows the movements of your head to sides and
angles. It assigns X, Y, Z axis to directions and movements, and involves tools like
accelerometer, gyroscope, a circle of LEDs (around the headset to enable the outside
camera). Head tracking requires low latency, i.e. 50 milliseconds or less, otherwise,
users will notice the lag between head movements and a simulation.
 Eye tracking
Some headsets contain an infrared controller which tracks the direction of your eyes
inside a virtual environment. The major benefit of this technology is to get a more
realistic and deeper field of view.
 Motion tracking
Though not engineered and implemented well enough yet, motion tracking would raise
VR to a totally new level. The thing is, that without motion tracking you’d be limited in
VR – unable to look around and move around. Through concepts of the 6DoF (six
degrees of freedom) and 3D space, options to support motion tracking fall into 2 group,
optical and non-optical tracking. Optical tracking is typically a camera on a headset to
follow the movements, while non-optical means the use of other sensors on a device or
a body. Most of existing devices actually combine both options.
3. Major VR market players
The number one is, of course, Oculus Rift virtual reality headset. It is a small, well-
crafted device, requiring a connection to a computer. A user can either sit or stand while
playing a game, though is somewhat limited in movements. With dozens of thousands
of units being sold each year, Oculus stays at the forefront of VR hardware niche.
Other honorable mentions are:
 Microsoft HoloLens, in contrast to Oculus Rift, uses holographic technology,
therefore often marketed as Augmented Reality rather than VR. It gives the user an
opportunity to interact with holograms around him.
 HTC Vive – also famous for developing the Steam platform for gaming, Vive by
HTC is the first headset for SteamVR products.
 Samsung GearVR – using Oculus head-tracking technology in combination with
Android smartphones (e.g.  Galaxy Note 4) to power mobile VR experiences. Its lenses
basically transform a phone’s screen into a stereoscopic screen.
 Google Cardboard – the simplest and the most affordable VR headset. A device
for Android smartphones for something like $15, with the variety of games and mobile
applications available from the Play Store.
 Google Daydream – the advanced version of VR headset by Google, working
with smartphones, and a standalone version with controllers coming soon.
4. Areas of use / Applications of VR
Virtual Reality has the potential to make new discoveries and have a positive impact in
multiple areas of our everyday lives. When it’s too dangerous or expensive to try
something out in reality, VR is a great option to have. Think of training aircraft pilots to
surgeons, and areas like:
 Education: training to acquire certain skills;
 Science: visualization of data and research;
 Medicine: monitoring, training, diagnosing;
 Industrial design and architecture;
 Gaming and entertainment: immersive and interactive experiences.
5. Virtual Reality SDKs to build VR Apps
5.1. HTC Vive / OpenVR SDK
Motion tracking:  It comes with two wireless infrared Lighthouse cameras. Cameras
are to be placed in the corners of a room, tracking the headset’s 37 sensors (70 in total,
including each controller).
Controllers: They’re basically a vertically bisected version of the Steam Controller, with
a trackpad, buttons and a pressure-sensitive grip in each hand.
Camera: Front-facing camera, which means users may overlay virtual objects onto the
real world surroundings.
Hardware:  
 NVIDIA GeForce GTX 970
 AMD Radeon R9 290
 Intel i5-4590
 AMD FX 8350
 4GB+ of RAM, HDMI 1.4 or DisplayPort 1.2 or newer
 1x USB 2.0 or greater port
 Windows 7 SP1 or newer
Best VR SDK for HTC Vive
For developers looking which SDK to use to build VR apps for HTC Vive, there are 3
major options. OpenVR kit, SteamVR kit and VRTK – all official virtual reality SDKs by
Viveport community. And by the way, Viveport SDK is definitely a starting point if you
want to create and share apps for HTC Vive. Now let’s give a brief review of  software
development kits:
1. OpenVR SDK by Valve is an API and a runtime environment with great samples.
It supports multiple VR hardware and applications don’t have to be vendor
specific. The runtime is of SteamVR.
2. SteamVR SDK lets developers create single interfaces that will work on different
VR headsets, including HTC Vive. Moreover, it gives access to controllers,
chaperoning, models and it also allows content preview in Unity play mode.
3. VRTK, or Virtual Reality Tool Kit, appears to be a collection of handy scripts for
VR applications. It works in Unity3d engine
5.2. Oculus Rift / Oculus SDK
Motion tracking: Oculus sensor is able to recognize the motion if you turn your body
more than 180 degrees.
Controllers: Three buttons on each controller with X, Y and Menu buttons on the left.
A, B and the universal Oculus menu button on the right. Other buttons also include a
clickable thumb stick, a trigger and a touch-pad button (for fingers other than a
forefinger). 
Camera: Oculus headset setup has a tracking camera which works with an infrared
light. This allows 360-degree positional head tracking in a pretty broad play area,
though in short scenarios.
Hardware:  
 NVIDIA GeForce GTX 960
 AMD Radeon RX 470, Intel Core i3-6100
 AMD FX4350
 8GB+ RAM, Compatible HDMI 1.3 video output
 2x USB 3.0 ports
 Windows 7 or newer
5.3. Samsung Gear VR / Oculus Mobile SDK
Motion tracking and camera:  Gear VR unit acts as the controller, which contains the
field of view, as well as a custom inertial measurement unit. This IMU for rotational
tracking connects to a smartphone via micro-USB. Gear VR set also has a touch-pad, a
back button, a proximity sensor to detect if the headset is on.
Controllers: The device is only 48.1 x 38.2 x 108.1 mm, and accessing the volume
controls, Back button, home screen or touch pad.
Hardware:
 Galaxy S8
 Galaxy S8+
 Galaxy S7
 Galaxy S7 Edge
 Note 5
 Galaxy S6
 Galaxy S6 Edge
 Galaxy S6 Edge+
Oculus Mobile SDK
A natural question comes to mind first: why is the best VR SDK for GearVR by Oculus?
The answer is pretty simple. The VR headset by Samsung was initially built in
collaboration with Oculus, thus their kit fits nicely to build apps for GearVR. Oculus
Mobile SDK contains tools and libraries for C/C++ development for Oculus, as well as
for Samsung Gear VR.
For those who want to conduct VR development in Unity for GearVR, the setup should
include (as advised by the vendor):
 Unity 5.3.2
 Java development Kit 8
 Android SDK 5.0 and tools
 Sample assets for Unity project
 Oculus signature file (osig)
5.4. Google Daydream View / Google VR SDK
Motion tracking and camera: Generally, your experience using a Google
Daydream will depend on which smartphone you lock inside the headset.
Controllers: There are only five buttons including a trackpad that doubles as a button.
Below the trackpad, you’ll find an app button. Then there’s the home button below.
Lastly, you can find the volume Up and Down buttons on the right side of the controller.
Hardware:
 Pixel / Pixel XL
 Pixel 2/2 XL
 Moto Z
 ZenFone AR
 Mate 9 Pro
 Axon 7
 Galaxy S8/S8+
 Galaxy Note 8
Google VR SDK
Undoubtedly, the best tools to build VR apps for Google headsets are provided by no
one else than Google. Their developer community is vast and has hundreds of
frameworks, tools, APIs, SDKs and whatnot. Google VR SDK is not one but many, they
have specific kits for Android/iOS, Unity/Unreal engines, etc. For smartphones and
Daydream headset developers can use Google VR SDK for Android OS.
This VR SDK is also a NDK (native dev kit) providing an API for native code in C and
C++. If you need direct Github repository for a specific VR development environment,
check these out:
 Google VR SDK for Android
 Google VR SDK for iOS
 Google VR SDK for Unity
5.5. Google Cardboard / Google VR SDK
Google Cardboard is a folding cardboard container, which users can place a
smartphone into. It’s universal and  supporting a wide range of smartphone models.
Google Cardboard viewer includes:
 a piece of cardboard cut into a precise shapes
 45 mm focal length lenses
 magnets or captive tape
 a hook and loop fastener (such as Velcro)
 a rubber band
 an optional near field communication (NFC) tag
Google VR SDK
This SDK for Androids also supports Cardboard along with Daydream View. Generally,
virtual reality SDKs by Google offer any tools to make VR apps for their platforms, e.g.
libraries, APIs, samples and design guidelines. The hardware requirements for
Cardboard apps are affordable pretty much to everyone – a viewer and a smartphone.
With Google VR SDK developers are able to make virtual reality apps, spending less
time and effort on tasks like:
 Correction of lens distortion
 Audio
 Head tracking
 3D calibration
 Rendering
 Geometry configuration
VR platform comparison
To summarize all of the above, here is the feature comparison of six top virtual reality
viewers. These specifications may help you out to determine what kind of VR app you
might be able to develop, or what resources might be required for your project.
Best game engines for VR
One of the most efficient uses of VR so far has been in the gaming industry. So we
thought it might be useful to name the top engines to create virtual reality games and
apps. We’ve already had few mentions of Unity previously, for example, which is most
probably the best one to this date.
1. Unity3d – a cross-platform game engine, which is great for VR, as it supports
Oculus Rift and all of the above-mentioned platforms. It is very popular among
developers, offers the asset store with wide choice, and also allows a free choice
of programming language (C#, C Sharp, JavaScript, Python).
2. Unreal Engine – a game engine introduced back in 1998, grown since to
become an efficient platform to build games, apps, animations for VR headsets
and mobile devices. UE4 grants full access to the source code and comes with
highly convenient visual scripting mode, has outstanding compilation speed.
3. LibGDX – an open-source development framework written in Java. It comes with
the unified API for every platform from Windows, Linux, mobile OS to web
browsers. Fast iterations and prototyping, rendering graphics via OpenGL ES
2.0, supporting all popular audio formats.
4. AppGameKit VR – a game creation system for mobile devices, and working on
Oculus Rift, HTC Vive. Commercial use is allowed without the obligation to pay
royalties. This kit’s commands allow quick creation of basic VR experiences.
Unlike other engines, that are free, AppGameKit costs $20.
5. CryEngine – an open-source royalty-free gaming engine, that provides lots of
features, some being quite unique, e.g. fog rendering/cloud shadows, weather
effects, color grading, etc. It also is a marketplace for developers to find
individual assets, 3D models and sounds.
Role of XR in Industry

 Coupled with data from the internet of things, extended reality is changing the
way people work and driving industrial innovation
 In 2018, this could mean you’re already a connected field worker. Earlier this
month, A1 Telekom Austria completed an eight-week pilot programme using the
Vuzix M300 Smart Glasses to enable .maintenance experts to legally sign off on
procedures and help on-site industrial technicians to rectify defects found in
transmitter mast reviews.
 With the “see-what-I-see” augmented reality (AR) software, remote assistance,
and the ability to capture images and video, the whole servicing process can take
less than 25 per cent of the time it would using traditional methods.
 XR is being used to build a bridge between human workers and robots
 This is just one way extended reality (XR) could power what Professor Klaus
Schwab, executive chairman of the World Economic Forum, and many others are
referring to as the fourth industrial revolution. Smart sensors producing internet of
things (IoT) data and advances in robotics are being combined with emerging
technologies such as XR, which merge digital, physical and human elements to
change industrial processes.
Widespread adoption of XR has led to a hike in industrial efficiency
 More important than any individual pilot announcement though are the
companies that jumped on board early, got a head-start and saw the results they
needed to move out of the trial phase into larger-scale deployment.
 For organizations that have graduated from small-scale pilots to widespread
adoption, the combination of XR and IoT is already changing the way people
work. The industry regarded as the most mature in its adoption of XR devices is
manufacturing with a number of others, including architecture, construction,
logistics and healthcare, following its lead.
 Take the Volkswagen Group. In 2016, it launched a series of virtual reality (VR)
training pilots for production and logistics staff with Vive’s VR hardware and
enterprise software startup Innoactive. Since then, it has seen “significant
increases in efficiency and the effectiveness of our programmes”, says Dennis
Abmeier, Volkswagen Group’s IT lead. Now it’s on track to train 10,000
employees in XR by the end of 2018 across Volkswagen, Audi, Seat, Skoda and
VW Utility Vehicles, via its digital reality hub.

XR invaluable for training and testing in industrial sector

 In addition, Volkswagen and also BMW, another early adopter in automotive, are


seeing results from increased virtual prototyping and production line training in
XR. Bringing intelligent robots into workplaces is a core part of this brave, new
industrial revolution and XR is being used to build a bridge between human
workers and robots

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy