Wearable Computing: A Review: January 2005
Wearable Computing: A Review: January 2005
net/publication/228697323
CITATIONS READS
16 3,252
1 author:
Cliff Randell
University of Bristol
67 PUBLICATIONS 1,732 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Cliff Randell on 30 May 2014.
cliff@cs.bris.ac.uk
1. Introduction
The concept of wearable computing (wearables) emerged in the mid 1990’s at a time
when carrying an ‘always-on’ computer combined with a head-mounted display and
control interface first became a practical possibility. In July 1996 a workshop ‘Wearables
in 2005’ was sponsored by the U.S. Defense Advanced Research Projects Agency. This
was attended by industrial, university and military visionaries to work on the common
theme of delivering computing to the individual. They defined wearable computing as
"data gathering and disseminating devices which enable the user to operate more
efficiently. These devices are carried or worn by the user during normal execution of
his/her tasks" (DARPA, 1996). One of the first advocates and adopters of this form of
computer usage, Steve Mann, further defined wearable computing and arrived at three
fundamental properties. Firstly a wearable computer is worn, not carried, in such a way
as it can be regarded as being part of the user; secondly it is user controllable, not
necessarily involving conscious thought or effort; and, lastly it operates in real time - it is
always active (though it may have a sleep mode) and be able to interact with the user at
any time (Mann, 1997/2).
It could be argued that other examples which met Mann and DARPA’s criteria included
the 1970's calculator wristwatches marketed by, for example, Pulsar and later, Hewlett-
Packard. Dive computers, which first became common in the 1980's, are also worn, are
user controllable, and have sensors which operate in real time. In the art world Stelarc has
experimented with body sensors and actuators (Stelarc, 1997), and many artists have
developed unusual musical controllers. These wearable interfaces used sensor systems
affixed to the body or clothing of a performer measuring movement and/or body
functions such as heart rate or skin resistance. The interfaces connected to audio
equipment such as midi devices and sound synthesisers, sometimes also worn on the
body. Joe Paradiso at M.I.T. has taken a particular interest in these and, as well as
creating his own devices, provided a comprehensive overview of this field in an IEEE
Spectrum article 'New Ways to Play: Electronic Music Interfaces' (Paradiso, 1997).
Many other inventive backroom constructors also produced wearable systems, most
notably Mann with his WearCam and WearComp devices. Originally starting by building
a wearable 'photographer's assistant', he developed a series of wearables from the 1970's
to the present day featuring body mounted cameras and lighting, head mounted displays,
audio interfaces and many of the other features commonly associated with wearable
computing (Mann, 1997/2). His work with M.I.T. is outlined later in this chapter.
As the development of the wearable computer was originally inspired by the availability
2
of battery powered head mounted displays, it has been closely linked to this technology.
An overview of the challenges presented by these displays was summarised by Duchamp
in 1991. These were the hassle of the head gear, low-resolution, eye fatigue, and the
requirement for dim lighting conditions (Duchamp et al, 1991). In addition battery life,
processor power and size, sensor availability and form, and availability of suitable
wireless communications added to the challenge of building viable wearable computers.
Fashion also played its part and HMDs also need to be become acceptable as everyday
wear without arousing social antipathy. Significant technical improvements to these
displays have eased this situation however progress still needs to be made.
Initially the use of wearables aroused specific interest in three categories – industrial
manufacturing and distribution; the military; and academia. These interests were
primarily represented by Boeing, DARPA and the M.I.T. MediaLab, each of whom
envisaged applications in their fields. In this chapter we will be exploring the experiences
gained over the past ten years in each of these fields, concluding with a look at future
applications.
2. Industry
The quick and accurate availability of complex information to the worker in the field, or
in a non-office workplace, has been an objective of many organisations since the
establishment of computerised records in the 1950's. While this can be provided using
handheld devices, many workers use one or both hands while carrying out their tasks, and
also need to maintain eye contact with their piece of work. The wearable computer with a
hands free interface, e.g. speech, and a head-mounted display can provide a solution for
these workers.
3
Figure 7.1: Using the Boeing augmented reality wire bundling system.
The maintenance of complex machinery also provides a potential application field for
wearable computers. Maintenance manuals are often large, unwieldy documents which
can deteriorate rapidly with frequent use in workshop environments. The possibility of
using a head mounted display to overlay technical drawing and maintenance procedures
onto the actual equipment being maintained offers an attractive alternative. A wearable
computer can also be used to efficiently update the maintenance records for the
equipment while the procedures are being carried out. In addition the availability of video
clips illustrating procedures which are viewed while carrying out maintenance can assist
with training. A series of prototypes were developed at Carnegie Mellon University to
assess the issues associated with maintenance of airplanes, trains and tractors (Bass et al,
1997). These were principally the VuMan 3 and Navigator 2, both designed to help with
performing inspections by recording the identification of imperfect aircraft skin panels as
part of a job order process. By using checklists and forms on a wearable a 50% reduction
was observed in the time to record inspection information, with the data entry to the
logistics computer being reduced from over three hours to two minutes. The C-130
system introduced a different emphasis by enabling the wearable to be used as a
collaborative device to support user training. This project identified design issues which
differentiated wearable and desktop computing notably the user interface, and the
opportunity for a focussed design to provide a powerful yet simple tool for a limited
function. More recent maintenance innovations have included the Talking Assistant
(Schnelle et al, 2004) in which car maintenance is supported by a body-worn device
featuring location triggered audio icons, text-to-speech conversion and note recording.
Warehousing and inventory control using wearable computers has been proposed for
many years. This, like many industrial applications, requires that the wearable is
comfortable to wear for long periods - it should be lightweight, not generate heat, must
not get in the way, and have minimal cabling. Vocollect's Talkman wearable voice
computing terminal and integrated software suite provides a solution which has been
adopted by office equipment supplier Corporate Express Inc. (Vocollect). Following pilot
studies it has implemented this system in 22 distribution centers. The studies showed that,
compared to paper-based picking, the speech-based, wearable data collection system
boosted productivity by 50-60%, increased picking accuracy to 99.99%, reduced worker
training time and would deliver payback in less than a year.
4
The previous examples of wearable applications illustrate how a wearable computer can
assist with indoor tasks. However one of the main features of wearables is that they may
be able to operate anywhere. For this to be realistic the computer and its interfaces have
to be especially rugged. To assist technicians working in the field, Bell Canada selected
Xybernaut's Mobile Assistant to provide communications with the support infrastructure,
gain access to data and schematics, and to log progress while climbing utility poles and
descending into manholes (Xybernaut). Time savings of 50 minutes per day per
technician were recorded during a pilot study using this wearable. Similarly Xybernaut
have supplied wearables to support journalists in the field. Typically a television news
team consists of a cameraman, a sound operator and a reporter. By combining all the
technology into a wearable rig it is possible for journalists to cover a story on their own.
This can result in better stories and faster coverage. Wearable cameras however do not
provide the same quality pictures as broadcast cameras, and the wearable computer can
be a diversion creating a story by itself from the reactions of the general public.
Nevertheless they can generate material with remarkable immediacy as well as reducing
some of the costs associated with newsgathering.
3. Military
5
wearable computer design provides immersive training for the armed services and
emergency response workers. As well as being able to reconstruct hazardous situations, it
is particularly suited to rehearsal of future missions. Squad level interaction based on a
distributed network of individual soldiers all equipped with the Expedition training
system is envisaged. With the ability to work within a correlated virtual world, squads
will be able to plan missions via the wearable interface, rehearse their course of action
prior to the actual training exercise, conduct virtual training exercises while engaging
intelligent computer generated forces, and review the action afterwards with unit scoring
and performance assessments. Figure 7.2 shows the complete Expedition system.
The health and well-being of service personnel also require special attention. The sensate
liner developed at Georgia Institute of Technology was designed specifically to monitor
the vital signs of combat casualties, as well as automatically detect and characterise a
wound in real time using bullet entry detection (Lind et al, 1997). Further health
monitoring applications are presented in the following section.
6
4. Medical and Health
The applications described previously have used position sensing technology to assist in a
variety of tasks. The knowledge of where the user is located clearly provides the basis for
many wearable designs. Wearables can also be designed to monitor well-being and
activity - the how and what of the user. This form of context sensing has been put to use
in wearable computers for medical and health applications and has met with more success
than in any other field. Body invasive devices, such as heart pacemakers, have become
commonplace. However as these devices are generally not user controllable they do not
fall into our definition of wearable computers. Wearables have the potential to monitor
health to assist with improving performance e.g. sports; prevention and detection of
illness through diagnosis; and even treatment, though this usually involves some invasive
procedure. Examples of treatment by a wearable are insulin pump therapy for diabetics
(Doyle et al, 2004) and a brain implant to facilitate communication with speech-incapable
patients (Bakay and Kennedy, 1999).
Health monitoring applications were initially explored for military purposes with the
objective of remotely determining the physical status of troops in the field. The Personnel
Status Monitor was designed to predict when a soldier is either injured or fatigued using
a wide range of sensors, processing boards and a wristwatch display (Satava, 1997). A
simpler low-cost, lightweight, noninvasive, and adaptable system employed a single neck
mounted acoustic sensor to listen to the sounds of blood flow, respiration and the voice,
while minimising ambient sound (Siuru, 1997). The sensor can collect information
related to the function of the heart, lungs, and digestive tract or it can detect changes in
voice or sleep patterns, other activities, and mobility. Extensive testing with soldiers and
firefighters has demonstrated the effectiveness of this design to help understand the
interrelations between physiology, the task at hand, and the surrounding environment.
More recently health monitoring wearables have become commercially available in the
form of the Bodymedia product range (Bodymedia). This is based around an armband
design with sensors for detecting movement, heat flux, skin temperature, near-body
temperature, and galvanic skin response (see Figure 7.3). Data can be either viewed in
real time via a wireless link, or downloaded for analysis using the Internet. Meanwhile
academic research continues with health monitoring wearables such as the University of
Birmingham's Sensvest for monitoring sports activity (Knight et al, 2005); the
WEALTHY Wearable Health Care System which seeks to improve the comfort of
wearable systems by integrating sensors with the fabric of the users clothes (Paradiso,
2004); and the GRID enabled system which can display live data, historical data, or
perform data mining developed by the University of Nottingham (Crowe et al, 2004).
7
Figure 7.3 The Bodymedia SenseWear armband.
Providing assistance for people with special needs has become an important role for
wearables. Many systems have been explored to provide the visually impaired with
guidance. Early examples of this were developed at the University of California, Santa
Barbara using GPS (Loomis, 1985). Evolving from a bulky backpack design, the current
system weighs only a few pounds and is worn in a pack slung over the shoulder. Using an
electronic compass in conjunction with GPS and a spatial database of the UCSB campus
with GIS functionality and a spatialised audio interface. Using this apparatus the visually
impaired can achieve improved access to the environment as well as having greater
independence of movement. A radically different approach was taken by the University
of Bristol (Campbell et al, 1995) in which real-time video from a body worn camera
produced images in which areas such as pavements were classified, identified and
presented to the visually impaired user as registered colour coded areas on a head
mounted display.
The PARREHA project led by Oxford Computer Consultants (Greenlaw et al, 2002) is
directed at sufferers of Parkinson's Disease. This disease causes inability to direct or
control movement such as walking in a normal manner. The project assists sufferers to
walk normally by placing virtual visual cues as part of an augmented reality display. This
wearable design takes advantage of a little understood effect called kinesia paradoxa by
using the user's head mounted display to show brightly coloured stripes which scroll
towards the viewer as if they are walking down a tunnel.
8
Interaction and communication in this field can also be assisted by wearables, for
instance with the deaf using M.I.T.'s American Sign Language Recogniser (Starner et al,
1998); and the forwarding of images from body worn cameras at accident scenes to
hospitals from medics while talking to the waiting doctors using British Telecom's
CamNet system (Garner et al, 1997).
5. Personal Assistance
The Wearable Computing Project at the MIT Media Lab foresaw many of these agent
based applications being used to help "smooth" the user's daily interactions (Starner et al,
1997). The Remembrance Agent in particular was designed to provide timely information
by searching for data associated with current location and activity; assisting with personal
organisation such as prompting the user when current, or future, activities might interfere
with each other; and building an expert database of knowledge personalised to the user
(Rhodes and Starner, 1996). Systems using physical context other than location have also
been developed. The DyPERS system presents information about museum exhibits, but,
instead of location, uses machine vision to detect what painting a wearer of the system is
currently viewing (Schiele et al, 1999). Camera based applications were also explored
where an environment could be augmented with personalised digital information, for
instance, using a wearable a virtual museum exhibition could be overlaid with virtual
information tailored to the user's interests (Mann, 1994). A team at Columbia University
carried out related work under the title of "Knowledge Based Augmented Reality"
(Feiner et al, 1993). In this project they explored overlaying graphical information onto
complex objects in a similar way to the industrial maintenance applications described
previously. The challenge identified here was how to design suitable content for the
envisaged tasks in order to most effectively communicate with the user.
9
Research into the use of wearables without head mounted displays, such as in the
previous section, has also produced a number of relevant applications. The University of
Bristol's Cyberjacket, originally developed to deliver location based multimedia
messages, was used to prototype a Tourist Guide application in which content is related
to the user's activity - with audio delivered when the user is active, and images when the
user is stationary (Randell and Muller, 2002). The same platform was employed to
investigate the future needs of the everyday shopper. Using a wearable with sensors to
determine proximity to retail outlets, and further control a background data exchange, the
user's agent was able to browse the stock of nearby shops without entering the premises
(Randell and Muller, 2000).
While there are many forms of personal information devices now available to the
consumer, including mobile phones, PDAs and portable games consoles, the wearable
still provides the most advanced platform for personal applications. The wearable can go
beyond supporting provision of digital information and multimedia to actually supporting
the wearer in the full production of, not only text, but also audio, images and video in real
time (Mann, 1997/1).
6. The Future
While the widespread deployment of many of the applications already described will be
in the future, the issues regarding their feasibility are well understood. There are still
emerging applications where research has only begun recently. Two examples of this are
mobile games and fashionwear.
The challenges associated with mobile games are, as with desktop games, greater than
with conventional mobile computer applications. Fast playability, realistic graphics and
intuitive user interfaces all require significant development for games to be practical on a
wearable. Nevertheless a team at the University of South Australia (UniSA) have
developed a wearable version of the popular Quake game. Using a six degrees of freedom
GPS/compass tracking system and a 3D model of the University campus, they are able to
overlay the ARQuake monsters on their normal vision using a head mounted display.
The player is able to 'shoot' the monsters using a single button handheld device (see
Figure 7.4). Though the research was originally addressing issues of tracking and
rendering, it also explored user interaction (Thomas et al, 2000). Similarly a team from
the Mixed Reality Lab at the National University, Singapore have created an outdoor
version of Pacman with real players represented by 'pacmen' and 'ghosts' in the virtual
world. Again using head mounted displays to view the virtual world, this game explored
immersion and interaction with the real world. As it is a multi-player game social
interaction while playing was observed, and tangible artefacts, or 'ingredients', were
introduced enabling the players to also interact with real objects with virtual properties
(Cheok et al, 2003).
10
Figure 7.4: An ARQuake monster in the real world
The connection between wearable computing and fashion was inevitable and
CyberFashion shows have taken place regularly since the 1990's with a regular event at
the SIGGRAPH graphics and interactive techniques conference each year (SIGGRAPH).
While many of the exhibits are conceptual, or do not have the traditional qualities of
fashion garments, technology is advancing to the point where sensors, computers and
displays can be integrated into garments in practical and aesthetically pleasing ways. One
of the first products to reach the marketplace was the Philips/Levi Strauss ICD+ jackets
incorporating an MP3 player and a mobile phone into a jacket, and this has led to other
manufacturers incorporating device controls and interfaces into jackets. Full integration
of a wearable with a fashion garment with expressive and aesthetic potential is in its early
stages. Elise Co explored computational fashion in her MIT thesis with creations
featuring bio- and movement sensors controlling displays in the garments structure (Co,
2000). There is undoubted appeal for such fashion items and products are starting to
become available commercially (Enlighted).
11
6. Conclusion
As with the desktop computer, there are many diverse applications for wearable
computers. In this chapter applications for industrial manufacturing and distribution,
military use, medical and health, personal use and emerging future designs have all been
described. Prototypes of these applications have all been constructed and some of these
have become commercially available. Though the wearable has not seen the widespread
acceptance given to the desktop, work continues in this field to meet the challenges that
inhibit its growth. User interfaces need to become more intuitive and easy to learn and
use; processors and sensors need to be effectively integrated into textiles; and displays,
whether head or body mounted, need to effective under a wide range of lighting
conditions - all of these must be comfortable to wear and unobtrusive. Only when these
challenges have been successfully addressed will wearable computing become
ubiquitous.
References
Bass, L., Kasabach, C., Martin, R., Siewiorek, D., smailagic, A. and Stivoric, J., The
Design of a Wearable Computer, Proceedings of CHI97, (1997) pp 139-146.
Bass, 1985 Bass, T.A., The Eudaemonic Pie, Houghton Mifflin Company, ISBN: 0-
595-14236-2, 1985.
Campbell, N.W, Mackeown, W.P., Thomas, B.T. and Troscianko, T., Automatic
Interpretation of Outdoor Scenes, British Machine Vision Conference, September (1995).
Cheok, A.D., Fong, S.W., Goh, K.H., Yang, X., Liu, W. and Farbiz, F., Human Pacman:
A Mobile Entertainment System with Ubiquitous Computing and Tangible Interaction
over a Wide Outdoor Area, Fifth International Symposium on Human Computer
Interaction with Mobile Devices and Services, (2003).
12
Co, E.D., Computation and Technology as Expressive Elements of Fashion, Thesis
submitted to the Program in Media Arts and Sciences, School of Architecture and
Planning, Massachusetts Institute of Technology, June 2000
Crowe, J., Hayes-Gill, B., Sumner, M., Barratt, C., Palethorpe, B., Greenhalgh, C., Storz,
O., Friday, A., Humble, J., Setchell, C., Randell, C. and Muller, H., Modular sensor
architecture for unobtrusive routine clinical diagnosis. In: International Workshop on
Smart Applicances and Wearable Computing, (2004).
Doyle (Boland) E.A., Weinzimer, S.A., Steffen, A.T., Ahern, J.H., Vincent, M. and
Tamborlane, W.V.. A Randomized, Prospective Trial Comparing the Efficacy of
Continuous Subcutaneous Insulin Infusion with Multiple Daily Injection Using Insulin
Glargine. Diabetes Care, 27, (2004), pp 1554-8.
Duchamp, D., Steven, K. F. and Gerald Jr. Q. M., Software Technology for Wireless
Mobile Computing, IEEE Network Magazine, 12(18), (1991) p.218..
Feiner, S., MacIntyre, B. and Seligmann, D., Knowledge based augmented reality,
Communications of the ACM, 36(7), (1993), pp 53-62.
Garner, P., Collins, M., Webster, S.M. and Rose, D.A.D., The application of telepresence
in medicine, BT Technology Journal, 15(4), (1997), pp 181-187.
Greenlaw, R., Wessel, I.D., Katevas, N., Andritsos, F., Memos, D., Prentza, A. and
Delprato, U., PARREHA – Assistive Technology for Parkinson’s Rehabilitation, 1st
Cambridge Workshop on Universal Access and Assistive Technology (2002).
Knight, F., Schwirtz, A., Psomadelis, F., Baber, C., Bristow, W. and Arvanitis, N., The
design of the SensVest, Personal and Ubiquitous Computing, 9(1), (2005), pp 6-19.
Lind, E.J., Jayaraman, S., Rajamanickam, R., Eisler, R. and McKee, T., A sensate liner
for personnel monitoring applications, First International Symposium on Wearable
Computers, (1997), pp 98-105.
Loomis, J.M., Digital map and navigation system for the visually impaired. Unpublished
paper, Department of Psychology, University of California, Santa Barbara, (1985).
13
Mann, S., Mediated reality, Technical Report 260, MIT Media Lab, Perceptual
Computing Group, (1994).
Mann, S., Wearable Computing: A First Step Toward Personal Imaging, Computer, 30
(2), (1997/1)
Mann, S., An historical account of the `WearComp' and `WearCam' inventions developed
for applications in `Personal Imaging', First International Symposium on Wearable
Computers, (1997/2).
Paradiso, J., New ways to play: electronic music interfaces, IEEE Spectrum, December
(1997).
Paradiso, R., Loriga, G. and Taccini, N., Wearable health care system for vital signs
monitoring, Mediterranean Conference on Medical and Biological Engineering, (2004).
Quantum3D Incorporated, 6330 San Ignacio Avenue, San Jose, CA 95119, USA.,
Quantum3D product literature, www.quantum3d.com
Randell, C. and Muller, H., The shopping jacket: wearable computing for the consumer,
Personal Technologies 4(4), (2000), pp 241-244.
Randell, C. and Muller, H., The well mannered wearable computer, Personal and
Ubiquitous Computing, 6(1), (2002) pp 31-36.
Satava, R.M., Virtual Reality and Telepresence for Military Medicine. In: ANNALS
Academy of Medicine, Singapore 26(1), (1997), pp 118-120.
Schnelle, D., Aitenbichler, E., Kangasharju, J. and Mühlhaüser, M., Talking Assistant -
Car Repair Shop Demo. Proceedings of the Sixth International Conference on Ubiquitous
Computing. (2004).
Schiele, B., Oliver, N., Jebara, T. and Pentland, A. DyPERS: Dynamic Personal
Enhanced Reality System. Proceedings of the International Conference on Vision
Systems, (1999).
Siuru, B., Applying acoustic monitoring to medical diagnostics, Sensors, March (1997),
51-52.
14
Starner, T., Mann, S., Rhodes, B., Levine, J., Healey, J., Kirsch, D., Picard, R. and
Pentland, A., Augmented reality through wearable computing, Presence: Teleoperators
and Virtual Environments, 6(4), (1997), pp 384-398.
Starner, T., Weaver, J. and Pentland, A., Real-time American Sign Language recognition
using desk and wearable computer based video, Pattern Analysis and Machine
Intelligence, December (1998), pp 1371-1375
Stelarc, From Psycho to Cyber Strategies: Prosthetics, Robotics and Remote Existence,
Cultural Values. 1(2), (1997), pp 241-9.
Thomas, B., Close, B., Donoghue, J., Squires, J., De Bondi, P., Morris, M. and Piekarski,
W., ARQuake: An Outdoor/Indoor Augmented Reality First Person Application. Fourth
International Symposium on Wearable Computers, (2000), pp 139-146.
Thorp, E.O., The invention of the first wearable computer, Second International
Symposium on Wearable Computers, (1998), pp 4-8.
Vocollect Incorporated, 703 Rodi Road, Pittsburgh, PA 15235, USA, Vocollect product
literature, www.vocollect.com.
Xybernaut Corporation, 12701 Fair Lakes Circle, Suite 550, Fairfax, Virginia 22033,
USA, Xybernaut product literature, www.xybernaut.com
15