Sensors: Detecting Falls With Wearable Sensors Using Machine Learning Techniques
Sensors: Detecting Falls With Wearable Sensors Using Machine Learning Techniques
Sensors: Detecting Falls With Wearable Sensors Using Machine Learning Techniques
3390/s140610691
OPEN ACCESS
sensors
ISSN 1424-8220
www.mdpi.com/journal/sensors
Article
Abstract: Falls are a serious public health problem and possibly life threatening for
people in fall risk groups. We develop an automated fall detection system with wearable
motion sensor units fitted to the subjects body at six different positions. Each unit
comprises three tri-axial devices (accelerometer, gyroscope, and magnetometer/compass).
Fourteen volunteers perform a standardized set of movements including 20 voluntary falls
and 16 activities of daily living (ADLs), resulting in a large dataset with 2520 trials. To
reduce the computational complexity of training and testing the classifiers, we focus on the
raw data for each sensor in a 4 s time window around the point of peak total acceleration of
the waist sensor, and then perform feature extraction and reduction. Most earlier studies on
fall detection employ rule-based approaches that rely on simple thresholding of the sensor
outputs. We successfully distinguish falls from ADLs using six machine learning techniques
(classifiers): the k-nearest neighbor (k-NN) classifier, least squares method (LSM), support
vector machines (SVM), Bayesian decision making (BDM), dynamic time warping (DTW),
and artificial neural networks (ANNs). We compare the performance and the computational
complexity of the classifiers and achieve the best results with the k-NN classifier and
LSM, with sensitivity, specificity, and accuracy all above 99%. These classifiers also have
acceptable computational requirements for training and testing. Our approach would be
applicable in real-world scenarios where data records of indeterminate length, containing
multiple activities in sequence, are recorded.
Sensors 2014, 14
10692
Keywords: fall detection; activities of daily living; wearable motion sensors; machine
learning; pattern classification; feature extraction and reduction
1. Introduction
With the worlds aging population, health-enabling technologies and ambulatory monitoring of
the elderly has become a prominent area of multi-disciplinary research [1,2]. Rapidly developing
technology has made mobile and wireless devices part of daily life. An important aspect of context-aware
systems is recognizing, interpreting, and monitoring the basic activities of daily living (ADLs) such as
standing, sitting, lying down, walking, ascending/descending stairs, and most importantly, emergent
events such as falls. If a sudden change in the center of mass of the human body results in a loss
of balance, the person falls. The World Health Organization defines falls as involuntary, unexpected,
and uncontrollable events resulting in a person impacting and coming to rest on the ground or at
a lower level [3].
Falls need to be considered within the same framework as ADLs since they typically occur
unexpectedly while performing daily activities. Falls are a public health problem and a health threat,
especially for adults of age 65 and older [4]. Statistics indicate that one in every three adults of age
65 or older experiences at least one fall every year. Besides the elderly, children, disabled individuals,
workers, athletes, and patients with visual, balance, gait, orthopedic, neurological, and psychological
disorders also suffer from falls. The intrinsic factors associated with falls are aging, mental impairment,
neurological and orthopedic diseases, vision and balance disorders. The extrinsic factors are multiple
drug usage, slippery floors, poor lighting, loose carpets, handrails near bathtubs and toilets, electric or
power cords, clutter and obstacles on stairways [5]. Although some of the extrinsic risk factors can be
eliminated by taking necessary precautions, intrinsic factors are not readily eliminated and falls cannot
be completely prevented. Since the consequences of falls can be serious and costly, falls should be
detected reliably and promptly to reduce the occurrence of related injuries and the costs of healthcare.
Accurate, reliable, and robust fall detection algorithms that work in real time are essential.
Monitoring people in fall risk groups should occur without intruding on their privacy, restricting
their independence, or degrading their quality of life. User-activated fall detection systems do not
have much practical usage. Fall detection systems need to be completely automated and may rely on
multiple sources of sensory information for improved robustness. A commonly used approach is to
fix various sensors to the environment, such as cameras, acoustic, pressure, vibration, force, infrared
sensors, lasers, Radio Frequency Identification (RFID) tags, inertial sensors and magnetometers [6,7].
Smart environments can be designed through the use of one or more of these sensors in a complementary
fashion, usually with high installation cost [8]. Other people or pets moving around may easily confuse
such systems and cause false alarms. The main advantage of this approach is that the person at risk
does not have to wear or carry any sensors or devices on his body. This approach may be acceptable
when the activities of the person are confined to certain parts of a building. However, when the activities
performed take place both indoors and outdoors and involve going from one place to another (e.g., riding
a vehicle, going shopping, commuting, etc.), this approach becomes unsuitable. It imposes restrictions
Sensors 2014, 14
10693
on the mobility of the person since the system operates only in the limited environment monitored by the
sensors that are fixed to the environment.
Despite that most earlier studies followed the above approach for monitoring people in the fall risk
groups, wearable motion sensors have several advantages. The 1-D signals acquired from the multiple
axes of motion sensors are much simpler to process and can directly provide the required 3-D motion
information. Unlike visual motion-capture systems that require a free line of sight, inertial sensors can
be flexibly used inside or behind objects without occlusion. Because they are light, comfortable, and
easy to carry, wearable sensors do not restrict people to a studio-like environment and can operate both
indoors and outdoors, allowing free pursuit of activities. The required infrastructure and associated
costs of wearable sensors are much lower than smart environments and they do not intrude on privacy.
Unlike acoustic sensors, they are not affected by the ambient noise. Wearable sensors are thus suitable
for developing automated fall detection systems. In this study, we follow this approach for robust and
accurate detection and classification of falls that occur while performing ADLs.
Fall detection is surveyed in [9,10]. Earlier work is fragmented, of limited scope, and not very
systematic. The lack of common ground among researchers makes results published so far difficult
to compare, synthesize, and build upon in a manner that allows broad conclusions to be reached. Sensor
configuration and modality, subject number and characteristics, considered fall types and activities,
feature extraction, and acquired signal processing are different in individual studies [1114]. Although
most studies have investigated voluntary (simulated) falls, a limited number of involuntary falls have
been recorded in recent studies [1517]. The latter is a very difficult and time-consuming task [16].
The small number of recorded real-world falls are usually from rare disease populations that cannot be
generalized to fall risk groups at large.
Machine learning techniques have been used to distinguish six activities, including falls, using an
infrared motion capture system [18]. Studies that use support vector machines are reported in [19,20].
In the latter study, a computer vision based fall recognition system is proposed that combines depth
map with normal RGB color information. Better results are achieved with this combination as the depth
map reduces the errors and provides more information about the scene. Falls are then recognized and
distinguished from ADLs using support vector machines, with accuracy above 95%.
To achieve robust and reliable fall detection and enable comparing different studies, open datasets
acquired through standardized experimental procedures are necessary. We found only three works that
provide guidelines for fall experiments [2123] and only one that pursues them [8]. In [23], it is stated
that there is no open database for falls and the desirable structure and characteristics of a fall database
are described.
Although some commercial devices and patents on fall detection exist, these devices are not
satisfactory [22]. The main reasons are the high false alarm rates, high initial and maintenance
costs of the devices, and their non-ergonomic nature. Wearable fall detection systems are criticized
mainly because people may forget, neglect, or not want to wear them. If they are battery operated,
batteries will have to be replaced or recharged from time to time. However, with the advances
of the Micro Electro Mechanical Sensors (MEMS) technology, these devices have recently become
much smaller, more compact, and less expensive. They can be easily integrated to other available
alarm systems in the vicinity or to the accessories that the person carries. The lightness, low power
Sensors 2014, 14
10694
consumption, and wireless use of these devices have eliminated the concerns related to their portability
and discomfort. Furthermore, smartphones that usually contain embedded accelerometers are suitable
devices for executing fall detection algorithms [2426].
Through wearable sensors and machine learning techniques, this study aims to robustly and accurately
detect falls that occur while performing ADLs. Instead of using simple rule-based algorithms that rely
on thresholding the sensory output (as in most earlier works), we employ features of the recorded
signals around the point of peak acceleration. To be able to acquire the sufficient amount of data for
algorithm development according to the guidelines provided in [23], we limit our study to voluntary
(simulated) falls.
The rest of this article is organized as follows: in Section 2, we describe data acquisition and
briefly overview the six machine learning techniques. In Section 3, we compare the performance and
the computational requirements of the techniques based on experiments on the same dataset. We discuss
the results in Section 4, and draw conclusions and indicate directions for future research in Section 5.
2. Material and Methods
2.1. Data Acquisition
We used the six MTw sensor units that are part of the MTw Software Development Kit manufactured
by Xsens Technologies [27]. Each unit comprises three tri-axial devices (accelerometer, gyroscope,
and magnetometer/compass) with respective ranges of 120 m/s2 , 1200 /s, and 1.5 Gauss, and
an atmospheric pressure meter with 3001100 hPa operating range, which we did not use. We calibrated
the sensors before each volunteer began the experiments and captured and recorded raw motion data
with a sampling frequency of 25 Hz. Acceleration, rate of turn, and the strength of the Earths
magnetic field along three perpendicular axes (x, y, z) were recorded for each unit. Measurements were
transmitted over an RF connection (ZigBee) to Xsens Awinda Station connected to a remote PC with a
USB interface.
2.2. Experimental Procedure
We followed the guidelines provided in [23] for designing fall experiments. With Erciyes University
Ethics Committee approval, seven male (24 3 years old, 67.5 13.5 kg, 172 12 cm) and seven
female (21.5 2.5 years old, 58.5 11.5 kg, 169.5 12.5 cm) healthy volunteers participated in the
study with informed written consent. We performed the tests at Erciyes University Clinical Research
and Technology Center. We fitted the six wireless sensor units tightly with special straps to the subjects
head, chest, waist, right wrist, right thigh, and right ankle (Figure 1). Unlike cabled systems, wireless
data acquisition allows users to perform motions more naturally. Volunteers wore a helmet, wrist guards,
knee and elbow pads, and performed the activities on a soft crash mat to prevent injuries, each trial
lasting about 15 s on the average.
Sensors 2014, 14
10695
Figure 1. (ac) The configuration of the six MTw units on a volunteers body; (d) single
MTw unit, encasing three tri-axial devices (accelerometer, gyroscope, and magnetometer)
and an atmospheric pressure sensor; (e) the three perpendicular axes of a single MTw unit;
(f) remote computer, Awinda Station and the six MTw units.
A set of trials consists of 20 fall actions and 16 ADLs (Table 1) adopted from [23]; the 14 volunteers
repeated each set five times. We thus acquired a considerably diverse dataset comprising 1400 falls
(20 tasks 14 volunteers 5 trials) and 1120 ADLs (16 tasks 14 volunteers 5 trials), resulting
in 2520 trials. Many of the non-fall actions included in our dataset are high-impact events that may
be easily confused with falls. Such a large dataset is useful for testing/validating fall detection and
classification algorithms.
2.3. Feature Selection and Reduction
Earlier studies on fall detection mostly use simple thresholding of the sensory outputs (e.g.,
accelerations, rotational rates) because of its simplicity and low processing time. This approach is not
sufficiently robust or reliable because there are different fall types and their nature shows variations for
each individual. Furthermore, certain ADLs can be easily confused with falls. For improved robustness,
we consider additional features of the recorded signals. The total acceleration of the waist accelerometer
is given by:
Sensors 2014, 14
10696
q
AT = A2x + A2y + A2z
(1)
where Ax , Ay , and Az are the accelerations along the x, y, and z axes, respectively. We first identify
the time index corresponding to the peak AT value of the waist accelerometer in each record. Then,
we take the two-second intervals (25 Hz 2 s = 50 samples) before and after this point, corresponding
to a time window of 101 samples (50 + AT index + 50) and ignore the rest of the record. Data from
the remaining axes of each sensor unit are also reduced in the same way, considering the time index
obtained from the waist sensor as reference, resulting in six 101 9 arrays of data. Each column of data
is represented by an N 1 vector s = [s1 , s2 , . . . , sN ]T , where N = 101. Extracted features consist of
the minimum, maximum, and mean values, as well as variance, skewness, kurtosis, the first 11 values
of the autocorrelation sequence, and the first five peaks of the discrete Fourier transform (DFT) of the
signal with the corresponding frequencies:
N
1 X
sn
mean(s) = =
N n=1
N
1 X
variance(s) = =
(sn )2
N n=1
2
N
1 X
skewness(s) =
(sn )3
N 3 n=1
N
1 X
(sn )4
kurtosis(s) =
N 4 n=1
NX
1
1
(sn ) (sn )
autocorrelation(s) =
N n=0
DFTq (s) =
N
1
X
sn e
j2qn
N
(2)
= 0, 1, . . . , N 1
q = 0, 1, . . . , N 1
n=0
Here, DFTq (s) is the qth element of the 1-D N -point DFT. We performed feature extraction for the
15,120 records (36 motions 14 volunteers 5 trials 6 sensors). The first five features extracted
from each axis of a sensor unit are the minimum, maximum, mean, skewness, and kurtosis values.
Because each unit contains nine axes, 45 features were obtained (9 axes 5 values). Autocorrelation
produces 99 features (9 axes 11 features). DFT produces 5 frequency and 5 amplitude values, resulting
in a total of 90 features (9 axes 10 values). Thus, 234 features are extracted from each sensor unit in
total (45 + 99 + 90), resulting in a feature vector of dimension 1404 1 (=234 features 6 sensors) for
each trial.
Sensors 2014, 14
10697
Table 1. Fall and non-fall actions (ADLs) considered in this study.
Fall Actions
Label
Description
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
front-lying
front-protecting-lying
front-knees
front-knees-lying
front-right
front-left
front-quick-recovery
front-slow-recovery
back-sitting
back-lying
back-right
back-left
right-sideway
right-recovery
left-sideway
left-recovery
syncope
syncope-wall
podium
rolling-out-bed
Label
Description
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
lying-bed
rising-bed
sit-bed
sit-chair
sit-sofa
sit-air
walking-fw
jogging
walking-bw
bending
bending-pick-up
stumble
limp
squatting-down
trip-over
coughing-sneezing
Sensors 2014, 14
10698
Because the initial set of features was quite large (1404) and not all features were equally useful
in discriminating between the falls and ADLs, to reduce the computational complexity of training and
testing the classifiers, we reduced the number of features from 1404 to M = 30 through principal
component analysis (PCA) [28] and normalized the resulting features between 0 and 1. PCA is a
transformation that finds the optimal linear combinations of the features, in the sense that they represent
the data with the highest variance in a feature subspace, without taking the intra-class and inter-class
variances into consideration separately. The reduced dimension of the feature vectors is determined by
observing the eigenvalues of the covariance matrix of the 1404 1 feature vectors, sorted in Figure 2a
in descending order. The largest 30 eigenvalues constitute 72.38% of the total variance of the principal
components and account for much of the variability of the data. The 30 eigenvectors corresponding to
the largest 30 eigenvalues (Figure 2b) are used to form the transformation matrix, resulting in 30 1
feature vectors.
Figure 2. (a) All eigenvalues (1404) and (b) the first 50 eigenvalues of the covariance matrix
sorted in descending order.
first 50 eigenvalues in descending order
0.09
0.08
0.08
0.07
0.07
0.06
0.06
eigenvalues
eigenvalues
0.05
0.04
0.05
0.04
0.03
0.03
0.02
0.02
0.01
0.01
0
0
200
400
600
800
1000
number of features
1200
1400
10
(a)
20
30
number of features
40
50
(b)
Sp =
(4)
TP + TN
100
TP + FN + FP + TN
(5)
Sensors 2014, 14
10699
Here, TP (a fall occurs; the algorithm detects it), TN (a fall does not occur; the algorithm does
not detect a fall), FP (a fall does not occur but the algorithm reports a fall), and FN (a fall occurs
but the algorithm misses it) are the numbers of true positives and negatives, and false positives and
negatives, respectively. Obviously, there is an inverse relationship between sensitivity and specificity.
For instance, in an algorithm that employs simple thresholding, as the threshold level is decreased, the
rate of FN decreases and the sensitivity of the algorithm increases. On the other hand, FP rate increases
and specificity decreases. As the threshold level is increased, the opposite happens: sensitivity decreases
and specificity increases. Based on these definitions, FP and FN ratios can be obtained as:
FP ratio = 1 Sp
FN ratio = 1 Se
In this study, we consider falls with ADLs because falls typically occur unexpectedly while
performing daily activities. An ideal fall detection system should especially be able to distinguish
between falls and ADLs that can cause high acceleration of body parts (e.g., jumping, sitting down
suddenly). The algorithms must be sufficiently robust, intelligent, and sensitive to minimize FPs and
FNs. False alarms (FPs) caused by misclassified ADLs, although a nuisance, can be canceled by the
user. However, it is crucial not to misclassify falls as some other activity. FNs, which indicate missed
falls, must be avoided by all means, since user manipulation may not be possible if a fall results in
physical and/or mental impairment. For example, long periods of inactivity (such as those that may
occur after a fall) may be confused with the state of sleeping or resting.
We distinguish falls from ADLs with six machine learning techniques and compare their
performances based on their sensitivity, specificity, accuracy, and computational complexity. In training
and testing, we randomly split the dataset into p = 10 equal partitions and employ p-fold cross validation.
We use p 1 partitions for training and reserve the remaining partition for testing (validation). When
this is repeated for each partition, training and validation partitions cross over in p successive rounds and
each record in the dataset gets a chance of validation.
2.4.1. The k-Nearest Neighbor Classifier (k-NN)
The k-NN method classifies a given object based on the closest training object(s) [28]. Class decision
is made by majority voting from among a chosen number of nearest neighbors k, where k > 0. There
is no standard value for k because the k-NN algorithm is sensitive to the local data structure. Smaller k
values increase the variance and make the results less stable, whereas larger k values increase the bias
but reduce the sensitivity. Therefore, the proper choice of k depends on the particular dataset. In this
work, we determined the value of k experimentally as k = 7, based on our dataset.
2.4.2. The Least Squares Method (LSM)
In LSM, two average reference vectors are calculated for the two classes that correspond to falls
and ADLs [28]. A given test vector x = [x1 , . . . , xM ]T is compared with each reference vector
ri = [ri1 , . . . , riM ]T , i = 1, 2 by calculating the sum of the squared differences between them:
Ei2
M
X
(xm rim )2
m=1
(6)
Sensors 2014, 14
10700
1
(x i )T C1
i (x i ) + log[det(Ci )]
2
i = 1, 2
(7)
Sensors 2014, 14
10701
3. Results
The framework used for the study is subject independent; the classifiers considered here were used to
process the complete dataset, instead of designing different classifiers for each subject. We present the
performance comparison of the six classifiers in Table 2. The k-NN classifier gives the best accuracy
(99.91%), followed by LSM, SVM, BDM, DTW, and ANN. The k-NN has 100% sensitivity, indicating
that falls are not missed with this method; however, two to three ADLs were misclassified over 2520 trials
in 10 rounds (Table 3). The average accuracies and standard deviations of the classifiers over 10 rounds
are provided in Table 3, where we observe the similarity of the results in each round, indicating their
repeatability. Because the k-NN classifier and LSM do not miss any falls, we consider them both reliable
classifiers. ROC curves for the classifiers are depicted in Figure 3.
Figure 3. ROC curves for some of the classifiers.
ROC curves
1
kNN
LSM
BDM
DTW
ANN
0.9
0.8
0.7
TP rate
0.6
0.5
0.4
0.3
0.2
0.1
0
0
Table 2. Comparison of the results and the computational requirements of the six machine
learning techniques in terms of the training and testing times for a single fold (P: positive,
N: negative).
k-NN
True
P
N
LSM
SVM
Confusion Matrices
N
P
N
1400
2.3
0
1117.7
1400
8.7
0
1111.3
1393.9
7
Se (%)
Sp (%)
Acc (%)
100
99.79
99.91
100
99.22
99.65
Training
Test
318.2
76.6
2.2
32.7
BDM
6.1
1113
99.56
99.38
99.48
1398
16.7
2
1103.3
99.86
98.51
99.26
DTW
ANN
1381.4
35.5
18.6
1084.5
1364.6
73.5
35.4
1046.5
98.67
96.83
97.85
97.47
93.44
95.68
2.5
33,816.6
10,089.0
13.5
Sensors 2014, 14
10702
Table 3. Classifier results over 10 successive rounds. AVG: average, STD: standard
deviation (continued).
Run
Se (%)
Sp (%)
Acc (%)
TN
FP
TP
FN
10
AVG
STD
100
99.73
99.88
1117
3
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.73
99.88
1117
3
1400
0
100
99.73
99.88
1117
3
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.82
99.92
1118
2
1400
0
100
99.79
99.91
1117.7
2.3
1400
0
0
0.0431
0.0192
0.4830
0.4830
0
0
(a) k-NN
Run
Se (%)
Sp (%)
Acc (%)
TN
FP
TP
FN
10
AVG
STD
100
99.29
99.68
1112
8
1400
0
100
99.29
99.68
1112
8
1400
0
100
99.20
99.64
1111
9
1400
0
100
99.20
99.64
1111
9
1400
0
100
99.20
99.64
1111
9
1400
0
100
99.11
99.60
1110
10
1400
0
100
99.11
99.60
1110
10
1400
0
100
99.38
99.72
1113
7
1400
0
100
99.20
99.64
1111
9
1400
0
100
99.29
99.68
1112
8
1400
0
100
99.22
99.65
1111.3
8.7
1400
0
0
0.0847
0.0376
0.9487
0.9487
0
0
(b) LSM
Run
Se (%)
Sp (%)
Acc (%)
TN
FP
TP
FN
10
AVG
STD
99.64
99.46
99.56
1114
6
1395
5
99.50
99.29
99.40
1112
8
1393
7
99.64
98.93
99.33
1108
12
1395
5
99.57
99.46
99.52
1114
6
1394
6
99.50
99.46
99.48
1114
6
1393
7
99.57
99.55
99.56
1115
5
1394
6
99.50
99.29
99.40
1112
8
1393
7
99.50
99.55
99.52
1115
5
1393
7
99.57
99.29
99.44
1112
8
1394
6
99.64
99.46
99.56
1114
6
1395
5
99.56
99.38
99.48
1113
7
1393.9
6.1
0.0625
0.1882
0.0825
2.1082
2.1082
0.8756
0.8756
(c) SVM
Run
Se (%)
Sp (%)
Acc (%)
TN
FP
TP
FN
10
AVG
STD
99.86
98.57
99.29
1104
16
1398
2
99.86
98.57
99.29
1104
16
1398
2
99.86
98.48
99.25
1103
17
1398
2
99.86
99.48
99.25
1103
17
1398
2
99.86
98.39
99.21
1102
18
1398
2
99.86
98.57
99.29
1104
16
1398
2
99.86
98.48
99.25
1103
17
1398
2
99.86
98.57
99.29
1104
16
1398
2
99.86
98.48
99.25
1103
17
1398
2
99.86
98.48
99.25
1103
17
1398
2
99.86
98.51
99.26
1103.3
16.7
1398
2
0
0.0603
0.0268
0.6749
0.6749
0
0
(d) BDM
Sensors 2014, 14
10703
Table 3. Cont.
Run
Se (%)
Sp (%)
Acc (%)
TN
FP
TP
FN
10
AVG
STD
98.71
97.79
97.86
1084
36
1382
18
98.71
97.96
97.94
1086
34
1382
18
98.79
97.23
98.10
1089
31
1383
17
98.79
97.14
98.06
1088
32
1383
17
98.57
96.61
97.70
1182
38
1380
20
98.64
97.23
98.02
1089
31
1381
19
98.79
96.96
97.98
1086
34
1383
17
98.43
96.61
97.62
1082
38
1378
22
98.50
96.25
97.50
1078
42
1379
21
98.79
96.52
97.78
1081
39
1383
17
98.67
96.83
97.85
1084.5
35.5
1381.4
18.6
0.1313
0.3321
0.1992
3.7193
3.7193
1.8379
1.8379
(e) DTW
Run
Se (%)
Sp (%)
Acc (%)
TN
FP
TP
FN
10
AVG
STD
97.64
93.39
95.73
1046
74
1367
33
97.93
93.21
95.83
1044
76
1371
29
96.57
94.11
95.48
1054
66
1352
48
98.00
93.75
96.11
1050
70
1372
28
97.29
92.86
95.32
1040
80
1362
38
97.50
93.57
95.75
1048
72
1365
35
97.86
93.84
96.07
1051
69
1370
30
97.00
94.38
95.83
1057
63
1358
42
97.21
92.86
95.28
1040
80
1361
39
97.71
92.41
95.36
1035
85
1368
32
97.47
93.44
95.68
1046.5
73.5
1364.6
35.4
0.4545
0.6132
0.3048
6.8678
6.8678
6.3631
6.3631
(f) ANN
We compare the computational requirements of the six machine learning techniques in the last
two rows of Table 2 in terms of the training and testing times required for a single fold of the dataset
that contains 252 feature vectors. We implemented the algorithms in a MATLAB 7.7.0 environment on
a Windows 7 computer with a 2.67 GHz quad core 64-bit Intel Core i5 processor and 4 GB of RAM.
In terms of the required training time, the classifiers can be sorted as BDM, LSM, DTW, k-NN, SVM,
and ANN in increasing order. In terms of the testing time, the order is ANN, SVM, LSM, BDM, k-NN,
and DTW.
4. Discussion
The availability of standardized open databases allows researchers to compare their results with those
of others. Diversity of the subjects, activity spectrum, and the number of trials are important factors in
constructing a database. When a limited number of activities that are easy to discriminate between are
performed by a small number of subjects, it may be possible to achieve very high accuracies. However,
such performance may not be maintained when the set of activities is broadened or new subjects
participate in the tests. Although some studies with very high (100%) sensitivity and specificities
exist [32,33], the performance of these algorithms degrades when implemented in the real world under
realistic conditions and with new users. There are many academic works with promising results but no
reliable off-the-shelf product on the market.
Sensors 2014, 14
10704
Figure 4. Total acceleration of the waist sensor during the fall actions: (a) back sitting;
(b) back lying; and (c) rolling out of bed. The average total acceleration for female/male
volunteers and the overall minimum/maximum total acceleration values that occurred during
the experiments are shown.
Time (s)
8
Maximum (7.7916 g)
Male
Female
Maximum
Minimum
7
6
5
Male (4.6251 g)
Female (4.0514 g)
4
3
Minimum (2.9172 g)
2
1
0
1
a)
1.5
Maximum (6.2627 g)
6
Total Acceleration (g)
0.5
5
Female (4.4808 g) Male (4.3252 g)
Minimum (3.1562 g)
3
2
1
0
1
b)
1.5
Maximum (5.2092 g)
5
Total Acceleration (g)
0.5
4
Female (3.2448 g) Male (3.1547 g)
Minimum (2.0123 g)
2
1
0
0.5
1
c)
1.5
The ADLs that we recorded in this study and included in our dataset are a subset of real-world ADLs,
many of which are high-impact events that may be easily confused with falls. Since laboratory-recorded
ADLs/falls and those that occur in a natural setting may have some differences, we compared the average
and peak acceleration values of the voluntary falls that we recorded, with those in [17], where some
involuntary falls by the elderly are recorded. Figure 4 shows sample signals recorded by the waist
sensor in our experiments (which is also the location of the sensor in [17]). Back sitting, back lying,
Sensors 2014, 14
10705
and rolling out of bed (Table 1; fall actions 9, 10, and 20, respectively) recordings are illustrated, with
average values for female/male volunteers over 35 (= 7 subjects 5 trials) fall actions each and the
minimum/maximum total acceleration. The minimum/maximum values are determined over all records
and may belong to a female or a male volunteer. We observe that for a given type of fall, features of the
signals recorded from voluntary and involuntary falls are similar in nature. The average duration of the
impact from the maximum to the minimum value of total acceleration in both fall types (voluntary and
involuntary) is about 0.2 s. Thus, our experimental records are consistent with involuntary falls recorded
in an independently conducted study.
Our approach would be applicable to real-world settings where continuous data streams of
indeterminate length, containing multiple activities, are recorded. If the data stream contains falls in
between a sequence of ADLs, the multiple acceleration peaks can be easily identified. The signal pattern
in the time window around each peak can then be processed with machine learning techniques to evaluate
if it indeed corresponds to a fall. In real-world testing, we expect our system to give slightly lower
accuracies than under laboratory conditions.
The algorithms can be easily embedded into portable devices or accessories carried on the body that
can be connected to a telephone network [34]. This feature will allow prompt medical attention, improve
the safety, independence, and quality of living of those in fall risk groups, and contribute to the economy
by reducing the costs of medical healthcare.
5. Conclusions
We employ six classifiers based on machine learning to distinguish between falls and ADLs
using previously proposed, standardized experimental procedures. We compare the performance and
computational requirements of the machine learning techniques based on the same dataset and achieve
accuracies above 95%. The repeatability of the results over the 10 runs indicates the robustness of the
classifiers. The k-NN and LSM methods do not miss any falls; thus, we consider them reliable classifiers.
These classifiers also have acceptable computational requirements for training and testing, making them
suitable for real-time applications. The fact that we use standardized experimental procedures to perform
a comprehensive set of fall experiments sets an example in the fall detection area. This also makes our
approach more applicable to real-world scenarios where data records of indeterminate length, containing
multiple activities in sequence, are recorded. We plan to test the system with continuous data streams
acquired from falls and ADLs. To enable comparison among the algorithms developed in different
studies, we intend to make our dataset publicly available at the University of Irvine Machine Learning
Repository [35]. Our daily and sports activities dataset is already available at the same website [36].
In our current work, we are investigating which of the six motion sensor units and which axes of these
sensors are most useful in activity and fall detection [37]. Incorporating information from biomedical
sensors for vital signs and audio sensors may further improve the robustness of our fall detection system.
Our ongoing work considers embedding fall detection algorithms to a mobile device (e.g., a smartphone)
to be worn around the waist level.
Sensors 2014, 14
10706
Acknowledgements
This work was supported by Erciyes University Scientific Research Project Coordination Department
under grant number FBA-11-3579 and the Scientific and Technological Research Council of Turkey
(TBITAK) 2218 National Postdoctoral Research Scholarship Program. The authors wish to thank
Kenan Dansman, Director of Clinical Research and Technology Center at Erciyes University, for
providing the experimental area during the course of the project. The authors also would like to thank
the 14 volunteers who participated in our experiments for their efforts, dedication, and time.
Author Contributions
Ahmet Turan zdemir coordinated the experimental work, analyzed the experimental data, and
prepared the initial draft of the manuscript. Billur Barshan proposed the research topic and the machine
learning approach and supervised the study. She also contributed significantly to the writing and
organization of the manuscript.
Conflicts of Interest
The authors declare no conflict of interest.
References
1. Rodrguez-Martn, D.; Prez-Lpez, C.; Sam, A.; Cabestany, J.; Catal, A.A. Wearable inertial
measurement unit for long-term monitoring in the dependency care area. Sensors 2013, 13,
1407914104.
2. Ludwig, W.; Wolf, K.-H.; Duwenkamp, C.; Gusew, N.; Hellrung, N.; Marschollek, M.;
Wagner, M.; Haux, R. Health-enabling technologies for the elderlyan overview of services based
on a literature review. Comput. Meth. Prog. Biomed. 2012, 106, 7078.
3. World Health Organization. Available online: http://www.who.int/violence_injury_prevention/
other_injury/falls/en/ (accessed on 15 June 2014).
4. Zakaria, N.A.; Kuwae, Y.; Tamura, T.; Minato, K.; Kanaya, S. Quantitative analysis of fall risk
using TUG test. Comput. Meth. Biomech. Biomed. Eng. 2013, doi:10.1080/10255842.2013.805211.
5. Bianchi, F.; Redmond, S.J.; Narayanan, M.R.; Cerutti, S.; Lovell, N.H. Barometric pressure and
triaxial accelerometry-based falls event detection. IEEE Trans. Neural Syst. Rehabil. Eng. 2010,
18, 619627.
6. Doukas, C.N.; Maglogiannis, I. Emergency fall incidents detection in assisted living environments
utilizing motion, sound and visual perceptual components. IEEE Trans. Inf. Technol. B. 2011, 15,
277289.
7. Roetenberg, D.; Slycke, P.J.; Veltink, P.H. Ambulatory position and orientation tracking fusing
magnetic and inertial sensing. IEEE Trans. Bio-Med. Eng. 2007, 54, 883890.
8. Rimminen, H.; Lindstrm, J.; Linnavuo, M.; Sepponen, R. Detection of falls among the elderly by
a floor sensor using the electric near field. IEEE Trans. Inf. Technol. B. 2010, 14, 14751476.
Sensors 2014, 14
10707
9. Mubashir, M.; Shao, L.; Seed, L. A survey on fall detection: Principles and approaches.
Neurocomputing 2013, 100, 144152.
10. Yang, C.-C.; Hsu, Y.-L. A review of accelerometry-based wearable motion detectors for physical
activity monitoring. Sensors 2010, 10, 77727788.
11. Malhi, K.; Mukhopadhyay, S.C.; Schnepper, J.; Haefke, M.; Ewald, H. A Zigbee-based wearable
physiological parameters monitoring system. IEEE Sens. J. 2012, 12, 423430.
12. Aziz, O.; Robinovitch, S.N. An analysis of the accuracy of wearable sensors for classifying the
causes of falls in humans. IEEE Trans. Neural Syst. Rehabil. Eng. 2011, 19, 670676.
13. Mariani, B.; Rochat, S.; Bla, C.J.; Aminian, K. Heel and toe clearance estimation for gait analysis
using wireless inertial sensors. IEEE Trans. Biomed. Eng. 2012, 59, 31623168.
14. Zhang, M.; Sawchuk, A.A. Human daily activity recognition with sparse representation using
wearable sensors. IEEE J. Biomed. Health Inform. 2013, 17, 553560.
15. Bagal, F.; Becker, C.; Cappello, A.; Chiari, L.; Aminian, K.; Hausdorff, J.M.; Zijlstra, W.; Klenk, J.
Evaluation of accelerometer-based fall detection algorithms on real-world falls. PLoS One 2012,
7, e37062.
16. Klenk, J.; Becker, C.; Lieken, F.; Nicolai, S.; Maetzler, W.; Alt, W.; Zijlstra, W.; Hausdorff, J.M.;
van Lummel, R.C.; Chiari, L.; Lindemann, U. Comparison of acceleration signals of simulated and
real-world backward falls. Med. Eng. Phys. 2011, 33, 368373.
17. Kangas, M.; Vikman, I.; Nyberg, L.; Korpelainen, R.; Lindblom, J.; Jms, T. Comparison
of real-life accidental falls in older people with experimental falls in middle-aged test subjects.
Gait Posture 2012, 35, 500505.
18. Lutrek, M.; Kalua, B. Fall detection and activity recognition with machine learning. Informatica
2009, 33, 205212.
19. Liu, S.-H.; Cheng, W.-C. Fall detection with the support vector machine during scripted and
continuous unscripted activities. Sensors 2012, 12, 1230112316.
20. Dubey, R.; Ni, B.; Moulin, P. A depth camera based fall recognition system for the elderly. Image
Anal. Recognit., Lect. Notes Comput. Sci. 2012, 7325, 106113.
21. Noury, N.; Fleury, A.; Rumeau, P.; Bourke, A.K.; Laighin, G.O.; Rialle, V.; Lundy, J.E. Fall
detectionprinciples and methods. In Proceedings of the 29th Annual International Conference of
the IEEE EMBS, Lyon, France, 2326 August 2007; pp. 16631666.
22. Noury, N.; Rumeau, P.; Bourke, A.K.; Laighin, G.; Lundy, J.E. A proposal for the classification
and evaluation of fall detectors. IRBM 2008, 29, 340349.
23. Abbate, S.; Avvenuti, M.; Corsini, P.; Vecchio, A.; Light, J. Monitoring of Human Movements
for Fall Detection and Activities Recognition in Elderly Care Using Wireless Sensor Network: A
Survey. In Application-Centric Design Book, 1st ed.; InTech: Rijeka, Croatia, 2010; Chapter 9,
pp. 147166.
24. Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers.
ACM SIGKDD Explor. Newsl. 2010, 12, 7482.
25. Dai, J.; Bai, X.; Yang, Z.; Shen, Z.; Xuan, D. Mobile phone-based pervasive fall detection.
Pers. Ubiquitous Comput. 2010, 14, 633643.
Sensors 2014, 14
10708
26. Lee, R.Y.W.; Carlisle, A.J. Detection of falls using accelerometers and mobile phone technology.
Age Aging 2011, 40, 690696.
27. MTw User Manual and Technical Documentation; Xsens Technologies B.V.: Enschede,
The Netherlands, 2014. Available online: http://www.xsens.com (accessed on 15 June 2014).
28. Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification, 2nd ed.; John Wiley & Sons, Inc.:
New York, NY, USA, 2001.
29. Chang, C.-C.; Lin, C.-J. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst.
Technol. 2011, 2, Article No: 27.
30. Keogh, E.; Ratanamahatana, C.A. Exact indexing of dynamic time warping. Knowl. Inf. Syst. 2005,
7, 358386.
31. Haykin, S. Neural Networks and Learning Machines; Prentice Hall: Upper Saddle River, NJ,
USA, 2009.
32. Liu, J.; Lockhart, T.E. Automatic individual calibration in fall detectionan integrative ambulatory
measurement framework. Comput. Meth. Biomech. Biomed. Eng. 2013, 16, 504510.
33. Bourke, A.K.; van de Ven, P.; Gamble, M.; OConnor, R.; Murphy, K.; Bogan, E.; McQuade, E.;
Finucane, P.; Laighin, G.; Nelson, J. Evaluation of waist-mounted tri-axial accelerometer based
fall-detection algorithms during scripted and continuous unscripted activities. J. Biomech. 2010,
43, 30513057.
34. Chang, S.-Y.; Lai, C.-F.; Chao, H.-C.J.; Park, J.H.; Huang, Y.-M. An environmental-adaptive fall
detection system on mobile device. J. Med. Syst. 2011, 35, 12991312.
35. University of Irvine Machine Learning Repository. Available online: http://archive.ics.uci.edu/ml/
(accessed on 15 June 2014).
36. Barshan, B.; Altun, K. Daily and Sports Activities Dataset, University of California Irvine
Machine Learning Repository, University of California, Irvine, School of Information and
Computer Sciences, 2013. Available online: http://archive.ics.uci.edu/ml/datasets/Daily+and+
Sports+Activities (accessed on 15 June 2014).
37. Dobrucal, O.; Barshan, B. Sensor-Activity Relevance in Human Activity Recognition with
Wearable Motion Sensors and Mutual Information Criterion. In Information Sciences and Systems
2013, Proceedings of the 28th International Symposium on Computer and Information Sciences,
Paris, France, 2829 October 2013; Gelenbe, E., Lent, R., Eds.; Springer International Publishing:
Cham, Switzerland, 2013; pp. 285294.
c 2014 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article
distributed under the terms and conditions of the Creative Commons Attribution license
(http://creativecommons.org/licenses/by/3.0/).