Collaborative Fall Detection Using A Wearable Device and A Companion Robot
Collaborative Fall Detection Using A Wearable Device and A Companion Robot
Collaborative Fall Detection Using A Wearable Device and A Companion Robot
Robot
Fei Liang, Ricardo Hernandez, Jiaxing Lu, Brandon Ong, Matthew Jackson Moore, Weihua Sheng, Senlin Zhang
Backward 0.9434 0.0138 0.0446 0.0235 0.0290 0.0032 and power consumption. Apparently, the results we reported
Forward 0.0081 0.8489 0.0250 0.0451 0.0159 0.0021
Left 0.0232 0.0277 0.7902 0.1294 0.0654 0.0284 here are still preliminary but promising. We need further im-
Right 0.0182 0.0723 0.1027 0.7373 0.0813 0.0221
Sit down 0.0020 0.0255 0.0232 0.0147 0.7935 0.000 prove the whole system, particularly in the following aspects:
Walk 0.0051 0.0117 0.0143 0.0500 0.0150 0.9442 1) Energy consumption. We will conduct a more thorough
investigation of the power consumption on the WAMU and
The confusion matrix of our proposed approach on testing optimize both hardware and software to further reduce the
dataset is shown in Table II. The overall accuracy is 84%, power consumption. 2) Fall detection. We will collect more
as Fig. 11 shows, and it is obtained at 53th epoch. As data from more realistic falls and study the impact of various
shown in the table, the accuracy of some classes reaches factors on fall detection accuracy, including the speed of
94%, including falling backward and walking. However, the falling, the location of falling, etc. 3) WAMU design. We
accuracy of falling left, falling right, and sitting down is not will further improve the ergonomics of the WAMU to make
as good as that of other classes. it more human-friendly.
R EFERENCES
[1] J. Zhao and X. Li, “The status quo of and development strategies for
healthcare towns against the background of aging population,” Journal
of Landscape Research, vol. 10, no. 4, pp. 41–44, 2018.
[2] T. Nguyen Gia, V. K. Sarker, I. Tcarenko, A. M. Rahmani,
T. Westerlund, P. Liljeberg, and H. Tenhunen, “Energy efficient
wearable sensor node for IoT-based fall detection systems,”
Microprocessors and Microsystems, vol. 56, no. October 2017, pp.
34–46, 2018. [Online]. Available: https://doi.org/10.1016/j.micpro.
2017.10.014
[3] H. M. Do, M. Pham, W. Sheng, D. Yang, and M. Liu, “RiSH: A robot-
integrated smart home for elderly care,” Robotics and Autonomous
Systems, vol. 101, pp. 74–92, 2018.
[4] K. Ozcan and S. Velipasalar, “Wearable Camera- and Accelerometer-
Based Fall Detection on Portable Devices,” IEEE Embedded Systems
Letters, vol. 8, no. 1, pp. 6–9, 2016.
[5] A. K. Bourke, J. V. O’Brien, and G. M. Lyons, “Evaluation of a
threshold-based tri-axial accelerometer fall detection algorithm,” Gait
and Posture, vol. 26, no. 2, pp. 194–199, 2007.
[6] E. Torti, A. Fontanella, M. Musci, N. Blago, D. Pau, F. Leporati,
and M. Piastra, “Embedded real-time fall detection with deep learning
on wearable devices,” Proceedings - 21st Euromicro Conference on
Digital System Design, DSD 2018, pp. 405–412, 2018.
[7] A. Shojaei-Hashemi, P. Nasiopoulos, J. J. Little, and M. T. Pourazad,
“Video-based Human TelegramFall Detection in Smart Homes Using
Deep Learning,” Proceedings - IEEE International Symposium on
Circuits and Systems, vol. 2018-May, pp. 0–4, 2018.
[8] F. Hussain, F. Hussain, M. Ehatisham-Ul-Haq, and M. A. Azam,
“Activity-Aware Fall Detection and Recognition Based on Wearable
Sensors,” IEEE Sensors Journal, vol. 19, no. 12, pp. 4528–4536, 2019.
[9] A. Mao, X. Ma, Y. He, and J. Luo, “Highly portable, sensor-based
system for human fall monitoring,” Sensors (Switzerland), vol. 17,
no. 9, 2017.
[10] A. Pinto, “Wireless Embedded Smart Cameras: Performance Analysis
and their Application to Fall Detection for Eldercare,” p. 122, 2011.
[11] K. M. Shahiduzzaman, X. Hei, C. Guo, and W. Cheng, “Enhancing
Fall Detection for Elderly with Smart Helmet in a Cloud-Network-
Edge Architecture,” 2019 IEEE International Conference on Consumer
Electronics - Taiwan, ICCE-TW 2019, pp. 2019–2020, 2019.
[12] L. Martinez-VillasenoTelegramr, H. Ponce, and K. Perez-Daniel,
“Deep learning for multimodal fall detection,” Conference Proceedings
- IEEE International Conference on Systems, Man and Cybernetics,
vol. 2019-October, pp. 3422–3429, 2019.
[13] I. Anghel, T. Cioara, D. Moldovan, M. Antal, C. D. Pop, I. Salomie,
C. B. Pop, and V. R. Chifu, “Smart environments and social robots
for age-friendly integrated care services,” International Journal of
Environmental Research and Public Health, vol. 17, no. 11, 2020.
[14] Aibo. [Online]. Available: http://www.sony-aibo.com/
[15] Paro. [Online]. Available: http://www.parorobots.com/
[16] F. Erivaldo Fernandes, H. M. Do, K. Muniraju, W. Sheng, and A. J.
Bishop, “Cognitive orientation assessment for older adults using social
robots,” in 2017 IEEE International Conference on Robotics and
Biomimetics (ROBIO), 2017, pp. 196–201.
[17] Rasa. [Online]. Available: https://rasa.com/
[18] Lstm. [Online]. Available: http://colah.github.io/posts/
2015-08-Understanding-LSTMs/
[19] M. Alwan, P. J. Rajendran, S. Kelli, D. Mack, S. Dalali, and M. W.
I, “KellI, DalalI, FelderI,” pp. 6–10, 2006.
[20] Telegram. [Online]. Available: https://telegram.org/