Mobile-based Activity Monitoring System for the Self-quarantine Patient

Authors

  • Annisaa Sri Indrawanti Institut Teknologi Sepuluh Nopember Surabaya
  • Eka Prakarsa Mandyartha Universitas Pembangunan Nasional Veteran Jawa Timur

DOI:

https://doi.org/10.33086/atcsj.v4i1.2085

Keywords:

Activity Recognition, Classification, Mobile, Monitoring system

Abstract

Nowadays, not all the patient can be hospitalized because of the COVID-19 pandemics. So, the self-quarantine for the patient with the various diseases will be the given solution by the hospital. It would make the hospital needs a system that can monitor the activity and the position of the patient from a distance. Nowadays, mobile phone is equipped by the sensor that can detect the user movement. Not only the user’s position, but also the user’s activity. In this paper, it will be developed an activity and position monitoring system for the self-quarantine patient that can be used in their home. The mobile activity monitoring can be achieved by activity recognition using classification method. For the needs of performance testing, we evaluate some classification method for activity recognition to compare the among classification method for the activity recognition. Some tested classification methods are Naïve Bayes, KNN, KStar and TreeJ48. Furthermore, we tested the impact of sliding windows per N samples taken to the accuracy of the activity recognition. We choose the best N sample that could give the best accuracy for activity recognition. The system not only monitor the patient’s activity, but also the patient’s position. The position monitoring can be achieved using Google Maps API. The result is Naive bayes has the accuracy of 81.25%, KNN has the accuracy of 78.125%, KStar has the accuracy of 78.125% and TreeJ48 has the accuracy of 75%. The N sample that could give the best accuracy is 6 with the accuracy of 90.15%.

Downloads

Download data is not yet available.

References

P. Keskinocak, B. E. Oruc, A. Baxter, J. Asplund, and N. Serban, “The impact of social distancing on COVID19 spread: State of Georgia case study,” PLOS ONE, vol. 15, no. 10, p. e0239798, 12 Okt 20, doi: 10.1371/journal.pone.0239798.

Y. Liu, L. Nie, L. Liu, and D. S. Rosenblum, “From action to activity: Sensor-based activity recognition,” Neurocomputing, vol. 181, pp. 108–115, Mar. 2016, doi: 10.1016/j.neucom.2015.08.096.

L. Chen, J. Hoey, C. D. Nugent, D. J. Cook, and Z. Yu, “Sensor-Based Activity Recognition,” IEEE Trans. Syst. Man Cybern. Part C Appl. Rev., vol. 42, no. 6, pp. 790–808, Nov. 2012, doi: 10.1109/TSMCC.2012.2198883.

M. H. M. Noor, Z. Salcic, and K. I.-K. Wang, “Adaptive sliding window segmentation for physical activity recognition using a single tri-axial accelerometer,” Pervasive Mob. Comput., vol. 38, pp. 41–59, Jul. 2017, doi: 10.1016/j.pmcj.2016.09.009.

C. Ma, W. Li, J. Cao, J. Du, Q. Li, and R. Gravina, “Adaptive sliding window based activity recognition for assisted livings,” Inf. Fusion, vol. 53, pp. 55–65, Jan. 2020, doi: 10.1016/j.inffus.2019.06.013.

A. S. Indrawanti and W. Wibisono, “A CHANGE DETECTION AND RESOURCE-AWARE DATA SENSING APPROACHES FOR IMPROVING THE REPORTING PROTOCOL MECHANISM FOR MOBILE USER,” J. Ilmu Komput. Dan Inf., vol. 8, no. 2, Art. no. 2, Aug. 2015, doi: 10.21609/jiki.v8i2.307.

B. Sefen, S. Baumbach, A. Dengel, and S. Abdennadher, “Human Activity Recognition - Using Sensor Data of Smartphones and Smartwatches:,” in Proceedings of the 8th International Conference on Agents and Artificial Intelligence, Rome, Italy, 2016, pp. 488–493, doi: 10.5220/0005816004880493.

S. Sani, N. Wiratunga, S. Massie, and K. Cooper, “kNN Sampling for Personalised Human Activity Recognition,” in Case-Based Reasoning Research and Development, vol. 10339, D. W. Aha and J. Lieber, Eds. Cham: Springer International Publishing, 2017, pp. 330–344.

P. J. S. Ferreira, J. M. P. Cardoso, and J. Mendes-Moreira, “kNN Prototyping Schemes for Embedded Human Activity Recognition with Online Learning,” Computers, vol. 9, no. 4, Art. no. 4, Dec. 2020, doi: 10.3390/computers9040096.

J. Wannenburg and R. Malekian, “Physical Activity Recognition From Smartphone Accelerometer Data for User Context Awareness Sensing,” IEEE Trans. Syst. Man Cybern. Syst., vol. 47, no. 12, pp. 3142–3149, Dec. 2017, doi: 10.1109/TSMC.2016.2562509.

Y. E. Shin, W. H. Choi, and T. M. Shin, “Physical activity recognition based on rotated acceleration data using quaternion in sedentary behavior : A preliminary study,” in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Aug. 2014, pp. 4976–4978, doi: 10.1109/EMBC.2014.6944741.

O. Banos, J.-M. Galvez, M. Damas, H. Pomares, and I. Rojas, “Window Size Impact in Human Activity Recognition,” Sensors, vol. 14, no. 4, Art. no. 4, Apr. 2014, doi: 10.3390/s140406474.

W. Wibisono, A. S. Indrawanti, and T. Ahmad, “A context-awareness approach for improving reporting protocol for activity and position tracking for social networking services,” in 2015 7th International Conference on Information Technology and Electrical Engineering (ICITEE), Oct. 2015, pp. 348–353, doi: 10.1109/ICITEED.2015.7408970.

H. Bragança, J. G. Colonna, W. S. Lima, and E. Souto, “A Smartphone Lightweight Method for Human Activity Recognition Based on Information Theory,” Sensors, vol. 20, no. 7, Art. no. 7, Jan. 2020, doi: 10.3390/s20071856.

A. V and M. Nirmalhaldikar, www.ijsrp.org Real Time Position Tracking System Using Google Maps.

Downloads

Published

2021-07-31

How to Cite

Sri Indrawanti, A., & Mandyartha, E. P. (2021). Mobile-based Activity Monitoring System for the Self-quarantine Patient. Applied Technology and Computing Science Journal, 4(1), 56–62. https://doi.org/10.33086/atcsj.v4i1.2085

Issue

Section

Articles