Department of Biomedical Engineering, Meybod University, Meybod, Iran
10.30476/jhmi.2024.103265.1226
Abstract
Background: The rise of smartphone sensors, especially accelerometers, has expanded the scope of Human Activity Recognition (HAR). HAR plays a key role in monitoring student health by offering real-time insights into physical activity and promoting healthier behaviors. Objective: The aim of this study is to develop an optimized deep learning model to monitor and classify student activities, using accelerometer data for real-time health monitoring. Methods: This study developed and optimized a novel deep learning framework using modified version of Bidirectional Long Short-Term Memory (BiLSTM) networks, enhanced by the Grey Wolf Optimizer (GWO). The BiLSTM framework automates the feature learning process from raw accelerometer data, while GWO optimizes the hyperparameters to improve sequence processing and overall model performance. We employed public datasets, UCI-HAR and WISDM, for validation, using cross-validation to ensure model robustness. The edge computing approach was implemented to enable real-time processing. Results: The proposed BiLSTM-GWO framework achieved a classification accuracy of 97.68%, outperforming existing methods in recognizing student activities. The model showed enhanced performance in distinguishing between activities such as walking, sitting, and stair climbing, significantly reducing misclassification errors. In addition to accuracy, metrics such as precision, recall, and F1 score were evaluated, all showing improvement. GWO optimization also accelerated convergence, enhancing suitability for real-time applications. Conclusions: The integration of edge computing into the framework provides real-time analysis and resource efficiency, making it highly suitable for health monitoring applications in educational settings.
Dastbaravardeh E, Askarpour S, Saberi Anari M, Rezaee K. Channel Attention-Based Approach with Autoencoder Network for Human Action Recognition in Low-Resolution Frames. Int J Intell Syst. 2024.
Diraco G, Rescio G, Siciliano P, Leone A. Review on human action recognition in smart living: Sensing technology, multimodality, real-time processing, interoperability, and resource-constrained processing. Sensors. 2023;23(11):5281.
Javed AR, Faheem R, Asim M, Baker T, Beg MO. A smartphone sensors-based personalized human activity recognition system for sustainable smart cities. Sustain Cities Soc. 2021;71:102970.
Ren Y, Liu M, Yang Y, Mao L, Chen K. Clinical human activity recognition based on a wearable patch of combined tri-axial ACC and ECG sensors. Digit Health. 2024;10:20552076231223804.
Abbas A, Bilal HSM, Lee S. Smartphone Based Wellness Application for Healthy Lifestyle Promotion (poster). In: Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services. 2019. p. 622-3.
Slemenšek J, Fister I, Geršak J, Bratina B, van Midden VM, Pirtošek Z, et al. Human gait activity recognition machine learning methods. Sensors. 2023;23(2):745.
Nagpal D, Kumar R. A Review on Machine Learning Techniques for Human Actions Recognition. Comput Intell Anal Inf Syst. 2023;373-83.
Liu Y, Zhang Q, Chen W. Massive-scale complicated human action recognition: Theory and applications. Future Gener Comput Syst. 2021;125:806-11.
Saleem G, Bajwa UI, Raza RH. Toward human activity recognition: a survey. Neural Comput Appl. 2023;35(5):4145-82.
Xiao L, Luo K, Liu J, Foroughi A. A hybrid deep approach to recognizing student activity and monitoring health physique based on accelerometer data from smartphones. Sci Rep. 2024;14(1):14006.
Pires IM, Garcia NM, Zdravevski E, Lameski P. Daily motionless activities: A dataset with accelerometer, magnetometer, gyroscope, environment, and GPS data. Sci Data. 2022;9(1):105.
de Pinho André R, Raposo AB, Fuks H. A platform for assessing physical education activity engagement. In: Intelligent Human Systems Integration 2019: Proceedings of the 2nd International Conference on Intelligent Human Systems Integration (IHSI 2019): Integrating People and Intelligent Systems. February 7-10, 2019, San Diego, California, USA. Springer International Publishing; 2019. p. 271-6.
Zakharova AN, Karvunis YA, Kapilevich LV. Monitoring and management of students’ health, lifestyle and physical activity. Tomsk State Univ J. 2021;(464):203-15.
Rashidi P, Cook DJ, Holder LB, Schmitter-Edgecombe M. Discovering activities to recognize and track in a smart environment. IEEE Trans Knowl Data Eng. 2010;23(4):527-39.
Zhong CL. Internet of things sensors assisted physical activity recognition and health monitoring of college students. Measurement. 2020;159:107774.
Saha A, Sharma T, Batra H, Jain A, Pal V. Human action recognition using smartphone sensors. In: 2020 International Conference on Computational Performance Evaluation (ComPE). 2020. p. 238-43.
Munoz-Organero M. Outlier detection in wearable sensor data for human activity recognition (HAR) based on DRNNs. IEEE Access. 2019;7:74422-36.
Mahadevkar SV, Khemani B, Patil S, Kotecha K, Vora DR, Abraham A, et al. A review on machine learning styles in computer vision—Techniques and future directions. IEEE Access. 2022;10:107293-329.
Abbaspour S, Fotouhi F, Sedaghatbaf A, Fotouhi H, Vahabi M, Linden M. A comparative analysis of hybrid deep learning models for human activity recognition. Sensors. 2020;20(19):5707.
Kumar P, Suresh S. Deep learning models for recognizing the simple human activities using smartphone accelerometer sensor. IETE J Res. 2023;69(8):5148-58.
Sezavar A, Atta R, Ghanbari M. DCapsNet: Deep capsule network for human activity and gait recognition with smartphone sensors. Pattern Recognit. 2024;147:110054.
Zhuang Z, Xue Y. Sport-related human activity detection and recognition using a smartwatch. Sensors. 2019;19(22):5001.
Gholamrezaii M, AlModarresi SM. A time-efficient convolutional neural network model in human activity recognition. Multimed Tools Appl. 2021;80:19361-76.
Lv M, Xu W, Chen T. A hybrid deep convolutional and recurrent neural network for complex activity recognition using multimodal sensors. Neurocomputing. 2019;362:33-40.
Mekruksavanich S, Jitpattanakul A. LSTM networks using smartphone data for sensor-based human activity recognition in smart homes. Sensors. 2021;21(5):1636.
Luwe YJ, Lee CP, Lim KM. Wearable sensor-based human activity recognition with hybrid deep learning model. Informatics. 2022;9(3):56.
Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL. A public domain dataset for human activity recognition using smartphones. In: Esann 2013. 2013. p. 3.
Kolosnjaji B, Eckert C. Neural Network-Based User-independent Physical Activity Recognition for Mobile Devices. Springer International Publishing. 2015;9375:378-86.
Mirjalili S, Mirjalili SM, Lewis A. Grey wolf optimizer. Adv Eng Softw. 2014;69:46-61.
Zhang Y, Yao X, Fei Q, Chen Z. Smartphone sensors-based human activity recognition using feature selection and deep decision fusion. IET Cyber-Phys Syst Theory Appl. 2023.
Zheng G. A novel attention-based convolution neural network for human activity recognition. IEEE Sens J. 2021;21(23):27015-25.
Li X, et al. Enhancing representation of deep features for sensor-based activity recognition. Mob Netw Appl. 2021;26(1):130-45.
Friday NH, et al. Data fusion and multiple classifier systems for human activity detection and health monitoring: review and open research directions. Inf Fusion. 2019;46:147-70.
Kolosnjaji B, Eckert C. Neural Network-Based User-independent Physical Activity Recognition for Mobile Devices. Springer International Publishing. 2015;9375:378-86.
Jiang W, Yin Z. Human activity recognition using wearable sensors by deep convolutional neural networks. In: ACM International Conference on Multimedia. 2015. p. 1307-10.
Kim YJ, Kang BN, Kim D. Hidden Markov model ensemble for activity recognition using tri-axis accelerometer. In: 2015 IEEE International Conference on Systems, Man, and Cybernetics. 2016.
Zheng Q, et al. Imaging and fusing time series for wearable sensor-based human activity recognition. Inf Fusion. 2020;53:80-7.
Wan SH, et al. Deep learning models for real-time human activity recognition with smartphones. Mob Netw Appl. 2019;25(2):743-55.
Rezaee, K. (2024). An advanced deep learning structure for accurate student activity recognition and health monitoring using smartphone accelerometer data. Health Management & Information Science, 11(2), 85-97. doi: 10.30476/jhmi.2024.103265.1226
MLA
Rezaee, K. . "An advanced deep learning structure for accurate student activity recognition and health monitoring using smartphone accelerometer data", Health Management & Information Science, 11, 2, 2024, 85-97. doi: 10.30476/jhmi.2024.103265.1226
HARVARD
Rezaee, K. (2024). 'An advanced deep learning structure for accurate student activity recognition and health monitoring using smartphone accelerometer data', Health Management & Information Science, 11(2), pp. 85-97. doi: 10.30476/jhmi.2024.103265.1226
CHICAGO
K. Rezaee, "An advanced deep learning structure for accurate student activity recognition and health monitoring using smartphone accelerometer data," Health Management & Information Science, 11 2 (2024): 85-97, doi: 10.30476/jhmi.2024.103265.1226
VANCOUVER
Rezaee, K. An advanced deep learning structure for accurate student activity recognition and health monitoring using smartphone accelerometer data. Health Management & Information Science, 2024; 11(2): 85-97. doi: 10.30476/jhmi.2024.103265.1226