Cross-view Gait Recognition Based on Fine-Tuned Deep Networks

Access
info:eu-repo/semantics/openAccessDate
15 May 202Access
info:eu-repo/semantics/openAccessMetadata
Show full item recordCitation
Scopus EXPORT DATE: 05 March 2025 @CONFERENCE{Yaprak2024, url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-85200852754&doi=10.1109%2fSIU61531.2024.10600941&partnerID=40&md5=22bc212710d9fc57a2773eadb94d575e}, affiliations = {Yazılım Mühendisliği Bölümü, Gümüşhane Üniversitesi, Gümüşhane, Turkey; Yazılım Mühendisliği Bölümü, Karadeniz Teknik Üniversitesi, Trabzon, Turkey; Yapay Zeka Mühendisliği Bölümü, Trabzon Üniversitesi, Trabzon, Turkey}, correspondence_address = {B. Yaprak; Yazılım Mühendisliği Bölümü, Gümüşhane Üniversitesi, Gümüşhane, Turkey; email: busra.kucukugurlu@gumushane.edu.tr}, publisher = {Institute of Electrical and Electronics Engineers Inc.}, isbn = {979-835038896-1}, language = {Turkish}, abbrev_source_title = {IEEE Conf. Signal Process. Commun. Appl., SIU - Proc.} }Abstract
Gait recognition is a biometrics-based computer vision process used to identify people based on their walking styles. Compared to other types of biometrics, gait offers a more advantageous recognition process as it does not require high-resolution and close-range images and obtains without contact. But besides this, gait biometrics is highly affected by cross-view variation, and under this variation recognition performance decreases significantly. In this study, performance evaluations of fine-tuned VGG-16 and ResNet-50 deep CNN networks on the cross-view gait recognition problem are performed. For this purpose, Gait energy images (GEI) and Silhouettes obtained from CASIA-B, the most comprehensive data set in gait recognition, are given as input to the networks. The experimental results showed that the VGG-16 network achieved higher recognition rates in cross-view gait recognition. © 2024 IEEE.
URI
scopus.com/record/display.uri?eid=2-s2.0-85200852754&origin=SingleRecordEmailAlert&dgcid=raven_sc_affil_en_us_email&txGid=3e7c0387fd0f6739aa75d838fadc8a55https://hdl.handle.net/20.500.12440/6406