Gelişmiş Arama

Basit öğe kaydını göster

dc.contributor.authorBenli, Emrah
dc.contributor.authorMotai, Yuichi
dc.contributor.authorRogers, John
dc.date.accessioned2021-11-09T19:50:00Z
dc.date.available2021-11-09T19:50:00Z
dc.date.issued2020
dc.identifier.issn1932-8184
dc.identifier.issn1937-9234
dc.identifier.urihttps://doi.org/10.1109/JSYST.2019.2958747
dc.identifier.urihttps://hdl.handle.net/20.500.12440/4175
dc.description.abstractVisual perception is an important component for human-robot interaction processes in robotic systems. Interaction between humans and robots depends on the reliability of the robotic vision systems. The variation of camera sensors and the capability of these sensors to detect many types of sensory inputs improve the visual perception. The analysis of activities, motions, skills, and behaviors of humans and robots have been addressed by utilizing the heat signatures of the human body. The human motion behavior is analyzed by body movement kinematics, and the trajectory of the target is used to identify the objects and the human target in the omnidirectional (O-D) thermal images. The process of human target identification and gesture recognition by traditional sensors have problem for multitarget scenarios since these sensors may not keep all targets in their narrow field of view (FOV) at the same time. O-D thermal view increases the robots' line-of-sights and ability to obtain better perception in the absence of light. The human target is informed of its position, surrounding objects and any other human targets in its proximity so that humans with limited vision or vision disability can be assisted to improve their ability in their environment. The proposed method helps to identify the human targets in a wide FOV and light independent conditions to assist the human target and improve the human-robot and robot-robot interactions. The experimental results show that the identification of the human targets is achieved with a high accuracy.en_US
dc.description.sponsorshipU.S. Navy, Naval Surface Warfare Center Dahlgren; U.S. Army Research LaboratoryUnited States Department of DefenseUS Army Research Laboratory (ARL); Ministry of National Education of TurkeyMinistry of National Education - Turkeyen_US
dc.description.sponsorshipThis work was supported in part by the U.S. Navy, Naval Surface Warfare Center Dahlgren, in part by the U.S. Army Research Laboratory, and in part by the Ministry of National Education of Turkey.en_US
dc.language.isoengen_US
dc.publisherIEEE-Inst Electrical Electronics Engineers Incen_US
dc.relation.ispartofIeee Systems Journalen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectKinematicsen_US
dc.subjectRobot sensing systemsen_US
dc.subjectVisual perceptionen_US
dc.subjectLegged locomotionen_US
dc.subjectHeaden_US
dc.subjectCommand cognitionen_US
dc.subjecthuman motion analysisen_US
dc.subjectmultiple human targetsen_US
dc.subjectmultiple robotsen_US
dc.subjectomnidirectional (O-D) cameraen_US
dc.subjectrobotic perceptionen_US
dc.subjecttarget identificationen_US
dc.subjectthermal visionen_US
dc.subjectvisual perceptionen_US
dc.subjectwalking behavioren_US
dc.titleVisual Perception for Multiple Human-Robot Interaction From Motion Behavioren_US
dc.typearticleen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.description.wospublicationidWOS:000543049900131en_US
dc.description.scopuspublicationid2-s2.0-85086070002en_US
dc.departmentGümüşhane Üniversitesien_US
dc.authoridMotai, Yuichi / 0000-0002-1957-1896
dc.authoridBenli, Emrah / 0000-0001-8579-0539
dc.identifier.volume14en_US
dc.identifier.issue2en_US
dc.identifier.startpage2937en_US
dc.identifier.doi10.1109/JSYST.2019.2958747
dc.identifier.endpage2948en_US
dc.authorwosidMotai, Yuichi / G-9740-2017
dc.authorwosidBenli, Emrah / AAC-5089-2019
dc.authorscopusid57194856620
dc.authorscopusid9734468900
dc.authorscopusid57207977901


Bu öğenin dosyaları:

DosyalarBoyutBiçimGöster

Bu öğe ile ilişkili dosya yok.

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster