Gelişmiş Arama

Basit öğe kaydını göster

dc.contributor.authorBenli, Emrah
dc.contributor.authorSpidalieri, Richard Lee
dc.contributor.authorMotai, Yuichi
dc.date.accessioned2021-11-09T19:48:51Z
dc.date.available2021-11-09T19:48:51Z
dc.date.issued2019
dc.identifier.issn1551-3203
dc.identifier.issn1941-0050
dc.identifier.urihttps://doi.org/10.1109/TII.2019.2908626
dc.identifier.urihttps://hdl.handle.net/20.500.12440/3833
dc.description.abstractCollaborative robotic configurations for monitoring and tracking human targets have attracted interest in the fourth industrial revolution. The fusion of different types of sensors embedded in collaborative robotic systems achieves high-quality information and contributes to significantly improve robotic perception. However, current methods have not deeply explored the capabilities of thermal multisensory configurations in human-oriented tasks. In this paper, we propose thermal multisensor fusion (TMF) for collaborative robots to overcome the limitations of stand-alone robots. Thermal vision helps to utilize the heat signature of the human body for human-oriented tracking. An omnidirectional (O-D) infrared (IR) sensor provides a wide field of view (FOV) to detect human targets, and Stereo IR helps determine the distance of the human target in the oriented direction. The fusion of O-D IR and Stereo IR also creates amultisensor stereo for an additional determination of the distance to the target. The fusion of thermal and O-D sensors brings their limited prediction accuracy with their advantages. The maximum a posteriori method is used to predict the distance of the target with high accuracy by using the distance results of TMF stereo from multiple platforms according to the reliability of the sensors rather than its usage of visible-band-based tracking methods. The proposed method tracks the distance calculation of each sensor instead of target trajectory tracking as in visible-band methods. We proved that TMF increases the perception of robots by offering a wide FOV and provides precise target localization for collaborative robots.en_US
dc.description.sponsorshipU.S. Navy, Naval Surface Warfare Center Dahlgren; U.S. Army Research LaboratoryUnited States Department of DefenseUS Army Research Laboratory (ARL); Ministry of National Education of TurkeyMinistry of National Education - Turkeyen_US
dc.description.sponsorshipThis work was supported in part by the U.S. Navy, Naval Surface Warfare Center Dahlgren, in part by the U.S. Army Research Laboratory, and in part by the Ministry of National Education of Turkey. Paper no. TII-17-2046. (Corresponding author: Yuichi Motai.)en_US
dc.language.isoengen_US
dc.publisherIEEE-Inst Electrical Electronics Engineers Incen_US
dc.relation.ispartofIeee Transactions on Industrial Informaticsen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectCollaborative roboticsen_US
dc.subjectfar infrared (IR) cameraen_US
dc.subjectmobile roboten_US
dc.subjectmultisensoren_US
dc.subjectomnidirectional (O-D) cameraen_US
dc.subjectsensor fusionen_US
dc.subjectstereo IRen_US
dc.subjecttarget trackingen_US
dc.subjectthermal visionen_US
dc.titleThermal Multisensor Fusion for Collaborative Roboticsen_US
dc.typearticleen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.description.wospublicationidWOS:000474628100003en_US
dc.description.scopuspublicationid2-s2.0-85068593113en_US
dc.departmentGümüşhane Üniversitesien_US
dc.authoridMotai, Yuichi / 0000-0002-1957-1896
dc.authoridBenli, Emrah / 0000-0001-8579-0539
dc.identifier.volume15en_US
dc.identifier.issue7en_US
dc.identifier.startpage3784en_US
dc.identifier.doi10.1109/TII.2019.2908626
dc.identifier.endpage3795en_US
dc.authorwosidMotai, Yuichi / G-9740-2017
dc.authorwosidBenli, Emrah / AAC-5089-2019
dc.authorscopusid57194856620
dc.authorscopusid57209747760
dc.authorscopusid9734468900


Bu öğenin dosyaları:

DosyalarBoyutBiçimGöster

Bu öğe ile ilişkili dosya yok.

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster