Author: Chávez Alfredo Karstoft Henrik
Publisher: MDPI
E-ISSN: 1424-8220|12|4|3868-3878
ISSN: 1424-8220
Source: Sensors, Vol.12, Iss.4, 2012-03, pp. : 3868-3878
Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.
Abstract
To enhance sensor capabilities, sensor data readings from different modalities must be fused. The main contribution of this paper is to present a sensor data fusion approach that can reduce KinectTM sensor limitations. This approach involves combining laser with KinectTM sensors. Sensor data is modelled in a 3D environment based on octrees using a probabilistic occupancy estimation. The Bayesian method, which takes into account the uncertainty inherent in the sensor measurements, is used to fuse the sensor information and update the 3D octree map. The sensor fusion yields a significant increase of the field of view of the KinectTM sensor that can be used for robot tasks.
Related content
Visualization of Concrete Slump Flow Using the Kinect Sensor
Sensors, Vol. 18, Iss. 3, 2018-03 ,pp. :
Categorization of Indoor Places Using the Kinect Sensor
By Mozos Oscar Martinez Mizutani Hitoshi Kurazume Ryo Hasegawa Tsutomu
Sensors, Vol. 12, Iss. 5, 2012-05 ,pp. :
Automatic Recognition of Aggressive Behavior in Pigs Using a Kinect Depth Sensor
By Lee Jonguk Jin Long Park Daihee Chung Yongwha
Sensors, Vol. 16, Iss. 5, 2016-05 ,pp. :
Detection of Cardiopulmonary Activity and Related Abnormal Events Using Microsoft Kinect Sensor
Sensors, Vol. 18, Iss. 3, 2018-03 ,pp. :
Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor
By Park Jae-Han Shin Yong-Deuk Bae Ji-Hun Baeg Moon-Hong
Sensors, Vol. 12, Iss. 7, 2012-06 ,pp. :