登入選單
返回Google圖書搜尋
MRsensing - Environmental Monitoring and Context Recognition with Cooperative Mobile Robots in Catastrophic Incidents
註釋

Multi-sensor information fusion theory concerns the environmental perception activities to combine data from multiple sensory resources. Humans, as any other animals, gather information from the environment around them using different biological sensors. Combining them allows structuring the decisions and actions when interacting with the environment. Under disaster conditions, effective mult-robot information sensor fusion can yield a better situation awareness to support the collective decision-making. Mobile robots can gather information from the environment by combining data from different sensors as a way to organize decisions and augment human perception. The is especially useful to retrieve contextual environmental information in catastrophic incidents where human perception may be limited (e.g., lack of visibility). To that end, this work proposes a specific configuration of sensors assembled in a mobile robot, which can be used as a proof of concept to measure important environmental variables in an urban search and rescue (USAR) mission, such as toxic gas density, temperature gradient and smoke particles density. This data is processed through a support vector machine classifier with the purpose of detecting relevant contexts in the course of the mission. The outcome provided by the experiments conducted with TraxBot and Pioneer-3DX robots under the Robot Operating System framework opens the door for new multi-robot applications on USAR scenarios. This work was developed within the CHOPIN research project which aims at exploiting the cooperation between human and robotic teams in catastrophic accidents.