<address id="xrvvb"><listing id="xrvvb"></listing></address>

          <form id="xrvvb"></form>

          <address id="xrvvb"></address>

          華中科技大學學報(自然科學版) 2020, Vol. 48 Issue (9): 25-30 DOI10.13245/j.hust.200902

          欄目:計算機與控制工程
          基于情景經驗與稀疏點云的移動機器人導航
          劉 冬 , 陳 飛 , 鄒 強 , 叢 明
          大連理工大學機械工程學院,遼寧 大連 116024
          摘要 針對稀疏點云地圖用于自主導航任務的信息不充分問題,融合情景經驗實現環境認知,提出一種基于情景經驗與稀疏點云的移動機器人導航系統,獲取全局最優路徑,提高機器人導航精度.構建點云地圖來保存環境顯著路標.受人類基于情景經驗方式導航啟發,模擬經驗積累過程,構建封裝了場景感知、位姿信息和事件轉移集的情景經驗地圖,實現機器人對環境在拓撲關系上的理解.結合環境稀疏點云地圖定位機器人,根據情景經驗地圖規劃路徑與控制機器人行為.實驗結果表明:該導航系統能夠根據不同的導航任務規劃出全局最優路徑,并且具有較高的導航精度.
          關鍵詞 移動機器人 ;情景經驗 ;稀疏點云地圖 ;路徑規劃 ;導航
          Mobile robot navigation based on situational experience and sparse point cloud
          LIU Dong , CHEN Fei , ZOU Qiang , CONG Ming
          School of Mechanical Engineering,Dalian University of Technology,Dalian 116024,China
          Abstract To solve the problem of insufficient information of sparse point cloud map under autonomous navigation tasks,a novel mobile robot navigation system which realized environmental cognition based on situational experience and sparse point cloud was proposed.The system improved robot navigation accuracy while the global optimal path was calculated.Significant landmarks in the environment were stored in the sparse point cloud map.Inspired by human navigation ways with situational experience,the process of experience accumulation simulated was used for constructing situational experience map encapsulating scene perception,posture and event transfer set,which reconstructed the topological relation of environment.The mobile robot localized itself based on sparse point cloud map while planning paths and controlling behaviors with the situational experience map.The experimental results show this navigation system can generate the global optimal path according to different navigation tasks with high navigation accuracy.
          Keywords mobile robot ; situational experience ; sparse point cloud map ; path planning ; navigation
          基金資助國家自然科學基金資助項目(61503057);遼寧省自然科學基金計劃重點項目(20180520017);大連市科技創新基金資助項目(2018J12GX035).

          中圖分類號TP242
          文獻標志碼A
          文章編號1671-4512(2020)09-0025-06
          參考文獻
          [1] LING Y,SHEN S.Building maps for autonomous navigation using sparse visual SLAM features[C]// Proc of IEEE International Conference on Intelligent Robots and Systems.Vancouver:IEEE,2017:1374-1381.
          [2] YANG Z,SHI D.Mapping technology in visual SLAM: a review[C]// Proc of ACM International Conference Proceeding Series.Shenzhen:Association for Computing Machinery,2018:291-295.
          [3] HUANG A S,BACHRACH A,HENRY P,et al.Visual odometry and mapping for autonomous flight using an RGB-D camera[C]// Proc of Springer Tracts in Advanced Robotics.Flagstaff:Springer Verlag,2017:235-252.
          [4] MARDER-EPPSTEIN E,BERGER E,FOOTE T,et al.The office marathon:robust navigation in an indoor office environment[C]// Proc of IEEE International Conference on Robotics and Automation.Anchorage:IEEE,2010:300-307.
          [5] DAYOUB F,MORRIS T,UPCROFT B,et al.Vision-only autonomous navigation using topometric maps[C]// Proc of IEEE International Conference on Intelligent Robots and Systems.Tokyo:IEEE,2013:1923-1929.
          [6] RICHTER M,SANDAMIRSKAYA Y,SCHONER G.A robotic architecture for action selection and behavioral organization inspired by human cognition[C]// Proc of IEEE International Conference on Intelligent Robots and Systems.Vilamoura:IEEE,2012:2457-2464.
          [7] MILFORD M J,WYETH G F,PRASSER D.RatSLAM:a hippocampal model for simultaneous localization and mapping[C]// Proc of IEEE International Conference on Robotics and Automation.New Orleans:IEEE,2004:403-408.
          [8] MENEGATTI E,MAEDA T,ISHIGURO H.Image-based memory for robot navigation using properties of omnidirectional images[J].Robotics and Autonomous Systems,2004,47(4):251-267.
          [9] TIAN B,SHIM V A,YUAN M,et al.RGB-D based cognitive map building and navigation[C]// Proc of IEEE International Conference on Intelligent Robots and Sys- tems.Tokyo:Institute of Electrical and Electronics Engineers Inc,2013:1562-1567.
          [10] 鄒強,叢明,劉冬,等.仿鼠腦海馬的機器人地圖構建與路徑規劃方法[J].華中科技大學學報(自然科學版),2018,46(12):83-88.
          [11] 張瀟,胡小平,張禮廉,等.一種改進的RatSLAM仿生導航算法[J].導航與控制,2015,14(5):73-79.
          [12] HULETSKI A,KARTASHOV D,KRINKIN K.Evaluation of the modern visual SLAM methods[C]// Proceedings of Artificial Intelligence and Natural Language and Information Extraction,Social Media and Web Search FRUCT Conference.St.Petersburg:IEEE,2015:19-25.
          [13] MUR-ARTAL R,TARDOS J D.ORB-SLAM2:an open-source SLAM system for monocular,stereo,and RGB-D cameras[J].IEEE Transactions on Robotics,2017,33(5):1255-1262.
          文獻來源
          劉 冬, 陳 飛, 鄒 強, 叢 明. 基于情景經驗與稀疏點云的移動機器人導航[J]. 華中科技大學學報(自然科學版), 2020, 48(9): 25-30
          DOI:10.13245/j.hust.200902
          澳门盘口