ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于RealSense深度相機(jī)的多特征樹(shù)干快速識(shí)別方法
CSTR:
作者:
作者單位:

作者簡(jiǎn)介:

通訊作者:

中圖分類號(hào):

基金項(xiàng)目:

國(guó)家重點(diǎn)研發(fā)計(jì)劃項(xiàng)目(2018YFD0201400),、江蘇省重點(diǎn)研發(fā)計(jì)劃項(xiàng)目(BE2018372),、江蘇省自然科學(xué)基金項(xiàng)目〓(BK20181443)和鎮(zhèn)江市重點(diǎn)研發(fā)計(jì)劃項(xiàng)目(NY2018001)


Fast Recognition Method of Multi-feature Trunk Based on RealSense Depth Camera
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問(wèn)統(tǒng)計(jì)
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評(píng)論
    摘要:

    針對(duì)農(nóng)業(yè)機(jī)器人在果園定位和導(dǎo)航中,,環(huán)境背景復(fù)雜,、光照強(qiáng)度變化大等問(wèn)題,,本文提出了一種基于RGB-D相機(jī)并利用顏色,、深度,、寬度和平行邊特征的樹(shù)干快速識(shí)別方法,。首先,,使用RealSense深度相機(jī)獲取果園的彩色圖像和深度數(shù)據(jù),;然后,,將彩色圖像轉(zhuǎn)換為HSV顏色空間,,再對(duì)HSV顏色空間中的S分量進(jìn)行超像素分割,并將顏色特征和深度特征相近的相鄰超像素塊進(jìn)行合并,;隨后,,對(duì)深度圖像進(jìn)行樹(shù)干寬度特征檢測(cè),對(duì)寬度置信率大于閾值的物體看作是待處理物體,;最后,,對(duì)待處理的物體進(jìn)行平行邊特征檢測(cè),在待處理物體邊緣區(qū)域選擇感興趣區(qū)域窗口(ROI)進(jìn)行邊緣檢測(cè),,搜索可能的樹(shù)干邊緣直邊,,當(dāng)物體邊緣的置信率RB大于設(shè)定的閾值TLB時(shí),則識(shí)別為樹(shù)干,。通過(guò)對(duì)樹(shù)干的多特征提取,,有效提高了在不同環(huán)境下樹(shù)干識(shí)別準(zhǔn)確率。利用移動(dòng)機(jī)器人平臺(tái)在果園環(huán)境進(jìn)行試驗(yàn)測(cè)試,,以檢驗(yàn)在強(qiáng)光照,、正常光照和弱光照條件下樹(shù)干識(shí)別算法的性能。試驗(yàn)結(jié)果表明,,本文的樹(shù)干識(shí)別算法在強(qiáng)光照,、正常光照和弱光照條件下,樹(shù)干識(shí)別的準(zhǔn)確率分別為92.38%,、91.35%和89.86%,,每幀圖像平均耗時(shí)分別為0.54、0.66,、0.76s,,能夠穩(wěn)定且快速地實(shí)現(xiàn)果園環(huán)境下樹(shù)干識(shí)別作業(yè)。

    Abstract:

    For agricultural robots in orchard positioning and navigation, the environmental background is complex and the illumination intensity changes greatly. In order to solve the problems, a rapid identification method of tree trunks was proposed by using the features of color, depth, width and parallel edges based on RGB-D camera. Firstly, the color image and depth image of the orchard were obtained by using RealSense depth camera. Then, the color image was converted into HSV color space, and superpixel segmentation was performed on the S component in HSV, and then adjacent superpixel blocks with similar color characteristics and depth characteristics were combined. Secondly, trunk width feature detection was carried out on the depth image, and the object whose width confidence rate was greater than the threshold value was regarded as the object to be processed. Finally, the parallel edge detection of the processed object was conducted, the region of interest (ROI) window was selected in the edge area of the object to be processed for edge detection, and the possible straight edge of the trunk edge was searched, when the confidence rate of the object edge was greater than the set threshold, the processed object was recognized as trunk, otherwise it was non-trunk. Through the extraction of multiple features of tree trunks, the recognition rate of tree trunks under different environments was improved effectively. In order to evaluate the performance of tree trunk recognition algorithm under strong light, normal light and weak light, a mobile robot platform was used to test in orchard environment. The experimental results showed that the recognition rate of this algorithm was 92.38%, 91.35% and 89.86% and the average time of each image was 0.54s, 0.66s and 0.76s under strong light, normal light and weak light, respectively, which can stably realize the trunk recognition in orchard environment.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

沈躍,莊珍珍,劉慧,姜建濱,歐鳴雄.基于RealSense深度相機(jī)的多特征樹(shù)干快速識(shí)別方法[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2022,53(4):304-312. SHEN Yue, ZHUANG Zhenzhen, LIU Hui, JIANG Jianbin, OU Mingxiong. Fast Recognition Method of Multi-feature Trunk Based on RealSense Depth Camera[J]. Transactions of the Chinese Society for Agricultural Machinery,2022,53(4):304-312.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2021-03-29
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2021-06-10
  • 出版日期:
文章二維碼