ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于無人機遙感和弱監(jiān)督學(xué)習(xí)的城鄉(xiāng)結(jié)合部固體廢棄物識別
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

國家重點研發(fā)計劃項目(2022YFB3903504)


Solid Waste Identification in Urban-rural Fringe Areas Based on UAV Remote Sensing and Weakly Supervised Learning
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻
  • |
  • 相似文獻
  • |
  • 引證文獻
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    針對當(dāng)前固體廢棄物在遙感影像中標(biāo)注困難、特征復(fù)雜,、邊界提取精度低等問題,,提出了一種基于弱監(jiān)督學(xué)習(xí)的兩階段方法:第1階段采用圖像級標(biāo)簽數(shù)據(jù)集,在5種網(wǎng)絡(luò)模型中進行對比實驗,,最后選擇Swin Transformer作為特征學(xué)習(xí)模型,。然后,采用梯度加權(quán)類激活映射圖進行特征區(qū)域可視化,,以得到熱力圖,。隨后融合自適應(yīng)閾值法和色差法作用于熱力圖,獲取固體廢棄物粗輪廓,。第2階段采用DeepSnake模型進行優(yōu)化以得到精細化輪廓,。利用無人機多光譜遙感影像數(shù)據(jù),對河北省廊坊市開發(fā)區(qū)內(nèi)6個典型城鄉(xiāng)結(jié)合地區(qū)進行實驗,。第1階段,,對5種網(wǎng)絡(luò)模型進行測試,Swin Transformer優(yōu)勢顯著,,精確率93.8%,、召回率95.0%、F1分?jǐn)?shù)94.4%,,同時通過注意力區(qū)域可視化對比也顯示其識別效果最好,;自適應(yīng)閾值和色差法融合法的粗輪廓提取在二值化對比實驗下顯示出優(yōu)勢。第2階段,,精輪廓提取定量分析采用了COCO數(shù)據(jù)集的評價指標(biāo)平均精度(Average precision,AP)進行評價,,在IOU為0.5時,AP為91.3%,;IOU為0.75時,,AP為77.5%。同時,,在1,、2階段輪廓提取的定性比較下,顯示出DeepSnake的優(yōu)化作用。實驗結(jié)果表明:本研究能夠利用圖像級標(biāo)簽數(shù)據(jù)集精確識別提取固體廢棄物,,具有顯著的精度優(yōu)勢,,可為我國城鄉(xiāng)生態(tài)環(huán)境治理等提供可行方法。

    Abstract:

    Aiming to address the challenges of difficult annotation, complex features, and low boundary extraction precision for solid waste in remote sensing imagery, a two-stage method was proposed based on weakly supervised learning: in the first stage, an image-level labeled dataset was utilized to conduct comparative experiments among five network models, ultimately selecting the Swin Transformer as the feature learning model. Subsequently, the gradient-weighted class activation mapping was employed for feature region visualization to obtain heatmaps. These heatmaps were further processed by using a combination of adaptive thresholding and color difference methods to obtain a rough outline of the solid waste. In the second stage, the DeepSnake model was employed for optimization to achieve refined contours. This study utilized unmanned aerial vehicle (UAV) multispectral remote sensing image data to conduct experiments in six typical urbanrural interface areas within the Langfang Development Zone, Hebei Province. The results of the experiments were as follows: in the first stage, testing of the five network models revealed a pronounced advantage for the Swin Transformer in feature extraction quantitative analysis, with a precision of 93.8%, recall of 95.0%, and F1 score of 94.4%. Visualization of attention regions also indicated that it had the best recognition effect. The coarse outline extraction by using the combination method of adaptive thresholding and color difference demonstrated superiority in the binary comparison experiment. In the second stage, quantitative analysis of fine contour extraction evaluated by using the average precision (AP) metric from the COCO dataset, yielded an AP value of 91.3% at IOU 0.5 and 77.5% at IOU 0.75; moreover, qualitative comparison of contour extraction between the first and the second stages highlighted the optimization effect of DeepSnake. The results demonstrated that this study can accurately identify and extract solid waste by using an image-level labeled dataset, offering pronounced accuracy advantages and providing a viable method for the ecological environment management of urban and rural areas in China.

    參考文獻
    相似文獻
    引證文獻
引用本文

馮權(quán)瀧,張鑫虹,師宏大,牛博文,陳泊安,高秉博.基于無人機遙感和弱監(jiān)督學(xué)習(xí)的城鄉(xiāng)結(jié)合部固體廢棄物識別[J].農(nóng)業(yè)機械學(xué)報,2025,56(4):303-312. FENG Quanlong, ZHANG Xinhong, SHI Hongda, NIU Bowen, CHEN Boan, GAO Bingbo. Solid Waste Identification in Urban-rural Fringe Areas Based on UAV Remote Sensing and Weakly Supervised Learning[J]. Transactions of the Chinese Society for Agricultural Machinery,2025,56(4):303-312.

復(fù)制
分享
文章指標(biāo)
  • 點擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2024-03-04
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2025-04-10
  • 出版日期:
文章二維碼