ass日本风韵熟妇pics男人扒开女人屁屁桶到爽|扒开胸露出奶头亲吻视频|邻居少妇的诱惑|人人妻在线播放|日日摸夜夜摸狠狠摸婷婷|制服 丝袜 人妻|激情熟妇中文字幕|看黄色欧美特一级|日本av人妻系列|高潮对白av,丰满岳妇乱熟妇之荡,日本丰满熟妇乱又伦,日韩欧美一区二区三区在线

基于YOLO v7和改進U-Net模型的雞冠肉垂提取與面積計算方法
CSTR:
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

國家自然科學(xué)基金項目(32172779)、財政部和農(nóng)業(yè)農(nóng)村部:國家現(xiàn)代農(nóng)業(yè)產(chǎn)業(yè)技術(shù)體系項目(CARS-40)、河北省現(xiàn)代農(nóng)業(yè)產(chǎn)業(yè)技術(shù)體系建設(shè)專項資金項目( HBCT2023210201) 和保定市科技計劃項目(2211N014)


Extraction and Area Calculation of Chicken Comb and Wattle Based on YOLO v7 and Optimized U-Net Network
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻
  • |
  • 相似文獻
  • |
  • 引證文獻
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    傳統(tǒng)人工測量方法在蛋雞雞冠肉垂面積測算中存在接觸性應(yīng)激風(fēng)險,、人畜共患病隱患及測量誤差較大等問題。為此,,本研究提出基于YOLO v7與改進U-Net的雞冠肉垂自動分割與面積計算方法,。構(gòu)建兩階段檢測框架:利用YOLO v7完成雞頭姿態(tài)篩選與ROI提取,有效消除非正視角圖像干擾,;提出融合Contextual Transformer的CoT-UNet模型:通過將CoT塊融入U-Net編碼器實現(xiàn)動態(tài)和靜態(tài)上下文特征融合,,結(jié)合本文構(gòu)建的DyC-UP上采樣模塊(采用動態(tài)可調(diào)卷積核強化不規(guī)則邊緣特征提取),,顯著提升不同雞冠特征分割能力,;建立像素-面積轉(zhuǎn)換算法:基于標定系數(shù)實現(xiàn)從圖像空間到物理空間的精準映射。實驗結(jié)果表明,,改進CoT-UNet相較基線模型,,在雞冠和肉垂分割任務(wù)中,IoU提升4.77,、8.75個百分點,精確率提升5.31,、5.06個百分點,,分割質(zhì)量改善顯著。在面積計算精度方面,,雞冠面積絕對誤差(0.62~3.50cm2)和肉垂面積絕對誤差(0.10~2.93cm2)較傳統(tǒng)手工測量(3.58~7.27cm2)具有明顯優(yōu)勢,。多場景驗證顯示,在不同姿態(tài)(3類),、拍攝角度(2種)和距離(2種)條件下,,雞冠面積相對誤差為2.41%~13.62%,肉垂面積相對誤差為1.00%~29.21%,。本研究實現(xiàn)了非接觸式禽類生物特征精準測量,,為智慧化種雞選育提供了可靠的技術(shù)支持。

    Abstract:

    Traditional manual measurement of poultry comb and wattle areas poses contact-induced stress risks, zoonotic disease transmission hazards, and substantial measurement errors. A non-contact measurement system integrating YOLO v7 with an improved U-Net architecture was proposed. Three key innovations were presented: a dual-stage detection framework utilizing YOLO v7 for head pose screening and ROI extraction, effectively eliminating offangle image interference; a novel CoT-UNet model incorporating Contextual Transformer blocks into U-Net’s encoder for dynamic-static context fusion, combined with DyC-UP module employing dynamically adjustable convolution kernels to enhance irregular edge detection; a pixelarea conversion algorithm achieving precise spatial mapping through calibration coefficients. Experimental results demonstrated significant improvements: the enhanced CoT-UNet outperformed baseline models by 4.77 percentage points (comb) and 8.75 percentage points (wattle) in IoU, along with 5.31 percentage points and 5.06 percentage points precision gains respectively. Absolute measurement errors for comb (0.62~3.50cm2) and wattle (0.10~2.93cm2) showed marked superiority over manual methods (3.58~7.27cm2). Multi-scenario validation revealed stable relative errors of 2.41%~13.62% for combs and 1.00%~29.21% for wattles across varied postures (three types), angles (two positions), and distances (two levels). This automated system enabled stress-free poultry biometric measurement, providing reliable technical support for intelligent breeding selection.

    參考文獻
    相似文獻
    引證文獻
引用本文

楊斷利,沈洪碩,陳輝,高媛.基于YOLO v7和改進U-Net模型的雞冠肉垂提取與面積計算方法[J].農(nóng)業(yè)機械學(xué)報,2025,56(4):415-426. YANG Duanli, SHEN Hongshuo, CHEN Hui, GAO Yuan. Extraction and Area Calculation of Chicken Comb and Wattle Based on YOLO v7 and Optimized U-Net Network[J]. Transactions of the Chinese Society for Agricultural Machinery,2025,56(4):415-426.

復(fù)制
分享
文章指標
  • 點擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2024-03-04
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2025-04-10
  • 出版日期:
文章二維碼