Recognition: unknown
PBE-UNet: A light weight Progressive Boundary-Enhanced U-Net with Scale-Aware Aggregation for Ultrasound Image Segmentation
Pith reviewed 2026-05-10 13:24 UTC · model grok-4.3
The pith
PBE-UNet segments lesions in ultrasound images more accurately than prior methods by combining scale-aware receptive fields with progressive boundary attention expansion.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
PBE-UNet addresses the challenges of scale variation and indistinct boundaries in ultrasound lesion segmentation by first using a scale-aware aggregation module to capture robust multi-scale contextual information through dynamic receptive field adjustment, then applying a boundary-guided feature enhancement module that progressively expands narrow boundary predictions into broader spatial attention maps to better cover wider segmentation error areas and strengthen feature focus on difficult regions.
What carries the argument
The boundary-guided feature enhancement (BGFE) module, which treats boundaries not as fixed masks but as starting points that are progressively widened into attention maps to encompass segmentation error zones.
Load-bearing premise
That the gains from the SAAM and BGFE modules hold up on new, unseen ultrasound images and that the outperformance is measured against a complete, fairly tuned set of competing methods.
What would settle it
An independent test on a fresh ultrasound dataset where PBE-UNet fails to exceed the accuracy of the strongest published baseline, or an ablation study where removing either module leaves performance essentially unchanged.
Figures
read the original abstract
Accurate lesion segmentation in ultrasound images is essential for preventive screening and clinical diagnosis, yet remains challenging due to low contrast, blurry boundaries, and significant scale variations. Although existing deep learning-based methods have achieved remarkable performance, these methods still struggle with scale variations and indistinct tumor boundaries. To address these challenges, we propose a progressive boundary enhanced U-Net (PBE-UNet). Specially, we first introduce a scale-aware aggregation module (SAAM) that dynamically adjusts its receptive field to capture robust multi-scale contextual information. Then, we propose a boundary-guided feature enhancement (BGFE) module to enhance the feature representations. We find that there are large gaps between the narrow boundary and the wide segmentation error areas. Unlike existing methods that treat boundaries as static masks, the BGFE module progressively expands the narrow boundary prediction into broader spatial attention maps. Thus, broader spatial attention maps could effectively cover the wider segmentation error regions and enhance the model's focus on these challenging areas. We conduct expensive experiments on four benchmark ultrasound datasets, BUSI, Dataset B, TN3K, and BP. The experimental results how that our proposed PBE-UNet outperforms state-of-the-art ultrasound image segmentation methods. The code is at https://github.com/cruelMouth/PBE-UNet.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper proposes PBE-UNet, a lightweight U-Net architecture for ultrasound lesion segmentation that incorporates a Scale-Aware Aggregation Module (SAAM) to dynamically adjust receptive fields for multi-scale context and a Boundary-Guided Feature Enhancement (BGFE) module that progressively expands narrow boundary predictions into broader attention maps to cover segmentation error regions. It reports superior performance over state-of-the-art methods on four ultrasound benchmarks (BUSI, Dataset B, TN3K, BP) and provides a code link.
Significance. If the empirical superiority holds under controlled re-implementations and ablations, the progressive boundary expansion and scale-aware aggregation could offer a practical, lightweight advance for handling low-contrast and variable-scale features in medical ultrasound, with direct relevance to clinical screening tasks.
major comments (3)
- [§4 (Experiments)] §4 (Experiments): The central outperformance claim is load-bearing on fair attribution to SAAM and BGFE, yet the manuscript provides no evidence that prior SOTA baselines were re-trained under identical optimizer, augmentation, epoch, and loss schedules; without this, gains may stem from implementation differences rather than the proposed modules.
- [§3.2 (BGFE module description)] §3.2 (BGFE module description): The assumption that iterative widening of narrow boundary predictions reliably covers error regions without inflating false-positive area is dataset-dependent and untested; no error-map visualizations, per-lesion-size breakdowns, or false-positive rate analysis are presented to validate this mechanism.
- [§4 (Ablation studies)] §4 (Ablation studies): No component ablations isolating the individual contributions of SAAM and BGFE (or their interaction) are reported, which is required to substantiate that the reported gains arise specifically from these innovations rather than the base U-Net or training protocol.
minor comments (3)
- [Abstract] Abstract: Typo 'The experimental results how that' should read 'show that'; 'expensive experiments' is likely intended as 'extensive experiments'.
- [Title and abstract] Title and abstract: 'light weight' should be 'lightweight' for standard terminology.
- [§4] The manuscript should include statistical significance tests (e.g., paired t-tests or Wilcoxon) on the metric improvements across the four datasets to strengthen the outperformance claims.
Simulated Author's Rebuttal
We thank the referee for the constructive feedback. We address each major comment below, clarifying our experimental practices and committing to additions that will strengthen the manuscript without altering its core claims.
read point-by-point responses
-
Referee: The central outperformance claim is load-bearing on fair attribution to SAAM and BGFE, yet the manuscript provides no evidence that prior SOTA baselines were re-trained under identical optimizer, augmentation, epoch, and loss schedules; without this, gains may stem from implementation differences rather than the proposed modules.
Authors: We followed the official implementations and reported hyper-parameters from each baseline paper, applying identical data splits, augmentation pipelines, and loss functions across all methods on the four datasets. To eliminate any ambiguity, the revised manuscript will include an explicit table listing optimizer, learning rate schedule, epoch count, and augmentation details for every baseline, confirming that training conditions were matched as closely as possible to the originals. revision: yes
-
Referee: The assumption that iterative widening of narrow boundary predictions reliably covers error regions without inflating false-positive area is dataset-dependent and untested; no error-map visualizations, per-lesion-size breakdowns, or false-positive rate analysis are presented to validate this mechanism.
Authors: The BGFE design is grounded in our empirical observation of boundary-to-error gaps across the evaluated ultrasound datasets. We will add (i) qualitative error-map visualizations showing progressive expansion, (ii) per-lesion-size Dice and IoU breakdowns, and (iii) false-positive rate comparisons with and without BGFE in the revised experiments section and supplementary material to directly test the mechanism's behavior. revision: yes
-
Referee: No component ablations isolating the individual contributions of SAAM and BGFE (or their interaction) are reported, which is required to substantiate that the reported gains arise specifically from these innovations rather than the base U-Net or training protocol.
Authors: We agree that component-wise ablations are necessary. The revised manuscript will report a full ablation study on all four datasets, including variants using only SAAM, only BGFE, neither, and both modules together, with quantitative metrics and statistical significance tests to isolate their individual and combined contributions. revision: yes
Circularity Check
No circularity: empirical architecture validated on benchmarks
full rationale
The paper proposes PBE-UNet with SAAM (scale-aware aggregation) and BGFE (progressive boundary expansion) modules motivated by observed challenges in ultrasound images. The central claim is outperformance on BUSI, Dataset B, TN3K, and BP datasets via experiments. No derivation chain, equations, or first-principles results exist that reduce to inputs by construction. No self-definitional steps, fitted parameters renamed as predictions, or load-bearing self-citations appear. Design choices (e.g., expanding narrow boundaries to cover error regions) are presented as architectural responses to empirical observations and are directly testable, keeping the work self-contained without circular reduction.
Axiom & Free-Parameter Ledger
free parameters (1)
- architecture hyperparameters for SAAM and BGFE
axioms (1)
- domain assumption U-Net is an appropriate base architecture for medical image segmentation
invented entities (2)
-
Scale-Aware Aggregation Module (SAAM)
no independent evidence
-
Boundary-Guided Feature Enhancement (BGFE) module
no independent evidence
Reference graph
Works this paper leans on
-
[1]
Al-Dhabyani, W., Gomaa, M., Khaled, H., Aly, F.: Deep learning approaches for data augmentation and classification of breast masses using ultrasound images. Int. J. Adv. Comput. Sci. Appl10(5), 1–11 (2019) 14 IEEE TRANSACTIONS AND JOURNALS TEMPLATE
2019
-
[2]
Bi, H., Cai, C., Sun, J., Jiang, Y ., Lu, G., Shu, H., Ni, X.: Bpat-unet: Boundary preserving assembled transformer unet for ultrasound thyroid nodule segmentation. Comput. Methods Programs Biomed.238, 107614 (2023)
2023
-
[3]
IEEE Trans
Chen, F., Chen, L., Kong, W., Zhang, W., Zheng, P., Sun, L., Zhang, D., Liao, H.: Deep semi-supervised ultrasound image segmentation by using a shadow aware network with boundary refinement. IEEE Trans. Medical Image42(12), 3779–3793 (2023)
2023
-
[4]
IEEE Trans
Chen, G., Li, L., Dai, Y ., Zhang, J., Yap, M.H.: Aau-net: an adaptive attention u-net for breast lesions segmentation in ultrasound images. IEEE Trans. Medical Image42(5), 1289–1300 (2022)
2022
-
[5]
Expert Syst
Chen, G., Zhou, L., Zhang, J., Yin, X., Cui, L., Dai, Y .: Esknet: An enhanced adaptive selection kernel convolution for ultrasound breast tumors segmentation. Expert Syst. Appl.246, 123265 (2024)
2024
-
[6]
TransUNet: Transformers Make Strong Encoders for Medical Image Segmentation
Chen, J., Lu, Y ., Yu, Q., Luo, X., Adeli, E., Wang, Y ., Lu, L., Yuille, A.L., Zhou, Y .: Transunet: Transformers make strong encoders for medical image segmentation. arXiv preprint arXiv:2102.04306 (2021)
work page internal anchor Pith review arXiv 2021
-
[7]
In: Proc
Chen, L.C., Zhu, Y ., Papandreou, G., Schroff, F., Adam, H.: Encoder- decoder with atrous separable convolution for semantic image segmen- tation. In: Proc. Eur. Conf. Comp. Vis. pp. 801–818 (2018)
2018
-
[8]
In: Proc
Du, X., Xu, X., Ma, K.: Icgnet: Integration context-based reverse- contour guidance network for polyp segmentation. In: Proc. Int. Joint Conf. Artificial Intell. pp. 877–883 (2022)
2022
-
[9]
In: Proc
Gong, H., Chen, G., Wang, R., Xie, X., Mao, M., Yu, Y ., Chen, F., Li, G.: Multi-task learning for thyroid nodule segmentation with thyroid region prior. In: Proc. IEEE Int. Symp. Biomed. Imaging. pp. 257–261. IEEE (2021)
2021
-
[10]
Medical Image Anal.63, 101722 (2020)
He, Y ., Yang, G., Yang, J., Chen, Y ., Kong, Y ., Wu, J., Tang, L., Zhu, X., Dillenseger, J.L., Shao, P., et al.: Dense biased networks with deep priori anatomy and hard region adaptation: Semi-supervised learning for fine renal artery segmentation. Medical Image Anal.63, 101722 (2020)
2020
-
[11]
Hu, K., Zhang, X., Lee, D., Xiong, D., Zhang, Y ., Gao, X.: Boundary-guided and region-aware network with global scale- adaptive for accurate segmentation of breast tumors in ultrasound images. IEEE J. Biomed. Health Informatics27(9), 4421–4432 (2023). https://doi.org/10.1109/JBHI.2023.3285789,https://doi. org/10.1109/JBHI.2023.3285789
-
[12]
Neural networks 121, 74–87 (2020)
Ibtehaz, N., Rahman, M.S.: Multiresunet: Rethinking the u-net architec- ture for multimodal biomedical image segmentation. Neural networks 121, 74–87 (2020)
2020
-
[13]
Boundary loss for highly unbalanced segmentation,
Kervadec, H., Bouchtiba, J., Desrosiers, C., Granger, E., Dolz, J., Ayed, I.B.: Boundary loss for highly unbalanced segmentation. Medical Image Anal.67, 101851 (2021). https://doi.org/10.1016/J.MEDIA.2020.101851
-
[14]
Medical Image Anal
Lin, Y ., Zhang, D., Fang, X., Chen, Y ., Cheng, K.T., Chen, H.: Rethinking boundary detection in deep learning-based medical image segmentation. Medical Image Anal. p. 103615 (2025)
2025
-
[15]
IEEE Trans
Liu, G., Zhou, Y ., Wang, J., Chen, Z., Liu, D., Chang, B.: A cross- attention and multilevel feature fusion network for breast lesion seg- mentation in ultrasound images. IEEE Trans. Instrum. Meas.73, 1–13 (2024)
2024
-
[16]
In: Proc
Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. pp. 3431–3440 (2015)
2015
-
[17]
IEEE Trans
Luo, X., Wang, Y ., Ou-Yang, L.: Lgffm: A localized and globalized frequency fusion model for ultrasound image segmentation. IEEE Trans. Medical Image (2025)
2025
-
[18]
Montoya, A., Sterling, D., Hasnin, kaggle446, shirzad, Cukierski, W., yffud: Ultrasound nerve segmentation.https://kaggle.com/ competitions/ultrasound-nerve-segmentation(2016), kaggle
2016
-
[19]
IEEE Trans
Ning, Z., Zhong, S., Feng, Q., Chen, W., Zhang, Y .: Smu-net: Saliency- guided morphology-aware u-net for breast lesion segmentation in ultra- sound image. IEEE Trans. Medical Image41(2), 476–490 (2021)
2021
-
[20]
Attention U-Net: Learning Where to Look for the Pancreas
Oktay, O., Schlemper, J., Folgoc, L.L., Lee, M., Heinrich, M., Misawa, K., Mori, K., McDonagh, S., Hammerla, N.Y ., Kainz, B., et al.: Attention u-net: Learning where to look for the pancreas. arXiv preprint arXiv:1804.03999 (2018)
work page internal anchor Pith review arXiv 2018
-
[21]
Qin, Q., Lin, Z., Gao, G., Han, C., Wang, R., Qin, Y ., Li, S., An, S., Che, Y .: Mbe-unet: Multi-branch boundary enhanced u-net for ultrasound segmentation. IEEE J. Biomed. Health Informatics (2025)
2025
-
[22]
Qu, X., Zhou, J., Jiang, J., Wang, W., Wang, H., Wang, S., Tang, W., Lin, X.: Eh-former: Regional easy-hard-aware transformer for breast lesion segmentation in ultrasound images. Inf. Fusion109, 102430 (2024)
2024
-
[23]
In: Medical Image Computing and Computer-Assisted Intervention
Ronneberger, O., Fischer, P., Brox, T.: U-net: Convolutional networks for biomedical image segmentation. In: Medical Image Computing and Computer-Assisted Intervention. pp. 234–241. Springer (2015)
2015
-
[24]
IEEE Trans
Song, J., Zhou, M., Luo, J., Pu, H., Feng, Y ., Wei, X., Jia, W.: Boundary- aware feature fusion with dual-stream attention for remote sensing small object detection. IEEE Trans. Geosci. Remote. Sens. (2024)
2024
-
[25]
In: Medical Image Computing and Computer- Assisted Intervention
Sun, F., Luo, Z., Li, S.: Boundary difference over union loss for medical image segmentation. In: Medical Image Computing and Computer- Assisted Intervention. pp. 292–301. Springer (2023)
2023
-
[26]
In: Proc
Sun, Y ., Wang, S., Chen, C., Xiang, T.Z.: Boundary-guided camouflaged object detection. In: Proc. Int. Joint Conf. Artificial Intell. pp. 1335–1341 (2022)
2022
-
[27]
In: Proc
Tang, F., Ding, J., Quan, Q., Wang, L., Ning, C., Zhou, S.K.: Cmunext: An efficient medical image segmentation network based on large kernel and skip fusion. In: Proc. IEEE Int. Symp. Biomed. Imaging. pp. 1–5. IEEE (2024)
2024
-
[28]
In: Proc
Tang, F., Wang, L., Ning, C., Xian, M., Ding, J.: Cmu-net: a strong convmixer-based medical ultrasound image segmentation network. In: Proc. IEEE Int. Symp. Biomed. Imaging. pp. 1–5. IEEE (2023)
2023
-
[29]
In: Proc
Wang, C., Zhu, Y ., Li, Q., Liu, S.Z.W.: Msa-net: Masked separable attention network for breast ultrasound tumor segmentation. In: Proc. IEEE Int. Conf. Bioinform. Biomed. pp. 3289–3292. IEEE (2025)
2025
-
[30]
Wang, C., Zhu, Y ., Li, Q., Zhang, S., Liu, W.: Msa-net: Masked separable attention network for breast ultrasound tu- mor segmentation. In: 2025 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). pp. 2914–2919 (2025). https://doi.org/10.1109/BIBM66473.2025.11356822
-
[31]
Displays91, 103252 (2026)
Wang, C., Zhu, Y ., Wu, R., Shi, F., Li, Q., Liu, W., Hu, K.: Pconv- unet: Multi-scale pinwheel convolutions for breast ultrasound tumor segmentation. Displays91, 103252 (2026)
2026
-
[32]
In: Proc
Wang, Q., Wu, B., Zhu, P., Li, P., Zuo, W., Hu, Q.: Eca-net: Efficient channel attention for deep convolutional neural networks. In: Proc. IEEE Conf. Comp. Vis. Patt. Recogn. pp. 11534–11542 (2020)
2020
-
[33]
IEEE Trans
Wang, T., Jin, C., Chen, Y ., Zhou, G., Ge, R., Xue, C., Shi, B., Liu, T., Coatrieux, J.L., Feng, Q.: Gfa-net: Global feature aggregation network based on contrastive learning for breast lesion automated segmentation in ultrasound images. IEEE Trans. Instrum. Meas. (2024)
2024
-
[34]
Medical Image Anal.70, 101989 (2021)
Xue, C., Zhu, L., Fu, H., Hu, X., Li, X., Zhang, H., Heng, P.A.: Global guidance network for breast lesion segmentation in ultrasound images. Medical Image Anal.70, 101989 (2021)
2021
-
[35]
Yap, M.H., Pons, G., Marti, J., Ganau, S., Sentis, M., Zwiggelaar, R., Davison, A.K., Marti, R.: Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J. Biomed. Health Informatics22(4), 1218–1226 (2017)
2017
-
[36]
Yin, H., Shao, Y .: Cfu-net: A coarse-fine u-net with multilevel attention for medical image segmentation. IEEE Trans. Instrum. Meas.72, 1–12 (2023). https://doi.org/10.1109/TIM.2023.3293887
-
[37]
IEEE Trans
Yue, G., Wu, S., Li, G., Zhao, C., Hao, Y ., Zhou, T., Zhao, B.: Boundary- guided feature-aligned network for colorectal polyp segmentation. IEEE Trans. Circuits Syst. Video Technol. (2025)
2025
-
[38]
Zhang, X., Li, X., Hu, K., Gao, X.: Bgra-net: Boundary- guided and region-aware convolutional neural network for the segmentation of breast ultrasound images. In: Proc. IEEE Int. Conf. Bioinform. Biomed. pp. 1619–1622. IEEE (2021). https://doi.org/10.1109/BIBM52615.2021.9669834
-
[39]
Zhao, G., Zhu, X., Wang, X., Yan, F., Guo, M.: Syn-net: A synchronous frequency-perception fusion network for breast tumor segmentation in ultrasound images. IEEE J. Biomed. Health Informatics (2024)
2024
-
[40]
Selvaraju, Michael Cogswell, Ab- hishek Das, Ramakrishna Vedantam, Devi Parikh, and Dhruv Batra
Zhao, J., Liu, J., Fan, D., Cao, Y ., Yang, J., Cheng, M.: Egnet: Edge guidance network for salient object detection. In: Proc. IEEE Int. Conf. Comp. Vis. pp. 8778–8787. IEEE (2019). https://doi.org/10.1109/ICCV .2019.00887
-
[41]
Zhou, T., Zhang, Y ., Chen, G., Zhou, Y ., Wu, Y ., Fan, D.P.: Edge-aware feature aggregation network for polyp segmentation. Mach. Intell. Res. 22(1), 101–116 (2025)
2025
-
[42]
Medical Image Anal.107, 103855 (2026)
Zhou, T., Ruan, S., Lei, B.: Bufnet: Boundary-aware and uncertainty-driven multi-modal fusion network for MR brain tumor segmentation. Medical Image Anal.107, 103855 (2026). https://doi.org/10.1016/J.MEDIA.2025.103855,https: //doi.org/10.1016/j.media.2025.103855
-
[43]
IEEE Trans
Zhou, Z., Siddiquee, M.M.R., Tajbakhsh, N., Liang, J.: Unet++: Re- designing skip connections to exploit multiscale features in image segmentation. IEEE Trans. Medical Image39(6), 1856–1867 (2019)
2019
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.