pith. machine review for the scientific record. sign in

arxiv: 2605.08282 · v1 · submitted 2026-05-08 · 📡 eess.IV · cs.AI· cs.CV

Recognition: 2 theorem links

· Lean Theorem

A Paired Point-of-Care Ultrasound Dataset for Image Quality Enhancement and Benchmarking via a cGAN Baseline

Authors on Pith no claims yet

Pith reviewed 2026-05-12 03:37 UTC · model grok-4.3

classification 📡 eess.IV cs.AIcs.CV
keywords point-of-care ultrasoundPOCUSpaired datasetimage enhancementconditional GANcGANultrasound imagingimage-to-image translation
0
0 comments X

The pith

A new paired dataset of low-end and high-end ultrasound scans enables a cGAN to substantially raise POCUS image quality.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper collects the first accurately paired dataset of low-end point-of-care ultrasound images and matching high-end ultrasound images by mounting both probes on a custom automated gantry. It then trains a conditional generative adversarial network, built on the pix2pix architecture with a U-Net generator and a mix of L1 and SSIM losses, after first pretraining on simulated ultrasound data. Evaluation on 1064 paired ex-vivo and phantom sets shows clear metric gains. A sympathetic reader would care because portable ultrasound is already used in low-resource and bedside settings, but its hardware limits often produce images too poor for confident diagnosis; if the mapping learned here generalizes, the same low-cost devices could deliver higher diagnostic value without new hardware.

Core claim

The central claim is that an accurately registered paired dataset collected via an automated gantry supplies a reliable supervised signal that allows a cGAN to translate low-quality POCUS images into versions whose structural similarity to high-end reference scans rises from 0.29 to 0.54, peak signal-to-noise ratio rises from 19.16 dB to 22.41 dB, and no-reference scores (NIQE from 7.95 to 4.44, PIQE from 31.12 to 19.99) also improve. The work releases the POCUS-IQ dataset publicly and presents the cGAN as a reproducible baseline for future benchmarking.

What carries the argument

The accurately paired low-end POCUS to high-end ultrasound dataset collected with the automated gantry, which supplies the pixel-level training signal for the conditional GAN to learn the image-to-image mapping.

If this is right

  • The publicly released POCUS-IQ dataset supplies a new benchmark for any future image-enhancement or image-to-image translation method in ultrasound.
  • The same cGAN architecture with L1-plus-SSIM loss and simulation pretraining can be retrained on additional paired data to target other clinical ultrasound tasks.
  • If the learned mapping holds outside the ex-vivo and phantom domain, handheld POCUS devices could deliver images closer to high-end cart-based systems in low-resource environments.
  • The reported gains in both full-reference and no-reference metrics supply a concrete quantitative target that later methods must exceed to claim improvement.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same gantry-based pairing technique could be adapted to create supervised training sets for other portable imaging modalities such as low-cost MRI or optical devices.
  • Once the model is distilled to run on embedded hardware, real-time enhancement could be added to existing point-of-care ultrasound workflows without changing the acquisition protocol.
  • The dataset also enables controlled ablation studies on the relative contribution of the gantry alignment, the SSIM loss term, and the simulation pretraining step.
  • Future work could test whether the enhancement preserves or improves the visibility of specific diagnostic features such as lesions or vascular structures that matter in emergency medicine.

Load-bearing premise

The custom automated gantry produces spatially accurate, perfectly registered pairs between low-end POCUS and high-end images without residual misalignment or probe-pressure differences that would invalidate the supervised training signal.

What would settle it

A blinded reader study in which radiologists diagnose from cGAN-enhanced POCUS images versus raw POCUS images on real patient scans, with high-end images as ground truth, would show whether the reported metric gains produce any measurable improvement in diagnostic accuracy.

read the original abstract

Purpose: We aim to enhance the image quality of point-of-care ultrasound (POCUS) devices using deep learning and a novel paired dataset of POCUS and high-end ultrasound images. Approach: We collected the first accurately paired dataset using a custom-built automated gantry system of low-end POCUS and high-end ultrasound images. A conditional generative adversarial network (cGAN) was utilized based on the pix2pix architecture, with a U-Net generator that incorporates both L1 and structural similarity index (SSIM) losses to improve perceptual quality. Pretraining on a simulation dataset further boosts performance. Evaluation was performed on 1064 paired ex vivo tissue and phantom ultrasound image sets. Results: Our approach improves the SSIM from 0.29 to 0.54 and PSNR from 19.16 dB to 22.41 dB. No-reference metrics also indicate substantial enhancement, with the Natural Image Quality Evaluator (NIQE) and Perception-based Image Quality Evaluator (PIQE) scores dropping from 7.95 to 4.44 and 31.12 to 19.99, respectively. Conclusions: This work presents the first publicly available accurately paired dataset of low-end POCUS to high end ultrasound images. Additionally, our results demonstrate the potential of the proposed framework to overcome hardware limitations of handheld POCUS, enhancing its diagnostic value in low-resource and point-of-care settings. The POCUS-IQ Dataset is publicly available at https://github.com/NKI-MedTech-AI/POCUS-IQ.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The manuscript introduces the POCUS-IQ dataset of paired low-end POCUS and high-end ultrasound images acquired via a custom automated gantry system. It trains a pix2pix-style cGAN (U-Net generator with combined L1 and SSIM losses, plus simulation pretraining) and reports quantitative improvements on a held-out set of 1064 ex-vivo/phantom pairs: SSIM rising from 0.29 to 0.54, PSNR from 19.16 dB to 22.41 dB, and no-reference metrics (NIQE 7.95→4.44, PIQE 31.12→19.99). The dataset is released publicly.

Significance. If the gantry pairs are verifiably registered to sub-pixel accuracy with matched probe pressure, the public paired dataset constitutes a useful benchmark resource for supervised POCUS enhancement research. The cGAN baseline demonstrates measurable metric gains on the collected data. However, the absence of in-vivo clinical validation, statistical testing, and comparisons to other enhancement methods limits the immediate translational significance.

major comments (2)
  1. [Data acquisition / Methods] Data acquisition section: the central claim that the custom gantry produces 'accurately paired' images suitable for supervised training is not supported by any reported calibration, landmark-based registration error, or pressure-sensor measurements. Without such quantification, residual misalignment or acoustic-coupling differences could mean the cGAN is partly learning to compensate for acquisition artifacts rather than performing hardware-invariant enhancement.
  2. [Results] Results and evaluation: the reported metric improvements on the 1064-pair test set lack accompanying details on hyperparameter tuning protocol, confirmation that the test set remained completely unseen during pretraining and model selection, and any statistical significance testing of the deltas (e.g., paired t-tests or confidence intervals).
minor comments (1)
  1. [Abstract / Conclusions] The abstract and conclusions assert clinical potential in low-resource settings, yet the study is confined to ex-vivo tissue and phantoms; a brief statement acknowledging this scope limitation would improve clarity.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the constructive comments, which help clarify the strengths and limitations of our work. We address each major comment below and outline the corresponding revisions.

read point-by-point responses
  1. Referee: [Data acquisition / Methods] Data acquisition section: the central claim that the custom gantry produces 'accurately paired' images suitable for supervised training is not supported by any reported calibration, landmark-based registration error, or pressure-sensor measurements. Without such quantification, residual misalignment or acoustic-coupling differences could mean the cGAN is partly learning to compensate for acquisition artifacts rather than performing hardware-invariant enhancement.

    Authors: We acknowledge that the original manuscript does not provide explicit numerical quantification of registration accuracy or pressure consistency. The custom automated gantry was engineered with fixed mechanical offsets and stepper-motor positioning to enforce repeatable probe placement and contact force between the low-end POCUS and high-end transducers. This design yields image pairs whose spatial correspondence is determined by the rigid geometry of the rig rather than post-hoc software registration. Nevertheless, we agree that reporting calibration details would strengthen the claim of suitability for supervised learning. In the revised manuscript we will add a dedicated subsection describing the gantry calibration procedure, including measured positioning repeatability and any available pressure-sensor data, together with a brief discussion of residual acoustic-coupling variability and its potential influence on the learned mapping. revision: partial

  2. Referee: [Results] Results and evaluation: the reported metric improvements on the 1064-pair test set lack accompanying details on hyperparameter tuning protocol, confirmation that the test set remained completely unseen during pretraining and model selection, and any statistical significance testing of the deltas (e.g., paired t-tests or confidence intervals).

    Authors: We confirm that the 1064-pair test set was never used during simulation pretraining, hyperparameter search, or model selection; all tuning was performed exclusively on the training split via cross-validation. To make this protocol transparent, the revised methods section will include the full hyperparameter grid, the cross-validation procedure, and an explicit statement that the test set remained strictly held-out. In addition, we will report paired t-tests and 95% confidence intervals on the metric deltas (SSIM, PSNR, NIQE, PIQE) to quantify statistical significance of the observed improvements. revision: yes

Circularity Check

0 steps flagged

No circularity: empirical gains measured on externally collected paired data

full rationale

The manuscript collects a new paired POCUS dataset via custom gantry hardware, trains a standard pix2pix cGAN (with L1 + SSIM losses and optional simulation pretraining), and reports direct metric improvements (SSIM 0.29→0.54, PSNR 19.16→22.41 dB) on 1064 held-out pairs. No equations, fitted parameters, or self-citations are invoked to derive the reported numbers; the gains are measured quantities on data external to the model. The derivation chain is therefore self-contained and non-circular.

Axiom & Free-Parameter Ledger

2 free parameters · 1 axioms · 0 invented entities

The work rests on the hardware assumption that the gantry produces pixel-perfect spatial correspondence and on standard supervised-learning assumptions that the collected pairs are representative and that the chosen loss combination yields perceptually better images.

free parameters (2)
  • L1 versus SSIM loss weighting
    The relative weight between L1 and SSIM terms is a tunable hyperparameter whose value is not reported in the abstract and must have been chosen to obtain the quoted numbers.
  • Simulation pretraining schedule and learning-rate schedule
    Details of how the simulation pretraining was performed and how it was combined with real-data fine-tuning are not given.
axioms (1)
  • domain assumption The automated gantry produces spatially accurate, pressure-matched pairs between low-end and high-end probes.
    This is the foundational premise that allows supervised training; any systematic misalignment would invalidate the learned mapping.

pith-pipeline@v0.9.0 · 5632 in / 1508 out tokens · 41782 ms · 2026-05-12T03:37:34.610964+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

80 extracted references · 80 canonical work pages

  1. [1]

    and Tahir, M

    Hashim, A. and Tahir, M. J. and Ullah, I. and Asghar, M. S. and Siddiqi, H. and Yousaf, Z. , title =. Ann Med Surg (Lond) , volume =. doi:10.1016/j.amsu.2021.102982 , year =

  2. [2]

    Stock, K. F. and Klein, B. and Steubl, D. and Lersch, C. and Heemann, U. and Wagenpfeil, S. and Eyer, F. and Clevert, D. A. , title =. Abdom Imaging , volume =. doi:10.1007/s00261-015-0406-z , year =

  3. [3]

    and Sable, C

    Riley, A. and Sable, C. and Prasad, A. and Spurney, C. and Harahsheh, A. and Clauss, S. and Colyer, J. and Gierdalski, M. and Johnson, A. and Pearson, G. D. and Rosenthal, J. , title =. Pediatr Cardiol , volume =. doi:10.1007/s00246-014-0940-4 , year =

  4. [4]

    The American Journal of Medicine , volume =

    Han, Paul J and Tsai, Ben T and Martin, Julie W and Keen, William D and Waalen, Jill and Kimura, Bruce J , title =. The American Journal of Medicine , volume =. 2019 , type =

  5. [5]

    and Murphy, S

    Henderson, R. and Murphy, S. , title =. 2017 , type =

  6. [6]

    IEEE transactions on medical imaging , volume =

    Matrone, Giulia and Savoia, Alessandro Stuart and Caliano, Giosuè and Magenes, Giovanni , title =. IEEE transactions on medical imaging , volume =. 2014 , type =

  7. [7]

    van der Pol, Hilde G. A. and van Karnenbeek, Lennard M. and Wijkhuizen, Mark and Geldof, Freija and Dashtbozorg, Behdad , TITLE =. Applied Sciences , VOLUME =. 2024 , NUMBER =

  8. [8]

    and de Boer, L

    Veluponnar, D. and de Boer, L. L. and Geldof, F. and Jong, L. S. and Da Silva Guimaraes, M. and Vrancken Peeters, Mtfd and van Duijnhoven, F. and Ruers, T. and Dashtbozorg, B. , title =. Cancers (Basel) , volume =. doi:10.3390/cancers15061652 , year =

  9. [9]

    and Pruijssers, C

    Geldof, F. and Pruijssers, C. W. A. and Jong, L. S. and Veluponnar, D. and Ruers, T. J. M. and Dashtbozorg, B. , title =. Diagnostics (Basel) , volume =. doi:10.3390/diagnostics13233595 , year =

  10. [10]

    Image-to-Image Translation with Conditional Adversarial Networks , journal =

    Phillip Isola and Jun. Image-to-Image Translation with Conditional Adversarial Networks , journal =. 2016 , url =. 1611.07004 , timestamp =

  11. [11]

    and Bovik, Alan C

    Mittal, Anish and Moorthy, Anush K. and Bovik, Alan C. , booktitle=. Blind/Referenceless Image Spatial Quality Evaluator , year=

  12. [12]

    2023 , issn =

    ARU-GAN: U-shaped GAN based on Attention and Residual connection for super-resolution reconstruction , journal =. 2023 , issn =. doi:https://doi.org/10.1016/j.compbiomed.2023.107316 , url =

  13. [13]

    Image Quality Improvement of Hand-Held Ultrasound Devices With a Two-Stage Generative Adversarial Network , year=

    Zhou, Zixia and Wang, Yuanyuan and Guo, Yi and Qi, Yanxing and Yu, Jinhua , journal=. Image Quality Improvement of Hand-Held Ultrasound Devices With a Two-Stage Generative Adversarial Network , year=

  14. [14]

    Completely Blind

    Making a “Completely Blind” Image Quality Analyzer , author=. IEEE Signal Processing Letters , year=

  15. [15]

    2015 Twenty First National Conference on Communications (NCC) , year=

    Blind image quality evaluation using perception based features , author=. 2015 Twenty First National Conference on Communications (NCC) , year=

  16. [16]

    Can Signal-to-Noise Ratio Perform as a Baseline Indicator for Medical Image Quality Assessment , volume =

    Zhang, Zhicheng and Dai, Guangzhe and Liang, Xiaokun and Yu, Shaode and Li, Leida and Xie, Yaoqin , year =. Can Signal-to-Noise Ratio Perform as a Baseline Indicator for Medical Image Quality Assessment , volume =. IEEE Access , doi =

  17. [17]

    2020 , publisher =

    Pandey, Anil and Yadav, Divya and Sharma, Akshima and Sonker, Damini and Patel, Chetan and Bal, Chandrasekhar and Kumar, Rakesh , title =. 2020 , publisher =

  18. [18]

    Image Quality Metrics: PSNR vs

    Horé, Alain and Ziou, Djemel , booktitle=. Image Quality Metrics: PSNR vs. SSIM , year=

  19. [19]

    Image Quality Assessment through FSIM, SSIM, MSE and PSNR—A Comparative Study , volume =

    Sara, Umme and Akter, Morium and Uddin, Mohammad Shorif , year =. Image Quality Assessment through FSIM, SSIM, MSE and PSNR—A Comparative Study , volume =. Journal of Computer and Communications , doi =

  20. [20]

    and Sheikh, H.R

    Zhou Wang and Bovik, A.C. and Sheikh, H.R. and Simoncelli, E.P. , journal=. Image quality assessment: from error visibility to structural similarity , year=

  21. [21]

    1999 , issn =

    An overlap invariant entropy measure of 3D medical image alignment , journal =. 1999 , issn =. doi:https://doi.org/10.1016/S0031-3203(98)00091-0 , url =

  22. [22]

    Handheld Ultrasound Video High-Quality Reconstruction Using a Low-Rank Representation Multipathway Generative Adversarial Network , year=

    Zhou, Zixia and Guo, Yi and Wang, Yuanyuan , journal=. Handheld Ultrasound Video High-Quality Reconstruction Using a Low-Rank Representation Multipathway Generative Adversarial Network , year=

  23. [23]

    Reconstruction for Diverging-Wave Imaging Using Deep Convolutional Neural Networks , year=

    Lu, Jingfeng and Millioz, Fabien and Garcia, Damien and Salles, Sébastien and Liu, Wanyu and Friboulet, Denis , journal=. Reconstruction for Diverging-Wave Imaging Using Deep Convolutional Neural Networks , year=

  24. [24]

    Super-Resolution Reconstruction of Plane-Wave Ultrasound Imaging Based on the Improved CNN Method

    Zhou, Zixia and Wang, Yuanyuan and Yu, Jinhua and Guo, Wei and Fang, Zhenghan. Super-Resolution Reconstruction of Plane-Wave Ultrasound Imaging Based on the Improved CNN Method. VipIMAGE 2017. 2018

  25. [25]

    Contrast and Resolution Improvement of POCUS Using Self-consistent CycleGAN , isbn =

    Khan, Shujaat and Huh, Jaeyoung and Ye, Jong Chul , year =. Contrast and Resolution Improvement of POCUS Using Self-consistent CycleGAN , isbn =

  26. [26]

    Advanced Data Mining and Applications: 16th International Conference, ADMA 2020, Foshan, China, November 12–14, 2020, Proceedings , pages =

    Guo, Baozhu and Zhang, Bin and Ma, Zhuang and Li, Ning and Bao, Yiping and Yu, Dan , title =. Advanced Data Mining and Applications: 16th International Conference, ADMA 2020, Foshan, China, November 12–14, 2020, Proceedings , pages =. 2020 , isbn =. doi:10.1007/978-3-030-65390-3_41 , abstract =

  27. [27]

    2023 , editor =

    itk-elastix:. 2023 , editor =

  28. [28]

    and Pluim, Josien P

    Klein, Stefan and Staring, Marius and Murphy, Keelin and Viergever, Max A. and Pluim, Josien P. W. , journal=. elastix: A Toolbox for Intensity-Based Medical Image Registration , year=

  29. [29]

    ArXiv , year=

    NIPS 2016 Tutorial: Generative Adversarial Networks , author=. ArXiv , year=

  30. [30]

    2014 , eprint=

    Generative Adversarial Networks , author=. 2014 , eprint=

  31. [31]

    Generative adversarial networks,

    Goodfellow, Ian and Pouget-Abadie, Jean and Mirza, Mehdi and Xu, Bing and Warde-Farley, David and Ozair, Sherjil and Courville, Aaron and Bengio, Yoshua , title =. 2020 , issue_date =. doi:10.1145/3422622 , journal =

  32. [32]

    2015 , eprint=

    U-Net: Convolutional Networks for Biomedical Image Segmentation , author=. 2015 , eprint=

  33. [33]

    2016 , eprint=

    Precomputed Real-Time Texture Synthesis with Markovian Generative Adversarial Networks , author=. 2016 , eprint=

  34. [34]

    2015 , eprint=

    Texture Synthesis Using Convolutional Neural Networks , author=. 2015 , eprint=

  35. [35]

    and Ecker, Alexander S

    Gatys, Leon A. and Ecker, Alexander S. and Bethge, Matthias , booktitle=. Image Style Transfer Using Convolutional Neural Networks , year=

  36. [36]

    and Khvedchenya, Eugene and Parinov, Alex and Druzhinin, Mikhail and Kalinin, Alexandr A

    Buslaev, Alexander and Iglovikov, Vladimir I. and Khvedchenya, Eugene and Parinov, Alex and Druzhinin, Mikhail and Kalinin, Alexandr A. , TITLE =. Information , VOLUME =. 2020 , NUMBER =

  37. [37]

    A New Perspective on Stabilizing GANs Training: Direct Adversarial Training , year=

    Li, Ziqiang and Xia, Pengfei and Tao, Rentuo and Niu, Hongjing and Li, Bin , journal=. A New Perspective on Stabilizing GANs Training: Direct Adversarial Training , year=

  38. [38]

    Plane-Wave Image Reconstruction via Generative Adversarial Network and Attention Mechanism , year=

    Tang, Jiahua and Zou, Bao and Li, Chang and Feng, Shuai and Peng, Hu , journal=. Plane-Wave Image Reconstruction via Generative Adversarial Network and Attention Mechanism , year=

  39. [39]

    2023 , publisher =

    Yi Guo and Shichong Zhou and Jun Shi and Yuanyuan Wang , title =. 2023 , publisher =. doi:10.5281/zenodo.7841250 , url =

  40. [40]

    Archives of Computational Methods in Engineering , pages=

    A Systematic Review on Generative Adversarial Network (GAN): Challenges and Future Directions , author=. Archives of Computational Methods in Engineering , pages=. 2024 , publisher=

  41. [41]

    Journal of Digital Imaging , volume=

    Systematic review of generative adversarial networks (GANs) for medical image classification and segmentation , author=. Journal of Digital Imaging , volume=. 2022 , publisher=

  42. [42]

    Multimedia Tools and Applications , pages=

    Understanding GANs: fundamentals, variants, training challenges, applications, and open problems , author=. Multimedia Tools and Applications , pages=. 2024 , publisher=

  43. [43]

    International conference on machine learning , pages=

    Which training methods for GANs do actually converge? , author=. International conference on machine learning , pages=. 2018 , organization=

  44. [44]

    Tarek and Nabil, Hadiur Rahman and Jim, Jamin Rahman and Mridha, M

    Islam, Showrov and Aziz, Md. Tarek and Nabil, Hadiur Rahman and Jim, Jamin Rahman and Mridha, M. F. and Kabir, Md. Mohsin and Asai, Nobuyoshi and Shin, Jungpil , journal=. Generative Adversarial Networks (GANs) in Medical Imaging: Advancements, Applications, and Challenges , year=

  45. [45]

    Annals of translational medicine , volume=

    Point of care ultrasound: the next evolution of medical education , author=. Annals of translational medicine , volume=. 2020 , publisher=

  46. [46]

    European Journal of Trauma and Emergency Surgery , volume=

    Out of hospital point of care ultrasound: current use models and future directions , author=. European Journal of Trauma and Emergency Surgery , volume=. 2016 , publisher=

  47. [47]

    Journal of ultrasound , volume=

    Point of care ultrasound (POCUS) telemedicine project in rural Nicaragua and its impact on patient management , author=. Journal of ultrasound , volume=. 2015 , publisher=

  48. [48]

    International Journal of Maternal and Child Health and AIDS , volume=

    Trends in ultrasound use in low and middle income countries: a systematic review , author=. International Journal of Maternal and Child Health and AIDS , volume=. 2020 , publisher=

  49. [49]

    Tropical Medicine & International Health , volume=

    The use of portable ultrasound devices in low-and middle-income countries: a systematic review of the literature , author=. Tropical Medicine & International Health , volume=. 2016 , publisher=

  50. [50]

    Journal of Trauma and Acute Care Surgery , volume=

    Cost-effective remote iPhone-teathered telementored trauma telesonography , author=. Journal of Trauma and Acute Care Surgery , volume=. 2010 , publisher=

  51. [51]

    Heart , volume=

    Hand-held cardiac ultrasound screening performed by family doctors with remote expert support interpretation , author=. Heart , volume=. 2016 , publisher=

  52. [52]

    POCUS journal , volume=

    Ultrasound image quality comparison between a handheld ultrasound transducer and mid-range ultrasound machine , author=. POCUS journal , volume=. 2022 , publisher=

  53. [53]

    International journal of computer assisted radiology and surgery , volume=

    Cardiac point-of-care to cart-based ultrasound translation using constrained CycleGAN , author=. International journal of computer assisted radiology and surgery , volume=. 2020 , publisher=

  54. [54]

    IEEE transactions on ultrasonics, ferroelectrics, and frequency control , volume=

    Real-time 3-D ultrasound imaging using sparse synthetic aperture beamforming , author=. IEEE transactions on ultrasonics, ferroelectrics, and frequency control , volume=. 1998 , publisher=

  55. [55]

    Smartphone-based portable ultrasound imaging system: Prototype implementation and evaluation , year=

    Ahn, Sewoong and Kang, Jeeun and Kim, Pilsu and Lee, Gunho and Jeong, Eunji and Jung, Woojin and Park, Minsuk and Song, Tai-kyong , booktitle=. Smartphone-based portable ultrasound imaging system: Prototype implementation and evaluation , year=

  56. [56]

    Seminars in Ultrasound, CT and MRI , year=

    Neuro POCUS , author=. Seminars in Ultrasound, CT and MRI , year=

  57. [57]

    Clinical Medicine , volume=

    Point-of-care ultrasound (POCUS): unnecessary gadgetry or evidence-based medicine? , author=. Clinical Medicine , volume=. 2018 , publisher=

  58. [58]

    Journal of Medical Ultrasound , volume=

    Point-of-care Ultrasound of the Gastrointestinal Tract , author=. Journal of Medical Ultrasound , volume=. 2023 , publisher=

  59. [59]

    Medicina , volume=

    Point-of-care ultrasound—history, current and evolving clinical concepts in emergency medicine , author=. Medicina , volume=. 2023 , publisher=

  60. [60]

    International Conference on Medical Image Computing and Computer-Assisted Intervention , pages=

    A new dataset and a baseline model for breast lesion detection in ultrasound videos , author=. International Conference on Medical Image Computing and Computer-Assisted Intervention , pages=. 2022 , organization=

  61. [61]

    Data in brief , volume=

    Dataset of breast ultrasound images , author=. Data in brief , volume=. 2020 , publisher=

  62. [62]

    Computers in Biology and Medicine , volume=

    An open-access breast lesion ultrasound image database: Applicable in artificial intelligence studies , author=. Computers in Biology and Medicine , volume=. 2023 , publisher=

  63. [63]

    Medical Physics , volume=

    BUS-BRA: A breast ultrasound dataset for assessing computer-aided diagnosis systems , author=. Medical Physics , volume=. 2024 , publisher=

  64. [64]

    doi:10.17632/k6cpmwybk3.1 , url =

    Jin Huang and Jingwen Zhang and Yimin Zhang and Xiaoxiao Li and Xiao Ma and Jingwen Deng and Hui Shen and Du Wang and Liye Mei and Cheng Lei , title =. doi:10.17632/k6cpmwybk3.1 , url =

  65. [65]

    doi:10.17632/7fvgj4jsp7.1 , url =

    Noelia Vallez and Gloria Bueno and Oscar Deniz and Miguel Angel Rienda and Carlos Pastor , title =. doi:10.17632/7fvgj4jsp7.1 , url =

  66. [66]

    Scientific Data , volume=

    Curated benchmark dataset for ultrasound based breast lesion analysis , author=. Scientific Data , volume=. 2024 , publisher=

  67. [67]

    Ruers and Behdad Dashtbozorg , booktitle=

    Mark Wijkhuizen and Lennard van Karnenbeek and Freija Geldof and Theo J.M. Ruers and Behdad Dashtbozorg , booktitle=. Ultrasound tumor detection using an adapted Mask-. 2024 , url=

  68. [68]

    Smit and Tarik R

    Tiziano Natali and Andrey Zhylka and Karin Olthof and Jasper N. Smit and Tarik R. Baetens and Niels F. M. Kok and Koert F. D. Kuhlmann and Oleksandra Ivashchenko and Theo J. M. Ruers and Matteo Fusaglia , title =. Journal of Medical Imaging , number =. 2024 , doi =

  69. [69]

    2019 , eprint=

    Hepatic vessel segmentation using a reduced filter 3D U-Net in ultrasound imaging , author=. 2019 , eprint=

  70. [70]

    International journal of computer assisted radiology and surgery , volume=

    Ultrasound guidance in navigated liver surgery: toward deep-learning enhanced compensation of deformation and organ motion , author=. International journal of computer assisted radiology and surgery , volume=. 2024 , publisher=

  71. [71]

    International journal of computer assisted radiology and surgery , volume=

    Feasibility of tracked ultrasound registration for pelvic--abdominal tumor navigation: a patient study , author=. International journal of computer assisted radiology and surgery , volume=. 2023 , publisher=

  72. [72]

    Therapeutic Advances in Urology , volume =

    Snir Dekalo and Ziv Savin and Eran Schreter and Ron Marom and Yuval Bar-Yosef and Roy Mano and Ofer Yossepowitch and Mario Sofer , title =. Therapeutic Advances in Urology , volume =. 2021 , doi =

  73. [73]

    Ultrasound image enhancement: A review , journal =

    Sonia H. Ultrasound image enhancement: A review , journal =. 2012 , issn =. doi:https://doi.org/10.1016/j.bspc.2012.02.002 , url =

  74. [74]

    An overview of deep learning in medical imaging , journal =

    Andrés Anaya-Isaza and Leonel Mera-Jiménez and Martha Zequera-Diaz , keywords =. An overview of deep learning in medical imaging , journal =. 2021 , issn =. doi:https://doi.org/10.1016/j.imu.2021.100723 , url =

  75. [75]

    Journal of the Operations Research Society of China , volume=

    A review on deep learning in medical image reconstruction , author=. Journal of the Operations Research Society of China , volume=. 2020 , publisher=

  76. [76]

    2023 , Journal =

    Medical Image Processing based on Generative Adversarial Networks: A Systematic Review , Author =. 2023 , Journal =. doi:10.2174/0115734056258198230920042358 , Month =

  77. [77]

    and Abdulaal, Mohammed Jamal and Al-Saggaf, Ubaid M

    Moinuddin, Muhammad and Khan, Shujaat and Alsaggaf, Abdulrahman U. and Abdulaal, Mohammed Jamal and Al-Saggaf, Ubaid M. and Ye, Jong Chul , TITLE=. Frontiers in Physiology , VOLUME=. 2022 , URL=. doi:10.3389/fphys.2022.961571 , ISSN=

  78. [78]

    and Martí, Robert , journal=

    Yap, Moi Hoon and Pons, Gerard and Martí, Joan and Ganau, Sergi and Sentís, Melcior and Zwiggelaar, Reyer and Davison, Adrian K. and Martí, Robert , journal=. Automated Breast Ultrasound Lesions Detection Using Convolutional Neural Networks , year=

  79. [79]

    Kurucz and Matteo Fusaglia and Laura S

    Tiziano Natali and Liza M. Kurucz and Matteo Fusaglia and Laura S. Mertens and Theo J.M. Ruers and Pim J. Automatic prostate volume estimation in transabdominal ultrasound images , journal =. 2025 , issn =. doi:https://doi.org/10.1016/j.ejrad.2025.112274 , url =

  80. [80]

    and Natali, Tiziano and Westerhout, Sanne and Hagens, Marias and Visser, Jeroen J

    Kurucz, Liza M. and Natali, Tiziano and Westerhout, Sanne and Hagens, Marias and Visser, Jeroen J. and van Muilekom, Erik A. M. and van Kesteren, Jolien D. and Schoots, Ivo and Boellaard, Thierry N. and Agrotis, Georgios and Dashtbozorg, Behdad and Ruers, Theo J. M. and van Leeuwen, Pim J. and Mertens, Laura S. , title =. The Prostate , volume =. doi:http...