Background & Summary
An estimated 1.71 billion people worldwide are affected by musculoskeletal disorders, and there is a critical need for innovative diagnostic and therapeutic approaches[1](https://www.nature.com/articles/s41597-025-06047-9#ref-CR1 “Vos, T. et al. Global burden of 369 diseases and injuries in 204 countries and territories, 1990–2019: a systematic analysis for the Global Burden of Disease Study 2019. The Lancet 396, 1204–1222, https://doi.org/10.1016/S0140-6736(20)30925-9
(2020).“),[2](https://www.nature.com/articles/s41597-025-06047-9#ref-CR2 “Musculoskeletal health, https://www.who.int/news-room/fact-sheets/detail/musculoskeletal-conditions
(2022).“). Low back pain, fractures, and osteoarthritis are the most prevalent of these conditions, which together represen…
Background & Summary
An estimated 1.71 billion people worldwide are affected by musculoskeletal disorders, and there is a critical need for innovative diagnostic and therapeutic approaches[1](https://www.nature.com/articles/s41597-025-06047-9#ref-CR1 “Vos, T. et al. Global burden of 369 diseases and injuries in 204 countries and territories, 1990–2019: a systematic analysis for the Global Burden of Disease Study 2019. The Lancet 396, 1204–1222, https://doi.org/10.1016/S0140-6736(20)30925-9
(2020).“),[2](https://www.nature.com/articles/s41597-025-06047-9#ref-CR2 “Musculoskeletal health, https://www.who.int/news-room/fact-sheets/detail/musculoskeletal-conditions
(2022).“). Low back pain, fractures, and osteoarthritis are the most prevalent of these conditions, which together represent a significant health and economic burden, particularly in the context of an aging population[3](#ref-CR3 “Ferreira, M. L. et al. Global, regional, and national burden of low back pain, 1990–2020, its attributable risk factors, and projections to 2050: a systematic analysis of the Global Burden of Disease Study 2021. The Lancet Rheumatology 5, e316–e329, https://doi.org/10.1016/S2665-9913(23)00098-X
(2023).“),[4](#ref-CR4 “Cieza, A. et al. Global estimates of the need for rehabilitation based on the Global Burden of Disease study 2019: a systematic analysis for the Global Burden of Disease Study 2019. Lancet 396, 2006–2017, https://doi.org/10.1016/s0140-6736(20)32340-0
(2021).“),[5](#ref-CR5 “Compston, J. E., McClung, M. R. & Leslie, W. D. Osteoporosis. Lancet 393, 364–376, https://doi.org/10.1016/S0140-6736(18)32112-3
(2019).“),[6](#ref-CR6 “Wang, Y. X. J. et al. Elderly men have much lower vertebral fracture risk than elderly women even at advanced age: the MrOS and MsOS (Hong Kong) year 14 follow-up radiology results. Arch Osteoporos 15, 176, https://doi.org/10.1007/s11657-020-00845-x
(2020).“),[7](https://www.nature.com/articles/s41597-025-06047-9#ref-CR7 “Hunter, D. J. & Bierma-Zeinstra, S. Osteoarthritis. Lancet 393, 1745–1759, https://doi.org/10.1016/s0140-6736(19)30417-9
(2019).“). This demographic trend increases not only the risk of disability but also the complexity of the orthopedic surgical interventions required[8](#ref-CR8 “Hernigou, P. & Scarlat, M. M. Growth in musculoskeletal pathology worldwide: the role of Société Internationale de Chirurgie Orthopédique et de Traumatologie and publications. International Orthopaedics 46, 1913–1920, https://doi.org/10.1007/s00264-022-05512-z
(2022).“),[9](#ref-CR9 “Jeon, Y. D., Park, K. B., Ko, S. H., Oh, J. M. & Kim, S. G. Sports-related fractures in the geriatric population at a level I trauma center. BMC Geriatr 24, 464, https://doi.org/10.1186/s12877-024-05095-x
(2024).“),[10](#ref-CR10 “Gouzoulis, M. J. et al. Robotic Versus Navigation Assisted Posterior Lumbar Fusion: A National Database Study. Spine (Phila Pa 1976) https://doi.org/10.1097/brs.0000000000005032
(2024).“),[11](https://www.nature.com/articles/s41597-025-06047-9#ref-CR11 “Wang, D. et al. Radiographic and surgery-related predictive factors for increased segmental lumbar lordosis following lumbar fusion surgery in patients with degenerative lumbar spondylolisthesis. Eur Spine J https://doi.org/10.1007/s00586-024-08248-z
(2024).“). The complex nature of human anatomy, combined with the critical proximity of nerves and blood vessels to surgical sites and the poor visibility of the region of interest, demonstrates the need for precise intraoperative methods to guide these procedures[12](#ref-CR12 “Yamada, T. et al. Characteristics of pedicle screw misplacement using freehand technique in degenerative scoliosis surgery. Arch Orthop Trauma Surg 143, 1861–1867, https://doi.org/10.1007/s00402-022-04380-x
(2023).“),[13](#ref-CR13 “Chapman, J. R. Editorial. Spinal infections: a growing problem, which deserves our urgent attention. Neurosurg Focus 46, E3, https://doi.org/10.3171/2018.10.FOCUS18587
(2019).“),[14](https://www.nature.com/articles/s41597-025-06047-9#ref-CR14 “Lonstein, J. E. et al. Complications associated with pedicle screws. J Bone Joint Surg Am 81, 1519–1528, https://doi.org/10.2106/00004623-199911000-00003
(1999).“).
Surgical navigation systems that rely on intraoperative computed tomography (CT) or the fusion of preoperative CT with intraoperative fluoroscopy remain the gold standard for intraoperative guidance[15](#ref-CR15 “Elmi-Terander, A. et al. Augmented reality navigation with intraoperative 3D imaging vs fluoroscopy-assisted free-hand surgery for spine fixation surgery: a matched-control study comparing accuracy. Scientific Reports 10, 707, https://doi.org/10.1038/s41598-020-57693-5
(2020).“),[16](#ref-CR16 “Ghisla, S. et al. Posterior pelvic ring fractures: Intraoperative 3D-CT guided navigation for accurate positioning of sacro-iliac screws. Orthopaedics & Traumatology: Surgery & Research 104, 1063–1067, https://doi.org/10.1016/j.otsr.2018.07.006
(2018).“),[17](https://www.nature.com/articles/s41597-025-06047-9#ref-CR17 “Kochanski, R. B., Lombardi, J. M., Laratta, J. L., Lehman, R. A. & O’Toole, J. E. Image-Guided Navigation and Robotics in Spine Surgery. Neurosurgery 84, 1179–1189, https://doi.org/10.1093/neuros/nyy630
(2019).“). While promising developments in “CT-like” MRI, based on spoiled gradient, zero-, or ultrashort-echo-time sequences with bone-signal post-processing, are emerging as radiation-free alternatives with comparable accuracy, these technologies are not yet widely adopted[18](#ref-CR18 “Abel, F. et al. Deep-learning reconstructed lumbar spine 3D MRI for surgical planning: pedicle screw placement and geometric measurements compared to CT. European Spine Journal 33, 4144–4154, https://doi.org/10.1007/s00586-023-08123-3
(2024).“),[19](#ref-CR19 “Altorfer, F. C. S. et al. Feasibility and Accuracy of Robotic-Assisted Navigation for Thoracic Pedicle Screw Placement Using CT-like 3D-MRI. Spine https://doi.org/10.1097/brs.0000000000005366
(2025).“),[20](#ref-CR20 “Morbée, L., Chen, M., Herregods, N., Pullens, P. & Jans, L. B. O. MRI-based synthetic CT of the lumbar spine: Geometric measurements for surgery planning in comparison with CT. European Journal of Radiology 144, 109999, https://doi.org/10.1016/j.ejrad.2021.109999
(2021).“),[21](https://www.nature.com/articles/s41597-025-06047-9#ref-CR21 “Florkow, M. C. et al. Magnetic Resonance Imaging Versus Computed Tomography for Three-Dimensional Bone Imaging of Musculoskeletal Pathologies: A Review. Journal of Magnetic Resonance Imaging 56, 11–34, https://doi.org/10.1002/jmri.28067
(2022).“). As a result, most current intraoperative navigation systems still depend on ionizing radiation, exposing both patients and healthcare professionals to potentially harmful levels that raise concerns about long-term health risks[22](#ref-CR22 “Mendelsohn, D. et al. Patient and surgeon radiation exposure during spinal instrumentation using intraoperative computed tomography-based navigation. Spine J 16, 343–354, https://doi.org/10.1016/j.spinee.2015.11.020
(2016).“),[23](#ref-CR23 “Bourret, S. et al. Computed Tomography Intraoperative Navigation in Spinal Surgery: Assessment of Patient Radiation Exposure in Current Practices. Int J Spine Surg 16, 909–915, https://doi.org/10.14444/8319
(2022).“),[24](https://www.nature.com/articles/s41597-025-06047-9#ref-CR24 “Klingler, J. H. et al. Radiation Exposure to Scrub Nurse, Assistant Surgeon, and Anesthetist in Minimally Invasive Spinal Fusion Surgery Comparing 2D Conventional Fluoroscopy With 3D Fluoroscopy-based Navigation: A Randomized Controlled Trial. Clin Spine Surg 34, E211–e215, https://doi.org/10.1097/bsd.0000000000001077
(2021).“).
Additionally, they are limited in their ability to adapt quickly to changes that occur during surgery[25](#ref-CR25 “Huang, M., Tetreault, T. A., Vaishnav, A., York, P. J. & Staub, B. N. The current state of navigation in robotic spine surgery. Ann Transl Med 9, 86, https://doi.org/10.21037/atm-2020-ioi-07
(2021).“),[26](#ref-CR26 “Ewurum, C. H., Guo, Y., Pagnha, S., Feng, Z. & Luo, X. Surgical Navigation in Orthopedics: Workflow and System Review. Intelligent Orthopaedics: Artificial Intelligence and Smart Image-guided Technology for Orthopaedics, 47–63 https://doi.org/10.1007/978-981-13-1396-7_4
(2018).“),[27](#ref-CR27 “Fucentese, S. F. et al. Accuracy of 3D-planned patient specific instrumentation in high tibial open wedge valgisation osteotomy. Journal of Experimental Orthopaedics 7, 7, https://doi.org/10.1186/s40634-020-00224-y
(2020).“),[28](#ref-CR28 “Iannotti, J. P. et al. Accuracy of 3-Dimensional Planning, Implant Templating, and Patient-Specific Instrumentation in Anatomic Total Shoulder Arthroplasty. JBJS 101, 446–457, https://doi.org/10.2106/jbjs.17.01614
(2019).“),[29](https://www.nature.com/articles/s41597-025-06047-9#ref-CR29 “Alam, F., Rahman, S. U., Ullah, S. & Gulati, K. Medical image registration in image guided surgery: Issues, challenges and research opportunities. Biocybernetics and Biomedical Engineering 38, 71–89, https://doi.org/10.1016/j.bbe.2017.10.001
(2018).“), and their poor visualization of soft tissues can hinder the identification and avoidance of critical structures, potentially leading to perioperative complications[30](https://www.nature.com/articles/s41597-025-06047-9#ref-CR30 “Farshad, M., Aichmair, A., Gerber, C. & Bauer, D. E. Classification of perioperative complications in spine surgery. Spine J 20, 730–736, https://doi.org/10.1016/j.spinee.2019.12.013
(2020).“),[31](https://www.nature.com/articles/s41597-025-06047-9#ref-CR31 “Sakhrekar, R. et al. Pedicle screw accuracy placed with assistance of machine vision technology in patients with neuromuscular scoliosis. Spine Deform 12, 739–746, https://doi.org/10.1007/s43390-024-00830-1
(2024).“).
Ultrasound (US) imaging is a promising alternative in this setting. Known for its non-harmful, real-time, and dynamic imaging capabilities, US has been extensively used for diagnostic purposes perioperatively[32](#ref-CR32 “Dulchavsky, S. A. et al. Advanced Ultrasonic Diagnosis of Extremity Trauma: The FASTER Examination. Journal of Trauma and Acute Care Surgery 53, 28–32, https://journals.lww.com/jtrauma/fulltext/2002/07000/advanced_ultrasonic_diagnosis_of_extremity_trauma_.6.aspx
(2002).“),[33](#ref-CR33 “Wells, P. S., Lensing, A. W., Davidson, B. L., Prins, M. H. & Hirsh, J. Accuracy of Ultrasound for the Diagnosis of Deep Venous Thrombosis in Asymptomatic Patients after Orthopedic Surgery. Annals of Internal Medicine 122, 47–53, https://doi.org/10.7326/0003-4819-122-1-199501010-00008
(1995).“),[34](https://www.nature.com/articles/s41597-025-06047-9#ref-CR34 “Blankstein, A. Ultrasound in the diagnosis of clinical orthopedics: The orthopedic stethoscope. World J Orthop 2, 13–24, https://doi.org/10.5312/wjo.v2.i2.13
(2011).“). Despite its potential, applying handheld 2D US (HUS) in surgical guidance remains limited to pre-clinical and early clinical research in orthopedics[35](#ref-CR35 “Niu, K., Homminga, J., Sluiter, V. I., Sprengers, A. & Verdonschot, N. Feasibility of A-mode ultrasound based intraoperative registration in computer-aided orthopedic surgery: A simulation and experimental study. PLOS ONE 13, e0199136, https://doi.org/10.1371/journal.pone.0199136
(2018).“),[36](#ref-CR36 “Gueziri, H. E., Georgiopoulos, M., Santaguida, C. & Collins, D. L. Ultrasound-based navigated pedicle screw insertion without intraoperative radiation: feasibility study on porcine cadavers. Spine J 22, 1408–1417, https://doi.org/10.1016/j.spinee.2022.04.014
(2022).“),[37](#ref-CR37 “Brendel, B., Rick, S. W. A., Stockheim, M. & Ermert, H. Registration of 3D CT and Ultrasound Datasets of the Spine using Bone Structures. Computer Aided Surgery 7, 146–155, https://doi.org/10.3109/10929080209146025
(2002).“),[38](#ref-CR38 “Ottacher, D., Chan, A., Parent, E. & Lou, E. Positional and Orientational Accuracy of 3-D Ultrasound Navigation System on Vertebral Phantom Study. IEEE Transactions on Instrumentation and Measurement 69, 6412–6419, https://doi.org/10.1109/TIM.2020.2973839
(2020).“),[39](https://www.nature.com/articles/s41597-025-06047-9#ref-CR39 “Gueziri, H. E., Santaguida, C. & Collins, D. L. The state-of-the-art in ultrasound-guided spine interventions. Med Image Anal 65, 101769, https://doi.org/10.1016/j.media.2020.101769
(2020).“). This limitation is due to several challenges: US is still predominantly regarded as a diagnostic rather than a therapeutic tool; significant advancements in automatic image segmentation are required to enhance its practicality; and the observer-dependent nature of current ultrasound systems discourages many professionals from adopting it for intraoperative use. Recent advances in robot-assisted 3D reconstruction of 2D US images represent a breakthrough in this research area[40](#ref-CR40 “Jiang, W., Chen, X. & Yu, C. A real-time freehand 3D ultrasound imaging method for scoliosis assessment. J Appl Clin Med Phys 23, e13709, https://doi.org/10.1002/acm2.13709
(2022).“),[41](#ref-CR41 “Li, R., Davoodi, A., Cai, Y. & Vander Poorten, E. Automatic Robotic Scanning for real-time 3D Ultrasound Reconstruction in Spine Surgery. 11th Conference on New Technologies for Computer and Robot Assisted Surgery, Date: 2022/04/25-2022/04/27, Location: Napoli, Italy. https://lirias.kuleuven.be/retrieve/674549
(2022).“),[42](#ref-CR42 “Hacihaliloglu, I. & Vives, M. J. Real-time non-radiation-based navigation using 3D ultrasound for pedicle screw placement. The Spine Journal 20, S134–S135, https://doi.org/10.1016/j.spinee.2020.05.685
(2020).“),[43](#ref-CR43 “Li, R. et al. Development and evaluation of robot-assisted ultrasound navigation system for pedicle screw placement: An ex-vivo animal validation. Int J Med Robot, e2590. https://doi.org/10.1002/rcs.2590
(2023).“),[44](#ref-CR44 “Lei, L. et al. Robotic Needle Insertion With 2D Ultrasound–3D CT Fusion Guidance. IEEE Transactions on Automation Science and Engineering, 1–13. https://doi.org/10.1109/TASE.2023.3322710
(2023).“),[45](https://www.nature.com/articles/s41597-025-06047-9#ref-CR45 “Jiang, Z., Salcudean, S. E. & Navab, N. Robotic ultrasound imaging: State-of-the-art and future perspectives. Medical Image Analysis 89, 102878, https://doi.org/10.1016/j.media.2023.102878
(2023).“). Some US transducers are capable of real-time 3D imaging, but they currently suffer from lower imaging quality, structure insensitivity, and higher cost than 2D US transducers[46](#ref-CR46 “Chen, X., Chen, H., Peng, Y., Liu, L. & Huang, C. A Freehand 3D Ultrasound Reconstruction Method Based on Deep Learning. Electronics 12, 1527, https://doi.org/10.3390/electronics12071527
(2023).“),[47](#ref-CR47 “Daoud, M. I., Alshalalfah, A.-L., Awwad, F. & Al-Najar, M. Freehand 3D ultrasound imaging system using electromagnetic tracking. 2015 International Conference on Open Source Software Computing (OSSCOM), 1–5. https://doi.org/10.1109/OSSCOM.2015.7372689
(2015).“),[48](#ref-CR48 “Wang, N. et al. A Multiplexed 32 × 32 2D Matrix Array Transducer for Flexible Sub-Aperture Volumetric Ultrasound Imaging. IEEE Trans Biomed Eng 71, 831–840, https://doi.org/10.1109/tbme.2023.3319513
(2024).“),[49](#ref-CR49 “Fenster, A., Downey, D. B. & Cardinal, H. N. Three-dimensional ultrasound imaging. Physics in Medicine & Biology 46, R67, https://doi.org/10.1088/0031-9155/46/5/201
(2001).“),[50](https://www.nature.com/articles/s41597-025-06047-9#ref-CR50 “Bureau, F. et al. Three-dimensional ultrasound matrix imaging. Nature Communications 14, 6793, https://doi.org/10.1038/s41467-023-42338-8
(2023).“). Therefore, the community mainly utilizes tracking- or robot-based odometry to create 3D US data by stacking 2D US images along the acquisition axis[51](#ref-CR51 “Luo, M. et al. RecON: Online learning for sensorless freehand 3D ultrasound reconstruction. Medical Image Analysis 87, 102810, https://doi.org/10.1016/j.media.2023.102810
(2023).“),[52](#ref-CR52 “Li, R. et al. Robot-assisted ultrasound reconstruction for spine surgery: from bench-top to pre-clinical study. International Journal of Computer Assisted Radiology and Surgery 18, 1613–1623, https://doi.org/10.1007/s11548-023-02932-z
(2023).“),[53](https://www.nature.com/articles/s41597-025-06047-9#ref-CR53 “Bekedam, N. M. et al. Comparison of image quality of 3D ultrasound: motorized acquisition versus freehand navigated acquisition, a phantom study. International Journal of Computer Assisted Radiology and Surgery 18, 1649–1663, https://doi.org/10.1007/s11548-023-02934-x
(2023).“). This development promises to overcome the limitations of traditional 2D HUS by providing comprehensive 3D visualization of anatomical structures, facilitating observer-independent use, and allowing orientation-independent imaging. Moreover, US-based 3D reconstruction could advance the perception of surgical navigation and robotic systems without the drawbacks of additional ionizing radiation.
This clinical trial aimed to produce a dataset for advancing machine learning applications of 3D reconstruction algorithms using HUS and robot-assisted US (RUS) data, critical for facilitating its translation from pre-clinical research to clinical application for surgical guidance[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“). One example of a potential downstream application in robot-assisted surgery is the FAROS project (EU Horizon 2020), aiming to leverage non-visual sensing technology to improve the autonomy of robotic pedicle screw drilling[55](https://www.nature.com/articles/s41597-025-06047-9#ref-CR55 “Project FAROS, EC Horizon 2020 Framework Programme (EU Framework Programme for Research and Innovation H2020), https://h2020faros.eu/
(2020).“).
Up to now, only a few musculoskeletal HUS datasets are openly accessible to the scientific community[56](https://www.nature.com/articles/s41597-025-06047-9#ref-CR56 “Hohlmann, B., Broessner, P. & Radermacher, K. Ultrasound-based 3D bone modelling in computer assisted orthopedic surgery – a review and future challenges. Computer Assisted Surgery 29, 2276055, https://doi.org/10.1080/24699322.2023.2276055
(2024).“), such as the Leg-3D-US datasets (44 volunteers)[57](https://www.nature.com/articles/s41597-025-06047-9#ref-CR57 “Krönke, M. et al. Tracked 3D ultrasound and deep neural network-based thyroid segmentation reduce interobserver variability in thyroid volumetry. PLOS ONE 17, e0268550, https://doi.org/10.1371/journal.pone.0268550
(2022).“), and a recent initiative, the TUS-REC challenge (100 volunteers)[58](https://www.nature.com/articles/s41597-025-06047-9#ref-CR58 “Li, Q., et al Trackerless 3D Freehand Ultrasound Reconstruction Challenge 2025 (TUS-REC2025) - Train Dataset (1.0.0) [Data set]. https://doi.org/10.5281/zenodo.15224704
(2025).“), which aims to open-source a large HUS dataset of bilateral forearms with accurate positional information. To our knowledge, this is the first-ever RUS dataset of this size and topic made publicly available for large-scale scientific and clinical analyses[45](https://www.nature.com/articles/s41597-025-06047-9#ref-CR45 “Jiang, Z., Salcudean, S. E. & Navab, N. Robotic ultrasound imaging: State-of-the-art and future perspectives. Medical Image Analysis 89, 102878, https://doi.org/10.1016/j.media.2023.102878
(2023).“),[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“).
We present a comprehensive dataset from a single academic spine center, including demographics, HUS, RUS, and high-quality, ultra-low-dose CT images of the lumbar spine of 63 healthy volunteers[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“). Figure 1 provides an overview of the study pipeline, from volunteer selection and eligibility criteria to the different imaging methods and data processing. As such, this dataset adds to the existing collection of lumbar spine medical imaging data and provides a foundation for developing advanced intraoperative strategies in managing musculoskeletal disorders using RUS for surgical guidance[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“),[59](#ref-CR59 “Deng, Y. et al. CTSpine1K: a large-scale dataset for spinal vertebrae segmentation in computed tomography. arXiv preprint arXiv:2105.14711. https://doi.org/10.48550/arXiv.2105.14711
(2021).“),[60](#ref-CR60 “Löffler, M. T. et al. A Vertebral Segmentation Dataset with Fracture Grading. Radiology: Artificial Intelligence 2, e190138, https://doi.org/10.1148/ryai.2020190138
(2020).“),[61](https://www.nature.com/articles/s41597-025-06047-9#ref-CR61 “Sekuboyina, A. et al. VerSe: A Vertebrae labelling and segmentation benchmark for multi-detector CT images. Medical Image Analysis 73, 102166, https://doi.org/10.1016/j.media.2021.102166
(2021).“). Moreover, this ultrasound dataset complements the recent works published on medical imaging data descriptors for CT, MRI, and Finite Element Model data of the lumbar spine, adding more opportunities for machine learning approaches[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“),[62](#ref-CR62 “Liebl, H. et al. A computed tomography vertebral segmentation dataset with anatomical variations and multi-vendor scanner data. Scientific Data 8, 284, https://doi.org/10.1038/s41597-021-01060-0
(2021).“),[63](#ref-CR63 “van der Graaf, J. W. et al. Lumbar spine segmentation in MR images: a dataset and a public benchmark. Scientific Data 11, 264, https://doi.org/10.1038/s41597-024-03090-w
(2024).“),[64](https://www.nature.com/articles/s41597-025-06047-9#ref-CR64 “Rasouligandomani, M. et al. Dataset of Finite Element Models of Normal and Deformed Thoracolumbar Spine. Scientific Data 11, 549, https://doi.org/10.1038/s41597-024-03351-8
(2024).“).
Fig. 1
Study pipeline. (a) Sixty-three volunteers were recruited based on in-/exclusion criteria and passed a clinical examination of the lumbar spine. (b) An ultra-low-dose computed tomography (CT) scan was conducted. (c) The US setup was calibrated for accurate image recording. (d) Handheld (HUS) and robot-assisted US (RUS) scans were conducted. (e) Collected data was processed with the CT scan segmentation into 3D surface models and manual annotation of the bone surface in 2353 HUS and 3738 RUS images.
Methods
The dataset of this study was obtained through the study pipeline described in Fig. 1, which encompasses the following steps:
First, all participants’ demographic information, including sex, age, height, weight, and BMI, was collected (see section Volunteers).
Afterward, a CT scan of the lumbar spine was performed on the same day, right before the US scans (see section Computed Tomography Scans).
Following, paired ultrasound scans were acquired using two separate setups on each participant (see section Ultrasound Data Acquisition).
The system was previously calibrated for accurate image recording (see subsection Ultrasound Image Recording and Processing).
223 HUS scans were performed (see section Handheld Ultrasound Scans).
375 RUS scans were performed (see section Robot-assisted Ultrasound Scans).
Finally, the obtained data were processed for accurate storage and further use (see subsection Data Processing).
All CT scans were segmented into 3D surface models (see section Computed Tomography Image Processing).
The bone surface was annotated in 2353 HUS and 3738 RUS frames (see section Data Annotation and Labeling).
Volunteers
For this study, a total of 63 healthy adults (39 females, 24 males) aged 20–35 years (mean 25 years) with a BMI of 19–26 kg/m2 (mean 22 kg/m2) were recruited. Healthy adults were chosen to ensure motion-free scanning protocols, satisfy the ethical constraints of ultra-low-dose CT, and create a publicly shareable reference dataset of healthy paired US and CT images[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“). A detailed overview of the demographic characteristics is available in the repository. Oral and written informed consent were obtained from all volunteers before data collection for the study. Additionally, all participants consented to further use of their data for research and photography publication when applicable. Limiting the cohort to healthy volunteers also enabled the completion of all acquisitions within a one-month window, which was budgeted for the trial data collection, allowing for the scheduling of participants back-to-back. This approach would have been impractical for patient recruitment. Future work will involve pathological cohorts (e.g., vertebral fractures, osteoarthritis), where altered anatomy and potential discomfort may require customized positioning and imaging settings. The institutional review board and the local ethics committee of the Canton of Zurich, Switzerland, approved the study protocol (BASEC No: 2023-00350, 18.04.2023). Inclusion and exclusion criteria assessed the eligibility of the volunteers (Table 1). A study-specific clinical examination of the lumbar spine was performed on all volunteers to confirm eligibility, and pregnancy tests were conducted when applicable. All imaging data were collected between May and June 2023 at the University Hospital Balgrist, Zurich, Switzerland. Demographic data and inclusion/exclusion criteria were collected and managed using REDCap electronic data capture tools hosted at Balgrist University Hospital[65](https://www.nature.com/articles/s41597-025-06047-9#ref-CR65 “Harris, P. A. et al. The REDCap consortium: Building an international community of software platform partners. Journal of Biomedical Informatics 95, 103208, https://doi.org/10.1016/j.jbi.2019.103208
(2019).“),[66](https://www.nature.com/articles/s41597-025-06047-9#ref-CR66 “Harris, P. A. et al. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics 42, 377–381, https://doi.org/10.1016/j.jbi.2008.08.010
(2009).“). This study was registered as a clinical trial on ClinicalTrials.gov (Trial No: NCT05904418, 03.05.2023).
Data acquisition logistics
Given the collaborative nature of the project between KU Leuven and the University of Zurich, a substantial and coordinated effort was essential to ensure the quality and comprehensiveness of the dataset. Six months of protocol design, ethics approval, and pilot testing preceded a focused one-month on-site campaign at the University of Zurich, during which all 63 volunteers were scanned (≈ 2 h per participant; up to four participants per day). Six researchers staffed each on-site session: two medical personnel who managed recruitment performed the clinical examination and oversaw the ultra-low-dose CT acquisition with the assistance of a CT technologist, and a team of four to five members was responsible for protocol adherence, HUS imaging, real-time data logging, RUS scanning, and volunteer safety monitoring. Following the acquisition, over 15 months were spent processing, validating, and curating the CT and US datasets[54](https://www.nature.com/articles/s41597-025-06047-9#ref-CR54 “Cavalcanti, N. et al. A large, paired dataset of robotic and handheld lumbar spine ultrasound with ground-truth CT benchmarking. https://doi.org/10.48804/3XPCAE
(2024).“).
Computed tomography scans
Each participant underwent a tin-filtered ultra-low-dose (ULD) CT scan (NAEOTOM Alpha, Siemens, Erlangen, Germany) of the lumbar spine, including the vertebrae L1 to L5 and partial views of the iliac bones[67](https://www.nature.com/articles/s41597-025-06047-9#ref-CR67 “Stern, C. et al. Pelvic bone CT: can tin-filtered ultra-low-dose CT and virtual radiographs be used as alternative for standard CT and digital radiographs? Eur Radiol 31, 6793–6801, https://doi.org/10.1007/s00330-021-07824-x
(2021).“). The CT data were obtained using a specific acquisition protocol developed for this study. During this imaging procedure, the volunteers were positioned prone with their heads on a pillow and their arms above their heads (e.g., Fig. 2a). Further, a structured positioning cushion (MRI Knee support cushion, Siemens, Erlangen, Germany) was placed below the lower legs and ankles to have a similar comfortable position of the spine during the CT scan as in the subsequent US examinations (e.g., Fig. 2b,c). Afterward, a CT topogram and axial reconstructions of 0.6 mm and 2.0 mm slice thickness and increments with a matrix size 512 × 512 were collected, which are the current standard for ULD scans at our institution (Table 2)[67](https://www.nature.com/articles/s41597-025-06047-9#ref-CR67 “Stern, C. et al. Pelvic bone CT: can tin-filtered ultra-low-dose CT and virtual radiographs be used as alternative for standard CT and digital radiographs? Eur Radiol 31, 6793–6801, https://doi.org/10.1007/s00330-021-07824-x
(2021).“),[68](https://www.nature.com/articles/s41597-025-06047-9#ref-CR68 “Stern, C., Wanivenhaus, F., Rosskopf, A. B., Farshad, M. & Sutter, R. Superior metal artifact reduction of tin-filtered low-dose CT in imaging of lumbar spinal instrumentation compared to conventional computed tomography. Skeletal Radiol 53, 665–673, https://doi.org/10.1007/s00256-023-04467-5
(2024).“).
Fig. 2
Experimental CT, HUS, and RUS scan setup. The volunteers were positioned prone on the table during (a) CT Scan, (b) HUS, and (c) RUS. In all three imaging examinations, the arms were rested above the head, and a positioning cushion supported the feet and lower legs for comfort. The US device was positioned at the head of the volunteer. The HUS and RUS scans were conducted in the same tracking space, with the camera placed approximately two meters away on the left side of the volunteer.
Ultrasound data acquisition
Two different US acquisition setups were performed: HUS and RUS. 223 HUS and 375 RUS scans could be completed in the data collection without errors or data-saving issues. The total number of HUS and RUS scans differs mainly because the “Perpendicular”-type US scans were repeated three times for each RUS scan. This approach was chosen because the perpendicular scans are the simplest for the system to execute, making them less prone to errors. Additionally, this type of scan represents the current standard for our scanning system to achieve sufficient bone surface reconstruction, ensuring reliable and consistent data collection[52](https://www.nature.com/articles/s41597-025-06047-9#ref-CR52 “Li, R. et al. Robot-assisted ultrasound reconstruction for spine surgery: from bench-top to pre-clinical study. International Journal of Computer Assisted Radiology and Surgery 18, 1613–1623, https://doi.org/10.1007/s11548-023-02932-z
(2023).“). The US section of the repository provides an overview of the availability of HUS and RUS scan types. All US imaging was acquired using a clinical US device (Aixplorer Ultimate, SuperSonic Imagine, Aix-en-Provence, France; Fig. 2c). The volunteers were positioned prone with arms folded above the head on a height- and tilt-adjustable operating table (Maquet Alphastar, Getinge AB, Göteborg, Sweden). A structured support cushion was placed beneath the lower legs and ankle to comfort the volunteers during the scans and to create the same body position as during the preceding CT scan (e.g., Fig. 2c). The participants were instructed to remain stationary, breathe slowly, and relax. The US data were collected using a linear transducer (SuperLinear™ SL10-2) set to 10 MHz. The image depth (7.7 cm) and focus (4 to 7 cm, slightly adjusted for each volunteer) were previously assessed and defined by an expert radiologist for this custom imaging protocol. The gain or brightness of the US image was adjusted between 60% and 74%, depending on the volunteer and soft tissue thickness. Two identical transducers were available for the HUS and RUS scans, respectively (e.g., Fig. 2b,c). The transducers were embedded in a custom-made 3D-printed transducer housing (material PA-12, Formiga P110, EOS GmbH, Munich, Germany) featuring markers with individual geometries of reflective disks (Atracsys LLC, Puidoux, Switzerland) for optical tracking (e.g., Fig. 3a,b). An additional marker was positioned on the midline of the volunteer’s spine to monitor chest movements during breathing (e.g., Fig. 3a,c). The US images were recorded as 1080 × 1920 pixels by a frame grabber (Epiphan, Palo Alto, U.S.A.) at 15 Hz.
Fig. 3
In detail, the two separate experimental setups of the HUS and RUS scans. The transducers were housed in a custom-made cast featuring two markers for optical tracking. (a) The investigator on the volunteer’s left side conducted the HUS scans. (b) The custom-made robot end-effector included a 6-DOF force-torque sensor, and for both RUS and HUS scans, a gel pad was attached to the transducer’s tip (picture-in-picture). (c) The robotic system was positioned on the volunteer’s right side. (d) The physician manipulates the robotic arm to define the scanning area on the patient’s lower back using admittance control.
The coordinate system of the US transducer is illustrated in Fig. 4a, with the US transducer advanced along its x- (red) and y-axis (green). During RUS scanning, a constant force of 5 N was maintained along the probe’s z-axis (blue) to ensure optimal image quality with limited tissue displacement. The force was measured only during RUS scanning, as it was not feasible to measure it during the HUS scans without meaningful hardware extension and limitation of the traditional HUS scan process. However, the physician conducting the HUS scans was carefully instructed to perform the HUS scans with a similar force in the z-axis. This force was achieved through a hybrid control strategy, aligning the z-axis with the surface normal vector computed from a predefined trajectory derived from the physician’s manual point selection on the volunteer’s lower back[52](https://www.nature.com/articles/s41597-025-06047-9#ref-CR52 “Li, R. et al. Robot-assisted ultrasound reconstruction for spine surgery: from bench-top to pre-clinical study. International Journal of Computer Assisted Radiology and Surgery 18, 1613–1623, https://doi.org/10.1007/s11548-023-02932-z
(2023).“). Real-time control was facilitated by OROCOS (Open Robot Control Middleware) and eTaSL (expressiongraph-based Task Specification Language)[69](https://www.nature.com/articles/s41597-025-06047-9#ref-CR69 “Bruyninckx, H., Soetens, P. & Koninckx, B. The real-time motion control core of the Orocos project. IEEE International Conference on Robotics and Automation 2, 2766–2771, https://doi.org/10.1109/ROBOT.2003.1242011
(2003).“),[70](https://www.nature.com/articles/s41597-025-06047-9#ref-CR70 “Aertbeliën, E. & De Schutter, J. eTaSL/eTC: A constraint-based task specification language and robot controller using expression graphs. 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, 1540–1546, https://doi.org/10.1109/IROS.2014.6942760
(2014).“). The robotic US system has different safety levels implemented at the software and hardware levels. The safety hierarchy goes from software-based low-level to the hardware-based safety button.
Fig. 4
Multiple types of HUS (b to c, f to g) and RUS (b to f, h) scans were completed. (a) The coordinate system of the US transducer in its coronal, sagittal, and axial views is shown with the x-, y-, and z-axis illustrated in red, green, and blue, respectively. All US scans except (g,h) start at the right side of the vertebra L5, shown with the transducer parallel to the lumbar spine in grey from the top view. The transducer then travels along the 12 predefined points (black) on the scanning path (black) cranially toward the vertebra L1. (b–e) A torso and the lumbar vertebra L3 are illustrated with the transducer movement pattern from an axial view. Forward and reverse scanning paths are illustrated in orange and blue, respectively. (f) Anatomy is illustrated as in (b–e) The US transducer travels forward from right to left, tilting +20° to −20° above critical structures to follow the anatomical shape of the vertebra. (g) As the starting point, the transducer is positioned at a diagonal 45-degree angle to the lumbar spine on the right side of vertebra L5 (grey). When reaching the most cranial and caudal scanning positions at vertebrae L1 and L5, the transducer is turned 90° toward the spine. (h) The transducer travels from below the spinous process of vertebra L5 in a straight motion just above the spinous process of vertebra L1.
Ultrasound image recording and processing
For accurate image recording, temporal and spatial calibration were performed using a custom-designed Z-phantom with the proposed robotic system before image acquisition[71](https://www.nature.com/articles/s41597-025-06047-9#ref-CR71 “Li, R., Kenan Niu, Y. C., Poorten, E. Comparative Quantitative Analysis of Robotic Ultrasound Image Calibration Methods. 2021 20th International Conference on Advanced Robotics (ICAR), 511–516, https://doi.org/10.1109/ICAR53236.2021.9659341
(2021).“). This calibration yields the transformation between US images and the robot end-effector, enabling their synchronization with the end-effector poses. The transformation and calibration are explained in detail below (see section technical validation; spatial tracking and Fig. 7a). The pairing of the US probes in both HUS and RUS was achieved using optical tracking with a Fusiontrack500 camera (Atracsys LLC, Puidoux, Switzerland), positioned approximately two meters away on the left side of the volunteer. The positioning of the camera was marked on the ground for reproducibility during the entire data collection, and before each data collection day, the camera system was calibrated. A gel pad (Focus pad, aiSon™ Technologies, Benglen, Switzerland) was attached to the US transducer to ensure consistent skin contact without extensive pressure on the tissue (e.g., Fig. 3b; picture-in-picture).
Handheld ultrasound (HUS) scans
A physician first palpated the pelvis and the spinous processes of the thoracic and lumbar vertebrae T12 to L5 to mark the anatomical area of interest for the subsequent US scanning. Then, the same physician confirmed the marked area with a preliminary HUS scan of the spinous and transverse processes. After that, at least three standardized and one optional scan HUS in an “S”-shaped scanning pattern were conducted on the lumbar spine for the data collection. The standardized scans included the “Perpendicular,” “Rescan,” and “LR”, while the optional scan was labeled “Variety” (e.g., Fig. 4b,c,f,g). A total of 71, 62, 61, and 29 scans, respectively, were completed for these categories. All standardized scans started on the right dorsal side of the pelvis, laterally to the right transverse process of the L5 vertebra, and progressed medially across the spine to the left side of the respective vertebrae. The “S”-shaped scan patterns advanced cranially toward the vertebra L1 in 3 to 4 cm increments.
The first US scan (e.g., Fig. 4b), “Perpendicular”, was completed with the US transducer perpendicular to the skin. For the second scan, “Rescan” (e.g., Fig. 4f), the investigator took the anatomical shape of the vertebrae into account and scanned back-and-forth in approximately 20-degree angles to the left and right on critical structures, such as the spinous, mammillary, and transverse processes. The focus on the anatomical shape puts this scan type closer to the clinical use case. Previous work has shown that the maximal angle of tilting is 35 degrees, and the most commonly used angle is 10 degrees tilt to either side[72](https://www.nature.com/articles/s41597-025-06047-9#ref-CR72 “Essomba, T., Nouaille, L., Laribi, M. A., Poisson, G. & Zeghloul, S. Design Process of a Robotized Tele-Echography System. Applied Mechanics and Materials 162, 384-393, https://doi.org/10.4028/www.scientific.net/AMM.162.384
(2012).“). Preliminary tests of our consortium have shown that 20-degree angles allowed us to scan the variable spinous bone surface with optimal imaging results and skin-to-probe contact[73](#ref-CR73 “Assche, K. V. et al. Robotic Path Re-Planning for US Reconstruction of the Spine. IEEE Transactions on Medical Robotics and Bionics 7, 755–767, https://doi.org/10.1109/TMRB.2025.3550662
(2025).“),[74](#ref-CR74 “Davoodi, A. et al. All-Ultrasound-Guided Path Planning for Robotic Pedicle Screw Placement. 2024 10th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob), 641–647, https://doi.org/10.1109/BioRob60516.2024.10719904
(2024).“),[75](https://www.nature.com/articles/s41597-025-06047-9#ref-CR75 “Van Assche, K., Davoodi, A., Li, R., Ourak, M. & Vander Poorten, E. Path Re-planning for Robotic Ultrasound Imaging of Bony Structures. Authorea Preprints. https://doi.org/10.36227/techrxiv.23599437.v1
(2023).“). The third scan (e.g., Fig. 4c), “LR” or left to right, was started by tilting the US transducer at a 20-degree angle to the right. When the other side’s transverse process was reached, the US transducer was tilted to the left at the same angle, and the scanning was repeated in reverse order. Then, the same increments were used to move toward the upper spine as in the previous scans. A fourth scanning path (e.g., Fig. 4g), “Variety”, was conducted only in a subset of the volunteers due to time limitations. The path started in the same location as the three standardized HUS scans used to scan the lumbar spine up and down. However, the US transducer was positioned at a diagonal 45-degree angle to facilitate the optical tracking from the same camera position. Therefore, the US transducer had to be turned 90° toward the spine when reaching the most cranial and caudal scanning positions. A custom program with a graphical user interface (GUI) was used to store the collected data. The HUS scans took 3 minutes on average, generating 2700 images per scan.
Robot-assisted ultrasound scan
After the HUS scans were completed, the volunteers remained in the same anatomical position for the subsequent RUS scans. All RUS imaging was acquired using a research system previously described and developed within a European collaboration project (FAROS)[52](https://www.nature.com/articles/s41597-025-06047-9#ref-CR52 “Li, R. et al. Robot-assisted ultrasound reconstruction for spine surgery: from bench-top to pre-clinical study. International Journal of Computer Assisted Radiology and Surgery 18, 1613–1623, https://doi.org/10.1007/s11548-023-02932-z
(2023).“),[55](https://www.nature.com/articles/s41597-025-06047-9#ref-CR55 “Project FAROS, EC Horizon 2020 Framework Programme (EU Framework Programme for Research and Innovation H2020), https://h2020faros.eu/
(2020).“). The proposed framework integrates automatic robotic scanning with US image and probe position recording.
The housing of the transducer for the RUS system included a 3D-printed segment incorporating a 6-DOF force-torque sensor (Nano25, ATI Industrial Automation, Apex, U.S.A.), to which another marker was affixed (e.g., Fig. 3b) to ensure the capture of any movement between the US probe and robot arm.
Before scanning, the physician manually selected twelve predefined points on the patient’s lower back using admittance control (e.g., Fig. 4a). These points define an “S”-shaped trajectory for the robotic arm during automated scanning. With the anatomical area of interest marked, it was checked whether the area was within the workspace limits of the robotic arm holding the probe. If necessary, the position of the surgical table was adjusted (e.g., Fig. 2a).
For each subject, three consecutive “Perpendicular” scans were conducted for the RUS scan, as done previously with the HUS (e.g., Fig. 4b). Then “LR”, “LR switch”, “Spline”, “Spline2”, “Changing orientation”, and “Along the spinous process”-type scans were performed, which due to time constraints were performed in 45, 54, 24, 27, 25, and 11 cases, respectively (e.g., Fig. 4c–f,h). These scanning paths were automatically generated based on three waypoints along each vertebra: (1) at the start of the transverse process, (2) at the center of the spinous process, and (3) at the end of the other transverse process. Most RUS scans started in the same position as the HUS scans, on the right dorsal side of the pelvis, lateral to the right transverse process of the L5 vertebra (e.g., Fig. 4a).
In the “LR” path, the probe scans the full vertebra from the right to the left transverse process at a constant 20 degrees and returns in the same path at a constant −20 degrees (e.g., Fig. 4c). The decision on 20-degree angles is the same as mentioned above for the HUS scans. In the “LR switch” scan, the US probe starts at the right transverse process. It remains at a constant angle of 20 degrees until the center point at the spinous process (e.g., Fig. 4d). At the center, it rotates to −20 degrees and scans at this constant angle until the left transverse process has been entirely scanned. The “Spline” scan simulates a scan path following the shape of the vertebrae (e.g., Fig. 4e). The scan followed an estimated shape to position the US probe as perpendicular as possible to the bone surface with a variable inclination of a maximum of +20 or −20 degrees as the scan progresses. The “Spline2” scan was introduced as a modified version of “Spline” (e.g., Fig. 4e), in which the inclination of the US probe gradually increases from 0 degrees at the right transverse process up to 20 degrees at the spinous process. This motion is then mirrored: the US probe inclination is reduced to −20 degrees at the spinous process and then gradually increased to 0 degrees inclination at the left transverse process.
The “Changing orientation” path followed a scanning trajectory such that the US probe was almost perpendicular to the underlying vertebral bone surfaces, similar to the HUS “Rescan” scan type (e.g., Fig. 4f). The trajectory for this path was determined by an algorithm based on information of the vertebra bone surface outline obtained from previously the “Perpendicular” scan of the same volunteer[73](https://www.nature