Diagnostic accuracy of chest X-ray interpretation for tuberculosis by three artificial intelligence-based software in a screening use-case: an individual patient meta-analysis of global data

Share
Back

Background

 Chest X-ray (CXR) screening is a useful diagnostic tool to test individuals at high risk of tuberculosis (TB), yet image interpretation requires trained human readers who are in short supply in many high TB burden countries. Therefore, CXR interpretation by computer-aided detection software (CAD) may overcome some of these challenges, but evidence on its accuracy is still limited.

We established a CXR library with images and metadata from individuals and risk groups that underwent TB screening in a variety of countries to assess the diagnostic accuracy of three commercial CAD solutions through an individual participant meta-analysis.

Methods and findings

We collected digital CXRs and demographic and clinical data from 6 source studies involving a total of 2756 participants, 1753 (64%) of whom also had microbiological test information. All CXR images were analyzed with CAD4TB v6 (Delft Imaging), Lunit Insight CXR TB algorithm v4.9.0 (Lunit Inc.), and qXR v2 (Qure.ai) and re-read by an expert radiologist who was blinded to the initial CXR reading, the CAD scores, and participant information. While the performance of CAD varied across source studies, the pooled, meta-analyzed summary receiver operating characteristic (ROC) curves of the three products against a microbiological reference standard were similar, with area under the curves (AUCs) of 76.4 (95% CI 72.1-80.3) for CAD4TB, 83.3 (95% CI 78.4-87.2) for Lunit, and 76.4 (95% CI 72.1-80.3) for qXR. None of the CAD products, or the radiologists, met the targets for a triage test of 90% sensitivity and 70% specificity. At the same sensitivity of the expert radiologist (94.0%), all CAD had slightly lower point estimates for specificity (22.4% (95% CI 16.9-29.0) for CAD4TB, 34.6% (95% CI 25.3-45.1) for qXR, and 41.0% (95% CI 30.1-53.0) for Lunit compared to 45.6% for the expert radiologist). At the same specificity of 45.6%, all CAD products had lower point estimates for sensitivity but overlapping CIs with the sensitivity estimate of the radiologist. 

Conclusions

We showed that, overall, three commercially available CAD products had a reasonable diagnostic accuracy for microbiologically confirmed pulmonary TB and may achieve a sensitivity and specificity that approximates those of experienced radiologists. While threshold setting and cost-effectiveness modelling are needed to inform the optimal implementation of CAD products as part of screening programs, the availability of CAD will assist in scaling up active case finding for TB and hence contribute to TB elimination in these settings. 

Authors

  • Sandra Vivian kik1Sifrash Gelaw2, Morten Ruhwald1, Rinn Song1,3,4,5, Faiz Ahmad Khan6, Rob van Hest7,8, Violet N. Chihota9,10,11, Nguyen Viet Nhung12Aliasgar Esmail13, Anna Marie Celina G. Garfin14, Guy B. Marks13,18,19, Olga Gorbacheva17, Onno W. Akkerman7,8Kgaugelo Moropane9, Le Thi Ngoc Anh12, Keertan Dheda13,18,19, Greg James Fox15,20, Nina N. Marano21, Knut Lönnroth22, Frank Cobelens23, Andrea Benedetti6,24, Puneet Dewan25, Stefano Ongarello1 and Claudia M. Denkinger26. 1

Citation

  • 1. FIND
  • 2. Geneva
  • 3. Switzerland International Organization for Migration (IOM)
  • 4. Manila
  • 5. Philippines Oxford Vaccine Group
  • 6. Department of Paediatrics
  • 7. University of Oxford
  • 8. Oxford
  • 9. UK Division of Infectious Diseases
  • 10. Boston Children’s Hospital
  • 11. Boston
  • 12. MA Department of Pediatrics
  • 13. Harvard Medical School
  • 14. Boston
  • 15. MA
  • 16. USA McGill International TB Centre
  • 17. Research Institute of the McGill Health Centre
  • 18. Montreal
  • 19. Canada McGill International TB Centre
  • 20. Research Institute of the McGill University Health Centre University of Groningen
  • 21. University Medical Center Groningen
  • 22. Department of Pulmonary diseases and Tuberculosis
  • 23. Groningen
  • 24. Netherlands GGD Groningen
  • 25. Groningen
  • 26. Netherlands Aurum Institute
  • 27. Parktown
  • 28. South Africa School of Public Health
  • 29. University of the Witwatersrand
  • 30. Johannesburg
  • 31. South Africa Division of Infectious Diseases
  • 32. Department of Medicine
  • 33. Vanderbilt University School of Medicine
  • 34. Tennessee
  • 35. USA National Lung Hospital
  • 36. Hanoi
  • 37. Vietnam Centre for Lung Infection and Immunity
  • 38. Division of Pulmonology
  • 39. Department of Medicine and University of Cape Town Lung Institute
  • 40. Cape Town
  • 41. South Africa Department of Health
  • 42. Disease Prevention and Control Bureau
  • 43. Philippines Woolcock Institute of Medical Research
  • 44. Glebe
  • 45. Australia University of New South Wales
  • 46. Sydney
  • 47. Australia. International Organization for Migration (IOM)
  • 48. Geneva
  • 49. Switzerland South African MRC Centre for the Study of Antimicrobial Resistance
  • 50. University of Cape Town
  • 51. Cape Town
  • 52. South Africa Faculty of Infectious and Tropical Diseases
  • 53. Department of Infection Biology
  • 54. London School of Hygiene and Tropical Medicine
  • 55. London
  • 56. UK Faculty of Medicine and Health
  • 57. The University of Sydney
  • 58. NSW
  • 59. Australia United States Centers for Disease Control and Prevention (CDC)
  • 60. Atlanta
  • 61. Georgia
  • 62. United States Department of Global Public Health
  • 63. Karolinska Institutet
  • 64. Stockholm
  • 65. Sweden Amsterdam Institute for Global Health and Development and Department of Global Health
  • 66. Academic Medical Center
  • 67. Amsterdam
  • 68. The Netherlands Department of Epidemiology
  • 69. Biostatistics & Occupational Health
  • 70. McGill University
  • 71. Montreal
  • 72. Canada Bill and Melinda Gates Foundation
  • 73. Seattle
  • 74. US Heidelberg University Hospital
  • 75. Center of Infectious Diseases
  • 76. Heidelberg
  • 77. Germany

Share this publication