qXR detects various abnormalities on Chest X-Rays

qXR is CE-certified

Read the paper Try it out (Trial available outside the US)

How is AI useful for Chest X-Rays?

Chest X-rays are common radiology diagnostic tests, but reading chest X-rays is also one of the most complex radiology tasks with high inter-reader variability. AI can help make this process more efficient and reduce errors.
Due to their affordability, chest X-rays are used all over the world, including in areas with few or no radiologists. AI can be applied to facilitate automated screening for diseases like tuberculosis in low-resource setting.

Approx1.65 M

TB deaths worldwide in 2016

Approx10.4 M

people fell ill with TB in 2016

Only61%

new TB cases were reported

MORE THAN20%

false negative diagnostic errors on chest X-Rays

ONLY 55%

of physician time spent on face-to-face patient care

qXR detects multiple chest X-ray Abnormalities


We use deep learning to develop solutions which identify these abnormalities and generate focus areas (heat maps) for each X-Ray image.

Abnormalities

  • Opacities
    • Consolidation
    • Fibrosis
    • Calcification
    • Atelectasis
  • Blunted Costophrenic Angle
  • Cavity
  • Tracheal Shift
  • Cardiomegaly
  • Hilar Prominence
  • Pleural Effusion
Patient with Opacity
Cavity in the right upper lobe
Cardiomegaly
Blunted Costophrenic Angle

Validation Study I

At Qure, our diverse dataset contains about 1.2 million x-ray images collected from various centers, each with their associated clinical reports.

From these images, 1.15 million images are used for training our algorithms and the rest 75,000 make our test set (QXR-75k dataset)

AUCs

Finding QXR-75k
Fibrosis 0.9172
Cardiomegaly 0.9502
Consolidation 0.9301
Cavity 0.9264
Infiltration 0.8862
Hilar Prominence 0.9097
Blunted Costophrenic Angle 0.9231
Pleural Effusion 0.9682
Any Abnormality 0.8741

AUCs

Finding QXR-2k
Fibrosis 0.9317
Cardiomegaly 0.9755
Consolidation 0.9495
Cavity 0.9473
Infiltration 0.9174
Hilar Prominence 0.9628
Blunted Costophrenic Angle 0.9372
Pleural Effusion 0.9431
Any Abnormality 0.8919

Validation Study II

A dataset of 2000 (CXR-2k dataset) Chest X-rays were collected from centres (that did not contribute to our training/testing dataset) in two batches B1 and B2.

B1 is randomly sampled from xrays collected in a specific time period, B2 is enriched with xrays containing various abnormalities. Areas under receiver operating characteristics curves (AUCs) were used to evaluate the individual abnormality detection algorithms.

Our Algorithms are validated against a panel of 3 radiologists.

Learn more

Validation Study III

374 de-identified frontal chest radiographs of adult patients were processed with Qure's AI algorithm. To establish standard of reference (SOR), two thoracic radiologists assessed all radiographs for these abnormalities; all disagreements were resolved in consensus with another radiologist. Two other radiologists unaware of SOR and AI findings, independently assessed presence of abnormalities on each chest radiograph (test radiologists).

AUCs

Abnormality AI vs SOR
Radiologist vs SOR
AI vs Radiologist
Cardiomegaly 0.937
0.830
0.820
Pleural Effusion 0.872
0.877
0.814
Infiltration 0.845
0.762
0.756
Hilar Prominence 0.823
0.722
0.685

The Qure AI algorithm is accurate for detection of abnormalities on chest radiographs and help radiologists detect these abnormalities.

Learn more

Validation Study IV (against Chest CT Scans)

We used a retrospectively obtained independent set of de-identified chest X-rays from patients who had undergone a chest CT scan within 1 day (TS-1, n=187), 3 days (TS-3, n=197) and 10 days (TS-10, n=230) of the X-ray to evaluate the algorithms' ability to detect abnormalities that were not visible to the radiologist at the time of reporting on the X-ray.

Results

Out of 180 abnormal scans in TS-1, 106 (59%) had been picked up as abnormal on the original CXR by the reporting radiologist, and 120 of these (67%) were picked up by the deep learning algorithm.

Accuracy
Sensitivity
Specificity
Hyperdense Abnormality vs AI 0.49
0.41
0.85
Hyperdense Abnormality vs Radiologist 0.44
0.34
0.91
All Abnormalities vs AI 0.67
0.67
0.71
All Abnormalities vs Radiologist 0.59
0.59
0.71
Learn more

Consolidated Results

Finding Study-1
Study-2
Study-3
Fibrosis 0.9172
0.9317
---
Cardiomegaly 0.9502
0.9755
0.9370
Consolidation 0.9301
0.9495
---
Cavity 0.9264
0.9473
---
Infiltration 0.8862
0.9174
0.8450
Hilar Prominence 0.9097
0.9628
0.8230
Blunted Costophrenic Angle 0.9231
0.9372
---
Pleural Effusion 0.9682
0.9431
0.8720
Any Abnormality 0.8741
0.8919
---

See ROC curves

Automated report

Our algorithms detect, classify and localize abnormalities. Using these algorithms, we automatically generate concise diagnostic reports for each X-Ray image

qXR is an Investigational device and is limited by United States law to investigational use only

Read more about us




Collaborate With Us

Collaboration with clinicians helps us in carrying out further research in Chest X-Rays. Most of our research is in partnership with radiologists across the globe.

If you are interested, please reach out to us at partner@qure.ai