Radiologists are in the driver’s seat with AI as the co-pilot

Not just an algorithm, more than an image viewer.

Validated Technology

AI Metrics was validated in a multi-institutional study with 24 radiologists and 20 oncologic providers that compared the effectiveness of AI Metrics to current methods (manual tumor measurements and dictated reports) for advanced cancer image interpretation and reporting. [read more]



Average time of interpretation by radiologists was 18.7 min for current practice versus 9.8 min for AI Metrics (p<0.001), which is nearly twice as fast.



Reporting accuracy was 73% using the current practice method and 91% using AI Metrics, corresponding to a 25% relative increase in reporting accuracy. 


Major Errors

Major errors were found in 27.5% (99/360) for current-practice versus 0.3% (1/360) for AI Metrics (p<0.001), corresponding to a 99% reduction in major errors. The most common errors were related to measurement transfer from image to report.


Inter-Observer Agreement

Inter-observer agreement among oncologic providers was 46% (55/120) for current-practice versus 73% (87/120) for AI-assisted method (p < 0.001), corresponding to a 58% relative increase in inter-observer agreement.


of Radiologists Preferred

In a post-study survey, 96% (23/24) of radiologists preferred the AI Metrics viewer and reporting system over the current practice viewer with manual measurements and dictated text reports.


of Oncologists Preferred

In a post-study survey, 100% (20/20) of oncologic providers preferred the AI Metrics reports with a graph, table, and key images over the current practice dictated text reports.

One Platform
Multiple Opportunities